Do we need to tell our children what happens in the world? Do we need to tell our children all the danger/happy things about relationships? Do we need to tell our children about the bad/good things going on in the world? And at what age?
My oldest daughter is 11 years old now. She asks me lots of those questions. And sometimes I really don’t know the answer. Sowm questions are complex. She is smarter than me. And eventually children will know it. By others telling her, school and friends…
What is your opinion about what certain age we should tell bad things that happens in the world to our children?