Is there any merit to this thinking?
We can attribute the problems in society to lying to our children. What I mean is that we raise our kids to the point of being able to understand and comprehend things and then RIGHT AFTER they can understand and comprehend they start getting lied to....about Santa and the Easter Bunny and the Tooth Fairy and things of that nature...and then at some point they find out that they were being lied to for years on end.....which in turn shatters their faith in their parents, and if the kids believe that they can't trust their parents then WHO CAN they trust??? So they forever carry the scars of the lies they were told on their hearts until some of them snap....or maybe some of them end up with intimacy issues....trust issues....and so on and so on......thoughts???