Why do people claim to not care what people think?
We are all social creatures. We all want acceptance, friends, love. how are we supposed to get these things without caring about what people think. Frankly, i'd say that caring what others think is the second most important factor in life, the first being self-preservation/self-gain! So why would a society even pretend to not care what people think?