Over the last few years i've read many articles about how 'we' are paying people to teach our kids to hate our society, history, capitalism, democracy, and possibly even themselves. Also to encourage our young people not to be too patriotic as well. We've possibly all heard about whats happening in our universities, making sure so many females on campus believe they are victims of the male patriarchy, that there is a rape culture on campus.[which has been proven not to actually exist.] Apparently the majority of lecturers are to the Left politically. More and more people are saying that 'we' are producing young adults who aren't prepared properly for living successfully in our society. It is counter-productive for our education system to produce and turn out people like this. So when did it all start, has this been happening for many years? I don't only mean when it comes to education either. We, and our kids, are possibly the most fortunate people to of ever lived yet all i hear from people, especially younger ones, is complaining about society. Shouldn't we be making sure that each new generation is as close to ideal as possible when it comes to people believing in our society and fitting in? But then who decides the 'ideal?' Sometimes it seems that we ignore common sense and we end up doing the wrong thing instead.