Has it ever been anything but; honestly? Some things are getting better in society today. The worst thing that has happened since the days of the 1950s is that nobody reads books--at least not to any great extent. If I'm wrong, why are all of the book distributors going out of business? I don't see any other chains closing up shop in the strip malls. Bed Bath & Beyond, all of the restaurants, Target, Kohls, and all the rest are still alive and well. It's only the book stores that can't get customers in the USA. People just don't know anything about geography or culture. Education must be to blame. TV is to blame as well. So-called "History Channel" line up: Aliens, FBI, pawn shops, outer space, swamp men, Crab fishermen, truckers--is it really the "job channel"? Another thing that has changed since the earlier days is that men and women don't know how to dress themselves anymore--though this is more of a breakdown of the old conventions (which is good), I feel it is important to understand the basics of history and conventions in all facets of life. With dress goes good manners, which people no longer think is important. People are more narcissistic than ever. The old conventions are breaking down and tolerance is improving. But people are content to be ignorant on other important subjects.