Has the entertainment field contributed to our society's decline?
As someone with a TV background, I feel a certain duty to defend it. There's some excellent factual TV, that is educational, informative, and aspires to make the world a more positive place, and really help people. Unfortunately, this is not always the television that people choose to watch! To be honest, I also watch some "rubbish" TV and enjoy it. But it's not true that it's all rubbish!
I'll be interested to see other people's answers.
Well, I'm not sure just how useful my opinion will be - but of course I'll proffer it up anyway! Personally, I don't think the field has contributed to any sort of decline in society - there are some powerful and moving films and novels and television progammes out there, both fictional and factual. Besides, society is (believe it or not) not quite so degenerate as the mass media might have us belive, at least in our privileged western nations. There are problems, of course... but as a whole, I do think members of our society have it far, far better than those in some of the unenlightened and unprogressive societies of history. I have no duty defending artwork or any entertainment in particular - but I'll hold out hope that thoughtful and meaningful work is still being produced, if you look for it.
Copyright © 2018 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.