I am an American educator. In order to be as effective as I can be in the classroom, I think it is important for me to understand popular culture and its relationship with children and teens. This is also to satisfy my own curiosity. What do you believe are the cultural norms of children and teens in America? How do you feel about this?
I have my own fatalistic notions but I'd like to hear the better informed (hopefully) opinions of parents, teens, and others.
As an adult, I see how heavily influenced I was by the media in the 60s and 70s although I didn't see it at the time.
I'm an educator also, and I think that marketing in America has an even greater influence on youth today than it did in my 'time.' Many young girls seem to be convinced that they are not important unless boys 'want' them, and many boys believe that they are only as important as the number of conquests they have. Everything is sexualized by the media (and I'm not a prude)... in one propaganda lesson I do every year, students brought in advertising images wherein sex sold everything from booze to guitar strings to cell phone and shoes... and more. And teens are even more stubborn today insisting they are not influenced at all by media and advertising... thus, I would say for teens, popular culture tells them they are totally autonomous beings that outside the sphere of influence... although this is the exact opposite of the truth...just my opinion.
I think popular culture produces feeling of self entitlement in some teens and children. Surprisingly, I think there are also many children who could care less about the mainstream, and actually just want to learn. If your ultimate goal is to reach all of them, and actually make them learn something, then it I suggest reviewing the work of Jaime Escalante.
Copyright © 2017 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.