I am starting to think women are to blame (not 100% btw) for the way they are treated in society and in the media. Women love to blame media (on the pressure to be thin, the pressure put on women that express their sexuality, blah blah blah) . However real these issues are, we seem to reinforce and repeat these unhealthy ideas. Slut shaming, accusing someone of being too skinny (like 'Biggest Loser' Rachel Frederickson), being too fat, wanting a thigh gap (if you don't know what this is, google it). We seem to bring each other down more then we build it each other up. Not only that but we have to the nerve to blame others (the media, men, etc.) without owning up to our roles in this. Are women just hypocrites? By the way I am not saying all women are like that, but a good majority are, in my opinion... LET ME REPEAT MY OPINION!!! I am just frustrated with ignorance of my gender and the lack of responsibility and accountability that they expect from others but don't give in return. MEN AND WOMEN WHAT DO YOU THINK? I KNOW THIS IS NOT GOING TO POPULAR BUT I HAVE TO SPEAK ON THIS
Copyright © 2017 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.