jump to last post 1-4 of 4 discussions (4 posts)

Do we really need feminism in the United States?

  1. profile image60
    kittreposted 4 months ago

    Do we really need feminism in the United States?

  2. dashingscorpio profile image88
    dashingscorpioposted 4 months ago

    https://usercontent2.hubstatic.com/13737479_f260.jpg

    Feminism is nothing more than a continuous effort to strive for equality between women and men. As long as there remains discrimination, sexual harassment, inequity in pay for the same work and in hiring for certain positions based on gender there will be a need for both women and men to take a stand.

  3. LimeyFeline profile image41
    LimeyFelineposted 4 months ago

    Everywhere you turn there are politicians and religious fanatics trying to deny women access to birth control and safe abortions.  The US also has the worst rate of maternal deaths in the developed world, not to mention that maternity leave policies at the federal level for the US is again, among the worst in the developed world.  So you tell me, is feminism needed?

  4. NewsOnline23 profile image55
    NewsOnline23posted 4 months ago

    The revolution shouldnt be about women trying to be men or gain supremacy over men ,but that of equal rights in every area in life. Which is a fair request

 
working