jump to last post 1-7 of 7 discussions (9 posts)

Do you think that Christianity is the only group that teaches morals and ethics?

  1. Lady Guinevere profile image60
    Lady Guinevereposted 5 years ago

    Do you think that Christianity is the only group that teaches morals and ethics?

  2. lburmaster profile image84
    lburmasterposted 5 years ago

    No. There are other religions that teach morals and ethics. However, they leave a few out and add their own as well.

  3. Renee Abbott profile image84
    Renee Abbottposted 5 years ago

    I have never been Christian. I learned very young morals and ethics. I cant fathom if people honestly believe that morals and ethics is just a christian thing. If this is true. wow...My manners will not permit me to add more on that..smiles.

  4. profile image0
    Old Empresarioposted 5 years ago

    Since when did Christianity ever teach about morals and ethics?--seriously. The only things it teaches are "Go to church, keep going to church, and here's why you should go to church...". I guess it does harp on its own morals that are relative to itself; like who you can have sex with and how you should do it. But ethics? Where are those mentioned?

    1. justmesuzanne profile image89
      justmesuzanneposted 5 years agoin reply to this

      Yeah, really! The way Christianity has been co-opted by the Ayn Randians these days, it does anything but teach morals and ethics.

  5. davidkaluge profile image49
    davidkalugeposted 5 years ago

    No even a pagan knows and teaches morals and ethics. Moreover, morals and ethics depends on a group in that what is morally good in one group may be wrong in another.

    1. Renee Abbott profile image84
      Renee Abbottposted 5 years agoin reply to this

      Another way to answer this question is to ask the Native American Nations whose ancestors experienced Wounded Knee, the walk of tears. This is just 1 example. Moral and ethics in regard to this, will not be those I would teach. Nor burning of witches

  6. ChristinS profile image96
    ChristinSposted 5 years ago

    Absolutely not.  We are atheists/agnostics in my family and I have done my best to instill proper values and morals to my children.  I get tired of hearing how Christians have a corner on the morality market - they most certainly do not.  People do good things, people do bad things - regardless of their religion or lack thereof.

  7. JMcFarland profile image87
    JMcFarlandposted 5 years ago

    I would argue that the "morality" taught by the bible is immoral - that's why the majority of the laws that it dictates (there are 615 of them) are not followed today.  morality doesn't come from an old book.  It is dictated by society at large, and is meant to relate to the way that human beings treat each other.  It has nothing to do with christianity.  Morality has existed since human beings formed societal bonds to co-exist.  The bible, and therefore christianity came along much, much later.