jump to last post 1-12 of 12 discussions (55 posts)

Driver-less Cars - Someone Must Die

  1. GA Anderson profile image85
    GA Andersonposted 13 months ago

    As the move towards driver-less cars plows forward, one philosophical question raises the ante.

    In a collision scenario where someone will die; a pedestrian or a passenger in the car, what action should the driver-less car take; swerve and kill the passenger, or stay on course and kill the pedestrian?

    Tap dancing with qualifying conditions and what-ifs won't alter the basic question. The car cannot judge intent, so it won't know if the pedestrian is suicidal, or if the passenger is a human slug that doesn't deserve to live anyway. And the detail of how many pedestrians or passengers are involved isn't a qualifier.

    But a choice still must be programmed into the car.

    What say you?

    ps. here is the story that prompted the question: Quartz Thought Experiment

    GA

    1. wilderness profile image95
      wildernessposted 13 months ago in reply to this

      Hit the pedestrian.  Reasoning:

      The car is driving legally and in the safest manner consistent with efficient operation.  We know this because speed limits, crosswalks, roads, etc. are all set up to accomplish just that.  Ergo, the pedestrian is jaywalking or in some other way not behaving safely.  The passenger should not be penalized for such actions.

      In addition, a more pragmatic view is that no one will buy or use such a car knowing their life is forfeit should some idiot get in the way.  As driverless cars will greatly reduce the death toll on highways by eliminating driver error, it is a losing proposition in the big picture to discourage the use of the technology.  Again, then, the answer is to hit the pedestrian.

      1. GA Anderson profile image85
        GA Andersonposted 13 months ago in reply to this

        Hey guy, I was reaching for the "wrong answer" buzzer, but since it is a philosophical question involving ethical and moral determinations, I don't think there is an accepted right or wrong answer... yet.

        But your justifications for your answer are wrong either way. Your use of qualifiers regarding behavior as justification aren't part of the question.

        In this question, the conditions are blameless. Neither the passenger nor the pedestrian are at fault in any way. It is just a raw bone. The reason a choice is needed doesn't matter at all.  Your rationalization that the pedestrian must have been jaywalking or something is adding something unintended to the question.

        And the market influences affected by the question aren't a consideration either.

        But, to address your validations, let me rephrase the question;
        "A pregnant young mom, (the soon-to-be baby will bring world peace when it grows up), is the passenger in a driver-less car, and the pedestrian is a young doctor that is days away from finding a cure for cancer... something causes both the car and the pedestrian to have the right-of-way. Yada, yada, yada..."

        ... no participant bears any blame, and one of them will die.. Who dies?

        ps. this quote "... As driverless cars will greatly reduce the death toll on highways by eliminating driver error, it is a losing proposition in the big picture to discourage the use of the technology.  Again, then, the answer is to hit the pedestrian.." recalled something I read about Edison and his struggle to get his version of electricity accepted over Tesla's version. Edison volunteered to use his electricity for a legal death sentence execution - which failed miserably. Immediately associating his electricity with death in the public's mind. Wouldn't a driver-less car that mows down pedestrians in lieu of its passenger run the same danger?

        GA

        1. wilderness profile image95
          wildernessposted 13 months ago in reply to this

          Sorry, I didn't view either as a qualifier.  But if you won't allow any reasoning at all, the only answer available is to flip a coin.  There is nothing that says, legally or morally, that one should live and the other die, so chance is all that's left.

          But if it's the "wrong" answer, what justification do you have for valuing one life over another?

          1. GA Anderson profile image85
            GA Andersonposted 13 months ago in reply to this

            Tsk..Tsk... now you know I did not say your answer was wrong. I said your justifications for reaching your answer were wrongly applied, or something like that.

            And then you proceed, (as one politician once said; "There you go again..."), to say no reasoning was allowed. Nope, wrong again.

            I think this question was the article's attempt to link a decades-old philosophical example to a potential dilemma of modern advances. I also think they picked a lousy example. In the theoretical example, (that Trolley car), the moral choice would be a human action or inaction - hence the moral dilemma. I say that substituting the car as the action taker, changed the question. The human action requires a moral decision, the car's action requires data-set(s) evaluation.

            That question is easy for me to answer. In favor of the passenger.

            So you see, there was reasoning involved. And not a single coin-flip needed to get there.

            GA

            1. wilderness profile image95
              wildernessposted 13 months ago in reply to this

              But inaction is really a decision/action in and of itself.  Movement is not necessary for an action.

              Nor can the car be substituted for a human action as it is a human that told it what to do - set up the data-set evaluation you are using.  And as neither pedestrian nor passenger exhibits any legal or moral superiority in the choice it reverts 100% to the feelings of the programmer.

              1. GA Anderson profile image85
                GA Andersonposted 13 months ago in reply to this

                Alright buddy, to say I expected your responses to be as they are is intended as a compliment, so take it as such and let me change the point of the OP to this ; can the philosophical "Trolley Car" question be applied to driver-less car programming?"

                I still disagree with your movement, inaction, action dissection, but.... it does bring us to what I see as the realistic consideration of that moral conundrum of the "who dies" question..

                After considering my own OP, I have decided that the original Quartz article was wrong to apply a moral question, ie. the Trolley Car scenario,  to driverless-car programming.

                Everything about the question, whether it be; a strict application of the pure moral question or  a less restrictive question of programming data-set evaluations, assumes an instance of judgement. A driver-less car does not have that capability. A moral decision requires an evaluation of best interest served, whereas I say a driver-less car does not use a moral question at all. It does not have that capability. It merely evaluates a set of data to determine an action. It is all an "if this, then do that..." decision. There are no mitigating considerations, ie. the pedestrian is a turd that doesn't deserve to live, or the passenger is a mass murderer with a bomb strapped to his butt.

                So following that back to your 'programmer's decision" rational,  I also think they too are immune from the question. Putting aside all commercial and capitalistic justifications, the programmer's job is to code a car to be as safe as possible, considering all realistically foreseeable possibilities. And the safety they must be primarily concerned with is that of the passenger.The moral question of the OP being only one of many possible scenarios they must code for.

                As applied to my 5-10 year prophesy that smart driver-less cars will be a major part of our national transportation system, I say the only ethical way to program the cars is to protect the passenger. The pedestrian is the necessary loser.

                GA

                1. Kathryn L Hill profile image85
                  Kathryn L Hillposted 13 months ago in reply to this

                  I will not set foot near a road.

                  1. GA Anderson profile image85
                    GA Andersonposted 13 months ago in reply to this

                    Com'on Kathryn, Really? That is your response? You want to contribute to national discusssions with that rational backing your assertions?

                    GA

    2. rhamson profile image77
      rhamsonposted 13 months ago in reply to this

      Unless they come up with a version of the computer on Jeopardy, Watson I believe, who knows all the legalities and caveats of responsibility of driving a car I will just drive my own car thank you. smile

      1. GA Anderson profile image85
        GA Andersonposted 13 months ago in reply to this

        Nope, no dodges. I think you, (and I), will see an explosion of driver-less cars in our near future. So, as a moral and ethical person, how do you want the question resolved, who, (of two equally blameless participants), does the car decide to save?

        GA

        1. PrettyPanther profile image86
          PrettyPantherposted 13 months ago in reply to this

          As a former transportation planner, pedestrians always have the right of way.  Walking is our natural state.  Driving is an artificial state, a societal convenience that can add danger to self and others.  Therefore, the onus is on the driver (or, in this case the car), to defer to the pedestrian.

          1. GA Anderson profile image85
            GA Andersonposted 13 months ago in reply to this

            Hi there, pretty hard to argue with that logic... except you did the same thing Wilderness did. Added a qualifier; "pedestrians always have the right of way..."

            Is your answer the same without it?

            GA

        2. rhamson profile image77
          rhamsonposted 13 months ago in reply to this

          The car will logically save the passenger as that is what it is programmed to do. Is it ethically or morally right? Probably not is all I could add. But it is just like any other chicken or the egg argument.

          1. GA Anderson profile image85
            GA Andersonposted 13 months ago in reply to this

            I don't think it is like that at all, and the question is all about the car's programming. ie. the human coder's moral determination when they "program" the car's actions. Your assumption that "...that is what it is programmed to do." can only be said after the question is answered, because the answer will determine the programming.

            I have ended up thinking there is no moral question at all as long as the question is answered as posed; with the car's actions being questioned rather than a human's actions.

            Of course, when we reach the I, Robot stage of AI, I may have to give it another think.

            GA

            1. rhamson profile image77
              rhamsonposted 13 months ago in reply to this

              If it ever got out that the passengers safety would be in question in a passenger over pedestrian scenario would make me wonder how sales would go from there? You need only to look at Volkswagen to see how a software program has endangered the whole company.

    3. Credence2 profile image85
      Credence2posted 13 months ago in reply to this

      There it is, the Kobeyashi Maru Test-no win scenario.

      Intuition and jugdement are unique to those operating with neurons rather than microprocessors. Those traits are valid bases on which decisions are made. I don't know if I will ever be comfortable deferring all of that to a machine.

      All things being equal, IMHO, the  greater responsibility to preserve life belongs with  individual in the vehicle over the pedestrian. By his own volition the vehicle driver hurtles at high speeds in conveyances, when in collision with a pedestrian he or she would, more likely than not, have the advantage.

      You take an implied risk driving in a vehicle at high speed, why should I have an equal risk just walking around?

      I am leaning toward mortality for the driver. But I cant be adamant about it as there are so many variables invoved.....

      1. wilderness profile image95
        wildernessposted 13 months ago in reply to this

        Yep, the Kobeyashi Maru Test.  So reprogram to add lift fans to the car and jump over the pedestrian! big_smile

        1. Credence2 profile image85
          Credence2posted 13 months ago in reply to this

          That's the ticket, Laddie!

          1. wilderness profile image95
            wildernessposted 13 months ago in reply to this

            Warp factor 7!

            1. Credence2 profile image85
              Credence2posted 13 months ago in reply to this

              Make it So!

      2. GA Anderson profile image85
        GA Andersonposted 13 months ago in reply to this

        Ha! I knew somebody would bring that in. That is a Capt. Kirk reference isn't it? 

        I thought about that too, but then considered that asking the question relative to the car's actions, (I know, the root is human), and making the degree of injury  equal, (1 vs. 1), then it wasn't really the pure philosophical scenario it looks likes. There is a name for that question in the article, (The Trolley Car Problem)

        The OP article takes that chestnut and asks the question relative to driver-less cars. I think that changes the question.

        Without an explanation, I agree with Wilderness that the car must be programmed to protect the passenger. But without his classic pure-Wilderness type justifications.  Your answer appears to be more related to the kielbasa macho test question you mentioned;)

        GA

    4. Don W profile image82
      Don Wposted 13 months ago in reply to this

      The passenger. Because they (according to my assumptions) chose to be in the car, and therefore hit "Accept" when Google's terms and conditions flashed up on the multimedia screen. Those T&Cs clearly say:

      "In any scenario where the Personal Automated Vehicular Device™ ("automatic car") must choose between making an emergency course correction that will result in an Immediate Life Extinction Event ("death") to the Carbon Based Biped with Designated Binomial: Homo Sapien ("human") inside the automatic car, or a human outside the automatic car, then I agree to forfeit the Biological Signalling and Self-Sustaining Processes ("life") of the human inside the automatic car. Other than as expressly set out in these terms or additional terms, neither Google Inc ("Google") nor its suppliers or distributors make any specific promises about the services. We do not make any commitments about the content within the services, the specific functions of the services, or their reliability, availability, or ability to avoid life extinction events. We provide the services "as is""

      I Agree  -  I Disagree (selecting this option will immobilize your vehicle and make all the tyres and doors fall off)

      1. wilderness profile image95
        wildernessposted 13 months ago in reply to this

        lol

      2. GA Anderson profile image85
        GA Andersonposted 13 months ago in reply to this

        (Chuckling the whole time weren't you?) But would this;
        "The passenger. Because they (according to my assumptions) chose to be in the car..."
        still be your answer if the ToS was tossed?

        GA

        1. Don W profile image82
          Don Wposted 13 months ago in reply to this

          Don't you just love corporate speak, but yes I think there is a grain of truth in there. Anyone who chooses to ride in the vehicle has implicitly accepted an element of risk that goes with that. On the other hand it could be argued that anyone walking down the street has implicitly accepted the risk of doing so. On the other hand, surely walking down the street counts as basic necessity, whereas riding in an automatic car does not. So accepting the risks associated with the former is practically unavoidable, accepting the risks associated with the latter is not, it's a choice that the occupant doesn't necessarily have to make. So yes, I'll stick with my first answer: the occupant of the vehicle.

          1. GA Anderson profile image85
            GA Andersonposted 13 months ago in reply to this

            Hey Don, others have mentioned similar reasoning. I see the apparent logic of your choice, and it may be as valid an answer as the one I see, but in your reasoning of consent by the passenger, would a non-consensual use of the vehicle also hold?

            My first thought was of driver-less paddy wagons or medical vehicles. In one a passenger may be forced to become a passenger, and in the other the passenger may be unable to give consent. Just a thought. I suppose the thread will shed some light.

            As I mentioned to Wilderness, I am pondering the thought that any reasoning to chose to save the pedestrian, may be based on an incorrect view of the question. The question was posed as a moral question, but as asked relative to driver-less cars, can a moral question be asked of a non-human action?

            If the obvious first response is that humans program the cars, so the car does have a moral question via the programmers. Hmm... Is the action of programming the same class of action as the maneuvers the car would take? Is a passive action the same as an active one?

            I think I have answers for those, but the gears are still turning.

            GA

  2. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    Swerve and kill the passenger.
    The person who trusts such a car deserves to die out of sheer stupidity.

    1. GA Anderson profile image85
      GA Andersonposted 13 months ago in reply to this

      Just like those idiots that rode a rocket to the moon, right?

      GA

      1. Kathryn L Hill profile image85
        Kathryn L Hillposted 13 months ago in reply to this

        well, yeah!
        Its a gamble and the gamblers deserve their fate.
        Of course the manufacturers and marketers of this car will not admit this to themselves or anyone else!
        so we have to realize it ...
        and take the consequences when we refuse to be proactive.
        I wouldn't step foot in such a car!

        1. wilderness profile image95
          wildernessposted 13 months ago in reply to this

          You'd rather share the car, or the road, with a drunk driver?  A truck driver high on pep pills, or falling asleep?  The teen driver texting her boyfriend?  I fully expect the accident rate to drop precipitously when smart cars are common.

          1. Kathryn L Hill profile image85
            Kathryn L Hillposted 13 months ago in reply to this

            If there is one chance the car would choose pedestrians over me, I would not step foot into such a car. Let alone buy it. 

            If such a car would choose me over pedestrians, I would not step foot in such a  car.

            They will not be sold.

            But, if these cars do end up on the road, I will not set foot in any vehicle on any road.


            I shall have to buy a good bike
            or horse.

          2. GA Anderson profile image85
            GA Andersonposted 13 months ago in reply to this

            Ha! Stop stealing my thunder! Ignore him Kathryn. Don't let his semi-provable facts change your mind. Stay the course.

            GA

            1. wilderness profile image95
              wildernessposted 13 months ago in reply to this

              Hah!  Semi-provable!  100% of the long distance drives by a car (that I know of) have proven successful, with no injuries.  All 1 of them!

  3. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    Hopefully by the time those self-driving cars are perfected flying cars will have hit the market.
    Perhaps they will be based on drone technology. If they are self-driving flying drone cars, self-driving might be okay up in (inner) space!?

    You really think the nation is reading what we keyboard here? yikes!

  4. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    This is the issue: (Maybe? hmm !)

    "The former school of thought argues that the moral action is that which causes the maximum happiness to the maximum number of people, a theory known as utilitarianism."

                                                           versus

    "The creed which accepts as the foundation of morals, Utility, or the Greatest Happiness Principle,
    holds that actions are right in proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness. By happiness is intended pleasure, and the absence of pain; by unhappiness, pain, and the privation of pleasure. To give a clear view of the moral standard set up by the theory, much more requires to be said; in particular, what things it includes in the ideas of pain and pleasure; and to what extent this is left an open question. But these supplementary explanations do not affect the theory of life on which this theory of morality is grounded- namely, that pleasure, and freedom from pain, are the only things desirable as ends; and that all desirable things (which are as numerous in the utilitarian as in any other scheme) are desirable either for the pleasure inherent in themselves, or as means to the promotion of pleasure and the prevention of pain.

    Now, such a theory of life excites in many minds, and among them in some of the most estimable in feeling and purpose, inveterate dislike. To suppose that life has (as they express it) no higher end than pleasure- no better and nobler object of desire and pursuit- they designate as utterly mean and grovelling; as a doctrine worthy only of swine, to whom the followers of Epicurus were, at a very early period, contemptuously likened; and modern holders of the doctrine are occasionally made the subject of equally polite comparisons by its German, French, and English assailants."

    http://www.utilitarianism.com/mill2.htm


    Forget the one car "proof," wilderness.

  5. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    A moral question could be asked in the light of utilitarianism.
     
    Which result, (1 passenger dying / 1 pedestrian dying,) would cause the greatest happiness to the greatest amount of people?

    Should a self-driving space-aged computer-operated (very cute) vehicle be programed to:

    A.) Allow the passenger to perish in a tragic accident in order to save the pedestrian who, after all, was just minding his own business walking along the side of the road?
    OR
    B.) Hit and kill a pedestrian and thereby save the passenger?


    I still think, based on this theory, the pedestrian should be saved and here is why:

    After the crash, passenger Bob, would be faced with the death of pedestrian Jack. Bob would suffer way more than Jack, (now dead / feeling nothing,) due to the guilt and agony of CAUSING this tragic event.

    Jack, on the other hand, would not suffer any guilt or sense of responsibility whatsoever upon witnessing a self-driving vehicle swerve and plummet, (or whatever,) on his behalf. Therefore, Jack would come out the happier.

    Conclusion: The passenger would have to be the one programmed to die in a crash and the pedestrian would have to be the one kept intact.

    Unless, of course, if Jack had been illegally jaywalking and afterwards realizes he had caused the fancy self-driving vehicle, to veer into on-coming traffic ...

    huh?... yikes!

    Thanks for nothing, GA.

    1. wilderness profile image95
      wildernessposted 13 months ago in reply to this

      Why would it be necessary to hit a pedestrian walking down the side of the road?  Must have been in the middle of the road, where he does not belong.

      1. Kathryn L Hill profile image85
        Kathryn L Hillposted 13 months ago in reply to this

        You mean Jack was jaywalking.
        If Jack sees the oncoming cute little space aged computer programed self driving vehicle,(carrying Bob) coming at him why would he cross in front of it?

        1. wilderness profile image95
          wildernessposted 13 months ago in reply to this

          You'd have to ask Jack.  Or any of the other hundreds/thousands that jaywalk.

          (Read an article reviewing the Toyata Prius once, where the driver on a dark, rainy night suddenly found 3 women walking is his lane - in the middle of a 6 lane interstate!  All three were killed, one by him, but yes, people will jaywalk.)

          1. Kathryn L Hill profile image85
            Kathryn L Hillposted 13 months ago in reply to this

            I guess the marketers should never speak of the human life which exists willy nilly outside the automatic car.

    2. GA Anderson profile image85
      GA Andersonposted 13 months ago in reply to this

      You are welcome Kathryn, Look at the bright side... at least you had the opportunity to respond without needing to choose a right or wrong side.

      It is also interesting that you find guilt to be a motivating factor for your decision. Hmm...

      GA

  6. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    "In any scenario where the Personal Automated Vehicular Device™ ("automatic car") must choose between making an emergency course correction that will result in an Immediate Life Extinction Event ("death") to the Carbon Based Biped with Designated Binomial: Homo Sapien ("human") inside the automatic car, or a human outside the automatic car, then I agree to forfeit the Biological Signalling and Self-Sustaining Processes ("life") of the human inside the automatic car. Other than as expressly set out in these terms or additional terms, neither Google Inc ("Google") nor its suppliers or distributors make any specific promises about the services. We do not make any commitments about the content within the services, the specific functions of the services, or their reliability, availability, or ability to avoid life extinction events. We provide the services "as is""

    Who would agree to this in the first place??

    ~ unless one has no concern or regard whatsoever for the "Biological Signalling and Self-Sustaining Processes ("life") of the human inside the automatic car," i.e. Bob.

  7. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    I guess it could become a national campaign to make sure not to cross in front of a visually obvious Personal Automated Vehicular Device™ ('automatic car')! Maybe they should be equipped with flags extended on thin, flexible poles, for the sake of the human passengers.
    Then Bob might survive.
    and no one has to die! big_smile
    EVER!

  8. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    Feel free to get us back on track, GA.

    But in all actuality, the manufacturers are the deciding factor in all this.
    Do they wish to sell the vehicle and get sued by passengers or pedestrians?
    (or their families in case of death to ether party.)
    ~ why did you have to bring this issue up?

    WHY?

    1. GA Anderson profile image85
      GA Andersonposted 13 months ago in reply to this

      Because it was a question that did not need political or capitalist bias to defend a choice. (such as the cute one about the ToS, and the others that mention sales factors and manufacturer liability)

      I think the "passenger made a choice" rationals were reasonable - even though I disagree. Then there was the nut to crack as to how to answer without adding any conditions or assumptions; jaywalking, etc.

      Your "guilt" and which survivor would suffer the most explanation was an unexpected angle. I don't agree, but I can see that line of thought as a valid path for some folks.

      As the thread progressed I discovered an easy way out of the moral "someone must die" question for my answer. I now think that the moral question does apply until we reach AI capabilities that include intuitive judgement, rather than just tangible data-set evaluations. The programmed actions of a non-human decision maker vs. the judgement driven actions of a human decision.

      Which is a camouflaged example of following Capt. Kirk's machu picchu test scenario, (as Credence2 mentioned)

      ps. To atone for my cowardly bailout, maybe I should start a new thread on the real Trolley Car question.

      GA

  9. PrettyPanther profile image86
    PrettyPantherposted 13 months ago

    I think you all are making it way too complicated.  Pedestrians have the right of way.  A car will be programmed to swerve or stop to avoid pedestrians.  It will also be programmed to avoid running into other objects.  So, if there is a chance of stopping or swerving without running into something else, that will happen.  If there isn't, then the vehicle will be programmed to favor running over an inanimate object over running over a human.  That is assuming it can recognize which is which.  I don't know how nuanced the programming can be.  For example, would the vehicle be able to distinguish between a deer running into its path versus a person?

    Anyway, in my mind, this isn't really that complicated.  Maybe I just have a simple mind, though.

    1. Kathryn L Hill profile image85
      Kathryn L Hillposted 13 months ago in reply to this

      The question was directed at the programmers.
      How would you like to be in charge of saving either the rider or the walker?

      "Which result, (1 passenger dying / 1 pedestrian dying,) would cause the greatest happiness to the greatest amount of people?"

      or maybe it should be considered in this light:
      Which choice would cause the greatest unhappiness to the greatest number of people?

    2. GA Anderson profile image85
      GA Andersonposted 13 months ago in reply to this

      OK, does that mean I am supposed to agree with you -  that you do have a simple mind, or am I supposed to protest wildly that you have an excellent mind?

      I think you find it easy to answer because you are considering the question as a legal one and not a moral one.

      If it were the original moral question, before the driver-less car was added, then there would be no realistic legal aspect.

      The original question concerned a trolley car and pedestrians. One version, (this is from the OP linked article), has a passenger with the choice to pull the trolley car brake lever to stop the car and save the pedestrians, but that action would result in the trolley car crashing and killing all on-board, or do nothing and the trolley car folks live. Another version involves a footbridge over the trolley tracks. A person on this footbridge sees a trolley car on a path to crash into a group of people - sure death for the pedestrians. Standing beside this person is a very fat person, fat enough to stop the trolley. Should one fat person be thrown off the bridge to save the lives of multiple people? (the illegal act of murder is not a consideration - it's a moral question)

      GA

      1. PrettyPanther profile image86
        PrettyPantherposted 13 months ago in reply to this

        LOL, I was just poking fun at myself, but you are welcome to express your opinion, or not.

        I still think that a pedestrian is in its natural state, while a vehicle is a man-made contraption that must yield to that which is in its natural state.

        With regard to the first trolley scenario, the same thing applies.  The trolley is a man-made convenience that can also be dangerous.  You explicitly stated there can be no qualifiers.  To me, that makes it simple.  Humans using a man-made contraption with the potential to harm other humans must yield to the humans in their natural state.  [Edited to add:  In the 2nd scenario where sacrificing the fat person could save the others, then I go with sacrificing one to save many. ]

        Again, to me this seems quite simple and in line with the moral reasoning that if you build or create something you are responsible for minimizing its negative effects on people who are merely existing in their natural state.

  10. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    The link brought me to an abortion discussion. Maybe I didn't read far enough.

    < "The original question concerned a trolley car and pedestrians: > GA Anderson

    1.) A passenger with the choice to PULL THE trolley car BRAKE lever to stop the car and save the pedestrians:
    That action would result in the trolley car crashing and killing all on-board,
    or DO NOTHING and the trolley car folks live.

    A. Do nothing to let the greater number live.
    However, the trolley operator would certainly feel awful.
    The trolley passengers would feel nothing, being dead.

    OR WOULD YHEY???????


    2.) A person on a footbridge over the trolley tracks sees a trolley car on a path about to crash into a group of people - sure death for the pedestrians. Standing beside this person is a very fat person, fat enough to stop the trolley.
    Should one fat person be thrown off the bridge to save the lives of multiple people? (the illegal act of murder is not a consideration - it's a moral question)

    A. I guess throw off the fat person to save the greatest number of people! yikes !
    Of course the thrower would certainly feel a lot of guilt!
    Now if they were all left to die instead, the fat person, Joe and the would-have-been thrower, Gail, lived, Gail and Joe might have lived happily ever after, as the event pulled them into a beautiful, long lasting relationship. Meanwhile, the trolley victims were dead and felt nothing at all.

    OR DID THEY?????

  11. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    The metaphysical must be taken into consideration!

    1. PrettyPanther profile image86
      PrettyPantherposted 13 months ago in reply to this

      In what way?

  12. Kathryn L Hill profile image85
    Kathryn L Hillposted 13 months ago

    The affect of death upon the soul.

 
working