You would need a crazy low probability of a lunatic or a mass murderer being down the line to justify not to kill one person
Edit: Sum(2^n (1-p)^(n-1) p) ~ Sum(2^n p) for p small. So you'd need a p= (2×2^32 -2) ~ 1/(8 billion) chance of catching a psycho for expected values to be equal. I.e. there is only a single person tops who would decide to kill all on earth.
Well what about the fact that after 34 people the entire population is tied to the tracks. What are the chances that one person out of 35 wants to destroy humanity?
Also thing the entire human population to the tracks is going to cause some major logistical problems, how are you going to feed them all?
I just calculated the sum from n=0 to 32 (because 2^33>current global population). And that calculation implies that the chance of catching someone willing to kill all of humanity would have to be lower than 1/8 billion for the expected value of doubling it to be larger than just killing one person.
Yeah I think I was in a stupor when I commented. I don't think I even tried to understand your comment. My apologies. But now that I am trying, I am struggling to understand the notation.
Oh come on. A trolley is not going to have the momentum to kill that many people nor would the machinery make it through. The gears and whatnot would be totally gummed up after like 20 or so people.
You don't even need a lunatic or mass murderer. As you say, the logical choice is to kill one person. For the next person, the logical choice is to kill two people, and so on.
It does create the funny paradox where, up to a certain point, a rational utilitarian would choose to kill and a rational mass murderer trying to maximise deaths would choose to double it.
Idk which moral system you operate under, but I'm concerned with minimising human suffering. That implies hitting kill because chances of a mass murderer are too high not to. You also don't follow traffic laws to a t, but exercise caution because you don't really care whose fault it ends up being, you want to avoid bad outcomes (in this case the extinction of humankind).
Even if your moral system solves those "problems", you just "solved" them by substituting the obvious and logical base of utility through personal responsibility. Personal responsibility is no inherent good, unlike utility, if people are unhappy/"feel bad", it doesn't matter how personally responsible everyone is being, that world is still a shit place.
Also, the threat isn't imagined. I can assure you that there are a lot more than one person on earth who would choose to kill as many people as possible if given the option.
You would need a crazy low probability of a lunatic or a mass murderer being down the line to justify not to kill one person
Edit: Sum(2^n (1-p)^(n-1) p) ~ Sum(2^n p) for p small. So you'd need a p= (2×2^32 -2) ~ 1/(8 billion) chance of catching a psycho for expected values to be equal. I.e. there is only a single person tops who would decide to kill all on earth.
Well what about the fact that after 34 people the entire population is tied to the tracks. What are the chances that one person out of 35 wants to destroy humanity?
Also thing the entire human population to the tracks is going to cause some major logistical problems, how are you going to feed them all?
I just calculated the sum from n=0 to 32 (because 2^33>current global population). And that calculation implies that the chance of catching someone willing to kill all of humanity would have to be lower than 1/8 billion for the expected value of doubling it to be larger than just killing one person.
Yeah I think I was in a stupor when I commented. I don't think I even tried to understand your comment. My apologies. But now that I am trying, I am struggling to understand the notation.
Oh come on. A trolley is not going to have the momentum to kill that many people nor would the machinery make it through. The gears and whatnot would be totally gummed up after like 20 or so people.
You don't even need a lunatic or mass murderer. As you say, the logical choice is to kill one person. For the next person, the logical choice is to kill two people, and so on.
It does create the funny paradox where, up to a certain point, a rational utilitarian would choose to kill and a rational mass murderer trying to maximise deaths would choose to double it.
Why do you care whose fault it is? You'd want to minimise human deaths, not win a blame game.
Idk which moral system you operate under, but I'm concerned with minimising human suffering. That implies hitting kill because chances of a mass murderer are too high not to. You also don't follow traffic laws to a t, but exercise caution because you don't really care whose fault it ends up being, you want to avoid bad outcomes (in this case the extinction of humankind).
Even if your moral system solves those "problems", you just "solved" them by substituting the obvious and logical base of utility through personal responsibility. Personal responsibility is no inherent good, unlike utility, if people are unhappy/"feel bad", it doesn't matter how personally responsible everyone is being, that world is still a shit place.
Also, the threat isn't imagined. I can assure you that there are a lot more than one person on earth who would choose to kill as many people as possible if given the option.