Do we really want to leave moral decisions (morals being based primarily on our emotional response) to non emotional beings?
I think so
For example ask most people if they were to choose between who had to die, a 90 year old and a newborn - how would deterministic logic play a role?
Will Smith's iRobot. Suitability chance. Bad example though. How about young child or healthy adult. I still say adult as they are more likely to survive. I agree with the robot's decision.
Its called Triage, and it happens all the time.
Plenty of things are already decided by statistics and hard numbers, we just dont have to look at it. You can't save everyone all the time. Loss of life is inevitable. If the value of an individual's life cannot be quantified, then you have to choose the option to save more lives or the lives of those that are more likely to actually live.
The only difference is that if it was a human making the decision, its unlikely their choice would be as heavily scrutinized as a computer. Which is stupid imo. Better to remove human bias like racism and sexism from the equation.
Let the cars kill who they will.
edit: ps
in the trolly situation, its likely the computer would have a better handle on events and be able to react in such a way, like stopping the train, faster and more reliably than a sleepy driver who missed something that lead to the situation in the first place. 0 deaths instead of 1 or 5.
There are auto braking systems in cars. There is a reason why they override the driver's human error to stop the car safely. Its not a philosophical debate, its reaction time and data crunching.