[LINK] Robot cars and the fear gap
David Lochrin
dlochrin at key.net.au
Thu Jul 14 18:13:39 AEST 2016
On 2016-07-14 14:06 Karl Auer wrote:
> The trolley problem as I understand it has a person at a switch. A runaway train (trolley) is coming down the track. The switch is set so that if nothing is done, the trolley will kill five people standing on the track. If the switch position is changed, the trolley will kill only one person standing on the other track. Should the person at the switch change the setting? Would you, if you were the person at the switch?
This is an old problem in moral philosophy and it was addressed by Jeremy Bentham's theory of Utilitarianism, broadly the greatest good for the greatest number. From Wikipedia "Utilitarianism is a theory in normative ethics holding that the best moral action is the one that maximizes utility. Utility is defined in various ways, but is usually related to the well-being of sentient entities."
However I agree it hasn't much to do with driverless cars, unless of course a government legislated that they will be the only ones allowed because the overall accident statistics were lower. Sillier laws have been legislated, but that's another story...
The core question is to what extent people are to be held responsible for their actions. Is a driverless car which kills someone the responsibility of the owner, the maunfacturer, the agency which approved it, or nobody?
David L.
More information about the Link
mailing list