DRIVER LESS CARS COULD LET YOU CHOOSE WHO SURVIVES IN A CRASH

Would you ride in a car that was prepared to kill you? An “ethical knob” could let the owners of self-driving cars choose their car’s ethical setting. You could set the car to sacrifice you for the survival of others, or even to always sacrifice others to save yodau.

A woman in a driverless car

The dilemma of how self-driving cars should tackle moral decisions is one of the major problems facing manufacturers. When humans drive cars, instinct governs our reaction to danger. When fatal crashes occur, it is usually clear who is responsible.

But if cars are to drive themselves, they cannot rely on instinct, they must rely on code. And when the worst happens will it be the software engineers, the manufacturers or the car owner who is ultimately responsible?

People’s attitudes to the issue are also complicated. A 2015 study found that most people think a driver less car should be utilitarian, taking actions to Minimise the amount of overall harm, which might mean sacrificing its own passengers in certain situations. But while people agreed to this in principle, they also said they would never get in a car that was prepared to kill them.

Leave a comment