Most people have no problem making the right choice when they have to decide between hitting a person or a shopping cart with their car, but can self-driving vehicles make the same moral decisions? Yes, according to a new study by researchers developing technology to equip self-driving cars with ethics training.

Turning ethics into an algorithm isn't as difficult as you might think, according to researchers at the Institute of Cognitive Science at the University of Osnabrück in Germany. After plugging people into virtual reality simulations of critical traffic scenarios, they found patterns in the way we prioritize saving other people, animals and inanimate objects.

By assigning values to those hazards, computers can learn to make humane decisions on the road.

"Human behavior in dilemma situations can be modeled by a rather simple value-of-life-based model that is attributed by the participant to every human, animal, or inanimate object," wrote Leon Sütfeld - the primary author of the study that has been published in 'Frontiers in Behavioral Neuroscience.'

But researchers say a few gray areas remain - ones that are difficult for humans, let alone computers to resolve. For instance, if a child ran into the street, causing the likelihood of an accident, should its life be saved if that means the car has to veer onto the sidewalk and hit a bystander who is not at all responsible for the dilemma? Such ethical nightmares are hard for people to resolve, let alone teach to computers.

And if the self-driving vehicle makes the wrong decision, do we blame the machine or the inventor? 

The researchers urge society to consider those troubling questions before allowing machines to make life-or-death decisions for us.

"We need to ask whether autonomous systems should adopt moral judgements," says Gordon Pipa - a senior author of the study. "[I]f yes, should they imitate moral behavior by imitating human decisions, should they behave along ethical theories and if so, which ones and critically, if things go wrong who or what is at fault?"