The conscience of a self-driving car

NexusMedia

Imagine you are driving down the street when two people — one child and one adult — step onto the road. Hitting one of them is unavoidable. You have a terrible choice. What do you do?

Now imagine that the car is driverless. What happens then? Should the car decide?

Until now, no one believed that autonomous cars — robotic vehicles that operate without human control— could make moral and ethical choices, an issue that has been central to the ongoing debate about their use. But German scientists now think otherwise.

They believe eventually it may be possible to introduce elements of morality and ethics into self-driving cars.

To be sure, most human drivers will never face such an agonizing dilemma. Nevertheless, “with many millions of cars on the road, these situations do occur occasionally,” said Leon Sütfeld, a researcher in the Institute of Cognitive Science at the University of Osnabrück and lead author of a new study modeling ethics for self-driving cars.

The paper, published in Frontiers in Behavioral Neuroscience, was co-authored by Gordon Pipa, Peter König, and Richard Gast, all of the institute.

The concept of driverless cars has grown in popularity as a way to combat climate change, since these autonomous vehicles drive more efficiently than most humans. They avoid rapid acceleration and braking, two habits that waste fuel.

Also, a fleet of self-driving cars could travel close together on the highway to cut down on drag, thereby saving fuel.

Driverless cars will also encourage car-sharing, reducing the number of cars on the road and possibly making private car ownership unnecessary.

autonomous car copy

Improved safety is also an energy saver. “[Driverless cars] are expected to cause fewer accidents, which means fewer cars need to be produced to replace the crashed ones,” providing another energy savings, Sütfeld said.

“The technology could help [fight climate change] in many ways.”

The study suggests that cars can be programmed to model human moral behaviors involving choice, deciding which of multiple possible collisions would be the best option.

Scientists placed human subjects into immersive virtual reality settings to study behavior in simulated traffic scenarios.

They then used the data to design algorithms for driverless cars that could enable them to cope with potentially tragic predicaments on the road just as humans would.

Participants “drove” a car in a typical suburban neighbourhood on a foggy day when they suddenly faced collision with an animal, humans or an inanimate object, such as a trash can, and had to decide what or whom to spare. For example, adult or child? Human or animal? Dog or other animal?

In the study, children fared better than adults. The dog was the most valued animal, the others being a goat, deer and boar.

virtual reality copy

“When it comes to humans versus animals, most people would certainly agree that the well-being of humans must be the first priority,” Sütfeld said.

But “from the perspective of the self-driving car, everything is probabilistic. Most situations aren’t as clear cut as ‘should I kill the dog, or the human?’ It is more likely ‘should I kill the dog with near certainty, or alternatively spare the dog but take a 5 percent chance of a minor injury to a human?’

Adhering to strict rules, such as always deciding in favor of the human, might just not feel right for many.”

Other variables also come into play. For example, was the person at fault? Did the adult look for cars before stepping into the street? Did the child chase a ball into the street without stopping to think? Also, how many people are in harm’s way?

The German Federal Ministry of Transport and Digital Infrastructure attempted to answer these questions in a recent report. It defined 20 ethical principles for self-driving cars, several of which stand at odds with the choices humans made in Sütfeld’s experiment.

For example, the ministry’s report says that a child who runs onto the road is more to blame — and less worthy of saving — than an adult standing on the footpath as a non-involved party. Moreover, it declares it unacceptable to take a potential victim’s age into account.

google self driving car copy

“Most people — at least in Europe and very likely also Northern American cultures — would save a child over an adult or elderly person,” Sütfeld said.

“We could debate whether or not we want cars to behave like humans, or whether we want them to comply to categorical rules such as the ones provided by the ethics committee report.”

Peter König, a study co-author, believes their research creates more quandaries than it solves, as sometimes happens in science.

“Now that we know how to implement human ethical decisions into machines we, as a society, are still left with a double dilemma,” he said.

“Firstly, we have to decide whether moral values should be included in guidelines for machine behavior and secondly, if they are, should machines should act just like humans?”

The study doesn’t seek to answer these questions, only to demonstrate that it is possible to model ethical and moral decision-making in driverless cars, using clues as to how humans would act. The authors are trying to lay the groundwork for additional studies and further debate.

“It would be rather simple to implement, as technology certainly isn’t the limiting factor here,” Sütfeld said. “The question is how we as a society want the cars to handle this kind of situation, and how the laws should be written.

What should be allowed and what shouldn’t? In order to come to an informed opinion, it’s certainly very useful to know how humans actually do behave when they’re facing such a decision.”

Marlene Cimons writes for Nexus Media, a syndicated newswire covering climate, energy, policy, art and culture.

Source: NexusMedia. Reproduced with permission.

Comments

16 responses to “The conscience of a self-driving car”

  1. Charles Avatar
    Charles

    Can we please stop getting stupid dangerous situations that humans get themselves into, and insert a machine into the human’s place? It’s a purely theoretical example that is only useful in fuelling arguments on the internet.

    An actual autonomous vehicle would be aware of it’s environment and not get into these situations to start with.

    1. neroden Avatar
      neroden

      And this is correct. No autonomous vehicle should be permitted to drive recklessly — it should not be programmed to be speeding down roads so fast that it can’t stop if people at the side of the road step into the road.

  2. Damien van Hoogen van Avatar
    Damien van Hoogen van

    Most of these discussions are just sensationalist morale arithmetic, ‘is one woman worth two children’ .etc, unfortunately this article is no different.

    If we ever install the technology needed to make gender/age/species distinctions I dont think the morale rules will necessarily be complicated: The driver always must be protected, if there is any chance that swerving will cause injury to the driver, then it will run over whatever combination of children and dogs that’s required. What people say their morals are can be very different from what they practice; nobody actually wants a car that will prioritize somebody else’s life over theirs. Breaking is almost always the right answer in these situations, not swerving, and because of incomplete information, it will never be as clean cut as ‘he dies or she dies’, it will be a probabilistic array in which the car should choose the lowers risk.

    If we ever can discriminate gender/age/race, then hopefully we opt for simple numbers. Swerve and kill one instead of two, regardless of their identity. A woman is not worth more than a man, a child is not worth more than a woman and a white person is not worth more than an asian person

    1. neroden Avatar
      neroden

      You’re actually wrong. The rules are already established in the legal system — a human driver has *already* broken the law, and is driving recklessly, if he is driving so fast that he doesn’t have time to stop for pedestrians.

      The same will apply for self-driving cars. They will slow down preemptively when there are pedestrians at the side of the road, in case pedestrians walk into the street.

      Just as *competent* human drivers already do.

      We don’t want to make autonomous killing machines which speed recklessly down city streets like today’s reckless human drivers. So we aren’t going to.

      1. Damien van Hoogen van Avatar
        Damien van Hoogen van

        Imagine driving through a CBD at 60kmph and somebody steps out from the sidewalk into the street before the lights indicate they should. A car driver will physically not have time to stop even if they are completely focused. This happens frequently in Melbourne, although generally its bikes that hit them, not cars, still results in deaths though. In what way is the driver culpable in that case, assuming they were following the posted limit?

        You may be right about the legal status…but I doubt it, if you can find the relevant passage please link it as evidence. There are laws in existence (incest law for example) that are so foolish that cops and the courts use their discretion to effectively ignore them.

        1. neroden Avatar
          neroden

          In that case, the motorist is driving much too fast down the city street. 60 kmph is reckless. So, good example. The driver is legally culpable for manslaughter.

          You must not drive faster than what is reasonable and prudent, regardless of the speed limit. This may require driving substantially lower than the speed limit.

          FWIW, our city speed limits in the US are lower than that — typically 30 mph (48 kmph). In the UK there’s a campaign to lower city speed limits to 20 mph (32 kmph), a speed at which most pedestrians *survive* collisions without major injuries.

          I’m not sure about the Australian law — I am sure about the US law.

          It is true that corrupt police and DAs usually let the killers get away with vehicular manslaugher. There is a large campaign to stop this. (In the Netherlands it was “Stop de Kindermord”, and happened earlier.)

          1. Damien van Hoogen van Avatar
            Damien van Hoogen van

            Since you didn’t post any links re laws, I can only assume you are conflating the law with your own driving ethics. It IS legal to do the posted limit, you consider it reckless, but the law does not.

            I’m actually in full agreement with the spirit of what you are saying, just not the letter. I would love to see most cars removed from our CBD’s and seriously slowed down, so pedestrians can take back the streets. Additionally infrastructure could be dramatically altered to improve safety.

            There are situations in which neither driverless or drivered cars will be able to avoid hitting a pedestrian, and just calling the driver reckless is not grappling with these scenarios.

  3. Robin_Harrison Avatar
    Robin_Harrison

    Morality plays no part in this. Human drivers/riders are increasingly less likely to be in this position the more they take the trouble to learn how to drive well. Autonomous vehicles will always drive well reducing the possibility of this situation enormously.

  4. Alastair Leith Avatar
    Alastair Leith

    Moral and ethical choices made in a probabilistic model aren’t moral and ethical choices, they’re a simulacra of moral and ethical choices, the self-driving car actually doesn’t care. at all. They are even less what we call ‘consciousness’ which is way way more than human thought even.

  5. Alex Hromas Avatar
    Alex Hromas

    The question is false in that it assumes that in such a situation human drivers will respond rationally rather than reflexively or in panic. The most likely out come would be that a human driver would hit both whereas an autonomous vehicle would only hit one or may self destruct by running into an obstacle and not hit anyone or injure its occupants.

  6. Ian Avatar
    Ian

    The machine does not think, it just follows the program installed in it. The morals of the decisions are those of the programmers’. The machine is just a fancy steering wheel operated by the designer. It’s good to plan and anticipate the worst case scenario so that most of the time the best outcome is achieved.

  7. DugS Avatar
    DugS

    This is a pointless thought experiment that seems to gain traction only amongst those who propose it in the first place, i.e. Schrödinger’s cat fanciers. To imbue an inanimate object with ethics is as absurd as regarding the entire population of another country as ‘the enemy’. It should be rebutted as a puerile nonsense with a counter question, ‘is there a bomb that selectively only kills bad people?’, that will put the cat amongst the pigeons.

  8. neroden Avatar
    neroden

    “Imagine you are driving down the street when two people — one child and one adult — step onto the road. Hitting one of them is unavoidable…”

    If this happens, it means you are a bad driver who has already committed an illegal act of reckless driving.

    You should ALWAYS be driving at a speed where you can stop in time if anyone steps out into the road. This is what I was taught in driving school. Anything faster is reckless. “Don’t drive faster than your range of vision.”

    Now, would you like to reconsider this entire article? The self-driving car should, of course, refuse to out-drive its range of vision. It should go at reasonable and proper speeds and be able to stop before murdering pedestrians. If it doesn’t, the engineers who misprogrammed it are liable for manslaughter. Period.

  9. Kyle Avatar
    Kyle

    Hey, if they “drive more efficiently than most humans,” that isn’t saying much.

    1. John Saint-Smith Avatar
      John Saint-Smith

      Well, then, let’s quantify it. The autonomous cars will have more sensory inputs, no distractions, much faster more consistent responses, and a higher level of driving ‘skill/programming’. As a result, they will be ten times less likely to be involved in an accident than a fresh, intelligent, safety minded human – who unfortunately cannot concentrate on two issues at once, let alone make a moral calculation of the kind described here while under its own evolutionary ‘self-preservation’ imperative.

      Integrated autonomous electric vehicle services should replace our present commuter chaos and drunken bogan late night carnage as soon as possible. Most privately owned vehicles spend 95% of their time using up expensive real estate being parked. When moving they clog the roads and their drivers can be guaranteed not to merge or give way sensibly and efficiently, causing thousands of cascading ‘stops’ on the highway, reducing commuting speeds much more than necessary. You would arrive sooner, safer and refreshed, having had time to relax, catch up on reading or correspondence.

      Considering that the insurance, fuel, time, and parking costs would all be reduced dramatically, it is likely that an annual car service subscription with more reliability and safety than driving your own car would cost less than $5000 per year, as compared with $15-20,000 for the privilege of buying, operating, housing and depreciating your existing vehicle.

  10. Robert Massaioli Avatar
    Robert Massaioli

    Here is a hypothetical: Imagine that a self driving car gets into this situation and that, every time it does, it just explodes instead. It kills the passengers, it kills the pedestrians, everybody in the scenario dies.

    Even with this crazy behaviour self driving cars would still be better because, as Charles says below, self-driving cars will only be allowed on the roads unless they are 100-1000 times safer than humans anyway. And humans are not great drivers to start of with; so it may turn out to be a low bar.

    That means that, even if ~6 people died in one horrific accident and every human caused accident only caused a death once in every five accidents then the numbers work out like this. Self driving car deaths / Human deaths = (6 / (100 / 5)) = 1 / 3. Or, if it is the 1000 times safer then 1 / 30.

    That means that a self driving car would, on average, save somewhere between 66.6% – 96% of the people that humans would have killed with their driving. And that is with the crazy “explode and kill everybody” behaviour that will never be reality.

    The bottom line is this, if self driving cars are even marginally better than humans at driving then we will all have a moral and social imperative to stop driving ourselves and let the trained computers do it.

    If I was standing in a room with ~1000 people in it and I knew that, by mandating self-driving car use that I could save the lives of, 660-960 of those people, then I would do it. In a heartbeat.

Get up to 3 quotes from pre-vetted solar (and battery) installers.