Legal lessons for Australia from Uber’s self-driving car fatality

The Conversation

Consider this thought experiment: a woman pushes a bicycle across a busy road at 10 pm. She is struck by a car and dies.

Would this event usually generate headlines or cause all driving to be halted?

No, because there were more than 5,984 pedestrians killed by cars in 2017 and such an accident would be sad but not unusual.

Now, modify the facts so that the car is self-driving, and is owned by Uber. Today’s tragic accident in Tempe, Arizona, killing 49-year-old Elaine Herzberg, has resulted in headlines and the suspension of Uber’s self-driving cars in Arizona, Washington, and Pennsylvania.

Arizona is a leader in autonomous vehicle (AV) testing, and the state has taken a light-touch approach to regulation of self-driving vehicles. This incident points to the need for further scrutiny of those laws – and Australia should take note.

Reports from the recent accident show that the Uber car was traveling at 40mph (about 64kph) in full autonomous mode and that there was a “safety driver”.

Unsurprisingly, Uber is under attack and there are calls for strong action, including criminal prosecution. But it would be a mistake to overreact, and action against Uber must wait until we have facts. The National Highway Traffic Safety Administration has dispatched a team to investigate the accident.

Self-driving car regulation in Arizona

In March, Arizona Governor Doug Ducey issued an executive orderrecognising that the state had become a hub for driverless car research and development, with more than 600 vehicles undergoing testing on public roads.

Prior to testing AVs without a human present in Arizona, operators are required to submit a statement to the Arizona Department of Transportcertifying:

  1. compliance with federal and state law and motor vehicle safety standards, and insurance, title, and licensing requirements;
  2. if the automated system fails, the AV will achieve a minimal risk condition.

The second requirement is inapplicable if a human is present. Both the “automated driving system” (robot) and the person (or company) who submitted the statement are subject to all applicable laws, and regarded as licensed to operate the vehicle. The state can issue citations and penalties for violations.

Perhaps the most important part of the governor’s executive order is a to-do item to create a “law enforcement interaction protocol”, which would have provided the meat of the legal regime in cases such as today’s accident.

The accident must be addressed in its absence – as a result, there is doubt about the potential liability of Uber and the safety driver.

Currently, there is no evidence that Uber’s vehicle was defective, or that the safety driver was negligent. Uber may still be held liable under product liability or negligence laws, but it is unlikely that the safety driver will face liability.

Regulation in other US states

Contrast the Arizona order with other state laws.

For instance, legislation in Michigan specifies that the automated driving system is considered the driver of the vehicle, and so a car manufacturer is liable for incidents in which the automated driving system is at fault (as long as no modifications had been made to a car without their consent).

The legislation in Georgia imposes a “super-insurance” requirement until December 31, 2019. An AV has to be covered by “liability coverage equivalent to 250%” of what is normally required.

Meanwhile, Nevada has adopted perhaps the most detailed law on self-driving cars. In essence, their laws describe the owner of a vehicle as the driver of the vehicle, and therefore legally responsible for the automated driving system.

It’s clear there is a lot of variation in legislation relating to AVs across the US – 33 states introduced laws in 2017 alone. All of this reflects the legal complexities of advancing driverless car technology without sacrificing safety.

Critics argue that there is a “race for the bottom”, and states with strong tech-industry lobbies are at the forefront of light-touch regulation.

How Australia regulates self-driving cars

In Australia, the governments of New South Wales, Victoria, and South Australia have passed laws on the subject. NSW adopted legislation in 2017 – the Transport Legislation Amendment (Automated Vehicle Trials and Innovation) Act 2017.

The NSW legislation’s purpose is to:

  1. authorise the approval of trials of AV which may not be otherwise permitted;
  2. provide for adequate insurance cover to protect against physical and property damage;
  3. and provide modifications to laws that are human-centric.

The minister, Melinda Pavey, may approve applications for testing AV, for specified periods, and subject to any conditions.

Crucially, the law requires a public liability insurance policy of at least A$20 million to cover damage caused by, or arising from, the use of the trial vehicle.

Unlike some recent US laws, NSW requires human presence in the vehicle. This “vehicle supervisor” must be approved, and have a valid licence.

The human must be “in a position to take control of the trial vehicle at any time or to stop [it] in an emergency or if required” by law enforcement.

Approvals for trials are conditioned on the operator reporting accident and collision data. Importantly, the human is legally deemed to be the “driver” of the vehicle.

Clearly, the NSW law is a tentative first step – cautious, not fully embracing self-driving technology, and more human-centric. That may not be a bad thing. As the Uber accident today shows, much is unknown in the realm of automated vehicles.

Elaine Herzberg’s death will provide the impetus for clearer liability rules for self-driving cars. In the absence of such clarity, the aggressive tactics of operators such as Uber have the potential to kill AV development. Australia is wise to adopt a wait-and-watch approach and maintain its human-first orientation.

Source: The Conversation. Reproduced with permission.

Comments

11 responses to “Legal lessons for Australia from Uber’s self-driving car fatality”

  1. MaxG Avatar
    MaxG

    Populist outcry has always led to overzealous regulation… Why is nobody questioning the fact that the women suddenly stepped on the road? Was it even physically possible to stop the car? Then auto or manual drive does not matter. Lots of noise… typically media frenzy…

    1. Joe Avatar
      Joe

      The headline grabber is the fact an autonomous vehicle is involved. Big deal, as if an autonomous vehicle is never, never, ever going to be involved in a traffic incident. But the media shark feeding frenzy pumps the story for maximum attraction to the punters to pile in. All the while the hundreds of people killed on Aussie roads barely registers any interest.

      1. Ken Dyer Avatar
        Ken Dyer

        The problem is that it gives the right wing liberals ammo. However, before anybody goes overboard we should remind them what Aristotle said, ‘One swallow does not a summer make’.

        1. John Gardner Avatar
          John Gardner

          One small problem perhaps. I understand that it was late at night, the vehicle seems to be dark in colour, it makes little if any noise and the person just stepped onto the road in front of it. Maybe we need some noise generator to alert people that a car is approaching so the vehicle doesn’t surprise them. (not a man with a red flag).

          1. Michael Dufty Avatar
            Michael Dufty

            It was night time, so the colour of the car shouldn’t make any difference assuming the lights were on. I should think autonymous vehicles make about the same amount of noise as every other vehicle. It was travelling about 60kph so tyre noise would drown out any engine noise, so would not make any difference if the engine was running or not (it is a hybrid not an EV).

          2. Roger Brown Avatar
            Roger Brown

            I watch You- tube videos of a motor bike with load pipes (RJ) in central London , filtering in the traffic and people just walking out in front of him ? He is not a quiet rider and likes changing gears up and down . There is always going to be “Blind spots” , but the sensors (many) should have stopped the car, unless it was set up to “Save passengers in the car” and ” Not Save passengers in the other cars / pedestrians etc “

    2. Rod Avatar
      Rod

      Surely with all that gadgetry AVs have a couple of dashcams?
      If they don’t they should, to cover themselves for the situation you suggest.
      Pure physics mean AVs can’t avoid every collision.
      As long as they can prove the argument they are safer, even slightly, than a human controlled vehicle, they should be encouraged.

      1. Michael Dufty Avatar
        Michael Dufty

        I think it was at night, so dashcams may not show up much. The AV should have sufficient sensors to work though, in fact I would have thought hard to see pedestrians in darkness is a situation where AVs with their extra sensors ought to be safer than human drivers.

        1. MaxG Avatar
          MaxG

          There are cars fitted with night vision — no idea, whether this Volvo was…

    3. Michael Dufty Avatar
      Michael Dufty

      Where did you get that information from? I read elsewhere she had crossed 4 lanes of a 5 lane road before being hit, which suggests she didn’t suddenly step out. I have no idea whether that is true though,just read it on the interwebs

      1. MaxG Avatar
        MaxG

        Different stories everywhere… the latest police info is the car is not at fault… media frenzy… after all the Murdoch press rather have petrol cars going strong… 🙂

Get up to 3 quotes from pre-vetted solar (and battery) installers.