Robot cars and the fear gap

AAEAAQAAAAAAAAgBAAAAJGYyNDIzOWNkLTc1OWEtNDBmZC04MGM3LWRlNTY5NDZiN2I5NQ

Sarah Connor told me that robots are going to kill us all, but I’m weirdly excited about climbing inside one and letting it drive me around . I’m an exception;  last year, I wrote about the way broader media and community reactions to autonomous vehicles will take form:

Humans are responsible for the vast majority of motor vehicle deaths and injuries. In 2010 in Australia, there were 1,248 fatalities associated with motor vehicles — 75% to 90% of these were caused by human error”

“The [media coverage] pattern is simple: the magnitude of a risk is misperceived, individuals and media outlets react, and politicians demand greater regulation”

This tragic trajectory is taking shape. No technology will result in the complete eradication of vehicular fatalities, but considering the major role of human error in car accidents, computer assistance and eventual control should lead to a serious reduction of death and injury. Currently, this doesn’t matter. We’re set to see a period of skewed perception of risk, thanks to information presented in a way maximises skew, and minimises math.


Recently, Tesla enthusiast Joshua Brown was killed in a horrible accident on an American highway, in which a Model S on ‘autopilot’ mode failed to brake after a truck turned into the path of the vehicle. Tesla said, in a blog post:

“The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.”


Since then, a variety of articles have criticised Tesla and focused heavily on the incident. A context-free and salient focus on dramatic events is a key component in the exaggeration of risk, something I focused on in my predictions.

Tesla CEO Elon Musk has asserted, in blog posts and on Twitter, that though the accident is indeed a tragedy, a double standard exists – the Model S has an excellent safety record, and autonomous vehicle technology is something that will almost certainly lead to a serious reduction in road fatalities.

You can feel Musk’s frustration when you scroll through his replies (and when you read their follow up blog post). This has led to criticism, of him and the company, from a some ‘Crisis Communications’ experts:

“According to Bernstein, resorting – as Musk did – to statistics to try to put an accident or malfunction which resulted in a death into a wider context, however well-meaning, was ill-judged. “I haven’t seen anybody foolish enough to try the statistics approach in a long time,” he said.

Asked what advice he would give to Musk, Bernstein said that he should “take a step back, take a deep breath, and practice delivering a message that communicates compassion, confidence, and competence”.

“And if you can’t do that, keep your mouth shut,” he added”

I’ve seen plenty of instances where people lean too heavily on facts, and disregard community perceptions. I’ve also seen the inverse, where important facts are left out in favour of eagerly prodding community reactions.

The important thing here is that Musk’s company is designing and deploying a safety technology into modern cars. This runs parallel to his efforts to transition vehicles away from internal combustion engines and towards electric motors – in essence, another safety feature removing the dangers created by sole reliance on carbon-intensive fuels.

In this context, statistics matter – the crisis communications experts seem only to value community reactions, and fully disregard the question of whether this technology actually works to reduce risk. They are being misled by their instincts, and this hostility towards communicating the mathematics of risk will likely to contribute to delays in the deployment of seemingly valuable safety features.

Musk has shunned this evidence-blind approach, hopefully, it’s because deploying this safety feature is more important than bowing to the dominant skew in media coverage of new technology. Ideally, Tesla wouldn’t have to do this – it would be included in media coverage by default.


There’s a historical precedent to this. An article from a 1984 edition of the New York Times outline efforts to encourage seatbelt use, and the dramatic reasons people refuse to use them:

“Researchers say that misperceptions about the uses of seat belts, in addition to the discomfort of wearing them, contribute to drivers’ reluctance to buckle up. There is a persistent myth, for example, that it is safer to be thrown free of the car than to be restrained by a belt. In fact, the chances of being killed in a crash increase 25 times if an occupant flies from the vehicle. Others fear being trapped by a belt if the car catches fire or falls into water”

Emotionally jarring tales of seatbelts taking the lives of occupants by locking them in burning cars, or the strap causing injury, have been around since the 1960s. Unfortunately, some of these are probably based on real occurrences. In the 1930s, The Gippsland Times editorialised on seatbelts:

“That it is deemed necessary for such a safety measure to be even suggested for self protection of drivers and passengers in motor cars is a regrettable reflection of the frailty of human nature and an indictment against bad and careless driving-the cause of nearly all motor accidents.

This prescient piece highlights something important, as remarked upon by a psychologist in the NYT article: “if you ask a room full of people to rate themselves as drivers, they all say they are the top of the distribution”. The Gippsland Times errs in assuming that those who suffer serious accidents are ‘bad’ and ‘careless’ – they’re normal. These are the limits of perception, reaction time, attention and salience. A person safely operating a car is easy. Millions of humans driving for many hours a day on congested and complicated road networks is something else, altogether.


It seems counter-intuitive, but as this safety technology saves an increasing number of lives, we’ll increasingly perceive it as risky and unsafe, and media outlets will continue to maximise this emotionally-salient disconnect:

AAEAAQAAAAAAAAfcAAAAJGZmNzRhNDVlLTFjZDEtNDg1MC05ZmUxLWY1YWMwYmMxYzc2OA

It would be quite nice to skip this gap altogether, but an inherent feature of new technology is the presentation of familiar risks as new and scary threats. If this ‘Hazard-Perception’ gap emerged around seatbelts, it’s definitely going to emerge around people stepping inside computer cars. We pay a lot of attention to warnings presented in the right way; hence the incentive for click-driven media coverage to widen the gap. But, as Michael Barnard writes at Clean Technica:

“This level of [autonomous technology] sophistication and competence leads to drivers using it. A large part of the reason Tesla is in the spotlight for the fatality is because its driver assist features are so much better than its competitors. Simply put, fatalities are a statistical inevitability of driving, but the other manufacturers’ technology hasn’t been used enough that the sad inevitability has occurred yet”

The ‘crisis communications’ experts aren’t talking about the many thousands of daily deaths from car controlled by human minds because they don’t need to – incumbency has desensitised us to the serious risks that emerge when you blend our brain (with all of its perceptual flaws and cognitive shortcuts) with a sharp, heavy and enormous metal box powered by compressed and refined old, dead plant matter.

In this instance, the approach is clear: don’t take a binary approach to communications. Sensitivity is vital, but the statistical context has to be championed in a way that is not only accessible, but more interesting and compelling than the story of fear that will, by default, dominate coverage of robot cars.

Ketan Joshi is a communications consultant working with emerging clean energy technologies. You can follow him on Twitter at @KetanJ0.

Ketan Joshi is a European-based climate and energy consultant.

Comments

2 responses to “Robot cars and the fear gap”

  1. Zvyozdochka Avatar
    Zvyozdochka

    This is excellent.

Get up to 3 quotes from pre-vetted solar (and battery) installers.