On Self-Driving Cars and Illusory Superiority

Tesla model sNow that Uber revealed their first self-driving car, and we’ve got Google and Volvo and Tesla and probably Apple working on self-driving cars, I’d like to move from ride sharing/hailing to the self-driving car discussion. This is another one of those conversations that I stomp around the house having opinions about but have never written them up or really talked publicly about how I see this evolving.

The part of the self-driving car evolution that interests me most is the human reaction to the idea. The technology will arrive much faster than our ability to accept giving up control of our cars, and certainly well in advance of the legislation. Let’s talk about why that is.

Illusory Superiority

The biggest problem is that humans think that they’re excellent drivers. In a study in 1981 by Olia Svenson, 88% of the people sampled in the US and 77% of the Swedish participants put themselves in the top 50% of drivers for safety. Think about that, more than 80% of the people said they were better than average drivers. The results of this study were replicated in 1986 by McCormick, Walkey and Green who again found 80% of subjects thinking they’re better than average drivers.

But it gets better.  38% of drivers believed they were above average WHILE TEXTING in a study by car insurance company CheapCarinsurance.net. I can’t say if that was a super scientific study because of the source, but it’s backed up by other studies that tell us how many people do text while driving. They must think they’re above average.

This phenomenon in psychology has a name. It’s called illusory superiority. It’s a cognitive bias whereby individuals overestimate their own qualities and abilities relative to others. We don’t do this just about driving, we think we’re smarter than average, we think we are better in our job performance, and we think our habits are healthier than others. Basically we’re delusional.

What terrifies me is that I think I’m a below average driver, so I must be truly dreadful!

Ok, let’s say that you can’t let go of this illusory superiority, that you really are the best driver on the road. But I hope I’ve convinced you that everyone else is delusional about their skills. As a result they think they’re awesome drivers while texting and talking on the phone and yelling at the kids in the back seat and eating a sandwich and shaving and putting on makeup and every other idiotic thing we’ve seen others do while driving. No matter how skilled and attentive you are, you’re still in danger because there are humans behind the wheel.

Who Should the Car Save?

Many are asking an interesting question about self-driving cars, which Tom Merritt called “The Car-Bayashi Maru”, or the unwinnable scenario. The question is what should a self-driving car do if faced with the choice of certain death of its passenger or certain death of multiple other people. First of all why has no one responded to this question with, “The good of the many outweighs the good of the few”? Seriously missed Star Trek quote opportunity there.

The reason this question gets my ire up is because we never ask humans this question! What would I do in that scenario? What would you do? I bet of all the people reading or listening to this we could get a really nice sampling of responses and still not have the right answer to the question. Why do we hold self-driving cars to a higher standard than ourselves? I also question the probability of a) that exact scenario happening and b) that even if it did happen, how could we be certain that death would occur in either incident?

If a truck with a giant metal plate back comes to a dead stop in front of me on the freeway, and I can be pretty certain this will end badly for me, there is zero chance that I won’t try to swerve. That increases the chances of someone else dying horribly but I don’t know that for certain, so how could a self-driving car know that? I find myself getting spun up every time this question is posed as though it’s a justification for not having self-driving cars. If we’re just talking about it for the fun of it, ok. But I think asking this question increases peoples’ fear of this new technology.

Survey Says

The last thing that spins me up about the self-driving car discussion is the surveys. Most of the surveys out there seem to ask questions in the form, “How afraid are you of giving up complete control to a machine?” Well what do you think the results of that survey will show?

In the book Dr. Garry suggested we read, “Thinking Fast and Slow” by Daniel Kahneman, he talks about the concept of anchors and priming. The way we ask the questions and the information we give you beforehand influences your answer.

Kahneman references a study by German psychologists Thomas Mussweiler and Fritz Strack where they demonstrated the anchoring effect. They asked two separate groups to estimate the annual mean temperature in Germany. One group was asked if the mean temperature was higher or lower than 20°C. The second group was asked if the mean temp was higher or lower than 5°C. The two temperatures 5°C and 20°C were the anchors. They then briefly showed the subjects a series of words. The subjects who heard the temperature 20°C found it easier to recognize summer words like sun and beach, while those that heard the colder temperature recognized words like frost and ski. Obviously the two tasks had nothing to do with each other, but the temperature they are thinking about changed their answer to the unrelated task.

The effect of anchoring is even more dramatic than the experiment I just described. From the same book, German judges with an average of more than 15 years on the bench were given a pair of dice to roll. The dice were set to roll either a 3 or a 9. After rolling the dice, they were asked about a theoretical case of a woman shoplifter, and they had to say whether her sentence should be more or fewer months than what was shown on the dice. These are judges with 15 years of experience, and yet on average, those who rolled a 9 said they’d sentence her to 8 months, and those who rolled a 3 averaged 5 months. So the roll of these dice affected their long learned judgment even though they knew it was completely uncorrelated.

You can see how this concept of anchoring is so easy to use in manipulating surveys!

I would like to see the results of a self-driving car study that had the following questions:

  1. How do you feel about the time you spend on your daily commute to and back from work?
  2. Do you find other drivers generally polite on the road?
  3. Have you ever been in an accident because of another person’s stupidity?
  4. Do you enjoy being the designated driver when you and your friends go out in the evening?
  5. Do you wish you could get work done, or chat with friends online or watch movies while driving?
  6. Do you think self-driving cars sound like a good idea?

I’m betting we would see a completely different answer to the questions surrounding self-driving cars! Sure there are people out there (I’m looking at you, Steve) who will still say they enjoy driving but I think a huge percentage of people given those things to think about first would answer positively about self-driving cars.

First-Hand Experience

I’d like to close this out with my impressions from first-hand experience in a Tesla Model S with the autopilot feature. Our buddy Ron recently upgraded to the most recent model so that gave Steve and me a chance to experience what it could do.

Autopilot on the Tesla is designed for highway driving. The Tesla will read speed limit signs, adjust its distance automatically from the car in front and maintain a safe distance at the posted speed limit (unless you tell it to go faster). At one point Ron took his hands off the wheel and a warning came up telling him not to be stupid and put his darn hands back on the wheel! It was more polite than that but it was insistent.

If put on your turn signal, the car will actually check traffic and when it’s clear, change over one lane for you in the direction you selected. Ron tested that out several times and it worked flawlessly. It was probably the scariest thing to let it do but it did a great job every time he tried it.

We also gave autopilot a try while driving in stop and go traffic on the freeway. It’s recommended only above 18mph but we did it anyway in slower traffic (see how humans are the real problem?) I have to say it was the best part of the self-driving experience for me. Ron was able to relax and let the car speed up, slow down, come to a complete stop, start back up again and generally remove the aggravation that is most of the driving we do in Los Angeles.

So overall Steve, Ron and I were delighted with the experience of a semi-automated car, but one thing did kind of scare us. We were going probably 60mph on the freeway when there was a slight dip on the left side of our lane. After a lot of cars go through the same dip, they leave a little bit of oil, so all dips are darker than the flat parts. This confused the Tesla and it rather violently swerved towards the direction of the dip. This happened twice in about 2 hours of driving and it was enough to convince Ron that they really mean it when they insist you keep your hands on the wheel. It sobered us up a bit from suggesting that the technology is totally ready.

Conclusion

I’m a huge fan of the concept of self-driving cars, and I think they can’t get here soon enough. I picture a world where the elderly can get out and have fun and go to the doctor without assistance, where teenagers can text while being transported, where commuters don’t experience road rage from 2 hour commutes, and where no one ever drives drunk again.

Sadly this week a Tesla Model S driver was killed while the car was in autopilot. A tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes. The investigation has only just started into this incident so we don’t know whether it was avoidable had the driver been in control. It could have been the fault of the algorithm, or it could have been the fault of the tractor trailer driver, or even the Tesla driver himself, but we don’t really know.

What we do know is that in 2015, 38,300 people died in vehicle-related accidents, going up 8% from 2014. That’s the biggest percentage increase in 50 years. No matter what our illusory superiority tells us, we are not getting better at driving and we are not getting safer.

Let’s get this self-driving car thing going an let’s do it as quickly as we possibly can.

2 thoughts on “On Self-Driving Cars and Illusory Superiority

  1. Bob Goodrich - July 1, 2016

    Good writing!

  2. Pete Myers - July 2, 2016

    The “who should the car save?” question irritates me too. Any halfway decent algorithm will make that decision better than 99% of the drivers out there.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top