Tesla Autopilot a (1)
Many years ago I became stuck fast in a line of traffic because the guy in front was obeying the law. It got very ugly. I was in a queue to turn right at a major intersection in Sydney during rush hour, and the routine here is for the cars at the head of the queue to move out into the intersection, wait for oncoming traffic to get the red light and then turn right as the traffic on the other street gets the green. Only two or three cars get around at every change of lights, but that’s better than none.

What happened here was that the guy at the head of the queue wouldn’t move out onto the intersection, he stayed glued behind the line waiting for the break in oncoming traffic that never came. He was simply obeying the law, an argument that didn’t sit well with the motorists banking up behind him and blowing their horns. Eventually I pulled out and went straight ahead through the lights leaving him there. Maybe he’s still there.

Nissan self drive aI was reminded of this experience recently when I read about cars that drive themselves and what’s happening with them now they’re out in the real world. The bulk of them are in California and this is the only place that is keeping statistics of their accident rates. And the news is surprising. They are having about twice as many bingles as cars driven by real, live people.

Now the vast majority of these shunts are recorded at very low speeds and don’t involve injury or even major panel damage, and the experts believe that they’re happening because these cars are programmed to obey the traffic laws absolutely, and under all circumstances. In other words they behave like the guy who got us all stuck that morning by not moving out onto the intersection and waiting for the light to turn red before turning.

Actually the most common accident with self-driving cars happens when two lanes of traffic are merging. Instead of nosing forward and forcing their way into the merge, then dovetailing nicely with cars in the other lane, they sit there and wait for a hole in the traffic to come along. Which never happens. What happens is that the cars behind bump into them.

So, should the self-driving software algorithms be changed to make such cars a little less law-abiding and a little more aggressive? The law makers would certainly have a view on this, and so would the folk stuck in traffic. A self-driving car was recently booked in California for obstructing traffic by going too slowly along a freeway, which led to a problem; who gets the ticket? Is it the guy behind the steering wheel (who can legally say he wasn’t the driver) or the guy who wrote the software, who wasn’t driving either?

Tesla model SAnd so we reach the favourite ethical problem of the whole self-driving car debate: when a self-driving car is faced with the choice of either killing its occupants by running into a wall or mowing down a group of schoolkids what does it do? At the moment the great bulk of them simply decide this is too hard and throw back control of the car to the person behind the wheel. Which is fine as long as that person is ready to act within a split second and isn’t in the middle of reading email or decanting hot coffee from flask to cup.

It’s just such dilemmas, the experts agree, that will keep self-driving cars from being anything but oddities for at least the next decade. A clutch of cars may now be good at parallel parking or staying in their own lane on a freeway, but when it comes to negotiating a tight, off camber corner on a dark back street or handling a dirt road, well if you’re behind the wheel don’t stop paying attention. Early self-driving Teslas had a penchant for suddenly taking freeway exits.

Face it; when all this is fixed and self-driving cars are everywhere, the stark reality is that they’ll be about as exciting as riding around with your granddad.

Published February 2017

 

Tags: , ,