Who Gets The Ticket?

What happens when the driver isn’t “speeding,” but the car is?

Who gets the ticket?

It’s a question bound to come up soon in traffic court given the fact that some new cars—like Sammy Hagar back in the day—can’t drive 55.

Or rather, a driverless car can be tricked into driving 85 when the speed limit is 35.

How do you trick a car to drive 50 MPH faster than the posted speed limit?

By using black electrical tape to change the “3” to an “8” on a speed limit sign.

McAfee (the computer anti-virus software company) researchers Shivangee Trivedi and Steve Povolny did exactly that, and it fooled the apparently not-so-smart automated speed control system built into a late-model Tesla electric car.

Its Traffic-Aware Cruise Control uses a camera to “read” speed limit signs and automatically adjusts the car’s speed up or down accordingly without any action by the driver. When the camera “saw” 85, it raised the car’s speed to 85 or tried to at least.

The researchers hit the brakes to prevent it from running amok, but the point was made. If this hadn’t been a test—if the car “saw” 85 rather than 35 and the driver wasn’t paying attention, the result could have been a very big ticket or a very bad accident.

Possibly both.

Teslas aren’t the only cars with this type of tech, either. It’s a building block of autonomous or automated driving tech, which, as the McAfee researchers just established, isn’t infallible and can be just as dangerous as a reckless driver.

Especially true when the driver isn’t paying attention to a recklessly driverless car.

Given that the whole point of automated driving tech is to enable the driver not to drive, and given the fact that technology is no more infallible than the human beings who create it—it’s inevitable that automated cars will drive fallibly.

And fatally.

There have already been several fatalities involving auto-piloted cars, so it’s not a hypothetical. It’s an actual. And it is probable there will be more accidents and fatalities as more and more cars are fitted with partial and fully-automated driving technology.

And drivers pay less and less attention to driving.

Almost all new cars have some degree of automated driving tech already, including a very questionable feature called Lane Keep Assist/Steering Assist. It is meant to help the driver keep his car in its travel lane, by using cameras that “see” the double yellow lines painted on the road to the car’s left and the white line to its right, demarcating the shoulder. The idea is that when the inattentive driver wanders too close to either painted line, the car will steer itself back into the travel lane, using electric motors attached to the steering gear.

That’s how it’s supposed to work.

It often works differently in the real world. Or not at all, if the painted lines are too faded to be seen by the camera. A driver who expects the tech to take over and does not monitor closely might just find his or her car in the opposite travel lane or even veering off the road.

The system also sometimes pulls the car in a direction the driver doesn’t want to go, which then forces the driver to fight for steering control.

This can lead to loss of control, especially if the driver is startled by the car countersteering and over-reacts to it.

There is also something called Automated Emergency Braking (or Automated Collision Avoidance), which is supposed to apply the brakes in an emergency—as when the driver fails to react quickly enough to a looming threat, such as a car suddenly pulling in front of your car or the car ahead of you suddenly brakes hard before you have time to react.

But sometimes, the car auto-brakes or and auto-stops when there’s no emergency or even any reason to slow down.

This writer experienced both of these scenarios last summer while driving a brand-new press car equipped with the tech. The car slammed on the brakes and came to a complete stop in the middle of the road. There were no other cars around—luckily for me. If there had been a car behind me, it might have accordioned me.

A National Highway Traffic Safety Administration investigation is currently looking into similar problems reported by dozens of drivers of cars equipped with this tech, which is increasingly not optional but part of a new car’s suite of standard “safety” features.

It all raises several vital questions. Most of all, who is responsible – legally and morally, when an automated car violates a traffic law?

Or causes an accident?

Cars already in circulation that have automated driving tech have caveats and asterisks that insist the driver is fully responsible for the vehicle and for driving it; that the driver must always be “fully attentive” and “prepared to intervene at any time” if there is a problem.

For many, the only reason to have automated driving capability is precisely not to have to pay attention to driving. Expecting drivers of cars equipped with automated driving tech to not text, check email, or even take a nap is as absurd as expecting a kid given free rein of a candy store not to eat any candy.

These issues are going to have to be hashed out soon, as more and more cars are fitted with automated driving tech – whether we want it or not.

It should have been hashed out before any cars with automated tech were allowed to “Beta Test” on public roads rather than closed-to-the-public test tracks.

Comments?

www.ericpetersautos.com

Photo attribution: BP63Vincent licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license.

Not an NMA Member yet?

Join today and get these great benefits!

Leave a Comment