Drivers Who Aren’t, But Who Gets the Bill?

Another auto-piloted Tesla has crashed into another car—two cars, actually—last month. The Tesla’s “driver” (in quotes to ironize the obvious) was reportedly “checking on his dog” in the back seat when his car rammed into a Connecticut State Police cruiser and then ping-ponged into another car that was parked on the shoulder of the road.

Several interesting questions come to mind, again. Including one probable inevitability that hasn’t been much discussed but which will eventually affect all of us, including those of us who prefer to be drivers rather than meat sacks driven around by autopiloted (and glaucomic) cars.

The first question is: Who’s responsible for these crashes? Is it the “driver” who uses technology specifically designed to avoid the need for him or her to drive? Or is it the company which designed the technology that—its protestations and lawyer-ese notwithstanding— encourages the “driver” to abdicate responsibility for driving the car?

Tesla says, of course, that the “driver” must be “ready to intervene at all times,” which (for those who remember it) is wink-win-speak like that used to market the catalytic converter “test pipes” you used to be able to buy at auto parts stores back in the ‘80s. These, too, had the necessary for-legal-purposes verbiage. But everyone understood the “test” would be ongoing. Just as everyone understands that the whole point of Autopilot is to not pilot.

Else why bother with it at all?

If the driver is expected to keep his eyes on the road and be ready to “intervene” at all times and be responsible for what the car does, then he is still driving, in which case Autopilot and all other varieties of automated driving tech are elaborate but fundamentally useless gimmicks that ought to be banned for the same reason you can’t buy a catalytic converter test pipe anymore.

For a better reason.

Catalytic converter test pipes never hurt anyone. Autopiloted cars have already killed several people: Including people not in the Autopiloted cars.

Innocent victims, in other words.

More will be killed, inevitably, the more this technology filters into general use. Teslas are not the only cars that have it. Several high-end non-electric cars, including those made by Cadillac, Mercedes, and BMW have similar technology and unless it’s illegalized, which the government (usually so “concerned” for our “safety”) seems very reluctant to even consider—more and more cars will soon have it because gadget-mania is now the principle “sell” for new cars, electric and not. People love to tap and swipe, and do anything except drive.

Which this technology (it’s necessary to repeat) encourages them not to do.

But when they wreck or rather, when the car wrecks, who will get the bill? Should it be the “driver” who is still legally-speaking supposed to be “ready to intervene” at all times—even though it’s understood they won’t be, if they use the tech . . . or will it be the manufacturer of the tech, which at least implicitly encourages the “driver” to not drive, in the manner of perpetually “testing” the catalytic converter?

Well, I’ve got news for you sunshine.

The bill will be handed out to all of us. Including those of us who want nothing to do with automated driving technology, but who’ll be dunned by the insurance mafia, to recover the costs imposed (in metal and flesh) imposed by automated driving technology. Our premiums will rise, not because we imposed in any costs and even though we haven’t wrecked, to “cover” the costs of those who did and have.

If you don’t think so and don’t understand how, allow me to elucidate.

We are already paying more for the high repair costs of modern, government-mandated cars. Even if we drive older cars with fewer government-mandated “features,” such as six air bags and back-up cameras, because when cars so equipped are wrecked, they cost — and someone’s gotta pay.

Guess who?

These costs are too much to be borne by the individual owners of government-mandated cars, so they’re spread out to owners of all cars. Your premiums increase because of what it’ll cost to fix your neighbor’s car if he hits you.

These costs included the higher repair costs imposed by the de facto mandated design of modern cars, including paper-thin exterior sheet metal that can be bent by hand (and crumpled beyond repairability in minor accidents), plastic front and rear ends that shear off in the same minor accidents and expensive to repair aluminum body parts—all of them now common attributes of new/late-model cars, resorted to in order to cut down on weight. So increase MPGs, but without compromising the car’s ability to pass government-mandated crash tests (the beneath-the-skin structure of the car comes into play here).

The cars pass the tests, but they’re much more expensive to fix. They are not infrequently thrown away because they’re too expensive to fix.

Now fold into the mix the costs (in steel and flesh) of automated car wrecks, which will continue to happen for the same reason that planes still crash and rockets blow up on the launch pad sometimes.

Technology made by fallible creatures isn’t infallible. Things will get broken, and worse—and that means someone’s going to have to pay for it.

And because insurance is mandatory—you won’t be allowed to say no to “covering” the cost of fixing other people’s automated cars nor for the mayhem they cause—the cost of your “coverage is going to go up.



Not an NMA Member yet?

Join today and get these great benefits!

Leave a Comment

One Response to “Drivers Who Aren’t, But Who Gets the Bill?”

  1. C. S. P. Schofield says:

    I have always thought the idea of ‘self driving’ cars was idiotic, for one simple reason. I have been dealing with computers since my Lady bought an Apple II in the 1980’s. In that time, I have yet to meet one that didn’t crash, freeze, or dome some other spastic thing at least once a month. Translate that into computer run automobiles, and that’s quite a pile of wreckage. I know the engineers and programmers claim they will have the bugs out, soon. Right. To err is human. To replicate that error ten thousand times a second requires a computer.