ap

Skip to content

Breaking News

tesla showroom
Justin Sullivan, Getty
A Tesla Model S is displayed inside of the new Tesla flagship facility in 2016. Tesla’s less expensive models have the capability of driving longer distances, but a computer program restricts the battery power from being accessed.
PUBLISHED: | UPDATED:
Getting your player ready...

I was getting ready to embark on a 550-mile road trip in a $145,000 Tesla Model S. I’d be driving through the twisted passes of Sierra Nevada in California, and couldn’t imagine a more enjoyable vehicle for the journey. But when I mentioned the trip to friends and family, all they wanted to talk about was a recent crash involving Autopilot, Tesla’s version of cruise control — on steroids.

“Is it safe?” they asked.

Even my wife said, “Don’t do any dangerous Autopilot stuff.”

Autopilot was engaged during a horrific broadside accident made public in June, and it’s no exaggeration to say that it has changed the way people think about self-driving cars. It was only a matter of time before something like this happened — cars kill people all the time — but that didn’t stop the swift and stern rebuke that followed.

This is a real problem for Tesla because Autopilot is, at its core, a suite of features designed to make driving safer: lane-keeping, emergency breaking, collision avoidance — it pays attention when you don’t. The Model S is, quite possibly, the safest car on the road today, with or without Autopilot.

The first thing to know about a Tesla on Autopilot is that it is not a self-driving car. Think of it as the next level of cruise control. Pull the lever once, and the car takes over acceleration and deceleration. Pull the lever twice, and it takes over the steering, too. Under the right conditions, Autopilot will accelerate itself from a dead stop, keep you locked in your lane during hairpin turns, and slam on the brakes to avoid collisions. It handles stop-and-go traffic beautifully. The limitations, however, become clear pretty quickly-like when I almost plowed into an SUV.

As I settled into my long mountain ascent, I engaged the car’s turn signal feature, which changes lanes at the flick of a finger. The sensors failed to register a gold-colored SUV beside me and would have driven into its path had I not taken over.

Normally when changing lanes the car will sense adjacent traffic and wait for an appropriate time to merge, but the nearsighted sensors didn’t see this one. The car’s front-facing radar and camera can “see” much farther than the sensors on the rear and sides of the car, which have a range of about 16 feet.

For another example of Autopilot’s limitations, take it for a spin along the scenic mountain switchbacks that surround Lake Tahoe. No, actually, don’t. The narrow roads are literally eroding off the mountain in places, and the lane markers are faint. I engaged Autosteering to see what would happen and had to immediately take over before it drove me right off the edge of the dazzling cliffs of Emerald Bay.

These may seem like egregious “failures” of an unsafe system — but compared with what? It’s only a failure if you’re thinking about a Tesla as a self-driving car that just isn’t up to the task. It’s not that car. Consider this: If a Toyota driver had standard cruise control set for 70 mph on the highway and failed to take over and reduce speed for a 25 mph turn, would we blame the cruise control for the resulting crash? Relinquishing full control to Autopilot is no different.

When the conditions are right, Autopilot unburdens us of the most tedious tasks of driving. The machine maintains a fidelity to the center of the lane. It stops and goes with traffic and adjusts with the speed limit. It liberates drivers to look up, enjoy the view, groove to the music and engage with their children. And that, of course, is Autopilot’s biggest weakness: the distractible human driver.

The driver in the fatal Autopilot crash was reportedly watching a movie when his car hit the side of a 50-foot semi. He never applied the brakes. He knew Autopilot’s limits well — limits that a driver can choose to ignore. Autopilot makes that easy.

When I drove the Model S with a careful eye on the road, there’s no question that I was a better driver with Autopilot engaged. There’s also no question I spent less time with a careful eye on the road. Tesla should do more to make sure drivers “check in” with the steering wheel more often, and be more liberal about when it makes Autosteering unavailable because of poorly marked lanes and uncertain conditions.

In this brave new world, there may be a window of time in which crashes caused by inattentive Autopilot drivers outnumber those prevented by the feature. But just as we don’t remove radios and standard cruise control from cars — both of which can also lead to inattentive drivers — there’s an argument that we shouldn’t hold back autonomous driving. By next year, Tesla will have collected data from a billion miles of Autopilot use, and it won’t be long before the lives saved may vastly outpace the risk from the imperfect human.

RevContent Feed

More in Business