What the Tesla Autopilot crash means for driverless cars – Mashable

Share Button
Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f133764%2fap_216935612103
Tesla Model S 70-D electric car is test driven in Detroit.

Image: Carlos Osorio/AP
By Nick Jaynes2016-07-01 18:15:14 UTC

The fact that a driver was killed in a crash inside his Tesla on Wednesday while the semi-autonomous Autopilot system was engaged is tragic. However, the collision will have ripple effects across Tesla and the rest of the automotive industry.

That’s because it brings into question people’s trust in self-driving systems. While it’s fair for people to feel wary of Tesla Autopilot and other semi-autonomous driving systems, it’s important people don’t conflate it with truly autonomous cars, or use the incident to write off self-driving vehicles altogether.

Tesla’s blog post explaining the crash that took the life of one of its drivers while its Autopilot system was engaged is impressively detailed. From the description, we learn it was essentially a perfect storm of circumstances.

Whether or not the driver was distracted is only one factor to take into account.

According to Tesla, the Model S was traveling on a divided highway when a tractor trailer crossed in front of the car. The side of the truck apparently lined up perfectly with the skyline in both position and color, and because of that neither Autopilot nor the driver saw it. Autopilot missed it because the car was lined up with the middle of the truck, where there are no wheels, and the Tesla’s low-mounted sensors only look downward toward the road, not above it. That allowed the car to drive underneath the trailer, killing the driver.

Many are saying, “Clearly, this driver wasn’t watching the road, like he was supposed to. How can he miss that truck?” However, we weren’t there. Maybe any one of us, too, would have not seen the truck — as unlikely as that sounds. That said, it’s possible (perhaps even probable) he was distracted.

Turns out, though, whether or not the driver was distracted is only one factor to take into account when put in the context of autonomous driving. Obviously, carmakers need to get the tech right before putting it in the hands of drivers. And even when they do, there’s a larger question of whether that tech — specifically semi-autonomous systems — might lull drivers into a false sense of security.

Beta testing with cars

Tesla clearly states that Autopilot is in beta. Anyone with an iPhone or PC probably isn’t bothered by that — public betas happen all the time. Apple’s Siri is a great example of a beta project that was put in the hands of customers.

That’s essentially what’s going on here with Autopilot. It’s in beta, meaning it’s not finished. That’s why Tesla warns drivers that they still need to pay attention while Autopilot is engaged. But it’s becoming fairly clear that the better your semi-autonomous system is, the more drivers are tempted to “check out” from the experience.

In the world of autonomous driving, there are levels, measured from 0 to 5. Zero autonomy is traditional driving. Level 1 is autonomous braking — where the car can independently hit the brakes to prevent an imminent collision — and it’s slowly being added to most new cars. Level 2 is automatic cruise control and some automated steering. Level 3 is complete autonomous driving in certain situations, like on the freeway. Level 4 is full autonomy, driving from your garage to your destination without you touching a steering wheel or a pedal. Above that, there’s a Level 5, where the human can’t even intervene (think Google’s steering wheel-less cars).

Tesla’s Autopilot is considered Level 2. It uses cameras and radar to see lanes and cars and can steer the car through most everyday freeway scenarios. Some other manufacturers, like Mercedes and Volvo, have deployed Level 2 systems as well.

The better your semi-autonomous system is, the more drivers are tempted to “check out” from the experience.

The difference between Level 2 and Level 4 is immense. Full autonomy requires more than the forward-facing sensors like those on the Tesla Model S; it incorporates 360-degree computer vision across many parts of the spectrum, enabled by cameras and LIDAR (laser-based radar), as well as car-to-car communication. Autopilot does not have most of this tech.

What’s more, these systems and more — including the car’s battery, for example — need to have backups, should one of them fail the car can still handle itself. Only with all these things onboard should a driver be allowed to really disengage from driving, because the whole idea is, no matter what happens, the car can handle it. That’s why Volvo has said it will accept liability for its cars when they are in fully autonomous mode; a carmaker wouldn’t make that claim if it wasn’t fully confident in its car’s capabilities.

Despite being a Level 2, many drivers treat Tesla’s Autopilot as a Level 3 or 4. As I’ve experienced first-hand, and what we’ve seen from Tesla drivers, people feel like Autopilot is so in control that they disengage — they look away, start texting or even take a nap. Behind the wheel of an Autopilot-fitted Tesla, you feel like you can pretty much check out. That is, if you don’t know any better.

Tesla did say in its blog post that Autopilot “makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected.” I never experienced this when I tested the system, but that was prior to an update to Autopilot that Tesla pushed out in February. It’s unclear whether these alerts didn’t happen, the driver ignored them, or simply tapped the wheel now and then without looking up. Or, again, maybe he was paying attention — we just don’t know yet.

Other semi-autonomous systems from Volvo, Honda and Mercedes are as technically advanced as Autopilot. But for safety reasons, they don’t allow you to remove your hands from the wheel for long periods of time. Those other automakers have also purposely limited the car’s ability to correct itself, instead relying on the driver to do things like keep the car from drifting into another lane. That’s partly why those systems aren’t considered beta, like Tesla’s is.

In fact, the Mercedes-Benz E-Class’s Drive Pilot system is so good, it earned the first autonomous driving license ever given to a production car in Nevada. But Mercedes didn’t put that advanced system into its production car. Instead, it retarded it significantly so that drivers weren’t just told but knew they were ultimately in charge of the driving, not the car.

Setback for driverless cars?

Let’s not forget that this crash was human tragedy. The driver was a Tesla enthusiast with a family, and whatever the fallout is in the car industry, it’s terrible this happened at all. But there’s also a real fear that the incident will derail self-driving tech. A Jaguar engineer not too long ago told me he worried an incident with Autopilot could set back autonomous driving by a decade.

The endgame of autonomy is zero road fatalities.

Ultimately, though, the crash isn’t a condemnation of automated driving, which has real potential to save lives. Going by average statistics, 87 people died on the road in America yesterday, and 90% of those were due to human error. Using one death to suggest that semi-autonomous tech isn’t safe is misinformed at best. The endgame of autonomy is zero road fatalities.

But it’s a fair question whether a sufficiently powerful semi-autonomous system — like Autopilot — instills too much false confidence in the car’s ability to drive itself, leading to drivers disengaging more than they should. During testing of its self-driving cars, Google found that its drivers — who were paid to pay attention to the road — would disengage since they essentially thought, “The car’s got it.” If disengagement is the norm for Level 2 systems (as I suspect it is for many drivers) this tragic crash could be just the first.

The NHTSA is investigating the fatal wreck and Autopilot’s role. When it’s done, the government may be forced to write more laws that pertain to self-driving systems. That could mean the end of questionable practice of beta testing semi-autonomous driving systems, and a wider acknowledgement that those systems have to do everything possible to ensure drivers keep their eyes on the road.

Have something to add to this story? Share it in the comments.

Let’s block ads! (Why?)

Share Button

One comment

  1. Raymond Kawakami

    The accident occurred back in May, not earlier this week. We are just hearing about it now because of the NHTSA investigation. As far as the driver being distracted, on his famous YouTube post from this past April, he mentioned listening to an audiobook while driving in Autopilot mode. It is conjecture that he was doing the same when he was killed, but it’s in the realm of distinct possibility. I saw some recent articles on this crash with a quote from the truck driver (or other witness; articles were unclear on this) saying he thought Mr. Brown was watching a Harry Potter movie. Some reports say a portable DVD player was in the car. I’ll wait for a formal statement from police or NHTSA on whether or not this is true.

    I have a deposit down for a Model 3. My understanding is that the hardware to support Autopilot will be installed but I’m not going to activate (pay) for it. My philosophy is that cars are meant to be driven. If your desire is to have a vehicle drive you around while you relax and otherwise “check out” of your obligation to pay attention to your surroundings, there’s Uber, Lyft, taxis or buses.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

Scroll To Top
Read more:
Apple is about to reveal how serious it is about competing with Amazon and Google on AI – Recode

Apple's annual Worldwide Developers Conference kicks off Monday, June 13, in San Francisco with one of the company's signature product...

Close