Advertisement

Two die in driverless Tesla incident. Where are the regulators?

Driver folds hands as Autopilot drives the Tesla
Tesla instructs drivers to keep hands on wheel when Autopilot and Full Self-Driving is engaged, but not all drivers do.
(Dreamstime/TNS)
Share via

It’s a 21st century riddle: A car crashes, killing both occupants — but not the driver.

That’s what happened over the weekend in Houston, where a Tesla Model S slammed into a tree and killed the two men inside. According to police, one had been sitting in the front passenger seat, the other in the back of the car.

Although investigators have not said whether they believe Tesla’s Autopilot technology was steering, the men’s wives told local reporters the pair went out for a late-night drive Saturday after talking about the system.

Tesla Chief Executive Elon Musk pushed back on speculation but also asserted no conclusion, tweeting Monday that “Data logs recovered so far show Autopilot was not enabled.” The company has resisted sharing data logs for independent review without a legal order.

Advertisement

After Musk’s tweet, a county police official told Reuters that the department would serve a warrant for the data.

Autopilot technically requires the human driver to pay full attention, but it’s easy to cheat the system, and the internet is rife with videos of pranksters sitting in the back while a Tesla cruises down the highway with the driver seat empty.

It’s a state of affairs that leaves many auto safety experts and driverless technology advocates wondering just what it will take before regulators step in and put an end to the word games and rule-skirting that have allowed it to continue. Could the crash in Houston provide that impetus?

Advertisement

“I suspect there will be big fallout from this,” said Alain Kornhauser, head of the driverless car program at Princeton University.

Tesla’s Autopilot system has been involved in several fatal crashes since 2016, when a Florida man was decapitated as a Tesla on Autopilot drove him under the trailer of a semi truck. Less lethally, Teslas have slammed into the back of firetrucks, police cars and other vehicles stopped on highway lanes.

Yet little action has been taken by federal safety officials and none at all by the California Department of Motor Vehicles, which has allowed Tesla to test its autonomous technology on public roads without requiring that it conform to the rules that dozens of other autonomous tech companies are following.

Advertisement

The National Highway Traffic Safety Administration said Monday that it had dispatched a “Special Crash Investigation team” to Texas. The agency, an arm of the U.S. Department of Transportation, said it “will take appropriate steps when we have more information.”

The agency declined to speak with The Times about what those steps might be.

Since 2016, NHTSA has launched investigations into at least 23 crashes involving Autopilot; but if they resulted in any conclusion or action, NHTSA hasn’t told the public about it.

Jason Levine, executive director of the Center for Auto Safety, thinks it’s about time that changes.

“There doesn’t seem to be much activity coming out of our federal safety administration with respect to what is pretty evidently becoming a public danger,” he said. “You’ve got the market getting ahead of regulators, which isn’t uncommon, but this all didn’t start yesterday.”

Tesla sells an enhanced version of Autopilot called Full Self-Driving Capability for $10,000, although there is no car sold anywhere in the world today that is capable of full self-driving.

Advertisement

Although Tesla technology might well be safe when used as directed, Tesla’s marketing can lead people to believe the car is capable of autonomous driving. NHTSA, Levine points out, has rules against “predictable abuse” in automotive technology.

“It is predictable when you call something Autopilot it means autopilot, and when you call something Full Self-Driving it means full self-driving,” he said.

Incidents such as the fatal Texas crash “are foreseeable incidents,” Levine said, “no matter how many disclaimers Tesla lawyers decide to insert in fine print.”

Musk disbanded the company’s media relations department in 2019. Emails to the company were not returned.

The California DMV is in a position to clarify matters but thus far has not. In previously undisclosed emails to the DMV in recent months, made public by the legal document transparency organization Plainsite, Tesla told the DMV that its system is not autonomous but a so-called Level 2 driver assist system.

The DMVs own regulations bar companies from advertising the sale or lease of a vehicle as autonomous if it “will likely induce a prudent person to believe a vehicle is autonomous.”

Advertisement

In public presentations and slideshows, DMV Deputy Director Bernard Soriano described Level 4 automation, which requires no human driver, this way: “Full self-driving.”

In a lengthy emailed statement, the DMV suggested that it views what Tesla is selling as a non-autonomous system. It did not address questions about whether the company, in using the term Full Self-Driving, is violating the regulation against misrepresenting driving systems as autonomous.

Adding to the confusion, Musk himself has appeared on “60 Minutes” and Bloomberg TV behind the wheel of a Tesla with his hands in the air. He’s been talking about Tesla‘s fully autonomous technology as if it’s imminent since 2016. That year, Tesla posted a video showing one of its cars running in autonomous mode through Palo Alto. “The person in the driver’s seat is only there for legal reasons,” the video said.

The same year he announced a coast-to-coast test drive of an autonomous Tesla by the end of 2017, which as of April 2021 has not happened. He told a Shanghai conference in 2020 that the “basic functionality” for fully autonomous driving would be complete that year. It wasn’t. He said the company would have 1 million driverless robotaxis on the road by the end of 2020, which would cause Tesla cars to appreciate in value. So far there are none.

The misleading promises and the confusing nomenclature are beginning to rile other players in the driverless car industry. Several industry executives have told The Times that they fear that Musk’s behavior could disturb the public and cause policymakers to enact restrictive laws and regulations that could unnecessarily delay the introduction of driverless cars.

Now, some are beginning to speak out publicly.

“We’ve had multiple years of claims that ‘by the end of the year it’s going to be magically self-driving by itself without a human in the car,’” As Ford’s autonomous vehicles head, John Rich said at a recent Princeton University conference. “It is not helpful, OK? It is confusing the public. Frankly even the investor community is very, very confused as to what paths are plausible and what the capabilities of the different systems are.”

Advertisement

Musk has long cultivated a maverick approach to robot-car technologies. Other car and tech companies combine radar, lidar and visual sensors in their systems to identify and analyze a robot-car’s surroundings. Musk believes lidar is an unnecessary expense and recently announced Tesla would soon stop using radar, too, relying solely on visual sensors for the main driving task.

And although other companies with advanced driver-assist systems similar to Autopilot use infrared cameras to make sure a human is in the driver’s seat and paying attention to the road ahead, Musk specifically rejected that technology in favor of a steering wheel sensor that can be easily defeated by hanging a weight off the wheel or jamming an object into it.

General Motors’ SuperCruise system, for example, allows hands-free driving and automated lane changing on interstates and other limited-access highways, but monitors the driver to ensure they’re paying attention to the driving task. If not, warning lights and sounds are deployed. If the driver remains inattentive, the car will exit traffic lanes and stop itself.

Ford recently announced a similar product, BlueCruise, expected to become available later this year on Mustang Mach E electric cars and Ford F-150 pickups. Neither company refers to the technology as full self-driving.

Advertisement