Getting your Trinity Audio player ready...

A 38-year-old Apple engineer died from a high-speed crash because his Tesla Model X’s Autopilot driving system steered the car into a median on Highway 101 in Mountain View in 2018, federal officials concluded Tuesday, following a two-year inquiry.

The probable cause of the crash, approved by the National Transportation Safety Board (NTSB) at the Feb. 25 meeting, lays significant blame on Tesla for shortcomings in the electric car company’s partially autonomous driving system. But it also points to driver complacency as a significant factor in the crash — he was likely playing a video game at the time — along with larger concerns that car manufacturers are marketing and selling autonomous features without adequate testing and clear disclosure of the limitations.

The NTSB had blasted Caltrans for failing to repair safety equipment along Highway 101 that contributed to the severity of the crash, and found that the driver would have likely survived the collision if a safety buffer called an “attenuator” had been in place at the median.

“When all of these combine, we end up with a tragic accident and one less father that’s coming home to the child he had just dropped off to school that morning,” said NTSB board chair Robert Sumwalt.

In March 2018, San Mateo resident and Apple engineer Walter Huang was commuting south on Highway 101 in his Tesla Model X when his vehicle veered left towards the “gore area” between the southbound lanes and the Highway 85 carpool flyover lane in Mountain View. The Model X struck the concrete barrier at over 70 mph, destroying the front of the vehicle and causing it to careen into two other vehicles before coming to a stop. Huang was pulled out the vehicle shortly before the damaged battery caught fire. He was taken to a local hospital where he died.

NTSB investigators launched a probe into the accident almost immediately, and spent two years collecting data and conducting interviews to determine what caused the Tesla to veer from the roadway and crash into a barrier at full speed. The subsequent findings show that the vehicle’s autosteering function was enabled at the time of the collision and, about 6 seconds before the crash, steered the SUV to the left toward the median.

The errant path may have been caused by faded paint markings along the left-hand lane of Highway 101, but NTSB staff couldn’t say for sure. NTSB accident investigator Don Karol told board members that video imagery that could have shed light on the crash wasn’t available because the vehicle’s computer system was heavily damaged due to the “catastrophic nature” of the crash.

Tesla representatives did not immediately respond to requests for comment.

Tesla’s Autopilot is a partially autonomous system that can control the vehicle’s steering, braking and lane changing without driver input, using radar, cameras and ultrasonic sensors to detect objects and lane markings. Though it goes beyond typical driver assistance, Tesla warns consumers that drivers must maintain awareness, understand the limitations of the autonomous features and have their hands on the steering wheel at all times.

Information extracted from the Model X involved in the crash shows Huang’s hands were off the steering wheel for roughly one-third of the half-hour trip, and he did not attempt to correct the vehicle’s path when it veered into the Highway 85 barrier.

Perhaps the biggest reveal at the Tuesday meeting was that NTSB investigators found that Huang’s lack of response before the crash was likely because he was distracted from “a cell phone game application” and was over-reliant on Autopilot.

“If you own a car with partial automation, you do not own a self-driving car, so don’t pretend that you do. This means that when driving in the supposed self-driving mode, you can’t sleep. You can’t read a book. You can’t watch a movie or TV show. You can’t text. And you can’t play video games,” Sumwalt said. “Yet that’s precisely what we found that this driver was doing. He was playing a video game on his smartphone when his car veered over into the median and struck the barrier in the median.”

Sumwalt, in his opening comments, said that the fatal collision could have been avoided, but the car manufacturing industry and federal regulators have failed to implement the NTSB’s safeguards proposed back in 2017. There needs to be a way to limit autonomous functions in road conditions that Autopilot was never designed to handle, and there needs to be effective ways to flag the drivers who are complacent on the road, he said.

“What struck me the most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology. Instead, the industry keeps implementing technology in such a way that people get injured or killed,” Sumwalt said. “And the industry, in some cases, is ignoring the NTSB’s recommendations intending to help prevent such tragedies.

Drivers need to be aware that Tesla’s Autopilot, and comparable systems in commercial vehicles, are not autonomous, and shouldn’t be branded as such. “The car in this crash was not a self-driving car, as I’ve said many times before,” he said. “You cannot buy a self-driving car today. We’re not there yet.”

Beta testing at highway speeds

One of the major sticking points among NTSB members was the idea that Tesla’s autosteering features, despite its widespread, daily use on roadways, is actually still a work in progress. The company itself considers the features a “beta,” a label that it believes encourages drivers to approach Autopilot with a clear-eyed understanding that they need to remain vigilant and attentive.

The label didn’t sit well with board member Bruce Landsberg, who said that enabling autosteering amounts to using a faulty safety feature that still has bugs. Federal regulators only test semi-autonomous collision avoidance systems under a limited set of conditions, such as rear-end crashes at speeds up to about 45 miles per hour, rather than in high-speed scenarios or in cross traffic, which have been the center of multiple Tesla-related investigations.

Two of the Autopilot functions — the Forward Collision Warning system and Automatic Emergency Braking — did not activate during the crash.

Landsberg slammed the current Autopilot system and testing scheme, calling it “completely inadequate,” and said it’s not enough for Tesla to simply give consumers the caveat that Autopilot is prone to issues.

“It seems to me when you put a system out there that is a safety-critical system, and it has known bugs in it, that it’s probably pretty foreseeable that somebody is going to have a problem,” Landsberg said. “Then they come back and say, “Oh, but we warned you” — that doesn’t seem like a very robust safety mechanism.”

Since the fatal Mountain View crash, Tesla has pushed out a firmware update for the Autopilot system. Among the changes, the vehicle will more quickly alert drivers when their hands are not detected on the steering wheel and it loses its ability to keep the vehicle centered in a lane.

The features are valuable improvements to Autopilot, said Robert Molloy, NTSB’s director of highway safety, but he underscored that it’s important to be proactive about making upgrades to autonomous vehicle functions.

“Fixing problems after people die is not really a good highway (safety) approach,” he said.

Huang had complained of problems with his Autopilot and navigation systems in the weeks leading up to the crash.

Board member Jennifer Homendy said she is concerned that the National Highway Traffic Safety Administration (NHTSA) is shirking its responsibility to regulate the emerging market of partially autonomous vehicles, described as “Level 2” automation on a scale of 0 to 5 (fully automated). She said a contributing factor in a fatal Tesla crash in Delray Beach, Florida, was the “failure of NHTSA” to compel vehicle manufacturers to incorporate acceptable safeguards for vehicles with Level 2 automation.

Homendy said she was also troubled by a Twitter post by the agency stating that the regulations should take into account the cost of buying a car. The Jan. 7 tweet states “For some, affording any car — let alone a new one — can be a challenge. That’s why NHTSA is working to keep regulations reasonable so cars, trucks and SUVs — with the latest safety features — are more affordable and families can be safer on the roads.”

The NHTSA’s mission is not to sell cars, and lowering the bar on safety to lower costs shouldn’t even be considered a factor, she said.

Not a new problem

Throughout the Feb. 25 meeting, NTSB board members referred to Tesla’s poor track record of responding to the agency’s past recommendations, which could have led to safety improvements and prevented further collisions.

In 2017, the NTSB had wrapped up a yearlong investigation into a fatal Tesla crash in which a Model S struck the side of a truck, killing the driver. Among the safety recommendations, NTSB suggested that six car manufacturers with Level 2 automation systems incorporate safeguards that limit the use of autonomous features to roads and conditions that it was designed to handle. The agency also suggested that vehicles need to better detect when the driver is being complacent and not paying attention to the road, and alert them when “engagement is lacking.”

Sumwalt said the request was pretty simple — respond to the NTSB’s recommendations within 90 days. Five of the manufacturers responded in time and stated they would comply with the recommendations. Only Tesla ignored the request and it remains the only non-compliant company.

“It has been 881 days since the recommendations were sent to Tesla and we’ve heard nothing,” Sumwalt said. “We’re still waiting.”

Tesla has butted heads with NTSB officials since the investigation into the Mountain View crash first launched in March 2018. The federal agency originally invited Tesla actively participate in the investigation and provide technical assistance, but took the rare step of dropping the company from the investigation one month later after Tesla released multiple public statements speculating that the driver, and not its technology, was at fault.

Tesla was releasing incomplete investigative information that was bound to lead to “speculation and incorrect assumptions about the probable cause of a crash, which does a disservice to the investigative process and the traveling public,” NTSB said in a statement at the time.

Sumwalt took time during the meeting to criticize Caltrans for its role in the severity of the Mountain View Tesla crash, noting that it was one of multiple occasions in which safety equipment was damaged and not adequately repaired or replaced. In front of the Highway 85 concrete barrier is typically a crash attenuator, which had been damaged to the point of being “nonoperational” due to a solo-vehicle crash 11 days before the March 23 fatality.

The attenuator could have significantly reduced the damage to the Model X, and NTSB investigators made clear at the Tuesday meeting that Huang likely would have survived if it had been there to cushion the impact.

The Mountain View Fire Department was mentioned by NTSB officials, who determined the emergency response to the accident was adequate and well-executed, given the circumstances.

The full report on NTSB’s investigation will be published in the coming weeks. An abstract of the report, released Feb. 25, lists 23 findings that enumerate all the factors that contributed to the fatal collision. Limitations on Tesla’s Autopilot lane-keeping assistance caused the vehicle to veer into the median and failed to provide an alert to the driver in the seconds leading to the crash. The Model X’s collision avoidance system was not designed to detect a crash attenuator and the NHTSA does not require such capability, according to the report, which resulted in a severe crash in which the automatic braking and collision warning systems failed to activate.

“In order for driving automation systems to be safely deployed in a high-speed operating environment, collision avoidance systems must be able to effectively detect and respond to potential hazards, including roadside traffic safety hardware, and be able to execute forward collision avoidance at high speeds,” according to one of the findings approved by the NTSB board Tuesday.

The NTSB doubled down on the recommendations it made to Tesla in 2017, adding that if Tesla does not create safeguards preventing the use of Autopilot on roads and in conditions it was not designed to handle, it risks future crashes. It also took a jab at NHTSA for “failing to ensure” that manufacturers of partially autonomous vehicles provide these safeguards.

The report’s findings also state the driver did not attempt to correct the route of his Model X as it steered into the concrete barrier, most likely because he was distracted by a game on his cell phone. Distracted driving could be curbed by new technology and company policies that prohibit the use of portable electronic devices in a moving vehicle.

Among the nine recommendations, NTSB is asking nine smartphone manufacturers, including Apple, Google and Samsung, to develop a “lock-out” mechanism that would automatically disable any functions that would distract a driver while the vehicle is in motion — making it a default setting that would need to be disabled by the user. In the case of Apple, board took it a step further and asked the company to create a company policy banning the use of cell phones by “all employees and contractors driving company vehicles, operating company-issued portable electronic devices, or using a portable electronic device to engage in work-related communications.”

Kevin Forestieri is the editor of Mountain View Voice, joining the company in 2014. Kevin has covered local and regional stories on housing, education and health care, including extensive coverage of Santa...

Join the Conversation

No comments

  1. This father of two, the sole breadwinner for his family and his parents, was speeding in his Tesla in its pseudo-automatic mode, a mode he had complained about to his wife, his brother, and Tesla, and taken it in for for malfunctioning at this very stretch of highway, all while playing a video game behind the wheel?

    How many different ways are there to say selfish? Once you decide to have kids, your life is no longer your own. Taking large risks like this for personal gratification is the height of foolish selfishness.

    Suing Tesla for this is just chutzpah.

  2. The headline here should be that he was distracted probably by a video game.

    Yes, the other factors were instrumental, seriously instrumental. But the long and the short of it is that as a Tesla driver with knowledge of how his car reacted on this part of his commute, he was not paying attention. If he had beed driving using the autodrive feature correctly, paying attention, both hands on the wheel, as Tesla’s self drive feature instructs, this accident would not have happened and he would still be alive today.

    There is a certain arrogance in this. The idea that he not only did not pay enough attention to driving but that he was paying attention to of all things a video game on his phone, is tantamount to negligence and dangerous driving. Being distracted by a crying child or similar is almost understandable, but to be actively involved in another task is reprehensible.

    It puts all Tesla drivers in a bad light and a good car ridiculed.

  3. Tesla Autopilot is a danger to Tesla drivers and to all other road users. Tesla should be forced to disable it until they can prove that they are properly training their drivers to use it safely.

  4. @resident

    I use Tesla autopilot all the time. Is my life at risk? I don’t think so as I’ve never even had a close call.

    How do I manage this? I make sure I don’t play video games while autopilot is on. That’s a small, but very important detail.

  5. This particular location was missing its “attenuator” because of numerous and frequent crashes at this location. This certainly implies to me that there is something about the design of the freeway in this location that makes it difficult to negotiate for both humans and computers. I pay extra attention driving through there now. The lane design has you driving straight at the gore point for quite a while and the seam in the pavement creates an apparent lane marking directing you into it. It’s not surprising Tesla software got fooled.

Leave a comment