Getting your Trinity Audio player ready...

A Tesla driver who died after striking a highway median in Mountain View reportedly complained of problems with his Autopilot and navigation systems in the weeks leading up to the crash in 2018, according to a trove of newly released documents.

Federal investigators at the National Transportation Safety Board (NTSB) released nearly 1,500 pages of information on the 2018 fatal accident, in which 38-year-old Walter Huang’s Model X collided with the barrier between Southbound Highway 101 and the Highway 85 carpool flyover. The investigation is looking into whether the highway conditions and the vehicle’s Autopilot lane-keeping assistance played a role in the crash.

While the agency has yet to make a determination, an attorney representing Huang’s family asserted in a letter last year that the vehicle’s Autopilot had been a problem, particularly at the location of the crash. Huang reportedly told his wife, Sevonne, and a friend that his vehicle’s lane-keeping technology was problematic and had a tendency to steer towards the median, also known as the gore point.

“Walter told Sevonne the Autopilot would cause his Tesla to veer towards the barrier involved in his crash, prior to the crash,” according to Mark Fong, an attorney with Minami Tamaki.

Tesla representatives did not immediately respond to requests for comment.

Vehicle maintenance records show that, two weeks prior to the crash, Huang brought his car to a Sunnyvale service center reporting problems with his GPS and navigation system that prevented his cruise control from functioning. A service advisor reportedly was unable to duplicate the problem during the visit, and had “no recollection whether the driver told him about problems encountered while driving vehicle in the vicinity of the gore area on US-101,” according to the documents.

An earlier report released by NTSB found that the Tesla’s Autopilot system, shorthand for a suite of functions including adaptive cruise control and autosteer lane-keeping assistance, was enabled at the time of the crash. Huang’s vehicle was in a lane traveling south on Highway 101 when it moved left and sped up from 62 mph to 70.8 mph. No “precrash braking or evasive steering” movement was detected.

The severe damage to the Tesla breached the battery, causing it to catch fire shortly after the crash. Though bystanders were able to pull Huang from the vehicle just before it was engulfed in flames, he later died of his injuries.

NTSB will be holding a board meeting on Feb. 25 to determine the probable cause of the fatal crash. In a previous report, the agency slammed Caltrans for “systemic problems” that prevented the swift repair of traffic safety equipment that could have lessened the severity of the crash. Caltrans is responsible for maintaining a crash attenuator at the site of the collision, which is equipped with a hydraulic cylinder and cable assembly designed to “telescope” and absorb impact when a vehicle hits it at high speeds.

The attenuator located at the highway median had been smashed by a Prius in a solo-vehicle accident 11 days before the Tesla crash and was damaged to the point of being “nonoperational,” and had not yet been replaced.

Alongside the NTSB investigation, Huang’s family has also filed a wrongful death suit in Santa Clara County Superior Court. The suit alleges that Autopilot,while marketed as a safety feature designed to prevent crashes, should have prevented Huang’s Model X from accelerating into a fixed object on the road.

The full public docket released by NTSB can be reviewed online at https://go.usa.gov/xd9u9.

Kevin Forestieri is the editor of Mountain View Voice, joining the company in 2014. Kevin has covered local and regional stories on housing, education and health care, including extensive coverage of Santa...

Join the Conversation

No comments

  1. Was this crash caused by defective technology or from a user not understanding the limits of the technology? What does the Tesla owner’s manual say the system is supposed to do in situations like this?

  2. The premis of an auto-drive where the system does not have transceivers in every vehicle,

    that their devices are not programmed with the dimensions of the vehicle,

    the safe tolerances of the acceleration of the vehicle,

    THE SAFE TOLERANCES OF VEHICLE DECELERATION OF THE VEHICLE,

    and the ability to integrate all vehicles on the road at the same time.

    Otherwise, the SAFETY of such an idea is greatly risky.

    WHY ARE WE LETTING THIS HAPPEN?

    The commercial attitude is that every person in the U.S. are guinea pigs.

    And be damned if these guinea pigs even understand the risks they are taking.

    Thus the people are without consent being subject to an experiment.

  3. From the article, it didn’t sound like it was established that the driver had reported past problems with the auto drive function. Just because his family members say it happened doesn’t make it true.

  4. So let’s get this straight.

    The driver claims that auto-pilot was giving problems.

    Yet he engaged the auto-pilot in a Zone that confuses even the most astute of the drivers ?

    The facts and allegations don’t match up. Common sense says that if a system as critical as an auto-pilot is malfunctioning, then don’t use it.

    The allegations are clearly false and aimed towards getting a settlement from Tesla.

  5. I remember when this happened, and at the time I could see why the autopilot was confused.

    The divider area where the left overpass splits from 101 south got wider as it got closer to the crash barrier, and near the end it is wide enough to be a lane. Additionally that divider area was not striped or marked on the ground.

    Finally, I think what should have been a solid white line near the divider area, was actually worn out to the point where it looked striped. I think at one time there were temp barriers there for construction, and those ruined the paint.

    Then on top of that, people drove like idiots there. Would constantly see people not paying attention and then cut across the divider at the last second.

    Not an area where I would be relying on auto-pilot.

  6. So a dad of two small kids used an autopilot feature, which he had already reported as malfunctioning, while speeding?

    Once you decide to be a dad and have dependents, your life is not your own anymore. This was selfish.

Leave a comment