Getting your Trinity Audio player ready...

The family of a man who died when his Tesla Model X crashed last year has filed a wrongful death lawsuit against the car manufacturer, claiming the vehicle’s Autopilot function and emergency braking system were defective. Attorneys representing the family say Tesla’s actions amount to beta testing vehicle software on “live drivers.”

Walter Huang, a 38-year-old San Mateo resident and an Apple engineer, was traveling south on Highway 101 in Mountain View on March 23, 2018, when his vehicle veered left and struck the barrier between southbound Highway 101 and the Highway 85 carpool flyover.

Huang was extracted from the Tesla shortly before the vehicle’s damaged battery caught fire. He died in the hospital several hours later of his injuries.

The fatal crash has been the subject of scrutiny after it was revealed that Tesla’s Autopilot — described as a driving assistance tool that includes cruise control and Autosteer “lane-keeping assistance” — had been active in the moments prior to the crash. Four days after the collision, the National Transportation Safety Board (NTSB) announced it was opening an investigation into the accident.

A preliminary report from the agency found that the vehicle had started a “left steering” movement seven seconds prior to the crash, and accelerated from 62 miles per hour to 70.8 miles per hour with “no pre-crash braking or evasive steering movement detected” in the final three seconds before striking the highway barrier.

The civil complaint from the family, filed in Santa Clara County Superior Court, alleges that Tesla marketed its Autopilot and automatic emergency braking systems as safe features designed to prevent crashes, and that the vehicle did not deliver on those promises.

“A safe and properly functioning automatic emergency braking system does not allow a crash to occur that could otherwise have been avoided or reduced in severity,” the lawsuit states. “A safe and properly functioning automatic emergency braking system should prevent a vehicle from accelerating into any fixed object.”

The suit goes on to claim that Tesla’s own testing and reports from NTSB found the Model X was prone to “unwanted, unwarranted or un-commanded acceleration” and lacked the sensors and systems needed to keep the car from leaving a travel lane. Such risks, the suit asserts, should have warranted a post-market warning, advisory or recall.

The law firm representing the family, Minami Tamaki LLP, released a statement Wednesday morning, May 1, announcing the wrongful death suit, which was filed on behalf of Huang’s wife, Sevonne Huang. Huang is also survived by his son and daughter, ages 4 and 7, and two elderly parents who were dependent on Huang for financial support.

“Mrs. Huang lost her husband, and two children lost their father because Tesla is beta testing its Autopilot software on live drivers,” Mark Fong, an attorney for the firm, said in the statement. “The Huang family wants to help prevent this tragedy from happening to other drivers using Tesla vehicles or any semi-autonomous vehicles.”

A spokesperson for Tesla declined to comment on the lawsuit.

The suit also accuses Caltrans of failing to maintain safe conditions at the location of the crash, arguing that there was a damaged safeguard where the Highway 85 carpool lane splits from Highway 101. The preliminary NTSB report found that a cushioning system called a “crash attenuator,” designed to soften a high-speed collision into the median, had been damaged 11 days prior to the crash and not been replaced.

The complaint argues that the attenuator was not replaced in a timely manner and that Caltrans’ actions were “negligent and careless” and amounted to leaving a dangerous, defective and hazardous condition of public property.

Just weeks after the crash, the NTSB took the unusual step of revoking Tesla’s involvement in the investigation after the agency officials said the company released “incomplete information” that was neither vetted or confirmed by the agency. The information, released by Tesla in a series of blog posts and statements to the media, implied that user error was a factor in the fatal crash.

“Such releases of incomplete information often lead to speculation and incorrect assumptions about the probable cause of a crash, which does a disservice to the investigative process and the traveling public,” according to an NTSB statement on April 12, 2018.

Tesla’s website cites data for the first quarter of 2019 showing that there was one accident for every 2.87 million miles driven in Tesla vehicles with Autopilot engaged, and one accident for every 1.76 million miles without Autopilot. The average, across all automobiles, is a crash every 436,000 miles, according to the website.

The March 23, 2018, crash also posed an unusual challenge for the Mountain View Fire Department, which responded to the vehicle fire caused by a 400-volt lithium-ion battery within the interior of the Model X. The department blasted the exposed portion of the battery with 200 gallons of water and foam, according to the preliminary NTSB report, and received additional support from Tesla engineers before the car was towed to a San Mateo impound lot. The battery reignited five days later and was extinguished by the San Mateo Fire Department.

Kevin Forestieri is the editor of Mountain View Voice, joining the company in 2014. Kevin has covered local and regional stories on housing, education and health care, including extensive coverage of Santa...

Join the Conversation

3 Comments

  1. It would seem to me that it would be appropriate to “ground” every Tesla and disable any autopilot functions before they can be driven. I sure don’t want one behind me on the road.

  2. I thought that drivers must stay alert and capable to retake control when a car is in auto pilot? Seven seconds is a long time to notice your veering off the road. It’s an unfortunate accident but I don’t see Tesla at fault unless they said “Autopilot is 100% safe”.
    I also don’t see CalTrans at fault unless they knew there was damage to the safe guard or conditions at the crash site were unsafe. CalTrans covers tens of thousands of miles. I don’t see how they know an area needs repairs unless it is reported to them.

  3. Witnesses say the driver was asleep at the wheel. How is Tesla responsible for this. 7 seconds is plenty of time to make corrections. If not asleep no doubt was texting or goofing with his phone. I see this everyday as I drive around silicon valley… Including Cops.

  4. If he wasn’t asleep, how do you account for the fact that there is a 7 second gap in time when the car was accelerating and moving left? He was probably on his phone and now people want money to replace their loved one who acted irresponsibly. Personal responsibility. I know it is sad that this happened, but why blame Tesla or CAL trans or the fire department or anybody else you can think of when there is obviously something done wrong here on the part of the driver. You can take it to the Supreme Court if you want to, but it doesn’t change the fact that this driver did something wrong. This is not an autonomous vehicle, nor did Tesla claim that it was; you need to be paying complete attention to your driving when you are behind the wheel. I know I’ll get blasted for being unsympathetic, not politically correct, etc., but this is how I feel. I’m tired of people not taking responsibility for their own actions and trying to blame anyone or everyone else for their mistakes and the tragedies that they cause.

  5. @D Moore
    Directly from the NTSB prelim. report:
    “The Autopilot system was engaged on four separate occasions during the 32-minute trip, including a continuous operation for the last 18 minutes 55 seconds prior to the crash.”

    @AllYouCanEat
    I don’t see anything in NTSB’s preliminary report, nor anything in Tesla’s two blog posts, nor anything in my notes from the MVFD fire chief on the day of the crash, indicating the driver was asleep at the wheel. You are likely confusing this collision with another incident.

  6. What this article does not remind people of is that the driver was already fully aware that this specific stretch of damaged roadway was somehow tricking the Tesla Autopilot software into doing exactly this action IF he drove in the lane near the damage and was using Autopilot without his hands on the wheel.

    Mr. Huang himself knew what he was doing was very dangerous and was against Tesla Autopilot directions and that this specific section was potentially deadly and yet he continued to tempt death by using Autopilot and NOT keeping full control of the car as he passed this section.

    Mr. Huang had even taken his wife on a trip past that area in order to demonstrate for her the problem. Mr. Huang had also complained to Tesla about the Autopilot for this reason and he had told his wife and others about his prior experience with this stretch of damaged road.

    Mr. Huang had intentionally repeatedly experimented with Autopilot while driving along this specific section of the road in that lane to see how it would behave.

    Mr. Huang had stated that he knew this section of the road had damage to the safety devices. He stated that due to the damage, an old, obsolete and dangerous set of lane markings were visible to the Tesla system and that he believed these old lane markings were the cause of the problem.

    When the freeway was in the process of being reconfigured years ago, there were a variety of lane markings that changed from time to time as the construction proceeded. This specific section had lane markings from one old phase of the construction that were NOT scrubbed off the road surface, so when a prior crash (by a normal car) destroyed the safety barrier, the old lane markings could be seen.

    The reason these old and dangerous lane markings had not been scrubbed off, which would be normal procedure, was that the plan was to plant a set of safety barriers in that location and thus they felt there was no real need to remove the old lane markings. They never considered what might happen if a crash took out these barriers and thus exposed the dangerous lane markings to view.

    I understand why the construction people failed to obey standard procedures and remove these old lane markings, and I can understand why they violated the rules of how long a damaged safety rail was allowed to sit and wait for repair, but the fact is that we have 2 errors of judgement and 2 failures to do work that is required for safety on the freeway in the proper manner. I understand that they never considered the what-ifs, but that’s exactly why we have standard procedures.

    Nobody ever claimed that the Autopilot was perfect and anyone who has ever used a computer or device that uses software should already be well aware of the limitations of software and should NOT risk their lives in foolish ways as Mr. Huang chose to do repeatedly until he finally paid the ultimate price of being careless with his own life.

    And let’s not forget that it was equally possible that when Mr. Huang ultimately did crash that his crash could have killed other people in other cars as well. Mr. Huang was risking the lives of anyone who was driving near enough to him at the critical moment.

    The bottom line is that Tesla Autopilot has SAVED many lives already and the tiny number of crashes have all been due to a combination of driver failure to maintain proper control and some extraordinary road situation.

    Tesla is certainly not at fault here.

Leave a comment