Getting your Trinity Audio player ready...

The family of a man who was killed in a fiery crash involving a Tesla Model X vehicle plans to file a wrongful death lawsuit against the car company over allegations of a faulty Autopilot system, according to an announcement by a legal firm representing the family.

San Mateo resident Walter Huang, 38, suffered major injuries and later died after his 2017 Tesla collided with a median on Highway 101 in Mountain View on March 23. Vehicle logs recovered by Tesla showed that Huang had been using the vehicle’s Autopilot function at the time of the crash, when the vehicle hit a cement barrier between Highway 101 and the Highway 85 carpool flyover, according to the company.

In an online post Wednesday, the law firm Minami Tamaki stated that the family intends to file the wrongful death suit against Tesla, and could potentially extend the suit to any subcontractors involved in the design and construction of the Autopilot system. An early review by the law firm indicated that the Autopilot system installed in the Model X may have misread painted lanes on the roadway, failed to detect the concrete median and failed to brake the car, according to the post.

“The firm believes Tesla’s Autopilot feature is defective and likely caused Huang’s death, despite Tesla’s apparent attempt to blame the victim of this terrible tragedy,” the law firm said in a statement.

The company lists grounds for the suit including liability, defective product design, and intentional and negligent misrepresentation.

In a blog post last month, Tesla officials said the company reviewed the crash and found that Huang had Autopilot engaged in the moments before the crash, and that he had ignored “several visual” and one audible warning to take the wheel again. The vehicle did not detect Huang’s hands on the steering wheel during the six seconds prior to the collision, according to the blog post, and he had “about five seconds and 150 meters of unobstructed view of the concrete divider.”

Tesla doubled down on the argument that Huang was largely at fault for the crash, noting that the “only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”

“The fundamental premise of both moral and legal liability is a broken promise, and there was none here,” according to the statement issued to media outlets Wednesday. “Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged.”

In the weeks following the crash, Tesla officials have emphasized the strong safety track record of the company’s Autopilot technology, citing statistics showing that crashes are far less likely to occur when Autopilot is active. The company cites statistics gathered by a federal traffic safety agency showing that the first iteration of Autopilot, released a year ago, reduced crash rates by about 40 percent, and argues that the technology has only gotten better since then.

“The reason that other families are not on TV is because their loved ones are still alive,” Tesla officials said in the statement.

Both Tesla and the aggrieved family have also pinned blame on Caltrans for allegedly failing to maintain a safety guard, known as an attenuator barrier, at the location of the crash that could have reduced the impact of the crash. Tesla officials claimed, shortly after the accident, that the barrier had “either been removed or crushed” in a prior accident and had not been replaced. The Minami Tamaki law firm also said the family “may” file a lawsuit against the California Department of Transportation for what it calls dangerous conditions of public property.

Kevin Forestieri is the editor of Mountain View Voice, joining the company in 2014. Kevin has covered local and regional stories on housing, education and health care, including extensive coverage of Santa...

Join the Conversation

8 Comments

  1. Tesla’s Autopilot is designed to assist the driver, not self driving. The driver was obviously not paying any attention to the road, and the Autopilot is not 100% safe. There’s a liability that is signed when purchasing the car, informing about that. It’s not Tesla’s fault, its the driver! Obviously. Sorry for the loss, however.

  2. Customers have to realize that TESLA AUTOPILOT is not a real autopilot. The name is just a marketing scam. The product is really only slightly more advanced from what other brands call CRUISE CONTROL. I would never think of taking both hands off the steering wheel when using the cruise control on my non-Tesla car. Taking one hand off the steering wheel is sometimes necessary to use the gear shift or turn signal.

  3. Very true it is to assist the driver. Pilots of commercial planes have auto pilot system but they still have to pay attention to the controls and navigation just in case they need to take over. Tesla cars are not approved self driving cars like waymo Google cars. Sorry about your loss.

  4. Let’s try this again, maybe I’m just too reasonable?

    It seems that the MV Voice changed the wording in the original article where they used the term “navigation system” instead of “Autopilot”.
    It’s a common mistake which I have seen in TV news interviews as well.

    Which is exactly what was found in the Tesla service records of the car in question. The owner had reported that the “navigation system” (which is the GPS, not the Autopilot) had a problem, but Tesla service could find nothing wrong with the navigation system. No mention of the “Autopilot” system was in the service records. He never demonstrated the Autopilot problem to Tesla or anyone else that has come forwards. The wife of the owner has stated on TV that the owner had taken her out to that dangerous strip of road to try to demonstrate the dangerous situation to her, but it had not happened.

    Not something I would have done, risking my wife’s life like that, but
    Maybe I’m just too reasonable?

    “allegations of a faulty Autopilot system,”
    “had been using the vehicle’s Autopilot function at the time of the crash”

    So, maybe it’s just me, maybe I’m just too “reasonable”, but if I believed that a certain stretch of road, in a certain lane of that road, in a certain mode of driving my car, was likely to cause my car to crash, then I would not place myself and any passengers and any other cars behind me in such a dangerous situation.

    What ever happened to rational risk/benefit thinking?
    What was the benefit of repeatedly putting yourself at risk when it was so easy to avoid that risk?

    I would simply avoid that lane in that area, or better yet, keep my hands firmly on the wheel at that location, or better yet, not have Autopilot on during that short dangerous stretch of road.
    Or am I just being too reasonable?

    I would have mounted a camera on the dash until I captured the event.
    I would have gone to a service manager and carefully described the problem and had I caught it on video, I would have given them a copy.
    I would have had Tesla service people take the car out to that stretch of road and in the same conditions allowed them to experience the problem.

    Or is it more reasonable to keep putting yourself and others in danger until you finally crash?

    Tesla has always been clear that the Autopilot system is a driver assist, not a self-driving car system, and the human driver must at all times control the steering wheel and be alert and watchful of the road and be ready to take full charge at any time. That’s why the car has a warning system to alert you that your hands are not on the steering wheel.

    Tesla always was clear that the performance of the autopilot system depends on road markings for it’s decision making and that the drivers must be ready to correct any misreads the software might make when the road markings are non-standard or in error.

    So, while I feel for his family, I cannot see how anything the owner did was reasonable. He repeatedly chose to risk his life in a manner which could have easily been avoided and eventually, the predictable outcome happened.

    It’s like sticking a metal fork in an electric toaster to pull out your stuck bagel and seeing an arc of electricity the first time, but then you just keep on using a metal fork to get your stuck bagel out until eventually you get electrocuted and then blaming the maker of the toaster.

    If your bagel gets stuck and you really have no other tool that a metal object to remove the bagel, then at least unplug the toaster first!

    Or am I being to reasonable?

  5. Tesla Autopilot thought the concrete median was just another lane and steered right into it. This was inevitable because Autopilot is in beta-testing which means that finding problems is to be expected. It’s the driver’s job to test the computer’s mistakes and report them for fixing.

    We’ve decided as a society that we value progress over safety so treating customers as lab rats is perfectly legal in this situation. Many would even argue that this dangerous testing is necessary in order to more quickly achieve true autonomous driving and save even more lives down the road.

  6. @resident

    “Customers have to realize that TESLA AUTOPILOT is not a real autopilot. The name is just a marketing scam. The product is really only slightly more advanced from what other brands call CRUISE CONTROL. ”

    No, “CRUISE CONTROL” is nothing more than a system that holds a “set speed” until the driver taps the brakes or the buttons to change the set point.
    CRUISE CONTROL only effects the gas pedal to add more gas as the car climbs a hill and reduce gas when going down hill. That’s pretty much it.

    No CRUISE CONTROL can hit the brakes for you.
    No CRUISE CONTROL can steer the car on a straight course, like an airplane autopilot can.
    No CRUISE CONTROL can steer the car at all.
    No CRUISE CONTROL can detect lines, or obstructions or help the driver avoid accidents in any manner.

    CRUISE CONTROL is simply a means to avoid speeding tickets and to somewhat relieve the strain on the drivers right foot when they are driving on long trips at a steady speed.

    Tesla Autopilot is very much like the autopilot in an airplane, which also cannot avoid crashes, nor react to changes, nor do anything safely without a pilot keeping an eye on things ready at all times to take full control.

    Just like Tesla Autopilot.
    Just like Tesla clearly tells it’s owners.

  7. @ Cordelia

    “Tesla Autopilot thought the concrete median was just another lane and steered right into it.”

    The news reports I have heard/read say it was not the concrete median, but rather the road lane markings that the Tesla was following. Specifically, a set of old obsolete markings that had been covered by a long stretch of safety barrier so these old markings were not seen. Until one day some other car crashed into that safety barrier and Caltrans removed it, but did not replace it quickly enough. The old obsolete lane markings lead the Tesla Autopilot to think they were the correct lane markings to follow and thus the car crashed into the unprotected concrete median.

    “This was inevitable because Autopilot is in beta-testing which means that finding problems is to be expected.”

    Finding problems, yes, but this crash was 100% avoidable, if the driver had simply followed Tesla directions, or had simply done anything to actually avoid the crash, like not using that lane or not using Autopilot when approaching that spot, for example. Or if Caltrans had quickly replaced the barrier or had done a better job of removing the old lane markings.

    “It’s the driver’s job to test the computer’s mistakes and report them for fixing.”

    Sure, but this driver knowingly and repeatedly drove in exactly the manner most likely to kill himself and never did report the actual problem either.

    “We’ve decided as a society that we value progress over safety”

    Many people exploit the call for “safety” to justify many bad things, which almost never actually improve real safety. People say “if only one life is saved, then it’s worth doing”, then they do something that wont in fact save any lives but will certainly cost more lives.

    “so treating customers as lab rats is perfectly legal in this situation.”

    I don’t think that a fair way to put it.

    “Many would even argue that this dangerous testing is necessary in order to more quickly achieve true autonomous driving and save even more lives down the road.”

    Considering how few deaths or even serious injuries have resulted from driving a Tesla compared to any other car, I don’t see how Tesla is putting people at risk, but rather Tesla is offering people a way to drastically reduce their risk of injury or death, if they follow Tesla driving directions along with the driving laws that apply to all other cars.

    As with so many other issues, the media is stirring up fear by grossly misrepresenting the facts and ignoring the overall impact of something the general public does not yet understand well.

    Fear sells and fear results in giving a smaller number of people more and more power over everyone else, in the name of “safety”. And it’s virtually never true that giving people in authority more power ever actually results in more safety for anyone but those few who gain power by screaming about “safety”.

    Compare the safety record of Tesla cars to any other maker and Tesla always wins.

  8. @Richt

    Seems we agree on just about everything. The path to reliable autonomously driving vehicles is not going to be bloodless. Humans will not pay attention to the road at all times, whether beta-testing systems are involved or not. Computers will be more reliable than human drivers some day, so let’s keep our eyes on the prize.

Leave a comment