Tesla crash: NTSB faults an "overreliance on technology"
David Wagman | February 26, 2020The National Transportation Safety Board (NTSB) said that multiple factors — including an ineffective driver monitoring system and driver distraction — led to a fatal March 2018 accident involving a Tesla vehicle on a highway in Mountain View, California.
The NTSB said that the driver over-relied on the vehicle's “Autopilot” function and was likely distracted by a cell phone game app when the accident occurred. Even so, the NTSB faulted what it said was the Tesla vehicle’s "ineffective monitoring of driver engagement" as also contributing to the crash.
Earlier, the NTSB said that systemic problems with the California Department of Transportation’s repair of traffic safety hardware and the California Highway Patrol’s failure to report damage to a crash attenuator led to the Tesla striking a damaged and nonoperational crash attenuator. The NTSB said these factors contributed to the severity of the driver’s injuries, which proved fatal.
(Read "In probing a fatal Tesla crash, NTSB faults a state highway agency.")
“This tragic crash clearly demonstrates the limitations of advanced driver assistance systems available to consumers today,” said NTSB chairman Robert Sumwalt. “There is not a vehicle currently available to U.S. consumers that is self-driving."
Sumwalt said that the crash was the result of an overreliance on technology, driver distraction, lack of policy prohibiting cell phone use while driving and infrastructure failures.
A 2017 Tesla Model X P100D similar to the crash vehicle. Source: Tesla"The lessons learned from this investigation are as much about people as they are about the limitations of emerging technologies,” said Sumwalt.
Beta version
In April 2018 as the accident investigation was getting under way, the NTSB removed Tesla as a party to the probe. The board said it acted because Tesla violated the party agreement by releasing investigative information before it was vetted and confirmed by the NTSB.
Tesla has previously said Autopilot was put out initially in “beta,” meaning it was being tested and improved as bugs were identified, a lead NTSB investigator said at a February 25 meeting that presented the board's findings on the accident.
NTSB vice chairman Bruce Landsburg was reported by the Associated Press as saying that if the system had known bugs, “it’s probably pretty foreseeable that somebody’s going to have a problem with it. And then they (Tesla) come back and say ‘oh, we warned you.’”
Uncorrected drift to the left
Walter Huang, the 38-year-old driver of the 2017 Tesla Model X P100D electric-powered sport utility vehicle, died from multiple blunt-force injuries after his SUV steered to the left and entered what is referred to as the "gore area" of the US-101 and State Route 85 exit ramp and hit a damaged and nonoperational crash attenuator at a speed of 70.8 mph.
The fatal accident vehicle and one of two vehicles involved in the March 2018 accident. Seconds before the accident, the Tesla steered into the gore area of an exit off-ramp, apparently confused by indistinct lane markings and bright sunshine. An accident barrier, visible at the lower left of the photo, had been damaged days before the accident and contributed to the severity of the Tesla accident. Source: NTSBThe Tesla was then struck by two other vehicles, resulting in the injury of one other person. The Tesla’s high-voltage battery was breached in the collision and a post-crash fire ensued. Witnesses removed the Tesla driver from the vehicle before it was engulfed in flames.
During a public session announcing the investigation's findings, NTSB staff members were quoted by the AP as saying they could not pinpoint exactly why the car steered left into the gore area. They said it likely was a combination of faded lane lines, bright sunshine that affected cameras and the vehicle and a vehicle in the lane ahead of the Tesla.
Final seconds
In its official findings, the NTSB said it learned from Tesla’s “Carlog” data (data stored on a non-volatile memory SD card in the media control unit) that during the last 10 seconds prior to impact, the Tesla’s Autopilot system was activated and the traffic-aware cruise control was set at 75 mph.
Between six and 10 seconds prior to impact, the SUV was following another vehicle at a distance of about 83 ft. The Tesla’s lane-keeping assist system (“Autosteer”) then initiated a "left steering input" toward the gore area. This took place while the SUV was about 5.9 seconds and roughly 560 ft from the crash attenuator. No driver-applied steering wheel torque was detected by Autosteer at the time of the steering movement. The NTSB said the hands-off steering indication continued up to the point of impact.
The Tesla’s cruise control no longer detected a lead vehicle ahead when the SUV was about 3.9 seconds and 375 ft from the attenuator. At this point, the SUV began accelerating from 61.9 mph to a preset cruise speed of 75 mph.
A view of the accident scene looking back at the Tesla's approach to the exit ramp gore area. Source: ABC7 NewsThe NTSB said the vehicle's Autosteer lane-keeping-assist subsystem was tracking both the left and right lane lines until about five seconds before the crash. At that point, the Tesla "lost acquisition of lane lines" and began following lines to the left, into the gore point.
About 3.5 seconds before the crash, the Tesla reacquired what the NTSB said was a "right high-confidence lane line" as the vehicle was near-centered in the neutral area of the gore, tracking both the left and right 8 in wide channelizing lines of the gore area.
The Tesla’s forward collision warning system did not provide an alert and automatic emergency braking did not activate. The SUV driver did not apply the brakes and did not initiate any steering movement to avoid the crash.
Software upgrades
Following the Mountain View crash, Tesla made design changes to Autopilot firmware.
The NTSB said the changes included an improved vision system, more immediate Autosteer hands-off wheel alert timing warnings and an Autopilot “Driveable Space” forward collision warning and avoidance system. These changes include:
Hydranet Vision System (Firmware Update 2018.10.4): The firmware update includes vision system changes designed to improve the ability of the system to recognize poor and faded lane markings, slopes/banks, and higher-curvature roads. A higher fidelity “roadway estimator” was incorporated to improve lane and path prediction. The Hydranet Vision System also includes changes that impact depth detection, vision-radar association, and vehicle detection.
Autopilot Hands-Off Wheel Alert Timing (Firmware Update 2018.23): The firmware changed the alert timing to include more immediate alerts when hands are not detected on the steering wheel and there is no valid lane line (plus no lead vehicle) or if an unusual lane line is detected. It also includes visual alerts based upon speed-based escalation for travel speeds over 25 mph.
NTSB chair Robert L. SumwaltFor example, at a speed of 25 mph if the vehicle does not detect hands on the wheel for 60 seconds, a visual alert would be provided. At 90 mph the alert would be provided after 10 seconds. Tesla advised that more immediate audible warnings are also provided if the system no longer recognizes a lane while hands are not detected on the steering wheel.
Driveable Space Collision Warning (Firmware Update 2018.23): The firmware was updated with a forward collision warning and active braking when certain unknown objects are identified in the path of the vehicle. This “Drivable Space” time-to-collision provides warning up to 2.5 seconds based on detection. Since this warning is part of the Autopilot system, it cannot be disabled when the Autopilot system is operating. The warning system is determined by the camera vision system without radar fusion and establishes a boundary ahead of the vehicle.
When the vehicle approaches the end of the driveable space, a chime will sound and maximum braking will be applied. Driveable space provides heavy deceleration but is primarily intended to give drivers an audible warning and reduce impact severity, not to fully prevent all crashes at highway speeds.
Driver distraction
The NTSB said that the driver was an avid gamer and game developer who worked for Apple. A review of cell phone records and data retrieved from his Apple iPhone 8 Plus showed a game application was active and was the frontmost open application on his phone during his trip to work.
The driver’s lack of evasive action combined with data indicating his hands were not detected on the steering wheel, is consistent with a person distracted by a portable electronic device, the NTSB said.
In his opening comments, NTSB chairman Sumwalt said that "Employers have a critical role in fighting distracted driving, including a ban on using Personal Electronic Devices (PEDs) while driving. The driver in this crash was employed by Apple — a tech leader," Sumwalt said. "But when it comes to recognizing the need for a company PED policy, Apple is lagging because they don’t have such a policy."
Safety issues and recommendations
Seven safety issues were identified in the crash investigation:
- Driver distraction
- Risk mitigation pertaining to monitoring driver engagement
- Risk assessment pertaining to operational design domain (in other words, the operating conditions under which a driving automation system is designed to function)
- Limitations of collision avoidance systems
- Insufficient federal oversight of partial driving automation systems
- Need for event data recording requirements for driving automation systems
- Highway infrastructure issues
To address these safety issues, the NTSB made nine safety recommendations that seek:
- Expansion of NHTSA’s New Car Assessment Program testing of forward collision avoidance system performance.
- Evaluation of Tesla “Autopilot”-equipped vehicles to determine if the system’s operating limitations, foreseeability of misuse, and ability to operate vehicles outside the intended operational design domain pose an unreasonable risk to safety.
- Collaborative development of standards for driver monitoring systems to minimize driver disengagement, prevent automation complacency and account for foreseeable misuse of the automation.
- Review and revision of distracted driving initiatives to increase employers’ awareness of the need for strong cell phone policies prohibiting portable electronic device use while driving.
- Modification of enforcement strategies for employers who fail to address the hazards of distracted driving.
- Development of a distracted driving lock-out mechanism or application for portable electronic devices that will automatically disable any driver-distracting functions when a vehicle is in motion.
- Development of policy that bans nonemergency use of portable electronic devices while driving by all employees and contractors driving company vehicles, operating company issued portable electronic devices or when using a portable electronic device to engage in work-related communications.
Wow.
Yet another oil company paid hack job.
The finding should have been DRIVER WAS PLAYING VIDEO GAME DURING CRASH.
That is not an over reliance on technology. That is abject stupidity.
Easy way to tell its an oil company hack job?
Whats the only car company called out specifically? Waymo? Uber? GM?
Nope, only Tesla.
In reply to #2
Agree. There are plenty of notifications from Tesla that the driver must be attentive and have his hands on the wheel. This "Tech" guy should have known better' but was too involved with his "gaming". Losing his live over a game? Whats next, falling off a cliff for a Selfie?
In reply to #2
The driver was relying on the heavily promoted misnamed system called AUTOPILOT.
A system called AUTOPILOT that is not an autopilot is a lethal weapon - as is precisely proven by this case.
This has nothing at all to do with what the driver was incidentally doing because he was using the vehicle in the exact manner it was promoted - on AUTOPILOT.
Your attempt to blame the driver speaks to your own personal bias and has nothing to do with the facts.
"Oil company hack job??" Because a Tesla crashed they mentioned only Tesla. Yep gotta be the oil companies. Self driving tech getting slammed because its not really self driving. Yep must be the oil companies.
Guess they should have said Ford truck hits barrier because it runs on diesel.
In reply to #5
Ah, I am glad you cleared that up. WE ARE SAVED!
Megacorps are not driven by money but are altruistic, humanistic and benevolent.
How could I be SO wrong!
They would never do anything shady or illegal. They will GLADLY lose money to do the right things just like GM with the ignition switch, Takata with the airbags, Boeing with the 737, BP with Deepwater horizon. oh, and the regulators didn't see any problems with any of these either. Those were all FAKE NEWS perpetrated by the liberal elite to take away your guns.
Oh and that whole American revolution conspiracy? Never happened. Fake News and paid crisis actors. We all know King George lets us go and wished us well.
Now put your fingers back in your ears and start humming again and you'll be fine. Just dismiss me as a mean old Republican. I wouldn't want to scare you.