The U.S. Automobile Safety and Security Agency said Monday it had launched a full investigation into the autopilot system used in hundreds of thousands of Tesla’s electric cars.
The investigation was triggered by at least 11 accidents in which Teslas drove into parked fire engines, police cars and other emergency vehicles with autopilot, an assisted driving system that can independently steer, accelerate and brake, the National Highway Traffic Safety Administration published. One woman was killed and 17 people were injured in these crashes.
Safety experts and regulators have been testing autopilot since the first fatal accident with the system in 2016, in which the driver of a Tesla Model S was killed when his car collided with a semi-trailer truck in Florida. In this case, the safety agency concluded that there were no shortcomings – a position it held for years even as the number of autopilot crashes and deaths increased.
On Monday, the agency appeared to be changing course. The investigation is the most comprehensive look at autopilot yet and potential errors that could make it and the Teslas operating with it dangerous.
Depending on their findings, the security agency could force Tesla to recall cars and make changes to the system. It also has the power to compel automakers to equip their cars with safety devices and features, such as: B. when reversing cameras and airbags are required.
A critical issue that investigators will focus on is how autopilot ensures that Tesla drivers are paying attention to the road and are ready to regain control of their cars in the event the system fails to detect something and brakes. The company’s manuals instruct the driver to keep their hands on the steering wheel, but the system continues to work even if the driver only taps the steering wheel occasionally.
“Driver monitoring was a major shortcoming of the autopilot,” says Raj Rajkumar, an engineering professor at Carnegie Mellon University who focuses on autonomous vehicles. “I think this investigation should have started some time ago, but it’s better late than never.”
The switch to electric cars
Tesla, by far the world’s most valuable automaker, and its charismatic and cheeky CEO Elon Musk said autopilot wasn’t flawed and insisted it makes cars far safer than others on the road. They have rejected warnings from safety experts and the National Transportation Safety Board criticizing the company’s development of the autopilot.
The company and Mr. Musk, who frequently comment on Twitter, did not respond to requests for comment on Monday and did not make public statements about the new investigation.
Mr Musk turned down the idea that Tesla’s advanced driver assistance system should monitor drivers, saying in 2019 that human intervention could make such systems less secure.
His views are in stark contrast to the approach taken by General Motors and other automakers. For example, GM offers a driver assistance system called Super Cruise on some models. The system allows the driver to take their hands off the steering wheel, but uses an infrared camera to monitor the driver’s eyes to make sure they are looking at the road.
The safety agency said it will also investigate how autopilot detects objects on the road and under what conditions autopilot can be engaged. Tesla urges drivers to only use the system on divided highways, but it can be used on smaller streets and roads. GM is using GPS to limit the use of Super Cruise on major highways with no oncoming or crossing traffic, intersections, pedestrians and cyclists.
Tesla’s autopilot system generally appears to be struggling to detect and brake parked cars, including private cars and trucks with no flashing lights. In July, for example, a Tesla crashed into a parked sports utility vehicle. The driver had turned on autopilot, fell asleep and later failed a sobriety test, the California Highway Patrol said.
The safety agency’s investigation will examine all Tesla models – Y, X, S, and 3 – from model years 2014 through 2021, totaling 765,000 cars, a vast majority of the vehicles the company has made in the United States.
The new investigation comes in addition to the inspections that the safety agency carries out on more than two dozen accidents with autopilot. The agency has said eight of these crashes resulted in a total of 10 deaths. These investigations are designed to examine the details of individual cases to provide data and insights that the agency and automakers can use to improve safety or identify problem areas.
Tesla has admitted that sometimes the autopilot does not recognize stopped emergency vehicles. And safety experts, videos posted on social media, and Tesla drivers themselves have documented a variety of Autopilot weaknesses.
In some accidents in which the system was involved, Tesla drivers were found asleep behind the wheel or were awake but distracted or unmotivated. A California man was arrested in May after leaving the driver’s seat of his Tesla while on autopilot; he was in the back of his car as it crossed the San Francisco-Oakland Bay Bridge.
At least one person died in one of the eleven accidents involving emergency vehicles that are being investigated by the agency. Just days after Christmas 2019, Derrick and Jenna Monet were driving on Interstate 70 in Indiana west of Indianapolis when their Tesla crashed into a parked fire truck, the Indiana State Police said at the time. Ms. Monet, who was 23 years old and was a passenger in the Tesla Model 3, died. Mr Monet, who drove the car, was unavailable for comment.
Some other accidents resulted in serious injuries. In February, local police officers in Montgomery County, Texas, north of Houston, were conducting a traffic control when one of their vehicles was hit by a Tesla. Several officers and a dog were treated for minor injuries, and one person at the scene was rushed to hospital with serious injuries, according to a local officer. The Tesla driver was arrested on suspicion of driving under the influence of alcohol. In another accident last year in Nash County, NC, near Raleigh, the sheriff’s office posted on Facebook that the Tesla driver had seen a movie.
The National Transportation Safety Board, which investigates accidents but cannot force automakers to change, has urged the National Highway Traffic Safety Administration to take stronger measures to regulate autopilot and other advanced driver assistance systems. Last year, the Road Safety Agency said in a report that Tesla’s “ineffective monitoring of driver engagement” contributed to an accident in 2018 that killed Wei Huang, the driver of a Model X, who hit a highway block in Mountain View, California bounced.
Following the release of this report, then-Chairman of the Road Safety Committee, Robert L. Sumwalt, urged the Road Safety Authority “to exercise their oversight to ensure corrective action is taken”.
“It is time that drivers in partially automated vehicles can no longer pretend they have self-driving cars,” he said.
In a statement on Monday, the Transportation Safety Committee said the NHTSA’s investigation of autopilot was “a positive step forward for safety”.
Auto safety experts have often criticized the highway safety agency for poorly investigating deadly car defects such as a defective GM ignition switch and defective Takata airbags. Some of the agency’s top executives are in the auto industry or have joined the industry after leaving government positions.
President Biden named Steven Cliff, previously vice chairman of the California Air Resources Board, as vice director of the agency in January. Mr. Cliff has spent much of his career dealing with air pollution and vehicle emissions.
The first Tesla fatality in the US occurred in 2016 when Joshua Brown, a man from Ohio and a former member of the Navy SEALs, was killed in Florida. His Model S was on autopilot on a federal highway when a trailer truck crossed the street in front of him. Tesla said the autopilot didn’t recognize the truck because it was white and the sky behind it was light.
Comments are closed.