Tesla Faces Another US Investigation: Unexpected Braking

U.S. auto safety regulators have launched another investigation of Tesla, this time tied to complaints that its cars can come to a stop for no apparent reason.  

The government says it has 354 complaints from owners during the past nine months about “phantom braking” in Tesla Models 3 and Y. The probe covers an estimated 416,000 vehicles from the 2021 and 2022 model years.  

No crashes or injuries were reported. 

The vehicles are equipped with partially automated driver-assist features, such as adaptive cruise control and “Autopilot,” which allow them to automatically brake and steer within their lanes. 

Documents posted Thursday by the National Highway Traffic Safety Administration say the vehicles can unexpectedly brake at highway speeds.  

“Complainants report that the rapid deceleration can occur without warning, and often repeatedly during a single drive cycle,” the agency said. 

Many owners in the complaints say they feared a rear-end crash on a freeway. 

The probe is another enforcement effort by the agency that include Autopilot and “Full Self-Driving” software. Despite their names, neither feature can legally drive the vehicles without people supervising. 

Messages were left Thursday seeking comment from Tesla. 

It’s the fourth formal investigation of the Texas automaker in the past three years, and NHTSA is supervising 15 Tesla recalls since January 2021. In addition, the agency has sent investigators to at least 33 crashes involving Teslas using driver-assist systems since 2016 in which 11 people were killed. 

In one of the complaints, a Tesla owner from Austin, Texas, reported that a Model Y on Autopilot brakes repeatedly for no reason on two-lane roads and freeways. 

“The phantom braking varies from a minor throttle response to decrease speed to full emergency braking that drastically reduces the speed at a rapid pace, resulting in unsafe driving conditions for occupants of my vehicle as well as those who might be following behind me,” the owner wrote in a complaint filed February 2. People who file complaints are not identified in NHTSA’s public database.  

Tesla CEO Elon Musk has been fighting with U.S. and California government agencies for years, sparring with NHTSA and the Securities and Exchange Commission.  

Last week, NHTSA made Tesla recall nearly 579,000 vehicles in the U.S. because a “Boombox” function can play sounds over an external speaker and obscure audible warnings for pedestrians of an approaching vehicle. Musk, when asked on Twitter why the company agreed to the recall, responded: “The fun police made us do it (sigh).” 

Michael Brooks, acting executive director of the nonprofit Center for Auto Safety, said it’s encouraging to see NHTSA’s enforcement actions “after years of turning the other way,” with Tesla. But he said the company keeps releasing software onto U.S. roads that isn’t tested to make sure it’s safe. 

“A piecemeal investigative approach to each problem that raises its head does not address the larger issue in Tesla’s safety culture — the company’s continued willingness to beta test its technology on the American public while misrepresenting the capabilities of its vehicles,” Brooks wrote in an email Thursday. 

Other recent recalls by Tesla were for “Full Self-Driving” equipped vehicles that were programmed to run stop signs at slow speeds, heating systems that don’t clear windshields quickly enough, seat belt chimes that don’t sound to warn drivers who aren’t buckled up, and to fix a feature that allows movies to play on touch screens while cars are being driven. Those issues were to be fixed with online software updates. 

In August, NHTSA announced a probe of Teslas on Autopilot failing to stop for emergency vehicles parked on roadways. That investigation covers a dozen crashes that killed one person and injured 17 others.  

Thursday’s investigation comes after Tesla recalled nearly 12,000 vehicles in October for a similar phantom braking problem. The company sent out an online software update to fix a glitch with its more sophisticated “Full Self-Driving” software. 

Tesla did a software update in late September that was intended to improve detection of emergency vehicle lights in low-light conditions. 

Selected Tesla drivers have been beta testing the “Full Self-Driving” software on public roads. NHTSA also has asked the company for information about the testing, including a Tesla requirement that testers not disclose information. 

 

leave a reply:

Discover more from SELLINES

Subscribe now to keep reading and get access to the full archive.

Continue reading