Many car enthusiasts know consumer advocate Ralph Nader as the man who wrecked the Chevrolet Corvair with his famous book. Dangerous at any speed. In the book, Nader pointed out the dangerous shortcomings of several domestic vehicles, including the Corvair, which he said had a tendency to spin and roll over due to its rear-engine layout and suspension design.
Nader’s book and other advocacy made major waves and went a long way in changing the landscape of standard safety equipment in consumer vehicles today. Things like seat belts and anti-lock brakes are now standard equipment in part because of his efforts. Now Nader is turning his ire on Tesla and its “Full Self-Driving” (FSD) software which, while not final, is currently being used by thousands of ordinary drivers in a “beta” test form. The National Highway Traffic Safety Administration (NHTSA) is currently investigating several car crashes (some fatal) in which Tesla’s Autopilot feature, a greatly reduced driver aid that shares some sensors and nascent abilities with parts of the FSD, is implicated as possible causes. for incidents.
Nader released a statement lambasting Tesla for the software and arguing that it should never have been put in its cars in the first place.
“Tesla’s major deployment of so-called Full Self-Driving (FSD) technology is one of the most dangerous and irresponsible actions by an automaker in decades. Tesla should never have put this technology into its vehicles. Today, more than 100,000 Tesla owners are currently using technology that research shows malfunctions every eight minutes.”
Putting that claim aside for a moment, it’s worth reminding everyone that despite its name, FSD cannot pilot Teslas autonomously. Drivers are still required to monitor the procedure when the system is activated, and the system has limitations. But that presents the really inconvenient aspect of it all: where other companies test autonomous driving in the real world, often with explicit permission from the relevant authorities, they do so with trained people driving and monitoring cars. autonomous. ‘ behaviour. They are specially trained to intervene in the event of a malfunction and know the operating parameters and the limits of the system. Tesla is indeed using its own customers as real-world testers, collecting data on their use of FSDs and launching the “beta” part of the software as a shield. After all, customers must sign up for the beta software, though many are eager Tesla fans who are dizzy to see FSD grow and work. We’re not here to rain down on that enthusiasm, but you can see how such optimism (freed from the professional obligation or scientific rigor of a corporate engineer, for example) can lead to substandard judgment. standard when choosing appropriate operating contexts for the FSD.
Nader seems to tap into this thread – that a feature film that isn’t ready for prime time and whose failure can lead to catastrophic consequences shouldn’t be in the hands of the public. His statement goes on to call on the National Highway Traffic Safety Administration (NHTSA) to use its authority to force Tesla to remove FSD software from all of its vehicles.
“I call on federal regulators to act immediately to prevent the growing number of deaths and injuries from Tesla manslaughter crashes with this technology. The National Highway Traffic Safety Administration (NHTSA) has the power to act quickly to prevent such disasters,” the statement read. “NHTSA must use its safety recall authority to order that FSD technology be removed from every Tesla.”
The statement ends with Nader urging a message to “glib-minded regulators” that “Americans should not be test dummies for a powerful, high-profile corporation and its famous CEO. No one is above- above the laws of manslaughter.”
NHTSA has previously argued with Tesla over FSD beta software and asked it to recall software updates, but did not force the electric vehicle maker to remove the software altogether. Since NHTSA is already investigating FSD and Autopilot, it’s unlikely to take any further action due to Nader’s statement, but it’s certainly not a great look for Tesla.
The automaker is also currently grappling with a complaint from the California DMV citing false advertising by positioning Teslas with FSD as fully autonomous vehicles despite the fact that Teslas equipped with Autopilot and FSD only have a “level 2” autonomy.