News, Culture & Society

Elon Musk admits Tesla’s autopilot will ‘never be perfect’

Billionaire hi-tech mogul Elon Musk admitted on Thursday that the autopilot program his automotive company is working on will ‘never be perfect’ at preventing accidents.

Speaking with CBS News on Thursday, Musk said that the system is not designed to replace human control of a vehicle but instead aid motorists with driving tasks like staying within a lane. 

‘Well what’s the purpose of having autopilot if you still have to put your hands on the wheel, Elon?’ Gayle King, co-host of ‘CBS This Morning,’ asked the famed inventor during their interview. 

Musk replied:  ‘Oh it, it’s because the probability of an accident with autopilot is just less.’

‘The system worked as described, which is that it’s a hands-on system. It is not a self-driving system,’ he added. 

Last month, federal officials forced Musk’s electric car company, Tesla, out of an investigation involving the death of a man testing the company’s autopilot system after crashing into a highway barrier.

The National Transportation Safety Board said on Thursday that Tesla released information about the probe before coordinating their findings with the government, drawing heavy criticism from federal officials who had not completed their investigation, according to CBS News. 

The carmaker said in a statement, however, that it withdrew NTSB panel because ‘We believe in transparency, so an agreement that prevents public release of information for over a year is unacceptable.’ 

The NTSB is currently looking into the autopilot system placed inside Tesla’s Model X, the vehicle Walter Huang was testing last month in California.  

‘It’s important to emphasize we’ll never be perfect,’ Musk said. ‘Nothing in the real world is perfect. But I do think that long term, it can reduce accidents by a factor of 10. So there are 10 fewer fatalities and tragedies and serious injuries. And that’s a really huge difference.’ 

The widow of Huang said she intends on suing the car company, claiming Tesla knew, or should have known, that the auto-pilot system was not ready for human testing.  

While Tesla said they are ‘incredibly sorry’ for the family’s loss, they blamed the accident on Huang, adding that he did not have his hands on the wheel at the time of the crash, despite repeated warnings to take control of the vehicle.

‘Tesla autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur,’ the company said in a statement. 


We are very sorry for the family’s loss. 

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.

We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.