Federal investigators warn Tesla is using customers as ‘guinea pigs’ to test its ‘Full Self-Driving’

The National Transport Safety Board (NTSB) suggests Tesla is using customers as ‘guinea pigs’ to test its autonomous driving technology before it is officially approved and is blaming its sister agency for letting it happen.

In a letter to the National Highway Traffic Safety Administration (NHTSA), NTSB is calling for stricter requirements for design and use automated driving systems on public roads, CNBC reports.

Tesla is named 16 times in the document, mainly due to the fact it released its ‘Full Self-Driving’ FSD) beta version to the public ‘with limited oversight or reporting requirements.’ 

Although NTSB points to the Elon Musk-owned firm for its lack of safeguarding, the agency is also slamming NHTSA for its ‘hands-off approach’ to monitor such testing on public roads.

 

In a letter to the National Highway Traffic Safety Administration (NHTSA), NTSB is calling for stricter requirements for design and use automated driving systems on public roads

Tesla first launched its FSD beta program in October to a limited number of customers who were deemed ‘expert and careful drivers.’ 

The firm now has more than 1,000 paying customers testing the beta version that is currently Level 2 autonomy – drivers are required to remain aware and in control of all driving activity.

However, NTSB is not happy that thousands of customers are cruising around with the unapproved system – and the agency is taking action.

Robert Sumwalt, chairman at NTSB, wrote a letter dated February 1 naming both Musk’s firm and NHTSA responsible for a number of fatal crashes.

Tesla is named 16 times in the document, mainly due to the fact it released its 'Full Self-Driving' FSD) beta version to the public 'with limited oversight or reporting requirements'

Tesla is named 16 times in the document, mainly due to the fact it released its ‘Full Self-Driving’ FSD) beta version to the public ‘with limited oversight or reporting requirements’

Although NTSB points to the Elon Musk-owned firm for its lack of safeguarding, the agency is also slamming NHTSA for its 'hands-off approach' to monitor such testing on public roads

Although NTSB points to the Elon Musk-owned firm for its lack of safeguarding, the agency is also slamming NHTSA for its ‘hands-off approach’ to monitor such testing on public roads

‘Tesla recently released a beta version of its Level 2 Autopilot system, described as having full self-driving capability. By releasing the system, Tesla is testing on public roads a highly automated AV [automated vehicle] technology but with limited oversight or reporting requirements,’ Sumwalt wrote. 

‘NHTSA’s hands-off approach to oversight of AV testing poses a potential risk to motorists and other road users.’ 

He notes that it was Tesla’s lack of appropriate safeguards and NHTSA’s inaction that led to the death of drivers in Florida.

‘The NTSB remains concerned about NHTSA’s continued failure to recognize the importance of ensuring that acceptable safeguards are in place so that vehicles do not operate outside their ODDs [operational design domain] and beyond the capabilities of their system designs,’ the letter reads.

Following a number of crashes in which Tesla drivers were operating the vehicle under Autopilot, NTSB determined a thorough investigation into how the system was being misused was necessary – and NHTSA was tasked with doing the job.

Robert Sumwalt, chairman at NTSB, wrote a letter dated February 1 naming both Musk's firm and NHTSA responsible for a number of fatal crashes. Pictured is a fatal accident in Mountain View, California - an incident cited in the letter

Robert Sumwalt, chairman at NTSB, wrote a letter dated February 1 naming both Musk’s firm and NHTSA responsible for a number of fatal crashes. Pictured is a fatal accident in Mountain View, California – an incident cited in the letter

‘NTSB recommended that NHTSA evaluate Tesla Autopilot-equipped vehicles to determine if the system’s operating limitations, the foreseeability of driver misuse, and the ability to operate the vehicles outside the intended ODD pose an unreasonable risk to safety; and that if safety defects are identified, the agency should use its enforcement authority to ensure that Tesla takes corrective action,’ Sumwalt wrote in the letter.

‘To date, NHTSA has shown no indication that it is prepared to respond effectively and in a timely manner to potential AV safety-related defects.’

Although the two agencies fall under the US government umbrella, they are tasked with different roles.

The NTSB investigates accidents to determine underlying causes of damaging incidents, such as the fatal crashes of Tesla drivers using Autopilot in Mountain View, California, in March 2018 and Del Ray Beach, Florida, in March 2019.

They also create safety recommendations for regulators and the auto industry as a whole.

However, once the recommendations are made, NHTSA is required to mandate any necessary recalls, along with releasing its own standards and requirements for safety and design.

If NTSB is able to push stricter requirements on autonomous technology, it could spell bad news for Tesla and its future systems.

It will have to testing its FSD updates internally, instead of using thousands of paying customers to lead the way.

However, it seems Tesla may not have any new updates in the near future – leaked emails says its beta version will not surpass Level 2.

Documents between Tesla attorneys and the California Department of Motor Vehicles (DMV) surfaced last week that state the software, known as Autosteer on City Streets,’ is far off from giving drivers hands-free capabilities.

‘City Streets continues to firmly root the vehicle in SAE Level 2 capability and does not make it autonomous under the DMV’s definition, wrote Eric Williams, Tesla associate general counsel, in a statement attached to an email with the California DMV that has been published to PlainSite.

‘City Streets’ capabilities with respect to the object and event detection and response (OEDR) sub-task are limited, as there are circumstances and events to which the system is not capable of recognizing or responding.’

Read more at DailyMail.co.uk