Tesla safety criticised after Autopilot crash

APD NEWS

text

Tesla has been criticised for a lack of safeguards in its Autopilot system, following a fatal crash in California.

Apple engineer Walter Huang died when his self-driving Tesla Model X hit a concrete barrier near Mountain View in March 2018.

The 38-year-old had complained to friends and family that the vehicle's Autopilot feature had veered it towards the same barrier on previous occasions.

At a hearing on Tuesday, the National Transportation Safety Board said the crash was caused by:

NTSB chairman Robert Sumwalt said: "What struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology.

"This car has level two automation, meaning that it could only drive itself under certain conditions and, most importantly, that an attentive driver must supervise the automation at all times, ready to take control."

Mr Sumwalt said government regulators had "provided scant oversight, ignoring this board's recommendations for system safeguards".

He added: "Industry keeps implementing technology in such a way that people can get injured or killed, ignoring this board's recommendations intended to help them prevent such tragedies.

"There is not a vehicle currently available to US consumers that is self-driving. Period.

"Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated."

Mr Sumwalt said that two recommendations were made in 2017 to six manufacturers regarding the inappropriate use of driving automation systems - five had responded positively but Tesla "ignored us".

"It's been 881 days since these recommendations were sent to Tesla. We're still waiting,"

The NTSB can only make recommendations - it is up to the National Highway Traffic Safety Administration to regulate for safety of vehicles.

The NHTSA said it would carefully review the conclusions.

It comes amid growing concern about systems able to perform driving tasks with little or no human input but cannot completely replace drivers.

Tesla's system Autopilot has been linked to least three deadly crashes since 2016 and is suspected of being a factor in others.

Tesla has not commented.

(SKY NEWS)