FutureFive New Zealand - Consumer technology news & reviews from the future
Story image

Fatal 'autopilot' car crash puts Tesla's life on the line

Fri, 1st Jul 2016
FYI, this story is more than a year old

It might just be that autonomous driving isn't as safe as it claims to be, following the news that Tesla has acknowledged its first ever fatal crash involving one of its semi-autonomous cars, which could prove to be a blow to the rapidly developing industry.

Tesla has publicly shared its distress about the first ever fatal crash involving its Tesla Model S. The crash claimed the life of 40-year-old Joshua Brown, a customer who was an innovator and a friend of Tesla and the broader EV community, "believed strongly in Tesla's mission".

Tesla stated in a blog today that the accident was the first known fatality in the 130 million miles (209,214,720 km) that the car's Autopilot feature had been activated.

The crash happened in the United States back in May, when the Tesla collided with a truck-and-trailer. The car - and its driver - failed to detect the white truck against the bright sky. Neither the car or the driver applied the brakes.

Tesla says the the truck's unusually high load, combined with its road positioning, caused the car to pass underneath the truck to the point where the trailer hit the car's windscreen, with one media report stating that the impact tore the roof off the car.

Tesla has admitted in the past that its sensors have difficulty detecting all objects around the car, but it's still a work in progress. Investigations are underway to find out if there was a fault in the car's systems.

The company states that the autopilot assist still needs to be human-managed. With this in mind, it's not entirely a "self-driving" accident, it does show that the technology and the people behind the wheel are still vulnerable and may trust it a little too much.

Tesla says that if the car had hit the front or back of the trailer at high speed, the advanced crash safety system would have deployed and prevented serious injuries, as it has done in the past.

Tesla says it disables its autopilot systems by default and users must specifically agree to use the new technology while it is still in a public beta phase. The company points out autopilot clauses in the agreement, stating it's "an assist feature that requires you to keep your hands on the steering wheel at all times" whilst maintaining control, awareness and responsibility and interception when necessary, stating users must "be prepared to take over at any time".

Tesla says its systems check often to see if the driver's hands stay on the wheel, and alerts drivers when they're not. The car then slows down until the driver's hands are back on the wheel.

Tesla admits that its autopilot is continuously being improved as the software accounts for rare events, but driver smarts are definitely still needed.

Will this be a setback for Tesla, when it is trying and succeeding to drive the autonomous and EV industry? Tesla is putting the blame mostly on the driver in this case, particularly when the technology is still in beta mode.

While the crash is still under investigation by the NHTSA, it does provide a tragic lesson for the future of the autonomous vehicles.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X