NTSB to probe Tesla Model S crash through which sedan rear-ended fire truck with Autopilot reportedly engaged
The National Transportation Safety Board will conduct an investigation right crash involving a Tesla Model S as well as a fire truck that took place on a freeway from the La area the 2009 week. Inside the Jan. 22 crash, a Model S sedan rear-ended a hearth truck that’s parked within the emergency lane and was giving an answer to another accident, The Mercury News reported.
No one was injured during the crash, which caused heavy frontal destruction of the Model S as well as some problems for the spine left portion of the fire truck.
"Amazingly there was no injuries! Please stay alert while driving!" the Culver City Firefighters union said inside a tweet.
The driver with the Model S, who had previously been the only person in the vehicle, reportedly revealed that the Autopilot driver-assist system was engaged in the time the crash, good Mercury News. However, they have not definitively been established by investigating agencies that Autopilot was in fact engaged at the time of the crash possibly the seconds before crash.
As over the crashes involving Autopilot, its expected the car's computer could provide investigators within the NTSB and also the automaker telemetry data in the incident.
Tesla Autopilot limitations played role in deadly crash, NTSB says
The National Transportation Safety Board has determined the probable reason for a May 2016 crash involving a semitruck along with a Tesla Model S, that the electric sedan drove in the truck's …
This may be the second investigation involving Tesla's Autopilot driver-assist system by way of the NTSB; in 2016, the company investigated a fatal Florida crash involving a single S where Autopilot use was eventually confirmed. In this incident, a Model S drove under the trailer of any semitruck on a top speed, ripping the rooftop over car and killing power. The NTSB determined that "truck driver’s failure to yield the proper of way plus a car driver’s inattention because of overreliance on vehicle automation" were the probable explanation for the crash.
"Advanced Driver Assistance Systems, like Tesla’s Autopilot, want the continual and full attention with the driver to watch the traffic environment and grow prepared to undertake it to protect yourself from crashes," the NHTSA stated following the report of your fatal Florida crash. "Automated Emergency Braking systems were designed to aid in avoiding or mitigating rear-end collisions. The systems have limitations and may also not absolutely detect threats or provide warnings or automatic braking early on to protect yourself from collisions. Although perhaps much less specific as it may be, Tesla presents information regarding system limitations during the owner’s manuals, ui and associated warnings/alerts, as well as a driver monitoring system which is designed aid the motive force in remaining involved in the driving task all of the time. Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be alert to system limitations."
Autopilot sensor maker faults Tesla for ‘pushing the envelope’ keeping the car safe, report says
Mobileye, the provider that supplies sensors for Tesla's semi-autonomous Autopilot driver assist system, is publicly criticizing Tesla's use and advertisement with the system in the market, …
In the weeks following a Florida crash, Tesla published your blog post reminding users to keep their diligent the tire while using Autopilot, but it subsequently made the system tighter when it comes to monitoring those things of your driver.
The most surprising finding on the fatal Florida crash probe was the NTSB's determination that this Autopilot system could hardly find out the truck crossing the road directly within the path and has not been even designed for this.
"The Tesla’s automated vehicle control system hasn’t been intended to, and might not, get the truck crossing the Tesla’s path or recognize the impending crash," the NTSB said from a statement. "Therefore, the computer couldn’t slow the auto, the forward collision warning system didn’t provide an alert, and also the automatic emergency braking wouldn’t activate."
In the wake of countless crashes through which Autopilot use was suspected or confirmed Tesla, has faced criticism for overstating the skill sets with the system and then for not designing greater safety mechanisms within the system that may compel proper usage.
Tesla was required to retrain store staff in China after it was actually publicized that this cars were marketed as "self-driving" — a translation snafu was blamed — and the store staff gave demonstration rides to prospective buyers and keeping their hands away from the steering for significant durations.