Posts with «transport accident» label

NTSB: Autopilot was not a factor in fatal Tesla Model S crash

Tesla's Autopilot was not at fault in a 2021 crash in which two people died, according to the National Transportation Safety Board (NTSB). In a final report spotted by Ars Technica, the agency determined that the 2019 Model S accelerated just before hitting a tree in Spring, Texas, just north of Houston. Neither occupant was in the driver's seat when they were found, leading to questions about the use of Autopilot.

Based on information provided by Tesla, the NTSB found (PDF) that the car's rapid acceleration from 39MPH to 67MPH two seconds before the crash and a loss of control of the EV was likely due to "impairment from alcohol intoxication in combination with the effects of two sedating antihistamines, resulting in a roadway departure, tree impact and post-crash fire." The NTSB says data indicated that Autopilot had not been employed "at any time during this ownership period of the vehicle." Investigators did not find any "evidence of mechanical deficiencies" that could have contributed to or caused the crash.

One of the occupants was found in the front passenger seat, while the other was in the rear. It's presumed that the driver was in the back seat because he was trying to escape. Security footage showed that the men were in the front seats as they set off, while data showed that both front seatbelts were buckled at the time of the crash — the car left the road around 550 feet from the driver's home. The men died as a result of the collision and post-crash battery fire.

Jury finds Tesla just '1%' responsible for a Florida teen's crash

Tesla is receiving minimal blame for a fiery 2018 crash in South Florida, which killed two teenagers and injured another. A jury today found Tesla just one percent responsible for the crash, reports the AP, which means it's only responsible for paying $105,00 of the $10.5 million awarded to the teen's family. 90 percent of the blame was placed on the teen driver, Barrett Riley, while his father James Riley received nine percent of the blame.

According to an NTSB investigation, Barrett Riley was driving at 116 mph in a 30 mph zone near Fort Lauderdale Beach. The agency concluded he most likely lost control of the vehicle. James Riley initially sued Tesla over the crash, claiming that it would have been survivable if the electric car's lithium ion batteries hadn't “burst into an uncontrollable and fatal fire." He also noted that the company removed a speed limiter that was meant to keep the vehicle under 85 mph. An investigation later found that his son had asked a Tesla dealership to remove that limiter.

Tesla lawyers argued that Riley's parents were negligent by allowing him to drive the car, despite his record of reckless driving and speeding. They denied negligence on the company's part. After the crash in 2018, Tesla released an update allowing drivers to set their own speed limits, a feature initially dedicated to Barrett Riley.

NHTSA deepens its probe into Tesla collisions with stationary emergency vehicles

The National Highway Traffic Safety Administration (NHTSA) has deepened (PDF) its investigation into a series of Tesla crashes involving first responders to an engineering analysis. As The Washington Post explains, that's the last stage of an investigation, and the agency typically decides within a year if a vehicle should be recalled or if the probe should be closed. In addition to upgrading the probe's status, the investigation now covers 830,000 units, or almost all the Tesla Model Y, Model X, Model S and Model 3 vehicles the company has sold since 2014.

This development expands upon the investigation the NHTSA initiated back in 2021 following 11 collisions of Tesla vehicles with parked emergency responders and trucks. Since then, the agency has identified and added six more incidents that occurred over the past couple of years. In most of those crashes, Autopilot gave up vehicle control less than one second before impact, though Automatic Emergency Braking intervened in at least half of them. 

The NHTSA also found that the first responders on the road would've been visible to the drivers at an average of eight seconds before impact. Plus, forensic data showed no driver took evasive action between 2 to 5 seconds prior to impact even though they all had their hands on the wheel. Apparently, nine of the 11 vehicles originally involved in the investigation exhibited no driver engagement visual or chime alerts until the last minute before the collision. Four of them didn't exhibit any engagement visual or chime alert at all. 

The NHTSA also looked into 191 crashes not limited to incidents involving first responders. In 53 of those collisions, the agency found that the driver was "insufficiently responsive" as evidenced by them not intervening when needed. All these suggest that while drivers are complying with Tesla's instructions to make sure they have their hands on the wheel at all times, they're not necessarily paying attention to their environment. 

That said, the NHTSA noted in its report that "a driver's use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect." As University of South Carolina law professor Bryant Walker Smith told The Post, monitoring the position of a driver's hands isn't effective enough, because it doesn't ensure a driver's capability to respond to what they encounter on the road. 

In addition, the NHTSA noted that the ways a driver may interact with the system is an important design consideration for Level 2 autonomous driving technologies. These systems still aren't full autonomous and still mostly depend on the human driver, after all. "As such, ensuring the system facilitates the driver's effective performance of this supervisory driving task presents an important safety consideration," the agency wrote.

Tesla Autopilot under investigation following crash that killed three people

A recent Model S crash that killed three people has sparked another Federal probe into Tesla's Autopilot system, The Wall Street Journal has reported. The National Highway Traffic Safety Administration (NHTSA) is conducting the investigation and said it's currently looking into more than 30 incidents involving Tesla's Autopilot.

The accident occurred on May 12th in Newport Beach's Mariners Mile strip, according to the Orange County Register. The EV reportedly struck a curb and ran into construction equipment, killing all three occupants. Three construction workers were also sent to hospital with non-life-threatening injuries. Police declined to say whether Tesla's Autopilot was involved. 

Tesla is one of a number of automakers that have released Level 2 driver assistance systems designed to ease driving chores. Those systems are far from full self-driving (Level 4 or 5) though, and Tesla specifically instructs drivers to pay attention to the road and keep their hands on the wheel. 

The NHTSA said last August that it was opening an investigation into Autopilot following 11 crashes with parked first responder vehicles since 2018 that resulted in 17 injuries and one death. 

The NHTSA itself has been criticized by the National Transportation Safety Board (NTSB) for not ensuring automakers include the right safety features in their Level 2 autonomous vehicles. NTSB chair Jennifer Homendy has called Tesla's use of the term "Full Self-Driving" for its latest Autopilot system "misleading and irresponsible," saying "it has clearly misled numerous people to misuse and abuse technology." 

Tesla driver in fatal California crash first to face felony charges involving Autopilot

A Tesla owner is facing the first felony charges filed against someone using a partially automated driving system in the US, according to AP. The defendant, Kevin George Aziz Riad, was driving a Model S when he ran a red light and crashed into a Honda Civic at a California intersection in 2019. It ended up killing the Civic's two passengers, while Riad and his companion sustained non-life threatening injuries. California prosecutors filed two counts of vehicular manslaughter against Riad in October last year.

The court documents reportedly didn't mention anything about Autopilot. However, the National Highway Traffic Safety Administration (NHTSA), which has been investigating the incident over the past couple of years, recently confirmed that it was switched on at the time of the crash. The NHTSA formally opened a probe into Tesla's driver assistance system in August last year following a string of 11 crashes involving parked first responder vehicles that killed 17 people. It's also investigating other types of crashes with Tesla vehicles, including one complaint blaming the beta version of the company's Full Self Driving technology for a collision in California. 

As AP notes, Riad is the first to face charges involving a widely used driver assistance technology, but he's not the very first person using an automated driving system to be charged in the US. In 2020, an Uber backup driver was charged with negligent homicide after the company's autonomous test vehicle struck and killed a pedestrian in Arizona. According to an investigation by the National Transportation Safety Board (NTSB), Uber's technology detected the victim more than five seconds before the crash but wasn't able to identify her as a pedestrian. The driver could have avoided the crash if she had been paying attention. 

The NHTSA told AP in a statement that "every vehicle requires the human driver to be in control at all times" even if it has a partially automated system. On its Autopilot page, Tesla says that Autopilot is "intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment."

Apple is reportedly working on a way for iPhones to detect car crashes and auto-dial 911

Your iPhone might have a new capability as soon as next year: detecting a car accident and automatically dialing 911. Apple plans to unveil a feature called "crash detection" for both iPhones and Watches, according to a Wall Street Journal report. The feature would supposedly use sensors like the accelerometer built into Apple devices.

Apple has reportedly been working on the feature for several years and testing it using real world data. According to documents seen by the WSJ, Apple has been collecting data shared anonymously from iPhone and Watch users. It has detected more than 10 million suspected vehicle impacts, with more than 50,000 of those accompanied by a call to 911. Apple has been using that data to improve the accuracy of its crash-detection algorithm, since a 911 emergency call is pretty solid confirmation of a serious crash. 

It's certainly not first to the gate with this. Google introduced a similar feature for the Pixel 3 and Pixel 4 via its Personal Safety app that can detect when you've been in a car crash and alert emergency services. GM has been offering it for years in its cars with OnStar, and recently introduced crash detection to smartphones via the OnStar Guardian app. OnStar's in-vehicle service reportedly responds to over 6,000 crash notifications a month, as the WSJ noted. 

Apple introduced fall-detection to the Apple Watch 4 series, and it can automatically call emergency services and contact your loved ones if you don't respond to a prompt in a certain amount of time. The crash-detection feature is supposed to come out for iPhone and Apple Watches in 2022, provided everything goes to plan. 

Automakers must report crashes involving self-driving and driver-assist systems

The National Highway Traffic Safety Administration (NHTSA) has implemented a new policy that will require car companies to report incidents involving semi- and fully autonomous driving systems within one day of learning of an accident. In an order spotted by The Washington Post, NHTSA mandates automakers fill out an electronic incident form and submit it to the agency when one of their systems was active either during a crash or immediately before it. They must report an accident anytime there's a death, an injury that requires hospital treatment, a vehicle that's towed away, an airbag deployment or when a pedestrian and or cyclist is involved. The order covers Level 2 advanced driver-assistance systems to Level 5 fully autonomous vehicles, meaning it includes the gamut of everything from Tesla cars with Autopilot to Waymo taxis.

"This action will enable NHTSA to collect information necessary for the agency to play its role in keeping Americans safe on the roadways, even as the technology deployed on the nation's roads continues to evolve," the regulator said. NHTSA said it would also require automakers to send in monthly reports detailing all incidents with injuries or property damage involving their automated driving systems. Companies that fail to comply with the order could face fines of up to $22,992 per day, according to The Post.

NHTSA's order comes some two months after a 2019 Tesla Model S was involved in a high-profile crash where investigators initially said there was no one behind the car's wheel. The National Transportation Safety Board (NTSB) later said it examined home security footage that showed the owner got into the driver's seat before the fatal accident. Mere weeks ahead of that incident, Robert Sumwalt, the chair of the NTSB, sent a letter to NHTSA in which he called on the agency to implement stricter regulation related to automated vehicle technology. NHTSA "must act" to "develop a strong safety foundation," he said, citing Tesla frequently in his letter.