Autopilot was active when a Tesla crashed into a truck, killing driver

Link articolo originale

Archivio di tutti i clip:
clips.quintarelli.it
(Notebook di Evernote).

Autopilot was active when a Tesla crashed into a truck, killing driver
NTSB report says driver engaged Autopilot 10 seconds before the deadly crash.

Timothy B. Lee
– 5/16/2019, 7:10 PM

EnlargeNational Transportation Safety Board

A Tesla Model 3 had Autopilot active in the seconds before it crashed into a semi truck in March, killing the driver, the National Transportation Safety Board reported on Thursday.
Jeremy Banner was driving his Model 3 on a divided four-lane highway in Palm Beach County, Florida. As the car approached a driveway, a semi truck pulled out in front of the car, making a left-hand turn from the driveway to the opposite travel lanes.
The Tesla was moving at 68mph (110km/h) and slid under the truck’s trailer. The trailer sheared off the top of the car, killing Banner. The vehicle continued down the road for another 1,600 feet (500m) before coming to a rest in the median.
EnlargeGoogle Maps
“Preliminary data show that the Tesla’s Autopilot system… was active at the time of the crash,” the NTSB reports. “The driver engaged the Autopilot about 10 seconds before the collision. From less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel.”
The NTSB says that preliminary data suggests that neither the driver nor the Autopilot system made evasive maneuvers.
“We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy,” a Tesla spokeswoman wrote by email. “Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance.”
Banner is the second Tesla customer to die this way
It’s never good news for Tesla when one of its cars is involved in a deadly crash. But this incident is particularly awkward for Tesla because it so closely mirrors the circumstances of the very first Autopilot-related death.
In May 2016, Joshua Brown was driving his Model S on another Florida highway when a semi truck made a left turn in front of his vehicle. Brown had Autopilot engaged, but he apparently didn’t notice the truck. Brown’s car smashed into the trailer at 74mph (120km/h), shearing off the top of the vehicle and killing Brown.
While both systems were called Autopilot, Brown’s and Banner’s cars actually had completely different driver assistance technologies. The Autopilot system in Brown’s Model S was based on technology supplied by Mobileye, an Israeli startup that has since been acquired by Intel. Brown’s death reportedly contributed to the two companies parting ways later in 2016.
Since late 2016, Tesla vehicles have shipped with a second-generation Autopilot system that Tesla developed in-house. By the time of Banner’s death, Tesla’s engineers had had more than two years to improve the system and avoid a repeat of the situation that led to Brown’s death.
Adaptive cruise control systems like Autopilot traditionally rely mostly on radar to avoid running into other vehicles on the road. Because radar can detect Doppler shifts, it is good at detecting moving objects. But it is not good at recognizing stationary objects (or objects, like a truck crossing the road, that are not moving in the car’s direction of travel). Radar systems lack the angular resolution to distinguish a truck crossing the road from a large street sign suspended over the road, so they tend to simply ignore stationary objects. That has led to Tesla cars (and cars from other manufacturers) plowing into parked cars and concrete barriers.

Theoretically, it should be possible to detect the side of a truck using cameras. But it’s not always easy. In some lighting conditions, for example, the side of a trailer might be hard to distinguish from the sky behind it.
Most other companies working on fully self-driving technology use lidar sensors to supplement cameras and radar. Lidar sends out laser beams and measures how long it takes for them to bounce back. Then it builds a three-dimensional point-cloud of the vehicle’s surroundings—that would have made it easy to detect a large object like a truck trailer.
Unfortunately, lidar with sufficient range and reliability for self-driving applications still costs tens of thousands of dollars, making it impractical for mainstream use in customer-owned cars. Tesla CEO Elon Musk has offered a different criticism of lidar, describing it as a crutch that will actually hamper companies’ progress toward full self-driving.
Whatever you think of that argument, it’s still worth asking whether lidar could have prevented the crash that killed Jeremy Banner.

Promoted Comments

I recently got a Model 3, and while I love it, and love the adaptive cruise control, I don’t trust it at all. I’ve used it for less than 10 hours of highway/freeway time, and it’s suddenly slowed for phantom cars. For example white cars in neighboring lanes seem to confuse it miserably, and if you look at how it estimates the locations of neighboring cars on the screen you’ll see that they often jump around erratically in their estimated position. I’m not surprised that the truck trailer was white, honestly. I’ve tried the lane-keeping for probably less than two hours, and it was fairly terrifying. Especially on California concrete freeways, which are prone to having lots of left-over paint from prior lane lines, with the most recent lane markings often times being barely more clear than ancient lines, the car was easily confused. And if there was a concrete barrier with no shoulder, the car just got as close as it could, scaring both me and passengers.And as for detecting hands on the wheel, even though I would never let go of the steering wheel while lane keeping was on, the car asked me to apply resistance to the steering wheel all the time, so I wonder how honest Tesla is being here when they say that drivers haven’t had their hands on the wheel. Everything about the system feels untustworthy.The on-screen labels it “Beta” software, and compares Autopilot to mobile app software that’s ready to ship, as if that means it’s acceptable for highway use. It is not. These guys are jokers when it comes to marketing their stuff, and the engineers should be ashamed of what the marketing team has done to their stuff.The Tesla is a great car, does great stuff, but the Autopilot is absolute crap, and Musk does a great disservice by being so dishonest in how he represents it. There should be SEC and NTSB suits about that, IMHO.

Veritas super omens wrote:I seem to misunderstand something. Does Florida have cross traffic with no stop sign on its freeways? Just looking at the overview it looks like the truck should yield to oncoming traffic on 441. If that is, in fact, the case the truck is clearly at fault. The driver inattention would be irrelevant if the truck had followed the rules of the road. The driver has some culpability for failure to avoid the accident, whilst Tesla has a very tiny culpability for misleading drivers into thinking it is truly self driving. My take? 95% on truck driver 4% on Tesla driver and 1% on Tesla.Driving rules don’t require you to wait until you can make a turn without oncoming/crossing traffic having to slow down when they see you. The rules require that you wait until you can make the turn without being a hazard (at least that’s how it works in my state). You are allowed to assume the other drivers are actually looking out the windshield when deciding what will/won’t be a hazard.Traffic would grind to a halt all over the place, otherwise.EDIT: It’s not clear from the NTSB report, however, how much time the Tesla driver had in order to stop. Considering the Tesla went under the trailer, the driver almost certainly had enough time to stop, had he noticed the truck pulling out. Trucks don’t pull out all that fast, after all, and the stopping time from 68 mph is about 4-5 seconds. However, whether the Tesla would have had to make an emergency stop, to avoid the accident, or merely would have had to slow down somewhat makes a big difference in whether one would consider the truck had created a hazard.

If you like this post, please consider sharing it.

Leave a Comment

Your email address will not be published. Required fields are marked *