Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla Autopilot, fatal crash into side of truck 6

Status
Not open for further replies.

VEBill

Military
Apr 25, 2002
7,090
CBC News

The Telegraph

"[It didn't notice] the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
 
Surprising that the front radar would likewise have failed to detect the truck. Moreover, since the truck made the left turn in front of the car, there should have been a time when both the cameras and the radar would have detected the tractor, and the autopilot should have had sufficient history file to say, oh wait something passed in front of us earlier; did it really complete its turn yet?

We had problems with a tracker along the same lines; no history file, no track history

TTFN
I can do absolutely anything. I'm an expert!
faq731-376 forum1529
 
I must say that Tesla's response is rather blasé. Yes one fatality in130 million miles is slightly better than the USA average,but I bet when you correct for demographics that changes significantly.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
And yes a sample size of one is not very useful statistically,but that's all we have.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
That's not an easy thing to really watch for. You think well, "look higher" but seeing a bridge or a street light and stopping suddenly could be disastrous too.

Keith Cress
kcress -
 
At the height of the tractor trailer, it's quite likely the radar had no cross-section to get a return signal from.

Dan - Owner
URL]
 
It's a new and complicated technology, there are bound to be bugs to work out. That said, I have no doubt it will result in a more pleasant, safer, and easier mode of transportation. I certainly hope this doesn't result in a massive fallout on the technology because it's definitely something we could use considering the number of human error related accidents on the road.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
"...height of the tractor trailer..."

In other words, perhaps the radar system design implicitly 'assumes' that the car is only two feet tall, and thus fails to ensure that the way ahead is of sufficient height. If that's the case, then that seems like a fairly major system design process flaw.

This tragic incident is so 'blatant' (driving straight into the side of a truck) that it will hopefully reduce the industry hype about the near term future of self-driving vehicles.

How many more blind spots do these systems have?
 
Might as well anticipate a point...

Some will take the position that as long as the accident rate is equal to or at least slightly better than humans, then it's all okay. "It's an improvement."

I suspect that such levels would prove to be unacceptable. For example, the concentration of liability on one doorstep might be financially impossible.

It seems that there's an implicit requirement that they be much, much safer, and that they totally avoid such systematic 'dumb' mistakes.

They've still got a lot of work to do.
 
Since the average person is such a poor driver (like the case in point), the "autopilot" should be directing the driver rather than driving the car. "Warning low bridge ahead".

This particular driver has been driving unsafely for some time and posting about it on youtube.


 
So next we will see rules that trucks can't be painted white.
 
I'd kind of make the point that a big part of the reason so many drivers are so poor at it these days is because the cars are so good. Power everything, automatic everything, infotainment systems, drivers are more and more insulated / isolated from both the physical and mental driving environment.

But, I guess the only thing to do is go all the way and let the machines do it all.

Regards,

Mike

The problem with sloppy work is that the supply FAR EXCEEDS the demand
 
cranky108 said:
So next we will see rules that trucks can't be painted white.
Well maybe - or maybe rules that say that white trucks will need to have some appliqué feature to make them more machine-identifiable.

We used not to have a rule to say that vehicles over 6m (20 ft) long had to have steady amber lights at intervals down the side so you can tell (visually) when someone stops their truck across a junction ahead of you at night. We (in the UK) do now.

Imposing new rules on one class of road users to protect against other road users' perceptual deficiencies (like not being able to see in the dark) is nothing new and (with certain exceptions) tends to find acceptance eventually.

A.

 
Mike said:
I'd kind of make the point that a big part of the reason so many drivers are so poor at it these days is because the cars are so good. Power everything, automatic everything, infotainment systems, drivers are more and more insulated / isolated from both the physical and mental driving environment.

But, I guess the only thing to do is go all the way and let the machines do it all.

As a pilot I see that argument often that automation or information overload causes accidents or makes us worse pilots. Overall, the conclusion I've drawn (and I believe this is supported by studies) is that it's not automation nor excessive use of computers that drives these accidents; rather it's our lack of preparedness to identify and take control when things do go wrong.

In the case of this fatal car crash, this does expose a flaw in that how is a driver supposed to be able to tell when the automation isn't working? Many "famous" crashes in aviation occurred when the autopilot turned off and the pilot wasn't aware it was no longer in control and allowed the plane slowly fly itself into a crash. This is about my only concern with the systems in self-driving cars from an engineering standpoint; we need to see what the car is reacting to before or while it's reacting (or not reacting). Perhaps the driver saw the truck and expected the car to brake for him and then wasn't ready to intervene when it didn't.

In aviation we follow strict procedures to ensure the autopilot functions properly on the ground, can be overridden, and presents the proper warnings when it is disabled. In addition, it's always emphasized during training to be aware of what mode the autopilot is in and be prepared to take control at any moment.

In my educated opinion, autopilots definitely make flying a safer and more comfortable experience. While I was trained on a simple trainer aircraft I currently fly a modern cockpit aircraft with multifunction displays and a full autopilot. I will "hand fly" the vast majority of the time but when workload is high it definitely improves safety if I do not need to strictly monitor my altitude and heading while I focus on reading an approach chart or focus on preparing to land.

But, with this convenience/safety feature comes another skill that becomes the crux of the issue. You must be mentally prepared to respond to an abnormal situation. Many aviation crashes occur due to automation failing to perform as expected and pilots getting fixated on getting the automation to perform the action they want, rather than just revert to manually flying and deal with the problem when reestablished on their appropriate course. Still, the vast majority of pilots understand this and practice this skill regularly.

Of course you will have people who use this feature as a crutch. Some general aviation pilots are chided as simply "following the magenta line" (in reference to the purple colors used to indicate information provided from a GPS source) in reference to their lack of non-automated flying skills. You see this in drivers today as well; such as people who drive into lakes because their GPS told them to. Of course these people will abuse self-driving features but they will exist regardless of the technology given them. I'd rather a computer system which has software that can be updated driving towards me than a person looking down to read that important text they just got.

Overall the best example of this is to remind yourselves that we used to need two pilots and a flight engineer to safely operate an airline. Does anyone lament the loss of the flight engineer position as a safety issue? In fact, many modern charter aircraft are flown with a single pilot these days. This is only made possible with increased automation and yet flight safety continues to improve while simultaneously the cost of flights continue to decrease. While this is a little "apples and oranges" the assertion that automation = worse pilots/drivers isn't quite that black and white if you ask me.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
Years ago there was a concern that people were setting the cruise control for 25 MPH so they would not speed in school zones. That alarmed people, because the 25 MPH was intended to allow people more time to react.

What happened is law enforcement increased in school zones which caused many more tickets for people not watching their speed. So people started to set their cruise control for 25 MPH so they would not have the problem of getting tickets for driving too fast.

Now most cruise controls can't be set for anything below 35 MPH, so drivers must watch their speed, and not so much the road. (Has technology improved things?)

On the same topic, I thought that trucks were required to have reflectors mid trailer so they can be seen. This is now being required on rail cars because of the number of accidents with trains.

But still the comment about the white trailer and bright sky is bothersome when a red reflector was all that should have prevented this. So my comment about not allowing white trailers.

Come to think of it, I think lights mid trailer are also required, but being daylight they likely were not on.
 
Perhaps the sides of all trucks should have this painted on the side:


phosphene-artistic-depiction_nvnyqx.gif




Or for easier computer recognition:


inddddex_zovg9v.png



Keith Cress
kcress -
 
TehMightyEngineer said:
another skill that becomes the crux of the issue. You must be mentally prepared to respond to an abnormal situation.

Never more true than in an accident that was reported yesterday. Imagine a ship motoring through a wreck-infested patch of water on autopilot in the middle of the night when the system throws in an undemanded 30° turn towards the rocks. Should the watchkeeper:

a. Disengage the autopilot and attempt to manoeuvre to safety in hand steering?
b. Apply Astern propulsion to stop the ship in the water giving him time to think before sorting himself out?
c. Throttle back a wee bittie, then wander off to wake the skipper?​

StApollo_WebsitePhoto.jpg


A.
 
What I've been hearing is that the Tesla's self-driving system apparently "saw" the truck, but mis-categorized it as an overhead sign board.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor