The always interesting Eric Peters reports:
Something strange – and dangerous – happened to me the other day while I was out test-driving a new Toyota Prius.
The car decided it was time to stop. In the middle of the road. For reasons known only to the emperor.
Or the software.
I found myself parked in the middle of the road – with traffic not parked coming up behind me, fast. Other drivers were probably were wondering why that idiot in the Prius had decided to stop in the middle of the road.
But it wasn’t me. I was just the meatsack behind the wheel. The Prius was driving.
Well, stopping.
Like almost all 2019 model year cars, the Prius has something called automated emergency braking. It’s a saaaaaaaaaaaaaaafety system meant to correct for distracted driving – or just slow-to-react driving.
Sensors embedded in the car’s front and rear bumpers scan the perimeter and if they see something in your path that you don’t – or you haven’t applied the brakes in time to avoid hitting whatever it is – the system will automatically brake for you.
. . .
This instance of non-emergency braking may have occurred because we had an ice storm the previous day. Everything got shellacked with a coating of the stuff.
I scraped the ice off the windshield and side glass before I headed out – as people have been doing for generations – so that I could see. The problem – I suspect – was that the car couldn’t see.
Those sensors embedded in the bodywork were probably still covered by ice, giving the car a case of temporary glaucoma. As a result, the Prius may have thought it saw something in the road – and slammed on its brakes to avoid hitting what wasn’t there.
To prevent this from happening, those sensors must be kept clean. Especially if there’s no way to turn off the saaaaaaaaaaaaaaaafety system tied into those sensors. Which in most cases, there isn’t.
But people haven’t been advised about keeping those sensors clean – at least, not strongly enough. There is info to that effect in the fine print of the owner’s manuals of most cars quipped with this feature, including the Prius.
But even if one is diligent about checking (and cleaning off) the car’s various embedded sensors before one begins driving, what about while one is driving?
Weather happens sometimes.
It was sunny and clear when you left the house – or are on your way home from work – but mid-trip, it begins to snow or sleet . . . and the car’s entire front end (where those sensors are embedded) gets coated by slush/slurry/road spray . . . and the car can no longer see very well or even not at all.
What then?
There aren’t warning icons/buzzers in the gauge cluster of any new car equipped with this system (so far as I have been able to determine) to let you know that it’s time to stop and wipe off the bumpers because the car can no longer see – and (like your grandma, who also can’t see very well anymore) might just do something unpredictable.
This is arguably . . . dangerous.
The car braked hard, too.
I can now describe what the dashboard of a Prius tastes like. Needs A1.
There's more at the link, including a video report on the incident.
I'd never heard of this problem before: but then, I also haven't driven a car with predictive emergency braking before. After reading Mr. Peters' report, I'm going to do my best never to get behind the wheel of one, thank you very much!
Who approved this system for production without thinking of so basic a problem, anyway?
Peter
21 comments:
Not a whole lot different from the "self-driving" vehicles that don't stop for pedestrians; but do run into walls.
Forcing this technology is wrong.
I'm going to do my best never to get behind the wheel of one, thank you very much!
Nice try, Peter. That won't help.
If you're behind one when that happens you'll be just as dead because the car/truck/18-wheeler behind you won't be able to stop, either. Peter points out that this event was caused by not clearing the ice on the sensors, but also asked what happens if the sensors ice up while on the road. Or get splashed with slush, mud, road debris, oil/whatever from another vehicle, etc. (I predict that last one will quickly discovered by The Nefarious Types as a means of either wreaking havoc on the roads or attacking the occupants (picture Antifas leaning out of car with a Super Soaker filled with an opaque liquid or Bad Bart & Co. dumping material from the back of a car in front).
I suggested this at Eric Peter's place: cars equipped with automatic braking systems need to be visibly - and very obviously - marked to allow the rest of us to avoid them. A 6" high 48" wide strip of da-glo orange across the rear bumper might be adequate.
This "safety improvement" is going to kill a lot of people.
Over the course of my career in the nuclear power industry I've taken several seminars/courses in safety analysis and risk assessment. One of my favorite quotes is "[Instructors Name]'s first rule of risk assessment: It is possible to make a system too safe. It will then be unreliable, and the operators will 'Fix' the unreliability issue. The system will then be unsafe."
Yep Nuke Warrior, that was my experience too. System at spec won't perform so interlocks and safeties get bypassed to force it to work to spec. Predictably and almost always, there's a terrible unanticipated accident. Who'd a thunk?
I have thought this was going to be a problem from the very first time I heard about 'self-driving' technology, and its various side technologies like this braking 'safety' feature. You aren't removing the human bing from the decision stream. Your are removing the human being that is IN the decision stream from the car by miles and years. Why anyone thought this was a good idea baffles me.
As I see it, out would be Lords and Masters desperately WANT this technology, because they despise the idea of all us peasants driving wherever we want to. It will last right up until the morning that 3000 copies of the same vehicle suddenly decide, for no readily apparent reason, to turn left in the middle of morning rush hour. Thereafter the technology and the companies that pushed it will be buried under a wrongful death suits, never too emerge again. The politicians who pushed the idea will naturally hold themselves blameless.
*spit*
I was driving a borrowed 2018 Ford F-350 with the so-called forward collision avoidance system (aka automatic emergency braking). Cruising at around 75 MPH on a twisty interstate winding through the mountains. The road curved to the left at about the same time I was passing a slow moving semi in the right lane. The geometry of the curve worked out so that as we went around the curve the semi entered my vehicle's 'forward cone of projected collision,' or whatever the heck it's called.
It fooled my collision avoidance system into thinking that something had just jumped in front of me and a collision was imminent. My brakes were automatically applied - hard! It was only by the grace of God that I wasn't rear-ended by the car behind me, who was following at a reasonable distance but who didn't (again, reasonably) expect 'me' to slam on my brakes for no reason.
I wish there was some way I could have explained this to him as he zoomed around me and shot me a dirty look, complete with the appropriate gesture. Can't blame him a bit...
Apparently, they need to get with it and repeal the law of unintended consequences. It's particularly messy since no one is subject to it going forward but everyone is subject to it looking back.
I can attest to the same problems in a Honda Accord, and it can get worse:
The Lane Control system fights if you try to swerve suddenly, as it did when a tire rim fell off a truck in front of me and I tried to not dodge to not lose wheels. Fortunately I was able to make the car swerve.
The Adaptive Cruise Control will hit the brakes hard if someone cuts in front of you. This has caused people behind me to have to go on the shoulder to avoid a rear end hit. You can dial this down a bit but it will not turn off.
The Emergency Auto brake has stopped me twice when there was nothing in front of me. Both times I was almost rear ended.
I spoke to the Dealer about this and was told Honda knew about it but couldn't do anything because it was "deep in the software"
Had I known I would become a test rabbit for their autonomous car technology I would not have bought the thing.
Oh, THAT will NEVER occur! Riiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiigh.
Four years ago Chevy brought out the "ECO" Cruze. Had a few aerodynamic tricks, including shutters in the lower front fascia which could open/close for more efficiency at speed.
So one snowy morning, our service manager found FIVE of these Cruzes, customers aboard, waiting for service. Every one said that their "Check Engine" light was on. He looked at the cars and noted that the shutters were clogged with snow....
He called GM, of course. The response from the Technical Gurus at GM Tech-Line?
"Can you send us some pictures??"
This is why Chevrolet Division will be selling electric bicycles in the 2019 model year. No, I'm not kidding...and they'll probably screw that up, too.
Here is something else to think about from further in the article, and it's a scarier prospect.
Some cars are setup to counter-steer if the sensors/computer/software "thinks" you are steering into "trouble", like a curb, or off the road, etc. So let's say that you deliberately steer off the road, like I did with my Mazda to avoid T-boning a car coming into the intersection, and the computer counter-steers your car back onto the road causing an accident.
Who is liable?
I'll note that this isn't idle speculation airplane automation has progressively taken decision handling out of the hands of the pilot. Two manufacturers have taken two paths. 1) Boeing has said that the pilot has ultimate control of the plane, and the plane is to do what the pilot commands. 2) Airbus has a computer/software, and the pilot commands are taken as "input" to the operation of the plane, but the computer/software makes the ultimate decision.
However, when there is an accident, Airbus has consistently sited "pilot error", even if the Airbus computer/software overrode the pilot command. For example: At the 1988 Paris air show, a pilot tried to pull the plane up to go around, because he didn't feel the plane was in a safe position to land. The computer said, "no, you're landing now," and put the plane into the forest beyond the airport. Airbus said, "pilot error". A large group of people (including me) said that if computer can override the pilot, than Airbus & its' software development is responsible for the accident.
So when the computer overrides the driver and causes the car to have an accident, who is liable, and who pays?
I've got a 2018 Honda Fit with Honda Sensing. Not perfect, but pretty good--and this is (I believe) the smallest, cheapest car with adaptive cruise. I've had it about a year.
I use the adaptive cruise and lane assist for most of my highway driving, when conditions are appropriate. Even then when someone cuts me off, my car doesn't brake if the other car is going faster than me. It has had false alarms where it misinterprets oncoming cars in a curve as being in my lane-but only enough to alarm, not enough that it does anything. I've had a couple times where it warned me it was turning off some of the driver assist systems because the sensors were covered, and last week it was false alarming (but not braking) in icy weather...so I turned the collision mitigation system off.
These systems aren't better than expert drivers in all conditions...but despite imperfections, I'm convinced that it makes drivers of my skill level safer. And this is a first generation system that will rapidly improve. And they scold when you change lanes without signaling. I want everyone else's car to have that.
Your last line says it all. "I want...."
Steve Sky:
Boeing seems to have decided to emulate Airbus recently. Unfortunately, with even less ability in design and documentation. Killed an airliner's crew and passengers. Scuttlebutt is that it might come close to bankrupting them, due to the apparent total lack of competency exhibited.
If this turns out to be the case, this might put a damper on the car tech. Maybe.
It's not known as "die by wire" for no reason . . .
The industry meets a set "quota" then moves on nonchalant, cavalier.
"The safety devise has been installed. Problem solved." On paper, anyway.
Without further experimentation or analysis, then not really.
I have collision alert (not braking) in my 2015 F-150. When the sensors were ice-encrusted, the system recognized it and disabled the system (with a warning on the dash). Not sure why Toyota couldn't figure out how to do that.
Well you gotta remember that these "safety" mandates come from the same people who f&cked-up the gas can.
We are much too close to automobiles being completely out of our individual control. It will be get in, sit down, shut up. And you (me,us, all of us) will be going wherever the powers-that-be allow us to go/force us to go. AI will likely only add to that scenario.....
I like to say, "The only thing they cannot do with electronics is what they cannot think of to do with electronics."
To commentor CenTexTim: I find it hard to believe that another vehicle was following you at a reasonable distance at 75 mph. Or at any other speed for that matter.
Some of this is being driven by insurance, a small fender bender is 15 to 30 grand for a Benz or similar car. They were getting uninsurable, look at the rates for a Tesla. Dave
Post a Comment