Autopilot failing to stop with incapacitated driver

  • SUPPORT THE SITE AND ENJOY A PREMIUM EXPERIENCE!
    Welcome to Tesla Owners Online, four years young! For a low subscription fee, you will receive access to an ad-free version of TOO. We now offer yearly memberships! You can subscribe via this direct link:
    https://teslaownersonline.com/account/upgrades

    SUBSCRIBE TO OUR YOUTUBE CHANNEL!
    Did you know we have a YouTube channel that's all about Tesla? Lots of Tesla information, fun, vlogs, product reviews, and a weekly Tesla Owners Online Podcast as well!

ibgeek

Top-Contributor
Joined
Aug 26, 2019
Messages
441
Location
Northern California
Country
Country
Tesla Owner
Model 3
#42
I don't believe that they are limited by the law. I've watched a dozen FSD Beta videos, and all are showing errors that could lead to crashes if the car was driving unmonitored. To my knowledge, FSD Beta is not in any way limited by the law.
OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.

And of the 86 videos that are currently out regarding FSD Beta, there are many which do not show any errors. So your claim on that is false. Now back to the actual topic.
 

DocScott

Top-Contributor
TOO Supporting Member
Joined
Mar 6, 2019
Messages
503
Location
Westchester, NY
Country
Country
Tesla Owner
Model 3
#43
OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.

And of the 86 videos that are currently out regarding FSD Beta, there are many which do not show any errors. So your claim on that is false. Now back to the actual topic.
No one is suggesting that the car be able to do something that can't be overridden. The proposal is that the car not obey the instruction of having the accelerator pressed down when there is an imminent collision and the driver has not been applying torque to the wheel. If for some reason, including legalities, it is necessary for a driver to be able to accelerate at an unsafe speed in to what the car judges is an imminent collision, the driver could still do so, simply by applying a slight torque to the wheel while accelerating.
 
Last edited:

Needsdecaf

Top-Contributor
Joined
Dec 27, 2018
Messages
1,496
Location
The Woodlands, TX
Country
Country
Tesla Owner
Model 3
#44
OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.

And of the 86 videos that are currently out regarding FSD Beta, there are many which do not show any errors. So your claim on that is false. Now back to the actual topic.
Errm, ok. Not a troll. Owned two Model 3‘s, put over 50k miles on them. I’ve got as much of a standing as anyone to state my opinion, and I will. If you disagree, you’re completely entitled to.

you’re the one who brought up the topic about Tesla somehow miraculously is better than the average driver, but just so handcuffed by the law. I’m sorry, but what does that have to do about anything? Nothing.

I never claimed, ever, that the driver could not override the car. Ever. But you brought it up, seemingly to imply that’s why the car doesn’t stop the car when Autopilot is screaming for the driver to take control? So yes, I’m arguing that point, because I don’t believe it to be true.

So sorry, not a troll. I pointed out what I feel is a major safety flaw in the Tesla software, which, as someone else showed, allowed a collision with a Car while traveling over 130 MPH. I find that incredibly concerning. I want Tesla to succeed, and to continue growing. But I do not agree with everything they’re doing, nor Am I willing to give Tesla a pass when they have issues. Don’t need to give the haters any more ammo.
 
Last edited:

Needsdecaf

Top-Contributor
Joined
Dec 27, 2018
Messages
1,496
Location
The Woodlands, TX
Country
Country
Tesla Owner
Model 3
#45
No one is suggesting that the car be able to do something that can't be overridden. The proposal is that the car not obey the instruction of having the accelerator pressed down when there is an imminent collision and the driver has not been applying torque to the wheel. If for some reason, including legalities, it is necessary for a driver to be able to accelerate at an unsafe speed in to what the car judges is an imminent collision, the driver could still do so, simply by applying a slight torque to the wheel while accelerating.
Exactly.
 

garsh

Dis Member
Moderator
TOO Supporting Member
Joined
Apr 4, 2016
Messages
14,120
Location
Pittsburgh PA
Country
Country
Tesla Owner
Model 3
#46
OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.
Name calling is not appropriate. Please refrain.

You said that Tesla can't override driver inputs due to laws. I believe @Needsdecaf is asking "what laws?". I tried searching around for laws that would be applicable, but I couldn't find any myself.

It's not unreasonable for people to request some evidence to back up this assertion. Do you actually know of any laws? If not, and this is just a presumption, that's ok, but it would be good to clarify.
Tesla can make changes but not ones that can not be overridden. That is not because Tesla doesn't want to. Tesla knows that the automation is going to be and in some cases already is magnitudes better than what a human can do. And they want the car to protect you, but they are limited to the law at the moment.
 

FrancoisP

Well-known member
TOO Supporting Member
Joined
Sep 28, 2018
Messages
279
Location
Cleveland, OH
Country
Country
Tesla Owner
Model Y
#47
No one is suggesting that the car be able to do something that can't be overridden. The proposal is that the car not obey the instruction of having the accelerator pressed down when there is an imminent collision and the driver has not been applying torque to the wheel. If for some reason, including legalities, it is necessary for a driver to be able to accelerate at an unsafe speed in to what the car judges is an imminent collision, the driver could still do so, simply by applying a slight torque to the wheel while accelerating.
Bingo! And I would advance that once FSD is "uncrashable", if a human decides to drive, there should be safety measures to prevent him from getting into an accident voluntarily or involuntarily.
 
Last edited:

FrancoisP

Well-known member
TOO Supporting Member
Joined
Sep 28, 2018
Messages
279
Location
Cleveland, OH
Country
Country
Tesla Owner
Model Y
#48
I argue that Full Self Drivng would be much better served to make the car uncrashable And let the human drive. Rather than let the car drive and wait for the human to save it.
I don't understand this statement. Full Self Driving is about the car driving itself, and the passengers enjoying the ride. Of course, FSD will have to be "uncrashable" for this to happen.
 

Needsdecaf

Top-Contributor
Joined
Dec 27, 2018
Messages
1,496
Location
The Woodlands, TX
Country
Country
Tesla Owner
Model 3
#49
I don't understand this statement. Full Self Driving is about the car driving itself, and the passengers enjoying the ride. Of course, FSD will have to be "uncrashable" for this to happen.
Yes, I agree that FSD will have to be uncrashable in order to be deployed. I’m talking about a completely different postulate. I believe that Full Self Driving is something many companies are rushing toward believing that most people want it. I believe that is false. I believe that a better direction to go in would to use the technology as the ultimate driver’s aid. To have the driver in control but the computer being your safety net, always watching. Not to make a robo-taxi. I’ll type more when I’m in front of a real keyboard.
 

JasonF

Top-Contributor
Joined
Oct 26, 2018
Messages
1,908
Location
Orlando FL
Country
Country
Tesla Owner
Model 3
#50
You said that Tesla can't override driver inputs due to laws. I believe @Needsdecaf is asking "what laws?". I tried searching around for laws that would be applicable, but I couldn't find any myself.
I think it’s more accurate to say that if the car overrides driver inputs, it drops liability for any crashes (or injuries) into Tesla’s lap.
 

FrancoisP

Well-known member
TOO Supporting Member
Joined
Sep 28, 2018
Messages
279
Location
Cleveland, OH
Country
Country
Tesla Owner
Model Y
#51
I think it’s more accurate to say that if the car overrides driver inputs, it drops liability for any crashes (or injuries) into Tesla’s lap.
I don't buy into the liability argument. Cars already have accident mitigation systems using preemptive seatbelt tensioning and automatic braking. These systems don't prevent an accident but reduce the severity of it. I've never heard any manufacturer being sued for that. Tesla taking over the car when a driver becomes incapacitated should not be an issue. And if it is, it should be discussed with the Transport Administration. To me this is a benefit, not a liability. It's definitely better than letting the car ram another car at 130mph.
 

garsh

Dis Member
Moderator
TOO Supporting Member
Joined
Apr 4, 2016
Messages
14,120
Location
Pittsburgh PA
Country
Country
Tesla Owner
Model 3
#52
I think it’s more accurate to say that if the car overrides driver inputs, it drops liability for any crashes (or injuries) into Tesla’s lap.
That's certainly not the case with current laws. Regardless of what driver's aids are available or in use, the driver is still liable for any accidents that happen. You would currently have to take Tesla to court to have them be liable due to some sort of "defect".

Perhaps that's what @ibgeek is talking about? It's not that there's a law against the car overriding driver input - it's just that allowing the driver to override protects the company from being liable in a case like this, since the driver is ultimately responsible AND continues to always have the ability to be ultimately responsible.
 

FrancoisP

Well-known member
TOO Supporting Member
Joined
Sep 28, 2018
Messages
279
Location
Cleveland, OH
Country
Country
Tesla Owner
Model Y
#53
Yes, I agree that FSD will have to be uncrashable in order to be deployed. I’m talking about a completely different postulate. I believe that Full Self Driving is something many companies are rushing toward believing that most people want it. I believe that is false. I believe that a better direction to go in would to use the technology as the ultimate driver’s aid. To have the driver in control but the computer being your safety net, always watching. Not to make a robo-taxi. I’ll type more when I’m in front of a real keyboard.
That's a whole different discussion. Let's focus on what Tesla offers today and promises we'll have tomorrow.
 

JasonF

Top-Contributor
Joined
Oct 26, 2018
Messages
1,908
Location
Orlando FL
Country
Country
Tesla Owner
Model 3
#54
I don't buy into the liability argument. Cars already have accident mitigation systems using preemptive seatbelt tensioning and automatic braking. These systems don't prevent an accident but reduce the severity of it. I've never heard any manufacturer being sued for that. Tesla taking over the car when a driver becomes incapacitated should not be an issue. And if it is, it should be discussed with the Transport Administration. To me this is a benefit, not a liability. It's definitely better than letting the car ram another car at 130mph.
All of those things you mentioned don't interfere with the driver's ability to override. The driver can turn off emergency braking, or can push the accelerator to override it. Yes, the driver can force a crash even if the protection systems want to do their best to prevent it.

Of course it would be nice for the car to take over when a driver is incapacitated. But then the question becomes, how does the car know a driver is incapacitated and not just doing something crazy or dangerous? If it's purely speed and ignoring AP warnings, what if they're doing it on purpose and watching a movie while driving? What if they're intentionally taking a nap and the car suddenly slowing abruptly wakes them up and causes them to do something that leads to a crash?

And then, of course, there's liability in the other direction. If the car can safely ignore acceleration input and pull over for someone who's incapacitated and ignoring AP warnings, why wasn't it able to do the same for someone who was intentionally napping? Or that person who was watching a movie while driving? Where do you draw the line of the car taking over? What can you consider an unsafe enough situation to sieze control from the driver? Do you force a driver to pull over and stop if they go 50 mph in a school zone because there are potentially kids' lives at stake? Do you have the car notify the police if the driver consistently speeds, drives dangerously, or exhibits signs of driving under the influence? All of those things are safety issues, but the slope becomes really slippery.
 

FrancoisP

Well-known member
TOO Supporting Member
Joined
Sep 28, 2018
Messages
279
Location
Cleveland, OH
Country
Country
Tesla Owner
Model Y
#55
All of those things you mentioned don't interfere with the driver's ability to override. The driver can turn off emergency braking, or can push the accelerator to override it. Yes, the driver can force a crash even if the protection systems want to do their best to prevent it.

Of course it would be nice for the car to take over when a driver is incapacitated. But then the question becomes, how does the car know a driver is incapacitated and not just doing something crazy or dangerous? If it's purely speed and ignoring AP warnings, what if they're doing it on purpose and watching a movie while driving? What if they're intentionally taking a nap and the car suddenly slowing abruptly wakes them up and causes them to do something that leads to a crash?

And then, of course, there's liability in the other direction. If the car can safely ignore acceleration input and pull over for someone who's incapacitated and ignoring AP warnings, why wasn't it able to do the same for someone who was intentionally napping? Or that person who was watching a movie while driving? Where do you draw the line of the car taking over? What can you consider an unsafe enough situation to sieze control from the driver? Do you force a driver to pull over and stop if they go 50 mph in a school zone because there are potentially kids' lives at stake? Do you have the car notify the police if the driver consistently speeds, drives dangerously, or exhibits signs of driving under the influence? All of those things are safety issues, but the slope becomes really slippery.
There is a saying: perfection is the enemy of the good. No one's claiming that whatever solution Tesla implements has to handle all situations. All that is being said is once the driver becomes unresponsive, voluntarily or not, the car when under Autopilot's control could and should disregard accelerator input and safely take the car out of traffic, park it and call the cops (I'm kidding). Possibly, only then would the driver be allowed to regain control.
 

DocScott

Top-Contributor
TOO Supporting Member
Joined
Mar 6, 2019
Messages
503
Location
Westchester, NY
Country
Country
Tesla Owner
Model 3
#56
Liability is likely one reason that you don't need to have AP to get some of the emergency safety features that use AP's abilities. If they didn't do that, you'd have a 737 max type situation, where a car had the ability to avoid or reduce the severity of an accident but didn't because the owner hadn't paid for a safety feature, and I don't know if I'd want to face that one in court if I were Tesla. That in addition to the fact that it's good ethics and good PR to have safety features provided to everyone with the necessary hardware once they get past the beta stage.

For that same reason, I expect we'll see a raft of emergency features for HW3 cars without FSD, once FSD features start to exit beta. For example, maybe a HW3 car without FSD will slam on the brakes if it looks like a red light is about to be run, particularly if it sees potential cross-traffic and no one is tailgating. The FSD braking for red would activate earlier and more smoothly, but the non-FSD version would kick in during an emergency. As with all the current emergency features, there could be ways for the driver to override when needed.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
772
Location
Minneapolis, MN
Country
Country
Tesla Owner
Model 3
#57
I think there are cases where it seems like the car could intervene but there is a big difference between the car controlling steering, acceleration, signaling, and braking and the car overriding the driver. I don't think the steering wheel can be argued to be an incapacitated driver sensor. Adding a terrifying alarm is also not intuitive to turn the wheel. I've had it go off and I don't know why it did. Sad to say but I've forgotten whether Autosteer was on or not early on in my ownership. I think there is a possibility for a false positive because the driver is not understanding what's going on and it causing an accident. It better not have false positives if they override.

I think that's the difference between this and other overrides like emergency breaking is this could cause an accident.

Another possible issue is many people don't want it. Read jalopnik reader comments on the recent driver assist article. Maybe Tesla doesn't care.
https://jalopnik.com/another-study-shows-just-how-careless-self-driving-tech-1845717398
 

JasonF

Top-Contributor
Joined
Oct 26, 2018
Messages
1,908
Location
Orlando FL
Country
Country
Tesla Owner
Model 3
#58
There is a saying: perfection is the enemy of the good. No one's claiming that whatever solution Tesla implements has to handle all situations. All that is being said is once the driver becomes unresponsive, voluntarily or not, the car when under Autopilot's control could and should disregard accelerator input and safely take the car out of traffic, park it and call the cops (I'm kidding). Possibly, only then would the driver be allowed to regain control.
That still raises the question of how does the car tell if a driver is incapacitated? It's actually possible for someone to zone out while fully awake, and simply not hear the Autopilot alarm. Some people are just naturally very still while they're wide awake, so detecting lack of movement in the cabin won't work either.

I know it shouldn't matter, since it's only a slight annoyance for the car to pull itself over and stop because you weren't paying attention to AP alerts...but in reality, an inattentive or incapacitated driver is just as likely to happen in heavy traffic as a lonely road, and it could be just as likely they're going slower than traffic rather than faster. Meaning that if the driver is indeed not incapacitated, it becomes a danger rather than just an annoyance for the car to declare an emergency and pull over suddenly.

Once Full Self Drive is more mature, and once Autopilot can use the interior camera for verification, I might be more in favor of the car taking over in highly dangerous situations to get you safely pulled over. It might even be helpful if the controls and camera can detect that you're inattentive and unresponsive, pull over safely for you, and call for help. Because at that point, if you're that unresponsive, chances are you're either having some kind of medical incident, or you're very drunk and will end up killing yourself in a crash otherwise. It's just that right now, I'm afraid the car might cause more danger trying to resolve the situation.

Still, though, once that barn door is open, the horses have escaped, and there's no getting them back. If a feature like that works really well and becomes widespread, the next thing you will see eventually are cars that are legislated to obey the instructions of police and pull over. It might be to prevent high-speed pursuits, which might be okay...but then again, it might be a cop sitting on the side of the road who orders your car to pull over because you were 5 mph over the speed limit, just because it's so easy for them to do, vs deciding if it's worthwhile to catch up to you and stop you over such a small infraction.
 

FrancoisP

Well-known member
TOO Supporting Member
Joined
Sep 28, 2018
Messages
279
Location
Cleveland, OH
Country
Country
Tesla Owner
Model Y
#59
That still raises the question of how does the car tell if a driver is incapacitated? It's actually possible for someone to zone out while fully awake, and simply not hear the Autopilot alarm. Some people are just naturally very still while they're wide awake, so detecting lack of movement in the cabin won't work either.
I've said my point. I'm done.
 

JasonF

Top-Contributor
Joined
Oct 26, 2018
Messages
1,908
Location
Orlando FL
Country
Country
Tesla Owner
Model 3
#60
I thought this thread was kind of a lively and intelligent discussion of the possibilities and pitfalls (legal and technical) of making Autopilot smarter and safer. I’m a little disappointed that instead, discussion is not allowed, and instead it’s a thread directing that it be done. I guess the only answer I have to that, then, is sorry, I don’t have the power to make it happen, and I don’t think anyone at Tesla will even read this thread.