# Autopilot failing to stop with incapacitated driver



## Needsdecaf (Dec 27, 2018)

FRC said:


> While your defense of CR has merit, your ridiculous washing machine example does not. My FSD does not threaten me in any way and improves over time. I'm not afraid of it ripping my clothes to threads or burning my house down. But if CR claims someone currently is producing a better self driving product, I'd call them out on that. And if they say that Tesla is way short of a self driving product, I'd have no argument with that as long as they are clear that Tesla provides ongoing software upgrades that improve the product over time at no additional charge.


I didn't think it was ridiculous. FSD in it's current form has the capability currently of inflicting permanent harm or death. The situation highlighted below shows that Telsa's driver monitoring systems are woefully inadequate to prevent a driver from doing something stupid, or to deal with a disabled or sleeping driver. Can't believe that no one died in this collision.


__ https://twitter.com/i/web/status/1326755485610020864


----------



## francoisp (Sep 28, 2018)

Needsdecaf said:


> I didn't think it was ridiculous. FSD in it's current form has the capability currently of inflicting permanent harm or death.
> 
> 
> __ https://twitter.com/i/web/status/1326755485610020864


I have hard time believing the car was on Autopilot. Are you suggesting that? Because if you're not I don't see the point of this video. It was probably someone under the influence.


----------



## Needsdecaf (Dec 27, 2018)

FrancoisP said:


> I have hard time believing the car was on Autopilot. Are you suggesting that? Because if you're not I don't see the point of this video. It was probably someone under the influence.


If I understand Green's tweets, yes, it was on AP. Follow his whole sequence of tweets. Best guess seems to be driver fell asleep or had a medical incident.


__ https://twitter.com/i/web/status/1327408848941027328
Specifically this tweet. Look at Green's canbus data overlaid on the video (how he gets this stuff is beyond me!).


__ https://twitter.com/i/web/status/1327408848941027328
It appears that the car was on AP, that the driver was no longer responding to driver attention warnings. But that the driver still had their foot on the accelerator. So AP was flashing the warning, but was not slowing the car to a stop. This seems to be a major safety drawback, and not one that's limited to this situation. See video below.






This driver deliberately tested the situation to see what would happen if he continued to ignore the hands on the wheel prompts, but had his foot on the accelerator. Net result: car continued driving, speed controlled by driver.

Now, the likelyhood is that if you're using AP, and fall asleep, that your foot won't press the throttle. But it's certainly possible. I'm not sure why AP doesn't override pedal command when AP is in "time out" mode and just put on hazards and bring you to a stop. That's what it would do if you weren't pressing the throttle.


----------



## francoisp (Sep 28, 2018)

If the car did not slow down and pull over as I've always been reading it would do after a fair amount of warning , then I agree it is a safety issue that Tesla needs to address. However I think this has nothing to do with FSD per say other than it was engaged when the driver had his malaise. With any other vehicle, when a driver looses consciousness with a foot on the accelerator, the car will zigzag around, possibly colliding with incoming traffic, running over pedestrians, maybe fall in a ditch and kill the driver. We've seen this picture before.


----------



## Needsdecaf (Dec 27, 2018)

FrancoisP said:


> If the car did not slow down and pull over as I've always been reading it would do after a fair amount of warning , then I agree it is a safety issue that Tesla needs to address. However I think this has nothing to do with FSD per say other than it was engaged when the driver had his malaise. With any other vehicle, when a driver looses consciousness with a foot on the accelerator, the car will zigzag around, possibly colliding with incoming traffic, running over pedestrians, maybe fall in a ditch and kill the driver. We've seen this picture before.


I agree it is of course safer than no driver's aids in the event that the driver becomes incapacitated.

I don't agree that "it has nothing to do with FSD per se". When the driver engages TACC / Autosteer, they are counting on it functioning as intended. And as intended by the owner's manual states:



> Autosteer requires that you pay attention to your surroundings and remain prepared to take control at any time. If Autosteer still does not detect your hands on the steering wheel, the request escalates by sounding chimes that increase in frequency.
> 
> If you repeatedly ignore Autosteer's prompts for having your hands on the steering wheel, Autosteer disables for the rest of the drive and displays the following message. If you don't resume manual steering, Autosteer sounds a continuous chime, turns on the warning flashers, and slows the vehicle to a complete stop.


So we've found a major loophole in the operation as intended.


----------



## garsh (Apr 4, 2016)

Needsdecaf said:


> So we've found a major loophole in the operation as intended.


It seems like this could be solved by ignoring an accelerator press when the driver doesn't respond to the request to take over steering. Then the car would have slowed down and activated the four-way flashers as it was meant to do for an incapacitated driver.


----------



## francoisp (Sep 28, 2018)

garsh said:


> It seems like this could be solved by ignoring an accelerator press when the driver doesn't respond to the request to take over steering. Then the car would have slowed down and activated the four-way flashers as it was meant to do for an incapacitated driver.


I agree. This can't be an oversight from Tesla's part, it's too obvious. I don't understand Tesla's choice. I've sent Elon Musk a tweet asking for his feedback ;-).


----------



## Bigriver (Jan 26, 2018)

Needsdecaf said:


> So we've found a major loophole in the operation as intended.


It seems that foot on accelerator is used to indicate attentive driver and overrides the inattentive driver assessment from no steering wheel input. I see nothing in the owner's manual about foot on accelerator while on autopilot, although I do know there is the warning that is displayed in the car that it will not auto brake. It does seem something they can improve. But a few things pop out to me from the tweet thread:

The probable accident report for this indicates the Tesla driver was intoxicated.
He was going over 130 mph immediately prior to impact.
Evidently no one was killed.
What other drunk driver incident at such a high speed avoids loss of life? My end conclusion is way to go autopilot, even if there are future improvements that can still be made.


----------



## francoisp (Sep 28, 2018)

Bigriver said:


> What other drunk driver incident at such a high speed avoids loss of life? My end conclusion is way to go autopilot, even if there are future improvements that can still be made.


Well, hopefully this will never happen at a busy intersection. Right?


----------



## garsh (Apr 4, 2016)

FrancoisP said:


> I agree. This can't be an oversight from Tesla's part, it's too obvious. I don't understand Tesla's choice.


It sounds to me like two different systems interacting in an unexpected way.

Autopilot will allow you to control the speed with the accelerator in order to temporarily go faster than the set speed. Useful for passing.
Autopilot will continue to steer if you ignore the autopilot nag, then eventually slow down the car and stop.
I think this is an oversight. I can't see any reason why Tesla would want to explicitly allow this scenario to occur. Hopefully this failure gains Musk's attention so that it gets fixed.


----------



## garsh (Apr 4, 2016)

FrancoisP said:


> Well, hopefully this will never happen at a busy intersection. Right?


This is why we need actual Full Self Driving rather than "driver's aids". People will continue to do stupid things with "driver's aids".


----------



## francoisp (Sep 28, 2018)

garsh said:


> It sounds to me like two different systems interacting in an unexpected way.
> 
> Autopilot will allow you to control the speed with the accelerator in order to temporarily go faster than the set speed. Useful for passing.
> Autopilot will continue to steer if you ignore the autopilot nag, then eventually slow down the car and stop.
> I think this is an oversight. I can't see any reason why Tesla would want to explicitly allow this scenario to occur. Hopefully this failure gains Musk's attention so that it gets fixed.


Every time someone on twitter praises Autopilot, I'm going to tweet back about this.


----------



## M3OC Rules (Nov 18, 2016)

garsh said:


> Autopilot will continue to steer if you ignore the autopilot nag, then eventually slow down the car and stop.


I think you have to be very careful with overriding the driver and basically they only do it when an accident is eminent AFAIK. (I think the exception is the pedal confusion setting.) Imagine a person doesn't do the nag in time. The car starts beeping in a terrifying way and then the car starts slowing down. The driver is unsure what is going on and pushes the accelerator and the car continues to slow down. They might not be thinking that they need to turn the wheel to regain control. When the car doesn't react like you expect it really messes with your head.


----------



## ibgeek (Aug 26, 2019)

Needsdecaf said:


> I didn't think it was ridiculous. FSD in it's current form has the capability currently of inflicting permanent harm or death. The situation highlighted below shows that Telsa's driver monitoring systems are woefully inadequate to prevent a driver from doing something stupid, or to deal with a disabled or sleeping driver. Can't believe that no one died in this collision.
> 
> 
> __ https://twitter.com/i/web/status/1326755485610020864


That vehicle was exceeding 90 MPH rendering it quite impossible for autopilot to be engaged.


----------



## garsh (Apr 4, 2016)

M3OC Rules said:


> I think you have to be very careful with overriding the driver and basically they only do it when an accident is eminent AFAIK. (I think the exception is the pedal confusion setting.) Imagine a person doesn't do the nag in time. The car starts beeping in a terrifying way and then the car starts slowing down. The driver is unsure what is going on and pushes the accelerator and the car continues to slow down. They might not be thinking that they need to turn the wheel to regain control. When the car doesn't react like you expect it really messes with your head.


Agreed. And even the current software waits after several seconds of the car screaming and flashing at you before it decides to slow down to a stop. So I think they can make a change to the software to have this override the fact that the accelerator is being pressed for all that time while THERE ARE NO HANDS DETECTED ON THE STEERING WHEEL. <- that's the other important bit here.


----------



## garsh (Apr 4, 2016)

ibgeek said:


> That vehicle was exceeding 90 MPH rendering it quite impossible for autopilot to be engaged.


Be careful not to knee-jerk react to what @Needsdecaf wrote. Read that post carefully.

Autopilot was originally engaged.
If autopilot is engaged, it is still possible to push the accelerator to go faster.
Autopilot will disengage if you exceed 90 mph.
HOWEVER, if you never touch the steering wheel, autopilot will continue to steer the car even though it WANTS to deactivate.
The FLAW here is that the car continues to allow the accelerator pedal position to override autopilot disengaging and automatically slowing down the car to a stop in this situation where the driver refuses to or can't take over steering.


----------



## Ed Woodrick (May 26, 2018)

garsh said:


> Be careful not to knee-jerk react to what @Needsdecaf wrote. Read that post carefully.
> 
> Autopilot was originally engaged.
> If autopilot is engaged, it is still possible to push the accelerator to go faster.
> ...


Overriding a manual input would be a no-no.

While the event is not good, let's look at the alternatives. Let's say there was no lane guidance and the car swerves and hits other vehicle or embankment. Not great results, probably worse outcome. It would be possible that the car T-bones something else or flips. Not great outcomes. 
Of all of the outcomes, a straight in solid hit tends to be the safest, as that's the post position for all safety equipment to activate.

The car seemed to be doing exactly as the driver commanded it.

Simply put, you can't override stupidity at this time.


----------



## JasonF (Oct 26, 2018)

I would go further than this to avoid writing code just to handle an edge case:

What I would do is have Autopilot record "violations". Among them: Not touching the wheel (or paying attention if they change it to the camera), ignoring the alert, driving a certain percent above the speed limit, triggering an imminent collision alert, missing your exit. Violations could be weighted or cumulative - so if you trigger the inattention violation, and then ignore the warning, plus you're going 10+ mph over the speed limit, AP will start forcing you to pull over. Of course if it's wrong and you're awake, you can override it by turning off AP and driving like an idiot. But at least it will _try_ to stop you.

Note also, btw, that with that scenario, if you trigger a certain number of "violations" in a short time, it would actually cause AP to go into fail-safe _faster_ than it would normally. So let's say the person in the OP was intentionally driving 122 mph, inattentive, ignoring the _first _warning, and then once AP has to dodge a car (the first one, not the one it crashed into), that would have been it. It's pull over and shut down time.


----------



## bwilson4web (Mar 4, 2019)

The old saying that a foolproof system will always find better fools applies. Fortunately, each driver has the freedom to not use Autopilot and post their complaints. Personally, I choose to use every driver’s aid.

Early Autopilot handled several micro-sleep events flawlessly. As Autopilot has gotten better and experience has improved my confidence, I use it more and more. It takes learning how to drive a smarter car but that can be a bridge too far for others who choose not to use it.

Bob Wilson


----------



## JWardell (May 9, 2016)

Nothing is new here. Back in the YouYouXue days he made a video ignoring AP shutdown and keeping the pedal pressed. Agitating to watch but forever engrained that the car will keep steering and going.
The same can be argued about falling asleep with the weight of the hand on the wheel.
And I always wished Tesla prevented the car from ever hitting ANYTHING no matter what the driver commanded, without some secondary over ride on the screen. Would prevent al those teslas from crashing into storefronts.

But it has been clear from day one they made a conscious decision to always obey manual input overrides. Because people WILL find corner cases where they need to, and there will be even more backlash. Especially when it prevented someone from overriding and avoiding an accident.
Maybe this will change if and when we get to full Level 3. But _I, Robot_ taught us even that might be a bad idea.


----------



## JasonF (Oct 26, 2018)

JWardell said:


> But it has been clear from day one they made a conscious decision to always obey manual input overrides. Because people WILL find corner cases where they need to, and there will be even more backlash. Especially when it prevented someone from overriding and avoiding an accident.
> Maybe this will change if and when we get to full Level 3. But _I, Robot_ taught us even that might be a bad idea.


Not to mention that there's this legal line where if the car starts making too many decisions for the driver in the name of safety, Tesla becomes liable for all crashes that people managed to cause anyway.

That's why I was careful above to suggest that Autopilot bail out of the equation if the driver insist on doing dangerous things like sleep. Eventually it should be able to give in to human stupidity and say "I want no part of this".


----------



## ibgeek (Aug 26, 2019)

garsh said:


> Be careful not to knee-jerk react to what @Needsdecaf wrote. Read that post carefully.
> 
> Autopilot was originally engaged.
> If autopilot is engaged, it is still possible to push the accelerator to go faster.
> ...


OK I see what your talking about here. Thanks for clarifying.


----------



## ibgeek (Aug 26, 2019)

JWardell said:


> Nothing is new here. Back in the YouYouXue days he made a video ignoring AP shutdown and keeping the pedal pressed. Agitating to watch but forever engrained that the car will keep steering and going.
> The same can be argued about falling asleep with the weight of the hand on the wheel.
> And I always wished Tesla prevented the car from ever hitting ANYTHING no matter what the driver commanded, without some secondary over ride on the screen. Would prevent al those teslas from crashing into storefronts.
> 
> ...


One slight correction... Tesla will kind of override the peddle with regards to running in to a store front. I mean it will still hit the store, but if you are parked in front of the store, put the car in drive and then hit the peddle, it will scream at you and only go very slowly towards the store wall. I know this because several months ago I accidently hit the accelerator in this scenario. The car crawled forward and yelled with plenty of time for me to realize the issue and stop. Had the car accelerated at it's normal rate, it would have hit for sure and done significant damage.


----------



## JWardell (May 9, 2016)

ibgeek said:


> One slight correction... Tesla will kind of override the peddle with regards to running in to a store front. I mean it will still hit the store, but if you are parked in front of the store, put the car in drive and then hit the peddle, it will scream at you and only go very slowly towards the store wall. I know this because several months ago I accidently hit the accelerator in this scenario. The car crawled forward and yelled with plenty of time for me to realize the issue and stop. Had the car accelerated at it's normal rate, it would have hit for sure and done significant damage.


Well that's a newer feature


----------



## garsh (Apr 4, 2016)

Ed Woodrick said:


> Simply put, you can't override stupidity at this time.





bwilson4web said:


> The old saying that a foolproof system will always find better fools applies.


In this case, we're not talking about "fools" - we're talking about someone who has become incapacitated.

At least, that's the only scenario that I care about. If you've turned on Autopilot, you've already given the car permission to make some decisions. If you can't be bothered to put your hands on the steering wheel with the car screaming at you for several tens of seconds to do so, then I think it's safe to ignore the fact that something has pressed down on the accelerator.


----------



## garsh (Apr 4, 2016)

JWardell said:


> Well that's a newer feature


It's been around for over a year now.

A Quick Look At Tesla's Obstacle-Aware Acceleration


----------



## Klaus-rf (Mar 6, 2019)

garsh said:


> It's been around for over a year now.
> 
> A Quick Look At Tesla's Obstacle-Aware Acceleration


 Only a year is New, right?


----------



## JWardell (May 9, 2016)

garsh said:


> It's been around for over a year now.
> 
> A Quick Look At Tesla's Obstacle-Aware Acceleration


"over a year" = newer than AP steering with pedal


----------



## JeanDeBarraux (Feb 18, 2019)

ibgeek said:


> One slight correction... Tesla will kind of override the peddle with regards to running in to a store front. I mean it will still hit the store, but if you are parked in front of the store, put the car in drive and then hit the peddle, it will scream at you and only go very slowly towards the store wall. I know this because several months ago I accidently hit the accelerator in this scenario. The car crawled forward and yelled with plenty of time for me to realize the issue and stop. Had the car accelerated at it's normal rate, it would have hit for sure and done significant damage.


There's a setting for this actually


----------



## garsh (Apr 4, 2016)

JWardell said:


> "over a year" = newer than AP steering with pedal


You were the one who remembered YouYou using the "AP steering with pedal" technique. His North America road trip was in Dec 2017 - Jan 2018. His crash in Europe was in May 2018.


JWardell said:


> Nothing is new here. Back in the YouYouXue days he made a video ignoring AP shutdown and keeping the pedal pressed.


----------



## francoisp (Sep 28, 2018)

garsh said:


> In this case, we're not talking about "fools" - we're talking about someone who has become incapacitated.
> 
> At least, that's the only scenario that I care about. If you've turned on Autopilot, you've already given the car permission to make some decisions. If you can't be bothered to put your hands on the steering wheel with the car screaming at you for several tens of seconds to do so, then I think it's safe to ignore the fact that something has pressed down on the accelerator.


Until a year ago, before the "obstacle aware acceleration" setting was added, someone could have push hard on the accelerator, ram into a store and the comments on this forum would have been "the car does whatever the 'fools' are prompting it to do". I don't understand why some are defending what I consider a poor software design choice. It's pretty obvious to me that if the driver isn't responding to repeated safety warnings, no matter what, the car should slowdown and safely pull over. I cannot see a scenario where letting the car going on will yield a satisfactory outcome.


----------



## ibgeek (Aug 26, 2019)

FrancoisP said:


> Until a year ago, before the "obstacle aware acceleration" setting was added, someone could have push hard on the accelerator, ram into a store and the comments on this forum would have been "the car does whatever the 'fools' are prompting it to do". I don't understand why some are defending what I consider a poor software design choice. It's pretty obvious to me that if the driver isn't responding to repeated safety warnings, no matter what, the car should slowdown and safely pull over. I cannot see a scenario where letting the car going on will yield a satisfactory outcome.


So I talked to some folks. It's actually a current regulatory requirement that the car not be able to override the human once the vehicle is moving. Work is being done to get this changed.


----------



## Needsdecaf (Dec 27, 2018)

ibgeek said:


> So I talked to some folks. It's actually a current regulatory requirement that the car not be able to override the human once the vehicle is moving. Work is being done to get this changed.


Hmmm, what's the context for this "regulatory requirement"? Because the way you stated "the car not be able to override the human once the vehicle is moving" would mean that any autonomous system from any manufacturer, such as Autonomous Emergency Braking, lane change correction, rear obstacle autonomous braking, etc. is, as a matter of course "overriding the human". By your statement above, those would be illegal.

Since these are pretty wide spread, I'm sure they're not illegal. So hence my request for clarification in your statement above.


----------



## Ed Woodrick (May 26, 2018)

garsh said:


> In this case, we're not talking about "fools" - we're talking about someone who has become incapacitated.
> 
> At least, that's the only scenario that I care about. If you've turned on Autopilot, you've already given the car permission to make some decisions. If you can't be bothered to put your hands on the steering wheel with the car screaming at you for several tens of seconds to do so, then I think it's safe to ignore the fact that something has pressed down on the accelerator.


Just to clarify, intoxicated drivers are fools.


----------



## ibgeek (Aug 26, 2019)

Needsdecaf said:


> Hmmm, what's the context for this "regulatory requirement"? Because the way you stated "the car not be able to override the human once the vehicle is moving" would mean that any autonomous system from any manufacturer, such as Autonomous Emergency Braking, lane change correction, rear obstacle autonomous braking, etc. is, as a matter of course "overriding the human". By your statement above, those would be illegal.
> 
> Since these are pretty wide spread, I'm sure they're not illegal. So hence my request for clarification in your statement above.


I'll see if I can get further clarification on this. I suspect it has to do with actions taken at speed. I'm sure you've seen the alert when you have your foot on the accelerator while on AP stating that it will not break. 
Lane change correction can also be overridden by a driver if he counters. There are no actions (steering, acceleration, breaking) that you can't override. The law simply states that if the car starts turning left, you must be able to stop the turn if you want to. If the car starts to break, you must be able to accelerate or abate the breaking if you want to.


----------



## francoisp (Sep 28, 2018)

ibgeek said:


> I'll see if I can get further clarification on this. I suspect it has to do with actions taken at speed. I'm sure you've seen the alert when you have your foot on the accelerator while on AP stating that it will not break.
> Lane change correction can also be overridden by a driver if he counters. There are no actions (steering, acceleration, breaking) that you can't override. The law simply states that if the car starts turning left, you must be able to stop the turn if you want to. If the car starts to break, you must be able to accelerate or abate the breaking if you want to.


Sure, a driver should be able to take control of the car at any time. But what about a car overriding a driver's input to keep him safe? There are situations where the car is doing this already. For example, my car's antiskid control system will reduce the throttle and keep the car under control with selective braking all the while disregarding my accelerator input because the software can react much faster than I. Back to Tesla, it's not a big jump to think that if a driver is incapacitated, the car should take over regardless of the inputs it's getting, much like my antiskid system. I'd like to hear a counterargument that's based on real life situations where that logic wouldn't apply.


----------



## Needsdecaf (Dec 27, 2018)

FrancoisP said:


> Sure, a driver should be able to take control of the car at any time. But what about a car overriding a driver's input to keep him safe? There are situations where the car is doing this already. For example, my car's antiskid control system will reduce the throttle and keep the car under control with selective braking all the while disregarding my accelerator input because the software can react much faster than I. Back to Tesla, it's not a big jump to think that if a driver is incapacitated, the car should take over regardless of the inputs it's getting, much like my antiskid system. I'd like to hear a counterargument that's based on real life situations where that logic wouldn't apply.


I argue that Full Self Drivng would be much better served to make the car uncrashable And let the human drive. Rather than let the car drive and wait for the human to save it.


----------



## ibgeek (Aug 26, 2019)

FrancoisP said:


> Sure, a driver should be able to take control of the car at any time. But what about a car overriding a driver's input to keep him safe? There are situations where the car is doing this already. For example, my car's antiskid control system will reduce the throttle and keep the car under control with selective braking all the while disregarding my accelerator input because the software can react much faster than I. Back to Tesla, it's not a big jump to think that if a driver is incapacitated, the car should take over regardless of the inputs it's getting, much like my antiskid system. I'd like to hear a counterargument that's based on real life situations where that logic wouldn't apply.


Tesla can make changes but not ones that can not be overridden. That is not because Tesla doesn't want to. Tesla knows that the automation is going to be and in some cases already is magnitudes better than what a human can do. And they want the car to protect you, but they are limited to the law at the moment. But they are working on changing these laws as many of them do not completely apply when it comes to automation.


----------



## Needsdecaf (Dec 27, 2018)

ibgeek said:


> Tesla can make changes but not ones that can not be overridden. That is not because Tesla doesn't want to. Tesla knows that the automation is going to be and in some cases already is magnitudes better than what a human can do. And they want the car to protect you, but they are limited to the law at the moment. But they are working on changing these laws as many of them do not completely apply when it comes to automation.


I don't believe that they are limited by the law. I've watched a dozen FSD Beta videos, and all are showing errors that could lead to crashes if the car was driving unmonitored. To my knowledge, FSD Beta is not in any way limited by the law.


----------



## garsh (Apr 4, 2016)

Needsdecaf said:


> I don't believe that they are limited by the law. I've watched a dozen FSD Beta videos, and all are showing errors that could lead to crashes if the car was driving unmonitored. To my knowledge, FSD Beta is not in any way limited by the law.


This thread is about Autopilot, not FSD Beta.


----------



## Needsdecaf (Dec 27, 2018)

garsh said:


> This thread is about Autopilot, not FSD Beta.


OK, so what about Autopilot is limited by law?


----------



## ibgeek (Aug 26, 2019)

Needsdecaf said:


> I don't believe that they are limited by the law. I've watched a dozen FSD Beta videos, and all are showing errors that could lead to crashes if the car was driving unmonitored. To my knowledge, FSD Beta is not in any way limited by the law.


OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.

And of the 86 videos that are currently out regarding FSD Beta, there are many which do not show any errors. So your claim on that is false. Now back to the actual topic.


----------



## DocScott (Mar 6, 2019)

ibgeek said:


> OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.
> 
> And of the 86 videos that are currently out regarding FSD Beta, there are many which do not show any errors. So your claim on that is false. Now back to the actual topic.


_No one_ is suggesting that the car be able to do something that can't be overridden. The proposal is that the car not obey the instruction of having the accelerator pressed down when there is an imminent collision _and the driver has not been applying torque to the wheel_. If for some reason, including legalities, it is necessary for a driver to be able to accelerate at an unsafe speed in to what the car judges is an imminent collision, the driver could still do so, simply by applying a slight torque to the wheel while accelerating.


----------



## Needsdecaf (Dec 27, 2018)

ibgeek said:


> OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.
> 
> And of the 86 videos that are currently out regarding FSD Beta, there are many which do not show any errors. So your claim on that is false. Now back to the actual topic.


Errm, ok. Not a troll. Owned two Model 3's, put over 50k miles on them. I've got as much of a standing as anyone to state my opinion, and I will. If you disagree, you're completely entitled to.

you're the one who brought up the topic about Tesla somehow miraculously is better than the average driver, but just so handcuffed by the law. I'm sorry, but what does that have to do about anything? Nothing.

I never claimed, ever, that the driver could not override the car. Ever. But you brought it up, seemingly to imply that's why the car doesn't stop the car when Autopilot is screaming for the driver to take control? So yes, I'm arguing that point, because I don't believe it to be true.

So sorry, not a troll. I pointed out what I feel is a major safety flaw in the Tesla software, which, as someone else showed, allowed a collision with a Car while traveling over 130 MPH. I find that incredibly concerning. I want Tesla to succeed, and to continue growing. But I do not agree with everything they're doing, nor Am I willing to give Tesla a pass when they have issues. Don't need to give the haters any more ammo.


----------



## Needsdecaf (Dec 27, 2018)

DocScott said:


> _No one_ is suggesting that the car be able to do something that can't be overridden. The proposal is that the car not obey the instruction of having the accelerator pressed down when there is an imminent collision _and the driver has not been applying torque to the wheel_. If for some reason, including legalities, it is necessary for a driver to be able to accelerate at an unsafe speed in to what the car judges is an imminent collision, the driver could still do so, simply by applying a slight torque to the wheel while accelerating.


Exactly.


----------



## garsh (Apr 4, 2016)

ibgeek said:


> OMG Man! Which of those items you've watched in FSD Beta resulted in an action were the driver was unable to override? ZERO! Honestly I think you just want to argue so I'm done. From now on I will call you out on any inaccuracies but I will no longer engage you directly because frankly I think you are just a troll.


Name calling is not appropriate. Please refrain.

You said that Tesla *can't* override driver inputs due to laws. I believe @Needsdecaf is asking "what laws?". I tried searching around for laws that would be applicable, but I couldn't find any myself.

It's not unreasonable for people to request some evidence to back up this assertion. Do you actually know of any laws? If not, and this is just a presumption, that's ok, but it would be good to clarify.


ibgeek said:


> Tesla can make changes but not ones that can not be overridden. That is not because Tesla doesn't want to. Tesla knows that the automation is going to be and in some cases already is magnitudes better than what a human can do. And they want the car to protect you, but they are limited to the law at the moment.


----------



## francoisp (Sep 28, 2018)

DocScott said:


> _No one_ is suggesting that the car be able to do something that can't be overridden. The proposal is that the car not obey the instruction of having the accelerator pressed down when there is an imminent collision _and the driver has not been applying torque to the wheel_. If for some reason, including legalities, it is necessary for a driver to be able to accelerate at an unsafe speed in to what the car judges is an imminent collision, the driver could still do so, simply by applying a slight torque to the wheel while accelerating.


Bingo! And I would advance that once FSD is "uncrashable", if a human decides to drive, there should be safety measures to prevent him from getting into an accident voluntarily or involuntarily.


----------



## francoisp (Sep 28, 2018)

Needsdecaf said:


> I argue that Full Self Drivng would be much better served to make the car uncrashable And let the human drive. Rather than let the car drive and wait for the human to save it.


I don't understand this statement. Full Self Driving is about the car driving itself, and the passengers enjoying the ride. Of course, FSD will have to be "uncrashable" for this to happen.


----------



## Needsdecaf (Dec 27, 2018)

FrancoisP said:


> I don't understand this statement. Full Self Driving is about the car driving itself, and the passengers enjoying the ride. Of course, FSD will have to be "uncrashable" for this to happen.


Yes, I agree that FSD will have to be uncrashable in order to be deployed. I'm talking about a completely different postulate. I believe that Full Self Driving is something many companies are rushing toward believing that most people want it. I believe that is false. I believe that a better direction to go in would to use the technology as the ultimate driver's aid. To have the driver in control but the computer being your safety net, always watching. Not to make a robo-taxi. I'll type more when I'm in front of a real keyboard.


----------



## JasonF (Oct 26, 2018)

garsh said:


> You said that Tesla *can't* override driver inputs due to laws. I believe @Needsdecaf is asking "what laws?". I tried searching around for laws that would be applicable, but I couldn't find any myself.


I think it's more accurate to say that if the car overrides driver inputs, it drops liability for any crashes (or injuries) into Tesla's lap.


----------



## francoisp (Sep 28, 2018)

JasonF said:


> I think it's more accurate to say that if the car overrides driver inputs, it drops liability for any crashes (or injuries) into Tesla's lap.


I don't buy into the liability argument. Cars already have accident mitigation systems using preemptive seatbelt tensioning and automatic braking. These systems don't prevent an accident but reduce the severity of it. I've never heard any manufacturer being sued for that. Tesla taking over the car when a driver becomes incapacitated should not be an issue. And if it is, it should be discussed with the Transport Administration. To me this is a benefit, not a liability. It's definitely better than letting the car ram another car at 130mph.


----------



## garsh (Apr 4, 2016)

JasonF said:


> I think it's more accurate to say that if the car overrides driver inputs, it drops liability for any crashes (or injuries) into Tesla's lap.


That's certainly not the case with current laws. Regardless of what driver's aids are available or in use, the driver is still liable for any accidents that happen. You would currently have to take Tesla to court to have them be liable due to some sort of "defect".

Perhaps that's what @ibgeek is talking about? It's not that there's a law against the car overriding driver input - it's just that allowing the driver to override protects the company from being liable in a case like this, since the driver is ultimately responsible AND continues to always have the ability to be ultimately responsible.


----------



## francoisp (Sep 28, 2018)

Needsdecaf said:


> Yes, I agree that FSD will have to be uncrashable in order to be deployed. I'm talking about a completely different postulate. I believe that Full Self Driving is something many companies are rushing toward believing that most people want it. I believe that is false. I believe that a better direction to go in would to use the technology as the ultimate driver's aid. To have the driver in control but the computer being your safety net, always watching. Not to make a robo-taxi. I'll type more when I'm in front of a real keyboard.


That's a whole different discussion. Let's focus on what Tesla offers today and promises we'll have tomorrow.


----------



## JasonF (Oct 26, 2018)

FrancoisP said:


> I don't buy into the liability argument. Cars already have accident mitigation systems using preemptive seatbelt tensioning and automatic braking. These systems don't prevent an accident but reduce the severity of it. I've never heard any manufacturer being sued for that. Tesla taking over the car when a driver becomes incapacitated should not be an issue. And if it is, it should be discussed with the Transport Administration. To me this is a benefit, not a liability. It's definitely better than letting the car ram another car at 130mph.


All of those things you mentioned don't interfere with the driver's ability to override. The driver can turn off emergency braking, or can push the accelerator to override it. Yes, the driver can force a crash even if the protection systems want to do their best to prevent it.

Of course it would be nice for the car to take over when a driver is incapacitated. But then the question becomes, how does the car know a driver is incapacitated and not just doing something crazy or dangerous? If it's purely speed and ignoring AP warnings, what if they're doing it on purpose and watching a movie while driving? What if they're intentionally taking a nap and the car suddenly slowing abruptly wakes them up and causes them to do something that _leads_ to a crash?

And then, of course, there's liability in the other direction. If the car can safely ignore acceleration input and pull over for someone who's incapacitated and ignoring AP warnings, why wasn't it able to do the same for someone who was intentionally napping? Or that person who was watching a movie while driving? Where do you draw the line of the car taking over? What can you consider an unsafe enough situation to sieze control from the driver? Do you force a driver to pull over and stop if they go 50 mph in a school zone because there are potentially kids' lives at stake? Do you have the car notify the police if the driver consistently speeds, drives dangerously, or exhibits signs of driving under the influence? All of those things are safety issues, but the slope becomes really slippery.


----------



## francoisp (Sep 28, 2018)

JasonF said:


> All of those things you mentioned don't interfere with the driver's ability to override. The driver can turn off emergency braking, or can push the accelerator to override it. Yes, the driver can force a crash even if the protection systems want to do their best to prevent it.
> 
> Of course it would be nice for the car to take over when a driver is incapacitated. But then the question becomes, how does the car know a driver is incapacitated and not just doing something crazy or dangerous? If it's purely speed and ignoring AP warnings, what if they're doing it on purpose and watching a movie while driving? What if they're intentionally taking a nap and the car suddenly slowing abruptly wakes them up and causes them to do something that _leads_ to a crash?
> 
> And then, of course, there's liability in the other direction. If the car can safely ignore acceleration input and pull over for someone who's incapacitated and ignoring AP warnings, why wasn't it able to do the same for someone who was intentionally napping? Or that person who was watching a movie while driving? Where do you draw the line of the car taking over? What can you consider an unsafe enough situation to sieze control from the driver? Do you force a driver to pull over and stop if they go 50 mph in a school zone because there are potentially kids' lives at stake? Do you have the car notify the police if the driver consistently speeds, drives dangerously, or exhibits signs of driving under the influence? All of those things are safety issues, but the slope becomes really slippery.


There is a saying: perfection is the enemy of the good. No one's claiming that whatever solution Tesla implements has to handle all situations. All that is being said is once the driver becomes unresponsive, voluntarily or not, the car when under Autopilot's control could and should disregard accelerator input and safely take the car out of traffic, park it and call the cops (I'm kidding). Possibly, only then would the driver be allowed to regain control.


----------



## DocScott (Mar 6, 2019)

Liability is likely one reason that you don't need to have AP to get some of the emergency safety features that use AP's abilities. If they didn't do that, you'd have a 737 max type situation, where a car had the ability to avoid or reduce the severity of an accident but didn't because the owner hadn't paid for a safety feature, and I don't know if I'd want to face that one in court if I were Tesla. That in addition to the fact that it's good ethics and good PR to have safety features provided to everyone with the necessary hardware once they get past the beta stage.

For that same reason, I expect we'll see a raft of emergency features for HW3 cars without FSD, once FSD features start to exit beta. For example, maybe a HW3 car without FSD will slam on the brakes if it looks like a red light is about to be run, particularly if it sees potential cross-traffic and no one is tailgating. The FSD braking for red would activate earlier and more smoothly, but the non-FSD version would kick in during an emergency. As with all the current emergency features, there could be ways for the driver to override when needed.


----------



## M3OC Rules (Nov 18, 2016)

I think there are cases where it seems like the car could intervene but there is a big difference between the car controlling steering, acceleration, signaling, and braking and the car overriding the driver. I don't think the steering wheel can be argued to be an incapacitated driver sensor. Adding a terrifying alarm is also not intuitive to turn the wheel. I've had it go off and I don't know why it did. Sad to say but I've forgotten whether Autosteer was on or not early on in my ownership. I think there is a possibility for a false positive because the driver is not understanding what's going on and it causing an accident. It better not have false positives if they override.

I think that's the difference between this and other overrides like emergency breaking is this could cause an accident.

Another possible issue is many people don't want it. Read jalopnik reader comments on the recent driver assist article. Maybe Tesla doesn't care.
https://jalopnik.com/another-study-shows-just-how-careless-self-driving-tech-1845717398


----------



## JasonF (Oct 26, 2018)

FrancoisP said:


> There is a saying: perfection is the enemy of the good. No one's claiming that whatever solution Tesla implements has to handle all situations. All that is being said is once the driver becomes unresponsive, voluntarily or not, the car when under Autopilot's control could and should disregard accelerator input and safely take the car out of traffic, park it and call the cops (I'm kidding). Possibly, only then would the driver be allowed to regain control.


That still raises the question of how does the car tell if a driver is incapacitated? It's actually possible for someone to zone out while fully awake, and simply not hear the Autopilot alarm. Some people are just naturally very still while they're wide awake, so detecting lack of movement in the cabin won't work either.

I know it shouldn't matter, since it's only a slight annoyance for the car to pull itself over and stop because you weren't paying attention to AP alerts...but in reality, an inattentive or incapacitated driver is just as likely to happen in heavy traffic as a lonely road, and it could be just as likely they're going slower than traffic rather than faster. Meaning that if the driver is indeed _not_ incapacitated, it becomes a _danger_ rather than just an annoyance for the car to declare an emergency and pull over suddenly.

Once Full Self Drive is more mature, and once Autopilot can use the interior camera for verification, I might be more in favor of the car taking over in highly dangerous situations to get you safely pulled over. It might even be helpful if the controls and camera can detect that you're inattentive and unresponsive, pull over safely for you, and call for help. Because at that point, if you're _that_ unresponsive, chances are you're either having some kind of medical incident, or you're very drunk and will end up killing yourself in a crash otherwise. It's just that right now, I'm afraid the car might cause more danger trying to resolve the situation.

Still, though, once that barn door is open, the horses have escaped, and there's no getting them back. If a feature like that works really well and becomes widespread, the next thing you _will_ see eventually are cars that are legislated to obey the instructions of police and pull over. It might be to prevent high-speed pursuits, which might be okay...but then again, it might be a cop sitting on the side of the road who orders your car to pull over because you were 5 mph over the speed limit, just because it's so easy for them to do, vs deciding if it's worthwhile to catch up to you and stop you over such a small infraction.


----------



## francoisp (Sep 28, 2018)

JasonF said:


> That still raises the question of how does the car tell if a driver is incapacitated? It's actually possible for someone to zone out while fully awake, and simply not hear the Autopilot alarm. Some people are just naturally very still while they're wide awake, so detecting lack of movement in the cabin won't work either.
> 
> 
> > I've said my point. I'm done.


----------



## JasonF (Oct 26, 2018)

I thought this thread was kind of a lively and intelligent discussion of the possibilities and pitfalls (legal and technical) of making Autopilot smarter and safer. I’m a little disappointed that instead, discussion is not allowed, and instead it’s a thread directing that it be done. I guess the only answer I have to that, then, is sorry, I don’t have the power to make it happen, and I don’t think anyone at Tesla will even read this thread.


----------



## Needsdecaf (Dec 27, 2018)

JasonF said:


> I thought this thread was kind of a lively and intelligent discussion of the possibilities and pitfalls (legal and technical) of making Autopilot smarter and safer. I'm a little disappointed that instead, discussion is not allowed, and instead it's a thread directing that it be done. I guess the only answer I have to that, then, is sorry, I don't have the power to make it happen, and I don't think anyone at Tesla will even read this thread.


Some people don't want an open or honest discussion, and / or don't want to hear that maybe FSD isn't something we should be rushing toward as an inevitability.

My point on posting all of this was to spark that discussion. I'm not anti-FSD, but I am anti-the way Tesla is releasing this software as a beta.


----------



## ibgeek (Aug 26, 2019)

garsh said:


> Name calling is not appropriate. Please refrain.
> 
> You said that Tesla *can't* override driver inputs due to laws. I believe @Needsdecaf is asking "what laws?". I tried searching around for laws that would be applicable, but I couldn't find any myself.
> 
> It's not unreasonable for people to request some evidence to back up this assertion. Do you actually know of any laws? If not, and this is just a presumption, that's ok, but it would be good to clarify.


My apologies for letting my temper get the best of me. But I'm not going to argue with those who don't agree with me. I put info in the forum to assist and inform. There is nothing in it for me if I post BS. 
As I stated prior, I've asked for clarification from my contact. I do want to clarify one thing. It's a regulatory limitation.


----------



## garsh (Apr 4, 2016)

ibgeek said:


> I do want to clarify one thing. It's a regulatory limitation.


Are you able to get any more information about the regulation? I'd really like to read and learn more about what this limitation is.


----------



## Mr.K (Jan 6, 2019)

garsh said:


> Are you able to get any more information about the regulation? I'd really like to read and learn more about what this limitation is.


A quick google search gives you this, but it also seems to be a bit of patchwork with different regulations per state.

https://en.wikipedia.org/wiki/Self-driving_car

*Legal status in the United States*
In Washington, DC's district code: 
...
An autonomous vehicle may operate on a public roadway; provided, that the vehicle: 

(1) Has a manual override feature that allows a driver to assume control of the autonomous vehicle at any time;
...


----------



## ibgeek (Aug 26, 2019)

Mr.K said:


> A quick google search gives you this, but it also seems to be a bit of patchwork with different regulations per state.
> 
> https://en.wikipedia.org/wiki/Self-driving_car
> 
> ...


Thanks, yeah I wouldn't be surprised if there are many variations given how different driving laws can be from town to town.

@garsh I have asked for more info but I don't want to be a pain if you know what I mean.


----------



## Needsdecaf (Dec 27, 2018)

Mr.K said:


> A quick google search gives you this, but it also seems to be a bit of patchwork with different regulations per state.
> 
> https://en.wikipedia.org/wiki/Self-driving_car
> 
> ...


ok, so that still doesn't inhibit Autopilot / FSD from working. It just requires a manual override. Which it has.

Moreover my point was that the car should override the driver's commands if it senses that the driver is not actually operating the vehicle. This law does not seem counter to that in any way. In the case I am discussing, as well as the YouTube video, it's clear that the driver is ignoring the warnings to take control. The vehicle should automatically enter emergency stopping procedures at that point, even if the pedal is pressed.


----------

