# Navigate on autopilot should NEVER override a safe driver instruction



## lance.bailey (Apr 1, 2019)

I had NoA override a lane change I initiated and cause a hazardous situation.

Here is the situation.

About a km before my freeway exit from northbound 99, there is a merge of traffic from 91 onto 99. their on ramp quickly ends for the start of bus lane. This morning, as is typical, there were 4 or 5 cars trying to merge from 91 onto 99 into the right lane where I was driving. There was nobody in the left (passing) lane. I was driving with Navigate on Autopilot which knew, and was warning about, the upcoming exit from 99.

To make room for the cars (and to avoid slowing to their speed etc. etc.) I indicated a lane change to the empty left lane with a pull down of the left wand (not a tap, but a full pull). The car started to change lanes as requested, but 1/3 of the way through the maneuver flashed up the familiar "changing lanes to follow route" and abruptly swung the car back to the right lane.

Now, the other cars were still in their on ramp, but with signals on, and the lane ending, they were about to enter the same lane into which I just abruptly swung back. I had my left signal on (from the wand pull) so they had every indication I was leaving that lane until the car overrode that wand pull command.

I don't care if the driver command is contra to what NoA wants to do, or even if the driver command will cause an exit to be missed, but as long as the driver command is safe then this should be the rule

*If the driver tells the car to do something which is safe to do, then do it - the driver is in charge of the car.*​
It's that simple. If I hadn't taken over control of the car, there would have been a side swipe or two.
​


----------



## FF35 (Jul 13, 2018)

It's still in Beta. It will improve. It works well more than 95% of the time. As long as you're prepared to take over, there shouldn't be any side swipes.


----------



## lance.bailey (Apr 1, 2019)

I understand that it is beta - all the more reason to listen to and not override a safe driver command part way through the maneuver. If the car wants to immediately request a lane change back after completing the maneuver that is fine, but aborting a safe driver command part way through with an abrupt car swing is not cool.

What I would love is a button on the screen to quickly flag "AutoPilot got it wrong" moments. The car could then send logs and details to the mothership. Abuse/overuse of the button would disable AutoPilot as overuse would indicate such a problem with autopilot that it is unsafe to use and a service check-up is indicated.


----------



## Greg Appelt (Sep 27, 2018)

Hopefully your interruption of AP is flagged as a fringe case that needs to be reviewed and they work on a solution. But... as you said and everyone needs to be vigilant about - the driver is ultimately responsible for any action the car takes.


----------



## FF35 (Jul 13, 2018)

lance.bailey said:


> I understand that it is beta - all the more reason to listen to and not override a safe driver command part way through the maneuver. If the car wants to immediately request a lane change back after completing the maneuver that is fine, but aborting a safe driver command part way through with an abrupt car swing is not cool.
> 
> What I would love is a button on the screen to quickly flag "AutoPilot got it wrong" moments. The car could then send logs and details to the mothership. Abuse/overuse of the button would disable AutoPilot as overuse would indicate such a problem with autopilot that it is unsafe to use and a service check-up is indicated.


When you disengage, that is the "autopilot got it wrong" moment.


----------



## jrzapata (Apr 23, 2018)

Glad you got away from the situation without any issues. you can always send this to [email protected]


----------



## lance.bailey (Apr 1, 2019)

FF35 said:


> When you disengage, that is the "autopilot got it wrong" moment.


Not always. There have been other threads discussing how disengaging with a twist is a valid way of returning to TACC only. Sometimes I just want to drive the fun toy. Sometimes I'm too heavy with my hand and it disengages. Sometimes ....

I am not so sure that Tesla has divisions full of people who go over every disengage moment that happens - that would be way too much. I know that if I am on a service call they can pull up messages and log alerts.

Once I called in service about the exclamation mark which kept appearing in the upper right corner of the screen. Service was able to pull up the event and tell me what it was. Very cool. But until I asked about the exclamation mark alert not a single person had looked at my car's logs

I want a button on the screen that lets flag me that "beta has had a learning moment" so that someone does take a look at the moment. They can even contact the driver to get more details.

You see - and I'm speaking as someone who has designed software systems for over 30 years for use by all levels of computer savvy people - beta testing absolutely requires user feedback so that the designer can understand real world use and experience. When I started out I would hold the opinion that "they aren't using it right" or "the system wasn't designed to handle that" and that was my tail wagging the user's dog.


----------



## FF35 (Jul 13, 2018)

lance.bailey said:


> Not always. There have been other threads discussing how disengaging with a twist is a valid way of returning to TACC only. Sometimes I just want to drive the fun toy. Sometimes I'm too heavy with my hand and it disengages. Sometimes ....
> 
> I am not so sure that Tesla has divisions full of people who go over every disengage moment that happens - that would be way too much. I know that if I am on a service call they can pull up messages and log alerts.
> 
> ...


I don't think they go over every disengagement. I think they're reviewed in groups. Could be wrong but that's my interpretation.


----------



## lance.bailey (Apr 1, 2019)

jrzapata said:


> Glad you got away from the situation without any issues. you can always send this to [email protected]


I have no faith that [email protected] has any use.

I sent a report when neither my wife or I could remotely lock the car with our phone apps when we discovered the car unlocked.

Two weeks later I got the response that I should have rebooted the screen (along with instructions on how to hold down the scroll wheels and so on). Brilliant. The problem as reported defined me being away from the car - how could I reach the scroll wheels to reboot when the car is kilometers away?


----------



## lance.bailey (Apr 1, 2019)

FF35 said:


> I don't think they go over every disengagement. I think they're reviewed in groups. Could be wrong but that's my interpretation.


exactly. Because they are not individually reviewed, hence the need for the "hey NoA blew chunks just now" button.

they may want to work on that wording.


----------



## FF35 (Jul 13, 2018)

lance.bailey said:


> I have no faith that [email protected] has any use.
> 
> I sent a report when neither my wife or I could remotely lock the car with our phone apps when we discovered the car unlocked.
> 
> Two weeks later I got the response that I should have rebooted the screen (along with instructions on how to hold down the scroll wheels and so on). Brilliant. The problem as reported defined me being away from the car - how could I reach the scroll wheels to reboot when the car is kilometers away?


You're trying to make a correlation between Autopilot reports and normal car problems. AP disengagements/reports are at a much higher level in the review process (a lot is automated) than having a problem with unlocking the car.


----------



## lance.bailey (Apr 1, 2019)

Not really making a correlation, more that I was giving an example of the ludicrously low skill level of the responses I have been getting and the level I now expect from emailing [email protected]


----------



## kort6776 (Apr 30, 2019)

the AP system will never override the operators disengaging the system. this is just a case of someone not assessing a situation and placing too much faith and dependence on a system that is BETA.


----------



## lance.bailey (Apr 1, 2019)

That is not what I said. My complaint is that I asked the car to change the lane with a wand pull. The maneuver started and partway through the car swerved back to maintain route. I disengaged AP successfully to take control and never said otherwise. I did assess the driving situation and decided that changing lanes was required. The car decided that it thought better.

As originally posted, AP overrode a driver instruction, not a driver disengagement. I might say that this is a case of a followup post not reading the thread..


----------



## kort6776 (Apr 30, 2019)

lance.bailey said:


> That is not what I said. My complaint is that I asked the car to change the lane with a wand pull. The maneuver started and partway through the car swerved back to maintain route. I disengaged AP successfully to take control and never said otherwise. I did assess the driving situation and decided that changing lanes was required. The car decided that it thought better.


something does not add up, if you DISENGAGED the AP system just how was the car able to "decide" differently?


----------



## lance.bailey (Apr 1, 2019)

I disengaged after the swerve caused by the car following the route with NoA (navigate on autopilot) to take the next exit.


Car is in Navigate on AP
i am in right lane
I initiate a lane change to the left lane with a wand pull
Car begins to change to left lane
Car flashes a "changing lane to follow route message"
Car swerves back to right lane
I disengage AP with a wheel twist to the left
Car's decision to follow route and swerve back to the right was before my disengagement


----------



## JasonF (Oct 26, 2018)

lance.bailey said:


> That is not what I said. My complaint is that I asked the car to change the lane with a wand pull. The maneuver started and partway through the car swerved back to maintain route. I disengaged AP successfully to take control and never said otherwise. I did assess the driving situation and decided that changing lanes was required. The car decided that it thought better.


I believe what AP does when you disengage it suddenly is it saves the "event" as what happened immediately before it was disengaged. So if AP threw up a warning and then you shut it off, Tesla knows that happened. In your case, it would save that it was attempting to change lanes to follow NOA, and you shut it off to abort it.

Those events are probably reviewed in groups to find trends. If a lot of people are cutting off NOA lane changes, Tesla recognizes there is a problem with that, and then they might start reviewing aggregate event details to see exactly what's causing so many drivers to have to interrupt NOA lane changes.

If you want to draw more attention to it, though, push the voice activation button and say "Bug report: navigate on autopilot overrides the driver and forces a lane change even when it's unsafe".

Edit for everyone else: Please try not to be dismissive of the OP. It's traumatic to get used to, and start to rely on, Autopilot, and then have it very suddenly try to kill you in heavy traffic. He's most definitely a little freaked out about it. I know there isn't a lot we can do to fix it, but we can calmly explain stuff like the paragraph above.


----------



## kort6776 (Apr 30, 2019)

lance.bailey said:


> I disengaged after the swerve caused by the car following the route with NoA (navigate on autopilot) to take the next exit.
> 
> Car's decision to follow route and swerve back to the right was before my disengagement


that is contrary to what you claimed in earlier posts. have a nice day and please drive safely


----------



## lance.bailey (Apr 1, 2019)

@kort6776 I just reread my posts in this thread - please help me here, I do not see where I claimed otherwise in earlier posts. Can you please point that out? I try to be as clear as possible and would appreciate you showing my contradiction.

thanks.


----------



## lance.bailey (Apr 1, 2019)

For the record, I am not "freaked out" or "traumatized". I disagree with the actions of the car, and I have no faith in the reporting mechanisms of Tesla.

I never claimed the car tried to kill me, and take exception that something needs to be fixed.


----------



## iChris93 (Feb 3, 2017)

I wonder if the car thought it was an unsafe lane change. I would not expect NoA to cancel it otherwise.


----------



## lance.bailey (Apr 1, 2019)

iChris93 said:


> I wonder if the car thought it was an unsafe lane change. I would not expect NoA to cancel it otherwise.


left lane was clear. right lane (current lane) had a car ahead of me by 6 or so lengths and 4 or 5 cars coming in from the off ramp.


----------



## iChris93 (Feb 3, 2017)

lance.bailey said:


> left lane was clear. right lane (current lane) had a car ahead of me by 6 or so lengths and 4 or 5 cars coming in from the off ramp.


I'm sure it was safe, from what you're saying, but sometimes my lane changes are swervy for no reason and I have to expect it's falsely flagging something.


----------



## lance.bailey (Apr 1, 2019)

heh, do we start calling those "phantom swerves" like we refer to "phantom brakes"

Perhaps the car did "see" something in the lane. We need to get a run of "I brake for hallucinations" bumper stickers.


----------



## racekarl (Jul 31, 2018)

Maybe there is a disconnect in how we think about control vis-a-vis navigate on autopilot. An alternative way to look at it is that you didn't really "tell the car to do something" - that's what the steering wheel is for. You "asked" the computer to alter its course - perhaps it didn't think it could reconcile that request with its overall goal (approaching an exit it needs to take, and courtesy notwithstanding, it had no reason to change lanes since it has the right of way). It should have responded better than by half-attempting it then changing its mind, certainly. It did do marginally better than HAL9000 when faced with contradictory commands, so you've got that going for you!


----------



## lance.bailey (Apr 1, 2019)

interesting @racekarl - i hadn't considered that mindset.


----------



## jrzapata (Apr 23, 2018)

lance.bailey said:


> I have no faith that [email protected] has any use.
> 
> I sent a report when neither my wife or I could remotely lock the car with our phone apps when we discovered the car unlocked.
> 
> Two weeks later I got the response that I should have rebooted the screen (along with instructions on how to hold down the scroll wheels and so on). Brilliant. The problem as reported defined me being away from the car - how could I reach the scroll wheels to reboot when the car is kilometers away?


Details!


----------



## slacker775 (May 30, 2018)

racekarl said:


> Maybe there is a disconnect in how we think about control vis-a-vis navigate on autopilot. An alternative way to look at it is that you didn't really "tell the car to do something" - that's what the steering wheel is for. You "asked" the computer to alter its course - perhaps it didn't think it could reconcile that request with its overall goal (approaching an exit it needs to take, and courtesy notwithstanding, it had no reason to change lanes since it has the right of way). It should have responded better than by half-attempting it then changing its mind, certainly. It did do marginally better than HAL9000 when faced with contradictory commands, so you've got that going for you!


This is exactly how I'm reading this situation. This is the sort of 'interpretation' stuff that really gets difficult between computers/AI and people. And coding the 'proper' behavior for this scenario is also extremely difficult. One person will feel 'the car knows my exit is coming up, it should have never started the lane change' whereas someone else would feel 'the car should do as I say, miss the exit if it has to, just do what I say'. Both users would be right, and yet both would also be kind of wrong.

And this is why full autonomy would ultimately require no steering wheel at all. Otherwise, either the person in the drivers seat or the computer is a back seat driver.


----------



## Long Ranger (Jun 1, 2018)

iChris93 said:


> I wonder if the car thought it was an unsafe lane change. I would not expect NoA to cancel it otherwise.


This is exactly what I think. Auto lane change under AP or NOA has always had a problem with abruptly aborted lane changes for no good reason. I think the timing of the route notification was just a coincidence. Those route based lane changes are always extremely slow to initiate, in my experience.

I started holding the turn signal on until the lane change is nearly complete, and this seemed to prevent the aborted attempts on 2019.8.5. Not sure yet on 2019.12.1.2.

Also, this may not apply here, but one possible explanation for these aborted lane changes is this comment from the manual:
Midway through the lane change, Auto Lane Change must be able to detect the target lane's outside lane marking. If this lane marking cannot be detected, the lane change is aborted and Model 3 returns to its original driving lane.


----------



## FF35 (Jul 13, 2018)

What I’ve found, especially in foggy conditions, wet weather or poor lane markings, is that AP works much better when the white reflectors are installed in the dashes lines in the highway. That’s most likely because AP can see the reflections and it knows where the lanes are.

Next time you’re on the highway, take notice if there’s reflectors or not. You’ll see that AP works better in all conditions if they’re installed.


----------



## M3OC Rules (Nov 18, 2016)

I had a similar situation where it wanted to be in a certain lane for an exit that was a ways ahead. I cancelled the lane change via the screen button but it just kept trying to change lanes. It was definitely a HAL moment but turning off noa stopped the madness. Based on my experience I'm not surprised about what you had happen and I agree it's not good at all. Add it to the list...


----------



## MJJ (Aug 7, 2016)

People be like “my car doesn’t drive as good as me”

Tesla be like “we’re just trying not to kill you”

Keep in mind that Tesla’s end game is to NOT have you be in control. If you had not been intent on changing lanes into the clear lane (a good choice to be sure), what would have happened? Maybe there was a different plan, one more compatible with AI, that would have unfolded, uneventfully. Maybe instead of maintaining speed in a different lane, the car would have slowed and merged.

This is a very awkward phase we are in. We are responsible. We are expected to take over in case of error. But I find that if I expect AP to respond the way I would have, I’m frequently disappointed. If I let it do its job, I’m usually pleasantly surprised.


----------



## lance.bailey (Apr 1, 2019)

I hear you @MJJ, and one day the car will see four cars with signal lights indicating that they want to be in the lane I am already in, with a clear lane I could use (making room for those cars). I have heard rumour that Tesla is looking at signal lights of other cars to do just that, and I imagine down the road (intentional pun) the cars will sort it out amongst themselves. Hopefully not using the CSMA/CD algorithms of ethernet.

but we are not there yet, so I keep coming back to that "feedback button" which would raise awareness of the car could have done better in this situation.


----------



## MelindaV (Apr 2, 2016)

I think in cases like this, it is best to disable NoA and go with standard autosteer to position yourself in the lane you want to be in.


----------

