Suggestion Phantom braking & false positive analysis suggestions to Tesla

  • SUPPORT THE SITE AND ENJOY A PREMIUM EXPERIENCE!
    Welcome to Tesla Owners Online, four years young! For a low subscription fee, you will receive access to an ad-free version of TOO. We now offer yearly memberships! You can subscribe via this direct link:
    https://teslaownersonline.com/account/upgrades

    SUBSCRIBE TO OUR YOUTUBE CHANNEL!
    Did you know we have a YouTube channel that's all about Tesla? Lots of Tesla information, fun, vlogs, product reviews, and a weekly Tesla Owners Online Podcast as well!

  • It's OK to discuss software issues here but please report bugs to Tesla directly at servicehelpna@teslamotors.com if you want things fixed.

msjulie

Top-Contributor
TOO Supporting Member
Joined
Feb 6, 2018
Messages
504
Location
San Fran Bay Area, Ca
Tesla Owner
Model 3
Country
Country
#1
for those getting the phantom braking, are you all recording a bug report with each?

I had more than half a dozen on one drive, 'bug reported' via the voice interface then had one that was so alarming I called support .. after talking about 20mins (this was early this year when you could get them on the phone) they decided to schedule a camera calibration service visit. Long story short, after about 15mins at service they sent me on my way - beta software. Hmmm
 

DocScott

Top-Contributor
TOO Supporting Member
Joined
Mar 6, 2019
Messages
421
Location
Westchester, NY
Tesla Owner
Model 3
Country
Country
#2
That is efficient for people, yes, for computers, no. It would be mind-blowing to have a computer positioned on the street corner to scan each passer-by, diagnose and refer the sick ones to physicians. That would really do a lot for preventive medicine.
For preventive medicine, that would be...terrible.

You know all the controversy over guidelines for screening tests? For example, at what age should mammograms be performed for women with no other risk factors? Or prostate cancer screening for men?

The reason to not just perform them for everyone every checkup is not primarily the expense of the test. Instead, it's that false positives create a lot of problems, including unnecessary stress for the patient. The health risk factors associated with that extra stress might, in some cases, cause more harm among a population than the harm prevented by the diseases that wouldn't have been detected as early without the screening.

This is always the problem with screening for rare-but-terrible events. Suppose a certain kind of cancer effects one person in a hundred thousand of a given age. Suppose also that the false positive rate is only 0.1%. If you screen everyone, then 99% of your positive results on the screening are false positives. At that point, you have to weigh the problems associated with false positives (e.g. invasive follow-up tests that might themselves have health risks, stress, resources) vs. the additional detections.

It's pretty much the same as the phantom braking problem. (At least the severe "panic braking" variety. The more gradual version which seems to be associated with getting the safe speed for the section or road wrong while on TACC, AP, or NOA is a different kettle of fish.)
 

Dr. J

Private
TOO Supporting Member
Joined
Sep 1, 2017
Messages
1,355
Location
Fort Worth
Tesla Owner
Model 3
Country
Country
#3
For preventive medicine, that would be...terrible.

You know all the controversy over guidelines for screening tests? For example, at what age should mammograms be performed for women with no other risk factors? Or prostate cancer screening for men?

The reason to not just perform them for everyone every checkup is not primarily the expense of the test. Instead, it's that false positives create a lot of problems, including unnecessary stress for the patient. The health risk factors associated with that extra stress might, in some cases, cause more harm among a population than the harm prevented by the diseases that wouldn't have been detected as early without the screening.

This is always the problem with screening for rare-but-terrible events. Suppose a certain kind of cancer effects one person in a hundred thousand of a given age. Suppose also that the false positive rate is only 0.1%. If you screen everyone, then 99% of your positive results on the screening are false positives. At that point, you have to weigh the problems associated with false positives (e.g. invasive follow-up tests that might themselves have health risks, stress, resources) vs. the additional detections.

It's pretty much the same as the phantom braking problem. (At least the severe "panic braking" variety. The more gradual version which seems to be associated with getting the safe speed for the section or road wrong while on TACC, AP, or NOA is a different kettle of fish.)
Umm, my mythical street corner screening was assumed to achieve a higher level of accuracy, I suppose. :) The example you give of screening everyone for an extremely rare cancer is, sure, a dumb idea. Not to mention expensive.

And I don't understand the parallel to phantom braking. We already have the positives. The problem is sorting out the false positives from the real positives and figuring out what to do to avoid the false ones in the future. I postulate that even though you (probably) nor I have ever experienced a legitimate AP braking event, they probably do happen somewhere to somebody. You're not suggesting that Tesla remove emergency braking entirely from AP, are you?
 

Frully

Top-Contributor
Joined
Aug 30, 2018
Messages
1,042
Location
Calgary, AB. Canada
Tesla Owner
Model 3
Country
Country
#4
As a scientist/engineer that has worked on many problems where false positives are an issue to be resolved by not having more than one in 10K or 100K events, and the fact that other cars with TACC are not exhibiting the high frequency of false positives, I think this should be a priority for Tesla to resolve. Unfortunately, they don't provide the user with enough information to help them solve the issue.

For example, one of the problems I worked on was in Chemical agent detection. We needed detectors that could detect quickly enough and at low enough levels to provide time for our servicemen to put on protective gear. It is a lot of work and very uncomfortable in high heat desert environments to put on this gear based on a false positive. People naturally start to ignore the warning after a certain number of false positives or they quit using your system and revert to the famous Kuwaitie Chickens as being a more reliable means of getting a warning (canary in the coal mine is another). In our case, we needed to develop better and more reliable ways to test the systems and keep them properly calibrated.

For example Tesla could show us what the car thought it was reacting to and ask us to confirm that the danger was real or not. Then it could learn to identify the ghosts from the real threats. Without this type of real-time driver input there is no way, just looking at the data after the fact that Tesla can identify and fix the problem.
I would argue (at least from my understanding) that Tesla already DOES record every false positive.

How? From how the driver reacted. If it phantom braked and you quickly torqued the wheel to take over, or mashed the accelerator to undo the false...it knows that was probably a false positive.
Every time AP is disengaged by the user, particularly from torquing the wheel, is a flag that something was probably wrong.

It's also the reason I understand that whenever you park from a long drive - tracking the wifi shows scads of data being sent back to the mothership.
 

lance.bailey

Top-Contributor
Joined
Apr 1, 2019
Messages
423
Location
cloverdale, BC
Tesla Owner
Model 3
Country
Country
#6
I don't have a great answer for why it's taking so long, but keeping in mind that humans are error-prone, I would respectfully disagree that drivers are by definition the best judge of anything related to driving. Eyewitnesses often identify the wrong person as a suspect in crime, people are often notoriously bad at choosing life partners, and drivers in accidents will often attribute blame to anything but their own actions.
we may be coming at the same issue with different points. i agree with you that humans are horrid at witnessing *what* happened. But I think they are pretty good at identifying *when* something happened.

For me, the issue is people flagging *when* unnecessary/unexpected braking happens (aka "phantom braking"). Tesla literally has a fleet of computers driving around deciding when to brake, but people keep complaining that the computers are getting it wrong.

What I would like is a way for people to be able to identify "hey, the car got it wrong here" so that the mothership analysts can hone in on the situations that people want corrected.
 

Dr. J

Private
TOO Supporting Member
Joined
Sep 1, 2017
Messages
1,355
Location
Fort Worth
Tesla Owner
Model 3
Country
Country
#7
we may be coming at the same issue with different points. i agree with you that humans are horrid at witnessing *what* happened. But I think they are pretty good at identifying *when* something happened.

For me, the issue is people flagging *when* unnecessary/unexpected braking happens (aka "phantom braking"). Tesla literally has a fleet of computers driving around deciding when to brake, but people keep complaining that the computers are getting it wrong.

What I would like is a way for people to be able to identify "hey, the car got it wrong here" so that the mothership analysts can hone in on the situations that people want corrected.
I don't disagree, but I think the implementation would be akin to a placebo thermostat. They've already got the data they need, they need to hop on it and get it working!
 

lance.bailey

Top-Contributor
Joined
Apr 1, 2019
Messages
423
Location
cloverdale, BC
Tesla Owner
Model 3
Country
Country
#8
I don't disagree, but I think the implementation would be akin to a placebo thermostat.
yep. In my original idea about the feedback button there was a limit per trip, after which it and AP/NOA/TACC would be disabled due to obvious unsafe or unmanageable road conditions. This was inspired by Tesla's turning off AP if the driver gets too many warning to put hands back on the week and also by Tesla's disabling AP when the wipers are going too often, either by rain-sense or manual setting.

Some people would over use the button. Turn it off for them :)

They've already got the data they need, they need to hop on it and get it working!
yes, we can agree on that.
 

garsh

Dis Member
Moderator
TOO Supporting Member
Joined
Apr 4, 2016
Messages
12,406
Location
Pittsburgh PA
Tesla Owner
Model 3
Country
Country
#9
not in squirrel/no squirrel situation i describe above.
One-off events don't matter.

If Tesla cars often brake at the same location, and people often hit the accelerator to override at that location, then that's a good sign of phantom braking. Tesla can automate this type of analysis to determine locations where phantom braking occurs.
 

lance.bailey

Top-Contributor
Joined
Apr 1, 2019
Messages
423
Location
cloverdale, BC
Tesla Owner
Model 3
Country
Country
#10
If Tesla cars often brake at the same location, and people often hit the accelerator to override at that location, then that's a good sign of phantom braking....
ah. "at the same location" yes - that would matter, i personally don't get phantom at the same regular locations on my usual commute, I instead get them a different places on my usual commute. I can only think of one location on my regular route which the Tesla cannot handle with any predictability.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
637
Location
Minneapolis, MN
Tesla Owner
Model 3
Country
Country
#11
This happens to me reasonably often (darn squirrels). Now let's table top a phantom brake.
They could upload a frame of every time it recognizes a squirrel and then analyze it on a separate more powerful AI that is more accurate and can have different thresholds. Then have a human review the subset where it's not confident. You probably only need a sampling for the statistics and then you also use them for training the AI. It's hard to know what they do or have resources to do, but there are lots of possibilities and they have said they do these kinds of things.
 

bwilson4web

Top-Contributor
TOO Supporting Member
Joined
Mar 4, 2019
Messages
512
Location
Huntsville, AL
Tesla Owner
Model 3
Country
Country
#12
The reason to capture and analyze 'phantom braking' is to get some metric:
Code:
Frame    GPS time    GPS speed
0     06:26:56 AM    17.57
1     06:26:57 AM    12.52
24    06:26:57 AM    12.52
25    06:26:58 AM     9.97
49    06:26:58 AM     9.97
50    06:26:59 AM    16.89
74    06:26:59 AM    16.89
75    06:27:00 AM    17.60
A solar induced, phantom braking event, GPS measured, is ~3 seconds. In this case, there was a ~7 mph of ~17 mph reduction in speed that was done ~3 seconds later. To get higher resolution, I'll have to use my 400 Hz, recording accelerometer.

Human reaction times are in the 200 ms range so this means ~15 times the reaction time of a typical human.

Bob Wilson
 

airbusav8r

Active member
Joined
Feb 24, 2019
Messages
34
Location
Santa Barbara, CA
Tesla Owner
Model 3
Country
Country
#14
Would you prefer the car not error on the side of caution while they are still training it, the ML chip is recording overrides, etc...? I’d rather have the car slow for shadow on the road than assume that shadow is okay when it’s really a trailer. AP works perfect at night, the cameras are having a hard time in the sun. The radar counteracts and says no it is fine; however, there is “phantom braking.” This is beta software to the T, just on a mass level and we are paying for it because we are nerds. In time it will be fine; they are recording everything you do and it will get better. The problem three teams are facing at the moment is cement refracting the sun. Generally when the sun is behind you the concrete acts as a mirror and bounces it at the camera. This is no easy challenge; it’s hard enough to extract meta data from a wrinkled PDF; imagine real-time analysis at 4K operations a millisecond with human life and one bad accident away from them having to turn off AP. I’m okay with a little braking here and there.
 

garsh

Dis Member
Moderator
TOO Supporting Member
Joined
Apr 4, 2016
Messages
12,406
Location
Pittsburgh PA
Tesla Owner
Model 3
Country
Country
#15
Would you prefer the car not error on the side of caution while they are still training it
It's a tough call, because sudden, aggressive braking isn't really "the side of caution". Sudden braking for no discernible reason will eventually cause you to get rear-ended by someone following you. Sure, it may end up technically being their fault for not paying attention, but you generally want to drive "predictably" to help prevent inattentive and aggressive people around you from hitting into you.
 

SalisburySam

Well-known member
TOO Supporting Member
Green Level Supporter
Joined
Jun 6, 2018
Messages
316
Location
Salisbury, NC
Tesla Owner
Model 3
Country
Country
#16
Would you prefer the car not error on the side of caution while they are still training it, the ML chip is recording overrides, etc...?...I’m okay with a little braking here and there.
My Model 3 is a fantastic car without question and I’m so glad to be an owner. But it is still a car. And it has to live and play with all the other cars/trucks/busses/mobile home carriers, etc., infinitesimally few of which exhibit any phantom braking. As @garsh mentions, the expectations by those other vehicles of what MY car will do is very important to everyone’s safety. I understand this is a difficult problem to solve and although there has been some improvement in the year I’ve had my 3, I’m growing impatient with this issue and the constant reminder about beta software. And I don’t want to subject passengers to this behavioral oddity so I don’t use the features with others in the car.
 

lance.bailey

Top-Contributor
Joined
Apr 1, 2019
Messages
423
Location
cloverdale, BC
Tesla Owner
Model 3
Country
Country
#17
From the fact that the driver "takes over". In this case, by pressing the accelerator.
had a phantom brake yesterday, but in a new location on my regular route. nothing major, but enough to shoot me forward in the seat and have the nose dip down. I just quietly swore and wish this would go away.

couple of seconds later I realized that I didn't even accelerate out of it, i'm getting accustomed to them, so I just let the car return to speed on it's own.
 

DocScott

Top-Contributor
TOO Supporting Member
Joined
Mar 6, 2019
Messages
421
Location
Westchester, NY
Tesla Owner
Model 3
Country
Country
#18
It's a tough call, because sudden, aggressive braking isn't really "the side of caution". Sudden braking for no discernible reason will eventually cause you to get rear-ended by someone following you. Sure, it may end up technically being their fault for not paying attention, but you generally want to drive "predictably" to help prevent inattentive and aggressive people around you from hitting into you.
Also, if excessive phantom braking discourages people from engaging AP in a situation where AP would be safer than a human driver, then that's a net loss.

On the other hand, if AP is not much, much safer than a human driver then the cases where there's a fatal accident will continue to make the news, and that will discourage some people from using it (or, for that matter, buying a Tesla, with all its other safety features), which might also be a net loss (i.e. more people might die from not having a Tesla/using AP than from flaws in AP).

So it's another trade-off.

Auto design, manufacturing, and marketing is not for the faint of heart...
 

garsh

Dis Member
Moderator
TOO Supporting Member
Joined
Apr 4, 2016
Messages
12,406
Location
Pittsburgh PA
Tesla Owner
Model 3
Country
Country
#19
Also, if excessive phantom braking discourages people from engaging AP in a situation where AP would be safer than a human driver, then that's a net loss.
Completely agree.

I have to imagine that Tesla is actively trying to fix this issue, and is just finding it very difficult to fix the neural net programming in an acceptable way.
 

adam m

Active member
Joined
Feb 1, 2019
Messages
40
Location
New England
Tesla Owner
Model 3
Country
Country
#20
After dealing with this problem for quite a while I also started using the gas pedal to override and I took some videos of the screen to watch the throttle during these breaking events. I'm pretty sure the issue is related to the throttle mapping/regen while in autopilot. I don't think the computer is just trying to stop as much as stop accelerating while figuring out what is going on at a given moment. The way it changes speeds via throttle response is extremely digital. The car drops off the throttle too abruptly and causes this whiplash sensation. Tesla needs to map the throttle to the selected regen mode and make it scale throttle input -%---0---+%. It seems like it's controlling the throttle like we do 0%-100% and it doesn't seem to properly map the 0% is braking and not coasting.

I've found if you set the car Regen to Low instead of standard while in autopilot it's much more enjoyable. I'd be interested if others tried this and agreed.
 
Last edited: