Is HW3 required to fix the problems with AP / EAP / NoA / AutoHighBeam / AutoWipers?

  • Our merch store is back! Check out our line of quality apparel and accessories featuring the TOO logo. Let us know if you'd like something specific you don't see 👍https://teespring.com/stores/tesla-owners-online-store
  • It's OK to discuss software issues here but please report bugs to Tesla directly at servicehelpna@teslamotors.com if you want things fixed.

mswlogo

Top-Contributor
Joined
Oct 8, 2018
Messages
699
Location
MA
Tesla Owner
Model 3
Country
Country
Yeah I tried just AutoPilot this weekend again (16.2). Drove up to destination 150 miles at night, no Phantom Braking. Drove back in bright sun, it Phantom Braked once on a bridge. No other cars around.
Surprised it only did it that once though.
 

GDN

Moderator
Moderator
TOO Supporting Member
Joined
Oct 30, 2017
Messages
3,488
Location
Far North Dallas, TX
Tesla Owner
Model 3
Country
Country
I've reported no phantom breaking events in months, so to be honest, I had one on the way home Friday. The first one ever in this car. The other car hasn't had on in months. It was a different time of day and I was passing under an overpass.
 

webdriverguy

Top-Contributor
Joined
May 25, 2017
Messages
678
Location
MA
Tesla Owner
Model 3
Country
Country
I'm not sure what you are expecting. NOA frankly does a lot better job than any other human that has only been driving for six months. It's downright impressive what it can do. OF COURSE it has a hard time with difficult situations, like merging into very competitive Massachusetts traffic. Just like any teenager. It probably works well in 90% of highways worldwide. But we live in a very special crazy driving place. Surely you must be aware and understanding of that. Surely you must be smart enough not to expect early versions of AI to be able to handle this stuff perfectly. Even experienced humans can't.

If you want a perfect experience from NOA, then only use it on open highways with light traffic. Have some patience and watch it improve slowly over the next year or two.

Wipers used to have issues just six months ago, now they work spectacularly, better than any previous auto-wiper-equipped car I've driven. Of course that's a much simpler situation. And of course there are still exceptions, such as light mist at night. But that should also be understandable, as those tiny drops are well below the resolution of images that are processed.

I honestly have never experienced phantom braking in over 9 months with plenty of AP use. Maybe it is a unique situation or calibration issue?

If you expect perfection from these beta early technologies, then please turn them off and wait a few years before you use them. Might as well also not use any humans to drive either, those computers are also prone to failure.
Sorry I have to disagree when you said "NOA frankly does a lot better job than any other human that has only been driving for six months." In my experience this is not true at all, infact not even close. However I am going to patiently wait and see the system improve over time. For now using NOA does not make sense. TACC is much less stressful.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
509
Location
Minneapolis, MN
Tesla Owner
Model 3
Country
Country
If this article is any indication we're in for more phantom braking not less. Cruise was braking 10 times per 10 miles for a test period of 30,000 miles. And not much better over the last year.
 

mswlogo

Top-Contributor
Joined
Oct 8, 2018
Messages
699
Location
MA
Tesla Owner
Model 3
Country
Country
If this article is any indication we're in for more phantom braking not less. Cruise was braking 10 times per 10 miles for a test period of 30,000 miles. And not much better over the last year.
Why would their struggle be tied to Tesla’s. I do believe things will get better with HW3 and probably a year of optimizing code for it. Just not sure how much better they could be with current hardware.
 

DocScott

Well-Known Member
TOO Supporting Member
Joined
Mar 6, 2019
Messages
248
Location
Westchester, NY
Tesla Owner
Model 3
Country
Country
If this article is any indication we're in for more phantom braking not less. Cruise was braking 10 times per 10 miles for a test period of 30,000 miles. And not much better over the last year.
This sort of casts doubt on the "autonomy requires LIDAR" argument. Tesla AP clearly is not causing passenger discomfort by maneuvers such as sudden braking an average of once a mile! Distinguishing between the need for real braking and a shadow is the kind of thing that LIDAR is supposed to be good at. So if the LIDAR-equipped cars aren't doing that well, then it shows that Tesla's software is so far ahead that it's making up for having a smaller suite of hardware.

On the other hand, I'm still skeptical that HW3 and further software iterations of the kind they've been doing will solve all of Tesla's phantom braking problems. I think Tesla may need to incorporate an approach where each individual car tweaks its own neural net, perhaps with a more formal and intentional calibration routine (e.g. something that would have to be done at a service center or perhaps by mobile service). Otherwise the incidental differences in sensors from one car to the next (e.g. a scratch on a camera window or a slight difference in alignment) might be an unsolvable problem.

My prediction is that eventually we'll need a protocol where Teslas have yearly "eye exams." Each sensor (camera, radar, ultrasonic) will be presented with standard stimuli and that information used to calibrate and adjust the car's AP.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
509
Location
Minneapolis, MN
Tesla Owner
Model 3
Country
Country
Business Insider is one of the more well know Tesla FUDers, so take their info with a grain of salt.
This was from a hit piece on Cruise that originated from The Information I think. It's paywalled though so I didn't link that one. But ya, FUDsters were a large topic of conversation at the Investors meeting yesterday.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
509
Location
Minneapolis, MN
Tesla Owner
Model 3
Country
Country
Why would their struggle be tied to Tesla’s. I do believe things will get better with HW3 and probably a year of optimizing code for it. Just not sure how much better they could be with current hardware.
Ya. It may not be a good comparison. Cruise is mainly in San Francisco I believe so their braking is probably mostly on streets versus the highway/freeway. But if Tesla releases their Autopilot for streets later this year like they say they will I wouldn't be surprised if the complaints about phantom braking start going up. I think their priority is adding functionality. Elon Musk talks about Navigate on Autopilot as if it's finished. Yesterday he said they were working on curb detection. Maybe they see the phantom braking as a training issue that will resolve over time.
 

mswlogo

Top-Contributor
Joined
Oct 8, 2018
Messages
699
Location
MA
Tesla Owner
Model 3
Country
Country
Ya. It may not be a good comparison. Cruise is mainly in San Francisco I believe so their braking is probably mostly on streets versus the highway/freeway. But if Tesla releases their Autopilot for streets later this year like they say they will I wouldn't be surprised if the complaints about phantom braking start going up. I think their priority is adding functionality. Elon Musk talks about Navigate on Autopilot as if it's finished. Yesterday he said they were working on curb detection. Maybe they see the phantom braking as a training issue that will resolve over time.
I think it is a training issue too, but I'm concerned it might take HW3 frame rates to deploy it.
If it is just training, I would think it would have been resolved by now. It's been going on a long time. It might likely be a dozen different cases that need 100's of markups though.
 

GDN

Moderator
Moderator
TOO Supporting Member
Joined
Oct 30, 2017
Messages
3,488
Location
Far North Dallas, TX
Tesla Owner
Model 3
Country
Country
Indeed it needs attention and no doubt an issue, but I'd have to say it looks like the semi might be way closer than he should be for that speed. Barely a road stripe behind the 3, way too close for highway speeds. It's nearly impossible to know and tell for sure the speed, but travelling distance between cars will help. There is no way you get the two second rule there, I'm going to say not even .5.
 
Last edited:

mswlogo

Top-Contributor
Joined
Oct 8, 2018
Messages
699
Location
MA
Tesla Owner
Model 3
Country
Country
Indeed it needs attention and no doubt an issue, but I'd have to say it looks like the semi might be was closer than he should be for that speed. Barely a road stripe behind the 3, way too close for highway speeds. It's nearly impossible to know and tell for sure the speed, but travelling distance between cars will help. There is no way you get the two second rule there, I'm going to say not even .5.
BTW that’s not my Video.

I agree that’s the first thing I noticed too that the truck was way to close to begin with. But you can’t always control that stuff or always catch it. That truck driver sure was alert though and was smart to switch to the other lane in case he couldn’t stop in time. But he could have lost control too.

You can also see the shadow on the road that TACC got tripped up by. I was expecting it at the bridge but it was the shadow from the sign before it, I believe.

You can also tell TACC doesn’t care what’s behind you when it slams on the brakes.

If Elon doesn’t do something soon, the NHTSA will do it for him.

Pretty soon the Semi’s will be ICEing us.
 

Bokonon

Self-identified Teslaholic
Moderator
TOO Supporting Member
Joined
Apr 12, 2017
Messages
3,040
Location
Boston
Tesla Owner
Model 3
Country
Country
You can also see the shadow on the road that TACC got tripped up by. I was expecting it at the bridge but it was the shadow from the sign before it, I believe.
Yeah, it looks as though the braking incident happened when the camera saw the sign's shadow "come out of" the vehicle it was following, and concluded that this vehicle was braking. Radar should have been able to confirm whether this was the case (and likely would have triggered the collision warning if it had been able to do so), but apparently the image of the shadow alone was enough for Autopilot to apply the brakes out of caution.

I'd love to see an Autopilot video overlay (of the sort Green likes to do on Twitter) for one of these incidents, just to see what it looks like through Autopilot's eyes. I'd also hope that the Autopilot team has the ability to construct a similar view from footage and telemetry... ideally the sort that is automatically reported when these incidents occur, similar to "driver takeover" events...
 

mswlogo

Top-Contributor
Joined
Oct 8, 2018
Messages
699
Location
MA
Tesla Owner
Model 3
Country
Country
Yeah, it looks as though the braking incident happened when the camera saw the sign's shadow "come out of" the vehicle it was following, and concluded that this vehicle was braking. Radar should have been able to confirm whether this was the case (and likely would have triggered the collision warning if it had been able to do so), but apparently the image of the shadow alone was enough for Autopilot to apply the brakes out of caution.

I'd love to see an Autopilot video overlay (of the sort Green likes to do on Twitter) for one of these incidents, just to see what it looks like through Autopilot's eyes. I'd also hope that the Autopilot team has the ability to construct a similar view from footage and telemetry... ideally the sort that is automatically reported when these incidents occur, similar to "driver takeover" events...
I’d be really disturbed if Tesla doesn’t know all about this issue. I think enough people see it routinely to easily repeat and train.

Only thing that is different in this Video is there was a Semi two cars lengths behind it. Which was not the root of the problem. Braking for shadows happens way to often.
 

Mike

Legendary Member
Joined
Apr 4, 2016
Messages
2,264
Location
Batawa Ontario
Country
Country
Indeed it needs attention and no doubt an issue, but I'd have to say it looks like the semi might be way closer than he should be for that speed. Barely a road stripe behind the 3, way too close for highway speeds. It's nearly impossible to know and tell for sure the speed, but travelling distance between cars will help. There is no way you get the two second rule there, I'm going to say not even .5.
Agreed.

But unfortunately in the driving I do around here, this is standard practice.

When I'm being followed that close on the freeway, I always de-couple from autopilot.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
509
Location
Minneapolis, MN
Tesla Owner
Model 3
Country
Country
I’d be really disturbed if Tesla doesn’t know all about this issue. I think enough people see it routinely to easily repeat and train.

Only thing that is different in this Video is there was a Semi two cars lengths behind it. Which was not the root of the problem. Braking for shadows happens way to often.
Tesla knows better than any other manufacturer. During their FSD presentation, the last guy said he has reviewed every single accident. We know they look at near misses as well. That video is a great example that it can cause accidents, but its also possible it hasn't or very few. It also would be the semi's fault in that case. That doesn't mean its not a problem but they are publishing accident rates with Autopilot on versus off and the accidents from this may not be a priority from a statistical point of view. Also if you're designing the system you want to err on the side of safety. I have seen several accidents where the car drove into something like the fire truck on the road or a barrier on the freeway. Those sell a lot of news. But that video of the semi was the first phantom braking incident I've seen. And it didn't result in an accident. Regardless of that, I think its a problem from a user experience point of view. I certainly have thought about whether the possibility of it phantom braking into an accident makes it less safe than me driving. And then having your heart skip a beat when it starts slowing is not super awesome. Elon has said they worry about the steering wheel nag causing people to use autopilot less resulting in more accidents. This also results in people using autopilot less which causes more accidents. Maybe more than phantom braking itself.

Ironically the one accident I was in that was 100% my fault was the result of people braking for shadows(entering a short tunnel well before radar cruise control) and would not have occurred if I was using autopilot (or any other car with radar cruise.)
 

mswlogo

Top-Contributor
Joined
Oct 8, 2018
Messages
699
Location
MA
Tesla Owner
Model 3
Country
Country
Tesla knows better than any other manufacturer. During their FSD presentation, the last guy said he has reviewed every single accident. We know they look at near misses as well. That video is a great example that it can cause accidents, but its also possible it hasn't or very few. It also would be the semi's fault in that case. That doesn't mean its not a problem but they are publishing accident rates with Autopilot on versus off and the accidents from this may not be a priority from a statistical point of view. Also if you're designing the system you want to err on the side of safety. I have seen several accidents where the car drove into something like the fire truck on the road or a barrier on the freeway. Those sell a lot of news. But that video of the semi was the first phantom braking incident I've seen. And it didn't result in an accident. Regardless of that, I think its a problem from a user experience point of view. I certainly have thought about whether the possibility of it phantom braking into an accident makes it less safe than me driving. And then having your heart skip a beat when it starts slowing is not super awesome. Elon has said they worry about the steering wheel nag causing people to use autopilot less resulting in more accidents. This also results in people using autopilot less which causes more accidents. Maybe more than phantom braking itself.

Ironically the one accident I was in that was 100% my fault was the result of people braking for shadows(entering a short tunnel well before radar cruise control) and would not have occurred if I was using autopilot (or any other car with radar cruise.)
Are they publishing all the accidents avoided by folks smart enough to stop using it? Or the stats when users canceling it at every bridge shadow or canceling it when cars/trucks are traveling to close behind? And taking credit for the HUMAN that is making judgement of only using it when it’s “safe” (or safer) to use.

The problem is the data they are collecting is very biased.

The only way to truly get a measurement of how good it is would be to not let humans intervene.

They are implicitly getting “cherry picked” data. I suspect they know this. And I suspect there is in house testing that less intervening happens to get more accurate data. Public data is very uncontrolled.
 

M3OC Rules

Top-Contributor
TOO Supporting Member
Joined
Nov 17, 2016
Messages
509
Location
Minneapolis, MN
Tesla Owner
Model 3
Country
Country
Are they publishing all the accidents avoided by folks smart enough to stop using it? Or the stats when users canceling it at every bridge shadow or canceling it when cars/trucks are traveling to close behind? And taking credit for the HUMAN that is making judgement of only using it when it’s “safe” (or safer) to use.

The problem is the data they are collecting is very biased.

The only way to truly get a measurement of how good it is would be to not let humans intervene.

They are implicitly getting “cherry picked” data. I suspect they know this. And I suspect there is in house testing that less intervening happens to get more accurate data. Public data is very uncontrolled.
I agree that the data they publish is very limited and more marketing driven than science driven. 2 points though. The data is showing the safety of the system as it is. So people are supposed to intervene. If autopilot is on you have to have TACC on so if it increased accidents significantly presumably it would show up in their published data which is bad for them. The more important point is that Tesla should have very good data on the safety of this when they are making decisions. We're doing a bunch of speculation. If we had the data we may still disagree with them but at least they should be making informed decisions.