# Autopilot: Where's the Beef?



## potatoee (Aug 26, 2018)

So, there was supposed to be a "major release" towards the end of 2019. Come Monday, we will be in March, nearing a full quarter since the last statement from Elon. Since 2020.4.1 it seems like new SW drops have pretty much ceased with the exception of a very limited deployment of 2020.4.10.

Any news, rumors?

Respectfully,

Waiting F. Gotot


----------



## Ed Woodrick (May 26, 2018)

I got an update around Christmas, didn't you?
If you are really referring to the gaps between releases, that's often a very good sign. That can be indicative of a lot of changes that have to go through a lot of work and testing before release. Small gaps between releases often represent small feature increases.


----------



## John (Apr 16, 2016)




----------



## FRC (Aug 4, 2018)

The "beef" may still be veal at this point.


----------



## eXntrc (Jan 14, 2019)

Personally I think Tesla (Elon) may have gotten themselves in a bit of a jam. Elon said that HW3 retrofits will ramp up by the time the software can truly take advantage of the hardware. But HW3 retrofits don't seem to be going as smoothly as they originally planned. In my view, this puts Tesla in a bit of a bind. If they start rolling out features that depend on HW3, they're going to have a lot of HW2.x owners who paid for FSD demanding their free upgrade. (And I say that with full compassion, since I happen to be one of those people.)

I'm betting that Tesla won't roll out any major features that depend on HW3 until they have a plan to seriously ramp up on retrofits.


----------



## JasonF (Oct 26, 2018)

I believe that the jam started _much_ earlier. Tesla probably should have sold FSD as a "reservation" instead of an add-on to a car purchase. You pay for Full Self Drive, and then you can redeem the reservation toward whatever Tesla you own at whatever point you choose. In other words, you would be able to choose the point at which you're satisfied with FSD, and then apply it to the car. You could be impatient and apply it when Advanced Summon is introduced, or you could hold out until the car can 100% drive itself - whatever you're comfortable with.

The reason customers would be less angry about it that way is they have total control over it. If FSD is late, those people with a "reservation" could simply say they're waiting until they get more for what they paid. And it would have been better for Tesla, because those people would keep buying Teslas to avoid losing that reservation, and be unafraid to upgrade and lose their "investment".


----------



## TrevP (Oct 20, 2015)

This could be the reason firmware updates have stalled:

__ https://twitter.com/i/web/status/1220873912009011200


----------



## garsh (Apr 4, 2016)

eXntrc said:


> But HW3 retrofits don't seem to be going as smoothly as they originally planned.


Agreed, but they seem to have the process figured out at this point.


----------



## potatoee (Aug 26, 2018)

TrevP said:


> This could be the reason firmware updates have stalled:
> 
> __ https://twitter.com/i/web/status/1220873912009011200


Very plausible, especially if they want to resolve all this stuff without confusing the matter when FSD rolls out.

Regarding this...









Personally, I'm very concerned about what the perceptions will be when "FSD" is deployed.

I've always felt that "Autopilot" conveys a false notion that some risk-taking fools leverage and get into trouble with our cars. Although I do believe in an individual's right to compete for a Darwin Award -)), I don't like the bad press that ensues afterwards (and I feel bad for the families that they leave behind). Anyway, since FSD != Level 4 Autonomous, I fear the media and consumer backlash as this "fresh meat" becomes apparent to the non-Tesla community.

Personally, I think Tesla should be more proactive in clarifying these differences and reconciling expectations as a preemptive measure to avoid backlash to the overall mission of fleet transformation. This might help since not everyone reads manuals nor follows them.


----------



## garsh (Apr 4, 2016)

potatoee said:


> Personally, I think Tesla should be more proactive in clarifying these differences and reconciling expectations as a preemptive measure to avoid backlash to the overall mission of fleet transformation.
> 
> This might help since not everyone reads manuals nor follows them.


Think about those two sentences that you just wrote back-to-back.
...
Is there any way for Tesla to clarify things further that will actually reach more people? In particular, the type of people who won't even read the short WARNING that is displayed on the car's screen when these people try to activate the feature? I really don't think there is. The *biggest* obstacle to overcome are news reports that mention "self-driving Teslas" without themselves explaining the limitations.

I think we as the community can try to correct peoples' expectations about Autopilot whenever we hear or see incorrect ones. I almost always add a comment on this website whenever somebody describes Autopilot "messing up", but it's clear to me that it was used in a "non-supported" situation. Usually, the person who was using it knows that they were "testing the boundaries", but I'm always concerned that new people who are reading these threads won't realize that, and they'll come away with an incorrect impression that Autopilot is failing.


----------



## potatoee (Aug 26, 2018)

garsh said:


> Think about those two sentences that you just wrote back-to-back.
> ...
> Is there any way for Tesla to clarify things further that will actually reach more people? In particular, the type of people who won't even read the short WARNING that is displayed on the car's screen when these people try to activate the feature? I really don't think there is. The *biggest* obstacle to overcome are news reports that mention "self-driving Teslas" without themselves explaining the limitations.
> 
> I think we as the community can try to correct peoples' expectations about Autopilot whenever we hear or see incorrect ones. I almost always add a comment on this website whenever somebody describes Autopilot "messing up", but it's clear to me that it was used in a "non-supported" situation. Usually, the person who was using it knows that they were "testing the boundaries", but I'm always concerned that new people who are reading these threads won't realize that, and they'll come away with an incorrect impression that Autopilot is failing.


Let me be more explicit:
+ I agree with you that one of the biggest obstacles are news reports
+ Another potential obstacle is people in legislative or regulatorial positions that can have a huge positive or negative impact.
+ The vast majority of the world is outside of the Tesla bubble and doesn't understand this stuff and has a gut reaction to what things like Autopilot and FSD should be based on the terms and not what the facts are. I think we, within the bubble, are naive if we assume that community support/communication, although necessary, is *sufficient *to correct misconceptions about the technology that we enjoy.
+ When an incident happens with a Tesla, many of us believe that there's a preconceived bias by those not familiar with Tesla to blame the vehicle despite what the driver's role was.

Although I don't think you're confusing my statements as an indictment of AP (quite the contrary), I do think that it's important that we have discussions about how accidents occur whether driving systems induced, weather induced, operator induced, or situationally. Because there are often multiple factors that result in accidents or other "surprises," forums like this are important to tease out these contributors. I believe that by doing so we can collectively come up with correct overall impressions of how these systems that we drive work.

The above helps us *within the bubble* understand but I assert that, *outside the the Tesla community bubble, more can be done.* I do believe that Tesla could do more with PR, legislative and/or community outreach to help the community out in correcting these misconceptions. Without correction, these misconceptions can result in futile regulatory attempts force AP off the road until it corrects for problems that are inherently driver or situation-caused. I note that Senator Markey of MA was seriously considering forcing Tesla to pull AP last year. Luckily, common sense prevailed. Attached is a response I got back from him and I provide it for our collective enlightenment.


----------



## bwilson4web (Mar 4, 2019)

A company that can distribute games to our cars and the ability to upload videos SHOULD be able to share a training video game showing where Autopilot fails.

Apparently news articles about Autopilot crashes are the preferred source. If anyone has a question about my testing, send a PM. I’m not here to cause angst.

Bob Wilson


----------



## garsh (Apr 4, 2016)

And once again, I feel I need to point out that Autopilot is currently NOT advertised as being able to handle ANY of these situations. When you activate the feature, there is a large warning that says that it should only be used on divided highways without intersections.

Calling these "bugs" and "failures" is misleading to those not familiar with Autopilot.


bwilson4web said:


> A company that can distribute games to our cars and the ability to upload videos SHOULD be able to share a training video game showing where Autopilot fails. Text is nice but video makes it real:
> 
> 
> 
> ...


----------



## bwilson4web (Mar 4, 2019)

garsh said:


> And once again, I feel I need to point out that Autopilot is currently NOT advertised as being able to handle ANY of these situations. When you activate the feature, there is a large warning that says that it should only be used on divided highways without intersections.
> 
> Calling these "bugs" and "failures" is misleading to those not familiar with Autopilot.


Ok, I've re-label mine with "Do not repeat testing <subject> (V01.00)". I don't get hung up on semantics. Would you prefer different words?

Bob Wilson


----------



## garsh (Apr 4, 2016)

bwilson4web said:


> Ok, I've re-label mine with "Do not repeat testing <subject> (V01.00)". I don't get hung up on semantics. Would you prefer different words?


I'm mostly concerned about people coming here for the first time, reading this post and watching this video, and coming away with the impression that Autopilot "can't do anything right" and is a "big, buggy failure" and "obviously dangerous and should be illegal" until they fix it.


----------



## eXntrc (Jan 14, 2019)

I have to agree with @garsh here. Even when FSD is deployed as "feature complete" to the fleet (which obviously hasn't happened yet), Elon has said it will take millions of miles of training data before it should be expected to handle city streets without human intervention. We can argue all day about what perception the name "AutoPilot" conveys, but there is no gray area about what the system is currently sold as capable of. Calling any of the scenarios in that video "failures" or "bugs" does not line up with the stated and sold capabilities. And doing so can easily just add to further consumer confusion.


----------



## Klaus-rf (Mar 6, 2019)

In order to make a list of what AP *can* do (currently, as designed), one must also include a list of what it CANNOT do. 

In any case it's a LOOOOOOOOOONG way from FSD.


----------



## bwilson4web (Mar 4, 2019)

I come from an engineering background and testing is critical. But I’m not in the business of causing angst. I just won’t share my results, here. It doesn’t mean they go away but willful ignorance as a policy, we’re experienced with.

Enjoy the silence.

Bob Wilson


----------



## garsh (Apr 4, 2016)

bwilson4web said:


> willful ignorance as a policy


When you say "this screwdriver doesn't do a good job of hammering nails", I feel the need to point out to others who may not know better than screwdrivers aren't designed to hammer nails.

I'm not sure where you get "willful ignorance" out of that.


----------



## bwilson4web (Mar 4, 2019)

garsh said:


> When you say "this screwdriver doesn't do a good job of hammering nails", I feel the need to point out to others who may not know better than screwdrivers aren't designed to hammer nails.
> 
> I'm not sure where you get "willful ignorance" out of that.


Relax, I won't share my Autopilot experiments here. Your team has made it clear my experimental tests are not welcome and I'm quite happy to comply.

Now if anyone is curious, they can contact me via PM and I'll be happy to share what I've learned from my testing. Of course, everyone can enjoy the news images from Autopilot accidents. There are plenty and near as I can tell, a growing number of examples. I prefer education over unplanned accidents but that is because I had four younger brothers. Literature is seldom appreciate but I recommend:
​_*Jonathan Livingston Seagull*, written by Richard Bach and illustrated by Russell Munson, is a fable in novella form about a seagull who is trying to learn about life and flight, and a homily about self-perfection. _​
Bob Wilson


----------



## eXntrc (Jan 14, 2019)

Klaus-rf said:


> In order to make a list of what AP *can* do (currently, as designed), one must also include a list of what it CANNOT do.


You have some very important words in that sentence. Namely *currently* and *designed*. I would also add to that *intended*. There are many things that people can do with their vehicles or any other tool which they were not intended or designed to do. I can *currently* use my Perf Model 3 for "4 wheeling" and climbing muddy mountains. And if I say it's not very good at that, I'd be entirely correct in saying so. But the important difference here is that the average non-Tesla owner still intuitively understands that the Model 3 is not *designed* or *intended* for that task.

The average non-Tesla owner doesn't intuitively know what AutoPilot is designed or intended to do. Posting videos or text saying "The Model 3 totally failed at navigating an intersection" can be damaging to Tesla and AutoPilot because that suggests to any non-Tesla owner that it's *designed* for that purpose but fails to live up to a promise.

Now, I'm all for posting videos of the Model 3 failing to do things it was designed and intended to do. Advanced Summon fails are good examples, so long as the video poster fairly mentions that Advanced Summon is still beta. These videos serve as a reminder for other Tesla owners what can happen in beta and to be careful with it. These videos also serve to help Tesla learn more about unexpected edge cases. But again, owners really need to fairly mention it's still beta.

Ultimately, for autonomous driving to continue gaining acceptance it's critical that us owners paint a fair picture about what it's _designed_ to do vs what we're trying to _make it do_.


----------



## Klaus-rf (Mar 6, 2019)

eXntrc said:


> The average non-Tesla owner doesn't intuitively know what AutoPilot is designed or intended to do. Posting videos or text saying "The Model 3 totally failed at navigating an intersection" can be damaging to Tesla and AutoPilot because that suggests to any non-Tesla owner that it's *designed* for that purpose but fails to live up to a promise.


 So Tesla has a communication problem. They need to assign a PR team to this subject.



> Now, I'm all for posting videos of the Model 3 failing to do things it was designed and intended to do. Advanced Summon fails are good examples, so long as the video poster fairly mentions that Advanced Summon is still beta. These videos serve as a reminder for other Tesla owners what can happen in beta and to be careful with it. These videos also serve to help Tesla learn more about unexpected edge cases. But again, owners really need to fairly mention it's still beta.


 IIRC all AP functions are ßeta that require an attentive full-time driver - at all times - to take over at any moment without notice.


----------



## JasonF (Oct 26, 2018)

Klaus-rf said:


> So Tesla has a communication problem. They need to assign a PR team to this subject.


As I see it, the problem is they used to be a "bleeding edge" car producer, where all of their customers stayed in touch with other owners and Tesla itself via social media, message boards, etc. Now they're becoming "mainstream", where there's just as likely a chance an owner might buy a Model 3 and never speak to other owners or Tesla after purchase. Their only connection with Tesla is the "What's new" page after a software update.

That means no matter how Tesla tries to communicate with people on safe use of Autopilot, a large portion of people simply won't see it. And then there will also be people who do see it/hear about it but think they know better - because they've already taken naps or watched movies while driving, and nothing bad happened before, so why would it now?

There is honestly no amount of documentation or PR that will keep people from doing stupid things.


----------



## eXntrc (Jan 14, 2019)

JasonF said:


> There is honestly no amount of documentation or PR that will keep people from doing stupid things.


Related article on Teslarati today:

https://www.teslarati.com/tesla-autopilot-prank-infuriating-facepalm-video/

This is from a channel with 19.2 *million* subscribers. It's going to be pretty hard for PR and documentation to counteract misinformation like that. I'm not saying Tesla shouldn't try. In fact I _personally_ feel they have an obligation to. But this is new tech that's significantly different from what came before it, and most consumers don't have access to it. It's going to take time and education for the general public to understand how it's truly intended to be used.


----------



## Bigriver (Jan 26, 2018)

garsh said:


> And once again, I feel I need to point out that Autopilot is currently NOT advertised as being able to handle ANY of these situations. When you activate the feature, there is a large warning that says that it should only be used on divided highways without intersections.


I have never seen this warning. Here is what currently pops up when I activate auto steer. Am I missing seeing something else?









The owner's manual does contain a warning against using auto steer on "highways and limited-access roads," but to me this is not as restrictive as your words of "divided highways without intersections." I use auto steer on a number of midwest roads that I believe fully fit Tesla's definition but do not match your wording.









I feel like my post is sounding contentious (it is a Monday 😬), but it's not meant to be. I'm in full agreement that autopilot has its limitations, that it is a difficult thing to get that communicated to owners, much less the general public. I also think autopilot doesn't currently do all that it should (phantom braking needs to not exist), but I guess that is covered under the Beta disclaimer. More than anything, tho, I love autopilot but still feel fully responsible as the human in the ****pit.

BTW @eXntrc, I wish you worked in Tesla's communication department. I thought your posts were exceptional, both in content and clarity.


----------



## garsh (Apr 4, 2016)

Bigriver said:


> I have never seen this warning. Here is what currently pops up when I activate auto steer. Am I missing seeing something else?


The very first time that you activate the Autosteer feature within the car's configuration, you were presented with this warning.










The above screenshot was taken from this video:


----------



## eXntrc (Jan 14, 2019)

Bigriver said:


> The owner's manual does contain a warning against using auto steer on "highways and limited-access roads," but to me this is not as restrictive as your words of "divided highways without intersections."


Yeah, that's a totally fair point. I honestly couldn't tell you what the verbiage is. I remember when I first bought the car, read the manual and all the on-screen warnings it was quite clear that it was intended for highway use. And any attempt by me to use it on city streets was at my own risk. But I couldn't tell you now what I read and agreed to. It was probably what Garsh posted from that video, but there's a very important point here to be made that it doesn't ever pop up again.

It seems like there should be some kind of balance struck here. On one hand I don't want a large confirmation dialog popping up every time I engage AutoPilot. On the other hand, I can't tell you what _any_ software "clickwrap" I accepted a year ago said. Maybe this could be a monthly reminder. But personally I like the idea of renewing confirmation at each major software update. Like "It couldn't do _____ on it's own before but it can now. It still can't do _____ and _____. The driver must remain attentive and in control at _all_ times. Do you accept?" To me, that sort of info would be valuable at every major update and I wouldn't find it overly obtrusive to accept.



Bigriver said:


> BTW @eXntrc, I wish you worked in Tesla's communication department. I thought your posts were exceptional, both in content and clarity.


Wow @Bigriver, that's very kind! Part of my day job includes helping explain emerging technologies to Devs, sales and C-Level folks. So I think some of that experience is just showing through, but I genuinely do appreciate the compliment!


----------



## Klaus-rf (Mar 6, 2019)

eXntrc said:


> ... it was quite clear that it was intended for highway use. And any attempt by me to use it on city streets was at my own risk.


 FIFY.

It's ßeta everywhere. There are no operating conditions for AP that are not 100% "at your own risk".


----------



## Klaus-rf (Mar 6, 2019)

Bigriver said:


> BTW @eXntrc, I wish you worked in Tesla's communication department.


 Wait - Tesla has a communications department???


----------



## garsh (Apr 4, 2016)

eXntrc said:


> On one hand I don't want a large confirmation dialog popping up every time I engage AutoPilot.


When I had my Nissan Leaf, it popped up this dialog.
Every time I started the car.

Every.
Single.
Damn.
Time.

I never want to experience that again.


----------



## JasonF (Oct 26, 2018)

eXntrc said:


> This is from a channel with 19.2 *million* subscribers. It's going to be pretty hard for PR and documentation to counteract misinformation like that. I'm not saying Tesla shouldn't try. In fact I _personally_ feel they have an obligation to. But this is new tech that's significantly different from what came before it, and most consumers don't have access to it. It's going to take time and education for the general public to understand how it's truly intended to be used.


Be careful with that wording. What would make _lawyers_ happy (and probably government as well) is if Tesla is required to only "install" Autopilot at a physical service center, where they can make sure you sign the lengthy release forms, and endure a scripted training course with one of their service people. Only after that would the service tech activate it for you.

And even then, Tesla will still be sued because a person who crashed while watching a movie claimed that the tech didn't make it clear enough, or that they weren't paying attention, and nobody bothered to confirm that they understood every step of the lesson.


----------



## Bigriver (Jan 26, 2018)

garsh said:


> When I had my Nissan Leaf, it popped up this dialog.
> Every time I started the car.
> 
> Every.
> ...


Ha, yeah, such a thing still pops up each time before you can get to the NAVIGATION screen on a 2005 Honda Odyssey.

There was a time that Tesla autopilot couldn't be actuated on certain roads, or other roads that you could actuate it but it had only lane-keep, but not able to change lanes. As they removed the restrictions, I don't recall any changes to the disclaimers or owner's manual. Beta, beta, beta... so one disclaimer fits all, I guess.


----------



## eXntrc (Jan 14, 2019)

garsh said:


> Every.
> Single.
> Damn.
> Time.


I think I physically shuddered when I read that message! Wow. I feel your pain!



JasonF said:


> What would make _lawyers_ happy (and probably government as well) is if Tesla is required to only "install" Autopilot at a physical service center


Actually, I think it'd be even worse than that because AutoPilot continuously evolves. Can you imagine having to resign a contract on each new release!?



Bigriver said:


> Beta, beta, beta... so one disclaimer fits all, I guess.


Well, from a legal standpoint I suppose so. But I do feel there are opportunities to better educate people about each release. I mean, really, the very first thing you see on tesla.com/autopilot is a video of their car doing things that no car on the road can do. And I don't think the text surrounding that video does enough to clarify what you're seeing either.

But these are all just my personal opinions.


----------



## eXntrc (Jan 14, 2019)

eXntrc said:


> I like the idea of renewing confirmation at each major software update. Like "It couldn't do _____ on it's own before but it can now. It still can't do _____ and _____.


Quoting my own self here, but I think the image below is relevant to what I posted above.

Just saw the manual instructions for the upcoming AutoPilot feature *Stopping at Traffic Lights and Stop Signs*.



http://imgur.com/a/K6m91n3


Personally I think that's _very_ clear. Like, impressively clear. I just wonder how many people will see it. That image supposedly comes from the Model Y manual. I wonder how much of that information will be presented to the user with confirmation when the feature arrives.


----------

