# Will FSD really make it?



## Garlan Garner (May 24, 2016)

I summon my car all of the time in various places in the Chicagoland area. 

Some people absolutely love it...and some people are scared to death of it. Some have even called police out of fear. Then when I talk to police about it.....they love it, however because they don't have laws authorizing it....they error on the side of caution and ask that it not be used. 

The comment I have received the most from the naysayers as well as from police officers is "you know....computers crash and aren't always reliable". Totally NON technical folks who know NOTHING about Tesla itself are commenting on FSD. So they condemn it. sample size = 1000 people like it and 2 don't....so it doesn't happen. 

THEN..... when I go to forums the same thing occurs. "FSD doesn't do this and doesn't do that". 

Where does someone go to have non biased conversations to help introduce FSD features to the public? I'm meeting with local authorities and municipalities in community board meetings just to introduce the concept. 


Humans will be a large hurdle in FSD acceptance and its my recommendation that human kind will be a much larger challenge than the technical one. I looked at the Illinois law concerning autonomy and there is an executive order in place that accepts everything Tesla has in production today except that someone would have to always bee in the drivers seat.


----------



## JasonF (Oct 26, 2018)

Like a lot of new tech with cars, I suspect there's going to be a long and painful fight in the U.S. to gain full self drive acceptance. Not so much because of the supposed safety issues (though that's what the regulators will claim), but because of liability. If Tesla or GM or Waymo owns a fleet of self-driving cars, they can be held accountable as an entity for all crashes. If a bunch of individuals own them, then it gets harder to regulate all of them at once.

I've predicted before the possibility that actual Full Self Drive might be available in Europe before it's available in the U.S. because Tesla has to fight with government regulators for a year or two. Look at the fights Tesla had with the government over Autopilot, and supposedly how close it still is to being ordered disabled in the U.S. by the NHTSA.


----------



## iChris93 (Feb 3, 2017)

Garlan Garner said:


> Where does someone go to have non biased conversations to help introduce FSD features to the public?


What does that mean to you? To me, that means discussing both what the car currently excels at and the shortcomings. Without both, the conversation is a dangerous misrepresentation of the capabilities.


----------



## Garlan Garner (May 24, 2016)

iChris93 said:


> What does that me to you? To me, that means discussing both what the car currently excels at and the shortcomings. Without both, the conversation is a dangerous misrepresentation of the capabilities.


That's a good question.

Are Tesla owners' concerns the same as the public? I hear from Tesla owners that their concern is that FSD gets close to the lane lines from time to time or it goes to slow on public streets or....., wheras the public is concerned about the car's computer crashing and killing someone. Would it be then possible to address the public against their rudimentary fears while at the same time owners having their more detailed concerns?

Should we tell the public - "FSD" is very unlikely to kill you, however it really hugs the lane at times on curvy roads - and that's scary"? Would that be a good commercial to promote FSD against their fears?

I don't mind discussing both sides of the fence, however its totally disconcerting hearing problems from the public all of the time and then hearing problems from owners all of the time. Everyone has problems with it on top of problem with it. I scan/google conversations about FSD all of the time and the majority of the time its problems on top of problems. Shortcomings on top of Shortcomings.

I don't mind discussing both sides of the fence, however when I bring up successes to balance the conversations about shortcomings....its always met with such disdain and.....


----------



## Garlan Garner (May 24, 2016)

JasonF said:


> Like a lot of new tech with cars, I suspect there's going to be a long and painful fight in the U.S. to gain full self drive acceptance. Not so much because of the supposed safety issues (though that's what the regulators will claim), but because of liability. If Tesla or GM or Waymo owns a fleet of self-driving cars, they can be held accountable as an entity for all crashes. If a bunch of individuals own them, then it gets harder to regulate all of them at once.
> 
> I've predicted before the possibility that actual Full Self Drive might be available in Europe before it's available in the U.S. because Tesla has to fight with government regulators for a year or two. Look at the fights Tesla had with the government over Autopilot, and supposedly how close it still is to being ordered disabled in the U.S. by the NHTSA.


You are totally correct.

I don't understand how the NHTSA has explicit data indicating less accidents from Autopilot than human driving ( per average ) and still looks to disable it. How can owners thwart that thinking?

When the needle of anything is leaning far to one side......you have to push with such force just to get it moving in the other direction just to get the needle to the middle.


----------



## JasonF (Oct 26, 2018)

Garlan Garner said:


> I don't understand how the NHTSA has explicit data indicating less accidents from Autopilot than human driving ( per average ) and still looks to disable it. How can owners thwart that thinking?


For some reason Americans are generally afraid of new technology. The humble pushbutton starter, and the keyfob unlock on cars, existed in Europe for years before making it to the U.S. - not because of safety, but because surveys indicated that Americans were afraid to buy a car that didn't use a physical key to unlock and start. And not because they felt it was more "solid" - the surveys indicated Americans were afraid the car might start or unlock on its own, without them present. Having a physical key made them feel like that was impossible.

in the U.S., any time something bad hits the press, the riddle for government to solve becomes "how can we make sure this _never ever_ happens again?" If it's a specific technology that keeps coming up in the news, the government focuses on it, and then in spite of any statistics to the contrary, it gets labeled "unsafe". And then the wheels start turning to solve that riddle with strict regulation.

It's likely to end with either Tesla being required to retrofit Autopilot cars with a specific mandated device (like the eye watching device GM uses for example) or being required to disable it across the fleet until they can submit a plan to the government which will make it impossible for the driver to not pay attention. At which time Tesla might just stop spending money on Autopilot and switch to TACC until full self drive is ready - because that's where their heart really is, and it costs the same to re-certify AP as it does to certify FSD.

What Tesla could do is enable Autopilot for the rest of the fleet. I know they'll get complaints from earlier buyers who paid a lot for it, so they'll have to get past that worry first. Enabling it for the rest of the fleet, though, would both increase the positive use statistics that they could use as supporting evidence, and also cause it to become more of a fixture instead of an optional feature - meaning the government would have a harder time mandating that it be disabled. Even though it could actually be disabled via software still, legally, making it ubiquitous puts it in the category of essential features like lane departure warnings.


----------



## NR4P (Jul 14, 2018)

I don't expect we will have real FSD for a number of years. The "Full" at the beginning is a misleading term. Full, means no room to add a dang thing. Some definitions.
-containing or holding as much or as many as possible; having no empty space.
-not lacking or omitting anything; complete 

We have Limited Self Driving today, with the designation Beta. So is it Beta SD?

I declined to pay $5K when I ordered my car and it crept up to $7K and $8K is coming. I got lucky for a rare event and grabbed FSD for $2K since I had EAP. I enjoy the new toys with HW3 such as stop signs and traffic lights. No complaints for what I paid but for now, I stand behind my original decision as a good one and whether the next Tesla is an X or Y it is doubtful I will pay for FSD unless I can take my eyes off the road and hands off the wheel. That's FSD to me.


----------



## Garlan Garner (May 24, 2016)

JasonF said:


> For some reason Americans are generally afraid of new technology. The humble pushbutton starter, and the keyfob unlock on cars, existed in Europe for years before making it to the U.S. - not because of safety, but because surveys indicated that Americans were afraid to buy a car that didn't use a physical key to unlock and start. And not because they felt it was more "solid" - the surveys indicated Americans were afraid the car might start or unlock on its own, without them present. Having a physical key made them feel like that was impossible.
> 
> in the U.S., any time something bad hits the press, the riddle for government to solve becomes "how can we make sure this _never ever_ happens again?" If it's a specific technology that keeps coming up in the news, the government focuses on it, and then in spite of any statistics to the contrary, it gets labeled "unsafe". And then the wheels start turning to solve that riddle with strict regulation.
> 
> ...


But what can WE do?

Or better yet....would you be willing to be an advocate of FSD to the public who knows nothing about it?

With autopilot being where it is today.....and possibly tomorrow....would you advocate for its current features ( to the government).


----------



## Garlan Garner (May 24, 2016)

NR4P said:


> I don't expect we will have real FSD for a number of years. The "Full" at the beginning is a misleading term. Full, means no room to add a dang thing. Some definitions.
> -containing or holding as much or as many as possible; having no empty space.
> -not lacking or omitting anything; complete
> 
> ...


But what can WE do?

Or better yet....would you be willing to be an advocate of FSD to the public who knows nothing about it?

Current features where they are today.....would you be willing?

Without looking into the future or past...do current features work? <------despite what you have paid for them.


----------



## NickJonesS71 (May 11, 2020)

Garlan Garner said:


> Everyone has problems with it on top of problem with it. I scan/google conversations about FSD all of the time and the majority of the time its problems on top of problems. Shortcomings on top of Shortcomings.


That should tell you all you need to know. You nailed it. The tech is great, when it works, when it doesn't (phantom braking) it's downright dangerous. We often become quickly complacent. Just the other day I was on the highway and realized I didn't look up from the screen (was trying to find a good in car web shortcut landing page that I lost with the HW3 retrofit) for at least 3 minutes. I've grown quite comfortable with it on highways and thus I had a lapse in judgement, I'm sure I'm not alone.

I understand that this is a difficult set of deliverables, and I get that it's a long road. But I'm not a fan of advertising or paying for something that is 25% complete. I do not think for one second Tesla is transparent to new owners, the FSD section on the order page sounds like the best thing since sliced bread so you order it for 7k only to find out most of the features are shoddy at best.

I think Tesla needs to separate us out. If you're a member of an officially sanctioned Tesla club, and you choose to opt in as a developer then by all means sure. But I don't think deploying these features to the masses is, especially at the price point, is the right thing to do

The only feature I think is ready for prime time and the mass public is the lane centering and lane changes. Even NOA to off ramps can be nerve wrecking in my experience. Where I live it constantly wants me to change lanes to follow the route, however I can assure you it's not necessary, so thus even "no confirmation lane change" is also disabled else I'd be ping-ponging all over the road


----------



## Friedrich (Mar 4, 2017)

JasonF said:


> I've predicted before the possibility that actual Full Self Drive might be available in Europe before it's available in the U.S. because Tesla has to fight with government regulators for a year or two.


No chance. As it stands now, the FSD options here in Europe are limited to an extent that leaves FSD unusable for all intents and purposes. There's no conceivable chance that the European functionalities will come close to those currently available in the US, let alone surpass them any time soon.


----------



## JasonF (Oct 26, 2018)

Garlan Garner said:


> But what can WE do?


As a group we can use Autopilot as it's supposed to be used, and stop filming youtube videos with the driver sleeping in the back seat.

I don't have AP activated myself because in 2 years, I haven't been able to justify the $3000 upgrade cost vs how much I would actually use it - my daily route doesn't really take me that far. But I still like the concept of it, as a tool to reduce exhaustion that causes people to make bad decisions on long drives, or in lengthy traffic jams.


----------



## Needsdecaf (Dec 27, 2018)

Garlan Garner said:


> I don't understand how the NHTSA has explicit data indicating less accidents from Autopilot than human driving ( per average ) and still looks to disable it. .


You've said this in other threads, but I don't believe you are correct. I believe the only data anyone other than Tesla has is Tesla vs non-Tesla. Do you have a source for the above? I could be wrong.


----------



## msjulie (Feb 6, 2018)

JasonF said:


> But I still like the concept of it, as a tool to reduce exhaustion that causes people to make bad decisions on long drives, or in lengthy traffic jams.


I wish I found this true; I find that because of phantom braking (less than before but far from zero) and odd behavior still when right hand side lines get confusing due to entrance ramp, I find I actually get more stressed from keeping an eagle eye on the car. Less so w/o passengers so maybe that's something; I hate passengers being jerked on odd behaviors from the car.


----------



## NR4P (Jul 14, 2018)

Garlan Garner said:


> But what can WE do?
> 
> Or better yet....would you be willing to be an advocate of FSD to the public who knows nothing about it?
> 
> ...


I have been asked by the wife to not use AP because of the phantom breaking. Shadows causing sudden hard braking at 70mph is dangerous.
I have been asked by the wife to not use AP because it refuses to budge from the center of the land and when the next lane has an oversized truck or load or mirrors, we have had too many near misses.

How about these two things get fixed before we promise others that FSD is nearly ready?


----------



## Garlan Garner (May 24, 2016)

ok,

I don't believe I'm asking the correct question. Blame it on me. 

I think everyone is clear as to what the dislikes are about FSD. Its clear. We get it. Phantom braking, not moving over for other vehicles coming too close.......etc, price is too high. 

What can we do "positively" to help?


----------



## JasonF (Oct 26, 2018)

Garlan Garner said:


> I think everyone is clear as to what the dislikes are about FSD. Its clear. We get it. Phantom braking, not moving over for other vehicles coming too close.......etc, price is too high.
> 
> What can we do "positively" to help?


Not a lot without Tesla's support. They're the only ones who can communicate directly with the NHTSA. I guess the only suggestion I might have is to get Tesla to consider making Autopilot ubiquitous (unlock it for everyone who still doesn't have it) to make it harder legally for the NTHSA to take it away...and then both for publicity and a huge pile of brand new safety data for the NHTSA to sort through, launch a Ten Billion Mile Challenge to encourage all Tesla drivers to use AP as much as possible to reach and surpass that goal as quickly as possible.


----------



## Needsdecaf (Dec 27, 2018)

Garlan Garner said:


> ok,
> 
> I don't believe I'm asking the correct question. Blame it on me.
> 
> ...


Use it responsibly and not make stupid videos and post them on YouTube.


----------



## Bigriver (Jan 26, 2018)

To me, a huge element of this whole FSD experience is that Tesla has included us, the owners, through the development process. This is unheard of. And I don't know that it is fully wise. But my family and I have willingly been a part of this experiment.

I have both loved and cursed autopilot. I started this journey at the end of 2017 when basic autopilot was truly awful and worthless. Been through myriad updates that sometimes took things backwards. Hated the phantom braking, the LONG hesitation that used to be there before changing lanes (and letting the opening close up, pissing off the car that I just pulled in front of) and ping-ponging between lines. Right now, tho, I'm in a good phase of my relationship with autopilot. It has been very good and stable for me lately. Of course I've also learned not to let it do some things. (@NickJonesS71 I so agree with you about NOA.) And a wee bit nervous about what additional bumps there will be in the autopilot journey, as an upcoming software re-write makes me sweat just a bit.

But back to the OP's intended question:


Garlan Garner said:


> What can we do "positively" to help?


I don't really feel a need to do anything. I feel blessed to have Teslas and I eagerly talk to anyone who wants to talk about them. To those who think autopilot sounds perfect, I caution them of its beta and ever-changing state. To those who think it couldn't possibly be safe to let a computer drive, I tout how life-changing it has been for me. I think autopilot (and EVs) are in everyone's future, and I feel lucky to be ahead of the general population in seeing this oncoming revolution. But I don't feel a need to sugar coat anything about current capabilities, nor do I feel any sense of responsibility to convince individuals or regulators of anything. I don't foresee any hearings that ask us cult-like Tesla owners to say how we feel about our car's self-driving capabilities. I also don't foresee Tesla needing to open up autopilot to non-paying existing owners for more data, as I think they are already swimming in more data than they can utilize.

Tesla says FSD is contingent on regulatory approval. However, it is first contingent on getting it fully working, eliminating the issues that most owners experience and acknowledge. That is outside my control or influence.

Will FSD make it? I think eventually yes, although I think the definition of what "FSD" is will remain fluid for a very long time.


----------



## Garlan Garner (May 24, 2016)

Bigriver said:


> To me, a huge element of this whole FSD experience is that Tesla has included us, the owners, through the development process. This is unheard of. And I don't know that it is fully wise. But my family and I have willingly been a part of this experiment.
> 
> I have both loved and cursed autopilot. I started this journey at the end of 2017 when basic autopilot was truly awful and worthless. Been through myriad updates that sometimes took things backwards. Hated the phantom braking, the LONG hesitation that used to be there before changing lanes (and letting the opening close up, pissing off the car that I just pulled in front of) and ping-ponging between lines. Right now, tho, I'm in a good phase of my relationship with autopilot. It has been very good and stable for me lately. Of course I've also learned not to let it do some things. (@NickJonesS71 I so agree with you about NOA.) And a wee bit nervous about what additional bumps there will be in the autopilot journey, as an upcoming software re-write makes me sweat just a bit.
> 
> ...


Fabulous response. Thanks for the clarity and honesty.


----------



## Mr. Spacely (Feb 28, 2019)

The easiest thing we can do is teach folks what the car can and cannot do one conversation or ride at a time. Over time a good percentage of people will have heard our message...


----------



## Garlan Garner (May 24, 2016)

Mr. Spacely said:


> The easiest thing we can do is teach folks what the car can and cannot do one conversation or ride at a time. Over time a good percentage of people will have heard our message...


Indeed. That's a great idea. Thanks.


----------



## shareef777 (Mar 10, 2019)

Garlan Garner said:


> Where does someone go to have non biased conversations to help introduce FSD features to the public? I'm meeting with local authorities and municipalities in community board meetings just to introduce the concept.


The way the question is worded, ironically, sounds biased to me. It's like asking " where can I go to talk to people that only like FSD and I don't have to hear the negatives". Adapting and responding to constant complaints/nagging is what makes a great company. Especially when it comes to software as bugs are naturally going to be in the code base. And more critically for machine learning software as the software needs to adapt to changing environment through user input.


----------



## DocScott (Mar 6, 2019)

Bigriver said:


> I also don't foresee Tesla needing to open up autopilot to non-paying existing owners for more data, as I think they are already swimming in more data than they can utilize.


I agree--that wouldn't be the reason they'd open up AP to non-existing owners.

But if AP can eventually be shown to be _conclusively_ safer than not having AP, Tesla might eventually do it for that reason. It's one thing to have a hardware-based safety feature that costs a company money to retrofit on cars that didn't pay for the option. It's another thing to be able to just flip a switch and suddenly have a bunch of people have safer cars.

We're not there yet. We kinda sorta think having AP is safer, but Tesla's numbers haven't demonstrated that yet. (The fact that there are fewer accidents per mile on AP than off could just be because AP tends to be used in situations that are safer anyway.) But once they do demonstrate that, it might happen. After all, by that point it will be a small fraction of the fleet (since all new Teslas have had AP for some time now), and most of the people who paid for AP back in the day (like me) will have had several years of use to show for it.


----------



## shareef777 (Mar 10, 2019)

DocScott said:


> I agree--that wouldn't be the reason they'd open up AP to non-existing owners.
> 
> But if AP can eventually be shown to be _conclusively_ safer than not having AP, Tesla might eventually do it for that reason. It's one thing to have a hardware-based safety feature that costs a company money to retrofit on cars that didn't pay for the option. It's another thing to be able to just flip a switch and suddenly have a bunch of people have safer cars.
> 
> We're not there yet. We kinda sorta think having AP is safer, but Tesla's numbers haven't demonstrated that yet. (The fact that there are fewer accidents per mile on AP than off could just be because AP tends to be used in situations that are safer anyway.) But once they do demonstrate that, it might happen. After all, by that point it will be a small fraction of the fleet (since all new Teslas have had AP for some time now), and most of the people who paid for AP back in the day (like me) will have had several years of use t6o show for it.


I generally don't say never, but we'll NEVER get there. People (admittedly myself included) PAID for features and Tesla can't simply decide to give it away without repercussions.


----------



## Garlan Garner (May 24, 2016)

shareef777 said:


> The way the question is worded, ironically, sounds biased to me. It's like asking " where can I go to talk to people that only like FSD and I don't have to hear the negatives". Adapting and responding to constant complaints/nagging is what makes a great company. Especially when it comes to software as bugs are naturally going to be in the code base. And more critically for machine learning software as the software needs to adapt to changing environment through user input.


Indeed it is biased.

I indicated earlier in the thread that the predominant conversations have been biased against FSD. The public seems scared of it and Tesla owners complain about it.

Absolutely I would like to hear some good news for a change.

I don't believe at all that constant complaints and nagging make a great institution at all - no matter what the institution is.

Even if that's true.....there are those who don't want to ALWAYS hear that. I'm one of them.

There are other threads where that happens. My attempt is to create one where that isn't the primary focus and where we can positively and publicly help TESLA with AP.

I guess we just can't do it.


----------



## Garlan Garner (May 24, 2016)

shareef777 said:


> I generally don't say never, but we'll NEVER get there. People (admittedly myself included) PAID for features and Tesla can't simply decide to give it away without repercussions.


Oh well...thanks ( and @*DocScott ) *for changing the thread.

Geeeshhh.

Maybe I can eventually create another one.


----------



## FRC (Aug 4, 2018)

Garlan Garner said:


> Oh well...thanks ( and @*DocScott ) *for changing the thread.
> 
> Geeeshhh.
> 
> Maybe I can eventually create another one.


I can appreciate your point. Constant whining and complaining is not helpful or productive. And constant negativity makes your position less impactful. But, all those same things can be said about constant positivity. A rose-colored-glasses, pollyanna approach does no more to improve a product or company than constant negativity does.

Lack of bias in either direction is the way to go. If something is great, say so. If it sucks, also say so.


----------



## garsh (Apr 4, 2016)

FRC said:


> Lack of bias in either direction is the way to go. If something is great, say so. If it sucks, also say so.


Be sure to point out that - unlike other cars - if something sucks, it will most likely be improved via an over-the-air update, AFTER YOU'VE BOUGHT IT.

We've gotten so used to that being the case, that it's sometimes hard to remember that this is not true for _any_ other vehicle, so non-Tesla owners might think that if something is bad now, then there's no point in buying a Tesla now.


----------



## DocScott (Mar 6, 2019)

shareef777 said:


> I generally don't say never, but we'll NEVER get there. People (admittedly myself included) PAID for features and Tesla can't simply decide to give it away without repercussions.


I paid for it too.

But note that some people paid $2000 and some paid $3000, if I recall correctly. There may have been other prices at some point, too. So I guess they "gave away" $1000 worth of features?

The idea that paying some price means everyone else has to pay the same price forever--where does that come from? If you pay for something and get what you paid for, then you got what you paid for. If you pay for energy-efficient light bulbs and then a few years later your utility company decides to start giving them away for free, is that a problem?

I do feel differently about FSD. For FSD, many people paid a substantial sum for nothing at all--just the promise of features later. And many of those features kept getting transferred to Enhanced Autopilot (I'm looking at you, Smart Summon). Only now are the first FSD-only features are starting to appear, far later than what was promised when they bought the option. So people who purchased FSD in, say, 2017 _didn't_ get what they paid for. That's the thing that should have repercussions. And that sort of gets us back to the original topic of this thread...

EDIT: Now that I think about it, there's a way we could both end up being sort of right, and that's if the features that are in each level of automation change over time. That's already been happening. So maybe at some point Tesla decides all cars capable of it are given TACC. No one paid for something called "TACC"--it was a feature within AutoPilot. Then, to keep the people who paid for AutoPilot happy, automatic lane change with the turns signal could be moved in to AutoPilot. And those with Enhanced AutoPilot could get stop sign recognition, or something like that (although that part's tricky, since some of that might require HW3, and that's not just "flipping a switch"). And those with FSD would, of course, always be getting the bleeding edge features.


----------



## Needsdecaf (Dec 27, 2018)

Garlan Garner said:


> Indeed it is biased.
> 
> I indicated earlier in the thread that the predominant conversations have been biased against FSD. The public seems scared of it and Tesla owners complain about it.
> 
> ...


Most of the time, eliminating the negatives has a far more reaching effect than accentuating the positives. Especially in today's Yelp driven world. You can have 300 great reviews, but someone posts a takedown and you mis-handle it and all that hard work goes out the window. So as much as you'd like to try to help by accentuating the positives, you can't do that but not also focus on the negative because.....



FRC said:


> I can appreciate your point. Constant whining and complaining is not helpful or productive. And constant negativity makes your position less impactful. But, all those same things can be said about constant positivity. A rose-colored-glasses, pollyanna approach does no more to improve a product or company than constant negativity does.
> 
> Lack of bias in either direction is the way to go. If something is great, say so. If it sucks, also say so.


Exactly. Constantly only focusing on the positive aspects paints you as someone who does wear rose colored glasses and is a cheerleader. Have to do both.

Having said that, I do agree with you that you can make the positive attributes the PRIMARY focus while still dealing with the negatives. However.....



DocScott said:


> I paid for it too.
> 
> But note that some people paid $2000 and some paid $3000, if I recall correctly. There may have been other prices at some point, too. So I guess they "gave away" $1000 worth of features?
> 
> ...


IMO, Tesla has really fumbled the way FSD has been sold. This is a perfect example. It's never been defined well. Timeframes have been nebulous. Features deliberately vague. When I bought my car with EAP in 2018, FSD wasn't even an option....yet people wanted to watch my car drive itself to lunch. Completely by itself. Tesla has not done a good job at defining what FSD is and what people can expect. They've constantly changed the price and the feature set, not to mention the date when many of those features are deployed. Their customers should be their own best advocates, and yet they have created a scenario whereby many of their customers are frustrated and disenfranchised. So how can the general public be expected to feel?

Finally you have Elon saying things like "Robotaxi in the next 18 months". And "Car will drive itself across the country". Most people take this at full face value. They are used to automotive manufacturers being very concrete and deliberately conservative, in what a car can and cannot do. Tesla comes in and acts like a startup, making grand promises that are not always met, and sometimes met, but not in the promised timeframe. This has confused and angered a lot of people. And likely for a while it was necessary. Because no one thought an EV could be viable, let alone cool. Tesla has changed that. But they haven't changed their rhetoric. And now the two things are somewhat at odds.

Back to your original question, my answer above was only a little tongue in cheek when I said "don't do stupid stuff and post it on YouTube". I agree with some other posters; I feel no obligation to evangelize on Tesla's behalf. I think the best thing to do is point out the benefits, deal with the negatives, and continue to espouse how much we enjoy the cars overall. Not from some feature that's half baked at best.


----------



## shareef777 (Mar 10, 2019)

DocScott said:


> I paid for it too.
> 
> But note that some people paid $2000 and some paid $3000, if I recall correctly. There may have been other prices at some point, too. So I guess they "gave away" $1000 worth of features?
> 
> ...


You're right. I'm referring to FSD too. I paid $3k for AP, and now it's free. I'm OK with that. But I paid for FSD (and it's development). To start to give it away before I even have a chance to use what I paid for years ago isn't fair. Likely can't do anything about it, but I'll surely be grumpy (for at least a few days, shoot maybe a whole week).


----------



## shareef777 (Mar 10, 2019)

Garlan Garner said:


> Oh well...thanks ( and @*DocScott ) *for changing the thread.
> 
> Geeeshhh.
> 
> Maybe I can eventually create another one.


As garsh pointed out. Our whining and complaining can finally improve a product we already own. So you're welcome for all the improvements I'm making to your car 😆


----------



## Garlan Garner (May 24, 2016)

Everyone,

I tried.

It got taken over. Now we have 2+ threads about this. 

I'm out. Enjoy.


----------



## Garlan Garner (May 24, 2016)

garsh said:


> Be sure to point out that - unlike other cars - if something sucks, it will most likely be improved via an over-the-air update, AFTER YOU'VE BOUGHT IT.
> 
> We've gotten so used to that being the case, that it's sometimes hard to remember that this is not true for _any_ other vehicle, so non-Tesla owners might think that if something is bad now, then there's no point in buying a Tesla now.


Indeed we should do what we can to change the publics ideas on NOT buying a Tesla because of their fears or what they read from Tesla owners.

We should help encourage them that its not ALL what they see or read here or in the news.

Teslas don't all catch on fire
Teslas don't just continuously run out of juice ( range anxiety )
Teslas FSD doesn't kill you or run over other people.

I see all of the time that people read this forum and watch its podcast that aren't owners yet. ( thats obviously clear for various reasons ). They are doing their research before buying a Tesla and/or FSD, which is great. I knew when I put a deposit on my Model3/ EAP (standing in line in the rain for 4 hours) on March 31, 2016 - that it wasn't finished, but I trusted the update process as I saw it in the Model S and trusted it and I was right...it eventually came about. Now it's FSD's turn. but..oh well....

Anyway....everyone - enjoy


----------



## JasonF (Oct 26, 2018)

shareef777 said:


> You're right. I'm referring to FSD too. I paid $3k for AP, and now it's free. I'm OK with that. But I paid for FSD (and it's development). To start to give it away before I even have a chance to use what I paid for years ago isn't fair. Likely can't do anything about it, but I'll surely be grumpy (for at least a few days, shoot maybe a whole week).


This part of the topic seems to be taking on a life of its own and running away. Addressing not just you (I'm not singling you out) I said above that unlocking Autopilot for the remaining Tesla owners who don't have it would be controversial for those who paid for it, but it might be worth it for Tesla because _it would make that feature ubiquitous and more difficult for the NHTSA to order it removed. _Right now it's considered an "optional feature", and it's well known that it can be switched off at will by Tesla, so the NHTSA might feel it's no big deal to make them turn it off until they can demonstrate x or y - which would be a big deal for Tesla, because it takes away a big unique point of its vehicles. If they switch it on for everyone via a software update, that demonstrates it's now "part of the vehicle" and therefore more difficult to remove, which means Tesla can leave it turned on while trying to satisfy the NHTSA. The extra data and the "10 million mile challenge" idea I mentioned would just be a side benefit.

That said, I do not know the accounting impact it would have on Tesla. It's possible they can never, ever enable AP for everyone because it would open up huge accounting issues that I would never have considered. Or perhaps some legal agreement sets the price for non-unlocked AP, and if they ever dare change it again, they will face lawsuits from a group of owners. Or it could simply be that their execs decided that no AP on older models will become their incentive to buy a new Tesla that comes with it for free, so they don't want to add value to a model they're hoping to deprecate. There are a whole lot of reasons why it might be impossible.

I did not mention anything about making FSD free. That must remain optional for now because it's not complete. Paying for it constitutes and agreement that you _know_ it's not complete and are willing to pay for it anyway. If it came with the car and no specific agreement, Tesla would be open to class-action lawsuits for not providing a feature that was promised to come with the car.



Garlan Garner said:


> Everyone,
> 
> I tried.
> 
> ...


It sounds like you had very specific, focused responses in mind, almost like a multiple choice - and none of our responses matched them, so it seems like we're derailing the conversation. If that's what you need, then you could present it as a poll or a list of items to choose from. It's really hard to guess what you have in mind.


----------



## Needsdecaf (Dec 27, 2018)

Garlan Garner said:


> Everyone,
> 
> I tried.
> 
> ...


I really don't understand what you were expecting? Honestly.



JasonF said:


> It sounds like you had very specific, focused responses in mind, almost like a multiple choice - and none of our responses matched them, so it seems like we're derailing the conversation. If that's what you need, then you could present it as a poll or a list of items to choose from. It's really hard to guess what you have in mind.


Agreed. I don't think the responses here have been bashing or carping at all. I think it's merely stating the truth of the situation. There's good, and there's bad. There's way more good than bad. Most owners are thrilled with their car, warts and all. Hell, my car got wrecked and I bought another one. Does it get any more clear of an endorsement than that? I just don't get what you expected people to do, say or feel about the situation in order to promote FSD.....


----------



## P&J (Apr 10, 2016)

Frankly I don't care how we get to FSD ... just want to keep pressing on. I am with the crowd that looks at joining with Tesla to keep moving ahead. Why, because we need to keep advancing technology in this country, and we need to support people working on this Tech. Yes we paid too much for what we thought it should be at this point in time. However I have spent a lot of money on cars in the last 50 years and never have I had the feeling the car would keep improving the longer I kept it. 

But the 3 was/is great and with it's new brain it will continue to impress and funding the ongoing improvements is the price of admission for advancement.
My son will get the 3 as soon as Tesla gets a few bugs and paint work done on our new Y and I am already loving the extra height and hauling capability. 
I fully expect the Y will take me in town to the places I need to go for the next 10 years SAFELY. And that is why I funded the FSD on both cars, and why I am very impressed with the S and X folks that plunked their considerable coin down to give this a shot. 

If we continue to listen to those who believe the world owes them a perfect paint job we have lost sight of the goal which was not a pretty toy but a tech wonder.
Early adopters lived through the warped fiberglass on Corvettes to get an inexpensive supercar (when that was important) and really awful cars of the 70s. We went to war for oil while we were going to space. Now we need to fund the tech keep improvements, and generate enthusiasm for the next generation, cause ICE cars won't do it any more.


----------



## Gatica (Oct 25, 2018)

Here's my take on FSD:

FSD will be here one day, maybe soon, maybe years from now and maybe Tesla isn't first. There are a lot of companies working on their own version of FSD.
Tesla seems to be doing micro releases (cones, trash cans, etc.) these visuals have slowly improved over time (static person picture moving across the screen vs a silhouette now with moving body parts walking in the direction the real person is). To me these are great for people to get an idea of what the car can see. All of these images on the screen had to be drawn by someone and it's good to see the image library growing and improving.

I have heard from many people (non Tesla owners) that they do not trust any car to drive itself because the car doesn't know the difference between a life or an inanimate object, so the car would just hit whatever is less damaging to itself. I have changed some peoples minds and blown others minds by being able to show that the car can in fact see people, trash cans, etc. this means that the car can make a choice on what to hit if that choice had to be made.

I have seen far less "phantom breaking" from when I first got my Tesla. I make it a point to log the bug report on the car and note the time and date then send the info to Tesla via email stating the AP issue and the time/date that I logged the bug report on the car so they can pull the log and improve the issue. Unfortunately for things like "phantom breaking" to get fixed a Tesla engineer needs to know about it and have the data to look at, the SC personel don't always relay the information to the engineers and will usually give some line that it was a bridge or shadow, thus never making it to an engineer (a personal experience I had at an SC).

The biggest hurdle though is regulatory, Tesla may in fact have some version of software that would be perfectly fine for FSD but they can't just release it and hope that some government agency decides deferentially.

I think by Tesla doing micro release improvements will allow more people time to get used to the fact that a car can drive itself along with feedback from the ever growing fleet of vehicles.

To the OP:
Don't take it too personal, a lot of people only post negative reviews for products/services because something is irritating them and they want to know their issue is heard and will be addressed. Few people post positive reviews, hence why some companies pay people to post positive reviews. When I am asked about my Tesla I tell people the things I love about my car (instant torque, not going to the fueling station, over the air updates, the superchargers, NOA, etc.) but I also tell them the bad (Autosteer can be twitchy at times, visualizations look like they are having seizures sometimes, just like my home computer the cars infotainment screen crashes and needs to be rebooted occasionally, there are not enough service centers, etc.) some of these are improving, just not as fast as I would like. So hang in there and talk about what you like or don't like of the current state of AP and what you are looking forward to with FSD.


----------



## John (Apr 16, 2016)

The most immediate thing everybody can do is not wreck using Autopilot or Summon.


----------



## dreitz (Oct 22, 2018)

yes, but not for a long time. AP on freeways works like a dream though. had FSD in my first tesla that was a lemon and my wife didn't trust it for ****, understandably. I'm very excited to see where it leads and rooting for it like crazy.


----------



## FRC (Aug 4, 2018)

Just completed a 1000 mile road trip and used NOA extensively(stop signs/lights turned off). On the freeways(probably 600 miles) NOA was close to flawless. With the minor exception of 2 unexplainably aborted lane changes, and one attempt to take an improper exit, it performed exactly as it should have with no input from me.

It also performed as expected on the 400 miles of surface roads. However, on long trips, the speed limit restrictions are tiresome. Depending on the road and the Tesla-detected speed limit, you're limited to 45 mph(when no limit is detected), or the speed limit, or 5 mph over, or your chosen offset(+9mph in my case). These limits bounce back and forth for no immediately obvious reason, and they are too slow for a man trying to get home after celebrating SpaceX(too much)! I'm hoping that Tesla is close to having enough data to allow us to choose our speed.


----------



## M3OC Rules (Nov 18, 2016)

Garlan Garner said:


> Indeed we should do what we can to change the publics ideas on NOT buying a Tesla because of their fears or what they read from Tesla owners.


I whole-heartily agree with this. It's hard to convince people that Tesla's really are that much better.

When it comes to FSD I think it's not enthusiasts that are causing the problems but Tesla themself. I would recommend Autopilot and parking features if they were priced appropriately for their current value. Bundling hope for several thousand dollars makes it hard for me to recommend. They are selling future features with timelines they provide and then never meet. I feel like people should be honest so people researching realize what they are buying. Originally they sold FSD as a separate addition to EAP which was way more reasonable. If you were crazy enough to buy a feature that got you nothing it was your choice. They took that choice away and now you have to pay for it to get certain actually useful features. I purchased FSD during the $2k sale because I love to watch them develop this amazing technology. I'm an engineer. But it has not provided any significant value in terms of useful features. Stop sign/light stopping is not a driving aid. Its a demo. Sure it will get better but let people buy it when it's ready.

Isn't there a risk that people spend way more than they want on a feature bundle that's filled with half baked features and are disappointed?

I'm all for ways to convince people to buy Teslas but how they handle FSD doesn't help. I would recommend people not buy FSD unless they are an enthusiast/techie. The threats of price increases are also hard for me to believe because you can't just charge whatever you want saying it's going to be worth $100k someday. People aren't that gullible. If they want more people to use it AND make more money they can not overcharge. Sure they will raise the price $1k. They might also have a $1k sale in a year.

Also regulators and the public will be onboard when the technology works. That means its safe, price competitive, and trip time competitive. They have already oversold when the technology will be ready.


----------



## slave0418 (Aug 4, 2019)

Sure it will in future but after people will settle on Mars but as Elon said, not in his lifetime. lol


----------



## M3OC Rules (Nov 18, 2016)

Scranton Model 3 owner said:


> It does not recognize the yellow speed signs. Nor does it recognize speed limit signs in construction zones. And it can get confused (sees speed limit sign then goes back to the location tagged speed limit and then sees another speed limit sign then goes back to location tagged limit). This happened several times, it switched the speed limit 4 times in less than 2 miles. Quite annoying.


This is disappointing. I like that they release things early in development but presumably they have been working on this for years. The Mobileye split happened in 2016. It gets touted how often they have so much more data than everyone else but then shouldn't they be able to do better than that?


----------



## SimonMatthews (Apr 20, 2018)

FRC said:


> In order to achieve FSD, the car must be correct 100% of the time. Good enough ain't gonna get it.


That would be true if human drivers never had accidents. All it has to do is be safer than human drivers.

"Don't let the perfect be the enemy of good".


----------



## GDN (Oct 30, 2017)

SimonMatthews said:


> That would be true if human drivers never had accidents. All it has to do is be safer than human drivers.
> 
> "Don't let the perfect be the enemy of good".


I would like to agree with you and do figure there will come a day most people realize that "Safer than human drivers" is a good thing, but it will take a long time to prove that. We kill many people across the world every day with cars being driven by humans and we just accept it - along with a few lawsuits, etc. However, let the car kill someone on it's own, even at a rate lower than what humans are doing, and I don't think we'll ever accept it.

You could have the most perfect self driving system that never kills anyone and it would take years for it to be accepted and allowed on the road without a driver.


----------



## M3OC Rules (Nov 18, 2016)

DocScott said:


> Speed limit signs are a good example. We already require speed limit signs to have standardized fonts, colors, and sizes, as an aid to _human_ drivers. It would not be a big deal to add a strip with information easy for a computer AI to read, such as a bar code or QR code. It wouldn't even require new signs in most cases; just a permanently adhered label road crews could slap on to signs.





DocScott said:


> If reading speed limit signs was so easy, then how come it took Tesla so long to implement a new, post-Mobileye, method?





JWardell said:


> We've made it clear over and over again that IDing speed limit signs has been a political challenge, not a technical one. We don't know what Tesla did to agree or get around Mobileye. My guess is they just waited long enough, and know they are big enough to survive a lawsuit now. Nothing to do with the tech.


Mobileye wasn't the first to patent road sign detection on a car. Everyone suggests its because of that patent but I wonder if they think that just because of the relationship with Tesla. If it has nothing to do with the tech then I would expect better performance. I've only driven once in a downtown environment over less than 2 miles with the new speed limit sign detection and it worked perfect. Some of the "edge" cases people have found very quickly though suggest maybe its not very robust. Tesla should have these edge cases with all their touted data. I don't think they are edge cases. I think they aren't finished with development of the code to handle non-map based speed limit information and they don't have enough data. And the data issue isn't because of a lack of miles driven or lack of coverage. Its because their data collection is necessarily targeted. Or maybe limited labeled data as well. Just guesses of course. Hopefully it's better than I think because I've been waiting a long time for this. 

In terms of bar codes on signs, I don't think this is a great solution. I think the cost of implementation would be significant. You would create new issues of maintenance, inventory tracking, and temporary situations due to construction. How many times are construction signs not meant to be honored sitting on the side of the road for a long time after a construction project is finished or during? Most of the time it works but not always. Which does better if the sign is partially blocked? AI or barcode? It would also likely have to be on a city by city basis. So I could see a city supporting the cost if it meant robo-taxis could operate in their city but you have a chicken and an egg problem. But the technology apparently has been developed and tested. Check this out if you haven't seen it: https://multimedia.3m.com/mws/media/1584051O/2d-barcode-whitepaper.pdf In a more general sense maybe I'm more optimistic that someone will solve this before the infrastructure changes. Tech vs government. I realize the government has to approve autonomy solutions but even so.


----------



## garsh (Apr 4, 2016)

M3OC Rules said:


> Mobileye wasn't the first to patent road sign detection on a car.


Citation needed. 
Or at least desired. Do you know of any in particular that you can point us at?


M3OC Rules said:


> ...but I wonder if they think that just because of the relationship with Tesla.


That relationship is important though. The punishment for patent infringement is MUCH worse when it can be proven that the infringement was willful. Because of this, it's generally recommended that companies don't go looking for prior art when working on a patentable invention. If Tesla were to infringe on Mobileye's technology, then Mobileye would be able to easily prove that Tesla had prior knowledge of their patent. If Tesla accidentally infringes on some other company's patent, they'll probably be able to settle out of court for a small sum.


----------



## GDN (Oct 30, 2017)

Being interested in the same - I just did a quick Google. Did not review the details of either of these patents, but seems there may be 2 in play from the early 2000's. Pasted the links below. However, their dates may be the key. 20 years was the original limit on patents and that has changed to 15 to 17 years it seems. So if the patents weren't Mobileye's and belonged to someone else they may have just expired. Mobileye may have been paying royalties for their use.

https://patents.google.com/patent/US20080137908A1/en
https://patents.google.com/patent/US6813545B2/en


----------



## garsh (Apr 4, 2016)

GDN said:


> Being interested in the same - I just did a quick Google. Did not review the details of either of these patents, but seems there may be 2 in play from the early 2000's. Pasted the links below. However, their dates may be the key. 20 years was the original limit on patents and that has changed to 15 to 17 years it seems. So if the patents weren't Mobileye's and belonged to someone else they may have just expired. Mobileye may have been paying royalties for their use.
> 
> https://patents.google.com/patent/US20080137908A1/en
> https://patents.google.com/patent/US6813545B2/en


On Google's patents website pages, the table to the right usually has an "expiration" date of some sort.
The first one above has an "adjusted expiration" in 2030, while the second one has an "anticipated expiration" in 2023.


----------



## M3OC Rules (Nov 18, 2016)

garsh said:


> Citation needed.
> Or at least desired. Do you know of any in particular that you can point us at?


There are many. Here are a few I found in a quick search, not all US. There are many more. I guess I'm skeptical that there is one Mobileye patent that has prevented Tesla from offering this. These types of things have happened in the past but that also leads me to believe someone speculated that and then its just been repeated until people think that's the truth.


On a sidenote the 1990 VW one also includes bar codes! 

Edit: Also I added an interesting one at the bottom for Google stop light detection.


----------



## GDN (Oct 30, 2017)

None of this is to say that Tesla may have found the patents that align to what they need to do and just negotiated royalties and moved on down the road.


----------



## lance.bailey (Apr 1, 2019)

data point - I remember a loaner Volvo S60 from dealer at least 4 years ago and it had the speed limit in the dash. When I asked about it they said that they did recognition from the front camera. So the technology has been out there in cars for quite a while.


----------



## tencate (Jan 11, 2018)

M3OC Rules said:


> I whole-heartily agree with this. It's hard to convince people that Tesla's really are that much better.


This is the long story I give people when they ask me how quickly autopilot FSD has come in the past couple of years:
⁃ I drive quite a lot. 70k miles on the car, lots of highway miles.
⁃ When I first tried Enhanced Autopilot in January 2018, it was pretty scary. Like a 12 year old trying to drive. I was driving in a hyper alert mode when I had it engaged. Scary at times.
⁃ It kept getting better, a year or so ago at an EV event---those don't happen much anymore---I described Autopilot to people as much like having a student driver. You still had to pay attention and really had to be ready to take over immediately at any time, it wasn't a relaxing experience. I never quite felt comfortable with it driving even though it got lots better.
⁃ Maybe 6 months ago I recalled telling someone that it was like letting your teenager take over on long drives, it was really pretty good and long drives were actually a dream. No longer did you have to spend time keeping the car in its lane, watching your speed, even lane changes were lots better. Even when I tried it out around town, I was amazed at how good it was managing.
⁃ Now I treat FSD as a driving partner. On highways when I wanna grab a sandwich or find something else to listen to or check the weather radar or other things which would normally be awkward in an ordinary car, I'm happy and comfortable letting Max do the driving; 98% of the time he drives better than I do. In heavy traffic he knows what lanes to be in and when to get over, makes his way around LA quite well, and when cruising long distances in the desert southwest, it's a dream having him do almost of the driving. I'm just an observer. Yes, always ready to take over for the occasional retread in the road, etc but otherwise he's doing the driving.
⁃ Sure, Max still makes mistakes, and I can almost predict now situations where it'll be challenging and there I'm especially alert, and I'll even let the the FSD fail, hopefully to teach it a lesson and help it get better and to contribute edge cases to make it better
⁃ In conclusion, it's the best $2000 upgrade ever. No regrets, and I've been happy to be a part of the "learning experience". But I also realize there are folks that don't want to be part of any of that sort of "experiment". There are the early adopters who love trying out the latest version and people who just pay their money and want it to work. This thread seems to have both sorts of people it seems!


----------



## garsh (Apr 4, 2016)

lance.bailey said:


> data point - I remember a loaner Volvo S60 from dealer at least 4 years ago and it had the speed limit in the dash. When I asked about it they said that they did recognition from the front camera. So the technology has been out there in cars for quite a while.


Yep, via Mobileye.


garsh said:


> Mobileye is a company that creates an Automatic Emergency Braking system and sells it to many companies, including Tesla, BMW, GM, Volvo, Hyundai, and more.


Note: Mobileye's "about" page has moved, and they no longer list any customers by name, but they did back when I made that post.

https://arstechnica.com/cars/2020/0...n-to-dominate-self-driving-and-it-might-work/

_A number of carmakers have developed similar systems. Shashua says Mobileye is supplying the technology for 70 percent of them, including systems from Nissan, Volkswagen, and BMW._​


----------



## DocScott (Mar 6, 2019)

GDN said:


> I would like to agree with you and do figure there will come a day most people realize that "Safer than human drivers" is a good thing, but it will take a long time to prove that. We kill many people across the world every day with cars being driven by humans and we just accept it - along with a few lawsuits, etc. However, let the car kill someone on it's own, even at a rate lower than what humans are doing, and I don't think we'll ever accept it.
> 
> You could have the most perfect self driving system that never kills anyone and it would take years for it to be accepted and allowed on the road without a driver.


Weirdly, part of the PR problem is that an autonomous car is never inattentive.

"I just looked away for a second" is the kind of an excuse a human driver can use.

Unless there's a sensor failure, though, the car _got_ the input it needed, it just didn't interpret it (or, alternatively, the broader context) correctly. "The car thought the truck was an overpass" infuriates/scares people, because if they were paying attention and looking at that visual input, they would know the truck wasn't an overpass. For human drivers, although once in a while a person might misinterpret what they're seeing, much more often the problem was they weren't looking, or weren't looking in the right place: "the other car came out of nowhere."

Since the most common mistakes leading to accidents will be very different for humans than for autonomous cars, many people will see autonomous cars as unreliable/dangerous, since they are not doing things as well as people that people do well. The fact that autonomus cars are doing very well things that people do poorly (paying continual attention, looking in all directions at once, never driving angry/drunk/sleepy, ...) should count in their favor, but it's hard to make that argument in the face of "it thought the speed limit on this narrow residential road was 85 mph" or whatever non-human mistake an autonomous vehicle will occasionally make.


----------



## DocScott (Mar 6, 2019)

M3OC Rules said:


> In terms of bar codes on signs, I don't think this is a great solution. I think the cost of implementation would be significant. You would create new issues of maintenance, inventory tracking, and temporary situations due to construction. How many times are construction signs not meant to be honored sitting on the side of the road for a long time after a construction project is finished or during? Most of the time it works but not always. Which does better if the sign is partially blocked? AI or barcode? It would also likely have to be on a city by city basis. So I could see a city supporting the cost if it meant robo-taxis could operate in their city but you have a chicken and an egg problem. But the technology apparently has been developed and tested. Check this out if you haven't seen it: https://multimedia.3m.com/mws/media/1584051O/2d-barcode-whitepaper.pdf In a more general sense maybe I'm more optimistic that someone will solve this before the infrastructure changes. Tech vs government. I realize the government has to approve autonomy solutions but even so.


I brought up bar codes as a for-instance--I don't know that will be the best infrastructure solution. Maybe RFID or something like that could be better, for instance.

But in terms of partially blocked, I just about guarantee that a well-designed bar code is better. If it's literally bars, rather than a 2-D grid like a QR code, then you only need to be able to see _some part_ of each bar, which isn't always true for numerals. And you could probably build in redundancy so that the bar code lists the speed limit more than once, while only listing less crucial but still useful information, such as GPS coordinates, a single time. That would largely solve the partial-obstruction problem.

But whether it's bar codes or RFID or some other aid, I'm pretty confident improvements will come from both sides. And Tesla will pursue a belt-and-suspenders approach, as it does now: Tesla uses map data when available, but reads signs (lights, stop signs, and now speed limits) as well. If those signs start to have features to help autonomous vehicles, then it will be one more arrow in the quiver for getting things right.

One bit of math related to these ideas: it's much, much harder to get something (just about anything) to perform correctly 99.99% of the time than to get it to perform correctly 95% of the time, right? But suppose you have four independent techniques that each get it right 95% of the time: say, reading the numbers on a sign, knowing the speed limit from a map, reading a bar code from the sign, and knowing from the fleet how fast previous cars drove through the area. Then the rate at which _all four_ of those fail is 5% of 5% of 5% of 5%, which is 0.0006%. Yes, if they disagree, then the car has to make some decisions as to which to trust, but at the very least it knows to be careful. Total failure becomes very unlikely. In a broad sense, that seems to be Tesla's approach to these kinds of issues, and I think it's a good one.


----------



## lance.bailey (Apr 1, 2019)

thanks @garsh didn't know they used mobileye and didn't see volvo in your earlier list of cars. mea culpa.


----------



## garsh (Apr 4, 2016)

DocScott said:


> But in terms of partially blocked, I just about guarantee that a well-designed bar code is better. If it's literally bars, rather than a 2-D grid like a QR code, then you only need to be able to see _some part_ of each bar, which isn't always true for numerals.


Then some kids come by with a roll of electrical tape to "adjust" the bars as a prank. Better to just have cars read the numerals.


----------



## DocScott (Mar 6, 2019)

garsh said:


> Then some kids come by with a roll of electrical tape to "adjust" the bars as a prank. Better to just have cars read the numerals.


Which simply makes the car reject the bar code as unreliable, because bar codes generally have a checksum, and in the "prank" case, the checksum will indicate something is wrong. So the net result of the prank would be that the car ignores the bar code, and relies on just reading the numbers alone, or on map data.

Also, I don't think we have an epidemic of kids with sharpies adjusting bar codes in grocery stores as a prank so that people going through self-checkout pay the wrong price...


----------



## garsh (Apr 4, 2016)

DocScott said:


> Which simply makes the car reject the bar code as unreliable, because bar codes generally have a checksum


It's trivial to adjust the codes to have a correct checksum.


> Also, I don't think we have an epidemic of kids with sharpies adjusting bar codes in grocery stores...


No, but we have an epidemic of thieves changing barcodes.

Alleged Theft Ring Used Switched Barcodes To Steal From Retailers
Police: Woman switched barcodes at Butler County Walmart to pay less
Woman Arrested After Allegedly Switching UPC Codes At Walmart On Four Visits
FDLE: Suspects in retail theft ring used fake barcodes to steal $300K in merchandise
And many, many more.
Again, it's hard for humans to tell if a barcode is correct or not. If it was trivial for Walmart to change all of their barcode readers to text readers, they'd probably do it to help prevent these types of thefts from happening.

If changing a speed limit sign would also cause Teslas to speed up or slow down, I'm sure some deviants would be motivated to make that happen.


----------



## iChris93 (Feb 3, 2017)

garsh said:


> If changing a speed limit sign would also cause Teslas to speed up or slow down, I'm sure some deviants would be motivated to make that happen.


And that will happen regardless of whether a barcode or text reading is used for speed limit detection.


----------



## M3OC Rules (Nov 18, 2016)

If you look at what 3M's testing it might be a little different than what you guys are thinking. Its invisible to the human. "Invisible" 2D Bar Code to Enable Machine Readability of Road Signs - Material and Software Solutions". It covers a large portion of the sign(65%). They can encode other information in there including markers for navigation and unique codes for sign tracking/maintenance etc. They also mention how many different federal signs there are and some info on detection accuracy

"For example, in 2010, the German Traffic Sign Benchmark (GTSRB), a 50000-sign image set was published in conjunction with using neural networks for sign classification [1]. Subsequent convolutional neural network implementations achieved 99.81% classification accuracy, compared to human results around 98.5% accurate [2]. The GTSRB data set included only 43 sign classifications, less than 5% of the number of different sign classifications in the US. In the United States alone, there exist over five hundred unique federal signs each with multiple different sizes. Only 23 states have standardized to the Standard Highway Signs stipulated in the Federal Highway Administration's (FHWA) Manual on Uniform Traffic Control Devices (MUTCD), while the other 27 states have "largely conforming supplemental volumes" introducing many hundred
more variations."

They also cite a study on just speed limit recognition I haven't looked at yet.
"Besides providing the MUTCD sign classification (a limit of current technologies), signs can convey more advanced functions of navigation information. The classification performance metrics referenced above, only evaluate classification of the sign code; they do not attempt to decode of the text in the sign, for example the numerical speed limit. Citing a more recent speed limit decode challenge shows a significantly different set of results with the successful decoding of a speed limit sign at a mere 84.31%, which confirms the opportunity to deliver sign content in a more optimized fashion [3]. "

Of course this is a whitepaper from a company trying to sell something. But it is an interesting read.

https://multimedia.3m.com/mws/media/1584051O/2d-barcode-whitepaper.pdf

I wouldn't underestimate kids or criminals but with this technology you could create an automated system that checked for sign tampering at least for permanent signs.


----------



## garsh (Apr 4, 2016)

iChris93 said:


> And that will happen regardless of whether a barcode or text reading is used for speed limit detection.


Sure. The difference being that it's easier for everybody to notice an alteration to an Arabic Numeral than to a barcode.


----------



## DocScott (Mar 6, 2019)

garsh said:


> Sure. The difference being that it's easier for everybody to notice an alteration to an Arabic Numeral than to a barcode.


That is a strawman argument.

No one is suggesting that we _replace_ human-readable signs with bar codes, or that we shouldn't have autonomous vehicles that can read Arabic numerals. The argument is that having both makes the system more robust, and is relatively inexpensive.

So, suppose there's a sign with Arabic numerals and a bar code and a prankster decides to go to the trouble of applying a fake bar code, complete with checksum. Sure--no one might detect it was altered..._if it wasn't for the presence of the Arabic numerals_. If the car detects a difference between a bar code with a proper checksum and the Arabic numerals, whether it was one that was altered or the other, then it's probably best for the car to disregard the sign. Which, for a speed limit sign, is fine; it can just keep going the speed it was going, double-check with map data if it's got it, and alert the driver of the issue. If you think a human driver can detect a change to the Arabic numerals (in your scenario, it sounds like we're talking L2 autonomy), then the human driver can surely make the judgement call as to what speed to choose. If you're envisioning L3 autonomy or higher, then the driver wouldn't be paying enough attention to notice the alteration anyway.

Or are you thinking that a change to the Arabic numerals shown on the sign will be reported by people driving non-autonomous cars, and thus fixed? Well, yes. But that's because more people are using them right now. The more people have autonomous cars, the more problems with things like bar codes (or RFID or whatever) will be reported too. And I think we're all envisioning a future with a lot of autonomous cars, including commercial vehicles. The trucking companies are not going to take kindly to people messing with their infrastructure, and are likely to report vandalism of that kind quite quickly.

If it turns out pranksters replacing bar codes is a widespread problem--and it's hard for me to picture that (switching bar codes on products is motivated mostly by greed, not mischievousness or churlishness)--then it becomes a problem like counterfeit bills or documents, and the solutions become the same: use particular colors of ink that are hard to simulate in a regular printer, or whatever. No, it won't defeat a criminal mastermind bent on changing a speed limit sign, but it should stop most of these juvenile delinquents we seem to be imagining. It helps, by the way, if the bar code is more than 14" wide, which it probably would be anyway, because then it becomes harder to print out on a single sticker.

As I've said, I don't necessarily expect bar codes would be the solution we'd end up with. It's the lowest-tech version of these inexpensive infrastructure changes; my guess is we'll end up with something a little higher-tech but still cheap. My basic point is that there's no need to expect the car to handle every edge case in our _current_ world, any more than we'd expect a typical autonomous car to be able to drive properly if it was magically transported back to the road network of the early 19th century. They already utilize the infrastructure we've developed to help regularize things for human drivers, such as standardized lane markers and such. As more companies develop robust autonomous driving systems, I fully expect our infrastructure to evolve to help those driving systems be more reliable in a wider range of circumstances, particularly when the changes don't cost much to install or maintain. That's the overall point I'm trying to make, and I'm not clear on why that's even remotely controversial. Perhaps I just haven't been sufficiently clear.


----------



## garsh (Apr 4, 2016)

DocScott said:


> No one is suggesting that we _replace_ human-readable signs with bar codes


I know you were suggesting barcodes in _addition_ to normal text.

My point is that a barcode can be changed, and *nobody* will notice except for one or two cars that read barcodes instead of text. Those one or two cars will act wonky, and it won't be immediately obvious to anybody that a vandalized barcode is to blame.

But if the numbers are changed, *everybody* will notice. All the cars reading signs. All the drivers reading signs.


----------



## M3OC Rules (Nov 18, 2016)

DocScott said:


> I don't think we're _ever_ going to get to robust FSD that way alone. It would require the car to have the abilities of a human, bleeding far outside the areas of driving alone: it would need to understand language, context, etc..


While there are a lot of different signs with a lot of scenarios, making barcodes(or RFID, etc) for each of them does not reduce the number or make the understanding of what to do easier does it? Detecting what the sign says doesn't seem that hard relative to the rest of the problems with FSD. I think the AI will get to human level in recognizing signs.

I could see a situation like where one sign modifies another sign where it would be easier if they made one barcode for two signs. But still this seems like a perception solution not helping the planning which is where you need to understand what they mean.


----------



## M3OC Rules (Nov 18, 2016)

I wonder what their plan is for regulatory approval. If they get FSD to the point where it can be level 5 in a year from now, for argument sake, will Tesla be running their own test drives with safety drivers to get disengagement counts? I imagine using disengagement data from owners would be a high number for various reasons. Or will they provide example disengagement data from owners?

From here:
"It comes down to standardization," said Pete Kelly, managing director of researcher LMC Automotive. He believes policymakers should spell out more clearly the circumstances in which a safety driver should or should not take control from the autonomous system. A better measure of the technology's evolution will come only when legislators require more detailed reports, with fuller descriptions of the vehicle's behavior during disengagement.

Industry executives including Cruise's Vogt don't like that the DMV's disengagement metric is used to infer whether self-driving technology is ready for commercial deployment. Vogt suggests that more qualitative analysis of autonomous testing, such as reviews of large volumes of raw video material, could better inform companies and regulators of the reasons for disengagement."


----------

