# V9 Autopilot Confidence, Behavior in "Unsupported" Situations



## sdbyrd79 (Nov 28, 2017)

I've been on v9 for a couple of weeks now and my initial impressions of autopilot were positive. There were several spots where I was forced to take over routinely and things were fairly consistent. However, after a couple of weeks of new roads & tests (I'm constantly using it) it has honestly scared the crap out of me in a few instances. When it gets confused, it seems to really jerk left or right at times and even going into another lane nearly! I'm well aware that v9 is learning to use all of its cameras and will take time to improve, but it's concerning that I've been less confident in the past 2 weeks as I have for the past 4 months driving in autopliot on v8. Anyone else feel like it's better in some cases, but when it's not good, it's far less predictable than v8? I'm still going to use it the same, but my hands have become much more tight on the wheel!


----------



## garsh (Apr 4, 2016)

sdbyrd79 said:


> When it gets confused, it seems to really jerk left or right at times and even going into another lane nearly!


I've found that if you understand the current system's limitations, and only operate it in ideal conditions, that it works fine.

Can you describe the conditions under which you had it jerk the car into another lane? Maybe also provide a link to Google Maps showing where you were?

I've had one strange occurrence that I've not yet been able to explain. I was using autopilot on an interstate, set to 60mph (55mph zone). The car suddenly started accelerating with no input from me, and when I looked at the screen, the set cruise speed had somehow changed to 64mph. I have no idea how that happened. But I held down the right scroll wheel and submitted a bug report to Tesla, so hopefully they can track it down.


----------



## sdbyrd79 (Nov 28, 2017)

These are the same roads within a mile of my house that I've traveled for the past few months on v8 that never once jerked like that in two different spots just this morning. I've noticed on other roads I routinely travel where I always used autopilot that with v9 it's getting confused more frequently - again with more jerking than just beeping to let me know it has an issue or confused. I think that's really my biggest concern is v8 I feel would beep (not the red alert one) when it needed your attention, but with v9 it swerves to the left/right more violently than I can ever remember happening in v8. I'm sure it's a short-lived problem, but I really hope we don't start seeing more autopilot accidents now that v9 is here. Stay safe and diligent folks!


----------



## garsh (Apr 4, 2016)

sdbyrd79 said:


> These are the same roads within a mile of my house that I've traveled for the past few months...


But that doesn't help me to understand the situation at all. What roads? A four-lane divided highway? A back road with no lane dividers?

Autopilot is currently only really competent on limited-access divided highways. You should not expect consistently-good behavior on other types of roads.


----------



## sdbyrd79 (Nov 28, 2017)

I get what you're saying and of course you're right - by what the manual states. However, like nearly all of us, we've logged thousands of miles with autopilot on "non-standard" back/side roads, etc. under close supervision. 

This road is very well marked 2 lane road that has a shared turning lane and it not marked at that intersection for about 15-20' by design. My only point to all of this is that I've traveled that road countless times on v8 and it's NEVER even flinched during that intersection. Now with v9 it swerves erratically and clearly confused. Other (similar) routes I routinely take NEVER swerved/jerked on v8 and now it's doing it with v9. At the end of the day we're all looking to test/debug/push the limits for everyone's benefits down the road. I'm merely curious if this is a broader problem amongst v9 adopters when compared to v8. Normally you wouldn't expect things to get "worse" than what you had before - that's all I'm trying to say 

Side note, I went to grab the USB stick out of my 3 and even though the red light dashcam icon was on this morning, it didn't have of today's recordings on there  

Was hoping to show exactly what happened - oh well.


----------



## garsh (Apr 4, 2016)

Thanks for clarifying.

I have no problem with people experimenting with autopilot in "unsupported" situations. But when you state that v9 appears to have regressions, you need to make it clear that you're talking specifically about its use in non-supported situations. I'm not completely surprised that the behavior in such situations could vary noticeably from one software revision to another, and not necessarily as an improvement. Eventually, Tesla will concentrate on improving performance in those areas, but they currently seem to be concentrating on making it better in limited-access highway use. They still have a lot of issues even in the "supported" situations (phantom braking events being the most prominent).


----------



## Mike (Apr 4, 2016)

garsh said:


> I've found that if you understand the current system's limitations, and only operate it in ideal conditions, that it works fine.
> 
> Can you describe the conditions under which you had it jerk the car into another lane? Maybe also provide a link to Google Maps showing where you were?
> 
> I've had one strange occurrence that I've not yet been able to explain. I was using autopilot on an interstate, set to 60mph (55mph zone). The car suddenly started accelerating with no input from me, and when I looked at the screen, the set cruise speed had somehow changed to 64mph. I have no idea how that happened. But I held down the right scroll wheel and submitted a bug report to Tesla, so hopefully they can track it down.


After two of my phantom braking events with V9 (15 Oct 2018, Hwy 401 eastbound in the London ON area), the set speed had reset down from 103 kph to 90 kph all on its own.

I have not driven in that area since.


----------



## kort677 (Sep 17, 2018)

ap in v9 seems to work fine for me, I really don't see any difference other than sometimes having to double click the stalk to get it running


----------



## kort677 (Sep 17, 2018)

garsh said:


> Thanks for clarifying.
> 
> I have no problem with people experimenting with autopilot in "unsupported" situations. But when you state that v9 appears to have regressions, you need to make it clear that you're talking specifically about its use in non-supported situations. I'm not completely surprised that the behavior in such situations could vary noticeably from one software revision to another, and not necessarily as an improvement. Eventually, Tesla will concentrate on improving performance in those areas, but they currently seem to be concentrating on making it better in limited-access highway use. They still have a lot of issues even in the "supported" situations (phantom braking events being the most prominent).


what is a non supported situation, I've never heard that term used before


----------



## kort677 (Sep 17, 2018)

garsh said:


> Autopilot is currently only really competent on limited-access divided highways. You should not expect consistently-good behavior on other types of roads.


I don't concur with that comment, I use AP on 2 lane undivided highways regularly


----------



## MelindaV (Apr 2, 2016)

kort677 said:


> I don't concur with that comment, I use AP on 2 lane undivided highways regularly


THAT is a non supported situation.


----------



## littlD (Apr 17, 2016)

I'm actually more confident with v9 for the following reasons IMHO:
1. Auto Lane Change is much safer to use. The car even slows down when needed to fit between two cars with enough room in the adjacent lane.
2. I've noticed less issues suddenly slowing down due to shadows, especially those around overpasses. Not completely fixed, but better for sure
3. Tracking within the lane seems more accurate on average.

One minus I've also noticed is a little ping ponging on occasion, like I used to experience way back in May when I first took delivery. But it corrects pretty quickly.


----------



## kort677 (Sep 17, 2018)

MelindaV said:


> THAT is a non supported situation.


 if on an unsupported road the AP wouldn't engage. the only thing that happens on roads like 2 lane undivided roads is that your max speed is limited to 5 mph over the limit otherwise the system functions just fine.


----------



## garsh (Apr 4, 2016)

kort677 said:


> if on an unsupported road the AP wouldn't engage.


That's not true. AP will engage if it either sees lines on both sides of the lane, or it has a car to follow. But that does not actually mean that it's a supported road type. Unfortunately, Tesla does not geo-fence the feature to only work on supported roads (they probably should), so this allows people to try it in all sorts of unsupported situations.


----------



## M3OC Rules (Nov 18, 2016)

Haven't tried v9 yet but is this really surprising considering it's a new neutral net? V8 does work quite well in many unsupported situations. I use it all the time as well. It would be irresponsible for Tesla to let you use it in situations where it's really unsafe and then say it's unsupported. Especially if there is some regression. They know when you're in an unsupported situation so why do they let you use it if you're not supposed to. In fact they change the behavior so there is no argument they don't know. I'm not saying @sdbyrd79 is suggesting that this is too unsafe to use but there is a threshold somewhere and Tesla has been criticized a lot of this. I see that autosteer statement as a legal one. Kind of like when Honda made their Navigation system fully operational while driving while everyone else restricted it while driving with a legal statement when you first start it. In the same way that drew me to Honda I like Tesla's stance on this but its a risk no doubt.

Edit: Changed to "It would be irresponsible ..." to make it clear this is a hypothetical and a grammatically correct sentence.


----------



## M3OC Rules (Nov 18, 2016)

I also agree people should be aware of the possibility of regression(hopefully temporary) in some cases to achieve major progression given how neural nets work. This will surprise some people and is a risk for Tesla. My question is if it's like going from an old reliable driver who didn't see too well to some young whipper snapper who just got their license?


----------



## sdbyrd79 (Nov 28, 2017)

And for the record, I'm all for highway AP to be improved with v9 and the secondary roads have some slight regression. After all, it's primary used on highways. I've continued testing on all the usual secondary roads I travel since my original post and I haven't experienced the same issues. That means it's either learning from its mistakes or the conditions weren't exactly the same (cars/shadows, etc.) since my previous runs. Unfortunately, my wife witnessed a couple of those moments, so she's not nearly as "confident" as I am either at this point. We'll just have to keep testing and build up the trust again over time.


----------



## garsh (Apr 4, 2016)

TOO Rules said:


> It irresponsible for Tesla to let you use it in situations where it's really unsafe and then say it's unsupported.


Let's try a little thought experiment:

On any other car, you can engage cruise control at any speed on any road you like. Yet, you never before thought that it was "irresponsible" for a car company to allow that. Why?
On any other car, you can hold down the accelerator and go twice the posted speed limit. Yet, you never before thought that it was "irresponsible" for a car company to allow that. Why?
What makes autopilot different? Other cars have GPS and navigation systems too and therefore "know where they are" as well. Why do you hold Tesla to a higher standard?


----------



## pcenginefx (Sep 12, 2017)

I also agree with the OP that I have less confidence w/v9 Autopilot vs v8 for exactly the same reasons - when my Model 3 Autopilot has what I call a "failure", the reaction from the car is for sure more scary than how it reacted in v8. For example, any false brake event is *much* harder than v8 (to the point where it could cause a ear-end collision if someone was following me) so I always keep my foot near the gas pedal just in case. I also have experienced the steering failure where it will actually steer into/cross the center lane where v8 never did that for me on the same roads.


----------



## M3OC Rules (Nov 18, 2016)

garsh said:


> Let's try a little thought experiment:
> 
> On any other car, you can engage cruise control at any speed on any road you like. Yet, you never before thought that it was "irresponsible" for a car company to allow that. Why?
> On any other car, you can hold down the accelerator and go twice the posted speed limit. Yet, you never before thought that it was "irresponsible" for a car company to allow that. Why?
> What makes autopilot different? Other cars have GPS and navigation systems too and therefore "know where they are" as well. Why do you hold Tesla to a higher standard?


I hadn't tried it yet and I didn't say it was really unsafe and they were being irresponsible. Maybe it was poorly worded and implied. I was just saying there is a threshold somewhere. If some manufacturer's cruise control went unstable once in awhile I would probably say its irresponsible for them to allow use of it and I imagine there would be a recall. The speed thing is more politics and history than a manufacturer decision. Tesla wants to be on the leading edge with autopilot and its part of their reputation. I think this is a little different.

I guess I just don't like the "you're holding it wrong" argument. You're right and maybe its good to remind people but these are calculated Tesla decisions and how it works in unsupported areas matters. They know people will and are using it in "unsupported" scenarios. I haven't seen a ton of outcry on v9 being less safe so hopefully @sdbyrd79 and @pcenginefx issues are more isolated issues. But I think its good to hear them. Slapping beta on something doesn't let you off the hook. The test will be when v9 starts hitting the autopilot safety reports. I assume they don't remove accidents/miles where people are using it in unsupported areas and I don't see anything that says they do. Its not going to look good if they start going backwards on their quarterly safety reports.

PS. I just had my first v9 experience and didn't notice much difference better or worse in terms of autosteer in the unsupported areas I was just in but very limited experience.


----------



## garsh (Apr 4, 2016)

M3OC Rules said:


> Slapping beta on something doesn't let you off the hook.


The problem is that if you are in an accident due to using Autopilot in one of these "unsupported scenarios", it's not Tesla that's "on the hook" - it's you. As far as the law is concerned, you are the driver of the vehicle and you are ultimately responsible for what happens. Tesla has already pointed out that you must pay attention at all times, and they attempt to enforce that via the "hands on wheel" nagging.

It's because of this that I keep reminding people about the limitations of Autopilot. I don't think that Tesla does enough to make sure people realize its limitations. I think most people who have been on this forum for a while have gotten the message by this point, but I don't think the vast majority of people carefully read that little popup window that appears when they activate autopilot.


----------



## garsh (Apr 4, 2016)

M3OC Rules said:


> I didn't say it was really unsafe and they were being irresponsible.





M3OC Rules said:


> It irresponsible for Tesla to let you use it in situations where it's really unsafe...


----------



## M3OC Rules (Nov 18, 2016)

M3OC Rules said:


> It irresponsible for Tesla to let you use it in situations where it's really unsafe and then say it's unsupported.





M3OC Rules said:


> I didn't say it was really unsafe and they were being irresponsible.


@garsh Hehe. I really didn't say it was really unsafe. I hadn't even tried it at that point. That was hypothetical. Tesla says "As we are working hard to make our cars the safest and most capable cars on the road in terms of passive safety, active safety, and automated driving, we must continue to encourage driver vigilance on the road - that is, by and large, the best way to prevent traffic accidents. Safety is at the core of everything we do and every decision we make, so we cannot stress this enough." If they really are all about safety they may agree with me it would be irresponsible, if they are half safety and half marketing then maybe risky is a better term. If you ask me to define my term "really unsafe" I will punt.


----------



## evannole (Jun 18, 2018)

I use Auto Pilot only on interstate highways. So far, I am finding it to be much more stable, smooth and reliable on v9 than it was on v8.


----------



## PNWmisty (Aug 19, 2017)

I have a little under 200 miles on V9, about half of it on rural highways. I think it's quite a bit better than previous versions, especially on unsupported road types. I've not noticed anything more scary about it, it seems more competent than ever. I did notice a little more mild ping-ponging occasionally on the Interstate but barely noticeable. Everything else seemed better and more capable.


----------



## garsh (Apr 4, 2016)

M3OC Rules said:


> @garsh Hehe. I really didn't say it was really unsafe.


I thought you were saying that it was irresponsible of Tesla to allow autopilot to be used in situations where it's not supported. I have a few issues with this:

Why don't you hold other Automakers to this same standard? It sounds like you're saying that the reason is because Tesla stresses safety in their vehicles. Ok, but so does Volvo, since the 1980's. And so does every other carmaker when they tout their 5-star safety ratings in advertisements.
"Geofencing" (the act of using the car's knowledge of its current position to allow/disallow features) is buggy. Yeah, the car basically knows where it is. But then suddenly the navigation system decides that you're actually on that little local road running parallel to the interstate, and it decelerates to this new 35 mph speed limit. That's bad enough. It would be even worse if autosteer suddenly stops working as well in that situation.
Tesla *does* prevent activating Autopilot if it can't find lines on the road. So they already perform some amount of sanity checking on the situation. It's just not good enough to prevent people using it in _all_ unsupported situations.


----------



## smatthew (Jul 1, 2018)

garsh said:


> I thought you were saying that it was irresponsible of Tesla to allow autopilot to be used in situations where it's not supported. I have a few issues with this:
> 
> Why don't you hold other Automakers to this same standard? It sounds like you're saying that the reason is because Tesla stresses safety in their vehicles. Ok, but so does Volvo, since the 1980's. And so does every other carmaker when they tout their 5-star safety ratings in advertisements.
> "Geofencing" (the act of using the car's knowledge of its current position to allow/disallow features) is buggy. Yeah, the car basically knows where it is. But then suddenly the navigation system decides that you're actually on that little local road running parallel to the interstate, and it decelerates to this new 35 mph speed limit. That's bad enough. It would be even worse if autosteer suddenly stops working as well in that situation.
> Tesla *does* prevent activating Autopilot if it can't find lines on the road. So they already perform some amount of sanity checking on the situation. It's just not good enough to prevent people using it in _all_ unsupported situations.


I disagree about geo-fencing. Prior to V9, Auto Lane Change only worked in "supported situations", i.e. restricted access highways. That restriction was pretty much flawless in my experience. I never had lane change activate on parallel side roads, nor did I have it go-away on the freeway due to nearby roads.

If Tesla had the capability to restrict auto lane change - they could also restrict Autopilot.


----------



## MelindaV (Apr 2, 2016)

smatthew said:


> I disagree about geo-fencing. Prior to V9, Auto Lane Change only worked in "supported situations", i.e. restricted access highways. That restriction was pretty much flawless in my experience. I never had lane change activate on parallel side roads, nor did I have it go-away on the freeway due to nearby roads.
> 
> If Tesla had the capability to restrict auto lane change - they could also restrict Autopilot.


In tesla's world San Ramon is toward the top of the list. Imagine you are going thru a small town in North Dakota and expecting it to react like it does in the Bay Area


----------



## smatthew (Jul 1, 2018)

MelindaV said:


> In tesla's world San Ramon is toward the top of the list. Imagine you are going thru a small town in North Dakota and expecting it to react like it does in the Bay Area


Does the car react differently in North Dakota?

BTW - prior to V9, has anyone ever heard of a Tesla doing an auto lane change on a non-restricted access highway?


----------



## MelindaV (Apr 2, 2016)

smatthew said:


> Does the car react differently in North Dakota?


there are just many more miles driven by Teslas in the bay area than some other parts of the world (like North Dakota  )


----------



## M3OC Rules (Nov 18, 2016)

garsh said:


> I thought you were saying that it was irresponsible of Tesla to allow autopilot to be used in situations where it's not supported.


Nope. It was a hypothetical that IF it was really unsafe it would be irresponsible for them to allow it. I don't believe it is and I use it every chance I get. And I totally agree with reminding people who question its safety that they are responsible and its beta and there is a warning about using it in unsupported situations. Separate from the individual user level/advice I think this does have ramifications to Tesla.

I agree with @smatthew that they could do more to restrict autopilot if they wanted to.
Reasons to restrict:
1. Safety - If they think its unsafe
2. Legal
3. Public perception, media and NTSB - Bad press, recalls, etc.

Reasons to not restrict:
1. Safety - If they think its more safe
2. Marketing
3. Engineering - If its too hard.

As I said I think they could do much more to limit its use if they wanted to so I don't think its not done for technical reasons. I think the marketing aspect is important to Tesla and their reputation as a technical leader. Tesla's lead in this area is not as obvious as it used to be. In terms of public issues, we've already seen them add more nagging due to investigations and bad press after some accidents. Elon was not happy about this based on his tweets because he felt people would use it less and be less safe. In terms of safety I wonder what Elon Musk would say if you asked if he thought it was less safe to utilize autopilot in unsupported areas than not utilizing it.


----------



## M3OC Rules (Nov 18, 2016)

I was thinking about how Electrek questioned the benefit of "Navigate on Autopilot" if it didn't change lanes on its own. One thing it might solve is the issue of splitting lanes on a freeway (exit). That was the cause of the Model X death where it got ripped in half and YouYou's accident in Europe.


----------



## iChris93 (Feb 3, 2017)

M3OC Rules said:


> YouYou's accident in Europe.


Was this confirmed? Thought there was suspicion of suspension/steering component failure.


----------



## undergrove (Jan 17, 2018)

Nick's Tesla Life said:


> I had a close call with V9 Autopilot steered left going through a intersection on a local road. I know, I know, we shouldn't be using AP on local roads, but I made this video to show people a potential problem with V9 AP.


I had a similar swerve to the left in an empty intersection on a two lane country road. It deviated slightly to the left just as it came to the intersection and then right back on center as it left. The road was completely deserted so I felt safe trying this. It happened twice in the same intersection. The second time I got video from the TeslaCam. I will try to post video later.

In general, I have found Autopilot to be smoother and better. I did not try using Autosteer in this location with v8.


----------



## joelliot (Jan 25, 2018)

undergrove said:


> I had a similar swerve to the left in an empty intersection on a two lane country road. It deviated slightly to the left just as it came to the intersection and then right back on center as it left. The road was completely deserted so I felt safe trying this. It happened twice in the same intersection. The second time I got video from the TeslaCam. I will try to post video later.
> 
> In general, I have found Autopilot to be smoother and better. I did not try using Autosteer in this location with v8.


Had a similar incident in a similar situation. Two lane road, with intersection where the lines did not go thru the intersection. Autopilot seemed like it was trying to go into the oncoming lane.


----------



## undergrove (Jan 17, 2018)

joelliot said:


> Had a similar incident in a similar situation. Two lane road, with intersection where the lines did not go thru the intersection. Autopilot seemed like it was trying to go into the oncoming lane.


In my case I did not feel the need to take control. The move was small and immediately corrected. However, I would not want to use Autopilot in this kind of situation with oncoming traffic in the opposite lane. It would likely startle the oncoming driver.


----------



## M3OC Rules (Nov 18, 2016)

Is this the type of thing that affected your confidence? Its definitely something that could cause a bad accident potentially even if someone was looking forward and paying attention. Clearly an unsupported area but definitely the type of road some will use autopilot on.

https://teslaownersonline.com/threa...-39-x-pre-release-megathread.8878/post-166751


----------



## M3OC Rules (Nov 18, 2016)

iChris93 said:


> Was this confirmed? Thought there was suspicion of suspension/steering component failure.


I don't know if anything was confirmed. Here is his original statement:

__
https://www.reddit.com/r/teslamotors/comments/8m8bn4
It sounds like Tesla did download the data from the car but I haven't seen any statement from them.


----------



## garsh (Apr 4, 2016)

@Nick's Tesla Life , @undergrove, @joelliot

If you don't yet know, please realize that Autopilot should only be used on limited-access, divided highways. It doesn't handle intersections, stop signs, red lights, or lack of painted lines marking your lane.

If you do know this, and you're just experimenting, then PLEASE be sure to state this in any videos or posts you make. Explain to your viewers/readers that you are merely trying it out in an unsupported situation to see how it reacts. There are too many people in the world who believe that Autopilot == Self Driving, and they'll erringly believe that it's a failure when seeing/reading these kinds of stories.


----------



## ADK46 (Aug 4, 2018)

I've experienced two episodes of violent misdirection commanded by AP v9. Despite my full attention and hands on the wheel, I could not stop the car from crossing into the opposing lane - it was that violent. It was the sort of steering I'd use to avoid a deer, but only if the opposing lane were clear.

The circumstances were straight sections of rural 2-lane roads at intersections, where the centerline goes away briefly. Dry, daytime. Once at 55 mph, once at 40.

Definitely a product of a dangerous shortcoming in the software. It tells me that #1, AP does not look very far down the road, and #2, AP pays little attention to the map data, or even that real roads have gentle curves. Yes, this was "unsupported" use, but I find it very disturbing that the AP software is so unsophisticated. If the centerline of an interstate is briefly interrupted, the same thing might happen. Sophisticated software would react differently, such as by looking further down the road - duh.

After years of reading about the vaunted ability of Teslas to "drive themselves", I am profoundly disappointed.


----------



## garsh (Apr 4, 2016)

ADK46 said:


> I find it very disturbing that the AP software is so unsophisticated.





> After years of reading about the vaunted ability of Teslas to "drive themselves"


This is why I continue to beat this drum.

*AUTOPILOT IS NOTHING BUT FANCY CRUISE CONTROL.*

The "neural net" that they're currently using has a very limited purpose: determine where the lines are on the pavement so that the car can steer between them.
That's it.
It doesn't do anything more sophisticated than that.

Autopilot doesn't make use of GPS or map data (other than for determining the current speed limit).
Autopilot doesn't look at previous camera snapshots - just the current images.
And yes, Autopilot doesn't even appear to look "further down the road" very much to look for lines.
All of the articles being written about Tesla's autopilot allowing the car to "drive itself" are misleading the public.


----------



## MelindaV (Apr 2, 2016)

ADK46 said:


> I've experienced two episodes of violent misdirection commanded by AP v9. Despite my full attention and hands on the wheel, I could not stop the car from crossing into the opposing lane - it was that violent. It was the sort of steering I'd use to avoid a deer, but only if the opposing lane were clear.
> 
> The circumstances were straight sections of rural 2-lane roads at intersections, where the centerline goes away briefly. Dry, daytime. Once at 55 mph, once at 40.
> 
> ...


You were on a street with two way traffic trying to use a feature designed only for one way divided roads. 
Your disappoint should be aimed at your unrealistic expectations, not on how the product performs.


----------



## ADK46 (Aug 4, 2018)

MelindaV said:


> You were on a street with two way traffic trying to use a feature designed only for one way divided roads. I find it disappointing people are still blaming the software for errors it makes when used in unsupported situations


Not streets, roads. I know most of you live in congested areas - I live in a one-stoplight town of mostly roads (and one lightly used interstate) that are simple to navigate. I have only experimented with AP on 2-lane roads, fully prepared to take over. The car allows it, though it could easily forbid it. I am a retired engineer in the aerospace industry, naturally prudent, but I did not expect the car to suddenly veer into an opposing lane, completely confused, so quickly I could not take over.

I paid $5000 for this unsophisticated and potentially dangerous software. Is it really a pure neural network steering the car? Nothing to stop an obviously incorrect action? If so, and since it is in beta, officially, perhaps it should have a big Student Driver sign on it until it has finished its training.

It is not much use if I must grip the steering wheel and pay even more attention to where the car is going than if I were to just steer it myself. Unless I have a stroke, I'm not going to veer if I lose sight of a bit of paint on the road. The word "confidence" is in the title of this thread.

The real kicker is that I almost paid $3000 for full self-driving, as if it was going to become possible in my lifetime . Until the current EAP becomes sophisticated enough to read a map and cope with brief interruptions of a centerline, that's a pipe-dream bordering on fraud.

I love the car otherwise. Great engineering, great features, programmed by people who know how to do user interfaces. I have to leave now, to give a test drive to a traditionalist.


----------



## MelindaV (Apr 2, 2016)

ADK46 said:


> but I did not expect the car to suddenly veer into an opposing lane,


Without even reading further, this is the issue. It is not designed for situations where there is oncoming traffic


----------



## Flashgj (Oct 11, 2018)

garsh said:


> This is why I continue to beat this drum.
> 
> *AUTOPILOT IS NOTHING BUT FANCY CRUISE CONTROL.*
> 
> ...


Spot on! I am totally shocked by the amount of people using Autosteer on any road besides divided highways (which the manual clearly states it is intended to only be used on). And then stating their total disappointment in how it handles.

I personally use it on interstates all the time and find it to be a great feature and I am amazed at how well it performs in most situations. It has a few hiccups (exits for example) but keeps improving with each update. I understand that in its current form it relies on Lane markings, so when I see that I am entering a situation where the Lane markings are not clearly defined, I take over. Using on roads that have a lot of breaks in the Lane markings (intersections for example) is, in my opinion, very irresponsible and just asking for something bad to happen.


----------



## garsh (Apr 4, 2016)

ADK46 said:


> The car allows it, though it could easily forbid it.


Cars allow you to go over the speed limit, though they could easily forbid it.
Cruise control allows you to set speeds way above the speed limit in the middle of a congested city, though they could easily forbid it.
The car is a tool.
You as the driver are responsible for using the tool safely.



> I paid $5000 for this unsophisticated and potentially dangerous software.


Compared to your expectations, *YES*.

It's really NOT all that sophisticated. And if you fail to heed the warnings and continue to use it in unapproved scenarios, then *YES, it is dangerous.*

I too originally thought that Autopilot could basically drive the car two and a half years ago went I put down my $1000 deposit on a Model 3. But since then, I've discovered just how simplistic it actually is. But it's not a problem of Tesla advertising it as more than it's capable of. It's all of the news articles saying that it is. It's all of the ill-informed Tesla fans saying that it is. And both of these sources are unfortunately more vocal and more popular than I am.

Please, everybody, help me get out the word. When you see someone talking about how great it is that their new Tesla can drive itself, CORRECT THEM. Stop this misinformation from spreading!


----------



## PNWmisty (Aug 19, 2017)

Flashgj said:


> Spot on! I am totally shocked by the amount of people using Autosteer on any road besides divided highways (which the manual clearly states it is intended to only be used on). And then stating their total disappointment in how it handles.


I agree. I use it in unsupported situations all the time, rural roads, state two-lane highways, winding residential lanes, etc. and don't find it dangerous at all. Probably because I have realistic expectations. I do find it useful in that it maintains following sidstances and lane position, allowing me to focus on the bigger picture.

I don't understand people who say it veers into dangerous situations faster than they can react. Here's why:

1) I'm always paying attention to what's going on around me.
2) My hands are on the wheel
3) If it tries to do something suddenly that is not justified by the situation, I feel it immediately and don't let it do it. It does not normally make sharp movements so, when it does, there is no mistaking it for normal. I only see this as a problem if someone thinks it's actually competent to drive and lets their normal guard down. I'm more relaxed when it's engaged and this makes me more alert and better to react if it does something suddenly.

The one time it put the brakes on really hard, there was a car pulling out from behind a large box truck that was a certain collision had one of "us" not applied the brakes. It's like having two drivers in one, me and EAP. Of course, I'm the better driver


----------



## Mike (Apr 4, 2016)

ADK46 said:


> It tells me that #1, AP does not look very far down the road, and #2, AP pays little attention to the map data, or even that real roads have gentle curves. Yes, this was "unsupported" use, but I find it very disturbing that the AP software is so unsophisticated. If the centerline of an interstate is briefly interrupted, the same thing might happen. Sophisticated software would react differently, such as by looking further down the road - duh.


@ADK46, this is the only thing I am looking for right now: the maps lend logic to the autopilot.

I don't care if I still have to use my mirrors to check the "blind" spots.

I don't care if i have to initiate a lane change with the use of the signal stalk.

I do care that the autopilot does not know to ignore any turns/intersections/offramps until the active nav system tells it otherwise.


----------



## ADK46 (Aug 4, 2018)

MelindaV said:


> Without even reading further, this is the issue. It is not designed for situations where there is oncoming traffic


I know what the manual says, but I also know what the car allows, knowingly, on purpose - by design. Words versus reality. I went along with Tesla's deliberate ambiguity - I gave it a try, cautiously, on an easy road. I've stopped now. I encourage everyone not to try. I'm OK that it didn't work well on curves, but shocked at _how violently_ it can fail, too suddenly to catch. Can we be confident our cars won't do the same if they encounter a brief interruption in a lane marking on an interstate?

My issue - my disappointment - is not that AP cannot be used on 2-lane roads, it is that I've come to understand that Tesla's AP system is remarkably unsophisticated given its cost and - hello! - it's 2018. I better understand now why it steers poorly on interstates, though less dangerously. I had to bail on a curve coming home this morning from giving my friend a demo. AP does not react to my fear of guardrails.

Garsh has it right. My expectations were too high - about $4200 too high, after accounting for adaptive cruise control.


----------



## changsteer (Sep 7, 2017)

PNWmisty said:


> I agree. I use it in unsupported situations all the time, rural roads, state two-lane highways, winding residential lanes, etc. and don't find it dangerous at all. Probably because I have realistic expectations. I do find it useful in that it maintains following sidstances and lane position, allowing me to focus on the bigger picture.
> 
> I don't understand people who say it veers into dangerous situations faster than they can react. Here's why:
> 
> ...


Well said and I couldn't agree more. I'm grateful that a company offers me such a nice car to drive and enjoy its advanced technology. I love to test out the technology and also understand its limit. When I use AP in supported or unsupported conditions, I pay attention to the road as I should be. And when AP got confused in several situations, I took over and correct it right away. If we are asking a computer to be responsible for its behavior and actions, we better hold the same standard to ourselves if not higher.


----------



## joelliot (Jan 25, 2018)

undergrove said:


> In my case I did not feel the need to take control. The move was small and immediately corrected. However, I would not want to use Autopilot in this kind of situation with oncoming traffic in the opposite lane. It would likely startle the oncoming driver.


Yes, there was no one behind me or in the other lane. There was someone in front of me that might have been amused if they where looking in the rear view mirror, but as mentioned by others, ...need to be vigilant if using autopilot in unsupported situation. That said, it seems much better on secondary roads, and I have a good feel for where I think it going to have problems now.


----------



## undergrove (Jan 17, 2018)

garsh said:


> @Nick's Tesla Life , @undergrove, @joelliot
> 
> If you don't yet know, please realize that Autopilot should only be used on limited-access, divided highways. It doesn't handle intersections, stop signs, red lights, or lack of painted lines marking your lane.
> 
> If you do know this, and you're just experimenting, then PLEASE be sure to state this in any videos or posts you make. Explain to your viewers/readers that you are merely trying it out in an unsupported situation to see how it reacts. There are too many people in the world who believe that Autopilot == Self Driving, and they'll erringly believe that it's a failure when seeing/reading these kinds of stories.


Garsh:

Your warning is well taken. I posted to affirm what the previous poster had illustrated--that Autopilot with Autosteer is specifically *not* to be trusted on surface streets at intersections.

I did this as a test on a country road well known to me with no other traffic. The swerving noted could easily have caused other drivers to react in a traffic situation and potentially caused an accident.

I responded to the post here, but I think this discussion might be better carried on at the *Autopilot confidence v8 vs. v9* forum, where sdbyrd79 described a similar tendency to "...jerk right or left at times."


----------



## garsh (Apr 4, 2016)

ADK46 said:


> ...I've come to understand that Tesla's AP system is remarkably unsophisticated...


Yet, keep in mind that it's also the most advanced system available. It's the only system available today that succeeds at keeping the car between the lines.


----------



## M3OC Rules (Nov 18, 2016)

ADK46 said:


> I know what the manual says, but I also know what the car allows, knowingly, on purpose - by design.


I agree here. Tesla could do more to limit its use and chooses not to.



ADK46 said:


> Tesla's AP system is remarkably unsophisticated given its cost


Do you mean its limited use cases?

One thing I'd say to keep in mind is its getting better over time. It could be full self driving someday. Enhanced autopilot is a stepping stone. Drive on nav is coming tomorrow. You're paying for sensors that no other consumer car has with an upgrade-able computer. Wait a few months and maybe it will meet your expectations.


----------



## ADK46 (Aug 4, 2018)

First, let me repeat that I love the car. I like that I get to experiment with "unsupported" and/or beta features. My lament is over the sophistication of the AP system. I did read the report published recently comparing the systems from several manufacturers, showing that Tesla had the best. But not by much. I thought Tesla would have a big lead.

(If I recall, the testing was done on two-lane roads to challenge each system, and one test was specifically on what happens when you go over a rise, when lane markings can't be seen very far ahead.)

I've been loosely following computer science since 1968, and have programmed in languages from Fortran to Swift, systems from IBM 360 to IOS. I did some data analysis using artificial neural networks in the 1990s - that was a real eye-opener into a very different world of computing. Having observed the trends over 50 years, perhaps my surprise is how naive I could be about the general state of machine vision.

It's easy to be misled into believing "The future is here!" from press releases, but I thought I was immune. Musk may have special powers here, like Jobs' Reality Distortion Field. Didn't he tweet something like "Imagine summoning your car home from California!" The actual meaning of Full Self-Driving is left to our imagination.

Anyway, to dive into one detail based on my rudimentary knowledge: If indeed the steering is the direct output of neural net computations, why wouldn't this net be trained with a smidgeon of map data: the current and upcoming radii of curvature? Compared to all the visual data, this seems easy. Training a net is about repetitively presenting data from known situations where the correct output is known - this is an apple, this is an orange. Did this steering input move the car into a better position or not? The training continues until the net is smart enough for your purposes. Difficult cases must be presented if you want it to be smart about them.

By providing curvature inputs, the net would be trained to know the path of the road beyond the rise. It would know better what to do if a lane marking became obscured or missing - the net would learn to trust whatever lines remain that follow the curvature, don't panic, slow down if it persists.

I'm sure this is a comically simple view of the problem. I'd love to bump into some AP engineers in a bar....

I love the car! And people who are smart enough to buy them.


----------



## garsh (Apr 4, 2016)

ADK46 said:


> I did read the report published recently comparing the systems from several manufacturers, showing that Tesla had the best. But not by much. I thought Tesla would have a big lead.


Waymo has a 10+ year lead over everybody else in this area, including Tesla. it's just a shame that you can't yet buy a Waymo-powered self-driving car. I hope that becomes available soon - the world needs self-driving cars. They will prevent so many accidents.


> Didn't he tweet something like "Imagine summoning your car home from California!"


Yes, but that was when discussing Full Self Driving (FSD). They're working towards it.


> If indeed the steering is the direct output of neural net computations, why wouldn't this net be trained with a smidgeon of map data: the current and upcoming radii of curvature? Compared to all the visual data, this seems easy.


I agree. And they may be working on it. I think they'll have to add some additional data if they want Autopilot to be able to follow navigation instructions - it can't yet handle an interstate that splits, for example.

Training a neural net is very much more an art than a science. You - the trainer - have to decide what inputs will be useful. You usually have to "simplify" the inputs to become one or more numbers in a range from 0 to 1. Adding a single input _multiplies_ the computing power required for training. So ideally you try to limit yourself to a minimal set of inputs.

You have to *manually* create a large set of training data, which includes all of the inputs along with the desired output. Additionally, you have to also create a large set of "validation" data. This is similar to training data, except you DO NOT use it to train the neural net - you use it to test the completed neural net to confirm that it produces desirable output. It takes a_ lot_ of manpower to create the training and validation datasets.


----------



## M3OC Rules (Nov 18, 2016)

I am pretty happy with how well autopilot works in supported and unsupported situations. There are cases that could be dangerous if someone is following too close like if a car crosses in front a block up and it hits the brakes. Or if it gets confused by shadows or overpasses and hits the brakes. These are TACC so they don't have supported and unsupported situations. I assume other cars have the same issues with this. That behavior has worried me more than the autosteer in my experience. The places where autosteer worries me is when its closer to the car next to me than I want or if their is no shoulder and a barrier or guardrail. I feel like maybe the ultrasonic sensors might keep it from hitting the side but I don't really know and I really don't want to find out the hard way. I don't know if its safer to be on autopilot or not. I can give examples that are in supported and unsupported situations. If its actively making sure I don't hit the barrier maybe its safer because I can spend more time watching the other cars. I really don't know. Now I think the case that people are getting upset about is crossing the intersection and turning hard. I've seen the well produced video and its exactly what you don't want it to do. Its hard to tell from a video if its likely you wouldn't have time to respond if you were paying close attention but its clearly dangerous. This is all anecdotal but it seems like this may be a v9 thing. I have no idea how often it does this or how many people use autosteer in unsupported areas. If its causing significant accidents its a problem for Tesla and one they need to address. They publish accident data and say they care about accidents. In terms of when and where you should use it is a personal decision. Unfortunately you may think its worse or better than it is and make a bad decision for yourself in terms of safety. This can go both ways. But its a little unpredictable so you're probably never going to know. Its not going to be perfect and statistics will ultimately tell the story.


----------



## MelindaV (Apr 2, 2016)

M3OC Rules said:


> I don't know if its safer to be on autopilot or not.


conveniently, Tesla has tracked this for you!


> Over the past quarter, we've registered one accident or crash-like event for every 3.34 million miles driven in which drivers had Autopilot engaged.
> For those driving without Autopilot, we registered one accident or crash-like event for every 1.92 million miles driven.




__ https://twitter.com/i/web/status/1047934267794186240


----------



## ADK46 (Aug 4, 2018)

Does the Waymo scheme rely on Lidar? I have not encountered the answer to a question I've had: what makes Tesla think they can get by with just cameras when other efforts involve a big wart on the roof? The flip answer is that humans drive pretty well with just two cameras, sometimes just one, but there must be a more serious answer.

The last thing I read about neural nets was about a new method for preparing an efficient training set. It was over my head, but yes - this is a big deal. I've assumed Tesla gathers a huge amount of potential data to form sets. I would hope it would flag cases where the vehicle went over a line, or detected that a driver bailed out for cause.

I know many believe our car is engaged in training its own neural net, but I presume we all have the same one, trained by Tesla.

Actually, I think it's more likely there is a suite of neural nets (together with more procedural routines), each dedicated to a task. Finding lines in an image is a task. Choosing which line to follow might be a different task. There ought to be a task that predicts where to look for lines during the next iteration. There could be a supervisory task, watching over the others, scoring the overall quality of the effort. I'd like to see this score displayed.

There I go, dreaming again.


----------



## MelindaV (Apr 2, 2016)

I think part of the answer to that question @ADK46 is the vast volume of Teslas collecting data and creating crowd-sourced mapping vs relying on lidar (while waymo has very few vehicles collecting info for mapping, and in very few locations) lidar also has limitations that camera+radar do not as far as what types of objects are detectable. 
(without going back to check myself, radar can detect solids but not liquids (water-based people/animals), lidar can see solids but is impeded by liquids (weather), sonar can see solids and liquids, camera can see both solids and liquids).
lidar is also higher cost, not-aero dynamic, not attractive, etc... so if there is an alternate system that can provide the same or better results, why choose lidar?


----------



## PNWmisty (Aug 19, 2017)

ADK46 said:


> The flip answer is that humans drive pretty well with just two cameras, sometimes just one, but there must be a more serious answer.


That seems like a pretty damn serious answer to me. It actually amounts to proof that correctly implemented, cameras are all that is necessary. I would add a couple of accelerometers for that "seat of the pants" capability. But the car already has those.

This is all that's necessary for a true "Mad Max" mode, with perfectly executed tire-smoking donuts thrown in for bonus attitude points!


----------



## M3OC Rules (Nov 18, 2016)

MelindaV said:


> conveniently, Tesla has tracked this for you!
> 
> 
> __ https://twitter.com/i/web/status/1047934267794186240


Yes. And as near as I can tell it does not distinguish the situation its being used in, supported or unsupported, which is great. Those results are compelling. Based on that Tesla data you might say its always better to have it on. But we don't really know if there are situations where its less safe statistically. Tesla probably does but I'm pretty sure we're never going to see it unless we go work there or it gets subpoenaed. Use in unsupported areas could be bad but not affect the results if its a very small percentage of the use or maybe its not causing more accidents than its preventing. Is there a situational crossover point where it becomes less safe? This is what I'm unclear on.


----------



## M3OC Rules (Nov 18, 2016)

MelindaV said:


> I think part of the answer to that question @ADK46 is the vast volume of Teslas collecting data and creating crowd-sourced mapping vs relying on lidar (while waymo has very few vehicles collecting info for mapping, and in very few locations) lidar also has limitations that camera+radar do not as far as what types of objects are detectable.
> (without going back to check myself, radar can detect solids but not liquids (water-based people/animals), lidar can see solids but is impeded by liquids (weather), sonar can see solids and liquids, camera can see both solids and liquids).
> lidar is also higher cost, not-aero dynamic, not attractive, etc... so if there is an alternate system that can provide the same or better results, why choose lidar?


One obvious advantage with lidar is being able to precisely detect the distance things are away. If you think about taking the camera image and then attaching a distance away every pixel is that would really help tell where all the objects are and their trajectories. You wouldn't have shadows or overpasses causing it to slam on the brakes. Or trouble distinguishing a semi from the sky. You wouldn't have the cars jumping around all over the screen like they do on v9. It cannot replace cameras as you say. But I think Tesla would use it if it was available at a reasonable cost. Do any other manufacturers put lidar on a consumer car yet?


----------



## MelindaV (Apr 2, 2016)

M3OC Rules said:


> One obvious advantage with lidar is being able to precisely detect the distance things are away.


unless it's snowing/foggy/raining


> Do any other manufacturers put lidar on a consumer car yet?


nope, only private fleets are using lidar (at least to date)


----------



## M3OC Rules (Nov 18, 2016)

garsh said:


> This is why I continue to beat this drum.
> 
> *AUTOPILOT IS NOTHING BUT FANCY CRUISE CONTROL.*
> 
> ...


That was yesterday...


----------



## Rick Steinwand (May 19, 2018)

MelindaV said:


> In tesla's world San Ramon is toward the top of the list. Imagine you are going thru a small town in North Dakota and expecting it to react like it does in the Bay Area


As a ND resident, I was thinking of posting my experience. 

I primarily use it for road trips to my home town, which is about 50 miles one-way on two-way roads. The only place I couldn't engage AP was a newer paved road (primarily used by farmers) that had no speed limit recognized. So except for that 10 mile stretch, I've found it worked on all roads. One time I did have a phantom brake event, when I saw nothing obvious for it. Other times I saw lots of discolored areas of blacktop road where it could have had issues and it worked fine.

I do have two areas where the speed limit changes from 55 to 40 for about a half-mile near small towns and usually the vehicle speed is unchanged and I'm driving 60 mph in a 40 zone. One time it detected this and braked too hard to slow to that speed, where I would have preferred more of a coast down to 40.

TBH, I have more complaints about the auto high-beams than AP.


----------



## garsh (Apr 4, 2016)

ADK46 said:


> Does the Waymo scheme rely on Lidar? I have not encountered the answer to a question I've had: what makes Tesla think they can get by with just cameras when other efforts involve a big wart on the roof? The flip answer is that humans drive pretty well with just two cameras, sometimes just one, but there must be a more serious answer.


No, that's not a flip answer. That's the serious answer.

All road markings and all road signs are designed to be seen and interpreted in the visual spectrum. Therefore, visible light *must* be a part of any vision system for self-driving. We also know that 3d vision (including depth perception) is possible with visible light cameras simply by employing more than one. Remember, Elon likes to break things down to first principals when solving a problem. Lidar is expensive, and this problem _should_ be solvable using nothing but visible light cameras.


----------



## Dr. J (Sep 1, 2017)

Rick Steinwand said:


> TBH, I have more complaints about the auto high-beams than AP.


Those you can turn off. I also found them very annoying, certainly not an improvement on my ability to manage high beams.


----------



## garsh (Apr 4, 2016)

Dr. J said:


> Those you can turn off. I also found them very annoying, certainly not an improvement on my ability to manage high beams.


I use them, but I only turn them on when on mostly-empty local roads. It's been a workable compromise for me. Mostly because turning them on-and-off just requires a tap of the stalk - you don't have to use the screen.


----------



## ADK46 (Aug 4, 2018)

Two differences between auto-dimming headlights and fully manual: 1) you can treat auto-dimming as a backup to your own better judgement and possible inattentiveness (good) and 2) the damn thing dims for huge green signs on the side of a straight road, and other things that are clearly (?) not cars (very annoying). I will soon make a final cost/benefit determination, but I'll probably be turning it off.

Regarding my "flip" answer, I might have phrased it differently to reflect my own opinion: I think it is a persuasive argument that Lidar is not required, subject to the limitations of machine vision. That's probably why Waymo and others went down the Lidar path: when they began, machine vision was deemed incapable of quickly figuring out the exact locations of baby carriages and other things.

Edit: I may not be using the term "machine vision" correctly - I meant the analysis of camera images only.


----------



## ADK46 (Aug 4, 2018)

NY Times article today: https://nyti.ms/2R6wrMn

Superficial, of course, but touches upon some things we have discussed.


----------



## Rick Steinwand (May 19, 2018)

garsh said:


> I use them, but I only turn them on when on mostly-empty local roads. It's been a workable compromise for me. Mostly because turning them on-and-off just requires a tap of the stalk - you don't have to use the screen.


I only use them for rural driving. Don't want to hit a deer.


----------



## sdbyrd79 (Nov 28, 2017)

Nick's Tesla Life said:


> I had a close call with V9 Autopilot steered left going through a intersection on a local road. I know, I know, we shouldn't be using AP on local roads, but I made this video to show people a potential problem with V9 AP.


That pretty much sums up my thoughts on v9 - for that type of scenario. v8 would have had no issues with that type of "unsupported" scenario, but v9 is doing more complex decision making, which leads to more variance and unpredictability for these secondary roads. Bottom line is secondary "unsupported" roads have definitely regressed (which I'm totally fine with) but for those that have driven thousands of miles on those same "unsupported" roads with v8, you really need to be super careful when you upgrade to v9. That's all I have to say


----------



## PNWmisty (Aug 19, 2017)

sdbyrd79 said:


> Bottom line is secondary "unsupported" roads have definitely regressed (which I'm totally fine with) but for those that have driven thousands of miles on those same "unsupported" roads with v8, you really need to be super careful when you upgrade to v9. That's all I have to say


My experience has been V. 9 has more capabilities on secondary roads containing sharper curves. I think what's going on in Nick's video is the road has some bad elevation changes from the right to left tire tracks (notice the car rolling severely from side to side). I think this kind of uneven road is outside the ability of the current EAP abilities. I think when the car rolled, it remapped the oncoming lane as the current lane (when the road lines disappeared for the intersection). Obviously, an area that needs work. Be extra cautious on roads that cause the car to roll from side to side.


----------



## ADK46 (Aug 4, 2018)

I see some roll to the car from the sudden steering input, but no elevation change. There seems to be a bump in the road beforehand, but the bad steering happens later. I think the cause is the sudden absence of any lines, though it is just for a short distance. The double yellow line is very prominent just beyond, and the side lines are good, too. 

How did the neural net get trained to go sharply left when presented with a view like this? If it has not been given any training for a road like this, it should not be possible to invoke it, of course.


----------



## Nick's Tesla Life (May 19, 2018)

garsh said:


> @Nick's Tesla Life , @undergrove, @joelliot
> 
> If you don't yet know, please realize that Autopilot should only be used on limited-access, divided highways. It doesn't handle intersections, stop signs, red lights, or lack of painted lines marking your lane.
> 
> If you do know this, and you're just experimenting, then PLEASE be sure to state this in any videos or posts you make. Explain to your viewers/readers that you are merely trying it out in an unsupported situation to see how it reacts. There are too many people in the world who believe that Autopilot == Self Driving, and they'll erringly believe that it's a failure when seeing/reading these kinds of stories.


I completely agree and that's why I put "This video is meant for educational purposed only and please don't use AP on local roads" in the description and within the video!


----------



## Nick's Tesla Life (May 19, 2018)

Nick's Tesla Life said:


> I had a close call with V9 Autopilot steered left going through a intersection on a local road. I know, I know, we shouldn't be using AP on local roads, but I made this video to show people a potential problem with V9 AP.


Hey, an update on this --- I went through the same intersection again and it did fine. Wonder if it learned??? I was driving at 45 instead of 50. I saw the tesla think it was a curve again on the screen for a spilt second this time, much quicker than before, but the steering stayed straight.


----------



## garsh (Apr 4, 2016)

Nick's Tesla Life said:


> Hey, an update on this --- I went through the same intersection again and it did fine. Wonder if it learned???


Were you following a car this time? If so, it probably just locked onto the car in front.


----------



## garsh (Apr 4, 2016)

ADK46 said:


> If it has not been given any training for a road like this, it should not be possible to invoke it, of course.


It's impossible to *not* invoke the neural net. This is how the car decides where to steer. The only other possibility is to disengage autosteer. It's possible for a net to provide bad output. But unless they also train the net to produce a "turn off autosteer" output, you're stuck with whatever it produces. And turning if off means that you replaced autosteer with "random steer", which won't be better.


----------



## ADK46 (Aug 4, 2018)

garsh said:


> It's impossible to *not* invoke the neural net. This is how the car decides where to steer. The only other possibility is to disengage autosteer. It's possible for a net to provide bad output. But unless they also train the net to produce a "turn off autosteer" output, you're stuck with whatever it produces. And turning if off means that you replaced autosteer with "random steer", which won't be better.


I'm saying that the system should not permit drivers to invoke auto-steer on types of roads for which it is not trained. From a certain perspective, this is obvious. It's not the only perspective, so we're having an interesting discussion.


----------



## changsteer (Sep 7, 2017)

sdbyrd79 said:


> That pretty much sums up my thoughts on v9 - for that type of scenario. v8 would have had no issues with that type of "unsupported" scenario, but v9 is doing more complex decision making, which leads to more variance and unpredictability for these secondary roads. Bottom line is secondary "unsupported" roads have definitely regressed (which I'm totally fine with) but for those that have driven thousands of miles on those same "unsupported" roads with v8, you really need to be super careful when you upgrade to v9. That's all I have to say


When I was on v8, an identical behavior happened to me, too. The only difference is the intersection I went through was bigger than Nick's. Autopilot got through the intersection fine one day when there was a car to follow, but got confused another day when there was no car ahead of me to follow. I haven't got a chance to test the same intersection against v9 yet. It'll be interesting.


----------



## undergrove (Jan 17, 2018)

I want to report further information on the swerving I reported earlier in this thread:

https://teslaownersonline.com/threa...or-in-unsupported-situations.9297/post-166802

I was testing Autosteer going through an interesection on a two-lane country road near my home. *I fully understand that this is not intended to be supported in the current state of EAP,* however there was no traffic, and I felt it safe to try this at moderate speeds: 20-30 mph.

What I observed was the car swerving slightly to the left as it entered the intersection and swerving back as it left. Since it did not do this at similar intersections elsewhere, I examined the intersection more closely--it is near my home and I frequently pass it on walks.

It turns out the truck painting the center line had itself swerved slightly to the left as it entered the intersection and then back as it left, so the center line curved slightly to the left just as it reached the intersection and then curved back on center on the other side. So the behavior of Autosteer was logical, but indicated that it was not looking very far ahead to see that the road was actually continuing in a straight line. The swerving was more pronounced at 20 mph than at 30 mph.

This was with v9.0 39.7.1

We have since been upgraded to 42.3, and I can report that our M3 now passes through this intersection with no swerving at 20-30 mph.

A definite improvement from 39.7.1. This is consistent with my general impression that EAP is better and smoother in 42.3, although we still do not have NoA.


----------



## garsh (Apr 4, 2016)

undergrove said:


> *I fully understand that this is not intended to be supported in the current state of EAP,*


Thanks for including that in your description. It's important that when reading these sorts of posts, people who are less familiar with autopilot are made aware that Autopilot is not yet meant to handle these situations.


----------



## sdbyrd79 (Nov 28, 2017)

I've also noticed that 42.3 (and time) have also made big improvements, leading to less swerving as the initial v9 release. I would go as far to say it not only regained my confidence, but really excels in lots of "unsupported" scenarios, topping what v8 could do for sure. I've been downright impressed to see it glide through questionable intersections and badly-painted roads with the latest updates, all while hands are on the wheel to take over of course! It's amazing what a month can make


----------



## Scubastevo80 (Jul 2, 2018)

My navigate on autopilot has a long way to go on NJ roads. Taking most exits is a "slam on the brakes" type of affair, and in some cases it won't get over onto the offramp. My biggest complaint though is the auto loan change ping-ponging I have to deal with daily. There are two scenarios which I seem to encounter often: (1) if another car is a good ways back and decides to accelerate even a bit my car is flying back into the original lane. (2) if the car begins to move into a faster lane, but the car ahead in that faster lane begins to slow down, the model 3 freaks out and instead of adjusting speed and moving over, it flies back into the original lane. I also wish it would just stop after one failed attempt instead of trying again and failing. I must look like the drunk electric car driver and I'm just waiting to have to explain this to the police if they see this behavior.


----------



## ateslik (Apr 13, 2018)

ADK46 said:


> I'm sure this is a comically simple view of the problem. I'd love to bump into some AP engineers in a bar....


It's more likely that if you saw some AP engineers in a bar that they'd bump into you!

haha!


----------



## MelindaV (Apr 2, 2016)

ateslik said:


> It's more likely that if you saw some AP engineers in a bar that they'd bump into you!
> 
> haha!


more like veer slightly toward you, then adjust and continue on their way


----------



## ADK46 (Aug 4, 2018)

MelindaV said:


> more like veer slightly toward you, then adjust and continue on their way


Nevertheless, beta keep both hands on your glass.


----------



## tivoboy (Mar 24, 2017)

Is there a thread somewhere that covers unexpected and unintended physical breaking while traveling at speed on a highway? A supported highway nonetheless dand where the car suddenly and briefly actually is applying the brakes unnecessarily?


----------



## wawam3 (May 2, 2018)

Mod: Please move/delete if not the right place to post.

Becareful during autopilot, make sure to keep your eyes looking forward.


Autopilot Rear Ended a Truck


----------



## garsh (Apr 4, 2016)

wawam3 said:


> Mod: Please move/delete if not the right place to post.


Thank you! Done! 

I encourage everybody to watch this video. This is a situation that I would have expected autopilot to handle, and it didn't. Always be paying attention. It isn't self-driving yet.


----------

