# I don't think this was the intended action



## MJJ (Aug 7, 2016)

We have been taking this offramp a couple times a month with our Tesla for 2 1/2 years now. We have only recently (last 2 months or so) been having this particular problem. It is highly repeatable in various conditions (speed, light, etc). I let it go farther than usual this time to see what the outcome would really be and I do think we would be the proud owners of a traffic sign if I hadn't intervened. Has anyone else experienced this behavior on a bus pull through?


----------



## Long Ranger (Jun 1, 2018)

I have to admit, on my first viewing I was a bit confused by the road markings. Do I go left or do I go right? But splitting the difference is never a good solution!


----------



## jsmay311 (Oct 2, 2017)

That’s pretty bad. There really shouldn’t be too much ambiguity about which way to go when there’s a solid line on the left.


----------



## MJJ (Aug 7, 2016)

Was able to make another pass this weekend. Went slower. Did not go any better.


----------



## JasonF (Oct 26, 2018)

Reminds me of the I-4 and Lee Road exit here in Orlando, which, appropriately enough, is how you get to the Tesla service center.

The exit is constructed so there are three left turn lanes. Two of them are obvious - but the third one is what you need to get to Wymore road, which goes to the Tesla service center. If you follow the first link below, you can _kind of_ see a sign above the exit ramp explaining it, but very badly - you can really only read that sign if you come to a complete stop on the exit ramp.

The second link shows what happens if you follow your instinct - or the GPS, or possibly even FSD (I don't have it, so I can't check) and take the two obvious left turn lanes. You end up trapped behind an island, unable to turn left at Wymore.

The exit: 
https://www.google.com/maps/@28.603...4!1s_X0EFli0Nn3MPf9Muyp_CA!2e0!7i16384!8i8192
What happens if you use the wrong left turn lane:
https://www.google.com/maps/@28.605...4!1sAPG7xEBQ5RnVY-dkavIk0g!2e0!7i16384!8i8192
End result is a ton of people turn left from behind the island. And quite often, there is a cop parked in the office parking lot just beyond the turn, who will happily ticket you and tell you that you should have stopped to read the sign on the exit ramp.

Yes, that last part makes me pessimistic about why badly designed traffic directions like that are never corrected - because the city/county asks the police if it's causing any problems, and the police tell them no, it's completely clear if people just stop to read the sign (because they can ticket people who don't).


----------



## lance.bailey (Apr 1, 2019)

yep, I have a few spots where the road splits - some in the 35mph/60kmh range and what used to be successfully navigated is now a shift left and right as if the car is deciding and then down the middle UNTIL it veers over to the choice made. bad car.

taking an off ramp last week (60mph/100kmh) in AE (auto everything) and the car started to exit right as expected, but then veered left back into the lane and directly at a traffic sign. very bad car.


----------



## evdude88 (Aug 15, 2021)

ouch that could've gone badly. I believe they improved the software since then though


----------



## MJJ (Aug 7, 2016)

evdude88 said:


> ouch that could've gone badly. I believe they improved the software since then though


Nope, same thing, at 25 mph, on the most recent update. On the same trip autopilot drove off the edge of a sharp but well-lined corner on Lucas Valley road. To be fair, these are scenarios that will fall under the control of fsd rather than the controlled access suite. Elon is too busy with his rocket to get FSD pushed out to us common folk though 

Before full AI, crossing a line seemed unthinkable.


----------



## TomT (Apr 1, 2019)

As a matter of course, I always take the care off of autopilot when approaching such intersections. I simply don't trust it enough at this stage of the software to make the right decision and would rather be safe than sorry...


----------



## MJJ (Aug 7, 2016)

TomT said:


> As a matter of course, I always take the care off of autopilot when approaching such intersections. I simply don't trust it enough at this stage of the software to make the right decision and would rather be safe than sorry...


I still somewhat have the view that challenging the software helps generate useful data. I think it is possible for an attentive driver to allow a failure and still be able to recover safely. Not in all cases, but in those cases the failure would not be allowed to play out.

Elon says every intervention is meaningful. However after literally years of having to make the same intervention at the same place, day after day, I question whether the data is being mined even for the low hanging fruit.


----------



## MJJ (Aug 7, 2016)

Finally got a chance to negotiate this corner with FSDB and it handled it perfectly. Some things do get better.


----------

