# Destination Ditch: Tesla Driver Blames Autopilot for New Jersey Crash



## Needsdecaf

Saw this article on The Truth About Cars. This accident happened close to where I used to live, although that was quite a few years ago.

https://www.thetruthaboutcars.com/2...topilot-for-new-jersey-crash/#comment-9694702

From the article, it sounds as if the car split the difference between the exit ramp and the travel lanes and plowed into the grass and some signs or whatever beyond the marked gores on the ground. See pic below.

Wonder what really happened? I've yet to experience EAP, even when set to NOA, make an error this egregious. Although I would never set NOA to run on this road. I probably would run EAP, but paying pretty careful attention.

And certainly the claim of "Autopilot wouldn't let me take control" sounds quite bogus.

It's a shame that drivers not doing what they are supposed to will lead to more crashes blamed on Autopilot. I will state, as I did in the comments on the article, that I feel that Elon oversells the capabilities of Autopilot. It's a fantastic system. But I'm not envisioning a day anytime soon where the car drives itself to pick me up, or can drive to it's destination completely without my guidance, like as if I were asleep in the car. Not based on my experiences with EAP and NOA that's for sure. But issues and crashes like this do not help the situation as a whole....


----------



## Bigriver

Thanks for providing a clear view of the location of that specific incident. Seeing the location lends a wee bit of credibility to the story as EAP often doesn't handle very well the addition of turn-off lanes. It wants to follow with the right lane, but then realizes it shouldn't, and jerks back to the left.... yeah. However, I really, really, really don't believe that EAP wouldn't let the driver take control. If I thought that was true, I would abandon my use of EAP. I do wonder, though, if the accident mitigation system was responding to some real or perceived threat? I think it's great that Tesla should be able to recreate the details of the accident, complete with video from all sides.

I don't see this accident as all that similar to the California one that it is being compared to. That was in a construction zone where lanes really weren't marked, and there is the problem of the former and current lines. I drive through construction zones all the time and would NEVER let EAP (in its current form) try to maneuver that mess.


----------



## JasonF

I noticed with all of these Autopilot related crashes that it's fairly certain that the driver wasn't even looking. Reading, texting, watching a movie, maybe, but not looking. I haven't yet seen video of one where either the crash happened too fast for the driver to react, or the driver was unable to correct.

From what I read in this case, it's highly likely the driver was busy reading/texting/watching a movie and only looked up when fixed objects were looming above them, and crashed before they could react.


----------



## mswlogo

Well the whole AutoPilot thing is really tricky. Because it's so "good" a lot of the time you can't help become complacent, and then it suddenly screws up.

I never would use it on a road like that, in it's current form, too risky. Just highways and I avoid right lanes , I try to stay in the middle lane.
That said, it's surprising it had an issue with that divider. Maybe it was the bad paint on the lines just before it.
You can predict now which road splits are risky and you generally have plenty of time to be attentive.

The worst thing, right now is phantom braking. And if someone's on your butt when that happens, your screwed.

Labeling it Beta and Requires User Attention is kind of a cop out.

I wouldn't mind so, much if we saw more incremental progress. Like, why is the phantom braking issue not resolved, or at least improved yet, it's so serious.
Being attentive isn't gonna be quick enough for the phantom braking.

I kind of can't blame someone blaming AutoPilot. It's so easily accessible, folks are used to having things be so thoroughly validated, if they have easy access. Especially on a car.
I'm sure many people flip it on, thinking, well I'm sure they wouldn't give me access if it could kill me. well it can.

Don't get me wrong, I love AutoPilot and what Tesla is doing. But sometimes they kind of treat is like a new app on your phone, not an 80 mph missile.

I'm hoping the "gap" in AutoPilot progress is because they are doing major refactoring to make it a lot better. If it were me, I would have released more restrictions of conditions to use it in, until it is improved.
There are gonna be some major accidents and it doesn't help their PR. Much better to have customers screaming for improvements than dead ones.

I do use it, I'm a geek like many here, but I know any time I use it, it's risky. Even on a clear summer day, middle lane, clear markings it can goof up faster than you can react.


----------



## JasonF

As far as I see it, Tesla's biggest problem with autopilot is that they _charge extra for it_, and they charge a _lot of money_. No, I'm not starting that "autopilot should be free" thing again.

Except I kind of _am_. Because the problem they've created is once a customer has paid $5000 (or $7200 in some cases) for Autopilot, they now can't say "we're disabling it over the air until we can fix it", and then shut it off for an indeterminate amount of time. No, they would end up having to refund everyone who asks for one, and that's a lot of money for a struggling company. If AP would have been free, they _could_ get away with disabling it for a while, just like when Apple turned off group Facetime (which they can do because it's not an extra-cost option).

So when there's an issue like phantom braking, they ask everyone to be careful, and hang on a little bit longer until they can figure out the issue and solve it. Most people will, as long as they keep calling Autopilot a "beta". But someday that patience will run out, and it won't be because of a crash.

A secondary reason for why charging for autopilot might be bad for Tesla is that people demand way more of a product they paid extra for, and the public tends to back them up on those demands. Some of them might even insist that it's perfectly reasonable to use Autopilot to catch up on their reading, or binge on Star Trek Discovery, because they paid $5000 for it, Tesla better damn well make it possible. If AP is free, it doesn't appear to be a _special feature_. It's just yet another driving feature people use incorrectly.

And finally, because Autopilot is being sold as a separate feature, that means it's not thought of as an integral feature to the car. If the NTHSA decides against Tesla in a case, that creates a high likelihood that they could order Tesla to disable Autopilot permanently in all vehicles - because they _know_ Tesla has the ability to do so without hardship. If it was a free part of the car software, Tesla could make the case that it's integrated with the car systems, and would be a hardship to remove it quickly.

On that topic: U.S. culture is rife with examples of "shut it down until you can absolutely prove it will never harm anyone again". I fully expect someday to see Europeans laughing at Americans because by law we still have to drive our own cars, and they've been self driving for years.


----------



## Needsdecaf

Bigriver said:


> Thanks for providing a clear view of the location of that specific incident. Seeing the location lends a wee bit of credibility to the story as EAP often doesn't handle very well the addition of turn-off lanes. It wants to follow with the right lane, but then realizes it shouldn't, and jerks back to the left.... yeah. .





mswlogo said:


> Well the whole AutoPilot thing is really tricky. Because it's so "good" a lot of the time you can't help become complacent, and then it suddenly screws up.
> 
> I never would use it on a road like that, in it's current form, too risky. Just highways and I avoid right lanes , I try to stay in the middle lane.
> That said, it's surprising it had an issue with that divider. Maybe it was the bad paint on the lines just before it.
> You can predict now which road splits are risky and you generally have plenty of time to be attentive.


You both make good points. But in this case, I can't see where AP would have lead to this crash. Take a look at the photos below. The first one shows way, way back (like 1/4 mile) where the right lane splits and picks up an exit lane. This is REALLY far before the divide. In this case I can see the Tesla maybe wanting to follow into the right lane, but then it would be in the middle of two sets of dashed lines. If it didn't pick one, the driver certainly would have had enough time to react prior to the actual off ramp.

Second photo is closer to the off ramp than the first.

Bottom line, I'd be surprised if AP really caused this accident.


----------



## gary in NY

I suspect driver inattention more than any other explanation. Still, Tesla should review the logs, and maybe they already have since they issued a statement regarding this crash. 

It would be interesting if a TOO member in that area could drive this route and check out EAP’s reaction. I’m a little too far away.


----------



## mswlogo

Needsdecaf said:


> You both make good points. But in this case, I can't see where AP would have lead to this crash. Take a look at the photos below. The first one shows way, way back (like 1/4 mile) where the right lane splits and picks up an exit lane. This is REALLY far before the divide. In this case I can see the Tesla maybe wanting to follow into the right lane, but then it would be in the middle of two sets of dashed lines. If it didn't pick one, the driver certainly would have had enough time to react prior to the actual off ramp.
> 
> Second photo is closer to the off ramp than the first.
> 
> Bottom line, I'd be surprised if AP really caused this accident.
> 
> View attachment 21890


I can see RED Flags immediately in that First Photo for AutoPilot.

First the lane opens up double wide. AutoPilot will center itself into this wide lane, then suddenly it will find itself directly over the dashed line. AP will then Jerk to one side or the other to get back in a lane.

Nav On AutoPilot would not be operating there.

Bet the accident was on his left side. Because a driver on his left rear will look like he has gestured to take that exit. Driver starts to pass and wham AP jerks back into the the original lane.

And it will happen QUICK. I would not blame attentiveness on this, but awareness that AutoPilot is horrible in those exact situations and should NOT be used in situations like that in the first place.


----------



## Needsdecaf

mswlogo said:


> I can see RED Flags immediately in that First Photo for AutoPilot.
> 
> First the lane opens up double wide. AutoPilot will center itself into this wide lane, then suddenly it will find itself directly over the dashed line. AP will then Jerk to one side or the other to get back in a lane.
> 
> Nav On AutoPilot would not be operating there.
> 
> Bet the accident was on his left side. Because a driver on his left rear will look like he has gestured to take that exit. Driver starts to pass and wham AP jerks back into the the original lane.
> 
> And it will happen QUICK. I would not blame attentiveness on this, but awareness that AutoPilot is horrible in those exact situations and should NOT be used in situations like that in the first place.


That's not what happened with the accident. The car proceeded all the way to the exit ramp, went straight through the middle of the gores (the striped lines) between the ramp and the travel lanes, and then proceeded up onto the grass clipping some manner of poles, objects, etc. From the article:

_According to a police report cited by NJ.com, the Tesla (model unspecified) was operating in Autopilot mode as it travelled down Route 1 in Middlesex County. As it neared the Adams Lane exit, North Brunswick police claim the vehicle "got confused due to the lane markings" and ultimately ended up off the road, taking out several signs in the process._

"The vehicle could have gone straight or taken the Adams Lane exit, but instead split the difference and went down the middle, taking the vehicle off the roadway and striking several objects at the roadside," the police report states.

I agree that Autopilot might have gotten confused at the initial lane split. But that happened WAY before the actual ramp. And as you said, Autopilot would have chosen one lane or the other and jerked into it quickly, as you said. But at that point, there's still 1/4 mile or so to go before the crash site, and the car would be in a lane.

Doesn't explain why it would have crashed where the exit ramp broke off, as that ramp is an exit only and that lane leading to the exit was way back, where I showed in the picture. If by the "first picture" you mean the first one in the thread showing the ramp. the actual point where EAP would have gotten confused was way prior to that.


----------



## Love

(In a not to distant land, on my way home from work)
MEEEP MEEEP!!!!
"Outta my way JERKASS, I own a Tesla and am speeding home! Road type and rules be damned! Time to engage autopilot..."
(BOONG BING)
"...zzzzzzz."










"TESLA'S FAULT!!!!"


----------



## Needsdecaf

Lol.


----------



## JWardell

This is a road with traffic lights. Autopilot should NOT be in use there. It is completely the driver's fault, AP or no AP. If you use AP on a road with lights and stop signs, then it's entirely up to you to monitor its every move closely. This is another blatant case of driver misuse, and so distracted that they didn't notice something wrong till they were already in the grass. At that point hitting the brakes or jerking the wheel you will just slide. Grass has about the same traction as ice.

I got a breaking news notification about this from ABC News yesterday, as if this was some national tragedy. Shame, shame on the media for pushing this crap.


----------



## Love

I think @JWardell said it much more gracefully than I did.


----------



## mswlogo

Needsdecaf said:


> That's not what happened with the accident. The car proceeded all the way to the exit ramp, went straight through the middle of the gores (the striped lines) between the ramp and the travel lanes, and then proceeded up onto the grass clipping some manner of poles, objects, etc. From the article:
> 
> _According to a police report cited by NJ.com, the Tesla (model unspecified) was operating in Autopilot mode as it travelled down Route 1 in Middlesex County. As it neared the Adams Lane exit, North Brunswick police claim the vehicle "got confused due to the lane markings" and ultimately ended up off the road, taking out several signs in the process._
> 
> "The vehicle could have gone straight or taken the Adams Lane exit, but instead split the difference and went down the middle, taking the vehicle off the roadway and striking several objects at the roadside," the police report states.
> 
> I agree that Autopilot might have gotten confused at the initial lane split. But that happened WAY before the actual ramp. And as you said, Autopilot would have chosen one lane or the other and jerked into it quickly, as you said. But at that point, there's still 1/4 mile or so to go before the crash site, and the car would be in a lane.
> 
> Doesn't explain why it would have crashed where the exit ramp broke off, as that ramp is an exit only and that lane leading to the exit was way back, where I showed in the picture. If by the "first picture" you mean the first one in the thread showing the ramp. the actual point where EAP would have gotten confused was way prior to that.


I see. Totally drivers fault. It's not gonna take the turn 

Beta should have a real "Quiz" that you clearly need to understand it's limits before it enables.

It could be pure genuine misunderstanding/lake of understanding or distraction/abuse (knowing its limits). But I can see some drivers being very confused when Nav on Auto Pilot can take a turn but it won't here and where it's ok to use, since it allows it.


----------



## Needsdecaf

Lovesword said:


> I think @JWardell said it much more gracefully than I did.


And he's Right.


----------



## Love

To add something besides an OT attempt at humor to this thread, I want to point out (or reiterate actually as I've stated this elsewhere before I believe) that at delivery of both vehicles my wife and I own, the Enhanced Auto Pilot (EAP) feature was OFF. Our Tesla delivery specialist mentioned each time that they're not allowed to (or not supposed to, perhaps) turn this feature on for the customer. Instead they assisted us in navigating to the screens in the menu so that we could enable EAP ourselves if we wanted to. Each time as we were navigating through the screens to turn it on, they were giving us the instructions and disclaimers on Tesla being at LEVEL 2 of autonomous driving.* They wanted it to be VERY CLEAR that if you are activating EAP, that you know and understand everything about it and its current iteration limitations.

*[NOTE: By chance if someone happens on this that doesn't know, there are 5 levels of autonomous driving.. the 5th, and highest being that the car drives itself without human intervention. Info on self driving levels.]
Per the link above:
Level 2 ("hands off"): The automated system takes full control of the vehicle (accelerating, braking, and steering). The driver must monitor the driving and be prepared to intervene immediately at any time if the automated system fails to respond properly. The shorthand "hands off" is not meant to be taken literally. In fact, contact between hand and wheel is often mandatory during SAE 2 driving, to confirm that the driver is ready to intervene.

I'm sympathetic to people for their situation(s) after an accident, especially some that we've all read about that have involved loss of life. However, I'm baffled by the decisions people are making on a daily basis to entrust their lives, the lives of their most precious cargo (their loved ones), the untold amount of lives that surround them over the course of a drive, to a machine/system/feature that is so clearly described, defined and marketed as 40% ready to handle driving for you.

This isn't a knock on EAP or those that use it. I use it myself. I'm talking about those outliers that so blatantly ignore all of what is written above... all the screen warnings and chimes the car itself gives about maintaining "hands on the wheel" and all the history of accidents up to and including DEATH that we read about where the drivers are not even paying attention. All of that information is passed down at delivery, is pasted everywhere and is just plain old common sense and yet there are those that just put on EAP and ignore the road.

Maybe some more info will come out on this particular incident that either absolves the driver or Tesla. But to me, from appearances in that first picture posted by @Needsdecaf , I'm pretty sure that the driver wasn't paying attention.


----------



## NJturtlePower

Yeah... so this has been the hot topic on the local Tesla Club NY/NJ FB page since the accident... more interesting is how the story has changed and evolved with the driver now taking back his part about the "unable to take control" after Tesla released an official statement.

The consensus is that this guy is CLOWN and in an attempt to likely avoid being ticketed at the scene, decided to pull the "AP card" aka "my car did it" excuse. Everyone local agrees that AP should NOT be used in this area which beyond being full of lights, ramps etc, is also under construction in stretches.

Also disappointed in how the media handled it and continued to hype it being there were no injuries or other parties involved. But we all know Tesla ALWAYS has a target on it's back. Can't tell you how many co-worker and friends stopped me to say, "hey did you hear about the Tesla....." yeah, I heard. 

Local Article with updates and video: http://newjersey.news12.com/story/3...t-crashed-on-highway-struck-number-of-objects


----------



## Needsdecaf

NJturtlePower said:


> Yeah... so this has been the hot topic on the local Tesla Club NY/NJ FB page since the accident... more interesting is how the story has changed and evolved with the driver now taking back his part about the "unable to take control" after Tesla released an official statement.
> 
> The consensus is that this guy is CLOWN and in an attempt to likely avoid being ticketed at the scene, decided to pull the "AP card" aka "my car did it" excuse. Everyone local agrees that AP should NOT be used in this area which beyond being full of lights, ramps etc, is also under construction in stretches.
> 
> Also disappointed in how the media handled it and continued to hype it being there were no injuries or other parties involved. But we all know Tesla ALWAYS has a target on it's back. Can't tell you how many co-worker and friends stopped me to say, "hey did you hear about the Tesla....." yeah, I heard.
> 
> Local Article with updates and video: http://newjersey.news12.com/story/3...t-crashed-on-highway-struck-number-of-objects


Yeah, it's sad how Tesla is targeted. I can't tell you how many people in my office keep coming up to me asking how my car is doing, as if they expect it to have turned into fairy dust by now. Hopefully with time, and more and more on the road, people will stop that BS. Of course, Elon does have a big mouth and invites criticism.

Flemington eh? I dated a girl from Glen Gardener for a number of years and we made the trip down Rt. 31 often as she grew near there and went to Hunterdon Central. I miss that area, such an under appreciated part of the NY Metro area. So close to NY and Philly but when you're out there, it feels a world away. If I was still in that part of the country, I'd definitely be out in Hunterdon, Morris or western Warren county.

In any event, yes, you'd have to be a Clown to get in a wreck on that road with AP working, and a clown to really have it running in the first case.


----------



## NJturtlePower

Needsdecaf said:


> Flemington eh? I dated a girl from Glen Gardener for a number of years and we made the trip down Rt. 31 often as she grew near there and went to Hunterdon Central. I miss that area, such an under appreciated part of the NY Metro area. So close to NY and Philly but when you're out there, it feels a world away. If I was still in that part of the country, I'd definitely be out in Hunterdon, Morris or western Warren county.


Yup, love it out here the past 5 years now! Grew up in Bergen county (North-East NJ), which at the time I thought was a quiet suburban area, but Hunterdon is like the mid-west of Jersey. I have a dairy farm right across the street (not far off of RT 31) that I can watch from my living room, but just an hour either way to Philly or NYC... meanwhile most people don't even believe there are any farms in NJ. I mean we are the "Garden State" right? I'm sure people have called us worse....


----------



## jsmay311

Bigriver said:


> I don't see this accident as all that similar to the California one that it is being compared to. *That was in a construction zone* where lanes really weren't marked, and there is the problem of the former and current lines. I drive through construction zones all the time and would NEVER let EAP (in its current form) try to maneuver that mess.


I don't think that's accurate. It's true that there was a video taken of the spot where the Mountain View fatality occurred and there were in fact some orange signs off on the side of the road indicating a HOV lane was closed and "road work ahead", so maybe that's where you're getting this from, but I believe that would've been for possible upcoming construction on Highway 85 if you were to take the ramp.

I haven't found any discussions of the fatal crash mention that it was in an actual construction zone, despite the fact that that would have been a very relevant detail to the story. And the lane markings were not temporary, they were just faded. And the video that I mentioned replicated the Autopilot failure that led to the earlier fatality and it had nothing to do with changes to lane configurations due to construction, or construction cones or signs, or anything else that would be associated with being a "construction zone".

(referenced videos of the Mountain View crash location:








)


----------



## jsmay311

Here's an aerial view of the location of this crash :
https://goo.gl/maps/VufCWmUjASv









Tesla's Owners Manuals say this about Autosteer:

_"Autosteer is intended for use only on *highways *and limited-access roads with a fully attentive driver. [...] Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present."_

This is literally a highway, and I would not consider it a "city street". Now I know many Tesla owners would argue AP shouldn't be used on this road (as evidenced above) since there are at-grade intersections and street lights (although there are also plenty of highway/expressway style on/off-ramps), and I wouldn't argue that they're necessarily wrong about that. But I bet many more Tesla owners _would_ use AP on this kind of road anyway. And I don't see anywhere that Tesla explicitly warns NOT to use AP on this type of road in the manual.

Bottomline: if Tesla wants to be more prescriptive about where Autosteer/Autopilot should NOT be used and err on the side of safety, they should say so clearly in their manuals.


----------



## jsmay311

Needsdecaf said:


> That's not what happened with the accident. The car proceeded all the way to the exit ramp, went straight through the middle of the gores (the striped lines) between the ramp and the travel lanes, and then proceeded up onto the grass clipping some manner of poles, objects, etc.
> [...]
> I agree that Autopilot might have gotten confused at the initial lane split. But that happened WAY before the actual ramp. And as you said, Autopilot would have chosen one lane or the other and jerked into it quickly, as you said. But at that point, there's still 1/4 mile or so to go before the crash site, and the car would be in a lane.
> 
> *Doesn't explain why it would have crashed where the exit ramp broke off, as that ramp is an exit only and that lane leading to the exit was way back, where I showed in the picture.* If by the "first picture" you mean the first one in the thread showing the ramp. the actual point where EAP would have gotten confused was way prior to that.


Looks to me like the exit lane in this spot veers off too sharply for AP to handle if it was going at highway speeds (the speed limit here is 55mph) if AP didn't anticipate that fact and slow down on its own in advance of the right turn.


----------



## Bigriver

@jsmay311, yes, upon watching those videos I agree the California accident was not a construction zone. But as you note, the lanes had issues that could make it hard to distinguish. I encounter that a lot where I live, and it is usually a construction zone, thus why I probably made that erroneous connection. Regardless, i think it is always within the responsibility of the driver to be "supervising" the car and especially in those situations. If my car wrecks while I'm on autopilot, it's my fault.


----------



## JWardell

jsmay311 said:


> https://goo.gl/maps/VufCWmUjASv
> 
> This is literally a highway


We have a big problem in the US that different regions have different names for the same things. That is most definitely NOT a highway, if it has a traffic light, at least by the definition of a highway 'round here!

And of course we don't say "Highway 1" we say "Route 1." 
But of course as shown in your screenshot Google is smart enough to name things with the area's vernacular.

Tesla's documented use of highway certainly agrees with my definition. Autopilot is still not intended for use on roads with traffic lights. Controlled access roads without lights and stop signs.

I do agree Tesla needs to better education the simple-minded with a more in-depth video about Autopilot, except of course they would have to remake the video with every update and improvement.
Understanding or not though, control of a vehicle is the driver's responsibility no matter what methods they use to operate it.


----------



## JasonF

I'm completely shocked at how some people have so little sense and self-preservation that they're going to play with technicalities in the rules and risk crashing their car because of it. "Well, technically, this is a highway"...What are you going to do, engrave that on your tombstone after you hit a pole face-first at 60 mph?

Back to the original subject, there are only three possible explanations for setting autopilot on a road like that: 1) The driver wanted to sit back and film Autopilot inevitably crashing for youtube views; or 2) The driver turned on AP to get some texting and/or facebook and/or movie watching done, lost track of time, and completely missed the cue to turn AP off before the traffic light; or 3) The driver actually believed that AP is full self-driving, and would take them home without any human participation (unlikely if they've had the car for more than an hour or so).


----------



## $ Trillion Musk

US-1 is definitely a highway that stretches the entire East Coast: https://en.m.wikipedia.org/wiki/U.S._Route_1

Were the Tesla log details about the accident made public yet? The driver may have been a clown but after reading through this thread I'm still wondering why EAP would have trouble in this circumstance. As a Jersey boy myself I've seen much worse lane markings than these.


----------



## Love

jsmay311 said:


> Here's an aerial view of the location of this crash :
> https://goo.gl/maps/VufCWmUjASv
> View attachment 21919
> 
> 
> Tesla's Owners Manuals say this about Autosteer:
> 
> _"Autosteer is intended for use only on *highways *and limited-access roads with a fully attentive driver. [...] Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present."_
> 
> This is literally a highway, and I would not consider it a "city street". Now I know many Tesla owners would argue AP shouldn't be used on this road (as evidenced above) since there are at-grade intersections and street lights (although there are also plenty of highway/expressway style on/off-ramps), and I wouldn't argue that they're necessarily wrong about that. But I bet many more Tesla owners _would_ use AP on this kind of road anyway. And I don't see anywhere that Tesla explicitly warns NOT to use AP on this type of road in the manual.
> 
> Bottomline: if Tesla wants to be more prescriptive about where Autosteer/Autopilot should NOT be used and err on the side of safety, they should say so clearly in their manuals.


While I do agree that a name like "autopilot" does indulge some ambiguity, and I do appreciate your well thought out responses here, I have to slightly disagree, or at least offer a different take for what is _my_ "bottom line." Actually having typed it below, I don't think you and I have differing views at all... but one's that could stand alone and support each other. One doesn't cancel out the other.

Bottom Line for me: At any given time while utilizing EAP (at its current level 2 of 5), regardless of road type, the driver needs to be ready to take over immediately. I do not believe this is the case here.

So, with our two bottom lines: Tesla could do a better job, and so could the driver.


----------



## MelindaV

jsmay311 said:


> Here's an aerial view of the location of this crash :
> https://goo.gl/maps/VufCWmUjASv
> View attachment 21919
> 
> 
> Tesla's Owners Manuals say this about Autosteer:
> 
> _"Autosteer is intended for use only on *highways *and limited-access roads with a fully attentive driver. [...] Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present."_
> 
> This is literally a highway, and I would not consider it a "city street". Now I know many Tesla owners would argue AP shouldn't be used on this road (as evidenced above) since there are at-grade intersections and street lights (although there are also plenty of highway/expressway style on/off-ramps), and I wouldn't argue that they're necessarily wrong about that. But I bet many more Tesla owners _would_ use AP on this kind of road anyway. And I don't see anywhere that Tesla explicitly warns NOT to use AP on this type of road in the manual.
> 
> Bottomline: if Tesla wants to be more prescriptive about where Autosteer/Autopilot should NOT be used and err on the side of safety, they should say so clearly in their manuals.


This may be a highway, but it is not limited access. It has cross traffic right in the image you posted.


----------



## jsmay311

MelindaV said:


> This may be a highway, but it is not limited access. It has cross traffic right in the image you posted.


Can't/Won't argue with that.

But, again, the relevant quote from the manual is: _ "Autosteer is intended for use only on highways *and *limited-access roads[...]"_

I don't want to get into a Bill Clinton-esque debate over what the meaning of the word "and" is... :tonguewink: but I'm pretty certain that that sentence does _not _actually warn against using Autosteer on non-limited-access highways.

If Tesla wants to say that Autosteer is *only* for use on limited-access roads, they need to relocate the compound adjective in that sentence: "_Autosteer is intended for use only on *limited-access* highways and limited-access roads[...]" :smirk: _

But I'm pretty sure a lot of thought went into precisely how to craft that sentence in the manual and that Tesla wrote it exactly as they intended. Otherwise, every country highway with even an occasional at-grade intersection would be off-limits.


----------

