# The Elephant in the Room: Autopilot and the NHTSA



## JasonF (Oct 26, 2018)

I read this article this morning:









NHTSA data shows Teslas using Autopilot crashed 273 times in less than a year


Honda had the second-highest number of crashes, with 90 over the same period.




arstechnica.com





Combined with other articles I've been reading about the NHTSA closely eyeing Autopilot as being a problem child, there is a rapidly increasing chance that they will take some action before Tesla can move to prevent it. And that action is likely going to be an order to disable Autopilot until it can be fixed _and proven not to cause crashes_.

The reason for this thread, though, is that I'm concerned about the path this might take immediately after that.

First possibility: Tesla is more invested in Full Self Drive than Autopilot. It's likely that if they're ordered to disable AP and start over, they'll decide that the development time is better spent on finishing FSD, and making _that_ the new Autopilot. If that occurs, will that mean those of us who paid for Autopilot (even on sale) will just lose what we paid? Or will Tesla do something like "We'll give you $2000 off or $12000 FSD"?

Second possibility: Tesla would still focus on FSD, but then make Autopilot available as a "watered down" version. Unfortunately, that code requires at least HW3, and the old code will have been deprecated by the NHTSA order. Could that means Tesla at some point says pre-HW3 cars can only keep Autopilot if we pay $1500 to upgrade to HW3?

Third possibility: Because I need at least one semi-positive possibility, maybe since the minimum AP has ever been priced at is $2000, Tesla could still do as above (require a HW3 upgrade) but eat the hardware cost for those who paid, and only ask the small number of people who got free AP with HW2.5 to pay a discounted rate for HW3. And the new AP wouldn't be AP anymore, but a limited FSD that behaves like Enhanced Autopilot. But would Tesla actually do that, or would they consider it watering down the mission of FSD too much?


----------



## shareef777 (Mar 10, 2019)

Why do you think a HW upgrade is required to change AP functionality? If anything, AP would be lowered to become just adaptive (or basic) cruise control, something even HW2 can support. At the end of the day the issue isn't with AP, it's with people THINKING AP is more advanced than it really is.

The way I see it, if the NHTSA forces Tesla to pull AP, Tesla will just tell everyone to talk to the NHTSA. They'll deflect blame (pretty much par for the course with Tesla).


----------



## Madmolecule (Oct 8, 2018)

People have no idea what to think. After four years I am still waiting for Tesla to DEFINE the following products. This is the standard phase in the Kickstarter program, when you can’t deliver the product sold, so now you must define what you really meant. As long as they keep it BETA they never have to define what the product is. I’m certainly not expecting what Elon has PROMISED over the years, but I sure would like to know what they intend to provide. I don’t think it’s asking too much. He has already figured out, who he’s gonna vote for, so now he can focus on what in the world we bought, and what he is currently selling now. I think the only reason they stop selling a cybertruck in Australia, as they are probably against scam kickstarter type product sales.

I personally think Elon has been waiting for NHTSA to step in before now, so they can blame them for why they can’t provide the automated products they sold. But the reality is this is ALL Tesla’s responsibility to provide what they sold Or compensate the customers they miss lead.

this applies to:
FSD
Autopilot
Navigate on City Streets
Summon
Advance Summon
Self Park
Cat Quest

They also need to put out a statement as to what the drivers responsibility will be during each of the modes. If we will always be responsible to keep our eyes on the road and take control at any second than it is certainly not what I thought I was getting.

They also need to put out a statement on who they feel the responsible party is when the car is operated in an automatic mode. I know the answer is me with a beta product, but that is not what I paid for I wanna know the answer for when they deliver my product which I’m sure it will be any day now.


I feel we are the forgotten Tesla Long’s, the investors that bought the product believed the hype and made them a success. No I’m not happy enough just because I got a reliable EV. Back in the day I feel they even took my tax credit. Because they lowered the price as soon as it went away.


----------



## francoisp (Sep 28, 2018)

It's not clear to me what's the scope of the NHTSA review.

As far as I know, Autopilot was never sold as an hands-off system. If a driver uses it in a responsible manner, there is absolutely no reason why AP would be the main cause of an accident. All accidents I read about in the news involved inatentive or inebriated drivers.

My only expectations are that when AP is engaged it will safely follow traffic and keep me in my lane 100% of the time without fault. If the NHTSA's review finds that AP has failed at this, then I would want it to come down hard on Tesla to get it fixed. Other than that the NHTSA can make recommendations to make AP better but it should leave Tesla alone.


----------



## Klaus-rf (Mar 6, 2019)

Even Tesla doesn't believe in AP/ FSD. The Safety Score used to calculate the price of Tesla Insurnace includes counts of "Forced AutoPilot Disengagements". So when the human MUST take over because AP/FSD did something stupid and/or dangerous, the human gets penalized for it. Where is the trust? Certainly not in AP/FSD.



> As long as they keep it BETA they never have to define what the product is


 One of the definitions of ßeta version software is it MUST be feature complete. Not all features need to be working 100%, but there must not be any new features added after ßeta is declared. If features are being added, then that's still αlpha code.


----------



## JasonF (Oct 26, 2018)

francoisp said:


> My only expectations are that when AP is engaged it will safely follow traffic and keep me in my lane 100% of the time without fault. If the NHTSA's review finds that AP has failed at this, then I would want it to come down hard on Tesla to get it fixed. Other than that the NHTSA can make recommendations to make AP better but it should leave Tesla alone.


Most of the review started with Autopilot crashing into stationary vehicles, especially emergency vehicles. Note that nearly all of these would require the driver to either be not looking, or sleeping, because you can kind of tell when a vehicle with flashing lights is stopped directly ahead of you.

That means, for most purposes, that NHTSA might ban the use of Autopilot until Tesla can make sure it notices _every single time_ the driver takes their attention off the road. Stuff like eye tracking would become required. Or, alternately, that the car doesn't require the driver's attention at all, anymore. That's why I suggested maybe Tesla might scrap Autopilot in this case, and fully invest in FSD.


----------



## Klaus-rf (Mar 6, 2019)

JasonF said:


> Most of the review started with Autopilot crashing into stationary vehicles, especially emergency vehicles. Note that nearly all of these would require the driver to either be not looking, or sleeping, because you can kind of tell when a vehicle with flashing lights is stopped directly ahead of you.
> 
> That means, for most purposes, that NHTSA might ban the use of Autopilot until Tesla can make sure it notices _every single time_ the driver takes their attention off the road. Stuff like eye tracking would become required. Or, alternately, that the car doesn't require the driver's attention at all, anymore. That's why I suggested maybe Tesla might scrap Autopilot in this case, and fully invest in FSD.


Or maybe the driver(s) paid attention fully and expected AP to stop. slow down or move around teh emergency vehicle and waited for AP to respond too long, IOW the driver waited too long before doing the infamous "*Forced Autopilot Disengagement*".

In any case - AP aside - shouldn't TACC have stopped?


----------



## shareef777 (Mar 10, 2019)

JasonF said:


> Most of the review started with Autopilot crashing into stationary vehicles, especially emergency vehicles. Note that nearly all of these would require the driver to either be not looking, or sleeping, because you can kind of tell when a vehicle with flashing lights is stopped directly ahead of you.
> 
> That means, for most purposes, that NHTSA might ban the use of Autopilot until Tesla can make sure it notices _every single time_ the driver takes their attention off the road. Stuff like eye tracking would become required. Or, alternately, that the car doesn't require the driver's attention at all, anymore. That's why I suggested maybe Tesla might scrap Autopilot in this case, and fully invest in FSD.


AFAIK, FSD has the same limits. I can take my eyes/hands off for 30sec or so. That's a long time for an accident to occur. This would be the same issue with any other assist system. The key is in the name itself. It's an ASSIST system, not meant to be a fully autonomous system.

Maybe Tesla's solution is just to rebrand it as Assist Pilot.


----------



## JasonF (Oct 26, 2018)

Klaus-rf said:


> Or maybe the driver(s) paid attention fully and expected AP to stop. slow down or move around teh emergency vehicle and waited for AP to respond too long, IOW the driver waited too long before doing the infamous "*Forced Autopilot Disengagement*".


I don't buy that, because natural survival instinct when headed toward a stopped fire truck at 75 mph and staring right at it is to swerve away from it. Except for youtubers crashing their car for views, that's an exception.



Klaus-rf said:


> In any case - AP aside - shouldn't TACC have stopped?


Autopilot is TACC plus autosteer, so no.



shareef777 said:


> AFAIK, FSD has the same limits. I can take my eyes/hands off for 30sec or so.


For now, yes, but I mean future FSD.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> In any case - AP aside - shouldn't TACC have stopped?


Like anything else, manufacturers can package TACC with other features which creates confusion as to what it does (and doesn't do) but TACC's basic functionality is to allow a car to adjust its speed based on traffic in it's lane.

Here's what the manual say about TACC:


> Traffic-Aware Cruise Control is designed for your driving comfort and convenience and is not a collision warning or avoidance system. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times.


Here's what the manual say about Autosteer:


> Never depend on Autopilot features to determine the presence of emergency vehicles. Model 3 may not detect lights from emergency vehicles in all situations. Keep your eyes on your driving path and always be prepared to take immediate action.
> 
> Autosteer is not designed to, and will not, steer Model 3 around objects partially in a driving lane and in some cases, may not stop for objects that are completely blocking the driving lane.


This is pretty clear to me. I don't think the NHTSA could fault TACC and Autosteer for not doing what it hasn't been designed to do in the first place.


----------



## francoisp (Sep 28, 2018)

JasonF said:


> That means, for most purposes, that NHTSA might ban the use of Autopilot until Tesla can make sure it notices _every single time_ the driver takes their attention off the road.


It's possible but I doubt it very much for the simple reason that that kind of distraction happens every minute of every day with cellphones and they're still allowed to operate while driving (of course it's only the passengers that operate them wink wink). How many accidents are due to cellphone distractions? Must be in the tens of thousands. Compare that to accidents caused by AP distractions.


----------



## shareef777 (Mar 10, 2019)

francoisp said:


> This is pretty clear to me. I don't think the NHTSA could fault TACC and Autosteer for not doing what it hasn't been designed to do in the first place.


Therein lies the problem. I don't think the fault will be of Tesla (any more then blaming EVERY manufacturer for drunk driving incidents involved with their vehicles). Seems that the NHTSA is beginning to understand that there's no technology to allow 100% certainty of driver attention. And therefore they may ban AP until such technology evolves (which is likely to be never).


----------



## JasonF (Oct 26, 2018)

shareef777 said:


> Therein lies the problem. I don't think the fault will be of Tesla (any more then blaming EVERY manufacturer for drunk driving incidents involved with their vehicles). Seems that the NHTSA is beginning to understand that there's no technology to allow 100% certainty of driver attention. And therefore they may ban AP until such technology evolves (which is likely to be never).


I also think that's a possibility, just because we live in a country that believes in "how can we absolutely make sure this will never happen again?" (referring to AP caused crashes). And the answer is, they can't, unless AP isn't allowed. Then people can feel free to crash their cars while texting, as long as it's not caused by anything automated.


----------



## shareef777 (Mar 10, 2019)

francoisp said:


> It's possible but I doubt it very much for the simple reason that that kind of distraction happens every minute of every day with cellphones and they're still allowed to operate while driving (of course it's only the passengers that operate them wink wink). How many accidents are due to cellphone distractions? Must be in the tens of thousands. Compare that to accidents caused by AP distractions.


Uh, I'm not aware of ANY state that allows you to operate a cell phone while driving. Sure, there're plenty of driver accidents due to cellphones, which is why it's illegal to use cellphones while driving. And it seems that the NHTSA is treating AP like cellphones now, as in relying on it will cause drivers to be distracted (thinking they don't have to pay attention to the road).


----------



## shareef777 (Mar 10, 2019)

JasonF said:


> I also think that's a possibility, just because we live in a country that believes in "how can we absolutely make sure this will never happen again?" (referring to AP caused crashes). And the answer is, they can't, unless AP isn't allowed. Then people can feel free to crash their cars while texting, as long as it's not caused by anything automated.


Unfortunately, there's no technology to stop a driver from using a phone while driving. The rare con of OTA is that it IS a technology that allows stopping EVERY driver from using AP, and as you said, we're a country of catering to the lowest common denominator.


----------



## francoisp (Sep 28, 2018)

shareef777 said:


> Therein lies the problem. I don't think the fault will be of Tesla (any more then blaming EVERY manufacturer for drunk driving incidents involved with their vehicles). Seems that the NHTSA is beginning to understand that there's no technology to allow 100% certainty of driver attention. And therefore they may ban AP until such technology evolves (which is likely to be never).


I have a more positive view of the same. The question the NHTSA has to answer is whether the driver and the public is safer with or without AP. To me it's pretty obvious: if a driver is to use his mobile phone while driving - and there's nothing right now to prevent it - I'd rather have him doing it while under AP.


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Like anything else, manufacturers can package TACC with other features which creates confusion as to what it does (and doesn't do) but TACC's basic functionality is to allow a car to adjust its speed based on traffic in it's lane.


No matter what it's bundled with TACC's ONLY function is to control speed according to traffic (in it's lane) right up to the max speed set [by the driver] down to zero in stopped traffic and all places in-between according to traffic speed. Perhaps a side task for TACC is to know where it's lane is. AP adds lane keeping to the traffic-aware speed control to [attempt to] keep it in the same lane.

Apparently neither AP nor TACC do object avoidance. And FSD isn't much better.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> Apparently neither AP nor TACC do object avoidance. And FSD isn't much better.


I'm far from being a Tesla apologist but I will say that based on many recent YouTube videos, FSD does avoid "objects" such as a parked car door opening in its lane. It will wait for pedestrians if they appear to be ready to cross the road, it will allow for additional space around cyclists, it goes around stopped vehicle. In fact I think the most recent release of the FSD software is quite impressive. That said, AP was never designed to do any of that because these situations aren't expected on major freeways and interstates.


----------



## JasonF (Oct 26, 2018)

francoisp said:


> I'm far from being a Tesla apologist but I will say that based on many recent YouTube videos, FSD does avoid "objects" such as a parked car door opening in its lane. It will wait for pedestrians if they appear to be ready to cross the road, it will allow for additional space around cyclists, it goes around stopped vehicle. In fact I think the most recent release of the FSD software is quite impressive. That said, AP was never designed to do any of that because these situations aren't expected on major freeways and interstates.


That just increases the chance that if the NHTSA bans Autopilot, Tesla might just abandon it and have everyone either upgrade to FSD or lose autosteer (and money if you paid for it).


----------



## Bigriver (Jan 26, 2018)

This thread seems to me that it is going down far-fetched rabbit holes with “possibilities” of what might be done which I don’t think are possibilities. NHTSA is investigating crashes because they are a regulatory body, and that is what they do. They are no more likely to ban autopilot than they are to require that we queue up to have our cars crushed (GM, I’m looking at you). They will not solve any problems by banning the product, they just may want to set some rules. As this is an area with little guidance, it’s not hard to believe they may find something to do a little rulemaking on.

While Tesla gets highlighted in this report, I am confident that as soon as they present the data vs miles driven, Tesla falls to the bottom. I think they are also probably lacking in data from other manufacturers because not all cars are connected to a mother ship like Teslas.


----------



## Bigriver (Jan 26, 2018)

Klaus-rf said:


> The Safety Score used to calculate the price of Tesla Insurnace includes counts of "Forced AutoPilot Disengagements". So when the human MUST take over because AP/FSD did something stupid and/or dangerous, the human gets penalized for it.


This is not what the forced AP disengagements component is about. The score is only pinged if the driver wasn’t responding to the steering wheel nags and got put in AP time out.









If AP does something wrong and the driver takes over, there is a 3 second lag before any driving characteristics are factored into the score.


----------



## garsh (Apr 4, 2016)

Klaus-rf said:


> In any case - AP aside - shouldn't TACC have stopped?


You're thinking of AEB (Automatic Emergency Braking).
But no, AEB is not guaranteed to avoid crashes - it's just meant to lower the speed at impact.
We have another thread where AEB is discussed:



garsh said:


> AEB is designed to minimize an unavoidable impact. It's not guaranteed to eliminate impacts.
> 
> Here's the relevant excerpt from the Model 3 owner's manual:
> 
> View attachment 28604





garsh said:


> No automaker's AEB system is guaranteed to prevent collisions.
> Car & Driver performed testing of AEB systems on four cars.
> 
> We Crash Four Cars Repeatedly to Test the Latest Automatic Braking Safety Systems
> ...


----------



## JasonF (Oct 26, 2018)

Bigriver said:


> This thread seems to me that it is going down far-fetched rabbit holes with “possibilities” of what might be done which I don’t think are possibilities. NHTSA is investigating crashes because they are a regulatory body, and that is what they do. They are no more likely to ban autopilot than they are to require that we queue up to have our cars crushed (GM, I’m looking at you). They will not solve any problems by banning the product, they just may want to set some rules. As this is an area with little guidance, it’s not hard to believe they may find something to do a little rulemaking on.


They're not likely to ban Autopilot outright, but they are likely to ban it until it meets certain requirements.

The issue is the way that's analyzed by organizations like NHTSA. They would compare it to "similar" driver assist systems and how their features differ, vs how many crashes have been recorded. Some of the driver assist features use eye tracking, and some use wheel torque like Tesla does. If the NHTSA finds a correlation between, say, eye tracking and low number of crashes, they might require that all driver assist systems must have an eye tracking feature to make sure the driver is watching, or driver assist must be disabled. Even though that's a false corelation because it's also dependent on one type having much fewer miles on it, being more expensive and therefore not selected very often, or simply not being used by drivers because it's annoying.

And that's how Autopilot could end up banned. It doesn't have an eye tracking feature, so it would have to be disabled until one can be added. And if that requires additional hardware (if the system must be infrared and not visual for instance) then all current AP equipped cars are out of luck, because Tesla is not going to retrofit them. And then, they might not even want to retrofit new ones, they might decide to abandon AP and stake the future on FSD.

That's where the scenarios I came up with are from.


----------



## Klaus-rf (Mar 6, 2019)

garsh said:


> You're thinking of AEB (Automatic Emergency Braking).
> But no, AEB is not guaranteed to avoid crashes - it's just meant to lower the speed at impact.
> We have another thread where AEB is discussed:


 No, TACC.

It's job is to control the distance and speed of forward vehicles. Like when in traffic. If a vehicle ahead in its lane is STOPPED, it should also STOP.

AEB doesn't regulate speed and distance to upcoming traffic (whether in it's lane or not). In my experience with AEB it mostly just makes blaring loud noises when there's absolutely nothing in the way - loud enough I'm sure pedestrians a 1/4 mile away can hear it. AEB has NEVER given me an instance that I thought needed ANY attention, especially immediate emergency corrections.


----------



## garsh (Apr 4, 2016)

Klaus-rf said:


> It's job is to control the distance and speed of forward vehicles. Like when in traffic. If a vehicle ahead in its lane is STOPPED, it should also STOP.


TACC is not guaranteed to detect stopped vehicles. This was true for all vehicles that implement TACC based on radar, including non-Teslas. They ALL have this issue. If there's no movement detected with respect to the roadway, then it doesn't "see" a vehicle. Hopefully Tesla's switch to vision-only will improve this.

The problem is that people make assumptions about how a feature works and don't understand the limitations. They get used to how good the feature works in normal conditions and get complacent. Then they aren't paying attention when they finally encounter a situation that the vehicle can't handle.


----------



## Bigriver (Jan 26, 2018)

JasonF said:


> They're not likely to ban Autopilot outright, but they are likely to ban it until it meets certain requirements.


I still think you are WAY off the path of “likely”. As cars have become safer over the years, new requirements are imposed on NEW cars. They don’t ban the existence of cars without the new requirements. We have history to look at: seat belts, air bags, backup cameras…. Each becomes required on new cars at some point, but there are always the older cars without them. I think within the next 10 years, adaptive cruise control and lane keeping will be added to that list of features required on new cars. 

I do not have any experience with the regulatory environment of the NHTSA, but I do have experience with other regulatory segments of the federal government. As much as people like to rag on the government and regulation, the rules are not arbitrarily made in a vacuum, nor are they written to be prescriptive, choosing one vendor’s approach over another. The NHTSA is not trying to squash the advances being made in automated driving systems. They are on board that this is the future. Here is what they say:









Automated Vehicles for Safety | NHTSA


Get info on automated driving systems, also referred to as automated vehicles and "self-driving" cars, and learn about their safety potential.




www.nhtsa.gov


----------



## JasonF (Oct 26, 2018)

Bigriver said:


> I still think you are WAY off the path of “likely”. As cars have become safer over the years, new requirements are imposed on NEW cars. They don’t ban the existence of cars without the new requirements. We have history to look at: seat belts, air bags, backup cameras…. Each becomes required on new cars at some point, but there are always the older cars without them. I think within the next 10 years, adaptive cruise control and lane keeping will be added to that list of features required on new cars.


There is an important differences though: The NHTSA approved the older cars at some point before new things like seatbelts and airbags were introduced - so they wouldn't retroactively ban something they approved. However, they never explicitly approved Autopilot, it's just something Tesla added on its own. One of the big criticisms of it is that Tesla didn't use NHTSA approved trials to prove it before introducing it. So that fully leaves them the option to either approve it, or ban it, or attach specific requirements for approval.

Now it's also possible the NHTSA might say because it's an unapproved feature, Tesla has x number of days/months/etc to submit for approval with the proper proving data to show it meets all of the requirements, _or_ it will be summarily banned if they don't.


----------

