# NTSB public hearing via Webcast 1 PM (EST)



## bwilson4web

Source: http://ntsb.windrosemedia.com

Playback of webcasts are available for three months.

Bob Wilson


----------



## bwilson4web

Most of the hearing was technically OK but they really want Autopilot restricted. Yet I noticed they made no reference to the built-in radar. They treat Autopilot as if it is only an optical sensor system. Was there any time when Autopilot did not have radar? Perhaps in the old Mobileye days?

A lot of attention was paid to not getting a reply to their earliest recommendations to Tesla in 881 days (9/27/2017.) My understanding is Tesla dropped Mobileye a year earlier. Yet " NTSB Revokes Tesla’s Party Status" in 04/12/2018, 684 days ago. This is symptomatic of a bad working relationship between NTSB and Tesla that appears to have deep roots.

Bob Wilson


----------



## JasonF

At this point the relationship is so sour that it's probably in the "I'm not saying anything without my lawyer present" stage. Tesla might just keep quiet until the whole thing is dragged into court.


----------



## bwilson4web

Source: https://www.ntsb.gov/news/press-releases/Pages/NR20200225.aspx

_Seven safety issues were identified in the crash investigation:_

_Driver Distraction_
_Risk Mitigation Pertaining to Monitoring Driver Engagement_
_Risk Assessment Pertaining to Operational Design Domain (the operating conditions under which a driving automation system is designed to function)_
_Limitations of Collision Avoidance Systems_
_Insufficient Federal Oversight of Partial Driving Automation Systems_
_Need for Event Data Recording Requirements for Driving Automation Systems_
_Highway Infrastructure Issues_
_
To address these safety issues the NTSB made nine safety recommendations that seek:_

_Expansion of NHTSA's New Car Assessment Program testing of forward collision avoidance system performance._
_Evaluation of Tesla "Autopilot"- equipped vehicles to determine if the system's operating limitations, foreseeability of misuse, and ability to operate vehicles outside the intended operational design domain pose an unreasonable risk to safety._
_Collaborative development of standards for driver monitoring systems to minimize driver disengagement, prevent automation complacency and account for foreseeable misuse of the automation._
_Review and revision of distracted driving initiatives to increase employers' awareness of the need for strong cell phone policies prohibiting portable electronic device use while driving._
_Modification of enforcement strategies for employers who fail to address the hazards of distracted driving._
_Development of a distracted driving lock-out mechanism or application for portable electronic devices that will automatically disable any driver-distracting functions when a vehicle is in motion._
_Development of policy that bans nonemergency use of portable electronic devices while driving by all employees and contractors driving company vehicles, operating company issued portable electronic devices or when using a portable electronic device to engage in work-related communications._
_
The NTSB also reiterated seven previously issued safety recommendations issued to: the National Highway Traffic Safety Administration (H-15-4, H-17-39 and H-17-38); the Department of Transportation (H-17-37); and Tesla (H-17-41 and H-17-42). The reiterated safety recommendations issued to Tesla (H-17-41 and H-17-42) were also reclassified from "Open―Await Response" to "Open―Unacceptable Response," as were two reiterated safety recommendations issued to NHTSA (H-17-39 and H-17-40) and one (H-17-37) issued to DOT.

As a result of the investigation the NTSB reclassified two other safety recommendations with

H-11-47, issued to the Consumer Electronics Association (now the Consumer Technology Association), reclassified as "Closed―No Longer Applicable," and H-19-13, issued to the California State Transportation Authority reclassified as "Open― Acceptable Response."

An abstract of the final report for the NTSB's investigation of the crash is available online at https://go.usa.gov/xdyHM and contains the probable cause, findings and safety recommendations. The full final report is expected to publish online in the next few weeks. Previously released information about the investigation is available online at http://go.usa.gov/xqag4._​
Bob Wilson


----------



## Bigriver

bwilson4web said:


> Most of the hearing was technically OK but they really want Autopilot restricted. Yet I noticed they made no reference to the built-in radar. They treat Autopilot as if it is only an optical sensor system. Was there any time when Autopilot did not have radar? Perhaps in the old Mobileye days?


I don't know the answers to your specific questions. But I very much remember this accident being investigated, as it occurred within months of us getting our model X and the wrecked model X was within months of the same build as ours (HW2.5). At that time, autopilot was horrible. Absolutely horrible. It required more monitoring than driving without it. It was unstable and certainly didn't handle any lane transition well. And now, hearing that he actively had game apps going on his phone..... I'm sorry he died but this accident that happened 2 years ago says nothing about the safety of autopilot today, except perhaps that the driver must always still be responsible.


----------



## Klaus-rf

bwilson4web said:


> Source: https://www.ntsb.gov/news/press-releases/Pages/NR20200225.aspx
> 
> _To address these safety issues the NTSB made nine safety recommendations that seek:_​
> _Review and revision of distracted driving initiatives to increase employers' awareness of the need for strong cell phone policies prohibiting portable electronic device use while driving._
> _Modification of enforcement strategies for employers who fail to address the hazards of distracted driving._
> _Development of a distracted driving lock-out mechanism or application for portable electronic devices that will automatically disable any driver-distracting functions when a vehicle is in motion._
> _Development of policy that bans nonemergency use of portable electronic devices while driving by all employees and contractors driving company vehicles, operating company issued portable electronic devices or when using a portable electronic device to engage in work-related communications._
> ​


Seems t me that the suggestions are not at all related to Tesla's AP system.​​Many states already have laws regarding cell phone/elec device usage while driving. While one can be cited and/or fined fr "distracted driving" the laws do not force lockout of these devices. And, I would venture to say, current cell phones can't be locked-out by the car - they don't interact with the vehicles that way yet. And you ain't touching my Game Boy!​


----------



## msjulie

This accident didn't happen too far from where I work. 

Worse thing in this case, to me, is that this was not the first time the car 'misbehaved' at this location and the driver was well aware of that fact, filing reports and everything. If indeed he was playing a game at that same location, well - speechless.

Complex situation but I always come back to 1 fact - the driver is supposed to be responsible and attentive. I know that doesn't happen as often as it should, sad, but the buck stops with the driver I believe and this accident was so very preventable.

So sad, so preventable.

Unfortunately, due or not, Tesla gets a potentially unfair bunch of negative press


----------



## JasonF

msjulie said:


> Unfortunately, due or not, Tesla gets a potentially unfair bunch of negative press


Every new piece of technology that can possibly have accidents is subject to government asking, "How can we make absolute sure this will never happen again?" They ask that question because they aren't tech savvy; they're legal savvy, and their aim is to promote safety in a way that can be legally specified.

With Autopilot, the answer to that question isn't clear, because there are too many factors - especially that humans are still operating the car - and the technology to babysit _all_ of those factors isn't available.

So there is a very real risk that when a bunch of non-tech-savvy bureaucrats want an answer of how Tesla is going to make sure that Autopilot will never be involved in another accident, they won't be satisfied, and their response will be to ban Autopilot from use in the U.S. until Tesla can prove that it will be accident-proof.

And then Tesla might either defy them and fight it in court (because there's so much money involved) or possibly be forced to implement some ridiculous feedback mechanism like the dead-man switch that trains have, forcing you to click it every x number of seconds to prove you're paying attention.

Tesla made a smart move including Autopilot starting in 2019 cars, because if they are forced to disable Autopilot for a while, that leaves them with much less liability to the people who paid for it.


----------



## bwilson4web

JasonF said:


> or possibly be forced to implement some ridiculous feedback mechanism like the dead-man switch that trains have, forcing you to click it every x number of seconds to prove you're paying attention.


In the hearing, it turns out the frequency of operator response (i.e., torque input) varies with speed. From memory, 'one minute at 25 mph and 10 seconds at 90 mph.'

Bob Wilson


----------



## bwilson4web

Bigriver said:


> At that time, autopilot was horrible. Absolutely horrible. It required more monitoring than driving without it. It was unstable and certainly didn't handle any lane transition well.


My first experiences started in March 26, 2019 and it took some getting used to. There is a fairly steep learning curve to find out where it works and where it doesn't. Autopilot has gotten better but there are a few reproducible cases that remain a challenge. I know where they are in town and what they look like when out of town.

Bob Wilson


----------



## M3OC Rules

If I read the safety recommendations they seem reasonable and mostly not Tesla specific. What I find very disturbing is the relationship issue with Tesla.


----------



## Ct200h

it seems Tesla could easily utilize the cabin camera in 3's and Y's to monitor driver attention. I know some people dont like a camera tuned in on them while driving , but if it just monitored driver attention and focus and didnt record I would be ok with that. It could also offer a monitor for a drousy driver. All without any new hardware or expense beside software.


----------



## iChris93

__ https://twitter.com/i/web/status/996102919811350528


----------



## Needsdecaf

JasonF said:


> With Autopilot, the answer to that question isn't clear, because there are too many factors - especially that humans are still operating the car - and the technology to babysit _all_ of those factors isn't available.
> .
> .
> .
> And then Tesla might either defy them and fight it in court (because there's so much money involved) or possibly be forced to implement some ridiculous feedback mechanism like the dead-man switch that trains have, forcing you to click it every x number of seconds to prove you're paying attention.


Or, you know, you could use an optical sensor like other manufacturers do, like BMW, Cadillac and others. That's why Super Cruise is truly "hands free", because it's watching you to make sure your eyes are on the road.

Telsa's torque sensor is a bunch of crap. It is more easily defeated by hanging a weight on the wheel than it is capable of reading the fact that my hands are _actually on the wheel_ when I am driving. Literally, I get probably 10 messages a day to "provide input" or whatever it says when my hands are literally, already on the wheel. Sometimes BOTH. It's the absolute bare minimum you can call a safety feature.

Optical sensors scanning driver's faces ARE available. They are deployed, and working, today.



Ct200h said:


> it seems Tesla could easily utilize the cabin camera in 3's and Y's to monitor driver attention. I know some people dont like a camera tuned in on them while driving , but if it just monitored driver attention and focus and didnt record I would be ok with that. It could also offer a monitor for a drousy driver. All without any new hardware or expense beside software.


Do not believe that camera is positioned properly to accomplish this. Or have the correct technology to do so. I could be wrong, but don't think I am.



iChris93 said:


> __ https://twitter.com/i/web/status/996102919811350528


Right. So they're ineffective, but deployed in the thousands of vehicles already. Erm, ok Elon.  And throwing in the whole "Tesla is safest car" argument proves what again, exactly?

I'll tell you what's ineffective...the torque sensor. It proves literally nothing more than the fact that I am pulling on the wheel with a certain force. Something I can do easily while staring down at my phone. Or, you know, something it can't sense while I am actually paying attention.

Try again Elon.


----------



## MelindaV

Ct200h said:


> It could also offer a monitor for a drousy driver. All without any new hardware or expense beside software.


this possible ability is specifically mentioned in a Tesla patent that is on the books. 


MelindaV said:


> came upon this patent and sounds like they are getting close to releasing some functions for the cabin camera
> 
> https://patents.justia.com/patent/20190176837
> 
> tuned HVAC airflow based on occupant locations, tuned audio based on occupant locations & recognize driver distress (alert emergency response, aid distressed driver in reaching a pre-designated emergency medical facility)


----------



## JasonF

Needsdecaf said:


> Or, you know, you could use an optical sensor like other manufacturers do, like BMW, Cadillac and others. That's why Super Cruise is truly "hands free", because it's watching you to make sure your eyes are on the road.


I think the reason why Tesla doesn't want to add a system like that is it doesn't complement the ultimate goal: Full self-drive. Once that's working, there's no reason to make sure the driver is paying attention anymore. So from their point of view, there is no point in spending time on it, and there is certainly no point in adding extra equipment to support it.

I'm hoping Tesla likes iterative design so much, that they probably _have_ demonstrated and tested something internally using eye or face tracking using the interior camera and software - and then set it aside. Because if they haven't, the U.S. government has a long history of requiring very specific hardware on cars to be compliant with the law. And that would mean Tesla would either have to retrofit every car with AP it's ever sold, or disable Autopilot on all current U.S. cars, and only make it available on modified newly produced models.


----------



## Needsdecaf

JasonF said:


> I think the reason why Tesla doesn't want to add a system like that is it doesn't complement the ultimate goal: Full self-drive. Once that's working, there's no reason to make sure the driver is paying attention anymore. So from their point of view, there is no point in spending time on it, and there is certainly no point in adding extra equipment to support it.
> 
> I'm hoping Tesla likes iterative design so much, that they probably _have_ demonstrated and tested something internally using eye or face tracking using the interior camera and software - and then set it aside. Because if they haven't, the U.S. government has a long history of requiring very specific hardware on cars to be compliant with the law. And that would mean Tesla would either have to retrofit every car with AP it's ever sold, or disable Autopilot on all current U.S. cars, and only make it available on modified newly produced models.


I don't want to get to into the conspiracy theories with this one but sometimes I do wonder. 1. Why isn't autopilot locked out on roads it's clearly not designed for? There are use cases all along of people using it on regular roads that it's not designed for. From the manual:










So why don't they lock out engagement on anything other than a highway or a limited access road? And 2. And why don't they have a more draconian device which protects against being "fully attentive"?

One wonders why....


----------



## garsh

Needsdecaf said:


> So why don't they lock out engagement on anything other than a highway or a limited access road?


Considering how often autopilot tries to drop my speed from 65 mph to 35 mph because the interstate happens to be passing over some local road, I think it's obvious that they are trusting the user to know the situation rather than their flawed maps.

Elon has said many times that maps are a "crutch" and that Autopilot shouldn't rely on them.


----------



## JasonF

Needsdecaf said:


> I don't want to get to into the conspiracy theories with this one but sometimes I do wonder. 1. Why isn't autopilot locked out on roads it's clearly not designed for?


There's a very simple reason: There is no way to "lock out" roads that aren't highways because the car doesn't know the difference between a road and a limited access highway. GM Cruise has a list of roads it's _allowed_ on instead, which is a much shorter list. If your local highways aren't on it, you're out of luck. Getting a computer to understand visually the difference between a highway and just a large road requires context that's beyond the ability of software to handle right now.



Needsdecaf said:


> And 2. And why don't they have a more draconian device which protects against being "fully attentive"?


Because there isn't one.


----------



## Needsdecaf

JasonF said:


> GM Cruise has a list of roads it's _allowed_ on instead, which is a much shorter list..


That's because SuperCruise relies, in part, on LIDAR mapping that GM has previously done in order to facilitate the system. I think this is belt and suspenders personally.



JasonF said:


> Because there isn't one.


Sure there is:

https://www.autonews.com/article/20181001/OEM06/181009966/bmw-camera-keeps-an-eye-on-the-driver

https://www.cnbc.com/2020/02/05/gm-...ver-assist-system-to-22-vehicles-by-2022.html



garsh said:


> Considering how often autopilot tries to drop my speed from 65 mph to 35 mph because the interstate happens to be passing over some local road, I think it's obvious that they are trusting the user to know the situation rather than their flawed maps.
> 
> Elon has said many times that maps are a "crutch" and that Autopilot shouldn't rely on them.


Eh, I don't know about that. Is it really all that hard to have a map know what a limited access road and say "that's where you can use it"?

Also, I believe Elon's comment about maps being a crutch was used in context of using them for actually driving, which I agree, the system should be able to drive without mapping. I don't believe it was in context of where to allow AutoPilot or not.


----------



## iChris93

Needsdecaf said:


> Eh, I don't know about that. Is it really all that hard to have a map know what a limited access road and say "that's where you can use it"?


In fact, the car used to distinguish between them by disabling auto lane change. Even now, they do not allow NoA everywhere.


----------



## JasonF

Needsdecaf said:


> That's because SuperCruise relies, in part, on LIDAR mapping that GM has previously done in order to facilitate the system. I think this is belt and suspenders personally.


Possibly, but there are issues with that: First, it won't understand construction, accidents, traffic reroutes, GPS outages, etc, while Tesla's system would because it treats every road the same and doesn't depend on maps and GPS. Second, during development of the systems, General Motors had the staff and infrastructure to map every single controlled-access road in the entire world, but Tesla did not. So Tesla had to forgo the suspenders and try to make the belt a little better.

Third, and much more controversial, is Tesla had to bring out Autopilot _first, _which means they had to be clever. Since they didn't have the data gathering capability of General Motors, or even BMW, they had to design a system that can handle most of what the roads tossed at it. Because if they didn't, they would have had to sell the Model 3 _without_ Autopilot, and then introduce it 5 years later, after even a low-end Kia has auto-steer.

Was it the right decision? I guess we'll find out when the NTSB finally makes a decision. It could have worked out, or perhaps it will be one of the country's worst examples of technology-too-soon. The good news is, a year and a half ago a negative decision against Tesla by the NTSB would have destroyed the company, now it would just be a nuisance.

And that brings us to the next related topic:



Needsdecaf said:


> Sure there is:
> 
> https://www.autonews.com/article/20181001/OEM06/181009966/bmw-camera-keeps-an-eye-on-the-driver
> 
> https://www.cnbc.com/2020/02/05/gm-...ver-assist-system-to-22-vehicles-by-2022.html


See, feeding me links like that imply that there is exactly one way to do driver monitoring. Whether that's true or not is up for debate until Tesla comes up with something. But the point I'd like to make is, if the NTSB believes the same thing, that those links are _the_ way to do driver monitoring, it leads directly to that particular system becoming mandatory for auto-steer - hardware and all. Which means Tesla would be required to disable Autopilot in all existing cars.

As I noted above, if Tesla is required to remove Autopilot until they install specific driver monitoring hardware, they will then be stuck with the choice of apologizing to previous customers and only making it available on newly produced cars, or paying out a fortune to retrofit every Tesla that was ever Autopilot capable with a new piece of hardware. Whenever a feature you paid for is in the hands of corporate numbers people (does it cost more to stiff them, or to retrofit?), it's never good.

Adding to that is the fact that Tesla really would like to make the leap to Full Self Drive. If Autopilot becomes enough of a nuisance and impedence to making cars and producting FSD, there's a strong possibility they'll turn it off and leave it off, encouraging their customers to wait for FSD ("right around the corner!") instead. It may be cheaper to do that as well, just give everyone who had AP active a credit toward FSD.



Needsdecaf said:


> Eh, I don't know about that. Is it really all that hard to have a map know what a limited access road and say "that's where you can use it"?


Every time traffic is heavy on the main road home from work, the GPS tries to reroute me onto this small road beside it for a few blocks. That small road is a private road owned by the JW Marriott hotel, and they now invite police to park alongside it and ticket people who drive through and have no business with the hotel. So now what does it look like the answer to that question is?


----------



## John

My thoughts-and I do have some.

The argument that Tesla needs to modify Autopilot because "The Tesla Autopilot system did not provide an effective means of monitoring the driver's level of engagement with the driving task" is correct but lacks proper perspective:

1. 25% of crashes of all other cars are caused by inattention
2. No one is proposing that all other cars have attention monitoring systems
3. There have been relatively fewer crashes while on Autopilot than normal cars (accidents per mile driven)

Yes, inattention is a big problem.

Starting to address that by going after Autopilot first-rather than the more dangerous population of all other cars-is a public disservice, because it will slow adoption of one thing that's already HELPING the problem, and distracts from the larger problem.

One of my neighbors getting into my car last year: "Don't put it on that Autopilot thing while I'm in the car! I heard how dangerous that is!"

(She changed her mind after actually experiencing it while she was driving it herself.)

NOTE
The report also states: "The crash attenuator was in a damaged and nonoperational condition at the time of the collision due to the California Highway Patrol's failure to report the damage following a previous crash and systemic problems with the California Department of Transportation's maintenance division in repairing traffic safety hardware in a timely manner... If the crash attenuator at the US Highway 101−State Route 85 interchange had been repaired in a timely manner and in a functional condition before the March 23, 2018, crash, the Tesla driver most likely would have survived the collision."

Was the attenuator taken out previously by a Tesla? No.
What percentage of attenuators are damaged by Teslas on Autopilot? Essentially none.

OBSERVATIONAL RANT
The NTSB is proposing restrictions on Tesla (the car maker) and Apple (the maker of the phone AND the employer that bought the phone for the driver) presumably because unlike the California Highway Patrol and California Department of Transportation they are politically-acceptable targets for criticism.


----------



## JasonF

John said:


> Yes, inattention is a big problem.


That's actually why I considered that Tesla might ignore any attempt to "fix" Autopilot right now; because the root problem is inattentive drivers.

_Extremely_ inattentive ones. And that problem is getting worse, to the point where I see drivers without any sort of auto-steer or collision avoidance systems watching television shows on their phones in heavy traffic.

So Tesla might be thinking Autopilot isn't fixable because the drivers aren't fixable - and they really need to concentrate on making the car do 100% of the driving, so people can watch Netflix, Facebook, and text to their heart's content without crashing.


----------



## John

JasonF said:


> And that problem is getting worse, to the point where I see drivers without any sort of auto-steer or collision avoidance systems watching television shows on their phones in heavy traffic.


Tru dat.

Autopilot lets me pay more attention to other cars.

In the California Bay Area traffic sometimes it appears that roughly 10% of people are holding their phone up in front of the steering wheel. Kinda horrifying. Go after that, NTSB!

(Oh, wait. There's already a law about that. Oh, well. Let's go after Tesla.

My last car was totaled by someone not paying attention, probably on their phone. I was stopped and the driver rear-ended me at full speed.


----------

