# Book review: Robot, Take The Wheel



## bwilson4web (Mar 4, 2019)

_Robot, Take The Wheel_ by Jason Torchinsky (Jalopnik reporter) is critical of Level 2, driver aids like but not limited to Tesla's Autopilot. Citing three fatal accidents, in theory Tesla's Autopilot can make driving safer but he sees a severe problem for 'those other drivers' who use it as if it were full autonomy. Sort of like someone with a 500 hp pony car driving on public streets like a drag or NASCAR track. But for three fatal accidents, he missed Tesla quarterly reports on accidents per million miles for Q2 2019, Autopilot had, 3.2, active safety features, 2.19, and none, 1.41.

Accidentally, Jason gives powerful description of why Level 2 driving is uncomfortable:

_"... when you get behind the wheel and pedals of a powerful car; you feel powerful yourself , because all those 700 or so insane horsepowers are directly controlled by your own body; … many people who love to drive and drive aggressively are the most uncomfortable being driven fast and aggressively because they are no longer in control, and it feels wrong somehow." (pp. 32.)_​
Far from being an aggressive driver, I've ridden with three aggressive drivers and it wasn't fun. One was a former Navy pilot who did 'formation driving', either maximum brake or maximum accelerator in bumper-to-bumper traffic. The second broke a 12 hour drive into two segments at plus 15 mph and a single meal and bathroom break. In contrast, I normally took 14 hours with three breaks. I married the third who delighted in talking her way out of speeding tickets and waking me up with high speed, tire and wind noise. Whether Level 2 Autopilot or another human, cognitive driving style determines the rider's comfort level.

Now Jason suggests autonomous systems could:

_"... download the fastest lap ever recorded on that track, and "replay" that lap. … download entire, curated road trips. ..." (pp. 193-194)_​
In effect, full autonomy could be a hoot combined with playback. Unskilled drivers could experience the joy of riding 'second seat' around Laguna Seca Raceway.

Can I recommend this book? It has a nice history of cars and autonomous systems (pp. 1-92.) He correctly calls a car a prosthetic with an emotional appeal and writes with a dry wit. But reading a book is different than forum posting. Each book page is like a three paragraph forum post and there are 240 pages with no breaks or commentary but your own.

Still, I appreciate Jason because he was the only reporter at a Prius "press day" who had the good sense to run race car tuning software on his phone to get metrics Toyota was not sharing. Would that every reporter did that!

Bob Wilson

ps. I took the Tesla quarterly accident reports and generated this chart:









Initial Model 3 production shows a decrease in safety rate (accidents per million miles.)
Since Q4 2018, the rate has improved reflecting driver experience and software improvements.
Severe winter of Q1 2019 may have affected accident rate.


----------



## John Tomson (Sep 7, 2019)

Three fatal accidents... Sounds like already too much. You better be careful if you take your hands off the wheel. At least, there has to be an option to brake manually or turn the wheel. I heard one driver put car into Auto, then didn't pay attention and looked somewhere else. That is when the car hit something. Pay attention when you put your life in the hands of a bot. Good luck.


----------



## Dr. J (Sep 1, 2017)

Three fatal accidents sounds like insufficient data to draw any conclusion.


----------



## garsh (Apr 4, 2016)

John Tomson said:


> Three fatal accidents... Sounds like already too much.


If that really was the cutoff point, then humans would have been banned from driving back in the 1800's.


----------



## Mr. Spacely (Feb 28, 2019)

John Tomson said:


> Three fatal accidents... Sounds like already too much.


I agree with @Dr. J that three accidents in millions of miles driven is not statistically significant.


----------



## bwilson4web (Mar 4, 2019)

Lapido said:


> Well it's an indicator that you have to be careful with it and it's better to drive yourself.


Careful, Yes. 

I drive myself with it.

Bob Wilson


----------



## Klaus-rf (Mar 6, 2019)

I seem to recall many more than single digits during the first few weeks when vehicles first came with cruise control. Some 60+ years ago. We've all heard the stories about someone driving the motorhome and setting cruise control then going into the back for a nap. 

The ONE common factor here is humans not understanding or using technology properly.


----------



## DocScott (Mar 6, 2019)

Lapido said:


> Well it's an indicator that you have to be careful with it and it's better to drive yourself.


And if there were three fatal accidents for people with cars that don't have driver assist systems, would that mean it's better to have a driver assist system?

And if three vegetarians who ate low-fat, whole-food diets died of heart attacks, would that mean it was better to eat a lot of junk food?


----------



## Klaus-rf (Mar 6, 2019)

DocScott said:


> And if there were three fatal accidents for people with cars that don't have driver assist systems, would that mean it's better to have a driver assist system?
> 
> And if three vegetarians who ate low-fat, whole-food diets died of heart attacks, would that mean it was better to eat a lot of junk food?


 Or if we compare to the original, actually working FSD system - the horse - I seem to recall there have been more than a few fatal incidents with that system too.


----------



## francoisp (Sep 28, 2018)

Airplane rarely have accidents but each one is investigated thoroughly to understand its root cause. That's how we find out about metal fatigue, worn out electrical wiring and bad software.

Three accidents with loss of life may not seem significant but how many other accidents happened under ADAS? Considering the novelty of ADAS I think it's not too much to ask that we understand and document the root cause of each accident so we can learn from them. If it's human behavior then fine. But what if it's something technical?


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Airplane rarely have accidents but each one is investigated thoroughly to understand its root cause. That's how we find out about metal fatigue, worn out electrical wiring and bad software.
> 
> Three accidents with loss of life may not seem significant but how many other accidents happened under ADAS? Considering the novelty of ADAS I think it's not too much to ask that we understand and document the root cause of each accident so we can learn from them. If it's human behavior then fine. But what if it's something technical?


 The whole issue is tainted by Tesla stating that AP has prevented >50 "accidents" every day. Which is total BS.


----------



## bwilson4web (Mar 4, 2019)

Klaus-rf said:


> The whole issue is tainted by Tesla stating that AP has prevented >50 "accidents" every day. Which is total BS.


How many has it prevented?

Bob Wilson


----------



## Power Surge (Jan 6, 2022)

Klaus-rf said:


> The whole issue is tainted by Tesla stating that AP has prevented >50 "accidents" every day. Which is total BS.


I could easily see autopilot preventing hundreds, or even more than a thousand accidents a day. Not just avoiding accidents from the negligence of others, but from the Tesla owner's own issues such as texting while driving or some other driving distraction. My car doesn't even have enabled AP, but I still have the AP safety features such as correcting for drifting out of lanes or someone stopping short in front of you and the car reacting faster than you can. I have had these kick in several times since I've had my car (8 months now).


----------



## Klaus-rf (Mar 6, 2019)

bwilson4web said:


> How many has it prevented?
> 
> Bob Wilson


 There are two kinds of people in the wor4ld: Those that can extrapolate from incomplete data.

We don't haver NAY evidence one way or another. But current AP/EAP/FSD are NOT as good as an attentive driver.


> but I still have the AP safety features such as correcting for drifting out of lanes or someone stopping short in front of you and the car reacting faster than you can. I have had these kick in several times since I've had my car (8 months now).


My car also has those features. Never ONCE has it actually warned or corrected for an upcoming "incident" although it has warned hundreds of times, if not more, that a forward collision was imminent - with no traffic forward for hundreds of feet at city speeds (40 MPH or less).

Poor automation is not better than no automation.

It's EXTREMELY important to note that it's NEVER just AP as AP always requires an "attentive" driver. Tesla has absolutely ZERO data showing safety of AP / EAP / FSD operating alone.


----------



## bwilson4web (Mar 4, 2019)

Klaus-rf said:


> We don't haver NAY evidence one way or another. But current AP/EAP/FSD are NOT as good as an attentive driver.


Too bad there are so few ‘attentive drivers’ versus the other kind. For example those who drive beyond their physical limits in one car accidents. I hear news reports of these several times a week.

Automated driving assistance is for a broad mix of human drivers. Not the smaller number of ‘attentive drivers’.

Bob Wilson


----------



## Klaus-rf (Mar 6, 2019)

bwilson4web said:


> Automated driving assistance is for a broad mix of human drivers. Not the smaller number of ‘attentive drivers’.


 Yes - that's the goal. But we're not there now.


----------



## bwilson4web (Mar 4, 2019)

Klaus-rf said:


> Yes - that's the goal. But we're not there now.


Any speculation on how our species will get there?

Like most human endeavors, I prefer experimenting and incremental improvements. For example, the Wright brothers did not start with an SE-71. We should never let perfect become the enemy of good enough.

Bob Wilson


----------



## Klaus-rf (Mar 6, 2019)

bwilson4web said:


> We should never let perfect become the enemy of good enough.


Agree. But, imho, we don't need any more AMC Pacers. or another version of Microsoft / IBM OS/2
<g>


----------



## bwilson4web (Mar 4, 2019)

Having driven AutoPilot since 2019, it has significantly improved. FSD beta brings that capability to urban streets. Not perfect but that is why I test it looking for edge cases: (1) left turns at controlled intersections with multiple, different, dashed, curved lines, and; (2) directly into the Sun.

I look for reproducible cases when FSD allows me to test it. The ‘strike out’ feature limits my ability to test. But I’m not anxious out it since I am an unpaid tester. 😁

Bob Wilson


----------



## Klaus-rf (Mar 6, 2019)

bwilson4web said:


> Not perfect but that is why I test it looking for edge cases: (1) left turns at controlled intersections with multiple, different, dashed, curved lines, and; (2) directly into the Sun.


 IMHO those aren't "edge cases". That's normal, everyday driving - the sun thing happens twice every day (except in heavy fog, rain, etc. - same cases where FSD isn't particularly useful. 

To me an edge case would be driving on a 4-lane limited-access highway between two semi trucks (with trailers) and coming up to an overpass where a vehicle crashes into the barrier and pieces of barrier (and car) fall down on the roadway in front of all of us. 

Yes, FSD has improved over the past 4 years but I keep hoping that maybe someday FSD will finally be able to just get out of my neighborhood. Not there yet.


----------



## bwilson4web (Mar 4, 2019)

Klaus-rf said:


> To me an edge case would be driving on a 4-lane limited-access highway between two semi trucks (with trailers) and coming up to an overpass where a vehicle crashes into the barrier and pieces of barrier (and car) fall down on the roadway in front of all of us.


Entertaining but I was looking for reproducible problems like the traffic intersections and driving into the sun.



Klaus-rf said:


> Yes, FSD has improved over the past 4 years but I keep hoping that maybe someday FSD will finally be able to just get out of my neighborhood. Not there yet.


I only have one neighborhood and when working, FSD has handled it well to the cross-town intersections. When my 'strike out' is reset, I'll see if I can video record it.

Bob Wilson


----------



## Klaus-rf (Mar 6, 2019)

bwilson4web said:


> Entertaining but I was looking for reproducible problems like the traffic intersections and driving into the sun.


 But those aren't "edge cases", no? hey're just normal, everyday driving (motoring).



> I only have one neighborhood and when working, FSD has handled it well to the cross-town intersections. When my 'strike out' is reset, I'll see if I can video record it.


 I realize it's a long drive, but you're always welcome to try my neighborhood. I'll keep a light on for you.


----------



## bwilson4web (Mar 4, 2019)

Klaus-rf said:


> I realize it's a long drive, but you're always welcome to try my neighborhood. I'll keep a light on for you.


Just take screen shot from Google Map/Apple Map showing your neighborhood with the preferred route. I will need the scale but don't need the names. I'll see if something like that exists in Huntsville, I can test after the "strike out" is removed.

Bob


----------

