# Tesla crash ... with no driver?



## tivoboy

Something tells me April, May and probably June are off the table now for full FSB beta roll out

https://www.theverge.com/2021/4/18/22390612/two-people-killed-fiery-tesla-crash-no-driver


----------



## slacker775

This one sounds like yet another collection of idiots that shouldn’t be allowed near any type of vehicle. Naturally the media will jump all over it but from what I’ve read, it’s no different than a dumbass putting his Ford Taurus on cruise control and tying a belt around the steering wheel and trying to surf on the roof. Maybe it’s just me...


----------



## lance.bailey

it's not just you. I think that Tesla should have an agreement with the Darwin awards.


----------



## Mike

Welp...

I'll assume they defeated the sensors in the driver's seat, so it will be interesting to see how the software will be updated to deal with idiots.


----------



## tivoboy

Mike said:


> Welp...
> 
> I'll assume they defeated the sensors in the driver's seat, so it will be interesting to see how the software will be updated to deal with idiots.


Frankly there are a LOT of sensors.. the seat, the wheel, the camera in the cabin. I'm sort of surprised that Tesla hasn't included some sort of limp mode into the FSB beta that very quickly would decelerate the car and move it to the side of the road quickly in the event that A) nobody was in the passenger seat. B) no wheel input was given for more than two rounds of indications or C) that the in car AI from the internal mounted camera (Yes, not in order S and X Models of course for now) indicated either A) nobody in passenger seat b) driver is asleep C) driver is completely inattentive for X seconds. Just put it in limp mode and move to the side of the road and stop.


----------



## slacker775

It actually already does have that and has for quite some time. I think in this particular case, everyone is assuming that FSD is in use (pick your acronym) and it really sounds like it wasn’t at all. Maybe just TACC. Sounds like a couple of goofballs that had no idea what they were doing were trying to ‘be cool’ and failed miserably. Tesla will get all of the blame in the media because so few people understand how Tesla’s - and most specifically FSD in all its incarnations - operate.


----------



## tivoboy

slacker775 said:


> It actually already does have that and has for quite some time. I think in this particular case, everyone is assuming that FSD is in use (pick your acronym) and it really sounds like it wasn't at all. Maybe just TACC. Sounds like a couple of goofballs that had no idea what they were doing were trying to 'be cool' and failed miserably. Tesla will get all of the blame in the media because so few people understand how Tesla's - and most specifically FSD in all its incarnations - operate.


I have to say, and hopefully, no offense to the dead, but if one thought they would get the car to drive itself and only engaged a mono direction, no steering set speed cruise control, well they shouldn't be driving even if they were in the front seat at the wheel.


----------



## DocScott

We don't know from the story what mode they were in, of course. I very much doubt they had FSD beta. My _guess_ would be Autosteer was engaged, at least at first, because otherwise they were on a suicide mission from the start.

Here's a real possibility: they defeated the sensors and engaged TACC and Autosteer as a stunt. At some point, Autosteer realized it had something it couldn't figure out, signalled "take over immediately," and because no one was in the driver's seat, they couldn't. On top of that, the measures used to defeat the steering-wheel-torque sensor likely would turn the car once AP disengaged, making matters worse.

I would not be at all surprised if Tesla ends up reporting that "Autopilot disengaged 18 seconds before the crash" or something like that. If you're driving and AP disengages, 18 seconds is plenty of time to take control and proceed without a problem. If you're in the passenger seat with a weight tied to the wheel? Not so much...


----------



## FRC

Isn't TACC supposed to turn on flashers and pull over if no driver input is detected?


----------



## shareef777

FRC said:


> Isn't TACC supposed to turn on flashers and pull over if no driver input is detected?


Really all it takes is an ankle weight on the steering wheel and a dumbell on the chair and the vehicle won't know it's not a person.

But just like ANY technology, there'll be people that will bypass safety measures to the extreme that'll end in a fatality. It's unfortunate, but it's also why the Darwin Awards exist.


----------



## jsmay311

You don’t even need to defeat the seat occupancy sensor to prevent Autopilot from disengaging as long as the seatbelt remains buckled, right?


----------



## garsh

This accident occurred in a housing plan cul-de-sac. There are no lane lines, and autopilot currently will not engage unless there is at least one lane line. And given that it was a housing plan, the speed limit was probably 25 mph, and autopilot will refuse to go more than 5mph over the speed limit on local roads.

Therefore, I don't think autopilot was involved in this accident at all.

My current theory is that the driver fled the scene after the accident. His two passengers were unconscious and died in the fire that erupted after he fled.


----------



## iChris93

garsh said:


> My current theory is that the driver fled the scene after the accident. His two passengers were unconscious and died in the fire that erupted after he fled.


Is there anyway to know? Tesla's black box is certainly destroyed.


----------



## francoisp

Apparently the firefighters had no idea how to handle the fire. With all the electric cars hitting the roads in the next few years, I think it's time for a refresher. By the way 32,000 gallons of water were used.


----------



## slacker775

francoisp said:


> By the way 32,000 gallons of water were used.


This sounds like the next thing the oil industry is gonna use to add to their reasons not to go EV


----------



## francoisp

slacker775 said:


> This sounds like the next thing the oil industry is gonna use to add to their reasons not to go EV


Or a more appropriate fire retardant needs to be used or developed.


----------



## iChris93

slacker775 said:


> This sounds like the next thing the oil industry is gonna use to add to their reasons not to go EV


Or this 
"Reignition of the battery can be a problem, because unlike gas-powered vehicles, even if the fire is extinguished, an EV battery still has stored energy."
Gasoline still has stored energy when extinguished!


----------



## garsh

francoisp said:


> By the way 32,000 gallons of water were used.





francoisp said:


> Or a more appropriate fire retardant needs to be used or developed.


Water is pretty much the ideal way to fight a battery fire. The problem here is the way in which the water is begin delivered.

For a battery fire, you use water to cool the battery. There's nothing to be done for the cells that are already on fire other than to let them burn up completely. The water just helps nearby cells from reaching a temperature where they will ignite as well.

So you want a constant flow of water - not a high-pressure, high-volume stream of water. If the firefighters didn't realize that their objective was just to keep the rest of the cells cool, then they were probably blasting the car with full-force, and using way more water than needed.


----------



## garsh

iChris93 said:


> Is there anyway to know? Tesla's black box is certainly destroyed.


I'll be curious to find out. Do Teslas automatically upload some telemetry when there's a bad accident?


----------



## GDN

If the car had a cellular connection there should have been a constant live flow of data back to Tesla. It all just depends on how quickly the events all happened. If he floored it 2 blocks back he could have hit 100 in just a few seconds, hoping the data all got uploaded.

I also know nothing about battery fires, but it seems logical that foam would also help smother the fire. Halon - or chemicals used in data centers could help, but may not be effective in an open area, but even a battery fire if you rob it of oxygen should dwindle. There could be other factors at play.


----------



## lance.bailey

francoisp said:


> Apparently the firefighters had no idea how to handle the fire. With all the electric cars hitting the roads in the next few years, I think it's time for a refresher. By the way 32,000 gallons of water were used.


how does that compare the amount of water usually used on a care fire? i have no idea if that is less, a bit more, or an order of magnitude more.


----------



## M3OC Rules

Coincidentally, last night right before I saw the Tesla story, I saw this story with some horrific video but you probably didn't see it.

https://minnesota.cbslocal.com/2021...-1-dead-in-lowry-tunnel-crash-in-minneapolis/


----------



## slacker775

M3OC Rules said:


> Coincidentally, last night right before I saw the Tesla story, I saw this story with some horrific video but you probably didn't see it.
> 
> https://minnesota.cbslocal.com/2021...-1-dead-in-lowry-tunnel-crash-in-minneapolis/


I bet there was a Tesla using Autopilot - probably nobody even in the car at all - that caused that accident but it fled the scene. Only plausible scenario!


----------



## garsh

GDN said:


> I also know nothing about battery fires, but it seems logical that foam would also help smother the fire.


You can't smother a battery fire. Starving it of oxygen does NOTHING.

Most "fires" are "rapid oxidation". You bring a fuel and oxygen together, give it a little spark or heat, and the material oxidizes. This also releases a lot of heat, which helps the oxidation spread.

Battery fires are different. There's (usually) no oxidation occurring. Oxygen isn't even required - this can happen in a vacuum. Instead, you have some other chemical reaction occurring (often due to a physical "short" in a cell) which releases a lot of heat. That heat makes it likely that other nearby cells start to undergo the same chemical reaction.

The only thing you can really do is try to keep all of the nearby cells cool until the failed cells have completed burning and cooled down.


----------



## Long Ranger

iChris93 said:


> Gasoline still has stored energy when extinguished!


Yeah, the energy in a fully charged 100 kWh battery pack is roughly equivalent to 3 gallons of gasoline. The way it burns is different, but there is far more stored energy in a typical gas tank.

EDIT: It looks like an estimate of 5 or 6 gallons of gasoline would be more accurate. I was only looking at the stored electrical energy in the pack. Factoring in the reaction energy from the cells themselves actually combusting, a fully charged 100 kWh pack might release ~150-200 kWh of energy based upon a quick look at some battery failure studies.


----------



## Long Ranger

Long Ranger said:


> Yeah, the energy in a fully charged 100 kWh battery pack is roughly equivalent to 3 gallons of gasoline. The way it burns is different, but there is far more stored energy in a typical gas tank.


Thinking about it a little more, maybe my statement isn't accurate. I was using 100 kWh as the energy stored in the battery. That's true when talking the normal battery chemical reaction, but when the lithium is combusting there's probably a lot more energy there.

EDIT: Edited above post to reflect this.


----------



## sterickson

https://www.cnn.com/2021/04/19/business/tesla-fatal-crash-no-one-in-drivers-seat/index.html
If this article is to be believed, why on Earth would someone think they could do this and *NOT* get killed?

(I don't know if this is the right place for this post, so feel free to move it, moderators.)


----------



## francoisp

lance.bailey said:


> how does that compare the amount of water usually used on a care fire? i have no idea if that is less, a bit more, or an order of magnitude more.


According to Tesla (see link below) 3000 gallons should be enough.

https://wildfiretoday.com/2018/05/1...take-3000-gallons-and-24-hours-to-extinguish/


----------



## lance.bailey

ah. they used an order of magnitude more. well better soaked than sorry.


----------



## Klaus-rf

Not much for useful info in that article. 

Although I find it curious that Texas has "Constables" instead of the more customary sheriffs, deputies, Rangers, etc.


----------



## fritter63

garsh said:


> This accident occurred in a housing plan cul-de-sac. There are no lane lines, and autopilot currently will not engage unless there is at least one lane line. And given that it was a housing plan, the speed limit was probably 25 mph, and autopilot will refuse to go more than 5mph over the speed limit on local roads.
> 
> Therefore, I don't think autopilot was involved in this accident at all.
> 
> My current theory is that the driver fled the scene after the accident. His two passengers were unconscious and died in the fire that erupted after he fled.


This. I can think of two scenarios more plausible than what they are claiming. The one you mentioned, or the the driver climbed into the back seat to try and get out and then couldn't.

Will be interested to hear why they insist there was no driver, other than not finding anyone in the seat.

Have they never watched CSI?


----------



## slacker775

I saw some video on Twitter trying to prove that AP would enable without lines, but they failed to mention the road they were trying it on was 45mph speed limit. Vastly different than a neighborhood street. And I’m sure you’ve all seen Elon’s tweet that AP was not engaged and that vehicle didn’t even have FSD. Does make you wonder what these knuckleheads were even thinking!


----------



## iChris93

slacker775 said:


> but they failed to mention the road they were trying it on was 45mph speed limit. Vastly different than a neighborhood street.


My car thinks the streets in my neighborhood are 45 mph streets even though they're 25.


----------



## fritter63

slacker775 said:


> . And I'm sure you've all seen Elon's tweet that AP was not engaged and that vehicle didn't even have FSD.


there it is! Looks like those Texas cops have a "jump to conclusions" mat! 😎


----------



## shareef777

iChris93 said:


> My car thinks the streets in my neighborhood are 45 mph streets even though they're 25.


I'd rather take that then my streets being 45 and the car thinks it's 25. Sudden and unexpected deceleration is a sure recipe for a rear end accident if the person behind you isn't paying attention.


----------



## victor

francoisp said:


> Apparently the firefighters had no idea how to handle the fire. With all the electric cars hitting the roads in the next few years, I think it's time for a refresher. By the way 32,000 gallons of water were used.




__ https://twitter.com/i/web/status/1384323386373906434


----------



## shareef777

victor said:


> __ https://twitter.com/i/web/status/1384323386373906434


Looking at the picture it seems that the fire would have gone out on its own. THERE WAS NOTHING LEFT TO BURN!

Either the car was burning for hours before the FD got there or they were trying to put it out for more than a few minutes. You don't go from a whole car to less than a chassis/frame in an hour, let alone a few minutes.


----------



## francoisp

slacker775 said:


> I saw some video on Twitter trying to prove that AP would enable without lines, but they failed to mention the road they were trying it on was 45mph speed limit. Vastly different than a neighborhood street. And I'm sure you've all seen Elon's tweet that AP was not engaged and that vehicle didn't even have FSD. Does make you wonder what these knuckleheads were even thinking!


Regarding the lines thing, after I've engaged Autopilot, if the lines disappear, Autopilot stays engaged. In my area this is not a common occurrence but it does happen, mostly on newly redone road surface. So what I'm saying is that it is possible that Autopilot could have been engaged prior to driving on that stretch of road.

Now Musk says the car wasn't on Autopilot and I believe him. Someone posted the scenario that after the accident the driver moved to the back to try exiting that way which seems a real possibility.


----------



## M3OC Rules

Regardless of facts or any sort of analysis we get this:

https://www.latimes.com/business/st...-autopilot-kills-two-where-are-the-regulators
Note that the link has what I assume is the original headline which may be factually false. Yet that's what the article is about anyway.


----------



## JasonF

Here's my scenario:

There was a driver who enabled what he thought was Autopilot on a straight stretch of road, and set the speed to something high like 70 mph. Then the driver climbed over to the passenger seat or rear seat. Autopilot didn't complain or disengage because it wasn't Autopilot - it was cruise control (TACC). At the next bend in the road, with no autosteer, the car went airborne over a square curb at speed and smashed head-on into a pole. Because it was a neighborhood street, even if the car did deliver a warning about the driver not being detected, it simply couldn't slow in time to not smash into the pole.

Why do I think that?

1. It's easy to ignore the "beep-beep-beep" that warns you Autopilot can't be engaged because the street doesn't have markings or whatever - but that means the car is now in TACC mode and will still accelerate up to speed.

2. Emergency braking generally won't jam full on the brakes unless an imminent obstacle is detected. Simply detecting that the driver has left the driver's seat would cause a slow deceleration and stop. Unfortunately, decelerating from something like 70 mph to a speed a residential street curve is designed for (25-30 mph) is not going to happen fast enough.

3. By the time emergency braking realizes a crash with a pole is imminent, two of the wheels are either not touching the ground anymore, or are on soft material like grass. So the car is going to hit at nearly full force.

If I'm closed to reality, it means _any vehicle with cruise control built in_ can kill its passengers in exactly the same way.


----------



## Needsdecaf

I would like to add some context since I live very close as the crow flies from the crash site, and in a neighborhood with similar road design and construction.



Klaus-rf said:


> Not much for useful info in that article.
> 
> Although I find it curious that Texas has "Constables" instead of the more customary sheriffs, deputies, Rangers, etc.


It's complicated. It has to do with the fact that our counties are divided up into various political precincts, that do not correspond with individual towns or cities. Both the Harris County Sherrif's Office as well as Harris County Precinct 4 Constables office patrol this area. The Precinct 4 Constables have the authority, followed by HCSO.



fritter63 said:


> This. I can think of two scenarios more plausible than what they are claiming. The one you mentioned, or the the driver climbed into the back seat to try and get out and then couldn't.
> 
> Will be interested to hear why they insist there was no driver, other than not finding anyone in the seat.
> 
> Have they never watched CSI?


The two passengers were 59 and 69 respectively. No one was hopping over the seat after setting AP.



slacker775 said:


> I saw some video on Twitter trying to prove that AP would enable without lines, but they failed to mention the road they were trying it on was 45mph speed limit. Vastly different than a neighborhood street. And I'm sure you've all seen Elon's tweet that AP was not engaged and that vehicle didn't even have FSD. Does make you wonder what these knuckleheads were even thinking!


Yes. This video someone posted is bunk. The streets in this (gated) community are exactly the same as the ones in mine. When you're on those slower streets, AutoSteer won't engage as there are no lane lines.



victor said:


> __ https://twitter.com/i/web/status/1384323386373906434


I trust the fire chief here.


JasonF said:


> Here's my scenario:
> 
> There was a driver who enabled what he thought was Autopilot on a straight stretch of road, and set the speed to something high like 70 mph. Then the driver climbed over to the passenger seat or rear seat. Autopilot didn't complain or disengage because it wasn't Autopilot - it was cruise control (TACC). At the next bend in the road, with no autosteer, the car went airborne over a square curb at speed and smashed head-on into a pole. Because it was a neighborhood street, even if the car did deliver a warning about the driver not being detected, it simply couldn't slow in time to not smash into the pole.
> 
> Why do I think that?
> 
> 1. It's easy to ignore the "beep-beep-beep" that warns you Autopilot can't be engaged because the street doesn't have markings or whatever - but that means the car is now in TACC mode and will still accelerate up to speed.
> 
> 2. Emergency braking generally won't jam full on the brakes unless an imminent obstacle is detected. Simply detecting that the driver has left the driver's seat would cause a slow deceleration and stop. Unfortunately, decelerating from something like 70 mph to a speed a residential street curve is designed for (25-30 mph) is not going to happen fast enough.
> 
> 3. By the time emergency braking realizes a crash with a pole is imminent, two of the wheels are either not touching the ground anymore, or are on soft material like grass. So the car is going to hit at nearly full force.
> 
> If I'm closed to reality, it means _any vehicle with cruise control built in_ can kill its passengers in exactly the same way.


This makes no sense as soon as you look at the actual street. The vehicle was traveling E to West and crashed in the trees on the North side of the road between the two lakes.










There was zero chance that TACC was set and the driver tried to climb out. Moreover, if TACC was somehow accidentally engaged, the driver would have mashed the brakes long before it careened into the trees.


----------



## TrevP

Tesla has pulled the logs and shows AP was NOT engaged. Furthermore the car did not have FSD purchased either. This is a case of driver error and going too fast


__ https://twitter.com/i/web/status/1384254194975010826


----------



## Long Ranger

Needsdecaf said:


> The two passengers were 59 and 69 respectively. No one was hopping over the seat after setting AP.


I don't think @fritter63 was suggesting anything about AP. I think the suggested scenario was 1) driver loses control and crashes 2) driver can't exit through door 3) driver climbs into back seat in attempt to escape.


----------



## tivoboy

I was thinking about this yesterday.. so assuming there was no AP engaged and this wasn’t a case of trying to trick FSD (all of which points to this not being that since apparently the car did not have FSD purchased and Tesla says AP was not engaged), it could then come down to driver error, too much speed, trying out ludicrous mode on a curve and the car getting out of control and crashing into the tree at high speed. After that it’s possible that the integrity of the A/B pillar was so compromised that the driver couldn’t get out of the car through the front door.. and if there was a passenget in the passenger seat the drive may have tried to get out through the back seat doors. At that point, they too may have been locked or forced shut and or the driver could have been entangled in the seat belt or the fire could have been too much already, so the driver couldn’t get out at ALL and both perished in the fire.


----------



## JasonF

Needsdecaf said:


> This makes no sense as soon as you look at the actual street. The vehicle was traveling E to West and crashed in the trees on the North side of the road between the two lakes.


I took my best guess based on how I use AP and know that it operates. But like the old Sherlock Holmes saying paraphrases, when all reasonable explanations are exhausted, the only answer is the impossible. Which would mean those two somehow used an exploit to activate Autopilot and get it to drive the car at high speed on a residential street while no one was in the driver's seat.

If that's true, though, that's really bad. Because as I've mentioned before, govt agencies tend to fix things with a sledgehammer, using phrases like "How do we make certain this can never happen again?". And that means the NHTSA might require Tesla to deactivate Autopilot across its entire fleet until they can make it impossible to exploit. And Tesla, placing its bets on FSD, and faced with possibly high costs to fix AP, might elect just to leave it disabled until FSD is available, rather than spend resources adapting AP and possibly having to install new hardware on all of their vehicles.


----------



## fritter63

JasonF said:


> Which would mean those two somehow used an exploit to activate Autopilot and get it to drive the car at high speed on a residential street while no one was in the driver's seat.


ha! I actually had the same thought...... was the other victim a software engineer?


----------



## JasonF

fritter63 said:


> ha! I actually had the same thought...... was the other victim a software engineer?


I am a software engineer...but that's not really necessary to exploit it. It's just getting the steps right.

I thought about it some more, and I have a new theory. I think one of them noticed that if they drive by a specific spot on their street, it would allow Autopilot to activate. I think they meant to be safe about it, and were going very slow, like less than 5 mph. Then one of them left the driver's seat, in no particular hurry, because the car wasn't going very fast, was steering ok. I now think that where it went wrong is the car veered toward the curb, and one passenger reached over to grab the steering wheel, and accidentally spun the scroll wheel. And then it all went horribly wrong in an instant, before any of them had time to react.

I also believe in the possibility that the kind of showing off they did was because they had been drinking, so it seemed like a perfectly logical idea to climb out of the driver's seat while the car is moving.

This theory is based on one particular possible exploit that I haven't tested, but might work, based on the fact that it pits Autopilot's safety systems against each other. I don't want to get too specific about that mostly because I don't really want a bunch of people trying it out and crashing this week. If it doesn't work, I'll definitely be happy that Tesla thought of it already!


----------



## garsh

garsh said:


> Water is pretty much the ideal way to fight a battery fire. The problem here is the way in which the water is begin delivered.
> 
> For a battery fire, you use water to cool the battery. There's nothing to be done for the cells that are already on fire other than to let them burn up completely. The water just helps nearby cells from reaching a temperature where they will ignite as well.
> 
> So you want a constant flow of water - not a high-pressure, high-volume stream of water. If the firefighters didn't realize that their objective was just to keep the rest of the cells cool, then they were probably blasting the car with full-force, and using way more water than needed.


It appears the media had also lied about the fire-fighter's incompetence in fighting this particular fire.
Karen Rei (who doesn't hang out here much anymore, but still posts on Slashdot) posted the following:

To quote the fire chief:​​"(It) was heavily involved in flames. When the fire was put out, it was noticed there were two bodies (inside) and they were deceased," Buck added. "They continued extinguishment of the woods around (the car), putting out the trees and pine needles and what have you. I was there probably five to 10 minutes after that and at that point, every once in a while, the (battery) reaction would flame and it was mainly keeping water pouring on the battery."​​"With respect to the fire fight, unfortunately, those rumors grew out way of control. It did not take us four hours to put out the blaze. Our guys got there and put down the fire within two to three minutes, enough to see the vehicle had occupants," Buck said of inaccurate claims the vehicle burned for hours. "After that, it was simply cooling the car as the batteries continued to have a chain reaction due to damage."​​Buck said what is termed in the firefighting profession as "final extinguishment" of the vehicle - a 2019 Tesla - took several hours, but that classification does not mean the vehicle was out-of-control or had live flames. The term is mostly used in relation to structure or wild land forest fires where hot ash that seems extinguished or is buried can later reignite other material and begin burning again.​​"We could not tear it apart or move it around to get 'final extinguishment' because the fact that we had two bodies in there and it was then an investigation-slash-crime scene," Buck explained. "We had to keep it cool, were on scene for four hours, but we were simply pouring a little bit of water on it. It was not because flames were coming out. It was a reaction in the battery pan. It was not an active fire."​​He also noted:​​"It is our job to keep up with the newest technologies, whether it is electric cars or other newer vehicles. The have strengthened uni-bodies, some of the framework they use is (high-tech) steel. The old 'jaws of life' will not cut through that. The 'jaws of life' would not have even made a dent in this car," Buck said of the Tesla. "We have had to upgrade tools and upgrade our training and processes​
So it sounds like the fire fighters had no problems with this particular vehicle fire, and also knew exactly how to handle it correctly.
I wonder where these reports of using a swimming pool's volume of water came from? 

Here's her original post on Slashdot:
https://tech.slashdot.org/comments.pl?sid=18717142&cid=61293400


----------



## Maxpilot

JasonF said:


> I am a software engineer...but that's not really necessary to exploit it. It's just getting the steps right.
> 
> I thought about it some more, and I have a new theory. I think one of them noticed that if they drive by a specific spot on their street, it would allow Autopilot to activate. I think they meant to be safe about it, and were going very slow, like less than 5 mph. Then one of them left the driver's seat, in no particular hurry, because the car wasn't going very fast, was steering ok. I now think that where it went wrong is the car veered toward the curb, and one passenger reached over to grab the steering wheel, and accidentally spun the scroll wheel. And then it all went horribly wrong in an instant, before any of them had time to react.
> 
> I also believe in the possibility that the kind of showing off they did was because they had been drinking, so it seemed like a perfectly logical idea to climb out of the driver's seat while the car is moving.
> 
> This theory is based on one particular possible exploit that I haven't tested, but might work, based on the fact that it pits Autopilot's safety systems against each other. I don't want to get too specific about that mostly because I don't really want a bunch of people trying it out and crashing this week. If it doesn't work, I'll definitely be happy that Tesla thought of it already!


Model S does not use scroll wheel to change set speeds. There is a lever on left side in front of turn signal lever to move up or down to change set speed. It would be very difficult to reach it from any seat other than the driver's seat.


----------



## JasonF

Maxpilot said:


> Model S does not use scroll wheel to change set speeds. There is a lever on left side in front of turn signal lever to move up or down to change set speed. It would be very difficult to reach it from any seat other than the driver's seat.


That only changes the scenario so that it was probably the back seat passenger that reached for the wheel. If they reached partially through it and bumped the stalks, that could have adjusted the speed.


----------



## Taney71

I hate this story. Unfortunately I got sucked into a Reddit conversation about Tesla and autopilot, etc. that used an article about the need to regulate this self driving systems. The article itself is trash and the Reddit conversation was equal parts Tesla bashing and misguided comments based on a poor understanding of the Tesla autopilot system and the recent Tesla car crash.


----------



## Maxpilot

This accident appears more like owner showing off ludicrous mode and lost control... probably on their way out of the neighborhood to play with autppilot. I am more concerned how the battery exploded and caused such an intense fire in a short amount of time. I think that should be Tesla's main concern. I have had more friends ask about that than weird autopilot tricks.


----------



## atebit

Taney71 said:


> The article itself is trash and the Reddit conversation was equal parts Tesla bashing and misguided comments based on a poor understanding of the Tesla autopilot system and the recent Tesla car crash.


Interesting how social media and knee-jerk media reactions are "bad" when you're on the other side of an issue.


----------



## JasonF

Taney71 said:


> I hate this story. Unfortunately I got sucked into a Reddit conversation about Tesla and autopilot, etc. that used an article about the need to regulate this self driving systems. The article itself is trash and the Reddit conversation was equal parts Tesla bashing and misguided comments based on a poor understanding of the Tesla autopilot system and the recent Tesla car crash.


There are two issues here to contend with:

One is an old issue, that Americans tend to see a tragedy and then immediately ask the question, "How can we make certain this never ever happens again?" The reality answer is, you can't, unless you stop people from driving entirely. But when a tragedy becomes high profile enough, people ask that question loudly, and the government feels it has to step in and show that they have a way to prevent it from ever happening. That often happens with a sledgehammer - which is why since early on, I kept saying that if crashes caused by idiots keep happening, we could see a time where Self Driving and Autosteer are common in Europe but banned in the United States. Because in the U.S., we protect idiots at all costs!

The second issue is a more recent one. As a society right now we're moving in the direction of China, not politically, but ideologically. More and more people are adopting the idea that government and big corporations are always right. If government is looking into whether Tesla's Autopilot is unsafe, then it _must be unsafe_, and should be banned immediately with no debate or discussion. Not because discussion or debate are illegal, but because it puts you on the _wrong side. _What might help in this respect is that General Motors has SuperCruise, and as a big company, they're also always right - so the government might think twice about banning all self driving technology, because GM might just be a big enough corporation to be more right than the government.


----------



## Klaus-rf

JasonF said:


> 1. It's easy to ignore the "beep-beep-beep" that warns you Autopilot can't be engaged because the street doesn't have markings or whatever - but that means the car is now in TACC mode and will still accelerate up to speed.


My M3 often will allow AP to be engaged in my neighborhood without lane markings (however there is a hard curb on the right side, but no center or side paint markings) but it often confuses the speed limit.

If i have AP engaged on the feeder street (proper lane markings and 45MPH posted), then disengage to enter the neighborhood "side streets", then re=engage AP it will re-engage at 45MPH. With recent software updates it will detect the 25MPH limit sign and adjust speed properly (after it passes the sign). But AP (auto-steer) remains engaged - with no lane markings.


----------



## Needsdecaf

JasonF said:


> I am a software engineer...but that's not really necessary to exploit it. It's just getting the steps right.
> 
> I thought about it some more, and I have a new theory. I think one of them noticed that if they drive by a specific spot on their street, it would allow Autopilot to activate. I think they meant to be safe about it, and were going very slow, like less than 5 mph. Then one of them left the driver's seat, in no particular hurry, because the car wasn't going very fast, was steering ok. I now think that where it went wrong is the car veered toward the curb, and one passenger reached over to grab the steering wheel, and accidentally spun the scroll wheel. And then it all went horribly wrong in an instant, before any of them had time to react.
> 
> I also believe in the possibility that the kind of showing off they did was because they had been drinking, so it seemed like a perfectly logical idea to climb out of the driver's seat while the car is moving.
> 
> This theory is based on one particular possible exploit that I haven't tested, but might work, based on the fact that it pits Autopilot's safety systems against each other. I don't want to get too specific about that mostly because I don't really want a bunch of people trying it out and crashing this week. If it doesn't work, I'll definitely be happy that Tesla thought of it already!


You're grasping at straws.

Autopilot will not activate on these streets. Period. I live in the same village within The Woodlands, with IDENTICAL streets, that are unmarked. AP will not activate. You get the "unavailable at this time' with the yellow warning and the beeping if you try.

Moreover, there is less than 400' from the home to the crash site. There would be NO time for even a Chinese gymnast to set AP (if it could be, which it couldn't) and then climb into the back of the car while the car accelerated 400' up to a speed quick enough to crash into a tree and catch fire.

And that's if you tricked AP into activating on unmarked streets. And also tricked the car into thinking you were driving.

This was a 59 year old Dr. and a 69 year old financial planner. They had just dropped their wives off at home. You mean to tell me that two wealthy white dudes decided after a dinner with their wives on a Saturday night around 9:00, that they were going to go out and trick autopilot, on a road where autopilot couldn't be activated, and in the middle of a curve in the middle of their gated neighborhood? Come on. Even if they were going to do that, they would have chosen a straight road!

People's husbands, fathers and grandfathers are now deceased. Can we please stop with the silly theories that they somehow tricked Autopilot.


----------



## JasonF

Needsdecaf said:


> People's husbands, fathers and grandfathers are now deceased. Can we please stop with the silly theories that they somehow tricked Autopilot.


There's a problem with just accepting that it happened through forces beyond our understanding and moving on, though. That problem is it comes with consequences to all of us how have Autopilot and/or paid for it. If it's labeled as an unsolvable problem that happened but can't or shouldn't be explained, and people died because of it, that means it's by nature a _defect_. And if it's an unfixable defect, that means Tesla has to disable it for everyone.


----------



## victor

New/additional information

https://threadreaderapp.com/thread/1384562198987501573.html


----------



## Madmolecule

Needsdecaf said:


> You're grasping at straws..
> 
> This was a 59 year old Dr. and a 69 year old financial planner. They had just dropped their wives off at home. You mean to tell me that two wealthy white dudes decided after a dinner with their wives on a Saturday night around 9:00, that they were going to go out and trick autopilot, on a road where autopilot couldn't be activated, and in the middle of a curve in the middle of their gated neighborhood? Come on. Even if they were going to do that, they would have chosen a straight road!
> 
> People's husbands, fathers and grandfathers are now deceased. Can we please stop with the silly theories that they somehow tricked Autopilot.


The investigators are very clear that no one was in the driver seat. Auto pilot or not the thought the two drunk rich white guys might be doing something stupid after they drop their wife's off is not hard to believe. The fact that happened in Texas makes it even easier to believe. The wife is usually the accident avoidance mode that keeps you from following through with your great ideas. 
I doubt it it's the first time the investigators have seen an accident. Bodies do get tossed during an accident, but they seem very confident about the peoples locations. They don't know how to put an electrical fire, they don't know much about auto pilot, but I think in spot where somebody is sitting especially if they're still strapped it

All, Darwin award winners have parents. Many of them have actually figured out how to breed


----------



## GDN

JasonF said:


> There are two issues here to contend with:
> 
> One is an old issue, that Americans tend to see a tragedy and then immediately ask the question, "How can we make certain this never ever happens again?" The reality answer is, you can't, unless you stop people from driving entirely. But when a tragedy becomes high profile enough, people ask that question loudly, and the government feels it has to step in and show that they have a way to prevent it from ever happening. That often happens with a sledgehammer - which is why since early on, I kept saying that if crashes caused by idiots keep happening, we could see a time where Self Driving and Autosteer are common in Europe but banned in the United States. Because in the U.S., we protect idiots at all costs!
> 
> The second issue is a more recent one. As a society right now we're moving in the direction of China, not politically, but ideologically. More and more people are adopting the idea that government and big corporations are always right. If government is looking into whether Tesla's Autopilot is unsafe, then it _must be unsafe_, and should be banned immediately with no debate or discussion. Not because discussion or debate are illegal, but because it puts you on the _wrong side. _What might help in this respect is that General Motors has SuperCruise, and as a big company, they're also always right - so the government might think twice about banning all self driving technology, because GM might just be a big enough corporation to be more right than the government.


I just knew that one of your two had to be inept journalism and the ability for anyone to write a story and get it to all the masses with little to no facts. Blame autopilot - you must be right and make someone prove you wrong. With two dead people and a car that far gone there is one and only one way of knowing if AP/EAP/FSD or anything else was engaged and that is Tesla. However that doesn't stop speculation and once written in an article like that it just becomes gospel. No one ever reads the follow up to know it wasn't true. I think we should have easier libel laws to go after articles like that.


----------



## Bigriver

JasonF said:


> There's a problem with just accepting that it happened through forces beyond our understanding and moving on, though. That problem is it comes with consequences to all of us how have Autopilot and/or paid for it.


I don't think anyone is trying to dismiss this horrible accident. I just think there are a number of us who believe it had absolutely nothing to do with autopilot.

Elon said autopilot wasn't engaged.
It was a residential road in which autopilot (autosteer) cannot be engaged.
The accident happened only 500 ft from the house. They were estimated to be going over 60 mph at the time of the crash.
It was a model S where the TACC/autopilot controls are very well tucked away and not accessible from anywhere but the driver's seat.
The owner had not purchased FSD. This tells me he is not likely to be smitten by autopilot and not likely showing off any aspect of it.
However, it was a performance model S with ludicrous mode, that also has a cheetah launch mode. This seems a much more likely aspect that was being played with. If the owner was seatbelted in somewhere other than the driver seat, it seems most plausible to me that a non-experienced friend was trying ludicrous and/or cheetah for the first time. Or maybe even just the amazing basic acceleration of a performance model S. And he lost control. And he managed to get out after the crash. Yes, I'm just speculating. But there is only one important unknown here - who was driving? We know someone had to be. And that detail will be figured out, then there will be no mystery and this will simply be one of many tragic accidents caused by human error.


----------



## JasonF

GDN said:


> I just knew that one of your two had to be inept journalism and the ability for anyone to write a story and get it to all the masses with little to no facts. Blame autopilot - you must be right and make someone prove you wrong. With two dead people and a car that far gone there is one and only one way of knowing if AP/EAP/FSD or anything else was engaged and that is Tesla. However that doesn't stop speculation and once written in an article like that it just becomes gospel. No one ever reads the follow up to know it wasn't true. I think we should have easier libel laws to go after articles like that.


Part of it was also that big corporations are always right. Tesla is still a relatively small company in the automotive world, and therefore is always perceived as being in the wrong or unsafe or scary. After all, it's the company that not only makes electric cars that catch fire all the time and can't be put out, but then takes even more changes with even scarier tech like Autopilot, which kills people all the time. They want more people to see that their own fears are true, that EV's and Tesla are dangerous, and need to be banned, and we should go back to nice, safe gasoline powered cars like everyone else.

I'm stating it that way only to emphasize why none of us could ever win an argument with some of those people. It's because they believe they're on the side of right and safety, and as Tesla buyers/fans, we're in the wrong and getting people hurt.


----------



## JasonF

Bigriver said:


> I don't think anyone is trying to dismiss this horrible accident. I just think there are a number of us who believe it had absolutely nothing to do with autopilot.


I don't think it did, either. The only thing I'm very certain of is that it had to have involved a lot of alcohol, which would explain in any case why the driver's seat was empty. Where the driver went isn't explained, and if they left or moved seats isn't, and why the driver's airbag didn't go off isn't.


----------



## Needsdecaf

JasonF said:


> There's a problem with just accepting that it happened through forces beyond our understanding and moving on, though. That problem is it comes with consequences to all of us how have Autopilot and/or paid for it. If it's labeled as an unsolvable problem that happened but can't or shouldn't be explained, and people died because of it, that means it's by nature a _defect_. And if it's an unfixable defect, that means Tesla has to disable it for everyone.


Yes, so we should be shouting from the rooftops all the reasons why it WASN'T Autopilot. Which almost 100% certainly it wasn't.

What we should not be doing is inventing silly ways in which it could have possibly been Autopilot's fault. Which, if you look at it would have required:

1. A failure of Autopilot in that it engaged on a road where it wasn't supposed to and
2. A complete bypass of all the other Autopilot safety locks (butt in seat, minimum speed etc.) and
3. A 59 year old to have jumped from the front seat of a moving vehicle into the rear in less than say 5-6 seconds.

This is all so impossible that I don't understand why people are trying to say "it could have happened this way". And making it worse for Tesla and owners?

No one is "just accepting" anything. It's been thought through and analyzed.


----------



## Needsdecaf

Madmolecule said:


> The investigators are very clear that no one was in the driver seat. Auto pilot or not the thought the two drunk rich white guys might be doing something stupid after they drop their wife's off is not hard to believe. The fact that happened in Texas makes it even easier to believe. The wife is usually the accident avoidance mode that keeps you from following through with your great ideas.
> I doubt it it's the first time the investigators have seen an accident. Bodies do get tossed during an accident, but they seem very confident about the peoples locations. They don't know how to put an electrical fire, they don't know much about auto pilot, but I think in spot where somebody is sitting especially if they're still strapped it
> 
> All, Darwin award winners have parents. Many of them have actually figured out how to breed


Except that I live less than a mile from the crash site. I am familiar with the subdivision, and the type of people who live there. I'm familiar with the road construction. The Constable is my constable.

The "investigators" (read Constable Hermann who spoke before any investigation was done) said they concluded no one was driving because no one was in the driver's seat, and the deceased in the rear seat was in a seated position. It's pretty clear that he, and the rest of the Constable's office, jumped to conclusions based on a false understanding of what Tesla's can do.

Also, it's already been stated that the "fact" regarding the intensity of the fire was mis-stated. This has been corrected by the fire department representative Palmer Buck. I have corroborated it through other sources.

Finally, if you look at the actual location of the crash, it happened less than 500 feet from the owner's driveway. There would not have been enough time to fool autopilot.

Did they get in trouble by doing something stupid? Obviously, they managed to crash in a residential subdivision and die. Whatever they did it was monumentally stupid. But why we keep speculating that Autopilot was involved and these guys somehow tricked it is the point I'm trying to make. It was NOT.


----------



## Needsdecaf

JasonF said:


> I don't think it did, either. The only thing I'm very certain of is that it had to have involved a lot of alcohol, which would explain in any case why the driver's seat was empty. Where the driver went isn't explained, and if they left or moved seats isn't, and why the driver's airbag didn't go off isn't.


You're speculating again. You have no idea how much, if any alcohol, was involved. You mean to tell me that you've never had a car get away from you?

And where are you getting information that the driver's airbag didn't go off? I haven't seen that reported at all.


----------



## Madmolecule

It is also very possible that the owner did not pay his monthly premium connectivity subscription and the car entered a self-destruct mode. Sometimes a breathalyzer isn't needed. I am not concerned as to what substance but it is hard to argue that they are operating at full capacity.
there have been many super cars wreck in their neighborhood. Tiger Woods is actually become pretty competent at wrecking in a subdivision and walking away. I am more concerned that Tesla will use it as an excuse not to release FSD because we can't behave.


----------



## M3OC Rules

JasonF said:


> One is an old issue, that Americans tend to see a tragedy and then immediately ask the question, "How can we make certain this never ever happens again?"


I disagree. On the same night an ICE BMW in MN hit a guard rail and burst into flames killing 2 people. That didn't make the national news and I hear of no one talking about making better gas tanks. This is a story because it's Tesla. Journalists make money by writing about Tesla. The problem is people use this information as confirmation bias. If they are negative towards Tesla, autopilot, or EVs this gives them arguments against the brand/technologies without knowing the actual facts. The facts will come out later but the damage is done and the journalists made their money.


----------



## JasonF

Needsdecaf said:


> Yes, so we should be shouting from the rooftops all the reasons why it WASN'T Autopilot. Which almost 100% certainly it wasn't.


I'm not insisting it was, I'm guessing at possible holes that could have been exploited. If they fail testing, that's a good thing. It carries more weight to say all of the possibilities have been gone through.



Needsdecaf said:


> You're speculating again. You have no idea how much, if any alcohol, was involved. You mean to tell me that you've never had a car get away from you?
> 
> And where are you getting information that the driver's airbag didn't go off? I haven't seen that reported at all.


I haven't had a car get away from me, fortunately. What says alcohol to me though is the fact that they left a house after a supposed gathering, and thought it would be fun to drive crazy speeds on a road that's not designed for it.

It was either someone above in this thread, or one of the links that were posted which mentioned that the driver's airbag didn't go off and that it was evidence that no one was in that seat at the time of the crash.


----------



## Mike

victor said:


> New/additional information
> 
> https://threadreaderapp.com/thread/1384562198987501573.html


IMO this is more logical than all the theories about tricking the car to drive with no one in the drivers seat.


----------



## M3OC Rules

Mike said:


> IMO this is more logical than all the theories about tricking the car to drive with no one in the drivers seat.


Am I missing something? That thread doesn't seem to provide a lot of conclusive info or a theory. The only info is "it is thought that based on the crash site, condition of the vehicle, and positioning of the victims that they WERE belted in." If that's true it would mean no one was in the driver seat when it crashed or they put on their seatbelt in the backseat after the crash after moving from the front seat to the backseat. I guess I'm not sure if they have ruled out a third person in the car. But that doesn't sound like very conclusive evidence that they were belted in.


----------



## TomT

https://arstechnica.com/cars/2021/0...opilot-works-with-no-one-in-the-drivers-seat/


----------



## Mike

M3OC Rules said:


> Am I missing something? That thread doesn't seem to provide a lot of conclusive info or a theory. The only info is "it is thought that based on the crash site, condition of the vehicle, and positioning of the victims that they WERE belted in." If that's true it would mean no one was in the driver seat when it crashed or they put on their seatbelt in the backseat after the crash after moving from the front seat to the backseat. I guess I'm not sure if they have ruled out a third person in the car. But that doesn't sound like very conclusive evidence that they were belted in.


Its the concept of Occam's Razor being implied, that there was a third person behind the wheel and that they escaped verus the need for all sorts of sets of fantastical chains of events...kind of like the "60 Minutes" hit piece on "unintended acceleration" of early 1980s Audi's.

https://threadreaderapp.com/thread/1384562198987501573.html
The Tesla logs should know if the drivers seat was occupied for what was essentially a 1/10 mile long trip.


----------



## lance.bailey

no one, Tesla included, can protect from purposeful stupid.

I am not saying that is what happened in this horrible accident.

I am saying that the ability to purposefully and stupidly defeat safety mechanisms in place around a potentially dangerous technology can never be stopped by more technology. Let Darwin do his work.


----------



## JasonF

I did a test today with Autopilot just to see how far it could be exploited. I did it completely safely, but there were some aspects of what I got away with that I probably should report to Tesla as being a _potential_ issue.

But that issue is not related to this crash! Because as a result of that test, I discovered that it's very difficult to impossible to drive dangerously on an average residential street* with an Autopilot equipped car without human intervention. It will simply refuse to go fast enough to crash itself. I now believe there's no way to have caused a crash like that without a human foot on the pedal, and I take back everything I said about using the scroll wheel/speed setting to make it become dangerous.

* Those 8-car wide California neighborhood streets don't count. I'm sure AP might go just as fast as human drivers do on those.


----------



## fritter63

lance.bailey said:


> no one, Tesla included, can protect from purposeful stupid.
> 
> I am not saying that is what happened in this horrible accident.
> 
> I am saying that the ability to purposefully and stupidly defeat safety mechanisms in place around a potentially dangerous technology can never be stopped by more technology. Let Darwin do his work.


And we're to the point where what's most important in this debate is that *FSD was not involved. *So the entire premise (that FSD was at fault) is null.


----------



## tivoboy

TomT said:


> https://arstechnica.com/cars/2021/0...opilot-works-with-no-one-in-the-drivers-seat/


That video pretty much sums up clear weaknesses and gaps in the safety features that SHOULD currently still be in place.


----------



## gaduser

I have much respect for Consumer Reports.
However, it seems clear that the 'driverless Y' test could have been of the car, the driver, or both.

Is there a faultless system that protects an ill equipped driver from himself?

A video screen, with dancing balls, might be implemented to satisfy optical detection.

Causing just about any car to move, WITHOUT OCCUPANTS, seems like an interesting, though not enormously challenging game.

That summoned, to mind, the SUMMON command.


----------



## fritter63

gaduser said:


> That summoned, to mind, the SUMMON command.


Yes, the very command that, when used by a Model S owner FOR THE FIRST DAMNED TIME EVER (Who does that in a full parking lot?), got us a trip to the body shop when that driverless Model S rubbed the front fender our relatively new Model 3.

And the idiot was an insurance agent on top of that.


----------



## JasonF

tivoboy said:


> That video pretty much sums up clear weaknesses and gaps in the safety features that SHOULD currently still be in place.


What Consumer Reports did is pretty close to what I did, except I didn't leave the driver's area or use a weighted chain. What they left out, though, possibly for sensationalism, is that alone won't cause a crash.

The rest of what I did was set the speed at 48 mph, which was far above the 30 mph neighborhood speed limit, but still safe without other drivers coming. AP got up to about 38 mph before it slowed for a curve, and then continued going less than 30 mph the rest of the way, due to trees and parked cars, until I shut it off. That's where I concluded that it would take a human foot to make the AP go dangerously fast - I still had plenty of margin, but AP simply didn't want to go that fast.

And if you haven't guessed, tricking the speed setting is the part I'd like to report to Tesla.


----------



## DocScott

JasonF said:


> What Consumer Reports did is pretty close to what I did, except I didn't leave the driver's area or use a weighted chain. What they left out, though, possibly for sensationalism, is that alone won't cause a crash.
> 
> The rest of what I did was set the speed at 48 mph, which was far above the 30 mph neighborhood speed limit, but still safe without other drivers coming. AP got up to about 38 mph before it slowed for a curve, and then continued going less than 30 mph the rest of the way, due to trees and parked cars, until I shut it off. That's where I concluded that it would take a human foot to make the AP go dangerously fast - I still had plenty of margin, but AP simply didn't want to go that fast.
> 
> And if you haven't guessed, tricking the speed setting is the part I'd like to report to Tesla.


It doesn't actually require a human foot though, right? If you really want to play the "can I make my car do dumb things" game, you can put a weight of some kind on the accelerator pedal as well.

Of course, at that point it's hard to distinguish that from what you could do with an ICE car with no autonomy whatsoever. By mechanically forcing the accelerator down, you're essentially overriding TACC. It's not unlikely that you'll make the car upset enough that it will disengage Autosteer as well, at which point the weight on the steering wheel will actually cause the car to turn.

In another thread, there was a lively discussion as to whether Teslas _should_ override the inputs when a depressed accelerator is going to cause an imminent crash, with some good arguments on both sides. The fact is, though, that they don't.

So one possible scenario for what transpired is something like this:

The Tesla owner knew a spot where he could coax AP to engage on his street. Yes, I know it won't engage on most of the very similar streets around there, but sometimes just the right combination of cues can allow AP to be engaged somewhere where it seems like it wouldn't. After discovering that, the owner could have, cautiously at first, played around with how well AP could handle the bend in the road while his foot was on the pedal--perhaps going, say, 35 mph. Fast enough to feel daring, but not so fast that the car loses control.

So then, the attempt at a "stunt": rig up something to depress the accelerator a little bit, so the car can go 35 mph or so. Weight the steering wheel. Figure out how to get the car to start while not in the driver's seat. I can think of several things I might try if I wanted to pursue such a reckless scheme. Transitioning from Summon, smart or dumb, if the car had Enhanced Autopilot? Shifting in and out of neutral? If all else failed, rigging up the accelerator weight so that I could deploy it from the passenger seat? It would all be a big puzzle, like designing your own explosive display for a gender-reveal party.

I can sort of imagine doing all that and kinda sorta convincing myself that I had it worked out so I'd be successful, and maybe get a cool video taken by my friend who would sit in the back seat.

The problem, of course, is that it would only take one thing to go wrong to have the whole thing cascade in to disaster before I could intervene. That disaster would involve AP disengaging well before the crash itself, which would be consistent with Musk's tweet.

* * *
I certainly don't know that's what happened. The other theories put forward in this thread are also plausible: a driver that fled the scene, perhaps, or a driver that clambered to a different seat in an attempt to escape.

All the theories, including mine, should leave considerable physical evidence, even after the fire destroyed much of the car. In addition, there are reportedly witnesses to at least the beginning of the sequence, and if that was the case it seems likely there were witnesses for parts of what the car did after that, too.

It's an odd mystery, and one that I think law enforcement could get to the bottom of if they make a concerted effort. Most of the media doesn't treat it that way, though, instead covering it as a referendum on Tesla, when it's clear that whatever happened was considerably out of the ordinary.


----------



## Madmolecule

So if the car was fooled to make it believe there was a driver in whatever mode, why didn’t the airbag go off. I was speaking with Colombo and he felt the pool boy jumped out right before impact

I separate question is why I own a vehicle with a high definition camera inside the cabin but only Elon has access to it. I paid for the vehicle I even pay for something called premium connectivity. Why can’t I connect to what I already own and pay for?

If a Tesla ran over 100 people I guarantee Tesla could come up with the video to prove it wasn’t their fault.

Elon electrify Cuba!


----------



## iChris93

Madmolecule said:


> So if the car was fooled to make it believe there was a driver in whatever mode, why didn't the airbag go off.


Citation please


----------



## Madmolecule

iChris93 said:


> Citation please


I did miss read the article. It said they were uncertain whether they were deployed or not. It would be great to see some early footage. I do believe it is pretty obvious to the first responders. If you look at the second article the fire was not that severe to begin with. They seem to put it out quickly but the smoldering is what left the final images. I don't know if the airbags were deployed but it should be clear or at least clearer to the first responders. They did not see what we're seeing in the pictures. That's how they were 99.9% sure on where the people were located. Sorry to misslead

Airbags deploy uncertain

Article on the fire

still electrify Cuba Elon


----------



## Madmolecule

From TMZ

According to authorities ... the 2 men who were killed in the Saturday night accident were talking to their wives about taking the Tesla for a drive and testing the driver-assistance technology just minutes before they took off and crashed.

This is the time I remember seeing it mentioned that they were talking about using the driver assist of technology. It is TMZ though


----------



## JasonF

Madmolecule said:


> According to authorities ... the 2 men who were killed in the Saturday night accident were talking to their wives about taking the Tesla for a drive and testing the driver-assistance technology just minutes before they took off and crashed.


Hopefully they weren't testing to see if the car would automatically stop if they aimed it at a tree...


----------



## Madmolecule

iChris93 said:


> Citation please


Sorry I thought you were talking about the airbags. Photo of woodlands pool boy. It might look like I'm making light of the tragedy but I'm just frustrated that these idiots are delaying my FSD.


----------



## GDN

JasonF said:


> Hopefully they weren't testing to see if the car would automatically stop if they aimed it at a tree...


If you need an answer - It does !!


----------



## Madmolecule

I think this is a great opportunity for Tesla to evolve their technology to become more customer centric. I feel that will become the new focus of teslas AI technology. I have named it the life-pilot. I think Tesla should set up a group called live sentry, to manage accidents and medical emergencies. I envision the system would have the following features.


When an emergency is detected the car would notify the live-century group (humans on a phone - true premium connectivity)
this person would receive information from all working cameras and sensors as well as audio from inside the cabin and outside the vehicle.
This telemetry information and then be patched 911 operator's. The castle support group could give the operators a five digit code that would allow them to pull up the information on an app. This way we do not have to wait for 911 services to update their equipment.
I also think they should create a Tesla bracelet, not a watch, with no display. It would be the interface to the Tesla phone I am the cars telemetry providing other vital signs that the camera cannot discern.
The vehicles through the Tesla cloud would also send out a May Day beacon to Tesla's within a local vicinity. 1 mile in urban areas and up to 50 miles in remote areas. This beacon would go away once first responders were on scene.

Of course this would also be helpful for flat tires etc. and even lower level of the support could help out with alternate routes, waypoints and restaurant recommendations. It would even pre-plan you're supercharging entertainment

Until we can get past the technical challenges and liability of FSD, the best use of the AI is to create an augmented reality and make the driving experience better, safer and more importantly more fun. They will also help you drive this beast of a vehicle. With pre-program show-off modes. We've all been the guy showing off the vehicles capability to friends. Luckily I was able to do it without finding a tree. Maybe some of you have resisted the urge not to show off the teslas capabilitys, but I have not.

I don't even care about privacy issues anymore. We are so far past that, let's use the technology to benefit us.

Elon Electrify Cuba! Earth Day is over, its time for Earthling Day

I want my life pilot with Moto Conte design

maybe we should also add biological AI


----------



## M3OC Rules

Madmolecule said:


> I think this is a great opportunity for Tesla to evolve their technology to become more customer centric. I feel that will become the new focus of teslas AI technology. I have named it the life-pilot. I think Tesla should set up a group called live sentry, to manage accidents and medical emergencies. I envision the system would have the following features.
> 
> 
> When an emergency is detected the car would notify the live-century group (humans on a phone - true premium connectivity)
> this person would receive information from all working cameras and sensors as well as audio from inside the cabin and outside the vehicle.
> This telemetry information and then be patched 911 operator's. The castle support group could give the operators a five digit code that would allow them to pull up the information on an app. This way we do not have to wait for 911 services to update their equipment.
> I also think they should create a Tesla bracelet, not a watch, with no display. It would be the interface to the Tesla phone I am the cars telemetry providing other vital signs that the camera cannot discern.
> The vehicles through the Tesla cloud would also send out a May Day beacon to Tesla's within a local vicinity. 1 mile in urban areas and up to 50 miles in remote areas. This beacon would go away once first responders were on scene.
> 
> Of course this would also be helpful for flat tires etc. and even lower level of the support could help out with alternate routes, waypoints and restaurant recommendations. It would even pre-plan you're supercharging entertainment


What you're describing sounds a lot like OnStar and ranges from $15/month to $50/month. Sounds like GM has over a million subscribers largely due to providing in car wifi.

https://plants.gm.com/media/us/en/g...tent/Pages/news/us/en/2020/oct/1020-wifi.html
https://www.onstar.com/ca/en/plans-pricing/


----------



## Madmolecule

M3OC Rules said:


> What you're describing sounds a lot like OnStar and ranges from $15/month to $50/month. Sounds like GM has over a million subscribers largely due to providing in car wifi.
> 
> https://plants.gm.com/media/us/en/g...tent/Pages/news/us/en/2020/oct/1020-wifi.html
> https://www.onstar.com/ca/en/plans-pricing/


Slightly like on star but much more useful and personal. I think this is what onstar aspired to be with the navigation but I am talking a true augmented reality experience that would weave into every aspect of your life including emergency medical and crash assistance. Taking all this data creating useful information and show the world what big data really can do for the individual. I think it should also continue with the individual when they leave the vehicle and be full global assist. Taking AMEX and garmins to the next level. I am just trying to show a very useful and immediately needed feature of a much broader reengineering of the user interface and experience.


----------



## M3OC Rules

Madmolecule said:


> Slightly like on star but much more useful and personal. I think this is what onstar aspired to be with the navigation but I am talking a true augmented reality experience that would weave into every aspect of your life including emergency medical and crash assistance. Taking all this data creating useful information and show the world what big data really can do for the individual. I think it should also continue with the individual when they leave the vehicle and be full global assist. Taking AMEX and garmins to the next level. I am just trying to show a very useful and immediately needed feature of a much broader reengineering of the user interface and experience.


You should really read their webpage. I've never used Onstar but what you describe is all on their webpage. They have an app that you can use outside the car. They also have automated crash response, with emergency medical assistance.

Tesla has mentioned some things like automatically scheduling repairs when the car senses part failure and automated response in accidents. Its mostly aspirational. I think this is part of the problem with trying to be vertically integrated. Trying to do all this stuff at the same time is difficult. Tesla even talked about their own music service at one point. I'm a little torn because i like the interface to be fully integrated but at the same time I realize Carplay/Android Auto would probably work better on some stuff until Tesla gets around to fixing/adding things.


----------



## Madmolecule

It's time to move way past inspirational. I have used on star before and I drive a Tesla and what I'm talking about is totally different. The operator interface and infotainment I believe has been a hole in the Tesla product line. Originally I thought it was while they were waiting to merge with either Apple Microsoft or Google but now I believe they are developing their own competitive phone. I think it is the missing link for Tesla and will be the key that will continue to provide value down the road. As electric vehicles involve to software driven computers It is becoming more difficult to sell on the high luxury end. Computers in electronics do not appreciate and are not an investment historically. This is a huge problem for Tesla to solve especially when trying to come out with a roadster. I don't care how cool a $250,000 Tesla roadster is the day it comes out. In five years if it's nothing but an outdated computer it'll be as cool as a 10-year-old plasma display. The reason some of the super cars maintain their value or even appreciate in time is due to the artful mechanical craftsmanship. Most of this goes away in the EV model. It will become very difficult to differentiate between the sub 2 second supercars. The challenges for these vehicles to continue to provide value to the consumer to make them feel good about their extremely expensive purchase. Otherwise the market will be driven to the $25,000 commuter vehicle with limited intelligence. A five-year-old auto pilot after they stopped upgrading it because there was no market in it will be as beneficial as a back up camera in the outback.

Based on their track record there is no chance that Tessler would actually use a third-party service like on-onstar, CarPlay or waze. I feel it is a combination of pride, license fees but most importantly not wanting to share information with a future competitor. Because of this we need Tesla to build more than just a great vehicle they need to work on their soft skills

Here are some features I posted that I thought this missing link should include



Madmolecule said:


> Since my wildest prediction about Telsa and Bitcoin came somewhat true I will continue with the real "made-up" reason the app has stagnated and does not have the capabilities of some 3rd party apps. Also why they have seemed to spend more time on game development than the Tesla App development.
> 
> The Tesla phone or as I call it Musk Mobile Communicator and Life Pilot.
> 
> No its not just a phone or like any phone you have ever used. A few of the features:
> 
> It is built using the same Tesla built processor in the new autopilot computer.
> Has Starlink as well as up to two additional sim cards for simultaneous multi carrier communication.
> It is a true coprocessor to you car. The car will not need it to drive but it will be integral to the advanced experience
> It will handle all Infotainment processing and aggregating all media, navigation, advanced routing, communication from messaging, phone calls and social media.
> multiband WiFi with managed routing
> 64 channel bidirectional Bluetooth
> It will instantly link up to the additional displays in the new Teslas and the displays will become a touch screen interface to the communicator.
> It will plan recommended playlists podcasts etc for you trip which it will optimize over machine learning. The same will take place in the home. This will be multi display based on the viewer. so the kids rear displays might have a separate route experience. It will also provide rear seat dog/kids noise cancellation for conference calls.
> This will continue when you take the communicator in the house, or leave docked in the car with the home extender, and your curated entertainment experience will continue on your home TVs. It will replace the home computer, Tivos and cable internet.
> It will be the ultimate gaming console in the car and the home. (also reason GTA has been delayed - it will be Tesla only)
> Seamless application adaption for iOS and adndroid devices
> It will also provide realtime diving assistance, by measuring your reaction time, studying your eye movements, and providing assistance as well as efficiency ratings on not only the power consumption but you driving skills. Advanced track and new driver training will be additional add on modules to the app.
> You will finally have that truly immersive informational experience. Now that we have have the iPhone and the Telsa it is not much to see what it could be.
> It will interface to all the vehicle cameras as well as home cameras to provide a true Sentry system with artificial treat detection
> Health: It will turn your car and home into a realtime doctors office, not only continually monitoring o2 egg and blood pressure abut also advanced modes such as reaction and speech detection for Alzheimer's etc.
> Interface to Tesla home and vehicle HVAC and biological air quality analysis, including virus detection
> Size: Bad news like a brick or old Motorola bag phone, but it is in three main pieces, The Base "Elon's Brain" as it will be called, Phone, and Camera.
> The phone and camera will be small and detachable. Basically a thin client to the base. all the AI and augmented reality will still be processed by the base. The camera will be separate for high resolution digital gimbal photos.
> The camera will also attach to the Tesla drone. Tesla will be buying DJI to make this happen. The drone will be used for the traffic condition acquisition, launched from the vehicle, but more importantly selfie mode when driving. It will also be used in the advanced driving modes such as track and snow to assist the FSD. The voice recognition will be so advanced it will be able to navigate you to Home Depot instead of Home
> Pricing: one will be included with the new vehicle. Additional phones may purchased for $3,500 to $7,000. This will vary based on whether the advanced auto-life-pilot features are purchased or the Musk advanced Glasses addon is selected
> You can also drive the car remotely with it a course and be able to remotely monitor everything in and outside the vehicle while it is being used as a robo-taxi
> It will be integral to the new Telsa Flightsuit, using DJI tech, which will come standard in the spacex roadster (Elon cant let BMW get the flightsuit outfirst)
> It will be announced in March and ship before the end of the year.
> 
> Biggest delay has been the naming. They are frustrated with Sexy and Cybertruck's name which will be dated by the time it come out. DJI will be SpaceX Personal, SkyX. GlobeX - Tesla and Boring will be SubterraneanX to go with Starlink they want homelink or carlink, neuralink,, the phone will be the "missing link" but most universal names are taken.
> 
> Tesla will also start up a grass roots company called "Conte Moto". It will focus on Italian inspired active vegan apparel. They will start with combination jackets/backpacks for the Tesla ATV, Bikes, snowboarding, skydiving, scuba and other active sports, in addition also an executive line. Also a construction apparel line as a working partner for cybertruck making work more safe, intelligent and efficient. The Tesla Conte Moto onesie will be an integral component for new babies. It will not only monitor the babies health but will provide early education and let the mother sleep. It will house the base processor as well as the smart apparel will provide health and body position information to the base brain. The line will expand to wingsuit and spacex flight suit designs. Once the line is established they will be come a major design arm for Tesla responsible for bringing the true sexy to well engineered vehicles.
> 
> The phone will make Cook wish he would have taken the call and Conte Moto will put Ive into permanent retirement
> a little different than the iPhone 13 rumors of weather it will have a charge port or not
> 
> the phone is the "missing link" to the evolution of all Tesla products. It will be highly desired, even from people even without teslas vehicles.
> 
> phone slogan will be you can tell your future by looking at your hand
> 
> View attachment 37086
> 
> Photoshopped real photo in the wild
> 
> BMW electric flightsuit


----------



## JasonF

Madmolecule said:


> Otherwise the market will be driven to the $25,000 commuter vehicle with limited intelligence.


That's kind of the direction Tesla is heading in - they're working on a $25k EV. You're talking about pushing them back in the other direction, into the money-is-no-object ultra-luxury segment. I don't think Tesla is going to go back there unless other automakers get very innovative very quickly, and take the lower-end segment away from them. On a personal note, I wouldn't want them to head back to ultra-luxury territory, because that comes with a hefty price increase across the product line, and I would be forced to buy future vehicles elsewhere. And if they did that before they achieved their mission of bringing EV's into the mainstream, that would force me (and the majority of other drivers) back to ICE vehicles.

Also, there has always been a problem in the ultra-luxury market for automakers, because the buyers in that segment are very few and very fickle. Fashion in that market changes very suddenly, and if you can't keep up, you drown - and many ultra-luxury makers have. If you don't believe that, ask Ferrari - at one point, the company almost died because they stagnated, and the segment of customers they served decided their cars were too common, and went for more rare and exclusive brands like Koenigsegg. So there's a very good chance that Tesla entered the mainstream market before it lots its ultra-luxury buyers to another more fashionable luxury manufacturer.


----------



## M3OC Rules

Madmolecule said:


> It's time to move way past inspirational. I have used on star before and I drive a Tesla and what I'm talking about is totally different. The operator interface and infotainment I believe has been a hole in the Tesla product line. Originally I thought it was while they were waiting to merge with either Apple Microsoft or Google but now I believe they are developing their own competitive phone. I think it is the missing link for Tesla and will be the key that will continue to provide value down the road. As electric vehicles involve to software driven computers It is becoming more difficult to sell on the high luxury end. Computers in electronics do not appreciate and are not an investment historically. This is a huge problem for Tesla to solve especially when trying to come out with a roadster. I don't care how cool a $250,000 Tesla roadster is the day it comes out. In five years if it's nothing but an outdated computer it'll be as cool as a 10-year-old plasma display. The reason some of the super cars maintain their value or even appreciate in time is due to the artful mechanical craftsmanship. Most of this goes away in the EV model. It will become very difficult to differentiate between the sub 2 second supercars. The challenges for these vehicles to continue to provide value to the consumer to make them feel good about their extremely expensive purchase. Otherwise the market will be driven to the $25,000 commuter vehicle with limited intelligence. A five-year-old auto pilot after they stopped upgrading it because there was no market in it will be as beneficial as a back up camera in the outback.
> 
> Based on their track record there is no chance that Tessler would actually use a third-party service like on-onstar, CarPlay or waze. I feel it is a combination of pride, license fees but most importantly not wanting to share information with a future competitor. Because of this we need Tesla to build more than just a great vehicle they need to work on their soft skills
> 
> Here are some features I posted that I thought this missing link should include


Ok. I see. The whole app ecosystem is a nice moat for Apple and Google but Tesla did talk about making a in car app store at one point so you never know what they might try to do.

I think getting a person on the line would be nice. I had a situation with a bad update right before taking a trip on the weekend. Going through their phone menus I believe the only option to try to get ahold of anyone was roadside assistance. But that made it seem like they were just going to send someone over to tow the car or something. I never got a hold of anyone and luckily after many, many reboots it deleted my profiles and was happy again.


----------



## Madmolecule

JasonF said:


> That's kind of the direction Tesla is heading in - they're working on a $25k EV. You're talking about pushing them back in the other direction, into the money-is-no-object ultra-luxury segment. I don't think Tesla is going to go back there unless other automakers get very innovative very quickly, and take the lower-end segment away from them. On a personal note, I wouldn't want them to head back to ultra-luxury territory, because that comes with a hefty price increase across the product line, and I would be forced to buy future vehicles elsewhere. And if they did that before they achieved their mission of bringing EV's into the mainstream, that would force me (and the majority of other drivers) back to ICE vehicles.
> 
> Also, there has always been a problem in the ultra-luxury market for automakers, because the buyers in that segment are very few and very fickle. Fashion in that market changes very suddenly, and if you can't keep up, you drown - and many ultra-luxury makers have. If you don't believe that, ask Ferrari - at one point, the company almost died because they stagnated, and the segment of customers they served decided their cars were too common, and went for more rare and exclusive brands like Koenigsegg. So there's a very good chance that Tesla entered the mainstream market before it lots its ultra-luxury buyers to another mo


Ferrari and Koenigsegg are great examples of supercars that if you're lucky enough to ever be able to buy one, which I will never be, no matter how much money you spend in another year they will make one better and more expensive. But they are more than just an automobile they are a work of art sculpture and the feet of engineering that can be appreciated long after it's technology has become outdated. Sadly computers just don't have that same reference. Aerodynamics don't mean as much , And does not need to be a trade off for volume anymore, that's why SpaceX is a flying brick.

I don't know if he fees can be profitable at the $25,000 level. How do you think you got to sell a little vaporware to keep the profit margin survivable. I hope they will focus on how to bring value and provide market separation. Also small companies like Ferrari would love to develop their own software or even stereo systems. They are just too small so they have to play well with others. If you're not gonna create navigation better than Waze and infotainment better than what's already on the market they must learn to play with the big boys we're back to Crappy radios. There was a time when car manufacturers did not think the radio was that important and people are buying them for their great vehicle. it created a tremendous aftermarket And took years for the manufacturers to respond. For decades they supplied a.m. radios with the antenna on the driver side. This was done because it was a couple inches shorter because the cars were so wide the stereos were not in the center. They didn't care that it obstructed the drivers view it saved a couple inches of antenna cable. This was how decisions were made not for the consumer experience. I'm just saying Tesla needs to change this mindset and focus on us


----------



## lance.bailey

yes, Tesla is the only company from whom I have product that I cannot talk to on the phone. This dates back to Nov 2018 when I started to buy the car. There was an ... inability ... for the sales person to understand my question and I kept asking her to call me, but it was all email until I gave up and went with a different car. such is life. At the end of the day both cars got sold so Tesla likely doesn't care.

But the also bought 27 months of bitter taste and with sales/delivery no better these days but sales still increasing .... well you know.

When the tides turn and Tesla has real competition things will be different. right now everyone compares against Tesla, it's a nice position to be in.


----------



## Madmolecule

lance.bailey said:


> yes, Tesla is the only company from whom I have product that I cannot talk to on the phone. This dates back to Nov 2018 when I started to buy the car. There was an ... inability ... for the sales person to understand my question and I kept asking her to call me, but it was all email until I gave up and went with a different car. such is life. At the end of the day both cars got sold so Tesla likely doesn't care.
> 
> But the also bought 27 months of bitter taste and with sales/delivery no better these days but sales still increasing .... well you know.
> 
> When the tides turn and Tesla has real competition things will be different. right now everyone compares against Tesla, it's a nice position to be in.


I actually got eBay on the phone last week for an email conflict issue. Talking to a professional person that could look into my problem see that there are no proper checkboxes for me to select online and handle the situation. It has not always been like this with eBay but they've learned that sometimes you have to talk to your customer. You would think Tesla would at least fake it during the sales part and then stick you with an automated attendant once you were on the hook. They are consistent and have that don't call us will call you attitude from the very beginning


----------



## JWardell

Tesla just made a statement on the accident on their earnings call.

Tesla believes someone WAS driving, as the steering wheel was bent in as if someone hit it.
Tesla also said NO seatbelts were buckled.
They reiterated that autosteer could NOT have been activated on the road, and AP or TACC could not have reached more than 30mph in that short distance no matter what (which would be a much less severe impact)

The simplest explanation is usually the best one:
The driver probably was showing off their Model S's full acceleration and lost control at the curve. After impact, the door might have been crushed so they tried to exit out the back.


----------



## lance.bailey

this is incredibly tragic


----------



## lance.bailey

with Tesla making a statement, that kind of closes the loop. I suggest that the most respectful thing to do for the families is to close this thread.


----------



## DocScott

lance.bailey said:


> with Tesla making a statement, that kind of closes the loop. I suggest that the most respectful thing to do for the families is to close this thread.


I understand where you're coming from, but right now we have Tesla and a representative of law enforcement saying things that outright contradict each other. That's kind of a big deal, one way or the other.


----------



## garsh

lance.bailey said:


> with Tesla making a statement, that kind of closes the loop. I suggest that the most respectful thing to do for the families is to close this thread.


We've only heard Tesla's information. We still haven't heard an update from first responders as to why the data contradicts their initial statements.

If you find that this topic is too sad to revisit, there's an Ignore button at the top of the thread that I believe will stop showing it to you when posts are added.


----------



## garsh

JWardell said:


> Tesla just made a statement on the accident on their earnings call.
> 
> Tesla believes someone WAS driving, as the steering wheel was bent in as if someone hit it.
> Tesla also said NO seatbelts were buckled.
> They reiterated that autosteer could NOT have been activated on the road, and AP or TACC could not have reached more than 30mph in that short distance no matter what (which would be a much less severe impact)
> 
> The simplest explanation is usually the best one:
> The driver probably was showing off their Model S's full acceleration and lost control at the curve. After impact, the door might have been crushed so they tried to exit out the back.


Thanks for explaining Josh.
Even after reading through the earnings call transcript, I couldn't make sense of Lars' broken English explanation.


----------



## Madmolecule

garsh said:


> Thanks for explaining Josh.
> Even after reading through the earnings call transcript, I couldn't make sense of Lars' broken English explanation.


I told you guys, pool boy from the beginning. I am sorry about the tragedy and hopefully it will turn into a positive event for Tesla in the long run.


----------



## JasonF

DocScott said:


> I understand where you're coming from, but right now we have Tesla and a representative of law enforcement saying things that outright contradict each other. That's kind of a big deal, one way or the other.


Not to mention the government investigation. While they will collect evidence from both the police statements and Tesla's, their investigation is going to be much more independent, because they have their own rules and requirements.

I enjoy figuring things out. It's getting tougher to do since the general consensus changed sometime last year to "The experts have spoken. You're not an expert, so you have no right to even discuss it," or such discussions being considered too political, or disrespectful of the dead, or in some cases even hate speech. Americans are becoming as fearful to discuss things as any people living under an authoritative regime, except it's not government retribution we're afraid of, it's each other's. If we're not careful, we're going to become yet another place in the world where people wish they could talk about something that happened, but instead we'll all just lower our heads and walk away.

The counter point might be that it's useless to discuss it anyway, because the government will do its investigation without our input anyway, and they'll impose whatever penalties without even reading these threads. But there is also something called _collective knowledge_. That maybe as we're looking at things, someone else out there is coming to the same conclusions. And that means reading about_ our_ discoveries here gives us hope that maybe things are going the right direction after all.


----------



## lance.bailey

garsh said:


> If you find that this topic is too sad to revisit, there's an Ignore button at the top of the thread that I believe will stop showing it to you when posts are added.


This has nothing to do with sadness and every to do with respect for lives tragically lost. Don't attribute to emotion what is actually good behaviour.


----------



## M3OC Rules

DocScott said:


> I understand where you're coming from, but right now we have Tesla and a representative of law enforcement saying things that outright contradict each other.


I'm not sure what you're referring to exactly but it's very common for news articles to incorrectly quote people and it sounded like they were interviewing someone who wasn't directly involved. If you're concerned about the state of news in this country then I agree.


----------



## DocScott

M3OC Rules said:


> I'm not sure what you're referring to exactly but it's very common for news articles to incorrectly quote people and it sounded like they were interviewing someone who wasn't directly involved. If you're concerned about the state of news in this country then I agree.


Yes, it appears the constable who has been widely quoted was not directly involved. But even if it had been a chief of police in some small town in another state, I think it's newsworthy when someone in their capacity as a representative of law enforcement flat out says there was no one in the driver's seat, and Tesla in their earnings call flat out says otherwise. I agree that the news stories of this event haven't generally been doing a good job, but I don't think there's any evidence at this point that the constable was quoted incorrectly--if he was, I would expect to have seen some follow up coverage indicating that somewhere.

It's all very troubling. I actually think this should be a _bigger_ story than it is, but also a different kind of story. Someone is wrong here, and it ought to be possible to figure out who. And whether it's Tesla that's wrong or that constable, both are in a position of enough authority that the one that is wrong should be held to account.

That's my two cents, anyway...


----------



## GDN

I think one of the big problems is that the follow ups to this story, as the facts are gathered, will never get the wide attention that the headline originally got, which we know Elon truly already knows or believes to be false. If the telemetry was flowing, Tesla knows 100% which seats were occupied and if the accelerator was being pressed. A wreck that violent, a body can be thrown around anywhere in the car. 

This brings up a question however, in our purchase agreement have we given Tesla a release to share information about our driving, whether in a wreck or not? I consider this to be private information. I know Tesla has access, but I've never explicitly given them approval to share it. I know it could be subpoenaed, but in the case of this wreck, while I'd like Tesla to clear their name if possible, I don't know that I agree with them releasing any information unless written approval by the owner, which may be dead. Perhaps his wife or court could give that, or Tesla could use it in a court to prove they are clear, but just to release it to the public in the way it is, I'm not sure I agree with it.


----------



## Klaus-rf

DocScott said:


> ... I think it's newsworthy when someone in their capacity as a representative of law enforcement flat out says there was no one in the driver's seat, ...


... at the time the constable inspected the vehicle.


----------



## bwilson4web

I'm just hoping the memory chip in the HW 3.0 controller of my Model 3 is well protected from fire and water. Perhaps there are ruggedized, USB thumb drives fast enough for video.

Bob Wilson


----------



## garsh

GDN said:


> This brings up a question however, in our purchase agreement have we given Tesla a release to share information about our driving, whether in a wreck or not? I consider this to be private information.


That's a good question.
This telemetry information is obviously traceable to an individual vehicle.
And there are several reports of Tesla trying to call people when they've detected that a car has been in an accident severe enough to fire the airbags.

There's a dialog in the car's configuration screen where you can choose to share video clips, but that only mentions video, and it explicitly says that those clips are not traceable to any particular car (link).
But the order agreement doesn't appear to cover this aspect (link).


----------



## Madmolecule

I had always been a big fan of CR, but it seems like they're just a shill like everyone else.


----------



## Madmolecule

bwilson4web said:


> I'm just hoping the memory chip in the HW 3.0 controller of my Model 3 is well protected from fire and water. Perhaps there are ruggedized, USB thumb drives fast enough for video.
> 
> Bob Wilson


I don't think there can be as safe, secure on board solution. It should be fairly easy for Tesla to offer a subscription service that continually uploads telemetry information data as well as screen snap shots to a secure server. Once you are docked, and connected to Wi-Fi all video would be uploaded. Doorbell cameras have been doing this very successfully, and there is no incentive for the thief to steal the camera anymore. All the telemetry data takes very little cellular bandwidth. If a critical event has detected all bandwidth should be dedicated to uploading video and telemetry data. It's time for tesla TiVo


----------



## Needsdecaf

Klaus-rf said:


> ... at the time the constable inspected the vehicle.


That constable was reported to not have been the responding force.

It's complicated here. There are multiple agencies covering the same jurisdiction.


----------



## Klaus-rf

If the S is similar to the M3, then once 12V power is cut, the rear doors will not open from inside or outside. I, personally, consider that a serious [negative] design flaw.


----------



## jsmay311

Does anyone know... if the steering wheel got bent by the driver hitting it, why didn't the airbag in the steering wheel prevent this?

Have there been any public statements about whether or not the airbags deployed?

(I just skimmed the "Airbags" section of the manual, and while there are lots of statements about the importance of wearing seatbelts, it doesn't say anything about the _driver's_ side airbag requiring the driver's seatbelt to be buckled in order to deploy. So I don't believe an unbuckled driver should prevent airbag deployment. But not 100% sure of that. )


----------



## lance.bailey

airbags are typically labelled a secondary restraint system. does that mean secondary to the existing primary (seatbelt) or secondary for the event of the primary failing/not used? I think the latter, that airbags do not require seatbelts to be active in order for the airbag to deploy in an accident. 

just how i read things.


----------



## garsh

There CONTINUE to be misleading, incorrect articles being published about this crash.

https://www.cnn.com/2021/04/28/cars/tesla-texas-crash-autopilot/index.html
_"Tesla said Monday that one of Autopilot's features was active during the April 17 crash that killed two men in Spring, Texas."_​
No, that's not at all what Tesla said. Sigh.


----------



## Long Ranger

Klaus-rf said:


> If the S is similar to the M3, then once 12V power is cut, the rear doors will not open from inside or outside. I, personally, consider that a serious [negative] design flaw.


The S has a mechanical release cable to open the rear doors. It's located under the carpet, so not too useful if you don't already know about it.


----------



## garsh

Long Ranger said:


> The S has a mechanical release cable to open the rear doors. It's located under the carpet, so not too useful if you don't already know about it.


The Y does as well. Same caveat - not discoverable, so only useful if you already know about it.
I don't know if newer 3s have been updated to add the access for the mechanical release.


----------



## Long Ranger

garsh said:


> The Y does as well. Same caveat - not discoverable, so only useful if you already know about it.
> I don't know if newer 3s have been updated to add the access for the mechanical release.


Wow, that's even more obscure than the release in the S, and I don't even see it documented in the manual. At least the one for the S is in the manual and not as difficult to access once you know about it.


----------



## jsmay311

garsh said:


> There CONTINUE to be misleading, incorrect articles being published about this crash.
> 
> https://www.cnn.com/2021/04/28/cars/tesla-texas-crash-autopilot/index.html
> _"Tesla said Monday that one of Autopilot's features was active during the April 17 crash that killed two men in Spring, Texas."_​
> No, that's not at all what Tesla said. Sigh.


I wonder if whoever wrote the CNN article also read the Electrek article that made the exact same reporting error based on an incorrect transcription of the earnings call.

https://electrek.co/2021/04/26/tesl...atal-crash-in-texas-but-questions-unanswered/


----------



## Klaus-rf

Long Ranger said:


> Wow, that's even more obscure than the release in the S, and I don't even see it documented in the manual. At least the one for the S is in the manual and not as difficult to access once you know about it.


If opening the door(s) is not READILY OBVIOUS to the average user, it's dangerous. More so in a pnic situation like a fire. Fortunately seat belt releases have been standardized in street cars for many decades so that is very intuitive. Door releases need to be the same and NOT require power to actuate [in an emergency such as this one].


----------



## GDN

We live in such a messed up dysfunctional world of US politics. Now we have a US Senator wanting to admonish Musk for commenting on the crash. I've gone on record myself in an earlier post saying I'm not sure I agree with Musk commenting publicly, but I get why he does it. He does it to clear Tesla as much as he can because we have rogue "writers" that get published and publish false and unverified information and as noted in this article "The NHTSA has about 24 ongoing investigations into Tesla crashes, while the NTSB has opened eight". This senator is clueless and if our government organizations can't open a case, investigate it and close it within a few weeks/months then that is the bigger problem.

https://www.yahoo.com/news/sen-richard-blumenthal-says-hes-153001874.html


----------



## Needsdecaf

GDN said:


> We live in such a messed up dysfunctional world of US politics. Now we have a US Senator wanting to admonish Musk for commenting on the crash. I've gone on record myself in an earlier post saying I'm not sure I agree with Musk commenting publicly, but I get why he does it. He does it to clear Tesla as much as he can because we have rogue "writers" that get published and publish false and unverified information and as noted in this article "The NHTSA has about 24 ongoing investigations into Tesla crashes, while the NTSB has opened eight". This senator is clueless and if our government organizations can't open a case, investigate it and close it within a few weeks/months then that is the bigger problem.
> 
> https://www.yahoo.com/news/sen-richard-blumenthal-says-hes-153001874.html


If they had a press office, this would be avoided and unnecessary.


----------



## Long Ranger

Needsdecaf said:


> If they had a press office, this would be avoided and unnecessary.


Yeah, that would make a big difference on this stuff. What's ironic to me is that in the earnings call, the investor question wasn't about the accident, it was asking what proactive PR steps Tesla is taking to tackle the media's deceptive reporting on the safety of Autopilot. In response, Tesla just talked about the details of this accident, but their phrasing on the call was so poor that it actually resulted in more bogus articles about how Tesla had now admitted that TACC was engaged during the accident.


----------



## lance.bailey

I watch a guilty pleasure show called "Blue Bloods" with Tom Selleck as the Police Commissioner of NYC. One of the fascinating things in the show is the portrayal of the role of the NYPD press secretary as confrontational with the PC often fighting him on points of presentation of the facts or what to say and how to say it to the media. 

Obviously the show is a work of fiction but it does give an interesting take on how the people running the show need a filter/mouthpiece/whatever between them and the media hounds.


----------



## Klaus-rf

GDN said:


> We live in such a messed up dysfunctional world of US politics. Now we have a US Senator wanting to admonish Musk for commenting on the crash. I've gone on record myself in an earlier post saying I'm not sure I agree with Musk commenting publicly, but I get why he does it. He does it to clear Tesla as much as he can because we have rogue "writers" that get published and publish false and unverified information and as noted in this article "The NHTSA has about 24 ongoing investigations into Tesla crashes, while the NTSB has opened eight". This senator is clueless and if our government organizations can't open a case, investigate it and close it within a few weeks/months then that is the bigger problem.
> 
> https://www.yahoo.com/news/sen-richard-blumenthal-says-hes-153001874.html


Wow - all of 74 ongoing cases. That's a relatively TINY number. Let's compare that to GM cases and [millions of] vehicles involved::

https://www.google.com/search?q=nht...7j33i299l2.11487j0j7&sourceid=chrome&ie=UTF-8
So much hate.


----------



## GDN

Klaus-rf said:


> Wow - all of 74 ongoing cases. That's a relatively TINY number. Let's compare that to GM cases and [millions of] vehicles involved::
> 
> https://www.google.com/search?q=nht...7j33i299l2.11487j0j7&sourceid=chrome&ie=UTF-8
> So much hate.


Hate for who? I'm just pointing out the inefficiencies of our government departments.

I clicked on your link and just hit one of the results - https://www.bloomberg.com/news/arti...-potential-airbag-flaw-in-749-312-gm-vehicles. Lets be freaking real here - if there are truly airbag issues in 749,312 cars and they don't get that investigated and take action in a matter of a few weeks, you've made my point better than I have. Why is the gov't letting GM endanger that many people. It is likely obvious and the gov't is just lax on doing there job. Point the finger, call the spade a spade and get the problem fixed, quit letting them get away with it.


----------



## JasonF

Klaus-rf said:


> So much hate.


Some of the hesitancy toward the covid vaccines taught me a lot about the way Americans think as a group. That doesn't sound related, but it is - Tesla is a company that moves fast, brings futuristic things to market quickly, and as a result finds failures and bumps in the road along the way. And just like with the fear from certain people that the vaccines were brought out too fast and are therefore dangerous, Tesla also brings out technologies too fast, and is also therefore dangerous.

So what we have then is people on social media, the press, and even the government looking to prove their own fears. Every single Tesla crash is scrutinized, focusing especially on the two "scariest" elements - the battery and Autopilot. General Motors and Ford might have a hundred times more crashes and fires, but they don't introduce technology at a pace that's considered scary, so everyone lets it slide.


----------



## garsh

GDN said:


> Hate for who?


I think @Klaus-rf was agreeing with you.


----------



## Klaus-rf

garsh said:


> I think @Klaus-rf was agreeing with you.


Yes, indeed.

And it's not necessarily the gov't that's behind here. After all, NHTSA doesn't actually fix any cars - they just define that XXX cars need to be fixed for x.y.z failures and it's entirely up to the vehicle mfgrs to implement, schedule, repair and then notify NHTSA that xxx vehicles were repaired or inspected.

And, often,. mfgrs aren't entirely forthcoming in releasing data to determine fault/needed repairs(the investigation phase) cause it costs them money (and in the case of Tesla, reputation - people are already numb to GM, Ford, etc. safety recalls).

It's complex at times. At least now we have paper trails of which cars (VINs, dates of production, mfgrs) have which series of (Serial Numbers, batch numbers) parts in them. MUCH better than we had 30 years back.


----------



## M3OC Rules

Needsdecaf said:


> If they had a press office, this would be avoided and unnecessary


I'm not sure about that. The brilliant senator said they should not be commenting before the investigation. The senator has no problem with the story being misreported or investigators making bad assumptions. He only has problems with Tesla providing facts. The senator doesn't care about safety or truth. He cares about Tesla undermining his argument.


----------



## Klaus-rf

^
One thing that same senator can claim 100% is that GM's super cruise system has NEVER caused a Tesla to crash. Thusly GM isn't at fault for Tesla incidents. And we're not going to talk about those 400,000+ fires in GM cars.


----------



## bwilson4web

Just publish the engineering log data with no comment other than, “where is the same data for named, non-Tesla fatal accidents.”

Treat it like body cam footage. The comparison to other manufacturers and investigation agencies would be a welcome fish-slap in their faces.

Bob Wilson


----------



## Needsdecaf

M3OC Rules said:


> I'm not sure about that. The brilliant senator said they should not be commenting before the investigation. The senator has no problem with the story being misreported or investigators making bad assumptions. He only has problems with Tesla providing facts. The senator doesn't care about safety or truth. He cares about Tesla undermining his argument.


Having worked for a large public corp, and having been on the side of negative publicity, I will tell you that what the senator says has some element of truth to it.

However, Elon firing off random tweets is ineffective. I guarantee you that if Tesla had issued a press release that said "we're aware of the crash, we are cooperating with the authorities and at this time we cannot release any of our findings. However we can say that this particular car did not have FSD package purchased." that would have gone over much, much better.


----------



## Madmolecule

I think I have solved it.

Tequila 
ludicrous mode
old man with cane
Unforgiving quercus


----------



## JWardell

Initial NTSB report is out:
https://www.ntsb.gov/investigations/AccidentReports/Pages/HWY21FH007-preliminary-report.aspx
"Footage from the owner's home security camera shows the owner entering the car's driver's seat and the passenger entering the front passenger seat. The car leaves and travels about 550 feet before departing the road on a curve, driving over the curb, and hitting a drainage culvert, a raised manhole, and a tree."

So it's all on video, driver gets in driver's seat and launches it. Like I've been saying, probably crawled into the back after impact, and all this media stir is for nothing but lies.


----------



## Needsdecaf

JWardell said:


> Initial NTSB report is out:
> https://www.ntsb.gov/investigations/AccidentReports/Pages/HWY21FH007-preliminary-report.aspx
> "Footage from the owner's home security camera shows the owner entering the car's driver's seat and the passenger entering the front passenger seat. The car leaves and travels about 550 feet before departing the road on a curve, driving over the curb, and hitting a drainage culvert, a raised manhole, and a tree."
> 
> So it's all on video, driver gets in driver's seat and launches it. Like I've been saying, probably crawled into the back after impact, and all this media stir is for nothing but lies.


It's not all media lies.

Important to note that the Harris County Precinct 4 Constable, Mark Hermann, said that they were "100% sure" there was no one driving the car. So either he was the one lying, or he's an idiot. Or his people are idiots. Based on living here, I'm betting one of the latter two is likely.

Again, this is where Tesla needs a press office.

Also of note, that the NTSB tried to get a representative Model S to engage Auto Steer on these roads and could not. As I have been saying, it does not work on these roads.


----------



## GDN

It's great to see these reports, however, don't hold your breath waiting on any of the people or outlets to update or retract their stories. Might be time to have a little writing campaign to ask them to correct their mis-information and "Guesses" at attempt to throw Tesla under the bus.


----------



## lance.bailey

I know that it has been discussed about whether or not FSD was enabled, but for an additional data point, I double tapped the wand by accident when sitting at (H) in the driveway and got an alert/warning/finger-scold that FSD will not enage if the seatbelt is not done up.

FWIW


----------



## M3OC Rules

Needsdecaf said:


> It's not all media lies.
> 
> Important to note that the Harris County Precinct 4 Constable, Mark Hermann, said that they were "100% sure" there was no one driving the car. So either he was the one lying, or he's an idiot. Or his people are idiots. Based on living here, I'm betting one of the latter two is likely.
> 
> Again, this is where Tesla needs a press office.
> 
> Also of note, that the NTSB tried to get a representative Model S to engage Auto Steer on these roads and could not. As I have been saying, it does not work on these roads.


That Constable is on video saying those things so its not misquoted. Not sure how that happened but he was clearly wrong and whoever messed that up needs some more training. But the media is self serving. Is Newsweek a reputable media outlet? Read this article from 5/18/2021 that is full of obvious mistakes including referencing this Houston crash as a "similar situation":

https://www.newsweek.com/tesla-autopilot-smashes-deputys-vehicle-1592560
This one is my favorite "California Highway Patrol said that a preliminary investigation showed that autopilot "was engaged" in Hendrickson's Tesla Model 3 truck."

But I can't even tell what the point of the article is. They talk about one issue with hitting a car on the side of the road and then reference two other cases where its not clear they had anything to do with Autopilot. They do say that but then why are they referencing those cases. It almost seems like the article is computer generated.

I'm curious what a press office would do. Tesla released relevant factual information fairly quickly via Elon that was later added to by Tesla and authorities. Some ignored it and some included that but still ran with the story like above. Perhaps it did stop some people with ethics. But even after its obvious the initial statements by police were wrong it still gets referenced. Trying to stop that would be like playing whack-a-mole. And I can't imagine Tesla wants to make a big press blitz themself because at the end of the day two people died and burnt in a Tesla. And then senator scolded them for not waiting for the investigation to be completed.


----------



## Madmolecule

My take away, is that they will not be releasing FSD in anything passed basic level 2 autonomy With the driver still in full control and responsible. Weather auto pilot or full self driving was engaged or not, we have purchased the vehicles with belief in the FSD dream. I know at least I believed that the car would take over some level of the driving responsibility at some point.
I don’t see this happening even with the greatest vision AI anytime soon. As soon as any vehicle in autonomous mode runs over a small child and puppy while the driver attention is elsewhere it will be game over. When this accident occurs or a similar one who’s fault will it be? Will it go against the drivers record? How will her insurance companies mitigate this risk? Can the driver be sued for not having radar?
once the fantasy faded away I have come to realize that these are very heavy, very fast killing machine, and it will be a while before the computer can be responsible.
I think the AI is better suited for route recommendation, driving assistance, and infotainment curation. I do have faith that one day teslas AI will be so advanced it will be able to control the windshield wipers, headlights and climate control. It might even be able to display the outside temperature properly, since I gave up on it even being able to park itself in my garage.


----------



## Needsdecaf

Madmolecule said:


> My take away, is that they will not be releasing FSD in anything passed basic level 2 autonomy With the driver still in full control and responsible. Weather auto pilot or full self driving was engaged or not, we have purchased the vehicles with belief in the FSD dream. I know at least I believed that the car would take over some level of the driving responsibility at some point.
> I don't see this happening even with the greatest vision AI anytime soon. As soon as any vehicle in autonomous mode runs over a small child and puppy while the driver attention is elsewhere it will be game over. When this accident occurs or a similar one who's fault will it be? Will it go against the drivers record? How will her insurance companies mitigate this risk? Can the driver be sued for not having radar?
> once the fantasy faded away I have come to realize that these are very heavy, very fast killing machine, and it will be a while before the computer can be responsible.
> I think the AI is better suited for route recommendation, driving assistance, and infotainment curation. I do have faith that one day teslas AI will be so advanced it will be able to control the windshield wipers, headlights and climate control. It might even be able to display the outside temperature properly, since I gave up on it even being able to park itself in my garage.


Once people stop and think and realize that any Autonomous Driving system that does not require the driver to be an active monitor, you realize that then this gets into a hugely murky legal area. And that really in order for a car to be FULLY autonomous, that the car's manufacturer has to be the one legally responsible. Otherwise, if someone was told, "yeah, you can let the car drive, but if you crash, it's on you", who in their right mind would buy it?

Once you realize just how much risk and liability a company assumes with full autonomy, I think the realization sets in fast that this isn't happening any time soon. Too many "believers" haven't reached this conclusion as you have.


----------



## garsh

Had a family get-together on memorial day. One of the relatives asked me about the Tesla that was on Autopilot and crashed.

Sigh. It really is unethical for the media to run stories like this without fact-checking.
Remember the movie "All the President's Men"? Did reporters actually try to get at least two, credible sources before publishing information? It's all just a memory now.


----------



## lance.bailey

there is no second place standing in the phrase "first to press", unfortunately that means truth is an interesting side effect at best.


----------



## iChris93

Needsdecaf said:


> Once people stop and think and realize that any Autonomous Driving system that does not require the driver to be an active monitor, you realize that then this gets into a hugely murky legal area. And that really in order for a car to be FULLY autonomous, that the car's manufacturer has to be the one legally responsible. Otherwise, if someone was told, "yeah, you can let the car drive, but if you crash, it's on you", who in their right mind would buy it?
> 
> Once you realize just how much risk and liability a company assumes with full autonomy, I think the realization sets in fast that this isn't happening any time soon. Too many "believers" haven't reached this conclusion as you have.


Tesla should do better than and stop selling "full self driving".


----------



## Needsdecaf

garsh said:


> Had a family get-together on memorial day. One of the relatives asked me about the Tesla that was on Autopilot and crashed.
> 
> Sigh. It really is unethical for the media to run stories like this without fact-checking.
> Remember the movie "All the President's Men"? Did reporters actually try to get at least two, credible sources before publishing information? It's all just a memory now.


One of my favorite movies. If I recall correctly, yes, they did.

I blame the press to a certain extent. But I also blame our dopey Constable who said "we are 100% sure no one was driving" the night of the crash before any investigation had taken place. I mean, if you were the press, and a law enforcement official having jurisdiction over the crash made that statement, wouldn't you run with it?


----------



## JasonF

Needsdecaf said:


> Once you realize just how much risk and liability a company assumes with full autonomy, I think the realization sets in fast that this isn't happening any time soon. Too many "believers" haven't reached this conclusion as you have.


That's not necessarily true. I'm not a lawyer, but I recently had to study up on how liability insurance works to understand why solar panels on your own house require so much insurance.

The answer was that as long as your solar panels push energy into the grid, _anyone_ who has knowledge of your solar array's existence and gets shocked from power lines can theoretically sue you for damages. In fact, if someone sues the power company for being shocked by power lines, the power company itself can turn and sue you for being the cause - and a lot of the time, it's harder to fight it than to file a claim with liability insurance.

I think the same would be true for self driving cars. Even if you're not physically _in_ the car, you would be financially liable for anything that happens in or around it. Same as if your car catches fire and burns while parked, the owners of nearby cars can file insurance claims with your insurance. So I would guess, in the end, when Full Self Drive becomes viable, insurance carriers might initially require extra liability insurance to cover anything it does while it's driving itself. At least until they're thoroughly convinced that less things happen when the car is self driving than while you're driving.

The scary part to that, though, is if you plan on letting your car drive by itself early on, you had better have deep pockets. Because with all of the fear surrounding self-driving, it's likely that _any_ crash your car gets into while driving itself - even if you're a passenger - will be assumed to be a fault in the self-driving system, and your liability insurance takes a hit.

Where it gets really tricky is if the self-driving car violates traffic laws - because generally, a cop can't cite for that unless they can prove you were driving, and they're generally required to serve the ticket to the driver directly (which is why you sign it). Even being in the car in a passenger seat doesn't count (they can tell if you hopped out of the driver's seat after being pulled over, though!). There's a possibility some departments will get around that restriction by impounding any self-driving car that violates any traffic law so they can physically serve the person who picks it up with the ticket.


----------



## Madmolecule

JasonF said:


> That's not necessarily true. I'm not a lawyer, but I recently had to study up on how liability insurance works to understand why solar panels on your own house require so much insurance.
> 
> The answer was that as long as your solar panels push energy into the grid, _anyone_ who has knowledge of your solar array's existence and gets shocked from power lines can theoretically sue you for damages. In fact, if someone sues the power company for being shocked by power lines, the power company itself can turn and sue you for being the cause - and a lot of the time, it's harder to fight it than to file a claim with liability insurance.
> 
> I think the same would be true for self driving cars. Even if you're not physically _in_ the car, you would be financially liable for anything that happens in or around it. Same as if your car catches fire and burns while parked, the owners of nearby cars can file insurance claims with your insurance. So I would guess, in the end, when Full Self Drive becomes viable, insurance carriers might initially require extra liability insurance to cover anything it does while it's driving itself. At least until they're thoroughly convinced that less things happen when the car is self driving than while you're driving.
> 
> The scary part to that, though, is if you plan on letting your car drive by itself early on, you had better have deep pockets. Because with all of the fear surrounding self-driving, it's likely that _any_ crash your car gets into while driving itself - even if you're a passenger - will be assumed to be a fault in the self-driving system, and your liability insurance takes a hit.
> 
> Where it gets really tricky is if the self-driving car violates traffic laws - because generally, a cop can't cite for that unless they can prove you were driving, and they're generally required to serve the ticket to the driver directly (which is why you sign it). Even being in the car in a passenger seat doesn't count (they can tell if you hopped out of the driver's seat after being pulled over, though!). There's a possibility some departments will get around that restriction by impounding any self-driving car that violates any traffic law so they can physically serve the person who picks it up with the ticket.


All valid points, but Tesla has been selling the product for two years. You would think by now they would've address the liability, responsibility, path towards regulatory approval. I've also stated earlier I think Tesla needs to provide a reinsurance or umbrella policy to other insurance providers, otherwise we will pay enormous premiums until the insurance companies can properly assess the risk.
If tesla believes and their product and their data there should be a no-brainer. I now wonder how much of the FSD price that I paid was for a future tesla liability and not for R&D. If you feel the answer is zero dollars that's even scarier to me. I hope they thought this through

There has been a lot of lies and problems from the media and the local responders, but Tesla has not been overly transparent in the past and I feel they have been more worried about their stock price them their customers. Because the Tesla software is proprietary we know very little of how it works, and how well it works. If it was open source code, to me they would have a better chance of passing the liability down to others. But a problem with a closed system, I feel they acept all liability for the functionality of the system. I am also starting to get frustrated that I am not just a data collector, but I have become their active R&D. Some of the recent complaints about the software don't seem that it was tested very much within house Tesla personnel. We almost realize that the label "beta" means you're responsible and Tesla is not. I don't see beta going away anytime soon. This is very evident to me that nothing that is non-critical is ever beta. My best example is Mars lander mode. It does nothing, the car doesn't even move on the map. But it is not beta, I guess it is the final version of a polished program.

I know an attorney for the military that focuses on drone software liability. It is very complicated to write code to allow for a computer to take the life of someone. Before you say that is always a human that gives the command to kill with a drone that is not what I'm talking about. what I'm saying is if the drones vision systems believes that it has the approved target within its sites and can make the kill, but instead experiences fanTom breaking and takes out the family instead, the computer just made the decision to kill innocent people. What % does the computer need to have to decide they have the valid target and can make the shot. If it was 100% we would never be able to take a shot. It's a battle of nine 's. I see the exact similarity with full self driving. How do you write code where are the car decides to run over one person or three people. That is not the typical driving scenario but at some point this horrific decision has had to be made by human drivers, so computers will also have to make this decision. I don't think Elon on the stand talking about the statistical benefits of taking out your love one, while sparing three, will add much compassion to the victims families.


----------



## Klaus-rf

> Once people stop and think and realize that any Autonomous Driving system that does not require the driver to be an active monitor, you realize that then this gets into a hugely murky legal area. And that really in order for a car to be FULLY autonomous, that the car's manufacturer has to be the one legally responsible. Otherwise, if someone was told, "yeah, you can let the car drive, but if you crash, it's on you", who in their right mind would buy it?


And you can place your bets now that ANY modification(s) to the vehicle, will void that mfgr liability. Perhaps not having the latest/current firmware installed will also void or limit their liability.



> The answer was that as long as your solar panels push energy into the grid, anyone who has knowledge of your solar array's existence and gets shocked from power lines can theoretically sue you for damages. In fact, if someone sues the power company for being shocked by power lines, the power company itself can turn and sue you for being the cause - and a lot of the time, it's harder to fight it than to file a claim with liability insurance.


That settles it. I'm not moving to Florida.


----------



## Needsdecaf

JasonF said:


> That's not necessarily true. I'm not a lawyer, but I recently had to study up on how liability insurance works to understand why solar panels on your own house require so much insurance.
> 
> The answer was that as long as your solar panels push energy into the grid, _anyone_ who has knowledge of your solar array's existence and gets shocked from power lines can theoretically sue you for damages. In fact, if someone sues the power company for being shocked by power lines, the power company itself can turn and sue you for being the cause - and a lot of the time, it's harder to fight it than to file a claim with liability insurance.
> 
> I think the same would be true for self driving cars. Even if you're not physically _in_ the car, you would be financially liable for anything that happens in or around it. Same as if your car catches fire and burns while parked, the owners of nearby cars can file insurance claims with your insurance. So I would guess, in the end, when Full Self Drive becomes viable, insurance carriers might initially require extra liability insurance to cover anything it does while it's driving itself. At least until they're thoroughly convinced that less things happen when the car is self driving than while you're driving.
> 
> The scary part to that, though, is if you plan on letting your car drive by itself early on, you had better have deep pockets. Because with all of the fear surrounding self-driving, it's likely that _any_ crash your car gets into while driving itself - even if you're a passenger - will be assumed to be a fault in the self-driving system, and your liability insurance takes a hit.
> 
> Where it gets really tricky is if the self-driving car violates traffic laws - because generally, a cop can't cite for that unless they can prove you were driving, and they're generally required to serve the ticket to the driver directly (which is why you sign it). Even being in the car in a passenger seat doesn't count (they can tell if you hopped out of the driver's seat after being pulled over, though!). There's a possibility some departments will get around that restriction by impounding any self-driving car that violates any traffic law so they can physically serve the person who picks it up with the ticket.


Uh, sure it's true. I never said that the car manufacturer assumes full liability. They should, but that's not the way our legal system works. Of course the driver is still liable. It's their car. HOWEVER, it's really easy to say to a defense attorney, as well as a plantiff's attorney "Look, this car has level X autonomy. That means I'm not operating the car, nor am I required to monitor the car, per the manufacturer's instructions, while the car is in motion. So while yes, this is my car, I wasn't driving. Also, I don't have much money and the company that built and sold me this car has hundreds of millions, if not billions, in cash".

What do you think is going to happen?

Moreover...these Robotaxis Elon has been promising? That's even more liability on the automaker.

You say it's not necessarily true, but what you wrote more or less makes my point. It's a huge legal morasse that will need to be solved before ANY manufacturer is either brave enough to say "it'll be on us if you crash" or if law is drafted stating, and limiting, liability.


----------



## Klaus-rf

Needsdecaf said:


> Uh, sure it's true. I never said that the car manufacturer assumes full liability. They should, but that's not the way our legal system works. Of course the driver is still liable. It's their car.


The driver is responsible for any criminal actions. It is always the vehicle owner that is responsible for damage, loss and any injury/death liability.. Thusly the owner maintains [some] insurance on the vehicle, NOT on the driver.


----------



## Needsdecaf

Klaus-rf said:


> The driver is responsible for any criminal actions. It is always the vehicle owner that is responsible for damage, loss and any injury/death liability.. Thusly the owner maintains [some] insurance on the vehicle, NOT on the driver.


So what happens when the "car" is driving, and the human is removed from the equation? Why should they be held responsible for actions beyond their control?


----------



## Klaus-rf

Needsdecaf said:


> So what happens when the "car" is driving, and the human is removed from the equation? Why should they be held responsible for actions beyond their control?


The OWNER is still the responsible party - unless other risk acceptance has been assigned. Just like someone tripping, falling and being injured on the sidewalk in front of your house. Even though the city demanded the access and easement, and then installed the sidewalk, the properly owner is the one that gets sued.


----------



## JasonF

Needsdecaf said:


> So what happens when the "car" is driving, and the human is removed from the equation? Why should they be held responsible for actions beyond their control?


It's not so much who controls it as who owns the property. That's why drivers hit by a company-owned van are able to sue a company behind it. If it was restricted to the person driving, then the liability would be limited to the employee driving the van. Or worse yet, it could create a "it's nobody's fault" situation because the driver doesn't own the car, and the company wasn't in control of it.


----------



## DocScott

Maybe another analogy will help: if you buy a dog, and then the next day your dog attacks the guy who comes to read the gas meter, you could get sued. You didn't train the dog. You didn't order the dog to attack the gas meter guy. You weren't even home when it happened. But you chose to buy the dog and keep it in your yard, so since we don't let people sue the dog, you're liable. Sure, if the dog's training was misrepresented by the person who sold the dog, they could conceivable be sued too. But the first target of the lawsuit would probably be the owner.

And yet that doesn't keep people from buying dogs. As long as the risk of a lawsuit is low, most people don't worry about it.


----------



## Long Ranger

Klaus-rf said:


> The driver is responsible for any criminal actions. It is always the vehicle owner that is responsible for damage, loss and any injury/death liability.. Thusly the owner maintains [some] insurance on the vehicle, NOT on the driver.


The owner isn't always liable, it's more complex than that. For example, here's an example where they weren't:
Car owner not liable
This law firm states:
"A vehicle owner may be liable for the negligence of a driver if the driver was acting as the owner's agent, and the owner controlled or had the right to control details of the physical movement of the agent. Both parties must consent to the principal-agent relationship."

With a self-driving car, the owner may very well be held liable. I'd also expect a lot of successful claims against the car manufacturers. Our society isn't too tolerant when a known product defect, or inadequate product testing results in personal injury.


----------



## JasonF

Long Ranger said:


> With a self-driving car, the owner may very well be held liable. I'd also expect a lot of successful claims against the car manufacturers. Our society isn't too tolerant when a known product defect, or inadequate product testing results in personal injury.


This is why I often talk about Americans' tendency to push courts and/or government bodies to take any and all measures that ensure that whatever happened can never happen again. But that demand depends largely on what the magnitude of the event was, and how people feel about it.

For instance if Tesla Full Self Drive causes a crash, it might make news, and generate lawsuits, but not much else would happen. If FSD runs over an adult, the outcome would depend on whether that adult was crossing the street properly. FSD runs over a child, on the other hand, and no matter where that child was crossing, FSD's days are likely numbered. Which is where I've predicted a few times that something might happen to make the U.S. the only place where self driving cars are banned, while Europe and the rest of the world make it standard.

Then again, sometimes there are strange outcomes. Like I would have been certain that the first time there was a large scale event involving guns in the U.S. where children were killed, all guns would become illegal to own. But it happened already, and they're still legal. So I guess sometimes those emotionally driven reactions don't happen.


----------



## Needsdecaf

You guys are sugar coating the liability issue. It's a huge liability for the manufacturer. 

Have any of you ever actually read the full definition of the SAE levels of automation in detail and understand what is expected of the vehicle and the occupant?


----------



## Madmolecule

Needsdecaf said:


> You guys are sugar coating the liability issue. It's a huge liability for the manufacturer.
> 
> Have any of you ever actually read the full definition of the SAE levels of automation in detail and understand what is expected of the vehicle and the occupant?


This is why I felt regardless of what tesla engineers confidence level is and what Elon believes there is no way the board of directors will let them release FSD Without driver full responsibility at all times. When I first got mine I was led to believe that as Tesla Received more data from their fleet of automated vehicles the nag would be lengthened and then disappear. Tesla is proven now, They are building giga factory's all over the place, they can't manufacture to keep up with demand, the only way they can lose their commanding lead is by taking responsibility for driving of your vehicle. But it is time for Tesla to be honest. If it takes an attention score of above 92 as determined by teslas AI for you to be able to use automated functions, that should be clear when you purchase the vehicle. Because I don't know what the definition of full self driving is. Does that mean that I will be doing the driving fully. I am told later this year we will have city streets. I've driven on the city street before I think I know what that means, but I'm sure I'll find out later that I have no idea what city streets mean.
if it is not full self driving and will never be full self driving and I am only the manager of an autonomous driving AI system I think I might just rather drive the car myself. I know how to drive a car maybe not all that well but managing a teenager learning how to drive even if it is a computer is a bit stressful and removes a lot of the benefit of automation


----------



## JasonF

Needsdecaf said:


> You guys are sugar coating the liability issue. It's a huge liability for the manufacturer.
> 
> Have any of you ever actually read the full definition of the SAE levels of automation in detail and understand what is expected of the vehicle and the occupant?


You might be confusing liability with legal regulation.

Tesla itself gives up all rights to the vehicle and what happens to it the moment you sign the delivery paperwork. It involves a legal concept I believe is called _assignment of title._ It's the same thing that happens when you sell a house to someone. Once that signature hits paper - and assuming no fraud has been committed in the deal - all of Tesla's liability is gone.

Where legal regulation comes in, FSD is an unregulated grey area right now. People might be able to sue Tesla on the grounds that they claimed FSD was safer than human drivers. Until self driving tech gets an official stamp of approval from government regulators rather than just letting it be, any automotive company with self-driving tech is at risk of that. We might even see a strange partnership someday where GM and Ford are fighting a lawsuit alongside Tesla to avoid having all of the self driving tech deemed dangerous.

FYI, I'm still not a lawyer, I learned this stuff from reading and watching Lehto's Law videos.


----------



## Klaus-rf

Note that these "laws" and "liabilities" vary state-to-state. Eventually (decades away??) there may be federal laws that apply to Auto-Driving, but the liability issues will still be state defined. Just as they are now.


----------



## Long Ranger

JasonF said:


> Tesla itself gives up all rights to the vehicle and what happens to it the moment you sign the delivery paperwork. It involves a legal concept I believe is called _assignment of title._ It's the same thing that happens when you sell a house to someone. Once that signature hits paper - and assuming no fraud has been committed in the deal - all of Tesla's liability is gone.


I'm no lawyer, but I know this isn't true. Product manufacturers are held liable for injuries caused by their products all the time. When a self driving car hits someone, I think it will be pretty easy to argue that it was due to a product defect.


----------



## JasonF

Long Ranger said:


> I'm no lawyer, but I know this isn't true. Product manufacturers are held liable for injuries caused by their products all the time. When a self driving car hits someone, I think it will be pretty easy to argue that it was due to a product defect.


That's a different legal concept. If you pick up your Tesla delivery and then immediately crash into another car as you're leaving, that other driver can't _successfully_ sue Tesla for causing the crash, whether a human or the car is driving. What they might try to do is prove that Full Self Drive, or the car itself, had a defect that caused the crash, but as long as it's working the way it's supposed to, that's really difficult. It's slightly easier now because the government hasn't officially certified any self-driving systems, but once they do, it'll be close to impossible to prove a defect unless it happens often.

I emphasized _successfully _above, because these days anyone can try to sue any person or company for pretty much any reason. With that same crash example above, it's entirely possible the other driver would sue you, Tesla, and also whatever company you work for because they allowed you to take time off to pick up the car.


----------



## Long Ranger

JasonF said:


> What they might try to do is prove that Full Self Drive, or the car itself, had a defect that caused the crash, but as long as it's working the way it's supposed to, that's really difficult. It's slightly easier now because the government hasn't officially certified any self-driving systems, but once they do, it'll be close to impossible to prove a defect unless it happens often.


We shall see. I agree that liability claims are more complex, difficult, and expensive to pursue than driver negligence claims. However, I think it will be relatively easy to find liability for crashes which would have been avoided a) by a typical human, b) by a competitor's system, or c) if the manufacturer's software had simply assigned more weight to one of the safety decisions. "Are you saying your system chose to drive over this child by design, or because of a defect in the design?"

There are numerous papers predicting this likely shift from driver negligence to product liability with the eventual transition to autonomous vehicles. Here's one example:
Autonomous Driving and Product Liability
Note that this author makes an admittedly rough calculation that covering product liability costs could add between $6k to $30k to the price of an autonomous vehicle (see page 60).


----------



## Madmolecule

Long Ranger said:


> We shall see. I agree that liability claims are more complex, difficult, and expensive to pursue than driver negligence claims. However, I think it will be relatively easy to find liability for crashes which would have been avoided a) by a typical human, b) by a competitor's system, or c) if the manufacturer's software had simply assigned more weight to one of the safety decisions. "Are you saying your system chose to drive over this child by design, or because of a defect in the design?"
> 
> There are numerous papers predicting this likely shift from driver negligence to product liability with the eventual transition to autonomous vehicles. Here's one example:
> Autonomous Driving and Product Liability
> Note that this author makes an admittedly rough calculation that covering product liability costs could add between $6k to $30k to the price of an autonomous vehicle (see page 60).


I totally agree. My point earlier was that if tesla did not include between 6-30k in the cost of the $10,000 FSD upgrade, I can't imagine they will release a version where they take any responsibility. In fact I think we will see subsequent upgrades will include a lot more checkboxes making it very clear that we are responsible, the product has not been certified for unattended use at all. I just feel the nags and monitoring Will make the driver just drive the car themselves. We might even see insurance companies start adding verbiage as to whether or not you can use FSD, and if so that you were still responsible for everything it does. It might even adjust some of your limits of coverage if you can't prove you were fully in control and aware of everything that was happening


----------



## Needsdecaf

JasonF said:


> You might be confusing liability with legal regulation.
> 
> Tesla itself gives up all rights to the vehicle and what happens to it the moment you sign the delivery paperwork. It involves a legal concept I believe is called _assignment of title._ It's the same thing that happens when you sell a house to someone. Once that signature hits paper - and assuming no fraud has been committed in the deal - all of Tesla's liability is gone.
> 
> Where legal regulation comes in, FSD is an unregulated grey area right now. People might be able to sue Tesla on the grounds that they claimed FSD was safer than human drivers. Until self driving tech gets an official stamp of approval from government regulators rather than just letting it be, any automotive company with self-driving tech is at risk of that. We might even see a strange partnership someday where GM and Ford are fighting a lawsuit alongside Tesla to avoid having all of the self driving tech deemed dangerous.
> 
> FYI, I'm still not a lawyer, I learned this stuff from reading and watching Lehto's Law videos.


No, I'm not, and I'm done arguing. Go read the entire text on the SAE Taxonomy of Automated systems and get back to me.

Level 4 and 5 involve no driver needed. They don't even need to be in the car. Yet you expect the owner to maintain liability for an autonomous vehicle that they aren't even in? You know, the Robotaxi that Elon keeps promising?

LOL, come on.


----------



## JasonF

Needsdecaf said:


> No, I'm not, and I'm done arguing. Go read the entire text on the SAE Taxonomy of Automated systems and get back to me.
> 
> Level 4 and 5 involve no driver needed. They don't even need to be in the car. Yet you expect the owner to maintain liability for an autonomous vehicle that they aren't even in? You know, the Robotaxi that Elon keeps promising?
> 
> LOL, come on.


I often get myself in trouble by attempting to make an interesting discussion out of something that's already been decided, and then I end up in serious trouble. So I'm going to take what I hope is the smarter route and bail out while I'm still safe.


----------



## Needsdecaf

JasonF said:


> I often get myself in trouble by attempting to make an interesting discussion out of something that's already been decided, and then I end up in serious trouble. So I'm going to take what I hope is the smarter route and bail out while I'm still safe.


I haven't decided anything. But your discussion points ignore the reality that someone else is in control of the vehicle. If you can't at least figure that enters into the equation, I don't know what to tell you.

And please, read the SAE paper. It's worth your time regardless.


----------



## garsh

Needsdecaf said:


> Level 4 and 5 involve no driver needed. They don't even need to be in the car. Yet you expect the owner to maintain liability for an autonomous vehicle that they aren't even in? You know, the Robotaxi that Elon keeps promising?
> 
> LOL, come on.


If I understand @JasonF 's point, it's just that current U.S. laws mean that the owner will be legally liable for any accidents that the car they own gets into, regardless of any autonomous ability.

It sounds like you're talking about liability in a real-world sense rather than in a legal sense.


----------



## DocScott

Needsdecaf said:


> Level 4 and 5 involve no driver needed. They don't even need to be in the car. Yet you expect the owner to maintain liability for an autonomous vehicle that they aren't even in? You know, the Robotaxi that Elon keeps promising?
> 
> LOL, come on.


Yes, I do.

Let's move this away from AI for a moment.

Suppose I run a school bus company. I buy the school busses, hire drivers, and then one of my school busses gets in an accident where the driver I hired is at fault.

Do you really think my insurance doesn't have to pay out because I wasn't driving, and wasn't even in the bus? Or that I couldn't be successfully sued for damages? That would be crazy. Of course the owner maintains liability for their vehicles, even when they're not behind the wheel, whether the actual driver is an AI, someone you hired, or a friend you loaned the car to.

I cam _imagine_ a legal system where it's the driver who has to carry the insurance, but it's not the system that's currently in place in the US--it's the owner who needs to insure the vehicle, and who is liable for accidents. Since that's the system, L4 or L5 doesn't change anything.

(And yes, Tesla can be sued as well, if FSD misbehaves in some way, just as they can be sued if a defect causes the brakes to fail. But there's still nothing special about the autonomous aspect.)


----------



## Needsdecaf

DocScott said:


> Yes, I do.
> 
> Let's move this away from AI for a moment.
> 
> Suppose I run a school bus company. I buy the school busses, hire drivers, and then one of my school busses gets in an accident where the driver I hired is at fault.
> 
> Do you really think my insurance doesn't have to pay out because I wasn't driving, and wasn't even in the bus? Or that I couldn't be successfully sued for damages? That would be crazy. Of course the owner maintains liability for their vehicles, even when they're not behind the wheel, whether the actual driver is an AI, someone you hired, or a friend you loaned the car to.
> 
> I cam _imagine_ a legal system where it's the driver who has to carry the insurance, but it's not the system that's currently in place in the US--it's the owner who needs to insure the vehicle, and who is liable for accidents. Since that's the system, L4 or L5 doesn't change anything.
> 
> (And yes, Tesla can be sued as well, if FSD misbehaves in some way, just as they can be sued if a defect causes the brakes to fail. But there's still nothing special about the autonomous aspect.)


Your insurance pays for that because that's what your insurance policy is for. It covers your business, and your operators.

But, say the driver gets in an accident and kills someone. Who is getting sued and / or prosecuted for involuntary manslaughter? You as the school bus owner? Or the driver?

What happened to the Uber safety monitor driver in AZ? I can't find anything on that. Edit, found it, she was charged with negligent homicide. Apparently because she was there to be the monitor and she wasn't.

But I'm not the only one asking this question: https://slate.com/technology/2020/10/uber-self-driving-car-death-arizona-vs-vasquez.html

Here's a lawyer already targeting Waymo:

https://www.baumhedlundlaw.com/car-accidents/self-driving-car-accidents/waymo-accidents/


----------



## Madmolecule

Everyone will probably be sued. The actual driver will probably get the manslaughter convention. The school district will probably have to pay a massive civil penalties. If the brakes were deemed faulty, or the District had not kept up on the recommended maintenance, then is very possible the driver could get off and they would end up with the manslaughter charges. But honestly I think this is a fairly simple case.
here’s where it gets more complicated. Where the driver and the owner of the vehicle assumed that the vehicle has the capabilities of autonomous driving, let’s say with limited driver attention. If that window is even that the driver can go for 10 seconds without having full attention to everything in his surroundings, that is a long time and a lot of bad things can happen. My belief as unless the driver is required to have the same level of attention whether autonomy is used or not, then the liability has to move to the manufacture and the supplier of the control systems that are now in control.
It is clear the Tesla is already coming up with new criteria for a successful manager of their automated system. Through there and camera monitor they are determining whether you pay enough attention or not to use an automated system. The reverse could also be true that if The system has certified you paying enough attention to allow for automated functions and something bad happens well shouldn’t tesla bear some of the responsibility. And where does the insurance company come in. How does my insurance company know my ability to manage an automated system as opposed to driving a modern vehicle.
Every manufacturer is really backed off their abilities to do full self driving, And I think Tesla will shortly. They are focused on some automated functions such as advance cruise control and self parking which makes the driving experience better but you are still in full control.
I probably get poor attention points from teslas AI camera for butchering songs I am singing along to and I would assume air drumming is out of the question


----------



## DocScott

Needsdecaf said:


> Your insurance pays for that because that's what your insurance policy is for. It covers your business, and your operators.
> 
> But, say the driver gets in an accident and kills someone. Who is getting sued and / or prosecuted for involuntary manslaughter? You as the school bus owner? Or the driver?


I really didn't have a particular case in mind when I described my scenario. But since you asked, I Googled it. Here's an example where the bus driver _and_ the bus owner lost a lawsuit when a pedestrian was hit and injured. The legal theory was a little different than I was describing, because it had to do with the driver being an employee rather than the bus being owned by the bus company, but I still expect something like that would hold with autonomous vehicles.

It does seem to me that you're talking about two different things, and on one I pretty much agree with you and on the other I don't.

For _criminal_ liability, I don't think they'd charge the owner. So if your L5 Tesla, operated as Tesla told you to operate it, hits a pedestrian while there's no one in the vehicle, I don't think you'd get charged with manslaughter. But if it could be shown that Tesla executives knew about the issue and sold the car anyway, they might get charged with some sort of criminally-negligent manslaughter.

For _civil_ liability, though, I bet the owner _would_ be included. Your car caused the problem; that makes you partially responsible.

The two articles you link to show that distinction. The Uber one is about criminal charges; the Waymo one is about civil lawsuits.


----------



## Madmolecule

The first question every attorney asks after an accident is what are the extents of your coverages. They will then go after all pockets but especially the deepest pockets. The manufacturer will certainly be brought into the lawsuit if there’s any possibility that the cars automation was at fault. The famous McDonald’s coffee lawsuit was so much not because of the injuries caused to the woman’s body, but it was McDonald’s fault for not installing a cheap thermostat on the coffee maker. Even if it wasn’t a corporate cover-up, if it could be prove that the software was not tested enough and was not suitable for the conditions it was being operated in I can assure you tesla will be sued.

This is why I thought they should release this product in a country with looser laws. Once they have the data that autonomous driving, improves safety then it should be brought into other countries, like the US. But I don’t see how they can build up the data in the United States that an automated driving car overtime will kill less children then a human controlled vehicle. Until they have this data the first child killed, will be game over. Jury’s have sympathy for a human I understand that said I made a mistake, I am really poor judgment call, I was tired from working a long shift, but I did the best I could at the time. I don’t think they have the same sympathy for computers and their programmers, that claim that in one more update that would be out in another two weeks, this would’ve never happened. And when it does come out it will be amazing, two fire emoji.

Do any insurance policies allow for unattended, even for seconds, autonomous driving? Does Tesla insurance?
I guess I thought it was drivers insurance, not automated system management insurance.

Elon Electrify Cuba and prove you cannot only sell a car with full self driving, but you can provide one.


----------



## DocScott

Madmolecule said:


> Do any insurance policies allow for unattended, even for seconds, autonomous driving? Does Tesla insurance?
> I guess I thought it was drivers insurance, not automated system management insurance.


I just looked through my insurance policy (GEICO). It had quite a few restrictions, but that wasn't one of them. They may change their policies in the future, but if I miraculously got L3 firmware in my Tesla before the next time I renew my insurance (at which point GEICO could change the policy), I'd be covered.

And in fact, it's not "driver's insurance." It is described as "automobile insurance." Here is the key line in the liability section:

"We will pay damages which an insured becomes legally obligated to pay because of bodily injury sustained by a person; and property damage arising out of the ownership, maintenance, or use (including loading or unloading) of the owned auto..."

Yes, it also has some limited extensions to when I'm driving some other car I don't own, but that's more restricted. The main insurance is for when my car causes trouble, whether I'm in it or not.


----------



## Klaus-rf

Madmolecule said:


> This is why I thought they should release this product in a country with looser laws. Once they have the data that autonomous driving, improves safety then it should be brought into other countries, like the US. But I don't see how they can build up the data in the United States that an automated driving car overtime will kill less children then a human controlled vehicle. Until they have this data the first child killed, will be game over. Jury's have sympathy for a human I understand that said I made a mistake, I am really poor judgment call, I was tired from working a long shift, but I did the best I could at the time. I don't think they have the same sympathy for computers and their programmers, that claim that in one more update that would be out in another two weeks, this would've never happened. And when it does come out it will be amazing, two fire emoji.


In the human driver case above, perhaps that ONE person [after killing someone because they drove drunk / too tired / distracted / etc.] will learn and correct future driving operations. However, only one driver will learn from that incident. With some level of updatable automation, all partially- or fully-automated vehicles can learn from one incident in a future update. Updating all future [human] drivers will take generations - think seatbelts.

( And why are we talking about autonomous cars? We don't have those yet. We have driver aids - and ALL of those aids are still ßeta (and any real definition of ßeta means "feature complete". with bugs to be resolved - and we're not even close to "feature complete" so, in reality, what we have now is αlpha code. )


----------



## Madmolecule

Klaus-rf said:


> ( And why are we talking about autonomous cars? We don't have those yet. We have driver aids - and ALL of those aids are still ßeta (and any real definition of ßeta means "feature complete". with bugs to be resolved - and we're not even close to "feature complete" so, in reality, what we have now is αlpha code. )


We are only talking about autonomous/full self driving because that is what I was sold years ago. I agree to call it Beta you would at least think they would have to defined the capabilities of the software, And at least release the specs.
All the data that Tesla touts that proves their vehicles are safer when an auto pilot then without it's somewhat misleading. Basically the way they should present the data is that the combination of the driver fully in control along with the automated auto pilot is safer than the driver alone. I don't know that they have any auto-pilot data that is representative.
What will the city streets functionality be, we all have Fantasy Uncertainty and Doubt about his capabilities But none of us know what it will actually do. Again this is something I paid for almost 2 years ago.
I've been laughing at the roadster buzz from a little plaque they put up at a museum. Not a video of even a car doing 0 to 60 in 1.1 seconds. Instantly all the bloggers come out and say it's been confirmed, the roadster I can really do 1.1 seconds. Nothing has been confirmed they put up a plaque at a museum, and Elon said it will have fart thrusters via tweet. They have taken the corporate mentality of a plaque and a promise. They brought it into modern times by adding a follow up tweet by Elon. Yeah they don't need a PR department. They get all the marketing they need from existing customers, and by giving away free roadsters fantasies

Also removing radar was a terrible move. It's one thing if you never had it. The first time a Tesla runs over a kid without a radar, I would imagine the Defense for the victim would certainly ask is there anyway this tragedy could've been prevented if there was radar in the car. It's one thing I have never had radar and be able to honestly answer we don't know but we don't think so. It's another thing to have millions of miles of radar data showing that in fact it might've helped even in the slightest way to save the kids life. You would think if there was a supply chain issue tesla would've recalled early model teslas they did not purchase full self driving and Rob the radar is out of them. I still think it is a patent infringement issue where Tesla wants to be the only vision only Supplier and if they can make that work they would have a lock on it. It's only my belief but I'm trying to find something that makes sense. But they did spend money on aero wheels, and cut weight by reducing insulation. I've got extended range but terrible road noise, which sadly sometimes my immersive sound can just barely overcome.

Elon Electrify Cuba


----------



## Klaus-rf

Madmolecule said:


> All the data that Tesla touts that proves their vehicles are safer when an auto pilot then without it's somewhat misleading. Basically the way they should present the data is that the combination of the driver fully in control along with the automated auto pilot is safer than the driver alone. I don't know that they have any auto-pilot data that is representative.


You bring up a very good point here - there is ZERO data to show AP/FSD all by itself (since it doesn't exist) is safer than human-only drivers. Since there is no non-human-driver data to compare it with.

Reminds me of my favorite t-shirt logo:

"There are two kinds of people in the world: Those that can extrapolate from incomplete data"


----------



## M3OC Rules

Liability is something that needs to be worked out but I don't believe that's a major limiting factor. Tesla providing insurance will go a long way to alleviate that. But other insurance companies will probably want in on the game so they will figure it out. Look at Waymo. Liability is not their biggest problem. Its capability and scaling. It doesn't run people over but it might get stuck somewhere. Basically engineering problems. I think once they get to the point where its competent to run on its own the number of overall accidents will be less but more importantly the number of fatal accidents caused by FSD will be much less than humans.


----------



## Tesla4Me!

Madmolecule said:


> We are only talking about autonomous/full self driving because that is what I was sold years ago. I agree to call it Beta you would at least think they would have to defined the capabilities of the software, And at least release the specs.
> All the data that Tesla touts that proves their vehicles are safer when an auto pilot then without it's somewhat misleading. Basically the way they should present the data is that the combination of the driver fully in control along with the automated auto pilot is safer than the driver alone. I don't know that they have any auto-pilot data that is representative.
> What will the city streets functionality be, we all have Fantasy Uncertainty and Doubt about his capabilities But none of us know what it will actually do. Again this is something I paid for almost 2 years ago.
> I've been laughing at the roadster buzz from a little plaque they put up at a museum. Not a video of even a car doing 0 to 60 in 1.1 seconds. Instantly all the bloggers come out and say it's been confirmed, the roadster I can really do 1.1 seconds. Nothing has been confirmed they put up a plaque at a museum, and Elon said it will have fart thrusters via tweet. They have taken the corporate mentality of a plaque and a promise. They brought it into modern times by adding a follow up tweet by Elon. Yeah they don't need a PR department. They get all the marketing they need from existing customers, and by giving away free roadsters fantasies
> 
> Also removing radar was a terrible move. It's one thing if you never had it. The first time a Tesla runs over a kid without a radar, I would imagine the Defense for the victim would certainly ask is there anyway this tragedy could've been prevented if there was radar in the car. It's one thing I have never had radar and be able to honestly answer we don't know but we don't think so. It's another thing to have millions of miles of radar data showing that in fact it might've helped even in the slightest way to save the kids life. You would think if there was a supply chain issue tesla would've recalled early model teslas they did not purchase full self driving and Rob the radar is out of them. I still think it is a patent infringement issue where Tesla wants to be the only vision only Supplier and if they can make that work they would have a lock on it. It's only my belief but I'm trying to find something that makes sense. But they did spend money on aero wheels, and cut weight by reducing insulation. I've got extended range but terrible road noise, which sadly sometimes my immersive sound can just barely overcome.
> 
> Elon Electrify Cuba


Umm, I don't think radar can see kids since they are not metal.


----------



## DocScott

Tesla4Me! said:


> Umm, I don't think radar can see kids since they are not metal.


Radar visibility has nothing to do with whether an object is metal.

Think about it: weather radars see rain!

EDIT: My statement was too strong. Radar visibility is affected by the composition of an object, and metals are particularly reflective. The same is true for visible light: metals are shiny, and you've got a better chance of catching a glint of sunlight off a distant metal object than a matte one! But it's certainly not the case that an object has to be metal in order for radar to see it.


----------



## MelindaV

Lidar is the one that doesnt do well with people (liquids).


----------



## garsh

MelindaV said:


> Lidar is the one that doesnt do well with people (liquids).


Lidar just uses light. Raindrops can reflect it making it useless, but it should see people just fine.


----------



## Tesla4Me!

DocScott said:


> Radar visibility has nothing to do with whether an object is metal.
> 
> Think about it: weather radars see rain!


Our car radar can see metal objects THROUGH rain. If it stopped at rain, it would not be able to see anything beyond the rain. I am pretty sure our car radar does not see people.


----------



## Madmolecule

Tesla4Me! said:


> Radar can see metal objects THROUGH rain. And through people. It does not see people.


Radar can reflect off people. Unless they're extremely dehydrated. Turns out they have a little bit of water in them. Radar is actually a great use for measuring water level. It has the ability to see through less dense things like foam and bring back the accurate water level. It actually sees the raindrops but it also sees past where there are no drops, which is tha majority. might help them make a windshield wiper that works. Actually LiDAR can also, it just doesn't reflect to give you as accurate readings.

A major advantage of radar is it is not as susceptible to dirt and Pollen as a camera! on a vision only system I would hate for them to use the excuse they ran over the kid because the camera was dirty. What is the drivers responsibility and camera cleaning. Am I responsible to make sure it at least has 97% transmittance. How would I know that.


----------



## Tesla4Me!

Tesla4Me! said:


> Our car radar can see metal objects THROUGH rain. If it stopped at rain, it would not be able to see anything beyond the rain. I am pretty sure our car radar does not see people.





> *Q. Can radar detect people, or will nearby people interfere with sensing the target?*
> A higher frequency radar sensor (122 GHz) will more reliably detect people but is *not* intended to be used in personnel detection as a safety rated device. Depending on sensor sensitivity settings, a person could interfere with sensing the intended target if a person is near the sensor and within the sensor's direct field of view.


Tesla's radar is reported to be in the 76 -77 GHz range, so not as sensitive to water or people. A sample reference: https://www.bannerengineering.com/z.../radar-sensor-frequently-asked-questions.html


----------



## DocScott

Tesla4Me! said:


> Tesla's radar is reported to be in the 76 -77 GHz range, so not as sensitive to water or people. A sample reference: https://www.bannerengineering.com/z.../radar-sensor-frequently-asked-questions.html


I agree that Tesla radars and the AI interpreting them are not good at detecting pedestrians. But I think you're putting way too much emphasis on people not being metal. The radar reflectivity of things that are mainly water, like people, isn't quite as good as metal, but is still quite good. (In reply to your earlier comment: of course the radar is not _stopped_ by rain. Even weather radar isn't stopped by rain, unless it's insanely heavy--if it were, all the radar could see would be the rain nearest to it, and if it were raining at the radar site, that would be pretty useless!)

There are two main reasons Tesla radar has trouble with pedestrians:

1) Pedestrians are much smaller than cars. That creates resolution issues.

2) Pedestrians move much more slowly than cars moving on a road.

#2 is the crucial one. Tesla radar is going to be getting returns from the environment all the time...ground level is a crowded place! Even the road surface itself may generate returns. So the radar needs a way of identifying what objects are important. It turns out with radar it's pretty easy to tell if an object is moving, via the Doppler shift. When radar bounces off a moving object, it comes back at a different frequency than it went out at. That's how police radar guns determine a vehicle's speed for instance (same with pitch speed in baseball). It's easy to note the different frequency, in the radar return, and thus know there's a moving object out there and how fast it's moving.

That means that Tesla radar is bad at detecting stationary objects, whether pedestrians, cars, trucks, or walls. It's programmed to ignore all those returns, because it's almost always getting returns like that. Most of the bad collisions Teslsas have had while on AP have been with stationary objects.

A pedestrian is nearly stationary, and thus Tesla radar/AI will generally ignore it, relying on vision instead.

In short, I agree with your main point: radar isn't being used in Teslas to detect pedestrians. But I disagree with your reason as to why; whether something is metal or not isn't the main issue. It's primarily how fast it's moving and to a lesser degree its size.


----------



## JasonF

Weather doppler radar returns reflectivity, velocity, range, and height measurements. Where that kind of applies here is they filter out ground interference by removing detected objects with very high reflectivity and zero velocity. 

I don’t know how similar the radar Tesla uses is to that, but if it is, it sort of explains how Autopilot ends up hitting stationary objects in the road. It doesn’t see them because of how the filtering works.

Pedestrians would be hard to detect the same way pop up showers are hard to detect in weather - because they have very low velocity, brief reflectivity, and a very small cross section. So radar would see, then not see, then see again that pedestrian.


----------



## Tesla4Me!

DocScott said:


> I agree that Tesla radars and the AI interpreting them are not good at detecting pedestrians. But I think you're putting way too much emphasis on people not being metal. The radar reflectivity of things that are mainly water, like people, isn't quite as good as metal, but is still quite good. (In reply to your earlier comment: of course the radar is not _stopped_ by rain. Even weather radar isn't stopped by rain, unless it's insanely heavy--if it were, all the radar could see would be the rain nearest to it, and if it were raining at the radar site, that would be pretty useless!)
> 
> There are two main reasons Tesla radar has trouble with pedestrians:
> 
> 1) Pedestrians are much smaller than cars. That creates resolution issues.
> 
> 2) Pedestrians move much more slowly than cars moving on a road.
> 
> #2 is the crucial one. Tesla radar is going to be getting returns from the environment all the time...ground level is a crowded place! Even the road surface itself may generate returns. So the radar needs a way of identifying what objects are important. It turns out with radar it's pretty easy to tell if an object is moving, via the Doppler shift. When radar bounces off a moving object, it comes back at a different frequency than it went out at. That's how police radar guns determine a vehicle's speed for instance (same with pitch speed in baseball). It's easy to note the different frequency, in the radar return, and thus know there's a moving object out there and how fast it's moving.
> 
> That means that Tesla radar is bad at detecting stationary objects, whether pedestrians, cars, trucks, or walls. It's programmed to ignore all those returns, because it's almost always getting returns like that. Most of the bad collisions Teslsas have had while on AP have been with stationary objects.
> 
> A pedestrian is nearly stationary, and thus Tesla radar/AI will generally ignore it, relying on vision instead.
> 
> In short, I agree with your main point: radar isn't being used in Teslas to detect pedestrians. But I disagree with your reason as to why; whether something is metal or not isn't the main issue. It's primarily how fast it's moving and to a lesser degree its size.


I stand corrected on the details. Thanks for the detailed response.


----------



## gaduser

Klaus-rf said:


> You bring up a very good point here - there is ZERO data to show AP/FSD all by itself (since it doesn't exist) is safer than human-only drivers. Since there is no non-human-driver data to compare it with.
> 
> Reminds me of my favorite t-shirt logo:
> 
> "There are two kinds of people in the world: Those that can extrapolate from incomplete data"


That logo reminds me of another - "*Beam me up Scotty, there's no sign of intelligent life down here*".


----------

