Tomorrow, Tuesday, October 20 is the day. I'm as anxious to find out who is considered an expert and careful driver as I am to hear of how the new FSD re-write performs.
Excellent input, thanks.I had another drive on 10.4-this time during the day (and on my regular route). It felt much better. Drove back during the night and I had less repetitive braking, though perhaps having a lead car helped.
I had an interesting experience. After repeated messages about degraded FSD, at a stop light, the car cleaned the windshield then did a Take Over Immediately (with the red steering wheel) at me. I disengaged then reengaged and then the light turned green and all was well.
I think it will slow when it sees the suddenly-visible object. (But I doubt it would slow, as it should, due to the abrupt lane change itself.)You are following a car…the lead car comes up on a stationary object and moves into the next lane at the last second. Your car is blind to the object for some reason that is baked in to the algorithm and doesn't even slow.
Been thinking about this and realized there's A LOT of stuff that's legal that causes untold numbers of deaths of parties not involved (alcohol comes to mind) that the government is completely ok with because a majority of the people want it. I'm sure once FSD is closer to being 99% safe (not before 2030 considering its been in development for 5 years and probably closer to 10% safe right now with some of the idiotic decisions it makes), they'll work on their image and getting people to accept it.Part of the problem is that unlike sky diving, a rollercoaster, or MMA, your autonomous car might kill someone else who didn't sign a release.
Kind of like when the original horseless carriages started taking over the roads. Speaking of causing untold numbers of deaths that is almost taken for granted.Been thinking about this and realized there's A LOT of stuff that's legal that causes untold numbers of deaths of parties not involved (alcohol comes to mind) that the government is completely ok with because a majority of the people want it. I'm sure once FSD is closer to being 99% safe (not before 2030 considering its been in development for 5 years and probably closer to 10% safe right now with some of the idiotic decisions it makes), they'll work on their image and getting people to accept it.
I've done that too, more frequently than I expected. Sometimes I even catch myself carrying on a conversation with the car but usually more along these lines: "OK Max, this intersection is going to be tricky, let's see how you do... WHOA, that wasn't the greatest choice, let's report that to the mothership..."There have been some situations the car handled perfectly where I openly exclaimed "Not bad!" to my empty passenger seat.
Middy has her Sottozero3s on, we're ready for some northern roadtrips to try just that.Anyone with FSD beta had a chance to ride in snow or ice yet? While I've had many successful trips without intervention, almost all have had a moment of sudden, jerky behavior. I would think it could cause issues in snow...
Agree. At some point FSD will be statistically safe enough to be approved at Level 4 or 5. But from my perspective it will always be driver-assistance technology and I will continue to believe that FSD with my oversight will be safer than FSD alone. No napping in the backseat for me. Just hoping, down the road, steering wheels will still be available as an option…"Any notion that self-driving technology must be 100% safe before it can be deployed is a non-starter and quite frankly, a waste of breath. It will not ever be perfect, just as human drivers have never been perfect. There WILL be instances where an autonomous vehicle doesn't handle something properly and people get hurt or killed. Of course we want to minimize the chances of that in every way possible, but the risk will never be reduced to zero. Even any kind of measurements of being 100%, 1000% or 100000% 'safer' than a human driver is ultimately just going to be dumb statistical gamesmanship. Pick your scenarios and data sets that support your marketing blurb and run with it. Those numbers will never matter to the person/people that get hurt or have property damaged. Nothing is perfect in this world, and self-driving will not ever be the first thing to challenge that notion. However, it will get to a point of 'good enough' or 'safe enough' and that is likely to be much sooner than a lot of people think. Even an imperfect self-driving technology will be better than a flawed or impaired driver in the very near future."
Even if an event has only a 0.00002% chance of happening, it's 100% when it happens to you.
Of course they can tell the car no passing if there is less than a mile. ( I'm assuming your vehicle mapping changed to CityStreets on the offramp. Right?)One of my usual routes involves exiting the freeway onto a 2 lane exit road that eventually ends up at a traffic light. FSD takes over on that long exit road. Almost always I'm turning right at the end. The navigation shows it's a right turn. But the car keeps wanting to pass into the left lane before the right turn. I don't see why they can't just program it to stay right if you are about to turn right in less then a mile. I see no reason to try and pass a car to get right back over again. Seems like they could just tell the car no passing if theres less than a mile! Or maybe have an option in the settings? No passing for 1 or 2 miles or something like that.
.