Tesla Owners Online Forum banner

Have you been selected to participate in the new FSD SW release?

FSDBeta MEGATHREAD <- For all FSD Beta discussions

Official 
333K views 4K replies 237 participants last post by  Shilliard528 
#1 ·
Tomorrow, Tuesday, October 20 is the day. I'm as anxious to find out who is considered an expert and careful driver as I am to hear of how the new FSD re-write performs.

World Organism Font Astronomical object Terrestrial plant

 
See less See more
1
#2,112 ·
I had another drive on 10.4—this time during the day (and on my regular route). It felt much better. Drove back during the night and I had less repetitive braking, though perhaps having a lead car helped.

I had an interesting experience. After repeated messages about degraded FSD, at a stop light, the car cleaned the windshield then did a Take Over Immediately (with the red steering wheel) at me. I disengaged then reengaged and then the light turned green and all was well.
 
  • Like
Reactions: Shilliard528
#2,113 ·
I had another drive on 10.4-this time during the day (and on my regular route). It felt much better. Drove back during the night and I had less repetitive braking, though perhaps having a lead car helped.

I had an interesting experience. After repeated messages about degraded FSD, at a stop light, the car cleaned the windshield then did a Take Over Immediately (with the red steering wheel) at me. I disengaged then reengaged and then the light turned green and all was well.
Excellent input, thanks.
 
#2,114 ·
FSD was doing 35mph on a two lane road at night, and a car backed out onto the road. FSD slowed down, nice, but then it started speeding back up. The other car had backed out so far I think FSD thought the way was clear again. So I hit the brake. Perhaps FSD would have figured it out, but with an unnecessarily hard brake. (I pressed the camera button too, for the first time.)
 
#2,115 ·
I’m wondering if anyone knows whether FSD has cracked that notorious hard nut:

You are following a car…the lead car comes up on a stationary object and moves into the next lane at the last second. Your car is blind to the object for some reason that is baked in to the algorithm and doesn’t even slow.

I know at some point every developer was grappling with this. Just curious what progress has been made.
 
#2,116 ·
You are following a car…the lead car comes up on a stationary object and moves into the next lane at the last second. Your car is blind to the object for some reason that is baked in to the algorithm and doesn't even slow.
I think it will slow when it sees the suddenly-visible object. (But I doubt it would slow, as it should, due to the abrupt lane change itself.)

A related problem I've seen in the FSD videos: FSD stops at T intersection. A car approaching from the right turns, and as it turns the road coming from the left is occluded, along with a car approaching from the left. FSD now thinks this is not a T intersection, and proceeds to turn right. Oops!
 
#2,117 ·
Anyone see an issue when not using the navigation, but have enabled FSD and it not following your turn signal indicator?
 
#2,121 ·
Sorry, should have been clearer. It does change lanes when I engage a turn signal. I did mean that it should turn the corner when the signal is engaged in the picture above, instead it wants to make a left turn (would assume the default choice for FSD would be to always go straight when no destination is entered). Figured if I engage the right turn signal it would make a right instead (ie, skip navigation, I'll manually tell FSD where to go), but it doesn't. Continues to try to make the left turn (when FSD is engaged).

*insert AI apocalypse reference here*
 
#2,123 ·
Anyone else heavily use FSD after a release, and then avoid it altogether after a week?

I find myself downloading the latest versions immediately, testing it out heavily the first few days reporting any and all bugs. By mid week I usually just press the report button and stop sending the emails (if you didn't notice the previous dozen emails, why bother making my life more difficult trying to keep track of the issues). Then by the end of the week I stop using it altogether as I already know the very frequent issues it'd have (especially the horrible phantom braking) and just relegate myself to drive like a commoner :fearscream:
 
  • Like
Reactions: DocScott
#2,142 ·
It's called FSD Beta Tester Fatigue!

I think it's good to take a break from being the tester/trainer/safety officer from time to time.
 
#2,130 ·
I stopped testing the FSD Beta, in 95% of the time. When I have passengers it scares the crap out of them. And when I'm driving solo, it's either too erratic, embarrassing to other drivers, or dangerous. It's way less stressful to drive myself so that's what I end up doing.
It's amazing what the beta can do, but I find myself not using it much my city, which is Atlanta.
 
#2,135 ·
Curious, 2021.36.5.1 downloading now as I was thinking about opening a case about false "Collision Alert." I found an overhead sign above an overpass that had at least twice triggered the alert. I was going to replicate the problem; make a YouTube, and; report the problem.

Bob Wilson
 
#2,137 ·
Part of the problem is that unlike sky diving, a rollercoaster, or MMA, your autonomous car might kill someone else who didn't sign a release.
Been thinking about this and realized there's A LOT of stuff that's legal that causes untold numbers of deaths of parties not involved (alcohol comes to mind) that the government is completely ok with because a majority of the people want it. I'm sure once FSD is closer to being 99% safe (not before 2030 considering its been in development for 5 years and probably closer to 10% safe right now with some of the idiotic decisions it makes), they'll work on their image and getting people to accept it.

Then again a majority of the people want healthcare and education reform, but politics/$$$ have a bigger influence over what people want.
 
#2,139 ·
DocScott said:
Part of the problem is that unlike sky diving, a rollercoaster, or MMA, your autonomous car might kill someone else who didn't sign a release.

Been thinking about this and realized there's A LOT of stuff that's legal that causes untold numbers of deaths of parties not involved (alcohol comes to mind) that the government is completely ok with because a majority of the people want it. I'm sure once FSD is closer to being 99% safe (not before 2030 considering its been in development for 5 years and probably closer to 10% safe right now with some of the idiotic decisions it makes), they'll work on their image and getting people to accept it.
Kind of like when the original horseless carriages started taking over the roads. Speaking of causing untold numbers of deaths that is almost taken for granted.

If they want to improve safety, they should get rid of cars entirely.
 
#2,138 ·
Last night I ran an errand which brought me home on a slightly different route. It seemed straightforward. I would transition onto a frontage-type street, then make a left and immediate right onto a freeway onramp. The map represented the route accurately. Instead, ended up dealing with two consecutive failures due to the same root cause.

On the first attempt, two full blocks prior, FSD changed lanes *away* from the necessary lane, announcing "changing lanes to follow route." It had been in the correct lane already, but by changing lanes we got boxed out of any possibility of making the move onto the onramp. At the last minute, the car realized it needed to get over, so the turn signal activated. But it was too late. So we went straight instead, and rerouted to the next onramp.

At the second try, we turned left successfully, but ended in the left of two lanes, without enough time to make the move right to the onramp. Not wanting to spend the entire evening missing onramps, I grabbed the wheel and swerved onto the onramp (I was sure the path was clear).

In both cases, it was an incorrect lane placement that resulted in the failures. I know we complain a lot about awkward behavior, but often it is awkward and also sufficiently correct to achieve a goal. I am really hoping focus is being placed on these lane choice issues which are just straight up errors.

In a similar vein, a couple of days ago we navigated two two-lane roundabouts. Success is guaranteed if you just pick a lane and stay in it (yielding to traffic already in the roundabout, of course). The car changed lanes twice, inside the roundabout, again to "follow route." Fortunately there was no other traffic.
 
#2,140 ·
Any notion that self-driving technology must be 100% safe before it can be deployed is a non-starter and quite frankly, a waste of breath. It will not ever be perfect, just as human drivers have never been perfect. There WILL be instances where an autonomous vehicle doesn't handle something properly and people get hurt or killed. Of course we want to minimize the chances of that in every way possible, but the risk will never be reduced to zero. Even any kind of measurements of being 100%, 1000% or 100000% 'safer' than a human driver is ultimately just going to be dumb statistical gamesmanship. Pick your scenarios and data sets that support your marketing blurb and run with it. Those numbers will never matter to the person/people that get hurt or have property damaged. Nothing is perfect in this world, and self-driving will not ever be the first thing to challenge that notion. However, it will get to a point of 'good enough' or 'safe enough' and that is likely to be much sooner than a lot of people think. Even an imperfect self-driving technology will be better than a flawed or impaired driver in the very near future.
 
#2,143 ·
I still find I have enough events to report on each drive that trying to keep track and sending an Email to the FSD team about each one is overwhelming. So I keep pushing the record button and hope they figure it out without my Emails.
 
  • Like
Reactions: SysConsultant
#2,144 ·
They're not looking for edge cases yet because it can't do straightforward things correctly. Until then, I'm sure they're also overwhelmed with data.
 
#2,150 ·
This video popped up in my feed recently.

There is a lot of stuff there that I'm not smart enough to figure out, but one question I had was answered. Namely, "Does the car continually maintain the state of the adjacent lane, or does it clear the area on an as-needed basis?" I'm assuming the "Side Obstacle" value answers this question. It seems to update as vehicles pass by. It changes to True when nothing is apparent, also, so it's evaluating more data than we see on the visualization.

This is meaningful, to me at least, because when the car suddenly decides to change lanes (FSD stack) without the warning that we're used to (in the highway stack), it makes me wonder if that action has been well considered. I'm going to rest a little easier knowing that it has.
 
#2,152 ·
I suspect that they are leveraging telemetry that they add in each release to get them data about certain scenarios that they are focusing on. Of course, they are also using these betas so they should be seeing most of the same issues that we are seeing. User reports are just additional data that they may or may not do anything with.
 
#2,153 ·
Last night I finally received FSD Beta 10.4 after qualifying with a 99, after some bad days from 30 days ago dropped off. This morning I took it for a long spin in the Boston area - from Newton out to Framingham on I-90, then back on Rte 9 and then a few miles on side roads, back home.

Before I started out, I had watched DirtyTesla's video of this morning, where he performs a camera calibration that ends up fixing lots of the problems he's had with 10.4. I decided to do that even before enabling FSD Beta. At the westbound Framingham rest area, calibration was done and I enabled everything. I have to tell you, I was nervous. I'm a lifelong hardware/software engineer who has scrutinized every move of every early access version since getting my Model 3 in 2018, and I have a good feeling how things have progressed since then. However, my wait for FSD Beta has been so long that I've read too many negative reports of close calls (like on TOO!) to not be extremely nervous.

Venturing back onto I-90 with FSD Beta for the first time, I had an immediate experience of far more smoothness than NOA had given me before! Following and passing cars was smoother, as was getting off onto the long circulars to Rte 9. This was a really pleasant surprise, which I can only attribute to vision-only and no contribution from radar. No jerkiness or the sudden braking I've read so much about.

Rte 9 all the way back to Newton was pretty flawless. Lane changes were really good - and peaceful. Some hesitancy here and there, but frankly, I felt like the car was actually channeling some of the caution I always have for cars that look like they're going to dart into my lane. In fact, I felt the car immediately react to a truck next to me that veered slightly into my lane at about the 5 o'clock position - moving away from it. There was no way I would have known that even happened without 10.4. It was a smooth, matter-of-fact, slight change of position. Very impressive.

I was blown away that my mood quickly changed from nervousness to even greater trust that the car was protecting me that I had had with my latest AP. Did not expect that at all. When I did my first left turns, yes it felt a little jerky, but darn if it didn't figure out where to go each time. After Rte 9, I went through lots of residential back streets, with piles of leaves and garbage cans. It navigated the mess with ease. Jerky here and there again, but it did it.

Final verdict after drive #1 - quite amazing! It certainly removed misgivings I'd been having lately about whether true FSD would ever be achieved. It will, no doubt about it. But who really cares - the automated assistance we're getting even with 10.4 is already pretty amazing, and it will only get better. Well done Tesla!
 
#2,157 ·
What I'd like is a second snapshot button that lets me pat my car on the back. :grinning: Despite the real struggles we've all experienced with the current FSDB, there are still plenty of things it does right, and sometimes even very well!

There have been some situations the car handled perfectly where I openly exclaimed "Not bad!" to my empty passenger seat. Hopefully no one was looking at the crazy person...

I'm not sure if any kind of positive reinforcement training would fit into the NN modeling they do, but I'd certainly be willing to report those cases in my travels.
 
#2,158 ·
There have been some situations the car handled perfectly where I openly exclaimed "Not bad!" to my empty passenger seat.
I've done that too, more frequently than I expected. Sometimes I even catch myself carrying on a conversation with the car but usually more along these lines: "OK Max, this intersection is going to be tricky, let's see how you do... WHOA, that wasn't the greatest choice, let's report that to the mothership..."
 
#2,169 ·
Okay everyone, take a breath.

It was your choice to request and enable FSD.
You requested to participate in the Beta testing program.
You worked hard or gamed the system to get your Safety Score.
You downloaded the software.
You flipped the FSD Beta switch on the dashboard.
You agreed to the conditions.
And finally, you double clicked the stalk.

You asked for this, despite knowing in advance that it would be a sh*t show. If you don't like the game, you don't have to play.

Over the last 24 hours, the car has made many of the same horrible errors that we have been talking about. There's no need to rehash. I did want to describe a very good response at a 4-way stop sign, where we wanted to turn left. All 4 directions were full. The car did a good job of knowing when its turn was, much better in fact than the guy facing us, who despite arriving after us, decided to go, just as we pulled out. The car slowed, noticed the other car was also turning left (he was not signaling), somehow deduced that its own path was clear, and proceeded anyway. Right alongside the other left turner. That's about as fault tolerant as I've seen it be, ever.
 
#2,173 ·
Anyone with FSD beta had a chance to ride in snow or ice yet? While I've had many successful trips without intervention, almost all have had a moment of sudden, jerky behavior. I would think it could cause issues in snow...
Middy has her Sottozero3s on, we're ready for some northern roadtrips to try just that.
 
  • Like
Reactions: Mike and Kizzy
#2,181 ·
I noticed 33% got FSD and 66% of us, including me, are still denied. Eventually the risk to health and safety from withholding fully-paid FSD, will become an embarrassment that can no longer be ignored. Given the usual holiday carnage on the roads, sooner is better than later.

Bob Wilson
 
#2,183 ·
“Any notion that self-driving technology must be 100% safe before it can be deployed is a non-starter and quite frankly, a waste of breath. It will not ever be perfect, just as human drivers have never been perfect. There WILL be instances where an autonomous vehicle doesn't handle something properly and people get hurt or killed. Of course we want to minimize the chances of that in every way possible, but the risk will never be reduced to zero. Even any kind of measurements of being 100%, 1000% or 100000% 'safer' than a human driver is ultimately just going to be dumb statistical gamesmanship. Pick your scenarios and data sets that support your marketing blurb and run with it. Those numbers will never matter to the person/people that get hurt or have property damaged. Nothing is perfect in this world, and self-driving will not ever be the first thing to challenge that notion. However, it will get to a point of 'good enough' or 'safe enough' and that is likely to be much sooner than a lot of people think. Even an imperfect self-driving technology will be better than a flawed or impaired driver in the very near future.”

Even if an event has only a 0.00002% chance of happening, it’s 100% when it happens to you.
 
#2,198 ·
"Any notion that self-driving technology must be 100% safe before it can be deployed is a non-starter and quite frankly, a waste of breath. It will not ever be perfect, just as human drivers have never been perfect. There WILL be instances where an autonomous vehicle doesn't handle something properly and people get hurt or killed. Of course we want to minimize the chances of that in every way possible, but the risk will never be reduced to zero. Even any kind of measurements of being 100%, 1000% or 100000% 'safer' than a human driver is ultimately just going to be dumb statistical gamesmanship. Pick your scenarios and data sets that support your marketing blurb and run with it. Those numbers will never matter to the person/people that get hurt or have property damaged. Nothing is perfect in this world, and self-driving will not ever be the first thing to challenge that notion. However, it will get to a point of 'good enough' or 'safe enough' and that is likely to be much sooner than a lot of people think. Even an imperfect self-driving technology will be better than a flawed or impaired driver in the very near future."

Even if an event has only a 0.00002% chance of happening, it's 100% when it happens to you.
Agree. At some point FSD will be statistically safe enough to be approved at Level 4 or 5. But from my perspective it will always be driver-assistance technology and I will continue to believe that FSD with my oversight will be safer than FSD alone. No napping in the backseat for me. Just hoping, down the road, steering wheels will still be available as an option…
 
#2,187 ·
The event of the morning was fog related. On the highway, NOA announced it had detected limited visibility and was reducing speed. Fair enough, but I could see almost a quarter mile, so I kept my foot on the pedal to maintain 70 mph. In short order NOA panicked, screamed, showed me a red steering wheel, and asked me to please drive. I thought it might have been a fluke but the events repeated themselves a mile or so later.

I'm not criticizing the car for throwing in the towel if it can't see, or if I ask it to overdrive its capabilities. The puzzling thing is that our visibility in the fog was a good 10 times further than the visibility of headlights, at night. If night driving is OK, and amazingly enough I have not noticed a difference between day and night driving, that fog should not have been an issue. I wonder if the fog sensor is still set at "super conservative."
 
#2,192 ·
I'm not convinced that fog is the issue in these scenarios. I've experienced the same situation.

Teslas seem to be driving via cameras now and I agree that night time and day time driving appears to be different.

I have found that its not all about the cameras. If fog presents itself as moisture on the windshield, the cameras have a more difficult time seeing long distances. Sometimes the windshield wipers come on to try and clear the fog and sometimes it doesn't. The same thing happens in a very light rain.

I have also found that when FSD encounters light - the cameras see what is on the windshield much better. I experimented with this with a simple floodlight flashlight. One night it was raining and I purposely parked in a very dark parking lot with my wipers set to auto. The rain was coming down, however the wipers didn't come on. I then stood about 10 feet from the car and shined a floodlight flashlight onto the front windshield and the wipers started moving.

Since all we have are cameras now....I wonder if FSD is being impeded by a lack of light or mist on the windshield.

What does FSD do on the same road - without fog ( mist on the windshield ) ?
 
#2,195 ·
One of my usual routes involves exiting the freeway onto a 2 lane exit road that eventually ends up at a traffic light. FSD takes over on that long exit road. Almost always I‘m turning right at the end. The navigation shows it’s a right turn. But the car keeps wanting to pass into the left lane before the right turn. I don’t see why they can’t just program it to stay right if you are about to turn right in less then a mile. I see no reason to try and pass a car to get right back over again. Seems like they could just tell the car no passing if theres less than a mile! Or maybe have an option in the settings? No passing for 1 or 2 miles or something like that.
.
 
#2,196 · (Edited)
One of my usual routes involves exiting the freeway onto a 2 lane exit road that eventually ends up at a traffic light. FSD takes over on that long exit road. Almost always I'm turning right at the end. The navigation shows it's a right turn. But the car keeps wanting to pass into the left lane before the right turn. I don't see why they can't just program it to stay right if you are about to turn right in less then a mile. I see no reason to try and pass a car to get right back over again. Seems like they could just tell the car no passing if theres less than a mile! Or maybe have an option in the settings? No passing for 1 or 2 miles or something like that.
.
Of course they can tell the car no passing if there is less than a mile. ( I'm assuming your vehicle mapping changed to CityStreets on the offramp. Right?)

FSD - Highway used to do the same thing and now it doesn't.

Send them your suggestion. That's what is supposed to happen right now.

You were deemed a safe driver with your safety score....so suggest away.
 
#2,205 ·
From what I have read, his 10 days out would strike tonight. Has any 98 received the email invite, (or whatever) to FSD Beta 10.5?
 
Top