# Vision Only AP still sucks



## iChris93 (Feb 3, 2017)

iChris93 said:


> Vision only sucks.


Still…

I’m on my first road trip with the Model Y that does not have a radar unit. I’ve noticed on stretches of interstate where the vehicle in front of me is very far away and looks like just a little sliver, that AP will sporadically interpret the sliver as a motorcycle that is much closer so it will then either phantom brake or try to change lanes if on NoA.

If the vehicle still had radar, it would know the distances much better.


----------



## TomT (Apr 1, 2019)

Yep, I've noted the same thing. I've also had it mistake how far away I was from a small vehicle in front of me and approach it at a dangerously high speed and too close. I had to manually brake. My 2019 with radar did much better,


----------



## iChris93 (Feb 3, 2017)

TomT said:


> Yep, I've noted the same thing. I've also had it mistake how far away I was from a small vehicle in front of me and approach it at a dangerously high speed and too close. I had to manually brake. My 2019 with radar did much better,


Did you notice any issues before the 85 mph max increase? I was traveling above 80 mph and wonder if it would be better at lower speeds?


----------



## Shilliard528 (May 29, 2021)

iChris93 said:


> Still…
> 
> I’m on my first road trip with the Model Y that does not have a radar unit. I’ve noticed on stretches of interstate where the vehicle in front of me is very far away and looks like just a little sliver, that AP will sporadically interpret the sliver as a motorcycle that is much closer so it will then either phantom brake or try to change lanes if on NoA.
> 
> If the vehicle still had radar, it would know the distances much better.


My Model X has a radar, which I wish was activated as a back up.


----------



## iChris93 (Feb 3, 2017)

Shilliard528 said:


> My Model X has a radar, which I wish was activated as a back up.


My 3 does too. It’s not totally clear that vehicles with radar have completely abandoned the radar

__ https://twitter.com/i/web/status/1537546010314125317


----------



## Ed Woodrick (May 26, 2018)

iChris93 said:


> Still…
> 
> I’m on my first road trip with the Model Y that does not have a radar unit. I’ve noticed on stretches of interstate where the vehicle in front of me is very far away and looks like just a little sliver, that AP will sporadically interpret the sliver as a motorcycle that is much closer so it will then either phantom brake or try to change lanes if on NoA.
> 
> If the vehicle still had radar, it would know the distances much better.


How were you able to determine that it was a vehicle very far away? Did you use echolocation like a bat?

Or did you use your built-in vision system.

It's amazing the number of people that think that radar is required for self-driving, but yet they drive everyday with their own two eyes.


----------



## iChris93 (Feb 3, 2017)

Ed Woodrick said:


> How were you able to determine that it was a vehicle very far away? Did you use echolocation like a bat?
> 
> Or did you use your built-in vision system.
> 
> It's amazing the number of people that think that radar is required for self-driving, but yet they drive everyday with their own two eyes.


Not once did I say that radar was required. Their vision only just still sucks and would be better, in its current state, if they still used radar. The fact that I can tell, with my own two eyes, that it wasn’t a motorcycle to be concerned about gives me hope that it can be better.


----------



## FRC (Aug 4, 2018)

Ed Woodrick said:


> It's amazing the number of people that think that radar is required for self-driving


It seems to me that even the great and powerful Elon believed this until quite recently.


----------



## shareef777 (Mar 10, 2019)

Ed Woodrick said:


> How were you able to determine that it was a vehicle very far away? Did you use echolocation like a bat?
> 
> Or did you use your built-in vision system.
> 
> It's amazing the number of people that think that radar is required for self-driving, but yet they drive everyday with their own two eyes.


Last I checked, my vision was attached to my brain, which, as far as I know, still hasn’t been replicated into software.


----------



## Madmolecule (Oct 8, 2018)

There are also a couple other senses involved. Somehow blind people don’t die. Elons latest comments that Radar was a crutch and masking how bad vision was performing, is Extremely alarming to a controls Engineer. No chance of FSD working on the the currently provided hardware and sensors in my opinion. I don’t care how many stacks or Brian cycles you dedicate, it ain’t happing. in my opinion. I hope this is child friendly enough, it’s just my opinion.


----------



## TomT (Apr 1, 2019)

iChris93 said:


> Did you notice any issues before the 85 mph max increase? I was traveling above 80 mph and wonder if it would be better at lower speeds?


These are between 65 and 70!


----------



## iChris93 (Feb 3, 2017)

TomT said:


> These are between 65 and 70!


Thanks!


----------



## Ed Woodrick (May 26, 2018)

Madmolecule said:


> There are also a couple other senses involved. Somehow blind people don’t die. Elons latest comments that Radar was a crutch and masking how bad vision was performing, is Extremely alarming to a controls Engineer. No chance of FSD working on the the currently provided hardware and sensors in my opinion. I don’t care how many stacks or Brian cycles you dedicate, it ain’t happing. in my opinion. I hope this is child friendly enough, it’s just my opinion.


So, you believe that if all you had was a TV screen (with 360 vision), steering wheel, accelerator and brake on a remotely operated vehicle, you couldn't operate the vehicle?
Actually, I feel that if I had a RADAR sending only one position report a second on 50 vehicles ahead of me, that it would overload my brain. 

And AFAIK, the problems with FSD has nothing to do with the data coming in, it's figuring out to do once you understand the data. 

And specific to this thread, phantom braking existed even with RADAR.


----------



## Ed Woodrick (May 26, 2018)

shareef777 said:


> Last I checked, my vision was attached to my brain, which, as far as I know, still hasn’t been replicated into software.


That's where you may be wrong.

How long did it take you to learn to drive? Elon may very possibly be doing it in less time.


----------



## shareef777 (Mar 10, 2019)

Ed Woodrick said:


> That's where you may be wrong.
> 
> How long did it take you to learn to drive? Elon may very possibly be doing it in less time.


Uh, took me about a year to learn to drive. Got my permit at 15 and was licensed a year later. FSD going on 6 years.


----------



## Klaus-rf (Mar 6, 2019)

Ed Woodrick said:


> And specific to this thread, phantom braking existed even with RADAR.


 And that RADAR was afraid of shadows so, obviously, it had to go.


----------



## Ed Woodrick (May 26, 2018)

shareef777 said:


> Uh, took me about a year to learn to drive. Got my permit at 15 and was licensed a year later. FSD going on 6 years.


You were driving at 1 year old????

Or maybe you got your permit after 15 years of life. While you may not think that it is a fair comparison, it absolutely is. Before you got your permit, you already had a lot of things, such as determining the difference between bicycles and cars figured out. You knew what a road was. Tesla has had to teach the computer all of this, just like it took you many years to learn it.


----------



## DocScott (Mar 6, 2019)

Ed Woodrick said:


> So, you believe that if all you had was a TV screen (with 360 vision), steering wheel, accelerator and brake on a remotely operated vehicle, you couldn't operate the vehicle?


Not well, no. Probably not well enough to drive the car both confidently and safely, at least without a lot of practice.

I think vestibular type senses are probably pretty important to me driving effectively--feeling if the car is speeding up/slowing down/etc.. It's helpful to be able to feel when a road service got rough, too.

I'd likely have been in several accidents during my life that I avoided because I heard someone honk at me.

Botts' dots are useful to know if I've drifted out of a lane.

I suspect I'm using sound in more subtle ways than just horns, too--it certainly gives me a sense of my speed, but I might be able to get a sense of cars nearby (e.g. in my blindspot), even if subliminally.

Oh, and you're really asking vision only, right? So no speedometer. Yes, we use our eyes to look at the speedometer, but that's actually a different sensor making the measurement. 

Individually, each of those things might be pretty minor. But together, I think I'd have a hard time driving well with the set-up you describe.


----------



## Madmolecule (Oct 8, 2018)

Ed Woodrick said:


> So, you believe that if all you had was a TV screen (with 360 vision), steering wheel, accelerator and brake on a remotely operated vehicle, you couldn't operate the vehicle?
> Actually, I feel that if I had a RADAR sending only one position report a second on 50 vehicles ahead of me, that it would overload my brain.
> 
> And AFAIK, the problems with FSD has nothing to do with the data coming in, it's figuring out to do once you understand the data.
> ...


I do think I could operate it in your scenario better than Tesla‘s AI. Driving in virtual reality 360 is very difficult, especially without sound feedback, You should try and get a real sense of what single sense operation is all about. You would not have your sense of hearing for example to hear his cars approaching or someone slamming their brakes. Radar is not perfect and ultrasonic is not perfect. At most they should all be used as trim biases. But what I believe the biggest problem Tesla is running into it is that using the multiple signals and sensors the computer cannot compute fast enough to provide an output as quick as a vehicle needs. I do agree that the problem is not data but information and knowledge. All the data Tesla has been collecting has only been to tune their model. Once they develop the perfect model them whatever input comes into it and appropriate ouput would come out of it. This is extremely difficult even in a semi closed loop system. My experience in implementing machine learning has been for example on an assembly line. Even this is very difficult, and the data sets are massive. But it is a very closed system with very few variables. Nothing like driving at 90 miles an hour on a six laned badly Maintained Road.
But I do think Tesla can achieve sophisticated cruise control one day. It’s
You can try to program a robot not to step in a puddle of water, but what do they do when they run across a puddle of milk.
That’s where a couple of those other senses and life Experience might make sense. One day it might be safer than humans driving. It should also be a relaxing and more enjoyable experience than humans driving, not only for the passengers but for the other cars around you. That is a much higher hurdle to overcome.


----------



## Ed Woodrick (May 26, 2018)

DocScott said:


> Not well, no. Probably not well enough to drive the car both confidently and safely, at least without a lot of practice.
> 
> I think vestibular type senses are probably pretty important to me driving effectively--feeling if the car is speeding up/slowing down/etc.. It's helpful to be able to feel when a road service got rough, too.
> 
> ...


You can have speedometer and accelerometer, that car has those.
Basically, anything that the car has, except radar.

Not quite sure about sound, but I'd basically say that the existing computers do as good of a job as other cars blowing at you.
And you bring up a really good point, people stray. They don't constantly know or watch the speed. Their foot gets lite or heavy. You start daydreaming and start drifting lanes. All of these the computes do MUCH better at than people.


----------



## Madmolecule (Oct 8, 2018)

oh vision only means with speed and inertia sensors, just no radar or ultrasonics. I guess that’s like full self driving where you will be driving the car and taking full responsibility. No sound for siren recognition.
Bottom line I can prove that I can drive the car from point a to point B. Tesla still has a long way to go to prove that they can do it or convince people they were really sold what they were delivered and you should be happy,

They still have not been able to address some of the major software bags and annoyances in the HMI, and it doesn’t seem like they’re working on them. It seems very apparent to me that they test the software only with a simulator, which creates a lot of frustrations when you put it in the hands of real people. It appears very limited paid Tesla employees are testing the full complement of software or actual vehicles in real world situations


----------



## Ed Woodrick (May 26, 2018)

I disagree with slightly; I don't think that the problem is an overabundance of data, I think that it is simply knowing what todo with the data. 
Also, I think that there is a problem averaging out the reactions. I'm pretty sure that's what is causing the phantom braking, One or two frames think that something is happening, the car reacts by slowing down, but the next frames analyze differently. Actually it's probably more like 20-30+ frames, which it often the timeframe that I feel the phantom event lasts.

And it becomes more obvious with FSB Beta, as the car creeps up to an intersection in multiple jerky actions and may turn the steering wheel in different directions maybe 10+ times during a turn.

It reminds me of a story from Peavey Electronics (speakers and amplifiers). They created their first transistorized amplifier and were extremely proud of an effectively flat response, just like a design engineer wants.
The market hated it, come to find out that humans had become used to the distortion of tube amplifiers.

Right now the car is kinda doing too much processing too fast. All of the data is being interpreted correctly, but indeed it is too much (the car can indeed process it fast enough, not quite the same as you indicated) And it just doesn't work well, effectively slowing the system will probably help.

Think about it as you are driving down the road next time. Did you count all of the cars at any point of time? Did you mentally calculate their position and direction and determine if there was an interception course with you for each car? Did you notice the position of every building and every traffic sign along the way? Did you determine where every white, yellow, and dotted line was?
I know that I don't. I know that if I do even a slight majority of this, I have to slow down to about 10 mph. We were just driving through the North Georgia city of Helen. It's got a main strip that's a tourist attraction. I still couldn't see every business or person going 10 mph. The car pretty much did though. I could see it on the screen.

Our brain has a immense capability to filter out crap that we don't need. I think that's why Tesla switched over to AI for the actual car actions as opposed to discrete code. It just still needs some more direction and training.


----------



## DocScott (Mar 6, 2019)

Ed Woodrick said:


> You can have speedometer and accelerometer, that car has those.
> Basically, anything that the car has, except radar.


Including the sonar? 

I'd love it if Tesla would incorporate the sonar into Autopilot and FSD, but it doesn't. 

Even Smart Summon doesn't use the sonar, which is just bizarre, because a person driving a Tesla performing the same tasks is quite likely to.

Do AP and FSD use the accelerometer? I honestly am not sure.

Tesla seems to have taken "vision-only" too seriously. Even setting aside the radar, it should make better use of the non-visual sensors it has.


----------



## Madmolecule (Oct 8, 2018)

Ed Woodrick said:


> So, you believe that if all you had was a TV screen (with 360 vision), steering wheel, accelerator and brake on a remotely operated vehicle, you couldn't operate the vehicle?
> Actually, I feel that if I had a RADAR sending only one position report a second on 50 vehicles ahead of me, that it would overload my brain.
> 
> And AFAIK, the problems with FSD has nothing to do with the data coming in, it's figuring out to do once you understand the data.
> ...


i have been selling, designing, and implementing controls systems for a while. A picture of me in the late 80s selling future proof computers with state of the ark, operator interfaces, computing power and security. It could eliminate most human labor with its SX coprocessor. We were so smart we even had an ashtray in the booth, so people could smoke while they learned about the future. I now have experienced machine learning and it still is not as impressive as a human that has experience with the process or equipment Tuning the automation. I have seen many manufacturers employ auto -tuning to their PLCs and it has had the same quality impact as in music. My experience is in wastewater treatment which is a very harsh environment for sensors. Many great engineered control systems failed because the sensors were not reliable. So we came up with multi variable criteria control where you gauge the quality of a sensor by the historical data relations to the other sensor data. None of this was the magic bullet, so operator enhanced control is what I have witnessed has had the most success. Where your automated SCADA system accumulates and analyzes historical data, runs optimization and prediction algorithms and compares that to historical performance for that device or process as well as give tuning and operations recommendations as well as alarm reporting and event trending to an experience operator to make the final decision. In my experience closed loop automation works best on PowerPoint. But most important is liability. The courts and jury’s are much more sympathetic to stupid humans than stupid computers


----------



## Ed Woodrick (May 26, 2018)

DocScott said:


> Including the sonar?
> 
> I'd love it if Tesla would incorporate the sonar into Autopilot and FSD, but it doesn't.
> 
> ...


Not sure why you would think that someone mimicking Summon would use Sonar. They'd be inside the vehicle and can't hear anything.

I doubt is the car uses accelerometer for FSD. I believe that it has one, because that's what some of the safety score is based on and honestly they are too cheap not to just include them. The only reason I mentioned them is because a poster that he could drive better with them.


----------



## DocScott (Mar 6, 2019)

Ed Woodrick said:


> Not sure why you would think that someone mimicking Summon would use Sonar. They'd be inside the vehicle and can't hear anything.


Not mimicking sonar--using the feedback the car already provides the driver based on its sonar. 

It's really strange that if you're pulling into or out of a parking space, garage, etc., a Tesla will hit you with a barrage of tones indicating how close you are to stuff (unless you've turned off that feedback), and will also show colored arcs on the visualization, but that Smart Summon itself doesn't use that information.


----------



## BrianC (Aug 14, 2021)

This post is a mix of AP and FSD observations, so apologies if the FSD pieces are off-topic.

I just got back from a 4500mi road trip, and I paid for a month of FSD to help out. This was 99% freeway travel so NoA was a non-factor for me. It was not FSD Beta, but the year+ old version because... 1 month subscription.  Regardless of FSD, I agree that Vison AP is not close to ready. Whether its the inability to process fast enough or the inability to make the correct decisions given all the data, I can't say. I can offer some observations though...

Autopilot generally did a better job at night than during the day, except for the highly annoying, way too spastic auto high-beams. Better even during a moderately windy rainstorm at night than during the day with similar traffic amounts. I interpret that as fewer inputs to distract it from the task at hand, but I'm not an AI engineer, just a lowly computer science engineer  so I don't know.

I noticed some patterns to the phantom braking. First on freeway stretches with gentle rolling hills, during the middle of the day with the sun out, there are lots of uncalled for braking events and what I call almost-braking events where you feel the car very briefly start to brake, but decides to cancels it after some further evaluation. Second, it seems to interpret mirages, even those maybe a half mile down the road, as something blocking your lane and very close to you. Severe braking when it sees a mirage, as if a brick wall just sprung up in front of you. This was most prevalent along I-40 all throughout New Mexico. Where it was overcast, this was not a problem. Third, breaching the top of one of these gentle, and even not so gentle rises triggered a braking or almost-braking event more than it should have, which is never if there's nothing in front of the car.

Curves cause way too many corrections to steering. When you or I take a turn, we set a turning amount, correct it a little bit as you're completely into the curve and pretty much hold it there when the curve is of a constant radius. Autosteer makes many, many corrections to get through that constant radius curve. I described it to someone as 'Autosteer works like its a kid who's had his learners permit for about 2 weeks or so'. I stick with this description. Same thing for coming out of the curve.

Accel and decel when cars are moving into our out of your lane is just flaky. When following a car in the left lane of a 2 lane divided highway and that car moves over to let the faster car pass (believe it or not, some people still do that) it takes way too long to decide to speed back up to your set speed (follow distance 2 here) and it seems to decide and change its mind at least twice every time. You get a little speed up, a hesitation, some more speed up, a bit more hesitation, and then its gets it in gear finally for real. Kind of dumb. Its either clear ahead or its not. Crosswinds and tractor-trailers are the bane of the car's existence. The moment those trailer tires cross the dotted line into your lane, big bada-brake, where I can clearly see the driver has already corrected and the incursion will be momentary. No reason to correct steering or speed.

Stop and go traffic... just nope. Disabled all assistance when in stop and go on the freeway. Braking and creep forward both too jerky. Its just not smooth at all.

Lane changes... I love this feature, but it still has that learners-permit vibe. Its as though it picks a steering angle to make the lane change and sticks with that until it senses the the far line and then adjusts steering to get back to straight down the lane. Its not smooth entering or exiting the lane change.

On ramps. It needs to understand that when there's an on ramp merging into the main lane, and no dotted line demarking the main lane from the ramp area, that you don't move over to drive down the center of the now much wider lane. The insistence on doing that instead of just hanging the same distance from the perfectly good dotted line its already tracking... silly.

Construction cones and zones. First, it would try to move out of the lane that the cones border, even when doing so in my eyes was a poor option (merging cars, traffic that will require a big adjustment in speed quickly, etc.). Its understandable that a stretch of road with curves, cones, merging lanes, ending lanes and everything else under the sun (I'm looking at you St. Louis) are a massively difficult task to navigate, but hey, deciding to abort in the middle of a curve on an overpass with concrete barrier on one side and codes on the other with ZERO notice... VERY not cool. Glad I was paying vlose attention and even then it was pretty close to having some wall rash on the car. I will say that I probably should have disabled when it started turning into a cone-ocolypse, but it was a steady curve. That was the last time I let it drive through anything more than the basic roadwork in one lane on the freeway scenario.

Taking all of that into consideration, it was better to have all this functionality than not to have it. When it was all doing its thing, it was much less stressful than driving it all without the help. I will add that an extreme phantom braking event did cut a travel day in half for us, because I got extremely motion sick from it (my head was turned to the side at the time, and that makes it ten times worse for me. A rest stop was within a couple of miles and after trying all my usual coping techniques there for about 30 minutes in hopes of resuming the journey, I just couldn't go so that day's hotel stop was only ten miles down the road with a lot of deep breathing and the 'do not puke' mantra in my head. For those who suffer from motion sickness, you get my pain.

Disclaimer, these are just my own experiences and opinions, and everyone has their own so you are welcome to agree or disagree as you see fit.


----------



## Klaus-rf (Mar 6, 2019)

^
So in conclusion, FSD is not ready for Prime Time.


----------



## BrianC (Aug 14, 2021)

Klaus-rf said:


> ^
> So in conclusion, FSD is not ready for Prime Time.


Hey, it took me a long time to write all that up, and you just go and blurt out the tl;dr.


----------



## Klaus-rf (Mar 6, 2019)

BrianC said:


> Hey, it took me a long time to write all that up, and you just go and blurt out the tl;dr.


My bad. Sorry.


----------



## ateslik (Apr 13, 2018)

Klaus-rf said:


> ^
> So in conclusion, FSD is not ready for Prime Time.


I’d say it’s not even ready for Sunday at midnight.


----------



## Madmolecule (Oct 8, 2018)

vision only will be even worse for the bot. If you think there are a lot of corner cases on different roads in different countries think about the real world corner cases in peoples homes and factories. The vision only, can’t even handle different peoples garages


----------



## Mike (Apr 4, 2016)

Vision only TACC: still does the phantom braking; this is the major drawback of this vehicle. Still, after almost 4.5 years of ownership, I would sign any legal waiver to have "dumb" cruise control as an option.


----------



## Shilliard528 (May 29, 2021)

And now they are removing USS (Ultra-Sonic Sensor) in cars in favor of vision - NFW. The car will need more cameras to be able to accomplish what cars with USS can do. Despite my Model X having USS, it still is terrible at rear crossing traffic and sensing objects such as a basketball size rock in the front (that I happened hit when turning into a driveway), I am not seeing it (pun intended!!).


----------



## Klaus-rf (Mar 6, 2019)

Madmolecule said:


> vision only will be even worse for the bot. If you think there are a lot of corner cases on different roads in different countries think about the real world corner cases in peoples homes and factories. The vision only, can’t even handle different peoples garages


 Yeah. What's that bot gonna do when it detects (it if even detects them!) cobwebs and spiders hanging down in doorways?? Yes, it's that time of year again.


----------



## jsmay311 (Oct 2, 2017)

Shilliard528 said:


> And now they are removing USS (Ultra-Sonic Sensor) in cars in favor of vision - NFW. The car will need more cameras to be able to accomplish what cars with USS can do. Despite my Model X having USS, it still is terrible at rear crossing traffic and sensing objects such as a basketball size rock in the front (that I happened hit when turning into a driveway), I am not seeing it (pun intended!!).


JFC… Elon might be the dumbest smart person in the world. I wish he’d just quit Tesla already and go full time into managing Twitter. Or literally anything else that doesn’t involve actively making Tesla worse.


----------



## francoisp (Sep 28, 2018)

Shilliard528 said:


> And now they are removing USS (Ultra-Sonic Sensor) in cars in favor of vision - NFW. The car will need more cameras to be able to accomplish what cars with USS can do. Despite my Model X having USS, it still is terrible at rear crossing traffic and sensing objects such as a basketball size rock in the front (that I happened hit when turning into a driveway), I am not seeing it (pun intended!!).


Removing the ultrasonic sensors in favor of vision is not necessarily a bad thing if Tesla can pull it off as it believes it can. What disappoints me is that the software won't be fully functional before doing it, giving the owners of these new cars a worst experience.


----------



## Mike (Apr 4, 2016)

francoisp said:


> Removing the ultrasonic sensors in favor of vision is not necessarily a bad thing if Tesla can pull it off as it believes it can. What disappoints me is that the software won't be fully functional before doing it, giving the owners of these new cars a worst experience.


I suspect I will end up not updating my software once it becomes obvious that a TBD future software version will remove the US’s functionality from my car.


----------



## francoisp (Sep 28, 2018)

Mike said:


> I suspect I will end up not updating my software once it becomes obvious that a TBD future software version will remove the US’s functionality from my car.


Let's hope that Tesla will come out with a working software solution faster than what happened with its replacement of Mobile Eye several years ago.


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Let's hope that Tesla will come out with a working software solution faster than what happened with its replacement of Mobile Eye several years ago.


Dream On, Wayne!!


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> Dream On, Wayne!!


Is that a quote? I'm not familiar with this expression.


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Is that a quote? I'm not familiar with this expression.


 It's a construct melding "Dream on, Whiteboy" and "Party On, Wayne!". I made it up (and tequila was not involved).

Both very commonly used terms in the hood I grew up in - when I was younger, now that I'm old.

<g>


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> It's a construct melding "Dream on, Whiteboy" and "Party On, Wayne!". I made it up (and tequila was not involved).
> 
> Both very commonly used terms in the hood I grew up in - when I was younger, now that I'm old.
> 
> <g>


"Old" is a frame of mind. That's what I keep telling myself.


----------



## Shilliard528 (May 29, 2021)

I still do not understand if it is vision only, why can't the 'system' see as far as a human? it does not see or perhaps sees but does not react to the speed limit change or the lane squeeze, or that the lane is a merge lane not a new lane? I see signal light colors well before the 'car' shows it as a light with color. I see 20:20 what does vision have for visual acuity? my 2 cents.


----------



## francoisp (Sep 28, 2018)

Shilliard528 said:


> I still do not understand if it is vision only, why can't the 'system' see as far as a human? it does not see or perhaps sees but does not react to the speed limit change or the lane squeeze, or that the lane is a merge lane not a new lane? I see signal light colors well before the 'car' shows it as a light with color. I see 20:20 what does vision have for visual acuity? my 2 cents.


What is rendered on the screen is only a subset of what the car sees. If it doesn't display a traffic light it's likely because it doesn't matter at that point. Same as me seeing a traffic light way ahead: I don't care until I'm within braking distance. By the way there are a number of older videos on YouTube by "Green the Only" showing what the car sees and that's a lot. And I think the software has only got better ever since. As far as failing to recognize speed limit signs, these "objects" are seen but they are either misunderstood or ignored because they don't conform to some standardized objects in the system database. In my area that's usually the case.


----------



## Kizzy (Jul 25, 2016)

francoisp said:


> Let's hope that Tesla will come out with a working software solution faster than what happened with its replacement of Mobile Eye several years ago.


The Mobileye thing was kind of sudden/unexpected, no?

Taking off radar/ultrasonic sensors was also unforeseen?


----------



## francoisp (Sep 28, 2018)

Kizzy said:


> The Mobileye thing was kind of sudden/unexpected, no?


Why do you think that? Unless I'm mistaken, the reason is simply that Tesla didn't want to use someone else's ADAS. Tesla wants to be vertically integrated and I agree with that.


----------



## Shilliard528 (May 29, 2021)

francoisp said:


> What is rendered on the screen is only a subset of what the car sees. If it doesn't display a traffic light it's likely because it doesn't matter at that point. Same as me seeing a traffic light way ahead: I don't care until I'm within braking distance. By the way there are a number of older videos on YouTube by "Green the Only" showing what the car sees and that's a lot. And I think the software has only got better ever since. As far as failing to recognize speed limit signs, these "objects" are seen but they are either misunderstood or ignored because they don't conform to some standardized objects in the system database. In my area that's usually the case.


Yea, I do recall that it sees a lot more. I guess I would like to see what it sees


----------



## Shilliard528 (May 29, 2021)

Shilliard528 said:


> Yea, I do recall that it sees a lot more. I guess I would like to see what it sees


Drove with latest FSD (2022.20.18 FSD Beta 69.2.3) and did not see much difference. Car still tries to get into an exit lane on two lane highway, or tries to move into what it thinks is middle lane of a 3 lane, but the 3rd lane is an off ramp. If I can see it ending and not a true lane, it should see that both visually and via mapping. Also, still does random lane changes for no reason that I can see. Now put it on a true highway / Freeway and it is very good, as is 1 lane road driving.


----------



## iChris93 (Feb 3, 2017)

francoisp said:


> If it doesn't display a traffic light it's likely because it doesn't matter at that point. Same as me seeing a traffic light way ahead: I don't care until I'm within braking distance.


I disagree. There’s a time when you stop accelerating and start coasting before you’re in braking distance. FSD beta drives as you describe and it’s uncomfortable and inefficient.


----------



## Shilliard528 (May 29, 2021)

Theory on FSD Beta: I find when using FSD Beta that it still speeds to stop lights versus let regen slow down or slight accel when coasting (to overcome regen), and other areas, such as changing lanes in an intersection. Is it possible that the generation developing the code is of a different generation than majority of us with FSD? Thoughts? Or am I all wet. LOL


----------



## iChris93 (Feb 3, 2017)

Shilliard528 said:


> Is it possible that the generation developing the code is of a different generation than majority of us with FSD? Thoughts?


I think it’s more that California drivers are training it and less to do with generation.


----------



## Kizzy (Jul 25, 2016)

francoisp said:


> Why do you think that? Unless I'm mistaken, the reason is simply that Tesla didn't want to use someone else's ADAS. Tesla wants to be vertically integrated and I agree with that.


Tesla did want to switch and I recall reading that they wanted more control of Mobileye tech to start testing their system and Mobileye declined (this is the closest I could find). Additionally, after that fatal accident on AP HW 1.0 hardware, Mobileye cut them off of hardware supply, period.


----------

