# Karpathy comments on removal of radar and ultrasonic sensors (Oct 29, 2022)



## francoisp (Sep 28, 2018)

Long interview of Andrey Karpathy by Lex Fridman. The radar/sensors comments start at 1:27:58. Enjoy!


----------



## DocScott (Mar 6, 2019)

I fundamentally disagree with the premise that people move around the world using vision only.

Over time, evolution is pretty good at nerfing systems creatures aren't using, or where the "entropy cost" is too high. But there isn't a living creature on the planet that navigates and moves using vision only. Vision may be the most important sense for that task for a lot of creatures, but it's never the only one.

We can certainly have the discussion as to whether the trade-offs of including radar, sonar, and/or lidar are worth it. But please, I wish we'd dispose of the rubbish that people do fine with vision only.


----------



## francoisp (Sep 28, 2018)

DocScott said:


> I fundamentally disagree with the premise that people move around the world using vision only.
> 
> Over time, evolution is pretty good at nerfing systems creatures aren't using, or where the "entropy cost" is too high. But there isn't a living creature on the planet that navigates and moves using vision only. Vision may be the most important sense for that task for a lot of creatures, but it's never the only one.


Are we talking about survival or navigation? Sensory needs are different for each.


----------



## skygraff (Jun 2, 2017)

It always surprises me how these companies ignore auditory cues in the driving environment.

Until all emergency vehicles are using transponders (or some other technology) to communicate their approach from behind or cross streets to all receiving vehicles, at the very least automation requires sonic (not ultra) sensors. There are other reasons to be able to hear when driving but that’s probably the most critical.


----------



## francoisp (Sep 28, 2018)

skygraff said:


> It always surprises me how these companies ignore auditory cues in the driving environment.
> 
> Until all emergency vehicles are using transponders (or some other technology) to communicate their approach from behind or cross streets to all receiving vehicles, at the very least automation requires sonic (not ultra) sensors. There are other reasons to be able to hear when driving but that’s probably the most critical.


So often I hear a siren and I can't figure out where it's coming from until I see the emergency vehicle. Seems to me a car with cameras all around is well equipped to see the lights of an emergency vehicle. As Karpathy said, simplify.


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Seems to me a car with cameras all around is well equipped to see the lights of an emergency vehicle. As Karpathy said, simplify.


 Teslas do NOT have "cameras all around". There are MANY blind spots.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> Teslas do NOT have "cameras all around". There are MANY blind spots.


This picture, assuming that it is accurate, shows that cameras have a 360-degree view of the surroundings.


----------



## Klaus-rf (Mar 6, 2019)

It's NIOT accurate. It is merely an artists representation. Some would call it false advertizing.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> It's NIOT accurate. It is merely an artists representation. Some would call it false advertising.


Are you accusing Tesla of lying, of willful deception? The Tesla website page linked below does mention 360 degrees of coverage.



> *Hardware
> Exceptional Awareness*
> 
> Eight cameras and powerful vision processing provide 360 degrees of visibility, detecting nearby objects like pedestrians, bicyclists and vehicles.


Tesla Safety


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Are you accusing Tesla of lying, of willful deception? The Tesla website page linked below does mention 360 degrees of coverage.


Yes. marketing BS - aka A LIE.

Park you car in an open parking lot and have someone - 100 feet away - walk a circle around your car. Then tell me how much of that 360 degrees is captured by your cars cameras.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> Yes. marketing BS - aka A LIE.
> 
> Park you car in an open parking lot and have someone - 100 feet away - walk a circle around your car. Then tell me how much of that 360 degrees is captured by your cars cameras.


Are you formulating your opinion based on the 4 dashcam views? Because those only include the fender cameras, not the door frame cameras.


----------



## garsh (Apr 4, 2016)

Klaus-rf said:


> Teslas do NOT have "cameras all around". There are MANY blind spots.


Many?

The only blind spots I know of are some areas right next to the car. Everything more than a foot away is covered by at least one camera, and often two.


----------



## Klaus-rf (Mar 6, 2019)

garsh said:


> Many?
> 
> The only blind spots I know of are some areas right next to the car. Everything more than a foot away is covered by at least one camera, and often two.


 Draw an imaginary line at the center of both front doors, perpendicular to the forward line of travel of the car. With a stationary car, place objects along the line at 2, 5, 10, 20, 50,100 feet away from the car and tell me which of those objects are seen by the cameras. You may be surprised.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> Draw an imaginary like at the center of both front doors, perpendicular to the forward line of travel of the car. With a stationary car, place objects along the line at 2, 5, 10, 20, 50,100 feet away from the car and tell me which of those objects are seen by the cameras. You may be surprised.


Do you have a way to access all 8 cameras that I'm not aware of? I see only 4.


----------



## Klaus-rf (Mar 6, 2019)

You can easily draw chalk lines on the pavement showing the angles of view that the cameras "see". and easily detect blind spots. The rear camera is fish-eye wide angle so it "seeing" something is yuuugely dependent on the distance to the object. It pretty much can't see anything past 100 feet. 

Since you can easily "see" what the rear side cameras see, you can extrapolate the angles that the front side cameras see. More chalk lines.

Would be a good use for a drone.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> You can easily draw chalk lines on the pavement showing the angles of view that the cameras "see". and easily detect blind spots. The rear camera is fish-eye wide angle so it "seeing" something is yuuugely dependent on the distance to the object. It pretty much can't see anything past 100 feet.
> 
> Since you can easily "see" what the rear side cameras see, you can extrapolate the angles that the front side cameras see. More chalk lines.
> 
> Would be a good use for a drone.


Green the Only has made recordings by hacking the cameras streams and shown that indeed the camera views do overlap. I remember posting something about this a few years ago.


----------



## Klaus-rf (Mar 6, 2019)

francoisp said:


> Are we talking about survival or navigation? Sensory needs are different for each.


 Both. Bats (sightless) navigate hundreds of miles and somehow manage to catch (and eat) small flying and stationary things quite easily while having no vision. 

Vision is not required for navigation. Lots of blind folks get around just fine without issue (although they may be slower at it). The Supreme Court ruled that blind people can have guns and CC permits.

Vision is only one of many sensory inputs. Seems irresponsible for Tesla to exclude all other inputs.


----------



## francoisp (Sep 28, 2018)

Klaus-rf said:


> You can easily draw chalk lines on the pavement showing the angles of view that the cameras "see". and easily detect blind spots. The rear camera is fish-eye wide angle so it "seeing" something is yuuugely dependent on the distance to the object. It pretty much can't see anything past 100 feet.
> 
> Since you can easily "see" what the rear side cameras see, you can extrapolate the angles that the front side cameras see. More chalk lines.
> 
> Would be a good use for a drone.


Here's a recording from *Green the Only* that I just pulled. That recording shows the views from 7 cameras. They clearly show the tractor trailer the car is passing in overlapping views. The back of the trailer is simultaneously visible on both the top view and the side repeater. Then the front of the truck becomes visible on the side repeater and then on the fender camera and the back camera.

✂ Tesla passing truck clearly showing cameras overlap


----------



## SimonMatthews (Apr 20, 2018)

Klaus-rf said:


> Lots of blind folks get around just fine without issue (although they may be slower at it). The Supreme Court ruled that blind people can have guns and CC permits.


And yet lots of deaf people drive.


----------



## francoisp (Sep 28, 2018)

SimonMatthews said:


> And yet lots of deaf people drive.


And no blind people do. 🤪


----------



## Klaus-rf (Mar 6, 2019)

SimonMatthews said:


> And yet lots of deaf people drive.


 True. I've had (used, experienced?) several deaf Uber drivers.


> And no blind people do. 🤪


 That may be more of a state licensing issue. 

You ever seen those crazy (imho) folks riding motorcycles inside the wire cages at a Carnival? There was a group of blind riders that did that (it was a while back, I think I was younger then). Balance and timing.


----------



## NR4P (Jul 14, 2018)

I am still surprised how a small rainstorm disables Nav on AP and FSD. What does car do in a fog and good snowstorm if it can't handle a typical rainstorm here in the SE 6 months of the year? My Teslas will never be able to robo-drive if it can't solve these problems.

And now that the ultrasonic sensors are gone and we don't have front camera views in the bumpers, how do I know how far to pull into my garage before the car hits the wall? Yes its a tight spot between the garage wall and door clearance when it comes down behind the car. (and no, I am not backing in the garage to solve Tesla's problem of removing ultrasonics with no solution).


----------



## Klaus-rf (Mar 6, 2019)

NR4P said:


> I am still surprised how a small rainstorm disables Nav on AP and FSD. What does car do in a fog and good snowstorm if it can't handle a typical rainstorm here in the SE 6 months of the year? My Teslas will never be able to robo-drive if it can't solve these problems.
> 
> And now that the ultrasonic sensors are gone and we don't have front camera views in the bumpers, how do I know how far to pull into my garage before the car hits the wall? Yes its a tight spot between the garage wall and door clearance when it comes down behind the car. (and no, I am not backing in the garage to solve Tesla's problem of removing ultrasonics with no solution).


 And nighttime can also disable FSD. If it doesn't "see' something in the side cameras, it assumes failed hardware and shuts off. Not useful.

And to your other point -I would never let FSD or Summon attempt to maneuver itself into or out form my garage. I have things within inches on three sides and the US alarms are always triggering. They used to show the distance at the front and sides and now (past 2 -3 3 "updates" just show red lines with no numbers now.


----------



## Nom (Oct 30, 2018)

@Klaus-rf - you seem to be ignoring ‘The green only’ video @francoisp has posted. Why do you feel that the door pillar cameras would not see things basically straight out from them?


----------



## DocScott (Mar 6, 2019)

francoisp said:


> Are we talking about survival or navigation? Sensory needs are different for each.


Both.

And yes, there are plenty of deaf people who drive. The question isn't if it's possible; it's if it's optimal. In fact, in some cases deaf people use devices to listen for them, so that they have a visual indicator of horns and sirens. That's right--they add sensors other than vision to optimize driving!


----------



## garsh (Apr 4, 2016)

Klaus-rf said:


> Since you can easily "see" what the rear side cameras see, you can extrapolate the angles that the front side cameras see. More chalk lines.


No need to extrapolate. There are plenty of captures from those cameras on YouTube. They all overlap.



Klaus-rf said:


> The rear camera is fish-eye wide angle so it "seeing" something is yuuugely dependent on the distance to the object. It pretty much can't see anything past 100 feet.


Agreed, the wide angle will reduce resolution at longer distances. But the rear camera is the least important for high-speed driving. And the two fender cameras overlap to show the majority of the rearward view.


----------



## garsh (Apr 4, 2016)

DocScott said:


> And yes, there are plenty of deaf people drive. The question isn't if it's possible; it's if it's optimal.


I disagree with your assertion of the question. Tesla isn't attempting "optimal". They're attempting to create "better than average human".


----------



## TrevP (Oct 20, 2015)

DocScott said:


> I fundamentally disagree with the premise that people move around the world using vision only.
> 
> Over time, evolution is pretty good at nerfing systems creatures aren't using, or where the "entropy cost" is too high. But there isn't a living creature on the planet that navigates and moves using vision only. Vision may be the most important sense for that task for a lot of creatures, but it's never the only one.
> 
> We can certainly have the discussion as to whether the trade-offs of including radar, sonar, and/or lidar are worth it. But please, I wish we'd dispose of the rubbish that people do fine with vision only.


Vision impaired people get around pretty well by relying on their sense of hearing and touch. Cars? Time will tell if they can do the same without any other sensory inputs. Remember biological senses ≠ electronic senses


----------



## Klaus-rf (Mar 6, 2019)

garsh said:


> No need to extrapolate. There are plenty of captures from those cameras on YouTube. They all overlap.[/quote[ Then why does the car (FSD?) sharply veer left/right at stop sign intersections - if it can already "see" everything?
> 
> 
> 
> ...


----------



## skygraff (Jun 2, 2017)

francoisp said:


> So often I hear a siren and I can't figure out where it's coming from until I see the emergency vehicle. Seems to me a car with cameras all around is well equipped to see the lights of an emergency vehicle. As Karpathy said, simplify.


The sirens are to get our attention so we start looking. Inside a vehicle with windows closed and surrounded by reflective surfaces (buildings, other vehicles, etc.), there will be some confusion and difficulty triangulating (our ears are within a foot of each other so...). Once we start looking, however, we turn our ears at the same time we move our eyes (and see where other drivers are looking) so, in short order, we are usually able to discern a likely vector.

I don't know about you, but I have a much deeper field of view behind my vehicle (especially when stopped and free to turn my head) than the rear camera and repeaters. That's why I'm able to avoid triggering road rage by stopping the car from changing lanes in front of an approaching speeder.

Yes, it's possible to drive without auditory cues just like it's possible to do so without looking well behind you before changing lanes. It just isn't necessarily ideal and doesn't raise my hopes of FSD success let alone an improvement in safety compared to human drivers.

By the way, overlapping cameras are great until one gets blinded by some condensation or glare. We can turn our heads, squint, put on sunglasses, blink, wipe our eyes, open a window, etc., etc.


----------



## DocScott (Mar 6, 2019)

garsh said:


> I disagree with your assertion of the question. Tesla isn't attempting "optimal". They're attempting to create "better than average human".


I disagree.

That's what some of the other attempts to create self-driving cars are doing. But Musk is quite insistent that he doesn't want sensors that aren't "needed," like lidar, because they add unjustified expense. He's trying to skip the "proof of principle" step of building a car that can drive itself better than the average human to building a car that can drive itself better than the average human in an efficient and inexpensive way. That is, aiming for an optimal solution.

That's also what Karpathy is talking about this clip. He's not at all focussed on what will help a car drive better than the average human; he's already moving a step beyond that and trying to figure out how to optimize it.

It's a bold approach. Maybe it will pay off in the end, and maybe it won't. But it _is_ the approach.


----------



## DocScott (Mar 6, 2019)

TrevP said:


> Vision impaired people get around pretty well by relying on their sense of hearing and touch. Cars? Time will tell if they can do the same without any other sensory inputs. Remember biological senses ≠ electronic senses


Sure. 

But it's Musk and Karpathy who repeatedly are making their argument for vision only cars based on human capabilities!


----------

