You probably have already heard that Elon Musk, noted Tesla boss and sayer of things, said , many of which, I’d say, are pretty questionable. I think the most questionable thing he said, though, has to do with his take on , which stands for Light Direction and Ranging, and is used by many companies developing autonomous vehicles, though, notably, not Tesla.
As you may know, lidar uses low-intensity, non-harmful, and invisible (to our eyes, at least—aliens may beg to differ) laser beams, which are pulsed at a target (or, in the case of most autonomous cars, all around, in a full 360° dome) and the reflected pulses are measured for return time and wavelength to compute the distance of the object from the sender.
In practice, lidar can produce some very detailed, high-resolution visualizations of the environment around a self-driving car. That’s why so many companies developing autonomous vehicles use it.
You can tell what autonomous vehicles use these systems because they usually have some sort of hardware on their roof to house the lidar sensors. You can see Waymo’s system explained in detail here, for example:
And as Tesla claims an extremely aggressive expansion of autonomous technology—things like 1 million “robotaxis” on the road by next year—and cars with Level 5 autonomy, it’s very hard to square all that with the lack of lidar tech.
Here’s :
“Lidar is a fool’s errand. Anyone relying on lidar is doomed. Doomed! [They are] expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices. Like, one appendix is bad, well now you have a whole bunch of them, it’s ridiculous, you’ll see.”
And , such as:
“In my view, it’s a crutch that will drive companies to a local maximum that they will find very hard to get out of. Perhaps I am wrong, and I will look like a fool. But I am quite certain that I am not.”
I don’t think the appendix analogy holds up, because unlike an appendix, lidar actually does something: it helps autonomous vehicles generate a very detailed and comprehensive 360 degree view of the world. Appendixes just get inflamed and make adults cry in pain.
I think Musk has, fundamentally, two real problems with lidar: first, he’s stated he doesn’t like the way they look () and more importantly for him, no Teslas currently incorporate any lidar hardware. It seems it is very important to him that one day he can make all (or nearly all) Teslas currently on the road fully autonomous with an over-the-air software update, now that the so-called is in pretty much all new Model S, X and 3 cars you can buy from here on out.
” in the past, and at the Tesla Autonomy Day event yesterday, , said that lidar is a “shortcut” that “sidesteps the problem of visual recognition.” and “gives a false sense of progress.”
Karparthy :
“You were not shooting lasers out of your eyes to get here,”
which, while not wrong, is pretty revealing of Tesla’s thinking on this manner.
These criticisms seem to suggest that we should be building, effectively, mechanical analogues to how humans perceive the world to drive, which is an oddly limiting idea. If we’re starting from scratch, why wouldn’t we want to go beyond what we ourselves are capable, even if that means metaphorically shooting lasers out of our eyes?
Think about these criticisms—calling lidar a crutch that sidesteps the problems of interpreting three-dimensional data from a camera feed assumes that, somehow, there’s a “right” way to process the world, and that right way is only the same way humans do it, through visible-light optical sensing.
The truth is that lidar is not a shortcut—it’s another very effective way to process the complex three dimensional space that an autonomous vehicle will exist in.
Lidar generates very detailed and comprehensive views of the surrounding environment, in three dimensions, and while it is still weather-dependent like optical systems are, there are methods to work around these limitations. (corrected)
This strange implication that, somehow, using lidar is cheating makes no sense at all. The fact that lidar-based autonomous vehicles have given, as Karparthy says “really fast demos,” doesn’t mean that there’s some kind of parlor tricks happening here—it means that lidar actually works well to give an AV a good comprehensive sense of its environment.
Nobody ever said that the “rules” to building autonomous vehicles were that they just had to use the same basic sensory systems that humans have. If there’s a way to improve on what humans use to drive, then why wouldn’t you take advantage of that?
Optical vision has worked well for driving for over a century, but that doesn’t mean it couldn’t be better, arguably much better. If humans could perceive a 360 degree view of their surroundings while driving, that would be fantastic—our clumsy rear view mirrors and blind spot sensors would no longer be necessary, and if that 360 degree view was independent of light conditions or weather, you can be damn sure we’d like to have it.
Camera-based systems have the same limitations our eyes do, in that when it’s dark, we can’t see very well, especially dark things, like dark dogs or people in navy blue sweatshirts on bicycles, and weather is a huge factor as well.
Plus, where our brains have millions of years of evolution of binocular optical systems to help us understand three dimensions, two-dimensional cameras rely on complex and advanced computation to understand the world.
Tesla is making , without question, which they mentioned in their event when they talked about . But there’s still inherent limitations to that approach, such as the with pictures of bicycles on the back of a van.
The solution to all of this is, as you may guess, don’t skimp when it comes to letting your 4,000-pound robot see the world. Incorporate as many systems as possible, and let them be redundant so they can cover for one another if one method is having issues. This is the only way that makes sense. Not incorporating a method of sensing because it’s not one humans use is insane; humans also don’t move around by spinning rotors in stators with electricity, after all. We don’t have wheels, either.
Optical recognition is crucial to self-driving cars, no question. But coupling optical systems with true three-dimensional lidar imagery is crucial as well. There’s just no rational reason to dismiss out of hand one of the most advanced environmental sensing technologies available.
Sure, right now lidar is expensive and requires funny-looking domes on the roof, but the as it gets developed, and there’s little reason to think that this technology would be any different than any other in that regard.
It’s certain that adding lidar to Teslas would increase the cost of the cars—it’s additional hardware, and adding anything means that anything has to be paid for. There’s real value in figuring out how to do more with less, no question, but when it comes to the fundamental way these machines will see the world and react to it, redundancy and better perception should likely beat out trying to be cheap.
Looking at all of the potential benefits of lidar, benefits that could literally save lives, and then looking at Elon’s blanket disparagement can really only suggest one explanation: Teslas don’t have lidar, so Elon needs to convince everyone it’s stupid and a big waste of money.
We’ll see if he’s right in the coming year, but it’s a view that may prove extremely short-sighted.