The mind salivates at the idea of sub-$100 and soon after sub-$10 Lidar. We could build spatial awareness into damn near everything. It'll be a cambrian explosion of autonomous robots.
There are already very good sub-$100 lidars, especially for 2D since they were made en masse for vacuum cleaners. E.g. the LD19 or STL-19P as they're calling it now for some reason. You need to pair them with serious compute to run AMCL with them, plus actuation (though ST3215s are cheap and easy to integrate now too) and control for that actuation which also wants its own compute, plus a battery, etc. the costs quickly add up. Robotics is expensive regardless of how cheap components get.
Even back when Snowden was current news, we'd reached the point where laser microphones could cover every window in London for a bill of materials* less than the annual budget of London's police force.
* I have no way to estimate installation costs, but smartphones show that manufacturing at this scale doesn't need to increase total cost 10x more than the B.o.M.
I’d definitely feel much better if most cameras in the world were replaced by LIDAR. I feel like it would be much tougher to have a flawless facial recognition program with LIDAR alone
The minute internet became widespread it was game over.
Pros and cons. :/
It'll never happen, but we need a bill of rights for privacy. The laypeople aren't well-versed or pained enough to ask for this, and big interest donors oppose it.
Maybe the EU and states like California will pioneer something here, though?
Edit: in general, I'm far more excited by cheap lidar tech than I am afraid of the downsides. We just need to be vigilant.
I'd say the numbers listed here prove the GPs point of poor enforcement. The largest fine is roughly 0.97% of Meta's 2023 revenue, the equivalent of a $600 fine for somebody making 60k / year. It's a tiny-tiny cost of doing business at best, definitely not a deterrent, given Meta's blatant disregard for GDPR since then.
Interestingly, there have been people in the LIDAR industry predicting costs like this for many years. I heard numbers like $250 per vehicle back in 2012 [1]
Of course, ambitious pricing like this is all about economies of scale - sensors that are used in production vehicles are ordered by the million, and that lowers the costs massively. When the huge orders didn't materialise, the economies of scale and low prices didn't materialise either.
Also 'Luminar Technologies, a prominent U.S. lidar manufacturer, filed for Chapter 11 bankruptcy in December 2025' LIDAR is useful in a small set of scenarios (calibration and validation) but do not bet the farm on it or make it the centre piece of your sensor suite.
This is very wrong.
LIDAR scanners have revolutionized surveying by
enabling rapid, high-precision 3D mapping of terrain and infrastructure, capturing millions of data points per second. LIDAR can penetrate dense vegetation, allowing accurate, ground-level, mapping in forested or obstructed areas. Drone mounted LIDAR has become very popular. Tripod mounted LIDAR scanners are very commonly used on construction sites. Handhels LIDAR scanners can map the inside of buildings with incredible accuracy. This is very commonly used to create digital twins of factories.
Lidar is critical for any autonomous vehicle. It turns out a very accurate 3D point cloud of the environment is very useful for self driving. Crazy, I know.
Like Waymo? (https://dmnews.co.uk/waymo-robotaxi-spotted-unable-to-cross-...) 17 years after betting the farm on LIDAR the solution fails to navigate a puddle. Sorry but they bet on the wrong technology, Tesla has overtaken them with multi camera and NN solution.
I'm not well versed into RF physics. I had the feeling that light-wave coherency in lasers had to be created at a single source (or amplified as it passes by). That's the first time I hear about phased-array lasers.
The beam is split and re-emitted in multiple points. By controlling the optical length (refractive index, or just the length of the waveguide by using optical junctions) of the path that leads to each emitter, the phase can be adjusted.
In practice, this can be done with phase change materials (heat/cool materials to change their index), or micro ring resonators (to divert light from one wave guide to another).
The beam then self-interferes, and the resulting interference pattern (constructive/destructive depending on the direction) are used to modulate the beam orientation.
You are right that a single source is needed, though I imagine that you can also use a laser source and shine it at another "pumped" material to have it emit more coherent light.
I've been thinking about possible use-cases for this technology besides LIDAR,. Point to point laser communication could be an interesting application: satellite-to-satellite communication, or drone-to-drone in high-EMI settings (battlefield with jammers). This would make mounting laser designators on small drones a lot easier. Here you go, free startup ideas ;)
In principle, as the sibling comment says, you could measure just the phase difference on the receiver end. The trick is that it's much harder for light frequencies than radar. I'm non even sure we can measure the phase etc of a light beam, and if we could, the Nyquist frequency is incredibly high - 2x frequency takes us to PHz frequencies.
There might be something cute you can do with interference patterns but no idea about that. We do sort of similar things with astronomic observations.
Not an expert, but main challenges with laser coherency are present when shaping the output using multiple transmitters.
For lidar you transmit a pulse from a single source and receive its reflection at multiple points. Mentioning phased array with lidar almost always means receiving.
A phased array is an antenna composed of multiple smaller antennas within the same plane that can constructively/destructively aim its radio beam within any direction it is facing. I'm no radio engineer but I think it works via an interference pattern being strongest in the direction you want the beam aimed. This is mostly used in radar arrays though I suppose it could work with light too since it is also a wave.
Interesting to see the cost curve drop ... this always changes the market.
I have been watching the sensor space for a while. Cheap LIDAR units could open up weird DIY uses and not just cars. ALSO regulatory and mapping integration will matter. I tried to work with public datasets and it's messy. The hardware is only one part! BUT it's exciting to see multiple vendors in the space. Competition might push vendors to refine the software stack as well as the hardware. HOWEVER I'm keeping an eye on how these systems handle edge cases in bad weather. I don't think we have seen enough data yet...
> Cheap LIDAR units could open up weird DIY uses and not just cars.
Interestingly, there are already some comparatively cheap LIDAR units on the market.
In the automotive market, ideally you need a 200m+ range (or whatever the stopping distance of your vehicle is) and you need to operate in bright direct sunlight (good luck making an eye-safe laser that doesn't get washed out by the sun) and you need more than one scanning plane (for when the car goes over bumps).
On the other hand, for indoor robotics where a 10m range is enough and there's much less direct sunlight? Your local robotics stockist probably already has something <$400
Sounds like the quality isn't all that great but LD06 sensors look like they're about $20 and someone who works on libraries about this suggested the STL27L which seems to be about $160 and here's an outdoor scan from it: https://sketchfab.com/3d-models/pidar-scan-240901-0647-7997b...
Not sure if the ld06 is a scanner like this or if it's just a line (like you'd use for a cheaper robot vac).
@dang .... do these comments seem organic to you? old accounts with almost zero karma going out of their way to use the same verbiage to compliment waymo 18 minutes after an article gets posted? .... dead internet at work.
Please don't post like this. If you suspect something, please email us (hn@ycombinator.com) with links to specific comments. The guidelines are clear abut this:
Please don't post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data.
Anytime a Tesla or Elon related article is posted it gets a barrage of negative comments usually FUD like. Any neutral or positive comment gets downvoted heavily. Bit suspicious to say the least, very clear pattern, they are not doing it very well should be a bit more nuanced.
There is no evidence of any such organised campaign. The critical comments we see against that company and person are generally from known, established HN users, and align with frequently-expressed sentiments among the general public. And the complaint is just as often made that "anything remotely critical" about that company and person is flagged. If posts about the topic are being downvoted and flagged, it's mostly because that person and company are in the news so frequently that most commentary about them is repetitive, sensationalist and uninteresting, and thus off topic for HN.
Is this Human safe at these volumes? There was a time you could get your feet sized by putting them into an X-ray box at the shoe store. Removed from stores once the harm was known.
I saw a Waymo in Seattle, today. If Waymo can get Seattle right, that gives me a lot of confidence that their stack is very capable of difficult road conditions.
Note: I have not had the pleasure of riding in one yet, but from what my friend in SJ says, it’s very convenient and confidence-inspiring.
I took the Waymo from San Jose airport to home on the peninsula. It took the 101 highway back for the most part, driving very conservatively at 65-55 mph, and in the right most lane. It still has a few quirks though. When there aren't any cars around it will speed up to 65 mph, but at on-ramps, it will slow down to 55 and then speed up once past. It will get stuck behind slow drivers being in the right most lane and patiently follow them a few car length behind them. On the plus side, the lidar stack field of view as shown on the internal display seems to see pretty far down the highway.
Why wouldn't you trust a Telsa, millions of people let there Tesla drive them all over USA (not geofences like Waymo) without touching the wheel from parking spot to parking spot everyday. Have you tried it?
Maybe because of the multiple investigations Tesla has currently due to crashes, deaths, injuries, etc. all caused by "whoops our cameras were fooled by some glare/fog and accelerated into a truck/pole"
Those are mainly autopilot which people conflate with FSD, and high percentage are human caused accidents (auto pilot requires full attention and driver is liable).
Autopilot is Tesla’s brand name for adaptive cruise control with lane centering. It’s a common feature on a huge number of cars from nearly every brand. Other companies have their own brand names for this, e.g. ProPilot or BlueCruise.
People died when adaptive cruise control was misused by drivers. Memes aside, there’s no good evidence that fatal abuse of adaptive cruise control is more prevalent than in other brands.
(Similarly, there was a rash of reporting of EV fires, yet statistics do conclusively show that ICE vehicles catch fire more often than EVs.)
Because it is not real autonomous driving? Being liable for software that you can neither verify nor trust is THE dealbreaker. Once Tesla says "We are liable for all accidents with FSD" with higher level autonomous driving this game changes. But Waymo is just way more reliable.
Even more fury-inducing, they don't even have ultrasonic parking sensors on cars that have ultrasonic parking sensors. They disabled them to move to a vision-only stack that is no where near as accurate or as good and which categorically cannot tell a difference in ground truth has occurred in its blind spot. But hey, all _people_ need are two cameras, right?
I mean it doesn't. If you actually look at it comma.ai proves that level two doesn't require lidar. Thats not the same as full speed safe autonomy.
whilst it is possible to drive vision only (assuming the right array of cameras (ie not the way tesla have done it) lidar gives you a low latency source of depth that can correct vision mistakes. Its also much less energy intensive to work out if an object is dangerous, and on a collision course.
To do that in vision, you need to work out what the object is (ie is it a shadow) then you have to triangulate it. That requires continuous camera calibration, and is all that easy. If you have a depth "prior" ie, yes its real, yes its large and yes its going to collide, its much much more simple to use vision to work out what to do.
It's fair to point out that comma.ai is SAE level two system, however it's not geofenced at all, which is an SAE level 5 requirement. But really that brings up the fact that SAE's levels aren't the right ones, merely the ones they chose to define since they're the standards body. A better set of levels are the seven I go into more detail about on my blog.
As far as distinguishing shadows on the road, that's what radar is for. Shadows on the road as seen by the vision system don't show up on radar as something the vehicle will run into.
Yes, silly using just cameras, I mean humans have Lidar sensors, that why they can drive, why didn't new just copy that....oh wait.
It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
Humans also don't have wheels, but we build objects with wheels. It is as if we can build objects that don't resemble humans for specific purposes. Crazy...
> Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world.
Wait what? when did they actually enter mass production?
> I mean humans have Lidar sensors
Real time slam is actually pretty good, the hard part is reliable object detection using just vision. Tesla's forward facing cameras are effectively monocular, which means that its much much harder to get depth (its not impossible but moving objects are much more difficult to observe if you only have cameras aligned on the same plane with no real parallax)
Ultimately Musk is right, you probably don't need lidar to drive safely. but its far more simple and easier to do if you have Lidar. Its also safer. Musk said "lidars are a crutch", not because he is some sort of genius, Its obvious that SLAM only driving is the way forward since the mid 00's (of not earlier). The reason he said it is because he thought he could save money not having lidar. The problem for him is that he didn't do the research to see how far away proper machine perception is to account for the last 1% in accuracy needed to make vision only safe and reliable.
This is a weirdly tired counterpoint that Elon and Elonstans like to bandy about as if it's an apples to apples comparison. Humans have a weirdly ultra-high-dynamic-range binocular vision system mounted on an advanced ptz/swivel gimbal that allows for a great degree of freedom of movement, parallax effects, and a complex heuristic system for analyzing vision data.
The Tesla FSD system has... well, sure, a few more cameras, but they're low resolution, and in inconveniently fixed locations.
My alley has an occlusion at the corner where it connects to the main road: a very tall, very ample bush that basically makes it impossible to authoritatively check oncoming traffic to my left. I, a human, can determine that if I see the light flicker even slightly as it filters through the bushes, that the path is not clear: a car is likely causing that very slight change in light. My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar, in a fixed location that means that without nosing my car _into_ the travel lane, there is literally no way for it to be sure the path is clear.
This edge case is navigated near-perfectly by Waymo, since its roof-mounted lidar can see above and beyond the bush and determine that the path is clear. And to hit back on the "Tesla is making cheaper cars that can drive autonomously anywhere in the world": I mean, they still aren't? Not authoritatively. Not authoritatively enough that they aren't seeing all sorts of interventions in the few "driverless" trials they're doing in Austin. Not authoritatively enough when I have my Tesla FSD to glory. It works well enough on the fat part of the bell curve, but those edges will get you, and a vision only system means that it is extremely brittle in certain conditions and with certain failure modes, that a lidar/radar backup help _enhance_.
Moreover, Waymo has brought lidar development in-house, they're working to dramatically reduce their vehicle platform cost by reducing some redundant sensors, and they can now simulate a ground truth model of an absurd number of edge cases and odd scenarios, as well as simulate different conditions for real-world locations in parallel with their new world modeling systems.
None of which reads to me as "not going well for Waymo." Waymo completes over 450,000 fully autonomous rides per week right now. They're dramatically lowering their own barriers to new cities/geographies/conditions, and they're pushing down the cost per unit substantially. Yeah, it won't get to be as cheap as Tesla owning the entire means of production, but I'm still extremely bullish on Waymo being the frontrunner for autonomous driving for the foreseeable future.
Waymos are still making lots of errors that a human wouldn't (Stopping in middle of a road due to a puddle was a recent one https://dmnews.co.uk/waymo-robotaxi-spotted-unable-to-cross-...) 17 years after betting on LIDAR, I think Tesla is ahead now in most respects. It's could be wrong though we will probably know by the end of this year.
* I have no way to estimate installation costs, but smartphones show that manufacturing at this scale doesn't need to increase total cost 10x more than the B.o.M.
There are SLAM cameras that only select "interesting" points, which are privacy preserving. They are also very low power.
Pros and cons. :/
It'll never happen, but we need a bill of rights for privacy. The laypeople aren't well-versed or pained enough to ask for this, and big interest donors oppose it.
Maybe the EU and states like California will pioneer something here, though?
Edit: in general, I'm far more excited by cheap lidar tech than I am afraid of the downsides. We just need to be vigilant.
Top 5 fines:
1 - Meta - Ireland - €1.2 billion
2 - Amazon Europe - Luxembourg - €746 millions
3 - WhatsApp - Ireland - €225 millions
4 - British Airway - UK - £183 millions
5 - Google - France - €60 millions
I wish every law barely got enforced this way.
Of course, ambitious pricing like this is all about economies of scale - sensors that are used in production vehicles are ordered by the million, and that lowers the costs massively. When the huge orders didn't materialise, the economies of scale and low prices didn't materialise either.
[1] https://web.archive.org/web/20161013165833/http://content.us...
https://www.forbes.com/sites/bradtempleton/2025/03/17/youtub...
Radar is just cheaper than the number of cameras and compute, it's also not really a strict requirement.
Look at how the current cars fuck up, it's mostly navigation, context understanding, and tight manoeuvres. Lidar gives you very little in these areas
Glad to see someone lowering the cost of this technology, and hope to see lots of engineers using this tech as a result.
We might even see a boom in LIDAR tech as a result
> phased-array
I'm not well versed into RF physics. I had the feeling that light-wave coherency in lasers had to be created at a single source (or amplified as it passes by). That's the first time I hear about phased-array lasers.
Can someone knowledgeable chime in on this?
In practice, this can be done with phase change materials (heat/cool materials to change their index), or micro ring resonators (to divert light from one wave guide to another).
The beam then self-interferes, and the resulting interference pattern (constructive/destructive depending on the direction) are used to modulate the beam orientation.
You are right that a single source is needed, though I imagine that you can also use a laser source and shine it at another "pumped" material to have it emit more coherent light.
I've been thinking about possible use-cases for this technology besides LIDAR,. Point to point laser communication could be an interesting application: satellite-to-satellite communication, or drone-to-drone in high-EMI settings (battlefield with jammers). This would make mounting laser designators on small drones a lot easier. Here you go, free startup ideas ;)
There might be something cute you can do with interference patterns but no idea about that. We do sort of similar things with astronomic observations.
For lidar you transmit a pulse from a single source and receive its reflection at multiple points. Mentioning phased array with lidar almost always means receiving.
I have been watching the sensor space for a while. Cheap LIDAR units could open up weird DIY uses and not just cars. ALSO regulatory and mapping integration will matter. I tried to work with public datasets and it's messy. The hardware is only one part! BUT it's exciting to see multiple vendors in the space. Competition might push vendors to refine the software stack as well as the hardware. HOWEVER I'm keeping an eye on how these systems handle edge cases in bad weather. I don't think we have seen enough data yet...
Interestingly, there are already some comparatively cheap LIDAR units on the market.
In the automotive market, ideally you need a 200m+ range (or whatever the stopping distance of your vehicle is) and you need to operate in bright direct sunlight (good luck making an eye-safe laser that doesn't get washed out by the sun) and you need more than one scanning plane (for when the car goes over bumps).
On the other hand, for indoor robotics where a 10m range is enough and there's much less direct sunlight? Your local robotics stockist probably already has something <$400
Not sure if the ld06 is a scanner like this or if it's just a line (like you'd use for a cheaper robot vac).
Please don't post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data.
No no it's the cabal...
Note: I have not had the pleasure of riding in one yet, but from what my friend in SJ says, it’s very convenient and confidence-inspiring.
The drive was delightful and felt really safe. It handled the SF terrain, traffic and mixed traffic like trams very well.
I wouldnt trust a self driving tesla ( or any camera only systems) though!
People died when adaptive cruise control was misused by drivers. Memes aside, there’s no good evidence that fatal abuse of adaptive cruise control is more prevalent than in other brands.
(Similarly, there was a rash of reporting of EV fires, yet statistics do conclusively show that ICE vehicles catch fire more often than EVs.)
I mean it doesn't. If you actually look at it comma.ai proves that level two doesn't require lidar. Thats not the same as full speed safe autonomy.
whilst it is possible to drive vision only (assuming the right array of cameras (ie not the way tesla have done it) lidar gives you a low latency source of depth that can correct vision mistakes. Its also much less energy intensive to work out if an object is dangerous, and on a collision course.
To do that in vision, you need to work out what the object is (ie is it a shadow) then you have to triangulate it. That requires continuous camera calibration, and is all that easy. If you have a depth "prior" ie, yes its real, yes its large and yes its going to collide, its much much more simple to use vision to work out what to do.
As far as distinguishing shadows on the road, that's what radar is for. Shadows on the road as seen by the vision system don't show up on radar as something the vehicle will run into.
They might not use them for autopilot, but maybe for some emergency braking stuff, when everything else failed.
It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
Part of that is that humans are distractible, and their performance can be degraded in many ways, and that silicon thinks faster than meat.
But part of it is the sensor suite. Look at Waymo vs Tesla robotaxi accident rates.
My understanding is that cyber cabs still need safety drivers to operate, is that not the case?
Wait what? when did they actually enter mass production?
> I mean humans have Lidar sensors
Real time slam is actually pretty good, the hard part is reliable object detection using just vision. Tesla's forward facing cameras are effectively monocular, which means that its much much harder to get depth (its not impossible but moving objects are much more difficult to observe if you only have cameras aligned on the same plane with no real parallax)
Ultimately Musk is right, you probably don't need lidar to drive safely. but its far more simple and easier to do if you have Lidar. Its also safer. Musk said "lidars are a crutch", not because he is some sort of genius, Its obvious that SLAM only driving is the way forward since the mid 00's (of not earlier). The reason he said it is because he thought he could save money not having lidar. The problem for him is that he didn't do the research to see how far away proper machine perception is to account for the last 1% in accuracy needed to make vision only safe and reliable.
The Tesla FSD system has... well, sure, a few more cameras, but they're low resolution, and in inconveniently fixed locations.
My alley has an occlusion at the corner where it connects to the main road: a very tall, very ample bush that basically makes it impossible to authoritatively check oncoming traffic to my left. I, a human, can determine that if I see the light flicker even slightly as it filters through the bushes, that the path is not clear: a car is likely causing that very slight change in light. My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar, in a fixed location that means that without nosing my car _into_ the travel lane, there is literally no way for it to be sure the path is clear.
This edge case is navigated near-perfectly by Waymo, since its roof-mounted lidar can see above and beyond the bush and determine that the path is clear. And to hit back on the "Tesla is making cheaper cars that can drive autonomously anywhere in the world": I mean, they still aren't? Not authoritatively. Not authoritatively enough that they aren't seeing all sorts of interventions in the few "driverless" trials they're doing in Austin. Not authoritatively enough when I have my Tesla FSD to glory. It works well enough on the fat part of the bell curve, but those edges will get you, and a vision only system means that it is extremely brittle in certain conditions and with certain failure modes, that a lidar/radar backup help _enhance_.
Moreover, Waymo has brought lidar development in-house, they're working to dramatically reduce their vehicle platform cost by reducing some redundant sensors, and they can now simulate a ground truth model of an absurd number of edge cases and odd scenarios, as well as simulate different conditions for real-world locations in parallel with their new world modeling systems.
None of which reads to me as "not going well for Waymo." Waymo completes over 450,000 fully autonomous rides per week right now. They're dramatically lowering their own barriers to new cities/geographies/conditions, and they're pushing down the cost per unit substantially. Yeah, it won't get to be as cheap as Tesla owning the entire means of production, but I'm still extremely bullish on Waymo being the frontrunner for autonomous driving for the foreseeable future.