Continental, the big European auto parts company, has a practical solid state LIDAR.[1] It's the Advanced Scientific Concepts flash LIDAR technology, which works well but cost too much when ASC was making it by hand in Santa Barbara, CA. What they don't have is a car company ready to buy enough units to make volume production worthwhile. In the meantime, they're selling a few units to commercial drone operators.[2]
If I hadn't seen an Animats comment here, I would have copy pasted it from your history just to beat you to it.
I checked out of curiosity -- you've got at least 35 posts that directly reference Continental and their Flash LIDAR based on tech from ASC that they're going to build in volume eventually. It's a valid analysis, but why post it every time LIDAR is referenced?
I don't have any connection with Continental or ASC. I did see ASC's flash LIDAR prototype back in 2004 when it was on an optical bench. I thought back then they had the right idea. It cost too much in small quantities. It's made in an IC fab, so cost should come down with volume.
Leddar ships a flash LIDAR, but it's 16x1 pixels. Useful for near obstacle detection, but not enough to create a point cloud. Quanergy seems to have problems, but they have demoed[1] ASC has a 128x128 pixel unit, expensive but good. Those are the main flash LIDAR players.
MEMS mirror scanning looked promising. It worked for video projectors. But it has problems in the automotive environment, apparently.[2]
Liquid crystal beam steering goes back to at least 2004.[3]
The mechanical scanning people have real products, but the ones that can rapidly collect enough data for a point cloud still cost too much.
So Lumotive isn't that novel. The video on the Lumotive site is a render, not a demo.[4] If they had a price, a "buy" button, and reviews, now that would be progress. The vaporware to product ratio in this industry is far too high.
Somebody needs to get this right so we can have vehicle anticollision systems that profile the ground and don't plow into stationary objects.
I find it odd to state that ASC's lidars work, and just cost too much until there is sufficient volume, but then turn around and say that the pulsed lidar groups with working products (that work better than flash lidar, in fact) are too expensive. Presumably the same applies to them. ASC's focal plane arrays may be ICs that scale well, but they still need lasers, electronics, housings, lenses, and more. Just like everyone else.
It also seems odd to state that Lumotive isn't novel. The fact that people have made SLM based phased array beam steering is hardly germane to the Lumotive being able to put together a robust, wide aperture, high speed, high resolution beam steering technology, let alone a complete lidar. There is simply a lot more to getting this stuff to work than just reading some paper from 2004 about SLM based steering in the lab.
Also, no lidar OEMs have a "buy" button. You're suggesting that this is because it's all a scam? I think it's because they don't need or want to sell to you. They are all out making partnerships with large vendors. I'll grant that some companies do appear to have vaporware products (most famously Quanergy's solid state product), but the reason there are many companies working at it is being there is a need and lots of commercial potential. The commercial potential in this case comes from the automotive industry, so there is basically zero incentive to sell at a consumer level. Not even velodyne, who definitely has real products, sells over the web. Even Ouster, who prides themselves at being the "available now" high performance lidar company, doesn't have a "buy" button. Perhaps if you email their sales guys and then send them $4k they will send you a unit, but it remains that the business model isn't sustained on individual sales.
Finally, the continental lidar you linked has pretty bad angular resolution (~1deg) and no stated range. The latter probably being because flash lidar is at a fundamental disadvantage to scanned lidar. Instead of all the photons going to one place, they go everywhere. This scales very badly with range, so they will be power (SNR) limited. The only way to overcome this would be to have correspondingly more sensitive detectors, which I do not believe is the case. Even if they gang together many small flash lidars that look at narrow FOVs, those lidars would probably have to be close to one pixel wide to compete on SNR while staying eye safe, in which case you end up losing the "scales on a chip" economics. This might be one reason why Ouster still spins a pixel wide array rather than strobing many.
i always paste a link to Ignition! in every rocket engine thread and there are always people surprised that such a book has been written. relevant https://xkcd.com/1053/
Tesla justly deserves all of the negative criticism they get.
Their marketing is fraudulent and their lax approach to safety is going to kill people and directly harms the overall industry. If you care about self driving cars you should be criticising Tesla at every opportunity.
No they don't. Flash LIDAR suffers from too little range, if you have an angular resolution for the LIDAR of say 0.01 degrees, you're looking at 36 million points! This means that you need 36 million times as much energy to match the same SNR as normal LIDAR.
Even with 1550nm light and single photon avalanche diodes (SPADs), this is clearly infeasible. Even Princeton Lightwave (now bought by Ford/Argo), arguably the fathers of modern 1550nm/SPADs flash LIDAR have moved away to a hybrid configuration for autonomous vehicles.
128x128 is as far as ASC got. There's a tradeoff between range and beam width for flash LIDAR. The idea is to get the price down, so you can have multiple wide-angle short range ones around the perimeter of the vehicle, and a narrow beam long range one pointed ahead.
The brutal thing about flash is that it doesn't matter how many pixels you collect, it's still going to spread out over the whole thing. Multipoint flash like Ouster is better in this respect.
The output side can be spread out to allow for higher laser power. The safety issue is how much energy goes through a hole about 0.25" in diameter, the size of the human eye, at the closest you can get to the emitter. If you spread out the outgoing beam, this is less of a problem.
Ouster is interesting. But it's not a no moving parts system. It rotates. It's like Velodyne's spinners, but with more integration of the lasers and sensors onto single ICs.
I'm not sure exactly what you're looking for, but if you Google "SPAD array" you'll find plenty of results. I'm sure ouster will be willing to sell you one also
I can't speak to profitability of competitors like velodyne and their existing lidar tech
but isn't it possible that the risk of waiting for sales volume to reach a certain threshold, thereby creating opportunities for other companies to bring their tech up to the same standard, outweighs the risk of selling the units below cost to capture marketshare in the short term?
If more of these companies had real products, I'd expect them to at least be selling development kits. There are lots of industrial applications at the $1000 price point.
you're right, any real competitor would at least be sending out dev kits if they were road ready.
i guess what im saying is that every day that continental delays is a day closer those companies get to having a product thats dev-kit ready, and thus a day closer to bleeding market share to them
Resolution of the one listed at Conti is about 9x lower than what lumotive is claiming, but at about double the sampling rate; though flash acquisition has advantages versus scanning acquisition.
> The device can thus see far without having to turn up the brightness. That’s important because the sensor works at 905 nanometers, an eye-sensitive wavelength the company chose because it works with silicon. You need exotic compound semiconductors to make and detect laser light at 1550 nm, a wavelength that’s easier on the eyes
905 nm is infrared, outside the visible range for humans. If someone made a LIDAR at that wavelength, and did "turn up the brightness", what kind of damage would it do to people? Is the danger increased because it is not visible, so you don't have a clue that you are looking at something dangerously bright?
> The retina does not respond to the 905-nm infrared light used in current car lidars, so we can’t see it. But the eye transmits 905 nm to the retina, so it’s subject to the same restrictions as visible light. In fact, it’s even more hazardous because the eye cannot automatically turn away from a bright source that the retina can’t sense.
The eye is opaque to 1550 nm light, and so even if you are looking right at it none of it makes it to the retina.
Which isn't to say that 1550 nm light is safer -- the mode of destruction is just different, e.g. corneal damage leading to cataracts or surface burns rather than retinal damage.
As I understand it, it’s largely a focusing issue. Close to the visible range, a distant point source gets focused to a point on the retina. At 1550nm, the energy is deposited uniformly on the cornea unless the laser is focused to a tiny spot on the cornea.
edit: I don’t know to what extent this is relevant, but humans can regenerate the corneal epithelium. Think about all the sand you’ve gotten in your eye as a kid, and the fact that it probably didn’t accumulate enough damage to blind you.
Typically you select a class of laser safe for the environment it will be used in -- class 1 is typical of LIDAR systems for this reason which poses no eye damage risk.
The lasers in these things are often not class 1, but the whole system is. Perhaps it's fairer to say that laser classification is not just about the power of the laser, it's about how it's used and under what conditions. Class 1 basically means that it's safe under normal operation. You can look at the certifications for scanning stations (e.g. Faro put theirs online) and they usually get round it by specifying that the system is spinning, so you don't get hit with a laser that often, and that you're standing far enough away that the divergence is large.
If you somehow fixed the laser and stared into it, it would probably hurt.
according to the test report, if the mirror fails, you would hit dangerous levels of energy deposited within 0.2s and they recommend that the laser must be turned off within 100ms to avoid risk of eye damage.
IV was explicitly founded as a patent troll. I find their processes so nefarious that I avoid talking to their employees when I run into them at conferences.
I don’t know anything about IV or their business practices, but it sounds like in this case they are funding research in order to patent useful inventions that they could then presumably license to manufacturers. Doesn’t sound like patent trolling.
IV always had a few things like this, google mosquito laser. It always seemed like they wanted enough that a few people wouldn't think they're trolls, but they're trolls.
Huh?! Your description basically accuses many universities and VCs for being patent trolls, as they develop / fund and then licence technology, but don’t productivise it...
Patent trolling is a red herring - just a form of economic specialization. Selling your patents to a troll or hiring an attorney to enforce them for you are just different labels for the same transaction.
It's just that, in software anyway, many more things are patentable than ought to be. And arguably the whole concept behind patent law has proved unnecessary necessary in our field - "good" actors deduplicate efforts and share mutually beneficial IP via the open source community.
Someone in software using the patent system at all is probably acting in bad faith, or else building a moat to defend themselves from bad-faith actors.
Not just that, but even if it turns out that slog is decades long, those miners will still need those shovels. It’s brilliant, because barring a sudden implosion of SDVA research, this is a guaranteed income stream independent of the success of SDV tech.
Please correct me if I am wrong - I am under the impression that camera-based systems were getting good enough to make lidar moot, at least for self-driving car applications.
Assuming this comment is in good faith - I have so far only seen Tesla make that claim. And they are also the only ones trying to do self driving without LIDAR.
Disclaimer: If I was a betting man, I'd short the hell out of TSLA.
They're definitely not the only ones, saw a presentation on a University team working on this just yesterday. After all, vision is the only system for which we know that it can be used to drive a vehicle as that's what we have been using all along. Also Tesla's do also have radar to detect the most important obstacles in front.
No self driving program is relying entirely on camera based systems otherwise they would struggle at night. Tesla for example is augmenting it with radar.
I am not an expert but I wouldn't trust a camera only solution purely on the fact you could never keep it completely clean and therefore you could have the car mistaking dust for an object.
With LIDAR you would at least know that there's missing data, and the system could react accordingly (eg. sound an alert or stop when the road is no longer detected)
The problem with camera based systems is that they don't know that they are missing data, and the system just thinks the path is clear...
Why shouldn't a camera based system be able to do that. Dirt for example won't easily match up in a stereo vision system and even then dirt on a lense has a pretty specific look
If these are the same engineers that have moved on to writing Python because you can get fairly close to math-y notation, well... Let's just say it can't be a thing.
[1] https://www.continental-automotive.com/en-gl/Passenger-Cars/... [2] https://brashtech.com/data-capture-and-fusion