Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The latter is a problem with LIDAR as well. In any system, you're going to need sensor fusion to get a robust result.


With LIDAR you would at least know that there's missing data, and the system could react accordingly (eg. sound an alert or stop when the road is no longer detected)

The problem with camera based systems is that they don't know that they are missing data, and the system just thinks the path is clear...


Why shouldn't a camera based system be able to do that. Dirt for example won't easily match up in a stereo vision system and even then dirt on a lense has a pretty specific look


Do you notice when a butterfly covers your eye? Without LIDAR?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: