Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I fail to see the point of autopilot at all if you're supposed to be able to correct it at any instant in real-world driving conditions.

-The cynic in me suggests we need autopilot as a testbed on the way to the holy grail of Level 5 autonomous vehicles.

The engineer in me fears that problem may be a tad too difficult to solve given existing infrastructure - that is, we'd probably need to retrofit all sorts of sensors and beacons and whatnot to roads in order to help the vehicles travelling on it.



Road sensors ain't gonna fix the long tail of L5. We can't even upkeep roads as is, like crash attenuators, which would have mitigated the fatality in OP article.

Also, highway lane splits are very dangerous in general. It's a concrete spear with 70mph cars whizzing right towards it. Around here, they just use barrels of material, sand I believe. Somebody crashes into one, they clean the wreck, and lug out some more sand barrels. Easy and quick.


It isn't the SOLE action for L5 to be feasible, but I believe it is a REQUIRED action. (Emphasis added not to insinuate you'd need it, but rather to show, well, my emphasis. :))

For the foreseeable future, there's simply too many variables outside autopilot manufacturers' control; I cannot see how car-borne sensors alone will be able to provide the level of confidence needed to do L5 safely.

Oh, and a mix of self-driving and bipedal, carbon-based-driven ones on the roads does not do anything to make it simpler, as those bipedal, carbon-based drivers tend to do unpredictable things every now and then. It'll probably be easier when (if) all cars are L5.


I see this stated often, that humans are unpredictable drivers. What's the proof that automated systems will be predictable? They too will be dealing with a huge number of variables, and trying to interpret things like intent etc.


Yes, automated systems will also do unpredictable things - the point I was (poorly, as it were) trying to make was that the mix of autopilots and humans are likely to create new problems; without being able to dig it out now, I remember a study which found that humans had problems interacting with autonomous vehicles as the latter never fudged their way through traffic like a human would - say, approaching a traffic light, noting it turned yellow - then coming to a hard stop, whereas a human driver would likely just scoot through the intersection on yellow. Result - autonomous vehicles got rear-ended much more frequently than normal ones.

So - humans need to adapt to new behaviour from other vehicles on the road.

When ALL vehicles are L5, though, they (hopefully) will all obey the same rules and be able to communicate intent and negotiate who goes where when /prior/ to occupying the same space at the same time...


I think that unless a single form of AI is dictated for all vehicles, we can't safely make the assumption that autonomous vehicles will obey the same rules. Hell, we can't even get computer to obey the same rules now, either programmatically or at a physic level.


-That is a very valid point.

And, of course, they should all obey the same rules (well, traffic regulations being one, but also how they handle the unexpected - it would be a tough sell for a manufacturer who rather damaged the vehicle than other objects in the vicinity in the event of pending collision if other manufacturers didn't follow suit...

Autonomous Mad Max-style vehicles probably isn't a good thing. :/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: