Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Actually one of the biggest obstacles to Level 5 is expectation of Zero Deaths for Level 5 autonomous operations. The standard should be fewer deaths than human-operated (or even auto assisted ) cars.

When you have thousands of machines traveling in close proximity at speeds exceeding 50mph there will be deaths, this is unavoidable. We need to reduce those as much as possible but to demand ZERO before the technology can be used is just unrealistic

That said, Just because some people are working towards Level 5, does not mean all of the other things you are asking for are not also being worked on, it is not a zero sum game. There are enough people that we can have teams working on both.

This complaint is repeated for everything, "Well if people were not working on X drug that I don't care about they could cure cancer"

We can have a better braking system, better frames, etc AND still try to achieve level 5 autonomous driving. It is not an either-or proposition



One of the problems with merely fewer deaths than a human operated car is that technology tends to fail in 'silly' ways.

Also at the very least an self-driving car should reach the level of a good driver, having self-driving cars cause as many deaths as drunk or inattentive drivers do nowadays isn't defensible. Especially since there's usually no explanation and nobody to hold accountable.


> technology tends to fail in 'silly' ways

And a scenario we can easily imagine is that a buggy update goes out to the whole fleet overnight that starts killing people all over the place.

The common case of accidents being on par with manual human driving goes out the window until the software is rolled back and for 12 hours, 24 hours, however long, we get a number of deaths that far outpaces what humans are capable of. The "worst case" would never apply to a manual/human population as a whole, at once.


This might be one of the few cases where it'd be better to not try to have all devices have the latest update all the time.


>nobody to hold accountable

Well, one of the issues is that someone more or less has to be held accountable. And that someone pretty much has to be the manufacturer. No one is going to hand over full control to a vehicle and accept the responsibility if that vehicle commits vehicular manslaughter because "software isn't perfect."

It's actually an interesting legal situation because, other than maybe drug side effects, there aren't a whole lot of consumer products which, properly used and maintained, sometimes randomly kill people and we're OK with that because sometimes stuff just happens.


What about simple speeding tickets? According to Wikipedia[1], Tesla Autopilot max speed is a whopping 90 Mph!!!

Who's responsible if you get pulled over for going 75 in a 65 mph zone?

[1] https://en.wikipedia.org/wiki/Tesla_Autopilot


>The standard should be fewer deaths than human-operated (or even auto assisted ) cars.

How do we go about testing this? By tallying up autonomous deaths until there are fewer per year than human drivers?

>We need to reduce those as much as possible but to demand ZERO before the technology can be used is just unrealistic

Human driver skill varies immensely by person. The idea that anyone who is (or even considers themselves to be) a "good" driver will never accept "average death rates" as a risk when getting in an autonomous car. I know I wouldn't.

The goal has to be zero or it will never be accepted by the public.


Everyone thinks they are "good" drivers, even the person weaving in out of lanes, even the person that has a car that has 500 dents all over it from previous impacts that were all caused by "other bad drivers not me"

What will happen is human-controlled cars will become $$$$$$ to ensure once Level 5 is better than humans. At that point, if you can afford it sure you can reject it but get out your wallet


Why should insurance be more than today? Unless you're arguing that safety systems in other vehicles make you driving one without those systems more dangerous.


Insurance is about risk pools, if Level 5 becomes a reality the risk of a human driver will go up, and has more and more people adopt level 5 (which they will contrary to what people on here think) the number of human drivers to spread that risk over will go down, small risk pools with increasing amount of risk means higher premiums


It is laughable that you don't think the goal wasn't already zero. It has been zero the whole time.

Real life and ideals are different thing. You can't promise that accidents will never happen. But you can promise that accidents will be substantially reduced.

In the US, about 35k people die per year from motor vehicle related deaths. If you get it down to 10k, then that would be a major success. Of course, you will still be fine tuning until you could get below 1000 and as close to 0 as possible.


> In the US, about 35k people die per year from motor vehicle related deaths. If you get it down to 10k, then that would be a major success.

If we were actually serious about reducing motor vehicle deaths, we would mandate that every car be equipped with a breathalyzer device. No fancy new technology is necessary, and there's plenty of low-hanging fruit (Impaired driving) that we can deal with.

For some reason, though, the religion of autonomous driving does not consider this as a solution to minimizing road fatalities.


To take this further, it's been shown every year the numbers are released, that of that ~35k motor vehicle related deaths, ~20k are alcohol related.

On average, humans are actually pretty good at not dying in motor vehicle related accidents - or avoiding them altogether, given the sheer number of miles traveled per day in the US.

That, however, just isn't the narrative Self-driving followers want everyone to know.


The goal is always zero, but we all know that will never happen. Nothing is perfect. And assistance technologies may be more dangerous than level 5 because we physically cannot maintain concentration when few actions or decisions ar required from us. Some studies even indicate manual tranamissions make us safer, possibly for that reason. When level 5 is available, and it will be, because forever is a long time, good drivers shouldn't have to buy those cars. Assuming money is still a thing. Insurance comapnies may start forcing people financially towards sel drive only cars by hiking rates on humans, but if you give up your privacy with a driving monitor and they asses that you are safer, they would probably rather you drive and waive the hike.


It isn't hard to show an order of magnitude less deaths even with non-zero deaths. At that point expect government to mandate level 5 on all cars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: