Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I believe he's alluding to the nature of the accidents. They're high intensity events which are more likely to be fatal (speeds were ~70mph). They're not fender benders when the autopilot fails.


But isn't that what autopilot is used for? High speed Highway traffic? I don't trust AI cars yet but I'd like to know my instinct is true on this and not just my natural inclination to avoid unfamiliar tech.

These are high risk areas, if autopilot is "failing hard" with a regularity equal to or higher than normal than this would be good to demonstrate with stats. Guessing Tesla doesn't really release that info?


Auto-pilot that drives above the speed limit ought to indicate to you its scope.

Still seems like people treat auto-pilot like auto-drive and die as a result.

Thats not a tech fail imho




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: