Liability should always lie with the self driving car. In the real world weird shit happens on roads, people adapt and are expected to adapt. What good is a car that I can't trust to be at least as reliable as an overly defensive version of myself?
A much higher percentage of autopilot cars had fatal collisions at that location than non-autopilot cars. That’s cause for concern. And for all we know the previous driver was distracted. Being not-worse than a distracted human isn’t very confidence-inspiring.
Yes, because responsibility is not a zero-sum game. If the system cannot cope well with sub-optimal roads, it should not be in use wherever one might encounter sub-optimal roads.
Responsibility being a zero sum game is literally what a lot of these conversations and the emerging laws are about. If users are making a mistake often enough to be noticible then the system and its interface are encouraging that behavior. If the autopilot is being used in ill-advised situations then the question becomes why the system wasn't more comprehensible to the user, if it was then they wouldn't have made a bad judgement because they would have understood the limitations of the system. This is just another instance of blaming users for bad design.