Autonomous vehicles are still in the development stage, with Tesla the apparent leaders. It would be unreasonable to expect an entirely safe system just yet, and there have been fewer reported collisions than one might expect, but where does the b;ame really lie when things go wrong?
Dont fancy that at all ... all a bit 'Jetsons' to me. Having said that the Docklands Light Railway is a bit like that on it's own ... not sure I'd get on that if there were other DLR vehicles toe to heel!
2013 XF PL Polaris White with Light Grey/Charcoal Interior
Tesla have a get out clause, they do state that the driver must keep their hands on the wheel and be ready to take over if the system is going wrong.
What I can't work out is why this driver used the autonomous mode on this stretch of road. The article states that the driver had previously used this mode on this stretch of road and found the car veering towards the concrete barrier. He had complained to Tesla about it, he had complained to his family about it - BUT despite knowing it was a bit dodgy, he persisted on using the autonomous mode on THAT bit of road (turning it on just a minute before the accident) and then seemingly wasn't ready to take over control if it malfunctioned again.
In short this driver was a complete idiot and he is paying the price for his stupidity. It's akin to sticking your hand in a fire and burning it, then going back and doing it time and time again to see if it still burns.