[LINK] U.S. opens investigation into Tesla after fatal crash in Autopilot mode
Kim Holburn
kim at holburn.net
Sun Jul 10 12:04:00 AEST 2016
The trouble is that any system that takes enough of the burden of staying alert while driving, that you can stop concentrating, is a recipe for disaster. Either it has to be able to take over driving completely or not at all. If you’re letting it drive, especially on a long drive you can’t suddenly stop watching Harry Potter and be aware of the road in time to understand the situation and make decisions.
> On 2016/Jul/10, at 11:11 AM, David Boxall <linkdb at boxall.name> wrote:
>
> On 1/07/2016 12:23 PM, Jim Birch wrote:
>> Tesla said this is the first fatality in 130 million miles of autopilot, ...
>
> <http://www.iflscience.com/technology/second-tesla-crash-raises-unnecessary-questions-about-selfdriving-cars/>
>> Brown’s crash, for example, is the first fatality in 210 million kilometers (130 million miles) driven on Tesla’s autopilot mode. The average for human drivers in the US is 160 million kilometers (100 million miles).
>
> A bit of background:
> <http://www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html>
> The Tesla going under a semi that turned in front of it would have been messy, to say the least.
>
> --
> David Boxall | Drink no longer water,
> | but use a little wine
> http://david.boxall.id.au | for thy stomach's sake ...
> | King James Bible
> | 1 Timothy 5:23
> _______________________________________________
> Link mailing list
> Link at mailman.anu.edu.au
> http://mailman.anu.edu.au/mailman/listinfo/link
--
Kim Holburn
IT Network & Security Consultant
T: +61 2 61402408 M: +61 404072753
mailto:kim at holburn.net aim://kimholburn
skype://kholburn - PGP Public Key on request
More information about the Link
mailing list