[LINK] Robot cars and the fear gap
Brendan
brendansweb at optusnet.com.au
Thu Jul 21 17:46:57 AEST 2016
On 07/21/2016 05:20 PM, andrew clarke wrote:
> On Thu 2016-07-21 16:59:02 UTC+1000, Jim Birch (planetjim at gmail.com) wrote:
>
>>> Do you suppose Tesla will be required to make their source code
>>> available for scrutiny if things get to court?
>>
>>
>> Do you suppose that anyone could understand it? A multilayer neural
>> network is essentially a black box. Presumably Tesla's cars have a bunch
>> of virtual neural networks in their system. It's the training that makes
>> this kind of system work. It might be possible to assess quality of the
>> training applied to the system but analysis in the old sense is not going
>> to work.
>
> "[Potentially] nobody will understand it" seems to me like a poor
> rationale for not making the code available to independent auditors to
> scrutinise.
While access to the source code would be good, I'm not sure it's all that relevant in court. If the car crashes when it ought not to have, they should be liable. The reason why it crashed is not to the point.
More information about the Link
mailing list