[LINK] Killer robots and military ethics

Craig Sanders cas at taz.net.au
Thu Jul 18 15:43:39 AEST 2013


On Thu, Jul 18, 2013 at 12:18:05PM +1000, Jan Whitaker wrote:
> At 12:07 PM 18/07/2013, Jim Birch wrote:
> > was that a robot might exercise a lot more restraint than a human
> > because it is not hormonally concerned about it's own safety.
>
> ....or aggressive nature (testosterone/criminal gung ho) or
> territorial protection.

robots aren't anywhere near smart enough for that, and likely won't be
for many decades. they're not really smart at all, they're thoughtless
programmable devices. military robots would have good sensors (visual,
infrared, sound, etc) and weaponry.

restraint or aggression don't come into it. robot weapons do what
they're programmed to do. if they're programmed to shoot anyone who
comes into a certain area (say, without wearing a uniform with an
encrypted RFID tag), or fly over an area and shoot anyone carrying an
AK-47, then that is exactly what they'll do.

of course, robot vision isn't great at the moment. it's possible to
confuse a broom stick with a rifle.


one of the real dangers, IMO, is that it opens up the possibility
to blame massacres on "computer error" in an attempt to avoid
responsibility. which is bullshit, of course - a person decided to
deploy them, but it gives plausible deniability.

craig

ps: it wouldn't surprise me to see survivalists, gun-nuts, and even
criminal gangs experimenting with robot weaponry in the not too distant
future. controlling a gun with an arduino or similar might require more
mechanical engineering than controlling a heater or air-conditioning
system, but it isn't any more difficult...especially if don't care all
that much about the reliability or accuracy of current computer vision
software.

-- 
craig sanders <cas at taz.net.au>

BOFH excuse #130:

new management



More information about the Link mailing list