tech

Ban 'killer robots,' rights group urges

21 Comments

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© 2012 AFP

©2024 GPlusMedia Inc.

21 Comments
Login to comment

And it begins.....Skynet!

1 ( +2 / -1 )

Ban drones at the same time.

-1 ( +0 / -1 )

What happened to Sir Asimov's laws? "A robot may not harm a human, or through inaction allow a human to come to harm." (more or less paraphrased).

0 ( +0 / -0 )

I think there is no question that with increasing improvement of AI, these machines will come. Not like in Hollywood movies, but in the form of military machines.

E.g. it is only a small step to move on from a remote-controlled killer drone (which is in danger of being hijacked, as happened recently in Iran), to a fully autonomous "smart" drone.

And "dumb" machines that kill have existed for a long time, for example in form of land mines. So the philosophical argument is moot.

1 ( +2 / -1 )

Agree that this is the beginning of a process that ends with Arnie appearing naked in the middle of an electric vortex and then offing lots of people.

0 ( +0 / -0 )

There's already a Cyberdyne Inc based locally in Tsukuba.

0 ( +0 / -0 )

@Frank Vaughn

They're more worried about when the machine ends up with a defect. Like...ignorning the 3 laws of robotics.

A robot cannot harm a human or through it's inaction harm a human. A robot must follow a human's order, unless it conflicts with Law 1. A robot must protect itself, unless it conflicts with Law 1 or Law 2.

Also, how smart do these robots need to be? If Robot A were a medical assistant, will Robot A, know that Human A has a latex allergy that will kill them. If Human B, tells Robot A to rub Human A with latex, will Robot A know that such an action will kill that human?

Either way, humans throw those laws away anyways. We want robots to kill the humans we want them to kill. We want robots to follow only the owner's commands, not every human's commands. We want robots to die for us, while in the process of following only our orders, which told them to go kill other humans.

0 ( +0 / -0 )

Drop a drone in the middle of the enemy's territory armed with automatic guns and motion sensors, Shoot anything come close and self detonate when out of bullets. You do not need to wait for 20-30 years. This can be done using the current technology with just some imaginations.

0 ( +0 / -0 )

They can sign all the treaties they want to. Countries will always act in their best interest, and limiting weapons development has never been considered in a country's "best interest". They'll sign the treaty and shake hands all around for the news cameras, while their countries secretly continue development.

In the late 30's President Roosevelt wrote a letter condemning the Nazi bombing attacks on English cities because the specific targeting of civilians was a "war crime". Just five years later... German cities are being bombed-out... by the American Army Air Corps and RAF. Ditto for Japan. I guess even commiting war crimes doesn't overrule "acting in their best interest".

1 ( +1 / -0 )

If you want to split hairs, this has already happened. The South African military were doing a drill with an automated AA gun and it just started shooting the soldiers. 9 died.

0 ( +0 / -0 )

Arming semi-autonomous robots is happening right now. The phalanx systems (R2-D2 like with a mini-gun poking out) have been in existence for some time now and will shoot nearly anything that is a potential threat, boats, missiles, planes. It is actually designed to not involve humans since it is a point defense weapon against fast missiles. They have to be armed by humans though.

There are also robo-submersibles (made by iRobot of all companies) and boats pulling habor patrol duties. The boats have a light machine gun that can be controlled remotely. There is also ATVs that have a light machine gun mounted on them and can be controlled remotely. These are much better systems than having a bleary eyed sentry out in the elements.

I highly doubt we will truly have autonomous kill-bots. The closest will be a fighter drone with an operator or the phalanx systems with an operator. Each of these will be easily overridden/turned off. Say you are worried, the phalanx systems are at least 25 years old. There may have been accidents, but I have never heard of one accident.

0 ( +0 / -0 )

that’s very important for the laws of war.”

The "laws of war" is when there are no laws. Such robots have already been used for ages. So far, they are called bombs,mines, dogs, etc. That they attack when you step on them or when they see you coming is a detail, dead is dead. The new models can be more efficient, but that doesn't change the nature of the problem.

“The robot could take a bullet in its computer and go berserk. So there’s no way of really determining who’s accountable and

And who's accountable for all the stuff now ? The loser is accountable to the winner. That's reality.

0 ( +0 / -0 )

the new age of warfare need not be with physical machines. You can do more damage with cyber warfare or just hacking/cracking into a utility and blowing it up. Just wait till the smart grid becomes main stream and you can remote access house hold devices from anywhere in the world, remote control your car (your iphone can do that) or disrupt somebodies pacemakers via wifi.

0 ( +0 / -0 )

@viking68

The phalanx definitely has accidents, but really only 2 worthwhile. The killing of a sailor when it shot at a drone and hit the USS Iwo Jima also. Phalanx isn't designed to think about anything past it's intended target. Also the downing of an A-6 Intruder, but that was an operator error, not a system error.

There have been issues with patriot missile batteries shooting down or shooting at friendly planes. A patriot missile battery shot down a British Tornado fighter plane. In another incident a US F-16 was targetted by a patriot missile battery, but the pilot decided to blow up the patriot battery before it could launch.

There will always be issues with any automated system (fully automated or with operator kill switch), cause it's built by humans. We ourselves aren't perfect and what we create definitely isn't perfect either. That's not to say we shouldn't continue moving foward.

@cwhite

Everything you are talking about is still physical machines. You're simply accessing someone else's physical machines via hacking methods. Then disrupting/destroying that physical machine.

0 ( +0 / -0 )

And "dumb" machines that kill have existed for a long time, for example in form of land mines. So the philosophical argument is moot.

Landmines are passive, defensive system. Weaponized mobile robots are a completely different scenario.

0 ( +0 / -0 )

ubikvit:

" Weaponized mobile robots are a completely different scenario. "

The difference is only quantitative, not qualitative.

0 ( +0 / -0 )

The difference is only quantitative, not qualitative.

Don't know about that. Sometimes a massive increase in complexity means you're dealing with something altogether different. Would you say the difference in capability between the simplest brain in say, a worm, is only quantitatively different from the things a human brain can do?

1 ( +1 / -0 )

It's ok, it won't kill you unless your name is Sarah or John Connor.

2 ( +2 / -0 )

How about banning killer humans?

-2 ( +0 / -2 )

I see some people are in favor of murder, lol.

0 ( +0 / -0 )

I can't believe this is a real concern. Creating truly intelligent machines is so far off, and that's assuming that such a thing is possible.

0 ( +0 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites