Rise of the Killer Robots: Hint–This is Not a Movie

The Fiscal Times – by DAVID FRANCIS

Last month, Boston Dynamics posted a video update of its AlphaDog robot, developed to carry heavy military equipment for soldiers. The company had already released video showing AlphaDog traversing rough terrain and gaining significant speed.

But the March video shows something different: The robot now has a mechanical arm attached to the front. This arm picks up a cinder block, which likely weighs about 30 pounds, and begins to move its feet rapidly up and down. It perches low, swings the arm to its left and then hurls the cinder block some 20 feet over its right shoulder, just as a decathlete would throw a hammer.  

AlphaDog’s ability to hurl cinder blocks is a significant step that comes at a time when the use of robots in warfare is quickly evolving. In throwing the cinder block, AlphaDog is performing an aggressive act as opposed to a passive one, such as carrying equipment. It will also soon be able to process voice commands.

Given these advances, it’s easy to imagine AlphaDog as a military Sherpa – lugging heavy equipment to soldiers in need. It’s just as easy to imagine the robot charging into a battlefield and throwing explosives over the enemy’s defenses.

AlphaDog is only the beginning: In the future, robots will play a massive role in how the United States will wage war. Their use is expected to cut down on casualties and related long-term medical costs. The Pentagon has even hired a scientist who is attempting to program robots to obey the Geneva Convention.

But Noel Sharkey, an ethicist at the University of Sheffield in the United Kingdom, warns that the rise of robot soldiers removes the human moral element from warfare, much like the “Terminator.” He, along with a number of other scientists and advocacy groups, is expected to launch the “Stop the Killer Robots” initiative in the UK this month.

“There are a lot of people very excited about this technology… this is going to be big, big money. But actually there is no transparency, no legal process. The laws of war allow for rights of surrender, for prisoner of war rights, for a human face to take judgments on collateral damage,” Sharkey said recently. “Humans are thinking, sentient beings. If a robot goes wrong, who is accountable? Certainly not the robot.”

Robot soldiers have already arrived: Beyond drones, the United States is using 2,000 robots in Afghanistan right now. They do everything from sniff bombs to inspect suspicious vehicles, although according to reports they don’t do either very well.

These robots are just the beginning: The secretive Defense Advance Research and Projects Agency (DARPA) has already invested millions into robot fighting technology. In 2013, it spent $7 million on the Avatar Program, which is exploring the possibility of uploading a soldier’s brain to a surrogate robot. It’s also committed $11 million to a program that aims to create robots capable of acting autonomously (DARPA refused to comment for this story).

DARPA has also invested $14 million in the Autonomous Robotic Manipulation program, which aims to create “autonomous (unmanned) mobile platforms to manipulate objects without human control or intervention,” according to the agency’s 2013 unclassified budget request. This program includes work on a robot that can treat wounded soldiers on the battlefield, and then extract them to combat hospitals. It’s also invested $14 million in its Biometric Computing program, which aims to teach robots how to recognize and react to objects.

DARPA’s programs might seem like science fiction, but life is truly imitating art: Boston Scientific has created a robot that looks remarkably like the ones in the Terminator movies. According to the Army, the robot will only be used to test suits designed to protect against chemical weapons. But as the video below shows, the Protection Ensemble Test Mannequin, or PETMAN, can run, kneel and do push-ups. It’s not hard to imagine the robot equipped with a weapon.

The Pentagon has also launched the Future Solider 2030 initiative, aimed at integrating robotic technology with traditional soldiers. This includes robotic exoskeletons that make soldiers stronger and faster as well as integrated optic interfaces that allows soldiers to control robots with their eyes.

Drones are only the start of the Pentagon’s airborne robotic arsenal: The Navy has paid Northrup Grumman $813 million to develop the X-47B, an unmanned plane that can take off and land on an aircraft carrier. It has both surveillance and strike capabilities and can travel at speeds that would be harmful to humans. It also acts autonomously, receiving mission instructions from an operator then executing them without oversight.

According to Moore’s Law, computer technology doubles every two years. If this is true, the use of robot soldiers is a lot closer than many people know.

This has alarmed scientists and ethicists. Writing in The Wall Street Journal recently, Jonathan Moreno warned not to allow autonomous robots to wage war, and urged the creation of treaties banning the practice.

“Given the obvious dangers to human society, fully autonomous offensive lethal weapons should never be permitted,” Moreno wrote. “And though the technical possibilities and operational practicalities may take decades to emerge, there is no excuse for not starting to develop new international conventions, which themselves require many years to craft and negotiate before they may be ratified by sovereign states.”

To date, the most coordinated effort comes from Sharkey and his Stop the Killer Robots campaign. Despite the hokey name, it has powerful backers, including Jody Williams, the political activist, Nobel Peace Prize winner and founder of the International Campaign to Ban Landmines. The two are expected to launch the anti-robot initiative at the House of Commons this month.

“Killer robots loom over our future if we do not take action to ban them now,” Williams told the Guardian. “The six Nobel peace laureates involved in the Nobel Women’s Initiative fully support the call for an international treaty to ban fully autonomous weaponised robots.”
Read more at http://www.thefiscaltimes.com/Articles/2013/04/02/Rise-of-the-Killer-Robots-Hint-This-is-Not-a-Movie.aspx#VqR5Apj6E5pEovJ2.99

9 thoughts on “Rise of the Killer Robots: Hint–This is Not a Movie

  1. One well placed trip wire or 300 lb test fishing line will take care of these bastards. I bet they will fall hard and become useless fast.

  2. The bottom line on all of this research is that the ruling class is using our tax money to develop methods of killing us so heinous that that they can’t find enough normal soldiers to carry out the orders. The stated reason for the research, cutting down casualties, is obviously bogus: when in history have they ever cared about that before? It took many years and hundreds of thousands of deaths after the invention of the machine gun to finally decide that ordering massed charges into them was counter-productive from a military standpoint. The dead mothers’ sons didn’t figure much into the equation.

    This research signals an abandonment of warfare that contains at least enough internal logic and credible morality to get real humans to carry it out, and is now moving on to pure mechanized slaughter to remove all humans within a geographic area.

    Mark Schumacher is correct: the one thing they are forgetting in this research is how fast the intended victims will figure out ways of countering the threat. I suggest we all put on our thinking caps now, and be prepared with whatever hardware we might need.

  3. I guess the 3LAWS OF ROBOTICS(I robot will smith movie) have been scrapped, ya know the one about never ever ever bringing harm to a human,yes my friends these are some sick COWARDLY pups wishing us harm and their day of comeuppance( I know its an england word but is so appropriate) is getting near.Run for your hidey holes you treasonous prix run and hide as if you can go far enough to escape your day of justice.

  4. Uh-oh… He just pushed that robot. (see 0:17 second mark)

    I see a T-1000 in the future making him its first target to terminate.

  5. ““There are a lot of people very excited about this technology… this is going to be big, big money.”

    As always, corporations think about MONEY FIRST and ethics and morality second.

  6. Obviously the “dog” has to start stamping it’s feet to adjust for balancing while shifting weight. Start tossing bricks at it’s feet and see how well it dances. Not to mention, it didn’t seem to toss that block with any accuracy…just heaved it. Not very threatening if you ask me.

    1. Drutch, the brick is just for research purposes. As soon as I saw this video a couple of months ago, I knew that robotic arm could wield a machine gun as easily as if it were an orchestra conductor’s baton. Maybe they put a video feed to a real human through the MG’s sights to make the decision about when to pull the trigger, or maybe they just give it autonomous software to kill anything that moves, depending on how they feel that day.

  7. “It’s not hard to imagine the robot equipped with a weapon.”

    Time for a “Marshall McLuhan” epiphany – The robot IS the weapon.

    Next stop: EMP area denial weapons, paid with taxes on surviving roaches.

Join the Conversation

Your email address will not be published.