ap

Skip to content
In Pennsauken, N.J., Lockheed Martin engineers design a robot for a Pentagon competition in June.
In Pennsauken, N.J., Lockheed Martin engineers design a robot for a Pentagon competition in June.
PUBLISHED: | UPDATED:
Getting your player ready...

WASHINGTON — It’s 6-foot-2, with laser eyes and vise-grip hands. It can walk over a mess of jagged cinder blocks, cut a hole in a wall, even drive a car.

And soon, Leo — Lockheed Martin’s humanoid robot — will move from the development lab to a boot camp for robots, where a platoon’s worth of the semiautonomous mechanical species will be tested to see whether they will be all they can be.

Next month, the Pentagon is hosting a $3.5 million, international competition that will pit robot against robot in an obstacle course designed to test their physical prowess, agility and even their awareness and cognition.

Galvanized by Japan’s Fukushima Daiichi nuclear power disaster in 2011, the Defense Advanced Research Projects Agency — the Pentagon’s band of mad scientists who have developed the “artificial spleen,” bullets that can change course midair and the Internet — has invested nearly $100 million into developing robots that could head into disaster zones off-limits to humans.

“We don’t know what the next disaster will be, but we know we have to develop the technology to help us to address these kinds of disaster,” Gill Pratt, DARPA’s program manager, said in a recent call with reporters.

The competition comes at a time when weapons technology is advancing quickly. With lasers that can shoot small planes out of the sky and drones that can land on aircraft carriers, it is piercing the realm of science fiction.

But some fear that the technological advancements in weapons systems are outpacing the policy that should guide their use.

At a meeting last month, the U.N. Office at Geneva sponsored a multinational discussion on the development of the “Lethal Autonomous Weapons Systems,” the legal questions they raise and the implications for human rights.

While those details are being hashed out, Christof Heyns, the U.N.’s special rapporteur, called in 2013 for a ban on the development of what he called “lethal autonomous robots.” He said that “in addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill — and their execution.”

Mary Wareham, global coordinator for the Campaign to Stop Killer Robots, a consortium of human rights groups, said the international community needs to ensure that when it comes to decisions of life and death on the battlefield, the humans are in charge.

“We want to talk to the governments about how (the robots) function and understand the human control of the targeting and attack decisions,” she said. “We want assurances that a human is in the loop.”

Organizers of the DARPA Robotics Challenge are quick to point out that the robots are designed for humanitarian purposes, not war. The challenge course represents a disaster zone, not a battlefield.

In all, there are 25 teams from all over the world, from companies, academic institutions and government agencies, all vying for the $2 million first prize. (Second is $1 million; third is $500,000.)

Although the robots might look like the Terminator and move with the rigidity of Frankenstein’s monster, they are harmless noncombatants, with the general dexterity of a teetering 1-year-old.

During the challenge, DARPA officials expect a few of the robots to end up on their keisters looking more helpless than threatening.

Their cognitive ability is not very advanced, either. Even though they are loaded with thousands of lines of code and able to communicate wirelessly with their human overseers, the robots are limited to simple tasks such as opening doors or walking up stairs.

Although the aim of the contest is to help develop robots to use in humanitarian missions, such as sifting through the rubble after the earthquake in Nepal, officials acknowledge that as the technology advances, they could, one day, be used for all sorts of tasks, from helping the elderly, to manufacturing, and, yes, even as soldiers.

“As with any technology, we cannot control what it is going to be used for,” Pratt said. “We do believe it is important to have those discussions as to what they’re going to be used for, and it’s really up to society to decide that. But to not develop the technology is to deny yourself the capability to respond effectively, in this case to disasters, and we think it’s very important to do that.”

RevContent Feed

More in News