ap

Skip to content
PUBLISHED: | UPDATED:
Getting your player ready...

PALO ALTO, Calif. — Eric Horvitz illustrates the potential dilemmas of living with robots by telling how he once got stuck in an elevator at Stanford Hospital with a droid the size of a washing machine.

“I remembered thinking, ‘Whoa, this is scary,’ as it whirled around, almost knocking me down,” the Microsoft researcher recalled. “Then, I thought, ‘What if I were a patient?’ There could be big issues here.”

We’re still far from the sci-fi dream of having robots catering to our every need. But little by little, we’ll be sharing more of our space with robots as prices drop and new technology creates specialized machines that clean up spilled milk or even provide comfort for an elderly parent.

Now scientists and legal scholars are exploring the likely effects. What happens if a robot crushes your foot, chases your cat off a ledge or smacks your baby? While experts don’t expect a band of Terminators to attack or a “2001: A Space Odyssey” computer that takes control, even simpler, benign robots will have legal, social and ethical consequences.

“As we rely more and more on automated systems, we have to think of the implications. It is part of being a responsible scientist,” Horvitz said.

Horvitz assembled a team of scientists this year when he was president of the Association for the Advancement of Artificial Intelligence and asked them to explore the future of human-robot interactions. A report on their discussions is due next year.

For years, robots have been used outside the home. They detect bombs on the battleground, build cars in factories and deliver supplies and visit patients in hospitals.

But the past few years have seen the rise of home robots. Mainly they are used for tasks like vacuuming (think Roomba). There are also robotic lawn mowers, duct cleaners, surveillance systems and alarm clocks.

By 2015, personal robot sales in the U.S. will exceed $5 billion, more than quadrupling current sales, according to ABI Research, which analyzes technology trends.

As such ‘bots become more sophisticated, they could complicate questions about product liability. Ryan Calo, a fellow with Stanford’s Center for Internet and Society, pointed out in a recent panel discussion at Stanford Law School that the original manufacturer might not always be liable if a robot went haywire.

“Robots are not just things the manufacturer builds and you go out and use them in a specific way. Robots can often be instructed, they can be programmed, you can have software that is built upon by others,” he said.

While ethicists, lawyers and roboticists ponder how to best integrate humans and autonomous machines, there is some evidence that a balance is already beginning to be struck. After returning to visit Stanford Hospital years later, Horvitz noticed a sign. It read: “Please Do Not Board The Elevator With The Robot.”

RevContent Feed

More in Business