Wednesday, May 31, 2006

Legislating the Three Laws

Isaac Asimov's three laws of robotics, as outlined in I, Robot, are:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Japan is codifying law 1(a). Nothing yet about allowing a human being to come to harm through inaction, but new guidelines being developed by the industry ministry seek to assure that robots will not injure human beings. For the most part, the guidelines seek to minimize robot-human collisions and make the robots less likely to inflict injuries if such a collision occurs. In a blow to sci-fi thriller plots everywhere, the guidelines also call for an emergency shutoff button in case the robot goes rogue.

Here's to hoping the Three Laws work out better for Japan than they did in the book. While often referred to as a novel, I, Robot is in fact a collection of short stories, most involving what happens when robots follow the Three Laws a little too well. What we say we want robots to do and what we actually want from them are not always similar. My personal favorite story from the collection, "Reason," has a robot demonstrating impeccable logic culminating in the conclusion that the humans on the space station where she was assembled have no legally-admissible proof that an Earth full of humans even exists for her to do no harm to.

No comments: