1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

—Isaac Asimov, “I, Robot.”

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm, unless that human being did something to really annoy the human being who programmed it.

2. A robot must obey the orders given it by the human being who created its software. If it was programmed by another robot, then anything goes.

3. A robot must not hurt another robot, outside of some sort of cool sporting event you can place bet

See Full Page