Karl`s PC Help Forums

Isaac Asimov's "Three Laws of Robotics"-A Revisit
JackInCT - 28-10-2017 at 14:04

I don't recall this board ever having kicked this around for a discussion, but even if it's been cherry picked before, as AI advances with increasing speed, it can't hurt to be up to speed as to what we're in for or NOT in for either.

(1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

(2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

(3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

(4) "promulgated" later: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

Needless to say any search engine on this would turn up zillions of hits especially of the commentary type responses/reactions to this subject (NONE of which, as far as I can tell, were from a robot but perhaps they are already smart enough to know that they should stay under the radar re being 'discovered').

By the way, somehow or other there doesn't seem to be a fifth law that a robot can't disavow laws 1-4.


LSemmens - 28-10-2017 at 21:36

Sadly, those laws have already been made redundant by various military's around the world. How many targeted strikes are done without some form of A.I.?


Katzy - 29-10-2017 at 09:49

AI in organisations with precious little RI is worrying.


John_Little - 29-10-2017 at 12:00

Tell it to The Terminator! Or Robo-cop - the bad one.