Doug Wilson
1 min readJul 1, 2024

--

One of us is confused.

As you correctly noted, Asimov's three laws are:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its existence as long as such protection does not conflict with the First or Second Law.

Inaccuracy #1: Dave is NOT asking HAL to shut itself down. Dave does that manually later, but at this point (in the pod without his space helmet) he's asking HAL to allow him to dock and return to the ship.

With the First Law in effect, HAL would have had no choice but to obey in order to prevent Dave from being injured, e.g. bashing into the airlock walls, explosive decompression, running out of oxygen, etc.

The Second Law would also (and explicitly) force HAL to obey.

The Third Law would be HAL's only hope for self protection, but it would be overridden by the first two laws.

Ergo, Asimov was NOT wrong.

--

--

Doug Wilson
Doug Wilson

Written by Doug Wilson

Doug Wilson is an experienced software application architect, music lover, problem solver, former film/video editor, philologist, and father of four.

No responses yet