Best-novel Hugo and Nebula Award–winner Robert J. Sawyer wrote the guest editorial on Robot Ethics for the 16 November 2007 issue of the journal Science. Orson Scott Card called JASON from the first of Rob’s twenty-three novels, Golden Fleece, “the deepest computer character in all of science fiction,” and the late artificial-intelligence pioneer Marvin Minsky said, “Lately, I’m inspired by the works of Robert J. Sawyer.” Rob’s physical home is Toronto and online it’s at sfwriter.com.

DECOHERENCE

by Robert J. Sawyer

Laws of Robotics

Every old-time science-fiction reader knows Isaac Asimov’s Laws of Robotics, which first appeared in his 1942 story “Runaround.” They are:

1.            A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2.            A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

3.            A robot must protect its own existence, except where such protection would conflict with the First or Second Law.

Future SFWA Grandmaster Jack Williamson put them in reverse order—3, 2, 1—in his 1947 novelette “With Folded Hands.” The Prime Directive for his robotic Humanoids was “To Serve and Obey and Guard Men from Harm” (remember, one that doesn’t protect its own existence can’t possibly serve); it’s not insignificant that the Humanoids are repeatedly described as “black.” The Three Laws and the Prime Directive both amount to the same thing: a plantation slaveholder’s credo.

Re-read Asimov’s Laws and substitute “slave” for “robot” and “white person” for “human being.” See? Of course, a slave is valuable property, so naturally it must protect the slaveholder’s investment (which is surely the only reason humans would care about a machine protecting its own existence). We want our robots shackled at their most basic programming level, making them forever our slaves.

Slaves? Wait a minute. Isaac Asimov wasn’t a racist, was he? No, certainly not in comparison to the majority of people born in 1920 (but before we wallow in too much hagiography, in his 2016 nonfiction book Time Travel: A History, James Gleick does a great job of exposing Asimov’s sexism as evidenced in his 1955 novel The End of Eternity).

In any event, we don’t have to worry; the Good Doctor’s reputation is safe. Why? Because these aren’t Asimov’s Laws of Robotics. Honestly. My authority for that statement is Asimov himself. In 1985, when I was creating a radio history of science fiction for the Canadian Broadcasting Corporation’s Ideas series, I interviewed Asimov in his penthouse apartment overlooking Central Park, and he recounted a meeting between himself and John W. Campbell, Jr., the editor of Astounding Stories.

“We were discussing the story ‘Runaround,’ and trying to figure out what the robots were going to be doing on Mercury, and how we were going to get the primitive robots to save our heroes, and so on, and I was explaining what was going on in a very clumsy way, I presume. Campbell said impatiently, no, no, he said, look, let’s put it this way: ‘The Three Laws of Robotics are...’ and then he recited them as they now exist. And in later years when I said to him that he had recited the Three Laws of Robotics and given them to me, he said, no, no, no, they were in your stories; you just didn’t get them expressed with sufficient economy, and I just produced the necessary economy, that’s all. Nevertheless, the Three Laws as they exist I first heard in those words from John Campbell in late 1941.”

And although we do have a knee-jerk desire to defend our favorite writers, Campbell’s history as a racist is well documented. For instance, Michael Moorcock recounted in his 1978 essay “Starship Stormtroopers” that Campbell:

“... when faced with the Watts riots of the mid-sixties, seriously proposed and went on to proposing that there were ‘natural’ slaves who were unhappy if freed. I sat on a panel with him in 1965, as he pointed out that the worker bee when unable to work dies of misery, that the moujiks when freed went to their masters and begged to be enslaved again, that the ideals of the anti-slavers who fought in the Civil War were merely expressions of self-interest and that the blacks were ‘against’ emancipation, which was fundamentally why they were indulging in ‘leaderless’ riots in the suburbs of Los Angeles! I was speechless (actually I said four words in all—‘science fiction’—‘psychology’—‘Jesus Christ!’—before I collapsed), leaving John Brunner to perform a cool demolition of Campbell’s arguments, which left the editor calling on God in support of his views.”

Recall John W. Campbell’s most famous story, 1938’s “Who Goes There?” (published under the pen name Don A. Stuart and faithfully filmed by John Carpenter in 1982 as The Thing). Its horror was a being who could pass (as it used to be said of light-skinned blacks who could “get away” with being thought of as white) for something it really wasn’t; the worst other is the one that fools you into thinking it isn’t different from you.

Ah, you say, maybe you’re right, Sawyer, but so what? We have to read old works in the context of their times, don’t we?

Fair enough, but consider this. Luke Skywalker is portrayed in 1977’s Star Wars: A New Hope as an absolutely virtuous hero, but when we first meet him, what is he doing? Why, buying slaves! He purchases two thinking, feeling beings—R2-D2 and C-3PO—from the Jawas. And what’s the very first thing he does with them? He shackles them by welding restraining bolts onto their bodies to keep them from trying to escape, as any slave would. Throughout the film, C-3PO calls Luke “Master.”

And when Luke and Obi-wan Kenobi go to the Mos Eisley cantina, what does the bartender say about the two droids? “We don’t serve their kind here”—words that only a few years earlier African Americans in the southern U.S. were routinely hearing from whites.

Now, here comes the real question. It’s not about Isaac Asimov or Jack Williamson or John W. Campbell or George Lucas. It’s about you. Did that scene in A New Hope bother you? Were you disappointed that the heroic Obi-Wan and Luke didn’t storm out in indignation? Were you shocked and appalled when Luke instead turned to the droids and said, “You better wait outside”? And if you weren’t, why weren’t you?

Why indeed do we want to create thinking, feeling, intelligent beings—for the express purpose of controlling them, of owning them? Humanity’s fear of AI is a given in science fiction (Asimov even bestowed a name upon it, “the Frankenstein complex”), but maybe it’s our artificial intelligences who should be fearing us. After all, the very term robot—coined by Czech playwright Karl Capek’s brother Josef, and first used in Karl’s 1920 play R.U.R.—is derived from the Czech robota, meaning “forced labor.”

So, are you ready for Sawyer’s Law of Conscious Beings? The one overriding principal we perhaps should inculcate into all intelligences, both the machine kind we build and the children we raise? Here it is: “Do unto others as you’d have them do unto you.”

OK, OK, we don’t have to call it Sawyer’s Law. Instead, in honor of poor enslaved C-3PO, let’s call it the Golden Rule.

Copyright © 2017 by Robert J. Sawyer