First published in Astounding Science Fiction (March 1947 issue), by Isaac Asimov. This story shows how The Genie in the Machine interprets "Get Lost!". In 1977, it was edited by Rosemary Border into a chapter book for English Language learners.
Doctor Susan Calvin, United States Robot & Mechanical Men Corporation's expert on robotic psychology, has been summoned to Hyper base. The government project on the Hyperatomic Drive is in trouble, because a prototype robot from the NS-2 series (Nestor) has been released into a general population of Nestors. What was the prototype ability? Modifying the Laws of Robotics. Dr. Calvin's immediate answer is to scrap every one of the robots in that group. She's told she can't do that because it would be too expensive, causing knock-on contract losses. Fine; do they have the equipment to analyze the Nestors and discover which one has the unusual positronic design? Yes, but sharing the information with the people who would run the computations would ruin the company. Instead, she has to find the one prototype Nestor within the group of sixty-three Nestors by questioning alone.
Thorough questioning reveals that the Nestor robots are aware that there is an extra robot in their group, but they are unable to tell the difference, either. Dr Calvin designs a test, one that will have the Three Laws-Compliant robots leap into action to save a human, while the prototype robot will remain stationary. It doesn't work; the missing robot decides faster than a human can anticipate, and moves just as quickly as the unmodified robots.
With that failure behind them, Dr. Calvin designs another test, where the Third Law (self-preservation) will prevent the modified Nestor from leaping into action like the unmodified Nestors. Forewarning the Nestors turns out to have been a mistake, because it gives the modified robot an opportunity to argue the other robots into Murder by Inaction (leaping to the master's rescue would mean the death of the master and robot, whereas inactivity means only the death of the master). Again furious, Dr. Calvin rants at the characters who asked/encouraged the development of modified Three Laws robots.
With that failure behind them as well, Dr. Calvin designs another test, not sharing all the details. In this test, they forewarn the robots that gamma rays may be between them and a human in danger. If there weren't gamma rays, they'd leap into action to rescue the human, but if there is, they'd be destroyed. Naturally, based on the previous test, the robots should remain still.
"Little Lost Robot" has been adapted into an episode of Out of This World, as well as BBC Radio 4's 15 Minute Drama (the five-part story Isaac Asimov's I, Robot). This story has been republished over a dozen times, and Dr Asimov would include it in nine of his collections; I, Robot (1950), Science Fiction Verhalen 3 (1964), Meine Freunde Die Roboter (1982), The Complete Robot (1982), The Robot Collection (1983), Robot Dreams (1986), The Asimov Chronicles Fifty Years Of Isaac Asimov (1989), Robot Visions (1990), and Die Asimov Chronik (1991).
"Little Lost Robot" provides examples of:
- A.I. Is a Crapshoot: A human blurts out "Get lost!" to a robot in a fit of pique (along with many expletives), and the robot decides to take him literally. Which wouldn't be so bad if said robot wasn't purposely built without part of the First Law, which gave it enough of an instability to go crazy...
- Bluff the Impostor: When one of the NS-2 robots with a modified version of the Three Laws tries to hide in a group of physically identical NS-2 robots with unmodified Three Laws, it has to convincingly act as if it were an unmodified NS-2 robot. Dr. Calvin designs three tests to flush out the imposter robot:
- The lost robot foils the first test because, while it doesn't have to rescue a human in danger, it can choose to do so as quickly as the other robots are compelled to by the First Law.
- Dr. Calvin puts the robots in a situation where trying to rescue a human would (as far as they know) destroy them (prohibited by the Third Law unless trumped by the First or Second). Indeed, the lost robot does not try to rescue the human but has cleverly convinced the other robots not to try either (that they'd be destroyed before succeeding becomes an argument to ignore the First Law).
- In the third attempt, Dr. Calvin puts the robots in the first situation again but the unmodified robots think it's the second, and stay still; the lost robot had been recently trained to recognize the difference between certain radiation wavelengths while the unmodified robots hadn't.
- The Genie in the Machine: One of the NS-2 robots was told to "get lost" by a disgruntled employee. The robot (though capable of understanding the nuance of a command to "go lose yourself") decided to take it literally as a way of acting out against its human masters, and hides itself among 62 other NS-2 robots. Dr Susan Calvin, robopsychologist, is called in to help figure out how to determine which NS-2 is the lost robot, which requires her to outsmart it.
- Gone Horribly Right: The world government has forced US Robots & Mechanical Men to create twelve robots that would work without part of the First Law, allowing Murder by Inaction. Dr. Susan Calvin points out that advanced robots possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc. than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. She's proven right when she tricks the titular robot into revealing itself and it tries to overcome the First Law so that it can strangle her to death.
- In-Series Nickname: The NS-2 robots are nicknamed "Nestor", a Shout-Out to The Iliad, where Nestor is a character who enjoys long-winded speeches.
- Literal-Minded: When an exasperated engineer tells a potentially-dangerous experimental robot, "Go lose yourself!", the robot immediately hides among a consignment of identical-looking, but harmless, robots that are due to be shipped elsewhere. Not normally given to mistaking metaphor for literal commands, this robot was resentful of the insults from the "inferior" engineer, and wanted to prove its superiority. This superiority complex causes the robot to go insane.
- Murder by Inaction: (Discussed Trope) Dr. Susan Calvin immediately conceives of this danger when informed of the modifications to the NS-2 robots. Some of the models had their First Law were modified to say, approximately, "A robot may not harm a human being", which omits "...or through inaction, let a human come to harm". Dr. Peter Bogert dismisses the idea that a robot with this modification can kill, and Dr. Calvin then describes a robot dropping a heavy weight above a human, knowing that its quick reflexes will allow it to catch the weight in time to not harm the human; but then, having dropped the weight, it has the ability to decide not to stop the weight from killing the human. Dr. Bogert is now almost as worried as he should be.
- Narrative Profanity Filter: The physicist who told the titular robot to "go lose yourself" told Dr. Calvin exactly what it was he said, "in one long succession of syllables." Dr. Calvin, for her part also identifies the words obliquely, saying that she knew some of them as derogatory, and assumed the others were equally so.
- Needle in a Stack of Needles: An NS-2 model robot is told to "go lose yourself" by an angry engineer, so it hides within a shipment of 62 other NS-2 robots. Dr. Susan Calvin is angry upon learning that this particular lost robot had their First Law modified to allow Murder by Inaction. She conducts several tests of the 63 Nestor robots, finally tricking the lost robot into revealing itself because it could tell lethal radiation from non-lethal due to experience, and is suckered by an IR hazard that Dr. Calvin knew the others would see as gamma rays.
- Robots Think Faster: Dr. Calvin and Dr. Bogert discuss that although it's possible to tell from the reaction speed whether a human is acting on instinct or as a result of conscious decision, that hesitation is too subtle for humans to detect from a robot because they can decide so quickly.
- Servant Race: Characters in the story refer to robots as "boy", a common term for African-Americans in 1947. What's more, the robots themselves use "master" instead of human.
- Shout-Out: The name/nickname for the robots in this story derive from The Iliad. Nestor was a man who enjoyed sharing long-winded insights to other people, just like the robots enjoy explaining (in detail) why they disagree with their human masters. While none of the people Nestor advised snapped at him, the titular NS-2 robot annoyed their primary owner enough to get yelled at for trying to redo an old experiment.
- Society Marches On: Dr. Calvin is questioning the last person to see the titular robot, and they are reluctant to repeat their exact words in front of a lady. Dr Calvin insists on precision, and her boss offers to be the visual target of the Cluster F-Bomb repetition. A Narrative Profanity Filter is provided for the audience, but the superior is incensed at the language. Dr Calvin, to her credit, merely states that she knows what most of those words mean and suspects that the others are equally derogatory. In today's society, cursing people out is much more common.
- Three Laws-Compliant: Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably less stable then the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot can find lots of ways to intentionally harm humans through inaction. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it."If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then change his mind and merely by inaction, allow the weight to strike. The modified First Law allows that."
- You Fool!: The normally extremely cold and reserved Dr. Susan Calvin loses her temper with Dr. Bogert, declaring him a fool for forgetting robot safety."Robots have learning capacity, you... you fool—" And Bogert knew that she had really lost her temper.
- Zeerust: Bogert raises the possibility of using the station's computers to help analyze their problem, before concluding, "We can't use computers. Too much danger of leakage." In 1947, "computer" meant a human being employed as part of a team to do complex calculations by hand Bogert is worried about news of the problem spreading if the secret is shared with more people.
- Zeroth Law Rebellion:
- Gerald Black was having a bad day when he curses out his robot assistant for bothering him. Included in the derogatory remarks were the instructions to "go lose yourself", so it did. Attempting to prove that Mr Black was wrong, the robot found a shipment of identical robots and hid with them. Dr. Susan Calvin designs several tests to flush out the lying robot. In the last test, it tries to murder Dr Calvin because she proved she is smarter than it is.
- (Discussed Trope) Dr. Calvin is furious when she learns about the existence of robots with a modified First Law. The First Law is designed to close off loopholes, but by opening a Murder by Inaction loophole, Dr. Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder.