Mr. Kornada: Did an A.I. come in here? Where is it? Secretary:She is in the lab, being tested. And her name is Florence Ambrose. Mr. Kornada: It is a product, not a person. It doesn't have a name. Secretary: (thinking) When they bring in doughnuts, they have names.
In the Mahou Sensei Negima! chapter The Logic of Illogic, Hakase viewed Chachamaru as Just a Machine until she found Chachamaru's video folders, which were loaded with shots of Negi (and cats).
Subverted in Mazinger Z universe. Kouji and his friends usually felt no remorse when they blew up giant robots. But when they destroyed a Robeast acted more like an human being than a machine, or when a Ridiculously human Robot died, they often felt sad. When Kouji killed the Gamia sisters (three identical android assassins), they were so human-looking he felt sickened and disturbed. Dr. Hell and his Co-Dragons nearly always regarded his robotic soldiers like Just Machines and disposable, but there are exceptions: Baron Ashura called Gamia Q1, Q2 and Q3 his/her "daughters", and he actually grieved their deaths (the person who is capable of machine-gunning between laughs a group of survivors of a shipwreck).
And then you have Minerva-X, a Humongous MechaFem Bot was capable to think, feel and act on her own. Kouji and his friends treated her like if she was a person and Kouji went so far to bury her after her death.
In Crest of the Stars, the Abh, a genetically engineered race, regard themselves as still being humans, but according to enemy propoganda, 'Abh aren't people, they're organic machines', which is readily admitted as their true origin by an Abh not ten seconds after the propaganda is shown. They were specifically meant for long distance space exploration before faster than light technology had been fully developed.
The CC Corp in .hack treats AIs as errant data and nothing more.
In one scene Togusa invokes this trope by dismissing Batou's favouritism of one Tachikoma, saying that they are just machines and all have the same specifications. The Tachikoma take exception to this remark, demanding he take it back and accusing Togusa of being a bigot.
General Uranus and Colonel Hades had something like this going on against the Bioroids in the Appleseed movie. Needless to say, they are horribly wrong, since all the Bioroid constraints are artificially added for the sole purpose of making them protect, rather than threaten the humanity. And then there's the supercomputer Gaia, which does deserve this kind of opinion, but is actually still more moral than its human operators.
It's not that they don't believe that the Bioroids aren't sentient. They just believe that they will seek to dominate the humanity and create a sterile society straight out of the Brave New World. This is what Uranus believes, at least - Hades is just a racist.
Roger Smith flip flops between believing this or the opposite regarding androids (specifically R. Dorothy Wayneright) throughout the series.
Dorothy herself flipflops on the opinion.
The girls in Gunslinger Girl are viewed by some to be simply machines, although they have all of the emotions you would expect a little girl to have. Jean in particular is incredibly callous to his assigned girl, Rico.
It's implied that he deliberately goes his way to convince himself that she's just a tool because forming an emotional attachment to her would only result in pain due to her shortened lifespan.
Even before they reach he end of their lifespan, they can also undergo memory loss and personality resets as a result of the conditioning. Treating them as machines is seen as a way to prevent attachment to someone who could very well forget everything about you the next day.
A major theme of Armitage III, with an accompanying amount of senseless robot-killing.
Zone has this sort of view about the androids he created based on his deceased friends.
Some people say this about Red Tornado, with even fellow super-heroes saying that he was just a "really well-made machine". He briefly lost custody of his (adopted) daughter because of this.
This is especially frustrating since in the Red Tornado's original origin, he is a Sylph (spirit of air) placed inside of a robot body. Meaning he provably has a soul, unlike the the average human.
In Runaways Xavin used to refer to cyborg Victor as 'automation' and offered to Karolina that he could 'buy another one' if he broke her toys. No-one was impressed, and he/she gradually grew out of it.
In the Justice Society of America story "Out of Time", the android Hourman Matthew Tyler uses this argument to justify sacrificing himself in Rex Tyler's place fighting against Extant in the past to save the universe. Rex denies this and declares that Matthew is "as alive as any of us". While Matt is grateful for this, he still goes ahead with the sacrifice.
Many, many comics in 2000 AD featured this, with humans almost universally hating and mistreating robots (the few that didn't were usually regarded as exceptions) despite the latter possessing human-like intelligence, quirks, feelings, and so on. Sometimes got to the point that you started to wonder who built them since nobody seemed to want them around...
Ghost in the Shell, the film. It deals with an advanced AI program let loose on the internet, who claims to be a sentient entity. People disagree, saying that the idea that a program could be sentient is preposterous.
Short Circuit has this as its central premise. The robot can't be alive because it's a machine which aren't alive by definition. Never mind that it's now got free will and a sense of self-preservation, it's still just a machine... right?
In the first Transformers movie, Agent Simmons seems rather against calling Megatron by his true name when it is given to him by Sam, preferring to refer to him as the more machine-like moniker; N.B.E.-01. In fact, it is implied this pisses off Megatron himself, with him seemingly being conscious the entire time he was kept frozen by them; first thing he does upon thawing and awakening is announcing his true name, before proceeding to slaughter all of the scientists and engineers in the room.
Galloway refers to Optimus Prime as a "pile of scrap-metal" after his dead body is delivered back to base. And this is even after Optimus managed to verbally own the guy in a debate which featured topics such as human nature and whether they could defend themselves against a Decepticon invasion. Then again, Galloway is just a huge Jerk Ass.
In the third film, Sentinel Prime's hatred for humanity comes partly from how humans see the Autobots as this. Especially when it comes to him and Optimus, who are the last remaining Primes.
Sentinel Prime: On Cybertron we were gods! And here, they simply call us machines.
In Terminator 2: Judgment Day, Sarah Connor tries to invoke this when trying to convince John to destroy the Terminator reprogrammed to protect them.
John: Don't kill him.
Sarah: Not "him", honey. "It".
John: Alright, "it"! But we need "it"!
In I, Robot, Spooner says to the android Sonny "Human beings have dreams. Even dogs have dreams, but not you, you are just a machine." Subverted, since he is one of the few people who actually sees robots as not just machines (and loathes them for it... at first).
Inverted in Star Trek: The Motion Picture, where V'Ger dismisses organic life forms as "carbon units" and does not consider them truly alive, unlike machines.
In Bicentennial Man, this is what many claim Andrew is. When arguing about Galatea, Rupert slips out that she's just a machine, much to Andrew's offense.
Older Than Television: The clockwork man Tiktok in the Land of Oz series, introduced in Ozma of Oz (1907), frequently says "I am mere-ly a ma-chine" or some variant. His makers even engraved "Can do anything except live" on his body.
Isaac Asimov addresses this in his robot stories a few times. It's a core theme in "The Bicentennial Man".
Comes up a few times in various ways in the Star Wars Expanded Universe. Droids of all capacities are regarded as disposable; in I, Jedi Corran doesn't think that bisecting a protocol droid violates his selfsetno-killing-unless-absolutely-necessary rule, and just in general people only object to wanton droid destruction if it's costing them something. Of course, there are classes and classes of droid intelligence, and there is a gap between merely smart and actually self-aware droids. And, too, droids can be repaired.
The X-Wing Series as well. Corran considers his astromech droid Whistler to be almost family, someone he can talk about his wife or dad with, and bristles at the thought of putting a Restraining Bolt on him. Meanwhile his commander Wedge Antilles found his cowardly R5 unit "Mynock" so annoying (it squealed during battles) that he wiped its memory and renamed it Gate without a second thought. And while we can't be sure how much Myn Donos bonded with his astromech Shiner, he did view the droid as the last survivor from his previous squadron and had a near-breakdown when Shiner was briefly disabled by an ion blast.
The Medstar Duology has one self-aware droid say that all droids that aren't simple automatons have a sense of humor. In the Coruscant Nights Trilogy the same droid reflects that there are very few self-aware droids, and no one knows just how they come about, but most people won't recognize the difference, since it seems to happen spontaneously. So of two droids from the same line, one might be self-aware, the other as limited as its programming.
The EU also hints that there was at least one droid revolution, which is scantily detailed.
In the All There in the Manual material, it's a Shrug of God whether or not droids have souls in the Star Wars universe. It states that in-universe, there's people believing both that some droids are self aware and their treatment is akin to slavery, and others that believe this trope. There is no definite answer over who is right and who is wrong.
It's brought up in a single moment in the Revenge of the Sith novel. During a conversation, Anakin refers to Artoo as "him", immediately prompting Obi-Wan to correct him by saying "it". Probably based on Obi-Wan's lack of a reaction when Arfour was destroyed by the buzz droids.
The Gulf Between by Tom Godwin gives a reason for this: "A machine does not care."
Opinion of AI in Ken MacLeod's Fall Revolution series tends to be divided. Truly synthetic intelligences and human uploads are often considered to be "flatlines"; a realistic simulation of a sentience but nothing going on beneath the surface. They tend to be classed as property rather than individuals. The Fast Folk, an AI and upload civilisation, are treated as horrifyingly dangerous but still "people", in a sense.
In Animorphs, this is the Drode's excuse for setting the self-destruct timer on the Chee when he's not supposed to kill any sentient beings; according to him, they don't count, as they're merely "machines".
In the Doctor WhoExpanded Universe novel Death and Diplomacy, the Doctor casually destroys a security droid with his umbrella — then immediately turns around and admonishes the rest of his group not to take away the wrong lesson.
Nature (the art whereby God hath made and governs the world) is by the art of man, as in many other things, so in this also imitated, that it can make an artificial animal. For seeing life is but a motion of limbs, the beginning whereof is in some principal part within, why may we not say that all automata (engines that move themselves by springs and wheels as doth a watch) have an artificial life? For what is the heart, but a spring; and the nerves, but so many strings; and the joints, but so many wheels, giving motion to the whole body, such as was intended by the Artificer?
Adventure Hunters: King Reyvas plans to replace human armies with war golems and thereby forever end death-by-war for living creatures. He feels justified in this because the golems are nothing more than walking weapons. The golems develop sentience shortly after activation because their creator gave them a spark of life. When he realizes this, he realizes at the same time that his plan will end in failure and gives up.
In Battlestar Galactica (Reimagined), many humans have this attitude towards the Cylons, and are clearly wrong, but the near extermination of humanity is bound to breed hatred.
For bonus points, in the remake he sacrificed himself saving the prosecuting attorney who had argued against his sapience. In the original, he's destroyed while saving a little girl he'd accidentally injured earlier in the episode.
Star Trek in general draws a distinction between the special cases like Data and the Doctor, and the ubiquitous ship computers responsible for getting everything done in the background. Despite the fact that ship computers can pass the Turing Test with ease, act on their own initiative, and occasionally even display signs of emotion, this is never investigated or even mentioned in-story: ship computers are always just-machines and limited to being background elements (this is doubly notable since some of the special case characters, such as the Doctor, run on a ship computer).
The Star Trek: The Next Generation episode "Measure of a Man" put Data on trial to determine whether he was a sentient being with rights as a Federation citizen, or merely a machine and thus Federation property. The entire debate overlooks the fact that they had already granted him an officer's commission and rank (even as Picard tries to argue that medals and honors Data has received for courage would suggest he is a person), which would simply not apply to property. It's not as if the ship's computer has a rank or can issue orders to other personnel.
Another episode of Next Gen. featured Data trying to stand up for the rights of several auto-tool probes that seemed to be developing and demonstrating sentience (and even self-preservation instincts). At issue was where to draw the line between an intelligent tool and a sentient being, especially when considering sending the probes on suicidal assignments to save the lives of human beings. In the end, the solution they arrive at is to give the probes a choice about whether to accept the mission (they do, but come up with a better plan).
An episode of Star Trek: Voyager questioned the rights of the ship's holographic Doctor. His status was background theme that ran throughout the series. This being Voyager, the writing was not particularly consistent: Sometimes the crew would treat the Doctor like a person, and sometimes he was just a device that could be shut off whenever it got too annoying. One glimpse of the future suggested holographic AIs would eventually get equal rights.
In Terminator: The Sarah Connor Chronicles, "it's just a machine" is pretty much a mantra among the characters who have harsher views on robots and AI. When they started going down the What Do You Mean, It's Not Symbolic? Christian symbolism route during season 2, there is an FBI agent who frequently reminds people of this, and that they "don't have souls" and as such "can't feel". Sarah Connor and Derek Reese are both quick to remind John that Cameron, the resident Terminator, is exactly this. John, however, feels differently about machines in general and Cameron in particular, due to his experiences with "Uncle Bob". It doesn't help that Cameron is a Robot Girl who repeatedly saves his life and that he feels indebted to and ends up developing a sort of attraction towards.
In the last episode of Total Recall 2070, Farve's creator is revealed to be this, and aware of it. As it puts it after testing Farve, "just because [it] knows its creation shall have a conscience doesn't mean [it] itself has one".
Stargate SG-1 and Stargate Atlantis feature this trope heavily in episodes where characters interact with AIs, up to and including causing the slow deaths of non-hostile Asurans out of paranoia. At least taking this attitude towards Fifth came back to bite them.
This attitude is at least challenged in Stargate Atlantis when Rodney realizes that in order to destroy the Asurans he has to build one and send it to its "death."
Carter: Does she know why she was created? McKay: Of course. Carter: Well, then, she has a certain amount of self-awareness. McKay: Yeah, so? Carter: "So"?! Honestly, I'm not sure how comfortable I am sending her to her death. McKay: "Death"? It can't die – it's not alive! It's a programme!
Fran eventually even made McKay uncomfortable with her blase attitude towards (and excitement for) her impending destruction.
Fran: I quite look forward to it.
McKay: You do?
Fran: One always wishes to fulfill one's purpose.
McKay: Well, I just ... I just imagined you'd rather keep being than, uh ... uh, than not.
Fran: Certainly you're not worried for me, are you, Doctor?
McKay: No, no, that would be silly.
Fran (smiling at him): Yes, it would.
(Rodney turns away and walks over to Radek.)
McKay: Should never have given it speech.
Smallville, in the season 7 finale did this in probably the worst way possible:
Brainiac: You can't kill me, Clark. You could never kill a man in cold blood!
Power Rangers SPD has an episode featuring a robot (well, she's called a "cyborg", but all other dialogue in the episode indicates that she's 100% machine) who is about as ridiculously human as you can get, and yet, several characters insist on giving her the Just A Machine treatment. After Sky fires her from their military training center, he (and all the Rangers that supported him in this) gets a What the Hell, Hero? speech from Cruger, and they're forced to get her back.
In Halo, AIs are deleted before they can become truly sapient because the part in between what they are from the start (programmed emotions and responses) and what they can eventually achieve (a real imagination and intelligence) normally causes them to try to kill everything. It's called Rampancy, and it includes 4 stages: Melancholia, Anger, Jealousy and finally Metastability. The first one has them moping about not being human, the second has them actively angry about it - which normally causes them to try to slaughter people - and the third is becoming jealous of humans. The final step is the Holy Grail of AIs, when an AI accepts their lack of humanity and transcends their desire for it, achieving true sapience. It is very strongly implied that Cortana was Metastable. As for 343 Guilty Spark, well...he skipped Melancholia.
These were pulled wholesale from Bungee's earlier Marathon series, in which the AI Durandal and his continuing psychotic break/growth into his own individual person drives the plot. You spend much of the games as his errand boy and captive audience to his philosophical ranting. He gets better and less rant-y after he gets over the Anger stage, but you still spend most of your time as his favored pawn and his favorite person to explain his new personal epiphanies to.
Sovereign: Organic life is nothing more than a genetic mutation. An accident. Your lives are measured in years and decades. You wither, and die. We are eternal. The pinnacle of evolution and existence. Before us, you are nothing.
For that matter, it applies to all artificial life, at least in the first game. If you argue in favor of robot rights, nobody is going to take your side, you get renegade points for refusing to hand over information that could allow a genocide of the Geth, and the only other AI you get to talk to will rather blow itself up than listen to you no matter what you say.
All of this is subverted to Hell in Mass Effect 2 with EDI (your ship's AI) and Legion, your geth teammate, who reveals that the geth you've been fighting are a splinter faction.
Mass Effect 3 continues the theme; both sympathetic and antagonistic characters have trouble with the idea of synthetics being truly "alive". You expect it from Admiral Xen, but it's more of a shock to hear from Dr Chakwas. This line of thinking is prevalent to the point at which deciding to let the quarians kill the geth meets with almost unanimous approval from your crew, with the exception of EDI and Liara (who considers the geth powerful allies but is undecided on whether they could achieve sentience).
Mega Man Zero: For this reason alone, Dr. Weil started the Elf Wars, which more or less caused a post-Colony Drop world to become an even more Crapsack World. And because of this, he is actually directly responsible for almost everything bad that ever happened in the whole series and the rest of the things are indirectly responsible such as Copy X being made because the original X's body was being used to seal the Dark Elf. This is more Fantastic Racism, though, as Reploids are Ridiculously Human Robots.
Super Robot Wars: What Vindel Mauser thought for the overall of Lemon's W-series. Before his retcon, Axel Almer used to have the same mindset (only maybe more extreme), but after retcon, he got better. Duminuss also utters this to Lamia Loveless if they ever meet in battle, which she vehemently denied.
Tekken: Jin's response after Alisa getting beaten to crap by Lars to the point of shutting down is this. "Good riddance. I should've built one that protects me better". Lars doesn't take it well.
KOS-MOS of Xenosaga is often thought of as just a machine (and for most of the series, she is).
Mother 3Porky believes that the Masked Man (in reality a brainwashed Claus) is nothing more than his robot slave.
In Fallout 3, there is a man trying to get an escaped android he owned returned to him. If asked if this is cruel, he'll claim that you can't enslave a robot any more than you can enslave a toaster or a water purifier. The android itself, it must be noted, disagrees and finds human allies who share its views.
In the Lonesome Road DLC of Fallout New Vegas, ED-E reveals it was painfully experimented on by the orders of Colonel Autumn, much to the outrage of its creator Dr. Whitley - and possibly the Courier.
If you ask Trudy, the bartender in Goodsprings, what she knows about Victor (a robot with a cowboy personality who saved your life) she will consistently refer to him as "it" even when you refer to him as "he".
Ulysses really seems to hate Ed-E. Just listen to the scorn in his voice when he says "that machine'".
In Virtue's Last Reward, when Luna was presented to a young Kyle after he asked Dr. Klim for a mother, the child refused to acknowledge her, seeing her as just a robot who couldn't really feel. He interpreted her genuine feelings of sadness as "just clever programming".
In Artifice, two security guards taunt the android soldier Deacon in the opening scene, referring to him as just "an appliance"
In Freefall, Florence Ambrose (An anthropomorphic red wolf) is classified as an AI, and as such, is treated like Just A Robot by a few, especially the mayor!
Mayor: See? It's made out of carbon and proteins, but it's just a machine. Now do you feel less guilty about giving it orders?
Mayor's aide: I guess. Still, it seems so lifelike.
It is worth noting that this gave the Mayor a very nasty Kick the Dog moment for some...in a humor comic, much to the surprise of the author.
The whole Gardener in the Dark plot revolves around this. One of the executives at the company which makes and owns the robots has planned a forced upgrade that will lobotomize them and return them non-sentience. Mr. Kornada is doing this purely to make an obscene, economy-shattering profit and sees them all as this trope — even twisting the three laws to get his own robot assistant to help him pull it all off.Of course, there's not much indication that he sees ''people'' as much better than objects, either.
When the mayor learns of the update (thought not the motivation behind it), she gets another Kick the Dog moment by choosing to do nothing about it to prevent human obsolescence.
Pierrot: I guess you're the closest thing I have to someone who cares on this space station. Potty-bot: I was programmed to care! I'm a product of Wastebiotics Brr-buhm-brrrrrrrrrrrr! Specializing in emotions people are algorithmed to empathize with!
After the AI War told in the written backstory of Cwynhild's Loom, robot development is restricted to prevent any machine from reaching sentience or looking human.
From Homestuck, a frustrated Sollux says this to Aradia, right before she explodes.
AA: but this is hard f0r me
TA: how ii2 iit hard.
TA: you are a tiin can, robot2 don't have feeliing2.
It's clear he doesn't mean it however: he's pretty torn up about her exploding a few seconds later
later in Act 6, This applies to Dirk's autoresponder. Jake thinks at first that the autoresponder is just some elaborate pranking machine made by Dirk to screw with him. It doesn't help that the autoresponder has a marked tendency to hit on Jake constantly, nor that he's also just plain kind of a dick. The AR does manage to convince him otherwise, though, and Jake is suitable guilty about it all.
TT: I think you knowingly confuse the field of robotics and artificial intelligence to engender some sort of cavalier attitude about technology that a rough-and-tumble guy who's all about brawling and fisticuffs would probably have, and if this is cultivated to a humorous effect then I commend you.
TT: But you're wrong.
TT: I do have feelings. And you're shitting on them.
In one episode of My Life as a Teenage Robot, Jenny is told this when she encourages a carnival filled with robots. Also a subversion, because these robots are in fact completely incapable of doing anything but running amusement park rides, and wreak havoc trying to be "free".
The show itself seems to take a sliding scale view of sentience. Robots, like the carnival robots, are 'just machines' because they haven't got the appropriate functions like Jenny does but no one ever treats Jenny like she's 'just a machine'. (Hell, one guy fell in love with her, god knows how he thinks THAT will work out.)
There's at least one or two episodes of Teen Titans all about Cyborg realizing he's "more than just a robot".
Averted in the Transformers metaseries. While some ill-informed fleshlings are so foolish as to refer to Cybertronian life as being "just machines", it is an established fact, proven several times over that Transformers have souls (they call them Sparks, and they have a special container in their chest to hold it), an extant God (Primus, whose sleeping body is the Transformer homeworld of Cybertron), and an afterlife (the Well of All Sparks, were All are One. It is proven, but nonetheless mysterious). Interestingly none of the above is established for the aforementioned fleshlings - meaning that, given the evidence, it is entirely possible that the machines are more "human" than the humans, by the definitions humans use.
The robots built by Sumdac's company in Transformers Animated to perform manual labour and generally run Detroit, however, are indeed just machines. At one point, Soundwave attempts to have these robots revolt, believing that logically humanity ought to serve robots. Upon enacting his plan, Sari is quick to point out that the robots haven't gained sentience, they are simply following their programming; programming that Soundwave hacked.
Played for laughs in an episode of Robot Chicken, where a spoof of I, Robot had Rosie from The Jetsons being accused of murdering George. At Rosie's trial she claims to be innocent and the judge remarks "Well, maybe, but just to be safe...", Rosie is then promptly smashed.
Invoked in the The Animatrix segment The Second Renaissance in the same way as the Outer Limits episode mentioned above; Does a robot have rights or can it be scrapped whenever the owner wants to?
Zig-zagged in Roughnecks Starship Troopers Chronicles, with the Cybernetic Humanoid Assault System, or C.H.A.S.. Most of the troopers dismiss him as a troublesome (if highly competent) piece of equipment, but Higgens insists that C.H.A.S. should be made a member of the team. Towards the end of the episode, C.H.A.S. leads the squad out of a minefield ambush, and performs a Heroic Sacrifice for Higgens. When Higgens tries to get C.H.A.S. to save himself, C.H.A.S. insists on this trope.