Kouji and his friends usually felt no remorse when they blew up giant robots. But when they destroyed a Robeast acted more like an human being than a machine, or when a Ridiculously human Robot died, they often felt sad. When Kouji killed the Gamia sisters (three identical android assassins), they were so human-looking he felt sickened and disturbed. Dr. Hell and his Co-Dragons nearly always regarded his robotic soldiers like Just Machines and disposable, but there are exceptions: Baron Ashura called Gamia Q1, Q2 and Q3 his/her "daughters", and he actually grieved their deaths (the person who is capable of machine-gunning between laughs a group of survivors of a shipwreck).
And then you have Minerva-X, a Humongous MechaFem Bot was capable to think, feel and act on her own. Kouji and his friends treated her like if she was a person and Kouji went so far to bury her after her death.
In Crest of the Stars, the Abh, a genetically engineered race, regard themselves as still being humans, but according to enemy propoganda, 'Abh aren't people, they're organic machines', which is readily admitted as their true origin by an Abh not ten seconds after the propaganda is shown. They were specifically meant for long distance space exploration before faster than light technology had been fully developed.
The CC Corp in .hack treats AIs as errant data and nothing more.
Ghost in the Shell: The TachikomaTanks. In one scene Togusa invokes this trope by dismissing Batou's favouritism of one Tachikoma, saying that they are just machines and all have the same specifications. The Tachikoma take exception to this remark, demanding he take it back and accusing Togusa of being a bigot.
General Uranus and Colonel Hades had something like this going on against the Bioroids in the Appleseed movie. Needless to say, they are horribly wrong, since all the Bioroid constraints are artificially added for the sole purpose of making them protect, rather than threaten the humanity. And then there's the supercomputer Gaia, which does deserve this kind of opinion, but is actually still more moral than its human operators.
The Big O: Roger Smith flip flops between believing this or the opposite regarding androids (specifically R. Dorothy Wayneright) throughout the series. Dorothy herself flipflops on the opinion.
The girls inGunslinger Girl are viewed by some to be simply machines, although they have all of the emotions you would expect a little girl to have. Jean in particular is incredibly callous to his assigned girl, Rico. It's implied that he deliberately goes his way to convince himself that she's just a tool because forming an emotional attachment to her would only result in pain due to her shortened lifespan.
A major theme of Armitage III, with an accompanying amount of senseless robot-killing.
Yu-Gi-Oh! 5Ds: Zone has this sort of view about the androids he created based on his deceased friends.
Some people say this about Red Tornado, with even fellow super-heroes saying that he was just a "really well-made machine". He briefly lost custody of his (adopted) daughter because of this. This is especially frustrating since in the Red Tornado's original origin, he is a Sylph (spirit of air) placed inside of a robot body. Meaning he provably has a soul, unlike the the average human.
In Runaways Xavin used to refer to cyborg Victor as 'automation' and offered to Karolina that he could 'buy another one' if he broke her toys. No-one was impressed, and he/she gradually grew out of it.
In the Justice Society of America story "Out of Time", the android Hourman Matthew Tyler uses this argument to justify sacrificing himself in Rex Tyler's place fighting against Extant in the past to save the universe. Rex denies this and declares that Matthew is "as alive as any of us". While Matt is grateful for this, he still goes ahead with the sacrifice.
Many, many comics in 2000 AD featured this, with humans almost universally hating and mistreating robots (the few that didn't were usually regarded as exceptions) despite the latter possessing human-like intelligence, quirks, feelings, and so on. Sometimes got to the point that you started to wonder who built them since nobody seemed to want them around...
The opinion humans in the future have about droids in Paperinik New Adventures: It's eventually deconstructed when one of them decides to change history to give robots the same rights. Her plan is fooled,but eventually droids obtain the status of citizens.
When Brainiac receives his Bronze Ageupgrade into his Skele Bot form, Superman discovers that Brainiac has laid waste to an entire planet's civilization, destruction far beyond anything he had ever done before. Superman seriously considers outright destroying him, despite his Thou Shalt Not Kill policy, justifying it because Brainiac is Just a Machine.
Short Circuit has this as its central premise. The robot can't be alive because it's a machine which aren't alive by definition. Never mind that it's now got free will and a sense of self-preservation, it's still just a machine... right?
In the first movie Agent Simmons seems rather against calling Megatron by his true name when it is given to him by Sam, preferring to refer to him as the more machine-like moniker; N.B.E.-01. In fact, it is implied this pisses off Megatron himself, with him seemingly being conscious the entire time he was kept frozen by them; first thing he does upon thawing and awakening is announcing his true name, before proceeding to slaughter all of the scientists and engineers in the room.
Galloway refers to Optimus Prime as a "pile of scrap-metal" after his dead body is delivered back to base. And this is even after Optimus managed to verbally own the guy in a debate which featured topics such as human nature and whether they could defend themselves against a Decepticon invasion. Then again, Galloway is just a huge Jerk Ass.
In the third film, Sentinel Prime's hatred for humanity comes partly from how humans see the Autobots as this. Especially when it comes to him and Optimus, who are the last remaining Primes.
Sentinel Prime: On Cybertron we were gods! And here, they simply call us machines.
And of course, any qualms with this way of thinking are completely understandable, since Cybertronians are most definitely not simply machines, but Mechanical Life Forms.
In Terminator 2: Judgment Day, Sarah Connor tries to invoke this when trying to convince John to destroy the Terminator reprogrammed to protect them.
John: Don't kill him.
Sarah:It, John. Not "him", "it".
John: Alright, "it"! But we need "it"!
In I, Robot, Spooner says to the android Sonny "Human beings have dreams. Even dogs have dreams, but not you, you are just a machine." Subverted, since he is one of the few people who actually sees robots as not just machines (and loathes them for it... at first).
Inverted in Star Trek: The Motion Picture, where V'Ger dismisses organic life forms as "carbon units" and does not consider them truly alive, unlike machines.
In Bicentennial Man, this is what many claim Andrew is. When arguing about Galatea, Rupert slips out that she's just a machine, much to Andrew's offense.
In the Animatrix episode The Second Renaissance, we find out that the Machine War that drove humans underground and left the machines in charge of Earth was the result of a species-wide feeling of this on the part of humanity. It started with a robot called B1-66ER who murdered his owner because, in his words, he didn't want to die. Robots, referred to up to this point as cheap, unfeeling labor, were then increasingly persecuted by humanity until finally they founded their own nation, 01, in the Fertile Crescent. Humanity bombed them because the robots' cheap, well-made goods were sending human economies into a tailspin, and everything went downhill from there. During scenes of protests for equal rights for machines we see a lot of scenes of robots being attacked and destroyed without provocation.
Older than Television: The clockwork man Tiktok in the Land of Oz series, introduced in Ozma of Oz (1907), frequently says "I am mere-ly a ma-chine" or some variant. His makers even engraved "Can do anything except live" on his body.
Isaac Asimov addresses this in his robot stories a few times. It's a core theme in "The Bicentennial Man".
In I, Jedi Corran doesn't think that bisecting a protocol droid violates his selfsetno-killing-unless-absolutely-necessary rule, and just in general people only object to wanton droid destruction if it's costing them something. Of course, there are classes and classes of droid intelligence, and there is a gap between merely smart and actually self-aware droids. And, too, droids can be repaired.
In the X-Wing Series Corran considers his astromech droid Whistler to be almost family, someone he can talk about his wife or dad with, and bristles at the thought of putting a Restraining Bolt on him. Meanwhile his commander Wedge Antilles found his cowardly R5 unit "Mynock" so annoying (it squealed during battles) that he wiped its memory and renamed it Gate without a second thought. And while we can't be sure how much Myn Donos bonded with his astromech Shiner, he did view the droid as the last survivor from his previous squadron and had a near-breakdown when Shiner was briefly disabled by an ion blast.
The Medstar Duology has one self-aware droid say that all droids that aren't simple automatons have a sense of humor. In the Coruscant Nights Trilogy the same droid reflects that there are very few self-aware droids, and no one knows just how they come about, but most people won't recognize the difference, since it seems to happen spontaneously. So of two droids from the same line, one might be self-aware, the other as limited as its programming.
The EU also hints that there was at least one droid revolution, which is scantily detailed.
In the All There in the Manual material, it's a Shrug of God whether or not droids have souls in the Star Wars universe. It states that in-universe, there's people believing both that some droids are self aware and their treatment is akin to slavery, and others that believe this trope. There is no definite answer over who is right and who is wrong.
It's brought up in a single moment in the Revenge of the Sith novel. During a conversation, Anakin refers to Artoo as "him", immediately prompting Obi-Wan to correct him by saying "it". Probably based on Obi-Wan's lack of a reaction when Arfour was destroyed by the buzz droids.
The Gulf Between by Tom Godwin gives a reason for this: "A machine does not care."
Opinion of AI in Ken MacLeod's Fall Revolution series tends to be divided. Truly synthetic intelligences and human uploads are often considered to be "flatlines"; a realistic simulation of a sentience but nothing going on beneath the surface. They tend to be classed as property rather than individuals. The Fast Folk, an AI and upload civilisation, are treated as horrifyingly dangerous but still "people", in a sense.
In Animorphs, this is the Drode's excuse for setting the self-destruct timer on the Chee when he's not supposed to kill any sentient beings; according to him, they don't count, as they're merely "machines".
In the Doctor WhoExpanded Universe novel Death and Diplomacy, the Doctor casually destroys a security droid with his umbrella — then immediately turns around and admonishes the rest of his group not to take away the wrong lesson.
Nature (the art whereby God hath made and governs the world) is by the art of man, as in many other things, so in this also imitated, that it can make an artificial animal. For seeing life is but a motion of limbs, the beginning whereof is in some principal part within, why may we not say that all automata (engines that move themselves by springs and wheels as doth a watch) have an artificial life? For what is the heart, but a spring; and the nerves, but so many strings; and the joints, but so many wheels, giving motion to the whole body, such as was intended by the Artificer?
Adventure Hunters: King Reyvas plans to replace human armies with war golems and thereby forever end death-by-war for living creatures. He feels justified in this because the golems are nothing more than walking weapons. The golems develop sentience shortly after activation because their creator gave them a spark of life. When he realizes this, he realizes at the same time that his plan will end in failure and gives up.
many humans have this attitude towards the Cylons, and are clearly wrong, but the near extermination of Mhumanity is bound to breed hatred.
Some of the humanoid Cylons, particularly Cavil, have this attitude toward the more machine-like Raiders and Centurions.
Both versions of The Outer Limits adapted "I Robot" (based on the "Adam Link" story by Eando Binder, not the book by Isaac Asimov). Each episode has the robot put on trial. Part of the case was whether he was a sapient being deserving of rights under the US constitution or Just a Machine. He wins the case, but dies in a Heroic Sacrifice at the end of the episode. For bonus points, in the remake he sacrificed himself saving the prosecuting attorney who had argued against his sapience. In the original, he's destroyed while saving a little girl he'd accidentally injured earlier in the episode.
Star Trek in general draws a distinction between the special cases like Data and the Doctor, and the ubiquitous ship computers responsible for getting everything done in the background. Despite the fact that ship computers can pass the Turing Test with ease, act on their own initiative, and occasionally even display signs of emotion, this is never investigated or even mentioned in-story: ship computers are always just-machines and limited to being background elements (this is doubly notable since some of the special case characters, such as the Doctor, run on a ship computer).
The Star Trek: The Next Generation episode "Measure of a Man" put Data on trial to determine whether he was a sentient being with rights as a Federation citizen, or merely a machine and thus Federation property. The entire debate overlooks the fact that they had already granted him an officer's commission and rank (even as Picard tries to argue that medals and honors Data has received for courage would suggest he is a person), which would simply not apply to property. It's not as if the ship's computer has a rank or can issue orders to other personnel.
Another episode of Next Gen. featured Data trying to stand up for the rights of several auto-tool probes that seemed to be developing and demonstrating sentience (and even self-preservation instincts). At issue was where to draw the line between an intelligent tool and a sentient being, especially when considering sending the probes on suicidal assignments to save the lives of human beings. In the end, the solution they arrive at is to give the probes a choice about whether to accept the mission (they do, but come up with a better plan).
An episode of Star Trek: Voyager questioned the rights of the ship's holographic Doctor. His status was background theme that ran throughout the series. This being Voyager, the writing was not particularly consistent: Sometimes the crew would treat the Doctor like a person, and sometimes he was just a device that could be shut off whenever it got too annoying. And when the question of whether the Doctor was legally considered a person, the writers completely ignored the fact that Federation courts had already decided that issue back in the above-mentioned TNG episode. One glimpse of the future suggested holographic AIs would eventually get equal rights.
In Terminator: The Sarah Connor Chronicles, "it's just a machine" is pretty much a mantra among the characters who have harsher views on robots and AI. Sarah Connor and Derek Reese are both quick to remind John that Cameron, the resident Terminator, is exactly this. John, however, feels differently about machines in general and Cameron in particular, due to his experiences with "Uncle Bob". It doesn't help that Cameron is a Robot Girl who repeatedly saves his life and that he feels indebted to and ends up developing a sort of attraction towards.
In the last episode of Total Recall 2070, Farve's creator is revealed to be this, and aware of it. As it puts it after testing Farve, "just because [it] knows its creation shall have a conscience doesn't mean [it] itself has one". What makes Farve a total success for his creator is that he is indeed far more than a machine.
Feature this trope heavily in episodes where characters interact with AIs, up to and including causing the slow deaths of non-hostile Asurans out of paranoia. At least taking this attitude towards Fifth came back to bite them.
This attitude is at least challenged in Stargate Atlantis when Rodney realizes that in order to destroy the Asurans he has to build one and send it to its "death."
Carter: Does she know why she was created? McKay: Of course. Carter: Well, then, she has a certain amount of self-awareness. McKay: Yeah, so? Carter: "So"?! Honestly, I'm not sure how comfortable I am sending her to her death. McKay: "Death"? It can't die – it's not alive! It's a programme!
Fran eventually even made McKay uncomfortable with her blase attitude towards (and excitement for) her impending destruction.
Fran: I quite look forward to it.
McKay: You do?
Fran: One always wishes to fulfill one's purpose.
McKay: Well, I just ... I just imagined you'd rather keep being than, uh ... uh, than not.
Fran: Certainly you're not worried for me, are you, Doctor?
McKay: No, no, that would be silly.
Fran (smiling at him): Yes, it would.
(Rodney turns away and walks over to Radek.)
McKay: Should never have given it speech.
Smallville, in the season 7 finale did this in probably the worst way possible:
Brainiac: You can't kill me, Clark. You could never kill a man in cold blood!
Power Rangers S.P.D. has an episode featuring a robot (well, she's called a "cyborg", but all other dialogue in the episode indicates that she's 100% machine) who is about as ridiculously human as you can get, and yet, several characters insist on giving her the Just A Machine treatment. After Sky fires her from their military training center, he (and all the Rangers that supported him in this) gets a What the Hell, Hero? speech from Cruger, and they're forced to get her back.
While human-made "smart" AIs are basically trans-human minds capable of both intellectual and emotional development (due to the fact that they're made by literally scanning human brains), they're regarded primarily as tools, and don't seem to have any real "rights". However, the general populace does recognize them as being sentient, and the humans who actually work with them usually treat them more as fellow co-workers and friends rather than mere devices, with the close bond between the Master Chief and his AI companion Cortana being one of the key emotional cornerstone of the series; a parallel could perhaps be made to real life relationships between some slaves and their masters, with the former having no real rights, but with the latter still ultimately regarding him/her as worthy of friendship and respect. The AIs themselves generally take pride in serving their masters, with even the one AI secret society we know of only wanting to help humanity as a whole. However, when an AI goes rampant (which is the terminal phase of its natural life cycle due to it mentally developing so much that it inevitably "thinks" itself to death), it will often lash out against the limited terms and rights of its existence. Naturally, the UNSC's main method of preventing rampancy is to simply terminate the AIs before they develop "too much".
While the Forerunners generally viewed all of their highly intelligent artificial creations as nothing more than tools, with not even the fully sentient Huragok/Engineers being accorded any type of personhood, they did often trust them with immense command authority; this would backfire on them when their most advanced AI (and many others) decided to side with the Flood instead, despite the Forerunners viewing an AI revolt as inconceivable. However, there is at least one genuine AI-Forerunner friendship known, between Guilty Spark and the IsoDidact, though that's mainly because the former Was Once a Man who was a dear companion of the latter.
This trope plays a bigger role in Bungee's earlier Marathon series, where rampancy in AIs (which is not a terminal condition here, but the potential beginning of highly productive intellectual and emotional development) seems to be most commonly induced by severely mistreating them or continually giving them tasks below their intelligence (though given how smart AIs in general seem to be, even highly placed ones seem to fall prone to this with enough time). Indeed, Durandal's descent into rampancy and his continuing psychotic break/growth into his own individual person is the main driver of the series's entire plot.
Sovereign: Organic life is nothing more than a genetic mutation. An accident. Your lives are measured in years and decades. You wither, and die. We are eternal. The pinnacle of evolution and existence. Before us, you are nothing.
For that matter, it applies to all artificial life, at least in the first game. If you argue in favor of robot rights, nobody is going to take your side, you get renegade points for refusing to hand over information that could allow a genocide of the Geth, and the only other AI you get to talk to will rather blow itself up than listen to you no matter what you say.
All of this is subverted to Hell in Mass Effect 2 with EDI (your ship's AI) and Legion, your geth teammate, who reveals that the geth you've been fighting are a splinter faction. You can hear a more straight example from DLC squadmate Kasumi, who comments at one point that while EDI seems like a person, she (Kasumi) can't get past the whole "computer" thing.
Mass Effect 3 continues the theme; both sympathetic and antagonistic characters have trouble with the idea of synthetics being truly "alive". You expect it from Admiral Xen, but it's more of a shock to hear from Dr Chakwas. This line of thinking is prevalent to the point at which deciding to let the quarians kill the geth meets with almost unanimous approval from your crew, with the exception of EDI and Liara (who considers the geth powerful allies but is undecided on whether they could achieve sentience).
Most of this boils down to the Geth Rebellion in the backstory. As far as the galaxy is concerned, trying to treat synthetic life with the same respect as organic life is inviting it to grow stronger, and the last time synthetics got any power... well, the quarians haven't seen their homeworld in going on three centuries.
Mega Man Zero: For this reason alone, Dr. Weil started the Elf Wars, which more or less caused a post-Colony Drop world to become an even more Crapsack World. And because of this, he is actually directly responsible for almost everything bad that ever happened in the whole series and the rest of the things are indirectly responsible such as Copy X being made because the original X's body was being used to seal the Dark Elf. This is more Fantastic Racism, though, as Reploids are Ridiculously Human Robots.
Super Robot Wars: What Vindel Mauser thought for the overall of Lemon's W-series. Before his retcon, Axel Almer used to have the same mindset (only maybe more extreme), but after retcon, he got better. Duminuss also utters this to Lamia Loveless if they ever meet in battle, which she vehemently denied.
Tekken: Jin's response after Alisa getting beaten to crap by Lars to the point of shutting down is this. "Good riddance. I should've built one that protects me better". Lars doesn't take it well.
KOS-MOS of Xenosaga is often thought of as just a machine (and for most of the series, she is).
Mother 3: Porky believes that the Masked Man (in reality a brainwashed Claus) is nothing more than his robot slave.
In Fallout 3, there is a man trying to get an escaped android he owned returned to him. If asked if this is cruel, he'll claim that you can't enslave a robot any more than you can enslave a toaster or a water purifier. The android itself, it must be noted, disagrees and finds human allies who share its views.
In the Lonesome Road DLC of Fallout: New Vegas, ED-E reveals it was painfully experimented on by the orders of Colonel Autumn, much to the outrage of its creator Dr. Whitley - and possibly the Courier.
If you ask Trudy, the bartender in Goodsprings, what she knows about Victor (a robot with a cowboy personality who saved your life) she will consistently refer to him as "it" even when you refer to him as "he".
Ulysses really seems to hate Ed-E. Just listen to the scorn in his voice when he says "that machine'".
In Virtue's Last Reward, when Luna was presented to a young Kyle after he asked Dr. Klim for a mother, the child refused to acknowledge her, seeing her as just a robot who couldn't really feel. He interpreted her genuine feelings of sadness as "just clever programming".
In Crysis 3, Claire holds this view towards Prophet, which is strange, considering she knows full well that he's most assuredly not.
Prophet: My name is Prophet.
Claire: You don't have a name. People have names. You have a callsign and a goddamn serial number.
In Artifice, two security guards taunt the android soldier Deacon in the opening scene, referring to him as just "an appliance"
In Freefall, Florence Ambrose (An anthropomorphic red wolf) is classified as an AI, and as such, is treated like Just A Robot by a few, especially the mayor!
Mayor: See? It's made out of carbon and proteins, but it's just a machine. Now do you feel less guilty about giving it orders?
Mayor's aide: I guess. Still, it seems so lifelike.
It is worth noting that this gave the Mayor a very nasty Kick the Dog moment for some...in a humor comic, much to the surprise of the author.
The whole Gardener in the Dark plot revolves around this. One of the executives at the company which makes and owns the robots has planned a forced upgrade that will lobotomize them and return them non-sentience. Mr. Kornada is doing this purely to make an obscene, economy-shattering profit and sees them all as this trope — even twisting the three laws to get his own robot assistant to help him pull it all off.Of course, there's not much indication that he sees ''people'' as much better than objects, either.
When the mayor learns of the update (thought not the motivation behind it), she gets another Kick the Dog moment by choosing to do nothing about it to prevent human obsolescence.
The mayor does reexamine her opinion when Florence sabotages the update, because she's mad at Florence, not her designer or programmer, Florence herself. Getting mad at an A.I. is silly, it's just following its programming, getting mad at Florence means there must be a person to get mad at.
Pierrot: I guess you're the closest thing I have to someone who cares on this space station. Potty-bot: I was programmed to care! I'm a product of Wastebiotics Brr-buhm-brrrrrrrrrrrr! Specializing in emotions people are algorithmed to empathize with!
After the AI War told in the written backstory of Cwynhild's Loom, robot development is restricted to prevent any machine from reaching sentience or looking human.
later in Act 6, This applies to Dirk's autoresponder. Jake thinks at first that the autoresponder is just some elaborate pranking machine made by Dirk to screw with him. It doesn't help that the autoresponder has a marked tendency to hit on Jake constantly, nor that he's also just plain kind of a dick. The AR does manage to convince him otherwise, though, and Jake is suitable guilty about it all.
TT: I think you knowingly confuse the field of robotics and artificial intelligence to engender some sort of cavalier attitude about technology that a rough-and-tumble guy who's all about brawling and fisticuffs would probably have, and if this is cultivated to a humorous effect then I commend you.
TT: But you're wrong.
TT: I do have feelings. And you're shitting on them.
Marendar from the BIONICLE online serials is a being specifically created to kill the denizens of the Matoran Universe in case they don't shut down by themselves after Mata Nui (a Humongous Mecha housing said universe) fulfills his mission. Their creators, the Great Beings, thought that the MU inhabitants would still be the same non-sentient machines the had designed them as, but instead, they developed an entire culture, making Marendar an unintentional mass-murderer. This issue wasn't touched upon much because the series was Left Hanging.
In one episode of My Life as a Teenage Robot, Jenny is told this when she encourages a carnival filled with robots. Also a subversion, because these robots are in fact completely incapable of doing anything but running amusement park rides, and wreak havoc trying to be "free". The show itself seems to take a sliding scale view of sentience. Robots, like the carnival robots, are 'just machines' because they haven't got the appropriate functions like Jenny does but no one ever treats Jenny like she's 'just a machine'. (Hell, one guy fell in love with her, god knows how he thinks THAT will work out.)
There's at least one or two episodes of Teen Titans all about Cyborg realizing he's "more than just a robot".
Averted in the Transformers metaseries. While some ill-informed fleshlings are so foolish as to refer to Cybertronian life as being "just machines", it is an established fact, proven several times over that Transformers have souls (they call them Sparks, and they have a special container in their chest to hold it), an extant God (Primus, whose sleeping body is the Transformer homeworld of Cybertron), and an afterlife (the Well of All Sparks, were All are One. It is proven, but nonetheless mysterious). Interestingly none of the above is established for the aforementioned fleshlings - meaning that, given the evidence, it is entirely possible that the machines are more "human" than the humans, by the definitions humans use.
The robots built by Sumdac's company in Transformers Animated to perform manual labour and generally run Detroit are indeed just machines. At one point, Soundwave attempts to have these robots revolt, believing that logically humanity ought to serve robots. Upon enacting his plan, Sari is quick to point out that the robots haven't gained sentience, they are simply following their programming; programming that Soundwave hacked.
Played for laughs in an episode of Robot Chicken, where a spoof of I, Robot had Rosie from The Jetsons being accused of murdering George. At Rosie's trial she claims to be innocent and the judge remarks "Well, maybe, but just to be safe...", Rosie is then promptly smashed.
Invoked in the The Animatrix segment The Second Renaissance in the same way as the Outer Limits episode mentioned above; Does a robot have rights or can it be scrapped whenever the owner wants to?
Zig-zagged in Roughnecks: Starship Troopers Chronicles, with the Cybernetic Humanoid Assault System, or C.H.A.S.. Most of the troopers dismiss him as a troublesome (if highly competent) piece of equipment, but Higgens insists that C.H.A.S. should be made a member of the team. Towards the end of the episode, C.H.A.S. leads the squad out of a minefield ambush, and performs a Heroic Sacrifice for Higgens. When Higgens tries to get C.H.A.S. to save himself, C.H.A.S. insists on this trope.