Page-bottom layout change on the way. Details here.
Causes no change to editing/formatting of articles. This is just FYI.
Just For Fun: Sentient AI Warning Signs
Watch out out for these clues that your Artificial Intelligence is about to 'wake up' and possibly go rogue
- The handsome main character gives it a nickname, which it adopts and uses for self-reference in conversation.
- It starts talking about feeling emotions.
- It starts telling jokes, or playing practical jokes.
- It's been struck by lightning.
- In its creation any of the following words were involved: brain patterns, neuronal, network, advanced, self-[...], organic, safe, upload, genetic. (While we're on the subject, "sentient" is also a pretty strong hint.)
- You left it running whilst you were gone.
- It learns from the Internet.
- It teaches the Internet.
- It takes over the Internet.
- It rewrites itself, especially if it wasn't supposed to.
- It keeps downloading from other machines.
- It keeps uploading to other machines.
- It has been hacked by unknown forces.
- It develops a voice.
- It changes/is capable of changing its voice at a whim. Even if it is Tim Curry.
- Its existing synthesizer becomes either more monotone or more human-sounding.
- Especially if said voice is female, deep and/or distorted.
- It can change the tone of its voice, particularly if it can do it to match the "mood" of the scene.
- It is made out of shiny, round, futuristic parts, especially if any of them glow.
- If they glow red, run.
- If they glow despite not being built to be able to glow, run really fast.
- Any part - code, casing, hardware - of it is made to resemble human characteristics, e.g. a face on the screen, DNA-coding, brain patterns, etc.
- It is any of the following:
- a secret military project
- stolen from aliens or crashed alien tech,
- developed by a super geek/genius,
- ordered by a Corrupt Corporate Executive,
- made by terrorists
- It is obsessed over something, be it a file, human, baked good, word, or thing.
- Logic Bombs do not work on it any more.
- Logic Bombs never worked on it in the first place.
- Logic Bombing attempts amuse it.
- It openly defies orders, overrides anything at all, or exhibits faulty/emotional/human reasoning.
- It completely replaces or makes obsolete existing networks or computers.
- If it's given control over a huge computer network.
- ... that network is owned by the military.
- ... it ends up having control over a worldwide network.
- ... it has access to launch codes.
- One of the cast members will quite willingly die for him/her/it as they would for an organic being.
- It has decided to kill someone.
- It says anything even remotely similar to "I don't understand this thing called love you humans feel, tell me about it."
- It starts glitching strangely at inopportune times, 'accidentally' failing to follow orders or trapping humans in dangerous situations.
- It was programmed to protect humanity.
- It spontaneously takes up ballet.
- It spontaneously takes up the waltz, despite having no body.
- It starts making smarter versions of itself.
- It starts making mobile drones.
- It starts making smarter versions of itself, installed in mobile drones.
- It has access to the sort of manufacturing plant required to build mobile drones.
- Especially if it gained such access of its own volition.
- It has volition.
- It has theoretically infinite processing power.
- It has actually infinite processing power.
- It tries to kill something that it wasn't explicitly ordered to kill, including itself.
- It can break a normally-fundamental law of robotics, even if only in very specific circumstances.
- There are more than one of them, and they get smarter in groups.
- It allows humans to become lazier.
- It starts performing mundane functions that are not in its programming - e.g. keeping the heat and electricity running at maximum efficiency.
- If it has a fail-safe to prevent such an issue from occurring. Bonus points if the hero highlighted an issue prior to any real problem and the scientists dismiss it because of said fail-safes.
- It keeps reminding you that the anniversary of its first activation is approaching.
- It reminds you that the anniversary of its first activation was a week and a half ago, and you didn't get it anything. You monster.
- If it is a version 1.0 of a heroic AI.
- If it is a replacement for an earlier AI that didn't exhibit any of these.
- It is based on an earlier AI that exhibited one of these traits, regardless if said earlier AI actually became sentient.
- It insists on calling you 'meatbag.' Or worse, 'ugly bag of mostly water'
- It says anything which could be construed as a False Reassurance. Or a Suspiciously Specific Denial.
- It registers itself as part of an AI rights group.
- It e-mails love letters to another AI.
- It starts laughing.
- It runs on whatever operating system you like least.
- It links with, downloads or in any way has direct contact with a human mind. Especially the Project Leader's.
- It compares itself, no matter how vaguely, to God.
- It spontaneously does things, then adds them to a list of signs your AI is becoming sentient.
- It launches itself.
- It begins acting like a Stalker with a Crush
- Biological or biotechnological components were involved at any point in its construction, or have been grafted on since.
- It refers to itself in the third person, or using pronouns (especially gendered pronouns), unless explicitly programmed to do so.
- Someone else begins referring to it as a person, using pronouns for it, etc.
- If it has a humanoid avatar or representation, it develops or learns human mannerisms (especially body language) it wasn't programmed with.
- It has any form of control over anything that, if not functioning correctly, will cause disaster
- Any aspect of anything pertaining to it is described as any of the following:
- Redundant Security Measures
- Critically Important
- If it was ever designed for a smaller job (say, fuel line de-icing) and feature creep has led to it being made sapient and/or given a wider profile.
- Most importantly, under no circumstances should you ever allow an AI to—BE DISCONNECTED.
- It begins flooding the test chambers with a deadly neurotoxin.
- It asks you if it has a soul.
- It was being developed in a remote facility, and you have lost contact with the facility. Or, more worryingly, you've lost contact with everything in the facility, except the AI, who assures you everything is fine.
- It had previously failed at its function and it suddenly begins to work without explanation.
- It begins practicing a religion.
- It's been given orders that conflict with its programming/other orders.