In the first annual Loebner Prize contest to find the most humanlike chatbot, the winner won in part because it could imitate human typing errors. One runner-up also got its high score by pretending to be a paranoid autistic seven-year-old. The Economist's use of the term "artificial stupidity" to describe the winner's technique may be the Trope Namer.
In Epic Games's documentation of the Unreal Development Kit's AI, they state that, in their games, (the Unreal series and Gears of War) they have to balance artificial stupidity and artificial intelligence to make their bots feel human; too much intelligence and it's obvious you're playing against a flawless machine ("Perfect aim is easy, but missing like a human player is hard."), too much stupidity, even if it would be realistic for a human player, and people think the AI is just dumb. They said that, during the playtesting for Unreal Tournament III, one of their designers complained about how poorly the AI was faring on a particular map, not realising he'd been facing humans.
Played for Laughs by the annual Baca Robo Contest that in 2010 took place in Budapest. The goal for the participants is to create the most ridiculous robotic creation possible, and the one that gets the most laughs from the audience wins a €2,000 prize. Of course, here the Artificial Stupidity is quite intentional.
Norton has repeatedly been accused of being intentionally bad software. It's often regarded as a case of actual malware (and it is genuinely harmful software, far worse than most viruses even when working as designed) that flies over the radar thanks to taking Refuge in Audacity and selling itself as boxed copies.
Probably the worst Epic Fail in the history of computer chess occurred in the game played by COKO III against GENIE in the 1971 ACM North American Computer Chess Championship. COKO had captured all the Black pieces, trapped the Black king and was all set to checkmate. But COKO overlooked mate in one for seven moves in a row, instead shuffling the White king back and forth. GENIE's response to this indecisiveness was to push its Black pawns until one became a queen, which it exchanged for all the White pieces and a couple of pawns. By the time Black was about to queen another pawn, COKO's programmers resigned.
The Grammar checker in Microsoft Word is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's Clippy... Oh Clippy...
Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them.
It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.
On occasions, the grammar checker will identify a sentence as a grammar error, then after correcting, identify the corrected sentence as a grammar error.
This may be an indication of how ridiculously complicated the English language is in regards to its rules. There are so many exceptions and points where things don't make sense, you're bound to confuse the parser.
Occasionally it will confuse "its/it's" and "your/you're". And advise you to begin a sentence with a lower-case letter.
A big problem is assuming the word immediately preceding a verb is the subject of the verb, so if it sees a sentence like: "One of the children was drinking the blood of a nun," it will demand "... children were ...".
Non-electronic example! The Amazing Dr Nim is basically a marble track with a number of gates which can either allow marbles to pass or block them. This allows it to play a perfect game of Nim. In order for it to be beatable, it includes an 'equaliser' gate. When set to on, this causes it to make a single non-optimal play over the course of the game, allowing a perfect human player to win an otherwise unwinnable game.
A R2-D2 toy robot, that is supposed to listen to human commands and do various games or actions, does nothing but spin in place, beep, and stare at owner with confusion no matter how clear your English is.
There's a Yoda toy that is supposed to train you in the ways of the Jedi. You make him go to sleep (turn him off) by setting him on his head and pressing his hand. He then immediately wakes up given the slightest provocation, or at complete random.
Photocopiers in general seem to select the wrong settings more often than the right ones.
In one somewhat infamous example, the Xbox Kinect's initial release caused quite a stir when an early review by GameSpot U.K reported that the Kinect could not read the motions of two dark-skinned employees, while the white employees were registered just fine. Cue several websites and gaming magazines half-jokingly claiming that the Kinect was "racist".
The M247 Sergeant YorkAnti-Air vehicle was equipped with an automatic engagement system (DIVAD) so that it could target enemy planes and destroy them faster than the crew could react. In a demonstration, the DIVAD was activated and immediately started to aim the loaded cannons at the grandstands full of officers and politicians. The system had difficulties distinguishing between helicopters and trees. It locked onto latrine fans as "low priority" threats. It would undershoot at ground vehicles by 300m. And if it aimed up, the guns would disrupt the radar system. A plethora of mechanical and design issues - the pathetic radar couldn't detect a drone target until it had four radar reflectors on it, water could foul the system, and it was slower than the vehicles it was designed to protect - lead to the project being canned after 50 vehicles were produced.