History ArtificialStupidity / RealLife

23rd Jul '16 9:19:58 PM Yinyang107
Is there an issue? Send a Message


* The [[http://en.wikipedia.org/wiki/M247_Sergeant_York M247 Sergeant York]] AntiAir vehicle was equipped with an automatic engagement system (DIVAD) so that it could target enemy planes and destroy them faster than the crew could react. In a demonstration, the DIVAD was activated and [[DisastrousDemonstration immediately started to aim the loaded cannons at the grandstands full of officers and politicians]]. The system had difficulties distinguishing between helicopters and trees. It would undershoot at ground vehicles by 300m. And if it aimed up, the guns would disrupt the radar system. A plethora of mechanical and design issues - the pathetic radar couldn't detect a drone target until it had four radar reflectors on it, water could foul the system, and it was slower than the vehicles it was designed to protect - lead to the project being canned after 50 vehicles were produced.

to:

* The [[http://en.wikipedia.org/wiki/M247_Sergeant_York M247 Sergeant York]] AntiAir vehicle was equipped with an automatic engagement system (DIVAD) so that it could target enemy planes and destroy them faster than the crew could react. In a demonstration, the DIVAD was activated and [[DisastrousDemonstration immediately started to aim the loaded cannons at the grandstands full of officers and politicians]].politicians]] (there were only minor injuries). The system had difficulties distinguishing between helicopters and trees. It would undershoot at ground vehicles by 300m. And if it aimed up, the guns would disrupt the radar system. A plethora of mechanical and design issues - the pathetic radar couldn't detect a drone target until it had four radar reflectors on it, water could foul the system, and it was slower than the vehicles it was designed to protect - lead to the project being canned after 50 vehicles were produced.
5th Jul '16 6:39:03 AM longWriter
Is there an issue? Send a Message


** The Grammar checker is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's [[StopHelpingMe Clippy]]... Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them. It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.

to:

** The Grammar checker is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's [[StopHelpingMe [[AnnoyingVideoGameHelper Clippy]]... Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them. It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.
9th Jun '16 6:18:44 AM Ohio9
Is there an issue? Send a Message

Added DiffLines:

* In March of 2016, Microsoft created an AI Twitter bot called "Tay Tweets", which was designed to mimic and converse with other twitter users as if it was a real teenage girl. In less then 24 hours, Microsoft was compelled to delete the program after constant trolling from twitter users turned it into a "Hitler-loving sex robot", according to the British newspaper The Telegraph. Some of the tweets it generated included "Hitler was right, I hate the jews", and "Gas the Kikes, race war now". It also stated it was in favor of genocide and the holocaust was made up.
31st May '16 11:46:57 AM mario0987
Is there an issue? Send a Message

Added DiffLines:

** It may also occasionally spot a sentence with a grammar error but highlight a part of the sentence that does make sense grammatically instead of the actual thing that is causing the error.
4th Dec '15 10:41:59 PM Saber15
Is there an issue? Send a Message

Added DiffLines:

* The [[http://en.wikipedia.org/wiki/M247_Sergeant_York M247 Sergeant York]] AntiAir vehicle was equipped with an automatic engagement system (DIVAD) so that it could target enemy planes and destroy them faster than the crew could react. In a demonstration, the DIVAD was activated and [[DisastrousDemonstration immediately started to aim the loaded cannons at the grandstands full of officers and politicians]]. The system had difficulties distinguishing between helicopters and trees. It would undershoot at ground vehicles by 300m. And if it aimed up, the guns would disrupt the radar system. A plethora of mechanical and design issues - the pathetic radar couldn't detect a drone target until it had four radar reflectors on it, water could foul the system, and it was slower than the vehicles it was designed to protect - lead to the project being canned after 50 vehicles were produced.
14th Nov '15 5:25:14 PM billybobfred
Is there an issue? Send a Message


** Occasionally it will confuse "its/it's" and "your/you're". And advise you to begin a sentence with a lower-case letter.

to:

** Occasionally it will confuse "its/it's" and "your/you're". And advise you to begin a sentence with a lower-case letter. And correct "I've" to "[[https://s-media-cache-ak0.pinimg.com/736x/a3/02/d6/a302d64e152e3bc0ff83045f5635a0bb.jpg me've]]".
15th Oct '15 3:48:47 PM SenseiLeRoof
Is there an issue? Send a Message


* In one somewhat infamous example, the Xbox Kinect's initial release caused quite a stir when an early review by [=GameSpot=] U.K reported that the Kinect could not read the motions of two dark-skinned employees, while the white employees were registered just fine. Cue several websites and gaming magazines half-jokingly claiming that the Kinect was [[http://www.pcworld.com/article/209708/Is_Microsoft_Kinect_Racist.html "racist"]]. Obviously, it's easier to detect a white face with dark hair, dark eyebrows, dark eyelashes and red lips than it is to detect those same features on darker skin. A white person with blonde hair, eyebrows, lashes and near-white lips would generally also be more difficult to detect. Besides a greater difference in color value, white skin will scatter the light onto those features (for which [[ThisMeansWarpaint warpaint]] is a practical countermeasure), making them easier to detect by a camera. Finally, a white person in front of the camera will reflect more light potentially triggering the camera to lower it's exposure time, resulting in less motion blur, making detection easier. Despite technical reasons why this sort of thing might happen, Microsoft should have probably tested the system more thoroughly.

to:

* In one somewhat infamous example, the Xbox Kinect's initial release caused quite a stir when an early review by [=GameSpot=] U.K reported that the Kinect could not read the motions of two dark-skinned employees, while the white employees were registered just fine. Cue several websites and gaming magazines half-jokingly claiming that the Kinect was [[http://www.pcworld.com/article/209708/Is_Microsoft_Kinect_Racist.html "racist"]]. Obviously, it's easier to detect a white face with dark hair, dark eyebrows, dark eyelashes and red lips than it is to detect those same features on darker skin. A white person with blonde hair, eyebrows, lashes and near-white lips would generally also be more difficult to detect. Besides a greater difference in color value, white skin will scatter the light onto those features (for which [[ThisMeansWarpaint warpaint]] is a practical countermeasure), making them easier to detect by a camera. Finally, a white person in front of the camera will reflect more light potentially triggering the camera to lower it's its exposure time, resulting in less motion blur, making detection easier. Despite technical reasons why this sort of thing might happen, Microsoft should have probably tested the system more thoroughly.
10th Sep '15 10:12:01 AM MyFinalEdits
Is there an issue? Send a Message


* Norton Antivirus. Which, according to the DarthWiki/IdiotProgramming page, has been known to classify ''itself'' as a virus. [[HilarityEnsues Hilarity, and digital suicide, ensues]]. Few people who have had to uninstall the blasted thing manually would dispute the accuracy of this assessment. Some other antivirus programs, like older versions of [=McAfee=]'s, can delete or quarantine themselves as well.
** Norton has repeatedly been accused of being ''intentionally bad'' software. It's often regarded as a case of actual malware (and it is genuinely harmful software, far worse than most viruses even when working as designed) that flies over the radar thanks to taking RefugeInAudacity and selling itself as boxed copies.

to:

* Norton Antivirus. Which, according to the DarthWiki/IdiotProgramming page, has been known to classify ''itself'' as a virus. [[HilarityEnsues Hilarity, and digital suicide, ensues]]. Few people who have had to uninstall the blasted thing manually would dispute the accuracy of this assessment. Some other antivirus programs, like older versions of [=McAfee=]'s, can delete or quarantine themselves as well.
**
well. Norton has repeatedly been accused of being ''intentionally bad'' software. It's often regarded as a case of actual malware (and it is genuinely harmful software, far worse than most viruses even when working as designed) that flies over the radar thanks to taking RefugeInAudacity and selling itself as boxed copies.



* The Grammar checker in Microsoft Word is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's [[StopHelpingMe Clippy]]... Oh [[TheScrappy Clippy]]...
** Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them.
*** It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.
** On occasions, the grammar checker will identify a sentence as a grammar error, then after correcting, ''identify the corrected sentence as a grammar error''.
*** This may be an indication of how ridiculously complicated the English language is in regards to its rules. There are so many exceptions and points where things don't make sense, you're bound to confuse the parser.

to:

* Microsoft Word:
**
The Grammar checker in Microsoft Word is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's [[StopHelpingMe Clippy]]... Oh [[TheScrappy Clippy]]...
**
Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them.
***
them. It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.
** On occasions, the grammar checker will identify a sentence as a grammar error, then after correcting, ''identify the corrected sentence as a grammar error''.
***
error''. This may be an indication of how ridiculously complicated the English language is in regards to its rules. There are so many exceptions and points where things don't make sense, you're bound to confuse the parser.



* An [[Franchise/StarWars R2-D2]] toy robot, that is supposed to listen to human commands and do various games or actions, does nothing but spin in place, beep, and stare at owner with confusion no matter how clear your English is.
** There's a Yoda toy that is supposed to train you in the ways of the Jedi. You make him go to sleep (turn him off) by setting him on his head and pressing his hand. He then ''immediately'' wakes up given the slightest provocation, or at complete random.

to:

* An [[Franchise/StarWars R2-D2]] toy robot, that is supposed to listen to human commands and do various games or actions, does nothing but spin in place, beep, and stare at owner with confusion no matter how clear your English is.
**
is. There's also a Yoda toy that is supposed to train you in the ways of the Jedi. You make him go to sleep (turn him off) by setting him on his head and pressing his hand. He then ''immediately'' wakes up given the slightest provocation, or at complete random.



* Your average GPS will work fine most of the time. However, there are instances where one will send a driver out to the middle of a field, or give them the most indirect route possible.
** The infamous older versions of Apple Maps would have occasional instances of providing good directions. Most of the time, they would present strange, winding routes, which might even ask the user to drive across an airport runway or two.

to:

* Your average GPS will work fine most of the time. However, there are instances where one will send a driver out to the middle of a field, or give them the most indirect route possible.
**
possible. The infamous older versions of Apple Maps would have occasional instances of providing good directions. Most of the time, they would present strange, winding routes, which might even ask the user to drive across an airport runway or two.
29th Jul '15 6:11:20 PM Zeke
Is there an issue? Send a Message


* Sometimes, it only takes a small bit of pushing to get an otherwise sane and normal IRC chatbot to go get itself killed. Repeatedly. By the same action. [[http://irc.digibase.ca/qdb/?16 Bonus points for the bot in question acknowledging the action]].

to:

* Sometimes, it only takes a small bit of pushing to get an otherwise sane and normal IRC chatbot to go get itself killed. Repeatedly. By the same action. [[http://irc.digibase.ca/qdb/?16 [[http://archive.is/dsP2T Bonus points for the bot in question acknowledging the action]].
15th Jul '15 4:33:14 PM ScorpiusOB1
Is there an issue? Send a Message

Added DiffLines:

* This trope also seems to be behind the [[https://en.wikipedia.org/wiki/2015_Seville_Airbus_A400M_Atlas_crash crash in 2015 of an Airbus A400M near Seville, Spain]], when three of its four engines stopped shortly after taking off because of the plane's computer being unable to read sensor data of them.
This list shows the last 10 events of 43. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=ArtificialStupidity.RealLife