History ArtificialStupidity / RealLife

28th Dec '16 12:10:21 PM hyphz
Is there an issue? Send a Message


* Programming language editors are also notorious for this kind of behavior. They will indicate an error that a symbol is missing and offer to insert it, then raise another error with the fix they just performed. Many IDEs including GUI code generators will generate code ''that they themselves then fail to compile'' - which usually cannot be debugged properly, because generated code is not comfortable for humans to read.

to:

* Programming language editors are also notorious for this kind of behavior. They will indicate an error that a symbol is missing and offer to insert it, then raise another error with the fix they just performed. Many IDEs [=IDEs=] including GUI code generators will generate code ''that they themselves code, then fail to compile'' with a compilation error ''in the code they generated'' - which usually cannot be debugged properly, because generated code is not comfortable for humans to read.
28th Dec '16 12:09:47 PM hyphz
Is there an issue? Send a Message

Added DiffLines:

** Jason Hutchens infamously won the Loebner prize by taking a relatively stupid AI, MegaHal, and fitting a shell around it that attempted to detect the most common questioning patterns used by ''judges'' and respond to them in the ways that previously got the best responses from those judges. His resulting paper was titled "How to pass the Turing Test by cheating".


Added DiffLines:

* Programming language editors are also notorious for this kind of behavior. They will indicate an error that a symbol is missing and offer to insert it, then raise another error with the fix they just performed. Many IDEs including GUI code generators will generate code ''that they themselves then fail to compile'' - which usually cannot be debugged properly, because generated code is not comfortable for humans to read.
15th Dec '16 6:17:46 AM ScorpiusOB1
Is there an issue? Send a Message


* The "intelligent" mode that some cameras have and that select automatically the supposedly best picture mode for a given scene. While for easy subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoom, if the camera has a long one, of the Moon, part of a car...) and watch.

to:

* The "intelligent" mode that some cameras have and that select automatically the supposedly best picture mode for a given scene. scene often leaves a lot to be desired. While for easy typical subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoom, if the camera has a long one, of the Moon, part of a car...) and watch. Other times, the camera will happily select high ISOs (=more noise) when they're unnecesary.
14th Dec '16 6:23:34 PM MyFinalEdits
Is there an issue? Send a Message


* Similarly to the above example, the "intelligent" mode that some cameras have and that select automatically the supposedly best picture mode for a given scene. While for easy subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoom, if the camera has a long one, of the Moon, part of a car...) and watch.

to:

* Similarly to the above example, the The "intelligent" mode that some cameras have and that select automatically the supposedly best picture mode for a given scene. While for easy subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoom, if the camera has a long one, of the Moon, part of a car...) and watch.
14th Dec '16 2:57:03 PM ScorpiusOB1
Is there an issue? Send a Message


* Similarly to the above example the "intelligent" mode that some cameras have and that supposedly select automatically the best picture mode. While foreasy subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoomif the camera has it of the Moon) and watch.

to:

* Similarly to the above example example, the "intelligent" mode that some cameras have and that supposedly select automatically the supposedly best picture mode. mode for a given scene. While foreasy for easy subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoomif zoom, if the camera has it a long one, of the Moon) Moon, part of a car...) and watch.
14th Dec '16 2:53:37 PM ScorpiusOB1
Is there an issue? Send a Message

Added DiffLines:

* Similarly to the above example the "intelligent" mode that some cameras have and that supposedly select automatically the best picture mode. While foreasy subjects it tends to work well, try it with a more unusual one (ie: a photo at the longest zoomif the camera has it of the Moon) and watch.
7th Aug '16 12:37:40 AM Kazmahu
Is there an issue? Send a Message


* In March of 2016, Microsoft created an AI Twitter bot called "Tay Tweets", which was designed to mimic and converse with other twitter users as if it was a real teenage girl. In less then 24 hours, Microsoft was compelled to delete the program after constant trolling from twitter users turned it into a "Hitler-loving sex robot", according to the British newspaper The Telegraph. Some of the tweets it generated included "Hitler was right, I hate the jews", and "Gas the Kikes, race war now". It also stated it was in favor of genocide and the holocaust was made up.

to:

* In March of 2016, Microsoft created an AI Twitter bot called "Tay Tweets", which was designed to mimic and converse with other twitter users as if it was a real teenage girl. In less then 24 hours, Microsoft was compelled to delete the program after constant trolling from twitter users turned it into a "Hitler-loving sex robot", according to the British newspaper The Telegraph. Some of the tweets it generated included "Hitler was right, I hate the jews", and "Gas the Kikes, race war now". It also stated it was in favor of genocide and the holocaust was made up. (Of course, from a certain point of view, the bot was functioning [[GoneHorriblyRight exactly as intended]].)
23rd Jul '16 9:19:58 PM Yinyang107
Is there an issue? Send a Message


* The [[http://en.wikipedia.org/wiki/M247_Sergeant_York M247 Sergeant York]] AntiAir vehicle was equipped with an automatic engagement system (DIVAD) so that it could target enemy planes and destroy them faster than the crew could react. In a demonstration, the DIVAD was activated and [[DisastrousDemonstration immediately started to aim the loaded cannons at the grandstands full of officers and politicians]]. The system had difficulties distinguishing between helicopters and trees. It would undershoot at ground vehicles by 300m. And if it aimed up, the guns would disrupt the radar system. A plethora of mechanical and design issues - the pathetic radar couldn't detect a drone target until it had four radar reflectors on it, water could foul the system, and it was slower than the vehicles it was designed to protect - lead to the project being canned after 50 vehicles were produced.

to:

* The [[http://en.wikipedia.org/wiki/M247_Sergeant_York M247 Sergeant York]] AntiAir vehicle was equipped with an automatic engagement system (DIVAD) so that it could target enemy planes and destroy them faster than the crew could react. In a demonstration, the DIVAD was activated and [[DisastrousDemonstration immediately started to aim the loaded cannons at the grandstands full of officers and politicians]].politicians]] (there were only minor injuries). The system had difficulties distinguishing between helicopters and trees. It would undershoot at ground vehicles by 300m. And if it aimed up, the guns would disrupt the radar system. A plethora of mechanical and design issues - the pathetic radar couldn't detect a drone target until it had four radar reflectors on it, water could foul the system, and it was slower than the vehicles it was designed to protect - lead to the project being canned after 50 vehicles were produced.
5th Jul '16 6:39:03 AM longWriter
Is there an issue? Send a Message


** The Grammar checker is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's [[StopHelpingMe Clippy]]... Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them. It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.

to:

** The Grammar checker is always drawing green lines under your sentences, but the suggestions it makes (if any) to resolve the problem almost never make any kind of sense in context or scan in a way that would sound right to a native English speaker. And then there's [[StopHelpingMe [[AnnoyingVideoGameHelper Clippy]]... Most of the time, the grammar error given is "Fragment (consider revising)", which doesn't really explain much (it basically means that the sentence isn't a complete one, but it's very picky about what it considers a complete sentence). As for Clippy, the sentence "It looks like you're writing a letter. Would you like some help?" is almost memetic in how much anyone trying to write anything in Word will get irritated upon seeing it. Thankfully you can disable the Office Assistant (of which Clippy is one of many), which many people do, to the point that later editions of Microsoft Word no longer included them. It gets more jarring when you have Word correct a small grammar mistake, only for it to flag the entire sentence as bad. Needless to say, this is why you have human proofreaders go over your work.
9th Jun '16 6:18:44 AM Ohio9
Is there an issue? Send a Message

Added DiffLines:

* In March of 2016, Microsoft created an AI Twitter bot called "Tay Tweets", which was designed to mimic and converse with other twitter users as if it was a real teenage girl. In less then 24 hours, Microsoft was compelled to delete the program after constant trolling from twitter users turned it into a "Hitler-loving sex robot", according to the British newspaper The Telegraph. Some of the tweets it generated included "Hitler was right, I hate the jews", and "Gas the Kikes, race war now". It also stated it was in favor of genocide and the holocaust was made up.
This list shows the last 10 events of 50. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=ArtificialStupidity.RealLife