"Uncertainty exists in the map, not in the territory."
Less Wrong- Three Worlds Collide is hosted here
. - Harry Potter and the Methods of Rationality is occasionally discussed here
. - Friendship Is Optimal originated there
.
Tropes:
- Anti-Advice: Called out as fallacious; reversed stupidity is not intelligence
. - Back from the Dead: Some in the Less Wrong community hope to achieve this through cryonics.
- Ban on Politics: It's generally agreed that talking about contemporary politics leads to FlameWars and little else. See Phrase Catcher, below.
- Blue and Orange Morality: One of the core concepts of Friendly AI is that it's entirely possible to make something as capable as a human being that has completely alien goals. Luckily, there's already an example of an 'optimization process' completely unlike a human mind
right here on Earth that we can use to see how good we are at truly understanding the concept."The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."- The community itself. Given their focus on cold logic and disdain for public opinion, once in a while they develop ideas that are tactless or simply absurd to an outsider (Roko's basilisk, musings on utilitarian value of racism, cryonics as ethical necessity etc.).
- Brown Note: Roko's Basilisk, to the point that any mention of it on Less Wrong's forums is deleted. Learn about it (at your own risk) here

- Concepts Are Cheap: Applause Lights
. - Deus Est Machina: Yudkowsky and some other members of Less Wrong from the Singularity Institute for Artificial Intelligence are working on making one. Singularity is eagerly awaited.
- Humans Are Flawed: Explained as a result of having been 'designed' slowly and very much imperfectly by the 'idiot god' that is evolution.
- The Horseshoe Effect: Frequently mentioned and discussed.
- Living Forever Is Awesome: Almost everyone on Less Wrong. Hence, the strong Transhumanist bent.
- Logical Fallacies: Revealed to be shockingly common for normal human minds, and something for rationalists to avoid.
- Phrase Catcher: The Flame Bait topic of politics is met with "politics is the mind-killer
". - Straw Vulcan: Averted. Less Wrong community members do not consider rationality to *necessarily* be at odds with emotion
. Also, Spock is a terrible rationalist.
- Talking Your Way Out: The AI-Box Experiment is a thought experiment intended to show how a superhuman intellect (like a hyper-intelligent AI) could talk its captors into anything, in particular releasing it into the world.
- Transhumanism: Their philosophy and goal, though it should be noted their emphasis on why is kind of skewed compared to others; see Living Forever Is Awesome. Most Transhumanists are more in it to make themselves and others better.
- Wiki Walk: It is fairly easy to go on one due to the links in the articles to other articles. Also, certain lines of thought about similar issues are organized into 'sequences' which make them more conveniently accessible.