To resist Dunning–Kruger effect (mistakenly believing you know more about a subject than you do, possibly because you simply weren't clued in to how vast the subject is), and to make the depth and breadth of this subculture seem more real to you, you could begin by browsing the titles of articles in the Sequences. That's the closest thing there currently is to an index of the subculture. That can be found here: Sequences.
5. Don't expect people to be perfect rationalists, not even yourself.
Above all, remember that nobody is a perfect rationalist. You're going to make mistakes, and you're going to find reasoning errors that other members have made. You may not be able to fix other people's irrationality, but you can keep an eye out for your own mistakes. None of us were taught to think rationally in school, and we've all been steeped in beliefs that were grown, defended, selected and mutated by countless irrational decision-makers. Becoming a group of perfect rationalists would take a very long time and may not be a realistic goal. Our common goal is to refine ourselves to become less and less wrong by working together. If you get the urge to tear someone's reputation to little bits, please remember this: We've all inherited quite the ideological mess, and we're all working on this mess together. Don't expect others to be perfect rationalists. They can't be perfect but most of them do desire to be more rational.
6. Don't help us be less wrong too much.
Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead: [PoliticalPolitical Skills which Increase Incomehttp://lesswrong.com/lw/jsp/political_skills_which_increase_income/]
6. Don't expect people to be perfect rationalists, not even yourself.
Above all, remember that nobody is a perfect rationalist. You're going to make mistakes, and you're going to find reasoning errors that other members have made. You may not be able to fix other people's irrationality, but you can keep an eye out for your own mistakes. None of us were taught to think rationally in school, and we've all been steeped in beliefs that were grown, defended, selected and mutated by countless irrational decision-makers. Becoming a group of perfect rationalists would take a very long time and may not be a realistic goal. Our common goal is to refine ourselves to become less and less wrong by working together. If you get the urge to tear someone's reputation to little bits, please remember this: We've all inherited quite the ideological mess, and we're all working on this mess together. Don't expect others to be perfect rationalists. They can't be perfect but most of them do desire to be more rational.
Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead.instead: [Political Skills which Increase Income http://lesswrong.com/lw/jsp/political_skills_which_increase_income/]
Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone"Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead.
Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone"Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead.
The main document that has influenced LessWrong subculture is Thethe Sequences. The main theme of Thethe Sequences may be rationality but there are many other themes in Thethe Sequences which have influenced the subculture as well. These other themes may be why the subculture has attracted a disproportionate number of software engineers, math and science oriented individuals, people with an interest in artificial intelligence, atheists, etc. More importantly, Thethe Sequences also contain a lot of articles with Eliezer's ideas about rationalist culture. If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you'll encounter:
To resist Dunning–Kruger effect (mistakenly believing you know more about a subject than you do, possibly because you simply weren't clued in to how vast the subject is), and to make the depth and breadth of this subculture seem more real to you, you could begin by browsing the titles of articles in Thethe Sequences. That's the closest thing there currently is to an index of the subculture. That can be found here: The Sequences
3. It takes a very long time to become good at being rational. To be a really good reasoner, you need to patch over a hundred of cognitive biases. Rationalists improve their rationality because it's necessary if you want to make good decisions. Good decisions are, of course, necessary if you want a high degree of success in life and learning about biases is necessary just to help you avoid self-destructive decisions.decisions. Becoming more rational requires an investment. There is no quick fix. There is a lot to learn. Until you've invested a lot into learning, many of the people you'll encounter in the subculture will know a lot more about this than you do. Interacting with this subculture isn't like talking about a couple dozen bands and enjoying the same music. Daniel Kahneman's book "Judgment under Uncertainty: Heuristics and Biases" is around 600 pages long. Becoming knowledgeable about rationality is an investment. The Sequences would take in the ballpark of 80 hours to read at the average reading speed. Becoming knowledgeable about this specific subculture is an investment.
There is no holy book, authority, set of political agendas, or set of popular beliefs that we can point you to in order to tell you which beliefs rationalists have. It would not be in the best interest of a rationalist to cling to beliefs by defining themselves with a set of specific beliefs. Instead, we can point you to various methods we might use for choosing beliefs like Bayesianism.Bayesianism. Using Bayesian probabilities is considered by many in this subculture to be one of the most fundamental and most prominent parts of the reasoning toolbox.
1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.progress.
1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs.beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.
1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct.correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.
The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls involved in learning about cognitive biases (http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/). Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.
The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of [pitfallspitfalls involved in learning about cognitive biases](http:biases (http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/). Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.
The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls[pitfalls involved in learning about cognitive biases.biases](http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/). Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.
We don't want new users to be downvoted a lot and most of you don't like it, either. This guide was created to help new users quickly get the gist of what LessWrong subculture is about and how website participation works to help new users gain some orientation. If you came here on your own, that's excellent because if you attempt to participate on the website without all the information in this introduction, there's a pretty good chance that you'll be pretty lost.
LessWrong refers to a website for a specific rationalist subculture. The main website feature is a blog. In addition to hosting user-generated posts and articles, the LessWrong blog also hosts LessWrong's main collection of writings. This collection, called "The Sequences", is about rationality and was written by a variety of authors. The term "LessWrong" is also used as a term to describe the related IRL Meetups ("LessWrong Meetups").
The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls involved in learning about cognitive biases. Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.
The main document that has influenced LessWrong subculture is The Sequences. The main theme of The Sequences may be rationality but there are many other themes in The Sequences which have influenced the subculture as well. These other themes may be why the subculture has attracted a disproportionate number of software engineers, math and science oriented individuals, people with an interest in artificial intelligence, atheists, etc. More importantly, The Sequences also contain a lot of articles with Eliezer's ideas about rationalist culture. If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you'll encounter:
Imagine being transported to a different country without ever having heard of that country before. You would have little hope of success in that society without first learning about the many differences between your cultures. That...
This page was written in 2015 and imported from the old LessWrong wiki in 2020. If this page is your first exposure to LessWrong, we recommend you starting with Welcome to LessWrong! which serves as the up-to-date About and Welcome page.