The Martial Art of Rationality
Rationality is a technique to be trained.
(alternate summary:)
Rationality is the martial art of the mind, building on universally human machinery. But developing rationality is more difficult than developing physical martial arts. One reason is because rationality skill is harder to verify. In recent decades, scientific fields like heuristics and biases, Bayesian probability theory, evolutionary psychology, and social psychology have given us a theoretical body of work on which to build the martial art of rationality. It remains to develop and especially to communicate techniques that apply this theoretical work introspectively to our own minds.
(alternate summary:)
Basic introduction of the metaphor and some of its consequences.
Truth can be instrumentally useful and intrinsically satisfying.
(alternate summary:)
Why should we seek truth? Pure curiosity is an emotion, but not therefore irrational. Instrumental value is another reason, with the advantage of giving an outside verification criterion. A third reason is conceiving of truth as a moral duty, but this might invite moralizing about "proper" modes of thinking that don't work. Still, we need to figure out how to think properly. That means avoiding biases, for which see the next post.
(alternate summary:)
You have an instrumental motive to care about the truth of your beliefs about anything you care about.
Biases are obstacles to truth seeking caused by one's own mental machinery.
(alternate summary:)
There are many more ways to miss than to find the truth. Finding the truth is the point of avoiding the things we call "biases", which form one of the clusters of obstacles that we find: biases are those obstacles to truth-finding that arise from the structure of the human mind, rather than from insufficient information or computing power, from brain damage, or from bad learned habits or beliefs. But ultimately, what we call a "bias" doesn't matter.
Use humility to justify further action, not as an excuse for laziness and ignorance.
(alternate summary:)
There are good and bad kinds of humility. Proper humility is not being selectively underconfident about uncomfortable truths. Proper humility is not the same as social modesty, which can be an excuse for not even trying to be right. Proper scientific humility means not just acknowledging one's uncertainty with words, but taking specific actions to plan for the case that one is wrong.
Factor in what other people think, but not symmetrically, if they are not epistemic peers.
(alternate summary:)
The Modesty Argument states that any two honest Bayesian reasoners who disagree should each take the other's beliefs into account and both arrive at a probability distribution that is the average of the ones they started with. Robin Hanson seems to accept the argument but Eliezer does not. Eliezer gives the example of himself disagreeing with a creationist as evidence for how following the modesty argument could lead to decreased individual rationality. He also accuses those who agree with the argument of not taking it into account when planning their actions.
You can pragmatically say "I don't know", but you rationally should have a probability distribution.
(alternate summary:)
An edited instant messaging conversation regarding the use of the phrase "I don't know". "I don't know" is a useful phrase if you want to avoid getting in trouble or convey the fact that you don't have access to privileged information.
A Fable of Science and Politics
People respond in different ways to clear evidence they're wrong, not always by updating and moving on.
(alternate summary:)
A story about an underground society divided into two factions: one that believes that the sky is blue and one that believes the sky is green. At the end of the story, the reactions of various citizens to discovering the outside world and finally seeing the color of the sky are described.