Over the last few decades, new experiments have changed science's picture of the way we think — how we succeed or fail to obtain the truth and achieve our goals. The heuristics and biases program, in cognitive psychology, has exposed dozens of major flaws in human reasoning. Social psychology has learned about how groups succeed or fail. Behavioral economists have measured the way humans decide against models of optimal decision-makers, and discovered that we often decide suboptimally.
Less Wrong is a site for people who want to apply these findings to their own thinking.
What could have prevented me from making such a ridiculous error? How could I have noticed that I was being insanely overconfident? There are a few signs that should have tipped me off:
Lack of expertise: The physics of heat and fluids is outside of my domain of expertise. I should have recognized that this is not a subject I know a lot about and so been more skeptical of my own opinions.
A simple model: I was employing a very simple model for the situation (involving just the temperature of the water). While beautiful, and easy to work with, really simple models rarely capture all the details of a situation. For instance, I didn’t consider the possibility of evaporation or frost, because those variables weren’t included in my model. If you’re using a simple model, you should consider whether you’re missing important factors.
Insufficient time: If you haven’t thought about something for very long yet you are already very opinionated, you might want to think about it longer.
Didn’t consider alternatives: I didn’t even try to think of ways that this effect could be real. Instead, I came up with an argument why it couldn’t be real, and that argument sounded convincing to me, so I stopped thinking. This problem is the big one. So here’s a 30 minute free mini-course I designed to train you to avoid exactly this problem.