Human minds are irrational. A rational mind would (1) make accurate everyday judgements about the world and itself and (2) be able to decide on a course of behavior it thinks will fulfill its goals and definitely follow through on it. I hope to convince you that our minds are not like this by default.1

I'm honestly not sure how most people conceptualize minds. Perhaps my statement above is obvious to most people, but I've found myself explaining/debating this viewpoint multiple times, and decided I would like to articulate my thoughts here, as it's a pretty central mental model of mine.

Examples of minds making bad judgements/decisions

To provide some evidence for this and give an intuition for what I mean, I'm going to give some examples. Cognitive science is littered with biases that we all seem to have, but I'll just list the highlights here that I think are most relevant for everyday decisions.

  • Our prediction machinery is flawed
    • Our mind are really bad at estimating how long things will take. If you ask people to estimate by when they'll be done with something, and also ask them for when they would be done with it in the worst case, most of them finish it after the worst case date given! I notice this practically everyday in myself. This is because the future we imagine tends to be a best case version where everything goes right.
    • Our minds suck at appreciating how we will feel and behave in different mental states. When we're angry, we don't realize how much our behavior is influenced by that temporary mental state. When we're calm, we find it hard to imagine ourselves angry. A big way this pops up for me is while planning vs. executing. While planning, I'm typically in a motivated state, such that I can easily imagine myself following through on the tasks that I set for myself. Later, when I'm in a less energetic state, I find it much harder to actually do the tasks.
  • Reasoning about what to do is flawed
    • We weight near-term rewards much more heavily than longer-term rewards. This means we'll often make decisions in the moment that we wouldn't want our future self to make. I like to eat ice cream in the moment, but I would prefer my future self to not eat ice cream as I don't want to get diabetes.
    • We continue with things beyond the point when it would be best to quit. For me, this often manifests itself while programming while implementing a feature. I might under-estimate how hard it will be to add something to a project and so will start working on it. An hour in, I'll realize that the lift is significantly greater than I thought. At this point, if the feature is a nice-to-have, it often makes sense to drop it and move on to something with a better bang-for-my-unit-of-time-buck, but this just feels really hard to do. I mean, I spent all that time already and I have all the relevant context now, shouldn't I just continue? (no!)
    • Adding more options to a choice can decrease the quality of our decisions and our level of satisfaction. I was recently applying to something that had many sub-applications, where you could decide how many and which ones to apply to. This was initially somewhat stressful, as I viewed this application as an important thing to do and wanted to make good decisions. For a bit this resulted in me either procrastinating, or spending too much time evaluating each possible option, time that would be better spent just filling out sub-applications for options that were above some bar of adequateness. Once I switched to this strategy, it was a lot less stressful and I got a lot more done per unit time.
  • Our understanding of the past is flawed
    • How much we remember enjoying an experience is often fairly different from how much we actually enjoyed the experience. In particular, the amount of enjoyment we remember is heavily weighted by the emotional peaks and by how we felt at the end. An example I like is from this book, whose author uses a tool to ping himself throughout the day to collect data on his experiential happiness. He remembered enjoying a whitewater rafting trip he had went on a lot, but when he checked the data, the trip consisted of a multi-hour ride in a stuffy van that was fairly unenjoyable, and then 20 minutes or so of fun. Before looking at this data, he had mostly forgotten about the van trip and was thinking of doing it again.
    • Our memories of what we did and how long we spent doing it are often very inaccurate. For example, once I started tracking my time, I realized my judgements of how long I spent working on a programming project were often way under the actual time spent (on the order of 3h vs. 5h).

What are emotions?

In many of these examples, mental states play a big role in affecting our perception of the world/ourselves and our decisions.

The world is complex and contains a lot of information, too much to reason with exhaustively, and too much for our conscious mind to reason with at all.2 Because of this, most of the machinery of our brain is a bundle of heuristics that is hidden from us. This is what we call the "unconscious". Our unconscious mind is able to aggregate and handle much much information than our conscious mind, and communicates to our conscious mind through impressions, new thoughts, and emotions.

Examples:

  • When we meet someone new, we'll very quickly get a prediction of their personality and an impression of whether we like them or not. This is the result we get from the unconscious mind after it used heuristics to aggregate the information we just got about this person with our past experiences. notice that we never get to see how these impressions are put together here3
  • You've probably had the experience of reading/hearing a complex argument and had a feeling that it was wrong somehow, without having consciously thought about it a lot, and without being able to fully articulate yet why you didn't like it. This again comes from your unconscious doing its magic to interpret the information in light of the knowledge you have.

Why are our minds like this?

It's useful to take a second to consider why our minds have all these quirks. Since we all have them, they must be a product of evolution, but why would evolution leave us with so many bugs?

The answer is that these aren't bugs from evolution's point of view. They're features. To evolution, our brains are decision-making organs that are designed with the end goal of maximizing the number of genes we propagate (in healthy offspring/relatives) to the next generation.

These issues then arise from a few sources:

  • The world is really complicated, and there are efficiency tradeoffs while making decisions. If it was optimal (for evolution) to give us minds that produce perfect decisions no matter how many resources this would consume, then those are the minds we would have. But the point on the spectrum that evolution decided was optimal uses heuristics that work pretty well most of the time and save on energy. Importantly, while this point on the spectrum is optimal for evolution, it's not necessarily optimal for our preferences as individuals, which brings me to the second source of issues.
  • The preferences of evolution sometimes diverge from our preferences as individuals. Because it was important to maintain our social status, evolution made us care a lot about other's perceptions of us, which can result in social anxiety. While caring about other's perceptions of us can be useful, my guess is that our preferences as individuals would define a lower point on that slider than the point evolution has picked.
  • The modern world is fairly different from the environment in which we evolved. Just like a sweet tooth is a remnant from times when food was more scarce, a tendency to overweight present rewards is a sound strategy when the future is unpredictable.

So, can we fix this?

Unfortunately, the general consensus in cognitive science is that just knowing that our minds screw up in these ways does essentially zilch by default to help us spot and prevent it in ourselves (though we are decent at recognizing them in others). We're, by default, blind to these issues, even if we know at a conscious level that they might happen.

Yet I think it's still very useful to have this frame. As a starting point, it's super useful to realize that failures to perfectly control your behavior/emotions isn't an issue with you; it's almost certainly with your strategy. For example, just trying harder to accomplish a goal is not a good strategy. It's much better to set up a system that makes it more likely for you to follow through (and to understand that you'll still drastically over-estimate how likely it is that you will).

Here are some other concrete ideas:

  • To combat choice overload (where more choices leads to dissatisfaction/anxiety and worse decision quality), you can make it a habit to satisfice for more decisions. This entails looking for a good-enough option and then just going with it. When I've been successful about implementing this it definitely reduces anxiety and has allowed me to spend more time doing things rather than thinking.
  • To combat planning fallacy (where we greatly underestimate how long it will take to do something) you can begin to consciously calibrate (by tracking your time, not relying on memory!) how long it takes you to do tasks of a certain level of complexity. For example, when estimating how many hours something will take, you can average over how long it took you to finish similar tasks in the past.
  • Because others are better at spotting the biases that we commit, it can be useful to talk to others about decisions and specifically ask for their feedback on where it seems like you're missing something obvious. You could also approach a situation from the perspective of another, perhaps by imagining what advice you would give to a friend in a similar situation.
1

Sometimes by "rational" people mean something like "doesn't make probabilistic mistakes such as thinking the conjunction of two events to be more likely than just one of the events." While it's true that our minds make these sorts of mistakes, I don't think view of rationality is useful in an everyday context, and therefore am defining rationality to be making good "everyday" judgements about oneself and being able to control one's behavior robustly into the future.

2

I'm honestly curious why a massive part of the machinery of our mind is opaque/hidden from us. Our conscious mind definitely has pretty hard limits on capacity . If this is an inherent limit, perhaps making the mind more transparent to the conscious mind would only nerf it?

3

In fact, another bias is that people think they do understand how their mind arrives at conclusions like these!