Pop psychology is killing your ability to make good decisions
Why you’re getting this: Hey I’m Tom, I’m a startup founder and thinkboi extraordinaire building medley.gg. You’re getting my personal newsletter because you subscribed at some point (likely from my YouTube channel). I send this out semi-regularly to 7000 + curious people and share my thoughts on building startups, productivity and other bits. You can unsubscribe at any time and I periodically remove people who aren’t regularly reading (I removed nearly 4000 today hehe).
Introduction
As a child, I was indoctrinated into the world of team sports. Not a bad thing, but maybe not the paragon of good most think it is. As part of team sports, you routinely exhaust yourself, sometimes voluntarily, sometimes under the always aggressive and occasionally motivational verbiage of a supervising adult. While I never seemed to mind being shouted at by all 5’6” of my rugby coach, Mr. Johnson, to ‘push through the pain’, one thing I did mind was being told how to recover.
For anyone who has ever physically pushed themselves, after the exertion, your body naturally comes into a resting position where the palms of your hands drop to just above your knees, and you sort of keel over, voraciously gulping as much oxygen as possible.
No one ever teaches you to do this, it just feels like the right thing to do. This pose is almost completely universal, as ubiquitous as that slight involuntary upward head raise when passing an acquaintance in a corridor.
At around the age of 12, this pose became very unpopular with our school teachers and club coaches alike. Apparently ‘new science’ had proven it was an ‘inefficient’ way to get oxygen into your body. We were instructed to, instead of keeling over like tired dogs, to posture up like proud gazelles, putting our hands on our heads. This was meant to ‘open up the surface area of our lungs’, facilitating the diffusion of CO2 from our blood.
Don’t let the grin on this chap’s face fool you. Anyone who has ever held this position after a serious bout of exercise can tell you the pain is excruciating. You get this awful cramp in your lats, you feel like no air is going into your lungs. Every instinct is telling you to adopt the keel-over position. But, for whatever reason, our teachers had got it in their heads that the fate of our rugby team making the Daily Map knockouts, rested on us adopting this unnatural resting position. Anyone not adhering was barked at.
Of course, several years later, science confirmed what we already knew. The original theory which had us all standing straight was complete pseudo-science. Gazelle posture was a very inefficient way to rest, a generation of youngsters across the globe were denying their bodies precious oxygen – going against millennia of instinctual biology, in the name of scientific progress.
This article is not about optimal recovery positions. This is an article about the dangers of questioning our nature.
On logical fallacies and human irrationality
All true thinkbois have gone through certain reading phases in their life. You’ll have the classics phase (nothing but Seneca, Dostoyevsky, Nietzsche), you’ll have the Economics phase (Graham, Freidberg) and inevitably you’ll also have the Behavioural Science phase. In this phase, you’ll read the worlds of Kahneman, Tversky and Thaler and become convinced on the theory of logical fallacies.
Logical fallacies, within the context of behavioural science, refer to errors in reasoning that often occur as a result of cognitive biases. These biases are systematic patterns of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Logical fallacies in behavioural science often highlight how our mental shortcuts and heuristics, developed for efficiency, can lead to flawed or irrational thinking in certain situations.
Basically, this branch of ‘science’ can be summarised with a simple idea. Human brains are not rational. These books make it crystal clear why this is so. They’ll outline different cognitive biases from loss aversion to sunk cost fallacy, and they’ll then show, using lab experiments, how these different fallacies cause people to make sub-optimal decisions.
For years, I imbibed this view. I would become that dickhead in meetings questioning decisions by saying ‘could we be suffering from the sunk-cost fallacy here?’. What’s worse, people seemed to think I was smart for doing so.
I now think this theory of human irrationality is a harmful one. Just like it’s harmful to go against nature with our bodies, when they are instructing us to keel-over to recover, I also think its harmful to second-guess what our brains are instructing us to be true based on some pseudo-scientific theory we’ve read in some book.
I believe that in the future, science will again prove much of what we know about logical fallacies to be wrong, but why wait for that? Let's do some armchair hypothesizing of some of the most popular and influential cognitive biases.
Confirmation Bias
The first logical fallacy we’ll look at is one you are likely familiar with, good old confirmation bias.
People tend to seek, interpret, and remember information that confirms preexisting beliefs. Participants are asked to read two studies, one supporting and one contradicting their beliefs. They often rate the study aligning with their beliefs as more convincing. Confirmation bias can lead to poor or misinformed decisions, as it limits the individual's ability to view situations objectively and consider alternative perspectives.
Confirmation bias is one of those things that perhaps well-meaning colleagues or friends will shout at you in debates, pointing out that you are just looking for evidence to support your own argument. The thing with this trail of thought is that it’s based on the idea that if an opinion contradicts your pre-existing belief, it should be held with equal regard as opinion or evidence supporting the fact. On the surface, this may seem rational, but it’s anything but.
The argument of confirmation bias assumes that your opinion exists a priori, that is outside of the evidence it has already taken to form that opinion. But that’s not how opinions are formed. You believe that if you eat your vegetables and plenty of protein your health and physique will be better because you’ve seen the success of adhering to that over time. When someone comes along and tells you to eat raw liver and that vegetables are bullshit, you shouldn’t hold that evidence as equal to something that supports what you already believe to be true. As a neophile I sincerely struggle with this, adopting any meme-driven diet that happens to appear on my Instagram feed, with the logic of ‘let’s try it and find out’. This is not logical, nor good reasoning.
Perhaps the issue most of us face when making good decisions is not that we don’t take a broad enough array of counter-evidence into account but the opposite. We give too much weight to opinions or arguments we don’t think to be true. How many times have you been talked out of something you initially felt strongly about because you gave too much weight to the counter-evidence?
Ignoring confirmation bias doesn’t mean we never change our mind, it just means that the quality and volume of evidence/opinion required to sway our beliefs must be much higher. It is perfectly rational to, when given two pieces of evidence, put much more faith in the one that already supports what you know. If you are a racist, ignoring confirmation bias doesn’t mean you’ll stay a racist, the moral and social evidence of why racism is bad are completely 1-sided.
Loss Aversion
One of the most oft-quoted logical fallacies is that of loss aversion.
Loss aversion is when people feel the pain of losing more intensely than the pleasure of gaining an equivalent amount. For instance, the displeasure of losing $100 is stronger than the pleasure of winning $100. In the irrational brain theory, this is seen as a cognitive bias leading to irrational decisions.
Loss aversion affects decision-making processes, particularly in situations involving risk. Faced with a potential gain or loss, individuals are more likely to act to avoid a loss than to achieve a gain. This can lead to risk-averse behaviour when potential losses are involved, but risk-seeking behaviour when people are trying to avoid a sure loss.
In economics, loss aversion can explain various consumer and market behaviours. For example, investors might irrationally hold on to losing stocks to avoid realizing a loss, or consumers might stick with a suboptimal service to avoid the perceived loss of switching to another.
Honestly, reading over this summary again, I can’t believe I ever fell for such nonsense. How can it be irrational to value a loss over a potential gain of the same amount? Probably the first bit of maths you learn when you start investing is that if you lose 50% of your cash, you need to make 100% just to get back to even – of course we should be more careful in protecting our losses than going after our gain, of course, we should feel the pain of a loss more – it’s our body telling us something.
You can go far further than money with this argument. The pain of losing an old friend you’ve built a relationship with is, and should be, much more painful than making a new one, it’s also perfectly rational to be more affected by getting laid off than getting a promotion.
We don’t need to be aware of this ‘cognitive bias’ when making decisions. We don’t need to question the irrational nature of our brains. If we feel ourselves pulled not to do something because we feel averse to the loss, that’s probably the voice we should be listening to. Counter to that, when we feel pulled to take a risk, that’s likely our brain telling us the upside is worth it. I do think that only harm can come from trying to over-intellectualise our decision-making processes.
Sunk Cost Fallacy
Another cognitive bias’ that is in some ways similar to loss aversion is the Sunk Cost Fallacy.
In simple terms, this fallacy is around people continuing an endeavor due to previously invested resources, even when further costs outweigh the benefits.
The example always used in pop psychology books is that participants are told they have spent money on a ticket to an event. Later, they learn of a better event on the same day but can't refund the first ticket. Many stick to the first event. This is seen as irrational, we shouldn’t value things we have sunk effort or money into any more than new initiatives.
I couldn’t agree less.
It’s my experience that, especially at the start of any endeavour, you do just need blind faith that it will work. With nearly every single project I’ve started, whether that’s a new career, a startup, or anything else, you never see any results at the outset.
If you were to apply the sunk cost fallacy to starting a YouTube channel, for example, after 6 months of consistently uploading videos, you’d look at the effort put in, (say 10 hours per video, for 2 videos per week, say 500 hours in total), and the upside from that effort, say 200 subscribers, and absolutely zero revenue, and come to the conclusion that you are suffering from the sunk cost fallacy and should instead choose to move on. But if I hadn’t stuck to my YouTube channel, I’d have not made any of the cash online I had, I’d not be writing this newsletter and I’d like not to have my own startup serving the creator economy.
Indeed, the more you’ve sunk cost into something, the more subconscious conviction you must have in your pursuit, if you’ve worked on something for 3 years with no results and are still not at the point of quitting, you likely just need to tough it out a little longer.
This isn’t just true for business endeavours but also relationships, of course there are examples when people stay in relationships longer than they should, be generally the sunk cost of leaving a relationship is likely justified, you’ve invested time and energy, you’ve built experiences together, and, while it may not be perfect you should of course put a lot of weight on that sunk cost rather than just viewing an alternative with equal credence.
Why then do we read these books and believe the arguments of them? I think it is something the behaviour scientists do with experiments.
In the example of the sunk cost fallacy, the experiment chosen (regarding valuing a ticket you’ve been told you’ve spent money on vs a better event that same day) is so limited in its scope that it does nothing to explain how human brains think about complex problems. It isn’t science to take an incredibly reductionist example of decision-making, label it as a cognitive bias, and then recommend people apply the learnings of this bias to the much more complex system of real-life decisions. It is the same flimsy logical that has given us arguments such as Richard Dawkins The Selfish Gene and Noah Hurari’s story-telling-as-evolution.
Against nature
Hopefully, I’ve made my stance on Behavioural Science and logical fallacies clear. I think they are mostly bullshit. They attempt to explain human decision-making, an incredibly complex phenomenon with billions of variables through reductionist theories and simplistic experiments. These theories are then packaged up in legitimate-sounding books and peer-reviewed studies and sold to the masses in an attempt to shine a light on how our own irrational brains are not suited for this complex, scientific modern world.
It’s arrogant that academics and pseudo-scientists can sit together and create these rules and heuristics for how the human brain works. We know less about the brain than we do about the far reaches of space.
I’m no humanist, and I certainly don’t think human beings are perfect, maybe one day we’ll have an answer to why we behave in what often seems as destructive and harmful ways, but until that day comes, I’m trusting my gut.
Big ups
Tom x