The Invisible Strings

Biases. We all have them, but we don’t choose them. They subconsciously colour everything we know, learn, and perceive. The books we read, what we see on the news, the jokes we hear, and even what we come to understand as facts; all are perceived and distorted by the internal lens of bias.

Yet we like to think we’re rational people.

In economic schools of thought humans are often portrayed as rational beings, able to control their impulses and suffer short term losses for long term gains. In truth, nothing could be further from reality. We lack control of our impulses, choosing small present pleasures over long term success, forsaking our future selves. We often make poor decisions regarding our health, money, and happiness. How frequently do we choose an expensive night out while student loans loom over us, or make poor dietary choices we know will be regretted later?

This ‘Economic Man’ does not exist. Instead we face a constant struggle for control of our own selves against inherent and unconscious biases.

These ‘Cognitive Biases’ pervade everything we do to an uncomfortable degree. We like to think of ourselves as tiny little pilots flying around these large bodies, meticulously controlling each and every thought, word and action. In truth, we are asleep at the wheel, allowing autopilot to make our journey for us. And in the fleeting moments we really helm the ship, we don’t even have the luxury of an instruction manual. We are often lost, confused, and almost always making mistakes.

But we are not hopeless.

Psychologist Daniel Kahneman tackles the subject of cognitive biases in his book Thinking Fast and Slow, where he reveals just how easy it is for us to make simple errors in reasoning. We are prone to be overconfident, blind to our own mistakes, detect imaginary patterns, and create subjective realities we see as anything but subjective.

He illustrates this with the ‘Linda Problem’. Imagine a young girl named Linda, who is outspoken, intelligent, and concerned with discrimination and other social problems. Is it more likely that Linda is (1) a bank teller, or (2) a bank teller and is active in the feminist movement. In a clear demonstration of flawed reasoning, almost everyone selects option two. Logically every feminist bank teller is also just a bank teller, meaning option one is at least, if not more likely than option two. Don’t feel too bad if you picked the second option — so did over 85 per cent of students in Standford’s Graduate School of Business, all of whom deeply study probability.

My personal favourite of these unfortunate mental quirks is the ‘Backfire Effect’. In a 2006 psychological study by Nyhan and Reifler, people from all parts of the political spectrum were given news articles on polarizing issues. Immediately after, they were given another news article from the same source correcting the first article. For example, both liberals and conservatives were given an article suggesting the United States found weapons of mass destruction in Iraq. They were later given an article correcting the first, saying there were in fact no such weapons in Iraq. The effect on conservatives was startling; being exposed to the corrections actually made them more likely to believe the false claim. Around 30 per cent of conservatives who only read the first article believed the U.S. found WMDs in Iraq, but among those who read the second article discrediting this claim, belief jumped to 60 per cent.

They repeated the experiment with a wide variety of issues, and found evidence against strongly held beliefs actually increases the strength of those beliefs, regardless of political ideology.

Being proven wrong just makes us believe even harder.

Just like how Confirmation Bias stops you from looking for information that challenges your world view, the Backfire Effect protects those beliefs held closest to your identity, no matter how irrational or unjustified they may be.

The existence of these biases poses an immense problem for us on a societal scale. How are we to engage in meaningful discourse if our minds are against us from the start? We are wired to avoid thinking objectively and critically, instead we emotionally identify with certain beliefs, attitudes, and ideologies, holding them near to how we perceive ourselves as people and protecting them from the dangers of evidence and reason.

We see this time and time again in all manners of belief, both trivial and important, and from political to religious. We are so averse to seeing our beliefs challenged and overcome that we mentally shut down during any such attempts. Nuance is our antithesis, the world devoid of any shades of grey.

Our current political situation is a reflection of this. We are incredibly susceptible to tribal mentalities, identifying with a set of beliefs and refusing to deviate. Political advertisements play on our biases, offering us emotional arguments filled with flawed reasoning and smearing opponents with baseless claims, often questioning their patriotism and or claiming they’re ‘just visiting’.

Enormous amounts of people deny the very reality around them, questioning the critical consensus of the world’s experts in a variety of fields from climate change to vaccination. It is clear that our susceptibility to these biases do not improve our modern situation or aid our collective cause as a species. Yet we are still bound to them, ingrained on a genetic and psychological level and beyond our capacity to change.

What then, do we do?

In his book, A Theory of Justice, philosopher John Rawls presents his readers with a thought experiment designed to create an ideal society. He asks them to imagine themselves as part of a group of people who are designing a new social contract, what he calls the Original Position. To do this effectively, the designers must remove themselves from their current situations and adopt the Veil of Ignorance, that is, they must not think from the perspective of who they are now, but of everyone who will be subject to their eventual creation. He is asking them to make of themselves a blank slate.

What this will achieve, Rawls argues, is a truly just society. No one knows where they will end up from behind the Veil of Ignorance. Their social status, intelligence, and wealth all remain unknown. Because of this, they will design a society with social and material goods distributed equitably.

Rawls is asking his readers to separate themselves from their biases, to abandon their presuppositions and assumptions about the way the world is supposed to be, and consider everything in a new light. Unburdened by their biases, he asks them to be objective, empirical, and above all, to think critically about the things they have as of yet taken for granted.

Adopting the Veil of Ignorance may very well be an effective way to avoid bias. Although fully shedding our biases is probably impossible. Still, the first step in mitigating them is simply to be aware of them. We must acknowledge they exist, concede that we are all subject to them, and find ways to work around them and minimize their impact.

Perhaps the most effective solution would be to rework aspects of the social contract. If we change how our political and social institutions function, we may be able to be more effectively accountable for at least our most common errors of judgement.

We are constantly pushed and pulled in various directions by all aspects of our society. The things we take in through our media, our friends, and our entertainment, have profound and lasting effects on how we see the world. As do our political and social institutions. We are not given the choice as to whether or not these things affect us, they simply do.

Why then shouldn’t we make sure they push and pull us towards making good decisions? This, of course, is not to suggest a loss of choice or of our autonomy, but rather a difference in the way choices are presented to us.

This is the argument made by Richard H. Thaler and Cass R Sunstein in their 2009 book Nudge. The two authors detail various ways the government and other organizations can help people avoid common mistakes made due to cognitive bias. They suggest an approach that maximizes personal freedom and choice, while curating the way these choices are presented to nudge people into making the best decision. They call this ‘Libertarian Paternalism’.

One of their examples deals with Germany and Austria, which are quite similar to each other, demographically, culturally and geographically. And yet, only 12 per cent of Germans are organ donors, while nearly 99 per cent of Austrians are. What causes this startling difference? The default setting. Germany uses an opt-in system, while Austria uses an opt-out system. In both cases, a vast majority of people just stick with the default, either because they identify it as the ‘correct’ choice, or just because of laziness.

Simply changing the German default to opt-out is one of the book’s suggested nudges. This small but powerful change in choice would be immensely valuable, saving both lives and money.

One such nudge is taking place in school cafeterias around the world. Food selections are being rearranged, not changed, to encourage healthier choices. Many studies have found that simply making fruits and vegetables the first option students see dramatically increases their sales. Going even further and making healthier choices easier to access, while displaying them more prominently than less healthy choices, instantly changes the eating habits of students. This little nudge goes a long way in combating health problems such as obesity, while also avoiding the massive costs associated with them down the road.

These small changes play on our cognitive biases. By utilizing them we can manipulate ourselves into making better choices in all aspects of our lives.

The United Kingdom government went as far as to create a Behavioural Insights Unit to apply the principles of Nudge to government policy. The team, largely composed of psychologists and economists, applies what we know about cognitive biases to urge people into making better choices, often by sculpting the way those choices are presented.

The Behavioural Insights Unit and their nudges have already saved the U.K. government hundreds of millions of pounds. It has quadrupled in size since its inception, and now services foreign governments as well as international organizations such as the world bank.

Inspired by the U.K.’s success, the Ontario government has decided to set up their own Behavioural Insights Unit. There are a few pilot projects on the way, one of which is tackling organ donation.

Everyone is susceptible to these biases. They lower our quality of life, causing us to make poorer choices about our health, or money and our happiness. We fall for poor arguments, accept the flawed logic of demagogues, and even at times deny reality as it surrounds us. These small errors in reasoning have dire consequences. All the more reason to acknowledge and combat them.

Exercising the capacity to think critically and objectively is one method. Taking a more critical look at the assumptions we take for granted, the things we hold nearest to our identities as human beings, goes a long way. Letting go of these presuppositions and assuming a blank slate allow for new perspectives to shine through. We learn more about ourselves and the world around us, and perhaps may even overcome the cognitive quirks of our minds.

We can go even further, making systemic changes to the ways we go about our lives. Small nudges in the ways we make choices about our health and money, and the ways we interact with businesses, government, and each other, will do much of the work for us.

But above all, we must work to engage on a deeper level with the ideas exposed to us every day. Being more considerate of the world around us, and the ways in which it shapes us and we in turn shape it, will help us be a little better than we are today.

We are not rational. But that doesn’t mean we shouldn’t try

Pin It

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>