Evaluate Better

Raul Popadineți
17 Dec, 2020 • 3 min read
Photo by Scot Graham on Unsplash

We seek to be right in our arguments. We want people to believe us when we present a fact or a story. The problem is we're biased. Presenting an indubitable truth can't be achieved easily when we talk only about what we know and leave out what we don't know. 

The Confirmation Error

We rely too much on what we know. Cicero, the great Roman orator, once spoke to someone and showed a picture of people on a boat that prayed to God during a storm. They survived. Thus making the inference that if you pray, you'll receive the mercy of God, but then one person told him:

Show me the pictures of all those who prayed to God but died.

And that's impossible because they did not live to tell the story.

To prove our point, we always seek confirmation. We look for a few people, books, or resources that confirm it for whatever we say. This hardly proves truthfulness. One opposing argument can crumble down all our beliefs.

A better way to rewire our brain around this is by looking for things that deny what we consider right. It will offer new perspectives and help us improve our initial thought.

The Narrative Error

We tend to believe something that has a cause or a story behind it. It makes it more appealing to our brain. The mind uses narratives to memorize things, so it’s easier to lean towards these stories. A good example of this is:

  1. Bob seemed to have a good day at work. He quit his job.
  2. Bob seemed to have a good day at work. He quit his job to open a woodworking shop.

Our first thought is that #2 is more plausible than #1 because it has a cause. But #1 is more general and could mean Bob quit his job for a thousand other better reasons: he had to move with his family to a different city, he did something that will bankrupt the company during the following weeks, etc.

We can also look at stories that appear in the newspapers and online. They include many events — which are not necessarily true, but we take them for granted —, are expressed coherently, and we tend to believe them. We won’t begin an investigation and determine if the story is fake or not. We usually think it’s genuine and move to the next one. Nowadays, with the abundance of content we find online, this happens more and more often.

More Rabbit Holes

The trickiest part is that both brain hemispheres fall into the confirmation or the narrative error trap to prove something is indisputable. The right side brain has an affinity for new. It works with particularities (e.g., a tree), whereas the left side has an affinity for patterns, structures (e.g., a forest). One will use a story to prove it's right. The other will look for a few facts as a confirmation.

Another interesting thing is that we can’t stop theorize stuff. The more we think about something, the more it’s susceptible to our interpretation. The question here is, what do you think consumes more energy: theorizing or not theorizing?

Probably the correct answer is not theorizing because it is unnatural to the human being.

What now?

We have to realize that our beliefs must be adaptable to keep up with the ever-changing world. Defending something zealously by referring solely to what we know is an act of ignorance and foolishness. Think about what we don't know, and that will make you evaluate things better. It will open a whole new world. A world where we can show more empathy and compassion with one another. Who knows? Maybe that person we thought was completely wrong might have a valid point after all.



All these are excerpts and conclusions I took from The Black Swan by Nassim Nicholas Taleb. Probably one of the most eye-opening and insightful books I've read in the past couple of years. The theory is applicable in many domains, especially business, where we deal with improbability all the time.

Get Fair Remote updates delivered to your inbox.

We'll send you at most one email every week with the latest posts. We'll never spam you or sell your email address, and you can unsubscribe at any time. Read our privacy policy to learn more.
We use cookies. They help us deliver our services and require your consent.