One of the most painful moments in my life was accepting as a young teen that new information does not necessarily change people’s minds. Including mine.
Before then, I approached disagreements with the view that if I could only get the other person to grasp some information I had, they would come to the obvious conclusions that I had. If only I could show them, I thought, what I understood, surely they were reasonable enough to understand?
Part of really growing up was learning that it’s not that simple, and logic is actually not the most powerful tool for convincing people.
And—this took me longer to accept—that isn’t a bad thing.
If you’re anything like many people I’ve talked to about this, it might be hard for you to accept as well. How is it not a bad thing that people aren’t convinced by facts and logic? How is it okay that emotions and stories have so much power?
Let’s talk about it.
Right now in the US, there are people to whom it is obvious that the most recent elections were free and fair and a new president is on the horizon. And there are apparently just as many who are unable to whom the obvious reality is that fraud occurred. One group is certain that the US is about to start heading away from certain disaster, and the other group is just as certain there’s a threat to the most important force keeping the republic from dropping off a cliff.
Both can’t be right. Right?
Thanks to the cultural power of US politics, this dispute over their elections are an inescapable subject, but of course they’re just one of a myriad such disputes that play out across our world. (Another would be the arguments I get into now and again with people convinced beyond doubt that Apple is a sheep-cultivating company selling overpriced, under-featured devices to stupid people.)
We’ve all seen people on both sides throw facts at each other, certain that any reasonable person must draw only one conclusion from these facts.
Which in turn implies that those who do not draw said obvious conclusion must not be reasonable. Or just evil. Maybe not even human.
How else would they not see something so obvious?
Well…maybe because it’s not that obvious?
Now and then the internet is hit with some perception dilemma—do you hear it Laurel or Yanny? Does the dress look blue or gold? And we joke about how it’s incredible that people of sound mind can see or hear the same things entirely differently.
Things we find obvious seem just as obviously different to others.
And so, when we wonder why facts seem so often in dispute in our day, it would serve us to remember something our forebears always knew, and which I’ve written about in a couple previous essays: we are moved, not by information, but by narrative.
Put differently, we don’t relate to facts but to what those facts mean to us. You might even say that to be human is to be a being defined by stories.
In other words, facts perhaps aren’t in dispute as much as they never had such a big place to begin with.
This isn’t to say facts don’t matter. Rather, the fact is, how much facts matter is defined by the narratives we fit them into.
And make no mistake: not even the most intelligent or intellectual among us is exempt. We can (and should) build our narratives around facts as much as we can, but we cannot escape that need for narratives.
Speaking of narratives: a word on beliefs.
What people believe is more important for predicting their actions than what they know. That’s why someone can be super knowledgable and otherwise smart and yet believe and do absurd things, like hold on to superstitions despite knowing they’re superstitions.
We have a tendency to assume that what people believe is based on what they know. And it is. But not in the way we often think. What people believe is based on what feels true to them, and that’s in turn based on what they know.
But that bit about “what feels true,” which is the bit of the equation we tend to overlook, is precisely the most crucial bit.
Let’s try putting it in a little sort of equation, shall we?
What we know → What seems true → What we believe
That’s how things really stand. Let’s call it model #1. But our typical assumption (let’s call model #2), is:
What we know → What we believe
But model #2 doesn’t fit with experience. People we think should know better surprise us by their full investment in all sorts of unexpected and seemingly unexplainable beliefs.
It’s unexplainable because we overlook the middle element (what seems true) which holds it all together. The unexpected part can’t be helped, but recognising the middle element might help us be less surprised at people’s beliefs.
So let’s talk about that middle bit.
What determines what seems true?
The term for this, invented by no other than American late show host Stephen Colbert, is truthiness. To be fair Colbert used this word to describe a tendency to give more weight to what’s perceived as true, while ignoring objective reality. I am, however, using the slightly broader meaning of what feels true, whether it is or not. And my point is our beliefs are based on what feels true. But it’s an entirely different question whether we have done the personal work of training ourselves to feel truth as truthy.
Because, let’s face it, many truths don’t feel truthy, especially at first blush.
Beliefs, too, are ultimately really about narratives. And narratives aren’t even really about logic (even for the most logical tending people), as they’re about meaning and truthiness.
In the meantime, truly intelligent people can carry on with seemingly absurd beliefs.
Leave a comment