How to detach yourself from a bad idea (w/ Prof. Adam Grant, #73)
Wharton's top rated professor on detaching from wrong ideas
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”
That phrase may or may not have been said by Mark Twain,1 but its truth is obvious—wedding yourself to a wrong idea makes you vulnerable.
That’s because you tend to rely on, and act upon, things you believe are true. If you '“know” there’s 0% chance of rain, you might walk to work in a suit. If the sky opens up, it’s not going to be a comfy trot home.
Though Professor
didn’t necessarily have rainstorms in mind when the Wharton psychologist wrote his 2021 best-seller, Think Again: The Power of Knowing What You Don't Know, the downside is analogous: rely too much on a belief or opinion and you may find yourself caught in a storm.Throughout the book, there are many science-backed lessons Grant teaches, but my favorite one—perhaps the single best question you can ask yourself to stress test your ideas—is one that I’ve not only used on myself, but on others to dislodge a belief.
And it’s the subject of today’s OGT.
“What would have to be true?”
My favorite chapter in the book is about how to detach yourself from ideas.
Attachment. That's what keeps us from recognizing when our opinions are off the mark and rethinking them. To unlock the joy of being wrong we need to detach. I e learned that two kinds of detachment are especially useful. Detaching your present from your past and detaching your opinions from your identity.
Here, we’ll focus on the latter: prying yourself loose from beliefs.
To show this, Grant uses the example of people called “superforecasters.”2 These are regular people who, via various tests and in competition, show an extraordinary ability to predict future sociopolitical events—like Trump’s election.
But it turns out, they aren’t just genetically gifted. They use teachable tactics—like changing their mind.
The single most important driver of forecasters success was how often they updated their beliefs3
But how? That’s the hard part. But there is a good tactic.
One superforecaster Grant highlights has a good method. And it turns out to be research-backed:
[He] has a favorite trick for catching himself when he's wrong. When he makes a forecast, he also makes a list of conditions in which it should hold true---as well as the conditions under which he wood change his mind.
Grant elaborates with the key question:
What forecasters do in tournaments is a good practice for life. When you form an opinion, ask yourself, “what would have to happen to prove it false?”
Research suggests that identifying even a single reason why we might be wrong can be enough to curb our overconfidence.
The OGT: The Key Question
Another way of saying this question: What would have to be true for you to change your mind?
It forces you to make the list. To actually measure your thoughts against a standard, rather than allow an abstract view to go untethered.
If you force yourself to think about what it would take to think differently, you might find one of two things is true:
Either (a) there actually is nothing that would make you change your mind. In this case, you’re not a science-like believer in a hypothesis, but dogmatic. In which case you have to ask yourself: am I OK with that result?
Or (b) it might show you that you are already quite close to this threshold, and the idea wasn’t as sacred as you once thought.
Give it a try.
Let me know if anything changes.
https://quoteinvestigator.com/2018/11/18/know-trouble/
If you want to know more about this, you should read Phillip Tetlock—the architect of superforecasting—and his book of the same title.
On tests of prediction, for instance, the superforecasters would update their beliefs (think erase and change) twice as much as average predictors.