Think Again: The Power of Knowing What You Don’t Know by Adam Grant (2021) is a new addition to the growing body of mainstream books about mental blindspots, cognitive biases, and thinking errors. Every individual possesses cognitive tools and accumulated knowledge that they regularly rely upon. But we rarely question or consider this knowledge which includes beliefs, assumptions, opinions, and prejudices. Blind adherence to these tools can result in poor outcomes: inflexible overconfidence, bad decision-making, avoidable errors, and failures to learn and grow. How do we know what we know, and how do we know if we’re right? Grant’s solution is an idea he calls “rethinking.” Rethinking is the process of doubting what you know, being curious about what you don’t know, and updating your thinking based on new evidence (in other words, the scientific method).
In the first chapter of the book, Grant outlines three common mindsets coined by political scientist Phil Tetlock: preacher, prosecutor, and politician. We routinely fall into one or more of these roles when we engage with others and in our solitary conversations with ourselves. When we’re locked in preacher mode, we are set on promoting our ideas (at the expense of listening to others). When we’re in prosecution mode, we actively attack the ideas of others in an effort to win an argument. Politician mode seeks the approval of others and has little conviction for the truth. In each of the three mindsets, the truth takes a back seat to other considerations: being right, defending your beliefs, and currying favor.
Grant recommends a fourth role to offset those found in Tetlock’s model. This is the mindset of the scientist. This mindset embraces Grant’s idea of rethinking. Those with a scientific mindset search for truth by testing hypotheses, regularly run experiments, and continuously uncover new truths and revise their thinking. In this mode of thinking, changing your mind is a sign of intellectual integrity, not one of moral weakness or a failure of conviction. This scientific mind is a key through line in the book; it offers a superior path to improved thinking, true knowledge, and lifelong learning.
Think Again is structured into three main parts. The first part considers rethinking at the individual level. The second part explores how to encourage and influence other individuals to engage in rethinking. The final part looks at rethinking at the institutional or group level.
Prologue
- Opening story: Smokejumpers and the Mann Gulch fire (Montana) of 1949. The team was inserted into challenging conditions and the fire quickly overtook them. Some smokejumpers held on to their equipment (as they were trained to do) despite the added weight (possibly) preventing them from surviving. Wagner Dodge made a quick decision to build an escape fire and lay down in the charred area while the wildfire raged around him. Most of the other smokejumpers perished.
- In addition, the mission was based on mistaken assumptions about wildfires—that immediate suppression was the optimal strategy. We’ve since come to rethink our approach to remote wildfires.
- Conventional vs. new views of intelligence:
- Conventional view: intelligence is the ability to think and learn.
- Alternative view: intelligence is the ability to rethink and unlearn, i.e. flexible thinking. Grant argues these cognitive skills are essential in a turbulent and changing world.
- Psychologists find that test takers who second-guess their answers usually have better outcomes with their revised answers.
- First-instinct fallacy: Your gut instinct or first response isn’t always right (even though we are taught to trust our gut).
- “We don’t just hesitate to rethink our answers. We hesitate at the very idea of rethinking.”
- “Questioning ourselves makes the world more unpredictable. It requires us to admit that the facts may have changed, that what was once right may now be wrong.”
- Seizing and freezing: the phenomenon where we “stick to our guns.” In part, we do this for psychological comfort.
- Everyone carries cognitive tools that are regularly used and seldom questioned or subject to reflection or scrutiny. These include beliefs, assumptions, opinions, and more.
Part I: Individual Rethinking
**Chapter 1: A Preacher, a Prosecutor, a Politician, and a Scientist **
Walk into Your Mind
- Opening story: Mike Lazaridis, the founder of the BlackBerry smartphone. Lazaridis was brilliant and turned BlackBerry into a popular business tool. But when the iPhone was released, Lazaridis failed to change his thinking to respond to a rapidly changing mobile device market. He stubbornly clung to the idea that people wouldn’t want to use smartphones for games, entertainment, and other tasks (beyond email, phone calls, and texting).
- It’s easy to notice when othersneed to change their opinions, but difficult for us to develop the same habit for ourselves.
- “When it comes to our own knowledge and opinions, we often favor feelingright over being”
- Phil Tetlock’s (political scientist) mindset model: Preachers, prosecutors, and politicians.
- Preachers: We pontificate and promote our ideas (sometimes to defend our ideas from attack). Changing your mind is a sign of moral weakness.
- Prosecutors: We attack the ideas of others, often to win an argument. Being persuaded is defeat.
- Politicians: We try to win the support of others, optimizing for approval and agreement (over personal conviction). We change our opinion opportunistically.
- Scientist: Grant appends this professional worldview to Tetlock’s mindset models.
- Rethinking is fundamental to scientific thinking.
- “You’re expected to doubt what you know, be curious about what you don’t know, and update your views based on new data.”
- Search for truth through testing hypotheses, running experiments, and uncovering new truths.
- Changing your mind is a sign of intellectual integrity and a response to evidence.
- “Hypotheses have as much of a place in our lives as they do in the lab. Experiments can inform our daily decisions.”
- In a study of entrepreneurs, a test group was encouraged to use scientific thinking to develop a business strategy. The test group outperformed the control group significantly and tended to pivot twice as often.
- Remember: real-life scientists can easily fall into preacher, prosecutor, politician modes too. They too are prone to forgetting their professional tools.
- Cognitive bias: Seeing what we want to see.
- Desirability bias: The tendency to act in a manner that enhances your acceptance or approval from others.
- Instead of searching for reasons why we are right, search for reasons for why we are wrong.
- “The purpose of learning isn’t to affirm our beliefs; it’s to evolve our beliefs.”
- The rethinking cycle: Humility => Doubt => Curiosity => Discovery
- The overconfidence cycle: Pride => Conviction => Confirmation and Desirability Biases => Validation
Chapter 2: The Armchair Quarterback and the Imposter
- Opening story: Ursula Mercz, in the late 1800s, was diagnosed as blind but insisted she could see and was completely unaware of this fact. Researchers in the 20th century reported similar findings: patients unaware of their situation and unable to learn from experience.
- Anton’s syndromeis a condition whereby an individual is oblivious to a physical disability due to damage to the occipital lobe of the brain.
- “In theory, confidence and competence go hand in hand. In practice, they often diverge.”
- Armchair quarterback syndrome: Phenomenon where confidence exceeds competence.
- Imposter syndrome: Phenomenon where competence exceeds confidence.
- The Dunning-Kruger effect: Identifies the disconnect between competence and confidence. The most confident are often the least competent.
- We routinely overestimate our abilities.
- David Dunning: “The first rule of the Dunning-Kruger club is that you don’t know you’re a member of the Dunning-Kruger club.”
- Competence and confidence don’t progress at the same rate:
- We mistake experience for expertise.
- Beginners rarely make Dunning-Kruger errors.
- But a small amount of knowledge can create big problems with the Dunning-Kruger trap as confidence climbs faster than competence.
- “Humility is often misunderstood. It’s not a matter of having low self-confidence. One of the Latin roots of humilitymeans ‘from the earth.’ It’s about being grounded—recognizing that we’re flawed and fallible.”
- Confident humility: An ideal wherein the individual has faith in their abilities but retains sufficient doubt and flexibility to recognize they could be wrong. Because of this they remain curious and flexible, always seeking the truth.
- Be confident in your ability to learn more than in your knowledge (which is malleable).
- “A mark of lifelong learners is recognizing that they can learn something from everyone they meet.”
Chapter 3: The Joy of Being Wrong
- Opening story: 1959 Harvard study by Henry Murray (psychologist). Murray designed a test in which subjects (Harvard students) were interrogated. The interrogators would aggressively assault the subjects’ world-views (the goal was to mentally stress the participants). Students whose identities and ideologies were strongly intertwined (non-flexible thinkers) cracked. Those who embraced flexible thinking did not. One of the subjects was Ted Kaczynski (The Unabomber); he had one of the strongest negative responses to the study.
- View being wrong as a good thing; an opportunity to learn something new.
- Isaac Asimov: “Great discoveries often begin not with ‘Eureka!’ but with ‘That’s funny…’”
- Totalitarian ego: Psychological term for the mental gate-keeper that keeps threatening information out of our heads. Our mini internal dictator.
- Richard Feynman (physicist): “You must not fool yourself—and you are the easiest person to fool.”
- Decouple your identity from your beliefs. Better yet, make your identity one in which you actively seek truth and knowledge—this opens you up to curiosity and rethinking.
- Two types of detachment:
- Detaching your present from your past.
- Detaching your opinions from your identity.
- “Who you are should be a question of what you value, not what you believe.”
- Values are core principles like excellence, generosity, freedom, fairness, integrity, etc.
- Values retain flexibility that opinions do not.
- “Better judgment doesn’t necessarily require hundreds or even dozens of updates. Just a few more efforts at rethinking can move the needle.”
- Actively seek out reasons why you might be wrong. Even a single idea can curb overconfidence.
- Jeff Bezos: “People who are right a lot listen a lot, and they change their mind a lot. If you don’t change your mind frequently, you’re going to be wrong a lot.”
- Jean-Pierre Buegoms (forecaster):
- Make a list of conditions in which your forecast holds true.
- Make a list of conditions under which you would change your mind.
Chapter 4: The Good Fight Club
- Opening story: Orville and Wilbur Wright and the chemistry the two brothers had as intellectual partners. They challenged each other’s thinking and this allowed them to improve their ideas through a continuous feedback loop.
- Wilbur Wright: “Honest argument is merely a process of mutually picking the beams and motes out of each other’s eyes so both can see clearly.”
- Relationship conflict: Personal feuds and arguments (e.g. “I hate you!”).
- Task conflict: Arguments over specific ideas and opinions (e.g. “What should we eat for dinner?”).
- Task conflict can be beneficial and generate better outcomes.
- Be careful to avoid letting task conflict turn into relationship conflict.
- Challenge network: A trusted group of peers to point out blind spots and errors in our thinking.
- The illusion of explanatory depth: We think we know more about things than we really do.
- When pressed to explain howsomething works, our understanding breaks down.
- Example: How does a bicycle, piano or appliance work? Exploring these questions reveals the limits of our knowledge.
Part II: Interpersonal Rethinking
Chapter 5: Dances with Foes
- Opening story: International debate champion Harish Natarajan vs. Debra Jo Prectet (later revealed to be a computer AI). Debate topic: Should preschools be subsidized by the government? Harish must argue the unpopular position of being against subsidies (most of the audience starts with their minds made up for subsidies). Harish uses a powerful combination of techniques—common understandings, non-judgmental questions, flexible thinking—to win over some in the audience.
- Adversarial vs. collaborative approach:
- Adversarial approach: Common tendency to go into preacher or prosecutor mode without listening to the other party.
- Collaborative approach: Leads with humility and curiosity. Invites the other party to think like scientists.
- Logic bully: Someone who overwhelms others with rational arguments. The others might not agree with those arguments, but they are left defenseless and bitter.
- Tactics of expert negotiators:
- Plan ahead to determine where they can find common ground.
- Present fewer reasons to support their case. Weak arguments dilute strong ones.
- Express curiosity with questions like “so you don’t see any merit in this proposal at all?”
- Express their feelings about the process and their opponents feelings, e.g. “I’m disappointed in the way this has unfolded, are you frustrated with it?”
- “We won’t have much luck changing other people’s minds if we refuse to change ours. We can demonstrate openness by acknowledging where we agree with our critics and even what we’ve learned from them.”
- Use a steel man(instead of straw man) and consider your opponent’s strongest argument.
- The stronger a person’s belief, the more important the quality of the reasons or justifications.
- “The person most likely to persuade you to change your mind is you. You get to pick the reasons you find most compelling, and you come away with a real sense of ownership over them.”
- Posing questions and letting the other person draw their own conclusions is more powerful than trying to give them your answer.
- Stop trying to convince others about the right answer. Open their mind to the possibility they might be wrong and let them work their way to the solution.
Chapter 6: Bad Blood on the Diamond
- Opening story: Daryl Davis is a musician and a Black man. In 1983, he was playing a gig. He struck up a conversation with a white man who was a member of the Ku Klux Klan. Rather than respond with hostility, Daryl was curious. He asked the man “How can you hate me when you don’t even know me?” The men became friends and the KKK member eventually renounced his membership. Daryl has gone on to befriend a number of former members who have similarly disavowed their past beliefs.
- Strong opinions like stereotypes and prejudice are less likely to be reconsidered.
- “A rivalry exists whenever we reserve special animosity for a group we see as competing with us for resources or threatening our identities.”
- We identify with our group or tribe. We distinguish ourselves from our adversaries—they are everything we are not.
- We preach the virtues of our side.
- We prosecute the vices of our rivals.
- As social beings, we are motivated to seek belonging and status. Group identification helps us achieve these goals.
- Group polarization: The phenomenon where we interact with people like us. This results in more extreme beliefs.
- Conformity with group orthodoxy maintains cohesion.
- Status is gained by holding the purest expression of these views.
- Many beliefs are arbitrary and based on flimsy foundations.
- The overview effect: Astronauts experience space travel gain a unique understanding of humanity. After seeing Earth from above, their perspective changes and the see the commonality of our existence.
- Counterfactual thinking: considering alternative realities, imagining different circumstances and outcomes.
- “The very notion of applying group stereotypes to individuals is absurd.”
Chapter 7: Vaccine Whisperers and Mild-Mannered Interrogators
- Opening story: Marie-Helene Etienne-Rousseau of Quebec gives birth to a child. The child is premature. Marie-Helene is against vaccines, but the child would benefit from a measles vaccine. A “vaccine whisperer” is called in. He exhibits many of the characteristics of skilled negotiators from Chapter 5. He leads Marie-Helene to decide for herself to vaccinate her child.
- Motivational interviewing: The best approach to changing someone’s mind is to help that person make the change on their own.
- “We don’t know what might motivate someone else to change, but we’re generally eager to find out.”
- Three key techniques are used:
- Asking open-ended questions.
- Engaging in reflective listening.
- Affirming the person’s desire and ability to change.
-
- Gentle recommendations that allow the other person to maintain agency are offered like: “Here are a few things that have helped me—do you think any of them might work for you?”
- The interviewer serves as a guide, not a leader or advisor.
- “Listening well is more than a matter of talking less. It’s a set of skills in asking and responding. It starts with showing more interest in other people’s interests rather than trying to judge their status or prove our own.”
- “Many communicators try to make themselves look smart. Great listeners are more interested in making their audiences feel smart.”
Part III: Collective Rethinking
Chapter 8: Charged Conversations
- Opening story: Columbia University’s Difficult Conversations Lab. Psychologist Peter T. Coleman experiments to learn how to reverse-engineer successful conversations between people about polarizing issues. One finding: framing issues as binary (i.e. black and white) leads to polarization, but presenting issues as complex with many gradations of viewpoints leads to greater cooperation.
- Binary bias: The human tendency to seek clarity by reducing a spectrum of categories to two opposites.
- Presumes the world is divided into two sides: believers and non-believers. Only one side can be right because there is only one truth.
- Binary bias promotes us vs. them hostility and stereotyping.
- The antidote is to “complexify” by showing the range of views for a given topic.
- Binary thinking results in fewer opportunities for finding common ground.
- “Resisting the impulse to simplify is a step toward becoming more argument literate.”
- Most people believe (wrongly) that preaching with passion and conviction is the best way to persuade others. The author continuously refutes this idea.
- Skeptics vs. deniers:
- They are not the same.
- Skeptics are those who don’t believe everything they hear. They look for information to update their thinking.
- Deniers reject anything from the “other side.” They revert to preacher, prosecutor, and politician modes. Their conclusions are predetermined.
- “Skepticism is foundational to the scientific method, whereas denial is the a priori rejection of ideas without objective consideration.”
- Practical tip: Favor content that presents many sides of an issue rather than a singular or binary view. “Recognize complexity as a signal of credibility.”
- Nuance is not rewarded by the attention economy.
- Outrage goes viral and makes for better sound bites.
- “Psychologists find that people will ignore or even deny the existence of a problem if they’re not fond of the solution.”
- Think about how this plays out in politics.
- Conservatives are more receptive to climate science that involves green-tech innovation than those that entail restrictions (e.g. caps on vehicle emissions).
- Perspective-seekingis more useful than perspective-taking. Rather than try to see things from someone else’s point of view, talk to those people and learn directly from them.
Chapter 9: Rewriting the Textbook
- Opening story: Teacher Erin McCarthy assigned her 8th grade students a textbook from 1940 to see if they accepted the information without question or if they noticed any problematic anachronisms. The exercise led her students to question what they were learning and discuss what was included and what was excluded. McCarthy taught her students that knowledge evolves and that it continues to evolve today.
- Learn to ask questions that don’t have a single right answer.
- Three steps to thinking more critically:
- Interrogate information instead of simply consuming it.
- Rank and popularity are not proxies for reliability.
- The sender of information is often not its source.
- Present schooling still relies heavily on the lecture. This approach to teaching is problematic as it involves passive transmission of ideas from expert to student.
- This approach doesn’t foster rethinking.
- “Good teachers introduce new thoughts, but great teachers introduce new ways of thinking.”
- “Education is more than the information we accumulate in our heads. It’s the habits we develop as we keep revising our drafts and the skills we build to keep learning.”
Chapter 10: That’s Not the Way We’ve Always Done It
- Opening story: Luca Parmitano, Italian astronaut who visited the International Space Station in 2013. During a spacewalk, Luca felt water in his helmet. The mission was aborted and Luca barely escaped drowning in his spacesuit due to a mechanical failure that wasn’t properly diagnosed. Luca assumed the problem was a leak with his drinking bag (it wasn’t). NASA took Luca’s explanation at face value. The incident was a powerful reminder that we need to reevaluate our assumptions and determine how we arrived at them.
- Rethinking is not only an individual skill, it’s also an organizational one.
- Organizational culture can either foster or inhibit rethinking.
- Psychological safety: The ability to take risks without fear of punishment or reprisal.
- In environments with psychological safety, teams will report more problems and errors (because they are comfortable doing so).
- Psychologically unsafe settings hide errors to avoid penalties.
- “Psychological safety is not a matter of relaxing standards…it’s fostering a climate of respect, trust, and openness…it’s the foundation of a learning culture.”
- Ellen Ochoa (NASA astronaut and director) 3×5 note card reminded her to ask these questions:
- What leads you to that assumption? Why do you think it’s correct? What might happen if it’s wrong?
- What are the uncertainties in your analysis?
- I understand the advantages of your recommendation. What are the disadvantages?
- ”How do you know?”is an important question to ask both of ourselves and of others.
- Author sees the idea of “best practices” as misguided.
- It implies that we have arrived at an optimal solution.
- It may inhibit further questioning and means for improvement.
- Performance accountabilityevaluates projects, individuals and teams based on outcomes.
- Good outcomes aren’t always the result of good decisions.
- “Focusing on results might be good for short-term performance, but it can be an obstacle to long-term learning.”
- Process accountabilityevaluates projects, individuals and teams based on the decision-making process.
Part IV: Conclusion
Chapter 11: Escaping Tunnel Vision
- Opening story: Looks at Grant’s cousin, Ryan, who spent many years studying and training to become a neurosurgeon only to realize later that he wasn’t thrilled with his career choice and investment in time. The lesson is that he lacked flexibility in his thinking.
- “When we dedicate ourselves to a plan and it isn’t going as we hoped, our first instinct isn’t usually to rethink it. Instead, we tend to double down and sink more resources into the plan.”
- Sunk costs are one explanation (an economic factor).
- Escalation of commitment is another (psychological factor). We constantly rationalize and justify our beliefs.
- Grit is essential for motivation (passion and perseverance), but it can also blind us to rethinking.
- Michelle Obama on asking a child what they want to be when they grow up: “It’s one of the most useless questions an adult can ask a child. What do you want to be when you grow up? As if growing up is finite. As if at some point you become something and that’s the end.”
- “Kids might be better off learning about careers as actions to take rather than as identities to claim.”
- Thinking like a politician—seeking to please others—can lead us astray. It trades status seeking and prestige for our true calling.
- Author recommends twice a year personal checkups: opportunities to reassess your current pursuits, whether your current desires still align with your plans, and whether it’s time to pivot.
- Overemphasizing happiness can backfire:
- “When we’re searching for happiness, we get too busy evaluating life to actually experience it.”
- We seek peak happiness (intensity), rather than small, steady positive happiness (frequency).
- We risk overemphasizing pleasure at the expense of purpose.
- Western society views happiness at the individual level rather than the communal or societal level (interconnectedness).
- Ernest Hemingway: “You can’t get away from yourself by moving from one place to another.”
- “Our identities are open systems, and so are our lives. We don’t have to stay tethered to old images of where we want to go or who we want to be. The simplest way to start rethinking our options is to question what we do daily.”