By Vikram Mansharamani, Harvard Business Review Press, 2020, 304 pp, $13.99
What do financial bubbles and cancer have in common? Vikram Mansharamani believes they both reflect our current moment, in which experts, technologies and information overload complicate all our choices. They also provided the inspiration for his new book, Think For Yourself.
In 2011, Mansharamani, a Harvard lecturer who consults on global business trends, had just published a book about spotting financial bubbles by looking at them through the lenses of several different disciplines, including economics, politics and psychology. “When you’re looking at something uncertain, like a financial bubble,” he says, “using multiple lenses increases the probability that you’ll identify one.”
At a talk the author gave about his book, an older gentleman asked for his business card. “A couple years later,” Mansharamani says, “he calls me back up, he says, ‘I want to thank you. You and your book and your ideas helped my wife and me navigate a cancer diagnosis.’ I said, ‘I think you got it wrong, I was talking about finance.’ He said, ‘No you weren’t. Your idea of using multiple lenses to navigate uncertainty was helpful in allowing us to interact with all the experts who were around us.’”
In both finance and medicine, information overload can make it difficult to choose. “We live in a world where expertise is everywhere, embedded in technologies and individuals,” Mansharamani says, “and so a lot of us have come to conclude that there is a perfect answer to every question that exists. But we also know that we don’t know everything about everything, so what do we do? We find the expert! And as a result, many of us stop thinking, and we just follow blindly.”
Of course, Mansharamani, who holds a Ph.D. in management as well as two master’s degrees from MIT, is something of an expert himself. He doesn’t advocate disregarding sources of information like experts and artificial intelligence, but instead looks at the best ways to use that expertise. The result is a guide to thinking about thinking — specifically how to make decisions, consult experts and imagine the future. “My hope is that this is a book that empowers individuals to stand up and think for themselves,” says Mansharamani. “That doesn’t mean dismissing experts. What I do want is for people to feel empowered to ask questions of experts.”
One clue to Mansharamani’s approach is the picture of a fox hanging above his desk. It’s a reminder of a quote from the Greek poet Archilochus, who wrote, “The fox knows many things, but the hedgehog knows one big thing.” Mansharamani sees himself as the fox in this scenario — a generalist with a breadth of knowledge, as opposed to the hedgehog, a specialist with deep knowledge in one small area. “I’m a big believer that ‘generalist’ is a good word,” he says. “A lot of people say that a generalist is a jack of all trades and master of none. But I do think when you’re navigating uncertainty and making tough decisions, it often is better to be a fox.”
How does being a fox help you make better decisions? According to the author, “In uncertain domains, I believe breadth of perspective may trump depth of knowledge.” He cites research by Philip Tetlock, a professor at the University of Pennsylvania, who found that non-experts are actually more accurate than experts in predicting developments in the experts’ area of knowledge.
Even if you are a wily fox, you’ll still need to ask expert hedgehogs for advice sometimes. Mansharamani’s catchy slogan for this situation is: “Keep experts on tap, not on top.” This strategy, he writes, “means using experts and technologists strategically. They may have a narrow focus, but we can combine their guidance with our broad perspective.” That broad perspective includes one piece of vital information. “There’s only one person that understands your decision context,” Mansharamani says, “and that’s you. And so, you need to think for yourself. And you bring in experts to help you do that.”
The road to thinking for yourself, however, is lined with some predictable cognitive potholes. The book surveys some of the worst mental obstacles that independent thinkers need to avoid, including framing and focus. Framing refers to the way our choices can be affected by the way those choices are presented to us — often to a surprising degree. In one experiment the author cites, doctors were given statistics for two different theoretical lung cancer treatments: surgery and radiation. According to Mansharamani, “When the doctors were told that for surgery ‘the one-month survival rate is 90%,’ 84% of them preferred it to radiation. By contrast, when they were told ‘there is 10% mortality in the first month,’ only 50% favored it.”
The effects of framing can be hard to spot. The author reminds readers to pay attention to default choices, citing studies on organ donation. In countries where people have to actively choose to become an organ donor by checking a box on a form, between 4% and 28% of people sign up to become donors. But in places where the opposite is true — where people have to check a box to opt out of becoming donors — between 86% and 100% of people become donors.
The problem of focus centers on the question: What should I pay attention to? For example, being presented with three choices often leads to focusing on the pros and cons of each choice. But Mansharamani wants readers to pay attention to situations, such as organ donation, where their focus can be manipulated. He recommends remembering that when presented with three choices , there are actually five options on the table: the three choices, opting to seek additional choices and deciding to simply do nothing.
What are some steps that readers can take to think for themselves? Mansharamani says the key is to triangulate. “Understand every perspective is limited, every perspective is biased, and every perspective is incomplete,” he says. “If that’s the case, don’t rely on one perspective — or one expert. The idea is that you need to get the contrarian view. If you think ‘up,’ somebody should explain to you why ‘down’ may be the right answer. I think it results in better understanding, and therefore better decision making.”
In today’s political climate, Mansharamani feels that seeking out different views is more important than ever. “If you’re watching Fox News, please also turn on CNN,” he says. “I’m not asking you to agree with everything; I’m asking you to appreciate that there are different views. Or if you’re on Twitter and you happen to pay attention to some very liberal commentators, also please turn around and follow a conservative commentator or two, just to get their perspective … I think that logic, of piercing the filter bubbles and echo chambers in which many of us live, is constructive and would help us overcome many of these polarizing dynamics that are affecting us.”
Instead of challenging our thinking with other possibilities, filtering can lead to confirmatory bias — a process in which in our own beliefs are constantly confirmed as correct.
The author also believes that a diversified media diet can help fight the current plague of disinformation. “The ideal, which is really a problematic process to get to the ideal, is that the disinformation is taken down,” he says. “And that is really, really hard because everyone’s version of reality may be slightly different … the worst-case scenario is, you just consume the information put in front of you, and it’s wrong, and you believe it. But if I tell you to diversify your media diet, you’re still going to get that misinformation, but you’ll see that there’s also a different view. There’s a debate. Now you can think for yourself.”
But even the best media sources can begin to distort a reader’s thinking, through the subtle effects of filtering. Mansharamani advocates reading newspapers and magazines in the print version to avoid digital filters and get the broadest views of what’s in the news. Web sites usually display only a small percentage of the articles a printed paper contains; read the paper on your phone and you’ll see even fewer.
“It’s not just that there’s a filter,” says Mansharamani, “but that the filter is dynamic, based on what you’ve clicked on before … so if you click on stories relating to civil war in West Africa, you know what? You’re more likely to get stories about West Africa and civil war. Then, my guess is, you’re going to believe that Africa is filled with war all the time.” Instead of challenging our thinking with other possibilities, filtering can lead to confirmatory bias — a process in which in our own beliefs are constantly confirmed as correct. According to the author, “When these algorithms channel [us] and focus [our] energy and attention on particular topics, our minds seem to adjust to believing they are either more frequent or more regular or more recent than they actually are. And that’s where we lose our ability to think for ourselves, because the whole world seems to confirm what we already believe.”
Quick test: Which is more dangerous: taking a selfie or swimming in places where sharks live? According to the book, in most years, more people die in selfie accidents than in shark attacks. Yet when rare events, like shark attacks, attract a great deal of media attention, confirmatory bias can tell our brains that those events are common.
An advisor to several Fortune 500 CEOs, Mansharamani has come to believe that learning how to navigate uncertainty — by thinking for yourself and seeking out a wide variety of experiences — can pay big dividends in the workplace. “Thinking about how to develop multiple perspectives and skills in your career is really important, and it has big career decision-making implications,” he says. “Some of the biggest corporate stumbles that I’ve witnessed in my career have been leadership decisions, where the person who’s risen came up and was wildly successful as a narrow specialist. And then they end up at the top, and they don’t have the breadth of perspective or experience to be able to navigate uncertainty.”
To grow in your career, the author recommends throwing out the analogy of climbing a corporate ladder, and instead thinking about a career as a trip through a jungle gym. He explains: “The way to the top may not be vertically, going straight up. It may need to be that, OK, you’re in finance today. You might need to take a lateral move. Go into operations for a little bit. Or you’ve been in operations for a little bit, take a downward step and go do a lower-level operations role in a different country, to get a different perspective.”
For himself, Mansharamani has found that being a generalist, like his mascot the fox, has led to career satisfaction. “Every room I walk into, I know I’m not the smartest person on anything, ever,” he says. “And I find that really stimulating. There’s always someone who I can learn from. For me, as someone who’s curious and constantly seeking to learn, that’s really cool. I feel like being a generalist allows me to spend enough time reading and thinking broadly, that I can make connections with people regardless of their area of study. So, I’ve found it personally more fulfilling.”
Laurie McClellan is a freelance writer and photographer living in Arlington, Virginia.
Five Ways to Think For Yourself
- Ask questions.
Whether an opinion comes from your doctor or your financial advisor, some of Mansharamani’s favorite questions include: Why do you think this is the right choice for me, specifically? How did you reach this diagnosis (or recommendation)? Do you think this to be true, or know this to be true? What kind of research is this based on?
- Beware of cognitive crutches.
Shortcuts in thinking, called heuristics, help us quickly make dozens of decisions every day. But these same shortcuts can mislead. For example, the anchoring effect is the tendency to be biased by the first piece of information we hear. In one experiment, audience members watched a researcher spin a number wheel and read the number out loud. Then they were asked to guess what percentage of African countries belong to the United Nations. If the random number was 10, the average guess was 25%. If the random number was 45, then the average guess was 64%.
- Read novels.
Says Mansharamani: “Fiction reading forces you to step back and see the big picture … and ask questions about the human condition and how it may affect where you are and what you’re doing.”
- Perform a premortem analysis.
In this exercise, try imagining the future failure of your project — then try to think through what might have caused it. According to Mansharamani, “Research found that the exercise of thinking about how projects may fail in advance increased the ability of correctly identifying possible trajectories by around 30%.”
- Play devil’s advocate.
When making a decision, deeply consider the alternatives. Mansharamani quotes historian Doris Kearns Goodwin from her book Team of Rivals: The Political Genius of Abraham Lincoln. Goodwin wrote that “by putting his rivals in his cabinet, [Lincoln] had access to a wide range of opinions, which he realized would sharpen his own thinking.”