Enjoying the episode? Want to listen later? Subscribe on any of these apps or stores to be notified when we release new episodes:
April 15, 2021
What are "scout" and "soldier" mindsets? How can we have productive disagreements even when one person isn't in scout mindset? Is knowing about good rationality habits sufficient to reason well? When do we naturally tend to be in scout mindset or soldier mindset? When is each mindset beneficial or harmful? Are humans "rationally irrational"? What are the two different types of confidence? What are some practical strategies for shifting our mindset in the moment from soldier to scout?
Find out more about Julia at juliagalef.com.
Further reading
JOSH: Hello, and welcome to Clearer Thinking with Spencer Greenberg, the podcast about ideas that matter. I'm Josh Castle, the producer of the podcast. I'm so glad you joined us today. In this episode, Spencer speaks with Julia Galef about different approaches to argumentation and communication, sources of self-deception and false beliefs, and strategies for increasing confidence.
SPENCER: Julia, welcome.
JULIA: Thank you. It's such a pleasure to be on your podcast, Spencer.
SPENCER: I'm so glad you came on. I'm really excited to talk about your book. Do you want to tell us about it?
JULIA: Yes. So it's coming out on April 13, and it's called The Scout Mindset, which is my term for essentially the motivation to see things as they are and not as you wish they were. Basically, another term for being or trying to be intellectually honest and objective and just curious about what's actually true. It's part of the framing metaphor of the book, in which I argue that we humans are often by default in what I call soldier mindset, which is the motivation to defend your pre-existing beliefs or defend what you want to be true against any evidence that might threaten those beliefs. Scout mindset is an alternative to that. Unlike the soldier, the scout's role is not to attack or defend; the scout's role is to go out and just see what's really there and put together as accurate a map of the situation or the landscape as possible. The book is about why we are often in soldier mindset by default, how to shift from soldier mindset towards scout mindset, and why that's something we should want to do.
SPENCER: Yeah, I really can't wait to read your book because I find that I'm constantly referencing this concept, even though I haven't dug into all the details yet.
JULIA: I'm so glad to hear that.
SPENCER: Yeah, for example, one thing I've been thinking a lot about lately is when you're in a debate with someone, you need to first figure out, are they in scout mindset? And are you in scout mindset? Because until you're both in scout mindset, that conversation is probably not going to go very well. If you realize that, first of all, you're in soldier mindset, then what you're trying to do in that conversation is actually just beat the other person, not to learn from them or come to an understanding. Second, even if you're in scout mindset, if the other person is in soldier mindset, then your first job, it seems to me, should be to help them get into scout mindset so you can have a productive conversation. I'm just using this idea all the time now.
JULIA: That's lovely to hear. I actually think it's maybe asking too much of everyone I talk to or everyone I disagree with to be in scout mindset. It's wonderful --- you're absolutely right that disagreements are way more productive when the other party is also in scout mindset or at least aspiring to be in scout mindset. But I've also been working on finding ways to get value out of disagreements even when the other person isn't trying very hard.
SPENCER: Oh, really interesting.
JULIA: Yeah, well, it can be kind of a one-sided thing. Having scout-like disagreements in a one-sided way, where you're the one trying and the other person isn't really trying, can still be partly as good as the full deal.
SPENCER: You can still learn a lot yourself, at least.
JULIA: Yeah, I mean, I can try to do the work myself by looking for the more charitable interpretation of what they're saying. Maybe because they're upset or because they are really passionate about the topic, they're exaggerating, they're overstating their case. If I were in a soldier mindset, it would be very easy for me to poke holes in their argument by focusing on the ways in which they're overstepping the limits of their evidence. Instead, I could choose to focus on the contexts or the ways in which their argument is actually true, maybe in ways that I hadn't really been paying attention to before. Then I can come away from that conversation feeling like I've changed my mind 180 degrees, but at least I have a bit of a richer perspective on this topic than I did before. That process doesn't technically require that they be in scout mindset during the conversation, although it's nice.
SPENCER: That makes a ton of sense. You can still get a lot of value in that conversation. I also think it's a useful frame to say, "Hey, what mindset is the other person in?" If they're not, do I want them to be? If I do want them to be, there can be different strategies you can use that are very different from just trying to have the conversation at the object level. For example, maybe you need to spend some time showing them that you have things in common with them or that you agree with them about certain aspects of what they're saying. Maybe that can kind of shift the nature of the conversation.
JULIA: Right, right. I also will sometimes work harder to signal to the person that I'm not trying to beat them in the argument, to signal to them that I'm maybe not in the same category of people they have argued with in the past who are just trying to score points against them. Because as frustrating as it is when the person you're talking to is in soldier mindset and just trying to win the argument and not really listening to what you're saying, it's also understandable in a way. The fact that so many other people do that means I think it's a reasonable prior for them to have that, "Okay, we're arguing; this is the situation in which they tried to beat me and I tried to beat them." If that's how it usually works, then I can understand why they wouldn't automatically recognize that I'm trying to do something very different from that.
SPENCER: Yeah, it's funny. I think there really are a lot of times when people sort of just expect you to be in soldier mindset. One thing that I find kind of funny is when people will leave kind of mean critiques of things I write, where they'll just do it in an unnecessarily nasty way. Then I'll write back and I'll be like, "Oh, you know, thanks so much for posting that. I thought your point was really interesting." Then they're just flabbergasted. Sometimes they just have this shocked reaction where they expected me to attack them, and they're like, "Oh, I'm so sorry. Thank you for writing this essay," or something.
JULIA: I find that really heartwarming. I strive for moments like that as well. Even when it's just a situation where they're disagreeing with something I believe, I will often look for honest signals of good faith argument. By an honest signal, I mean something more than just saying, "By the way, I'm really just trying to get at the truth with you. I'm not trying to argue with you," because that's something anyone can say. In fact, a lot of people do say it and don't really mean it, or they talk the talk but don't walk the walk. An honest signal would instead be something you really wouldn't do unless you genuinely were trying to argue in good faith. For example, sometimes I will point out flaws or limitations in my own position, or I will go out of my way to say, "Here's why this might not be the case," or sometimes I will voluntarily bring up reasons that I think support their view, even if I still overall don't think their view is right. I've found that those kinds of signals of good faith can help — not all the time, but can often take someone who started out in soldier mindset and take them at least part of the way towards scout mindset and make the disagreement go better.
SPENCER: I like that also because it takes the competitiveness out of it. Exactly. It's like imagining you're playing tennis with someone and they start scoring points on themselves.
JULIA: You're like, "Hey, what game have we been playing? I guess we're not playing the tennis I thought we were playing."
SPENCER: Exactly. So what motivated you to write a book about this in particular?
JULIA: I guess there are two answers to that. One answer is just the kind of origin story of the book, which is that I've been focused for the last 10 to 15 years in my career, and I guess just in my personal life as well, on understanding how to improve reasoning and decision-making. In 2012, I co-founded this educational nonprofit called the Center for Applied Rationality. We essentially ran workshops, classes, and consulting on how to reason about complicated things in your career and your life and how to make better decisions. For the most part, we were taking principles from cognitive science, sometimes basic economics or philosophy, and helping people apply those principles to their own messy real-world problems. I'm not at CFAR anymore. I still think all of that work is really important. One way in which my views shifted as a result of trying to do this for several years was that I originally had kind of a naive perspective on what it would mean to improve human rationality or human reasoning. That naive perspective was something like, "Well, we'll educate people about cognitive biases and how they work and logical fallacies, how to reason logically, and that will improve their reasoning." As I say that now, it may sound a little naive. But I don't know, that was kind of the assumption I was working based on. It's kind of obviously not true in retrospect that just knowing a lot about cognitive biases and logical fallacies will help you. You can see this fact just by observing the way people behave online. I'm sure you have seen people on internet forums accusing other people of being biased or reasoning fallaciously. They never actually turn that lens on their own reasoning and notice, "Oh, actually, I think I was maybe not being logical there." The point being, it started to seem much more to me that the bottleneck for improving reasoning was not so much about knowing how to reason but instead about being motivated to use that reasoning in the service of actually figuring things out, as opposed to using that reasoning in the service of defending what you already believe.
SPENCER: Right? Or at least the motivation comes before the other pieces, right? It's not that knowing about cognitive biases is totally useless or knowing how to use logic is totally useless.
JULIA: That stuff is definitely important, but it's more sort of necessary than sufficient. I guess when I say that it seems like more of a bottleneck to me, it seemed very important and underappreciated. I kind of wanted to write a book that focused people's attention less on "Here are all the ways the human brain is flawed" and "Here are principles of logical reasoning," and more on how do you get yourself into a state where you're directing your intelligence and your knowledge in the direction of truth and not in the direction of defending preconceived beliefs?
SPENCER: Because until you have the right motivation, these are just tools to attack your enemies or to support your own confirmation bias.
JULIA: Right. Or just justify your past decisions or justify making the decisions you want to make. We're all very clever, and we can find plenty of ways to come up with a seemingly airtight proof for why the decision we wanted to make all along is actually the rational one. That's the pitfall that I think is important to avoid, and that I'm working on. That is the origin story I tell for how I ended up writing this book. A different explanation is just that I personally love scout mindset. I've always been drawn to people who are unusually good at scout mindset, unusually intellectually honest. I've always found those kinds of conversations where we're both in scout mindset and trying to figure something out together just so satisfying, deep down in my soul. I've often been frustrated when people are not doing the scout mindset thing. That's maybe a simpler explanation for how I wrote this book because it's just a thing that I'm really passionate about and want other people to be passionate about too.
SPENCER: Well, that's very relatable. It seems to me that there are these different kinds of magisteria in our lives where we treat them differently with respect to trying to figure out the truth. For example, let's say you're trying to get to an appointment, and you're trying to figure out where the address of this appointment is. There, it seems like we're very truth-oriented.
JULIA: Yes.
SPENCER: You're just like, "Oh, I want to know that information." I don't really care if the person telling me is part of my enemy tribe, as long as I think that they're being trustworthy. Then there are all these other domains where it seems like we're not thinking in that sort of truth-oriented way. I'm curious if you have a comment on when we tend to naturally be in scout mindset and when we tend not to be.
JULIA: Yes, that's a great point. I'm so glad you brought that up. People often tend to, understandably, simplify my view into everyone being a soldier by default and we should try to be scouts instead, or some people are soldiers and some people are scouts. The reality is that we are all mixes of soldier and scout on different days and in different contexts. And about different topics, we might be demonstrating more soldier mindset or more scout mindset. The question you're asking is basically what factors determine whether we end up being in scout mindset or soldier mindset. The example you gave is a very practical situation in which there's a thing you need to know, which is how to get to the store. Getting the right answer to that question is immediately useful to you. You have a motivation in that situation to find out the truth. Crucially, you don't really have much motivation to believe a particular other answer, so you don't have a lot of emotion or identity invested in believing some particular thing about the address, like believing that your appointment is at 123 Bias Lane. I think that's a great illustration of what determines whether we tend to be in soldier mindset or scout mindset, which is just how motivated are we to learn the truth in that particular situation? How motivated are we to believe some other thing, irrespective of the truth? A nice counterpoint to your example would be politics. As I'm sure everyone is aware, people are often in soldier mindset about political issues, part of what makes Twitter and Facebook so unpleasant sometimes. In that situation, you have a strong motivation to defend a particular answer on political topics because some answers will make you feel really good to be accepted by your tribe, like by the other people who share your political beliefs. If you agree with them that Trump is terrible or Trump is amazing, you have a strong motivation to believe a particular thing, irrespective of the truth. You don't really have a comparable motivation to find out what the true answer is. Unlike the case of trying to get to your appointment on time, there's no direct benefit to you in your life of having true beliefs about politics. That's why we see people very often in scout mindset when they're trying to get to an appointment and more often than not in soldier mindset about things like politics. Those are kind of the extreme cases. There are cases where it's a bit more of a mix, like some issue that you're thinking about in your own life that's very emotionally fraught, like, "Should I break up with my partner?" That's a question that does have a direct impact on your life. You have a motivation, in some sense, to get the right answer to whether you should break up with them. But there's also this other motivation to get the answer that is most immediately easy for you to do, like the answer that prevents you from having to do something really difficult and painful right now. Those two motivations can conflict, and that's why we kind of vacillate back and forth sometimes between scout and soldier mindset on important issues in our lives. Does that make sense?
SPENCER: Yeah, absolutely. It feels like you kind of touched on this a little bit, but in the politics example, identity is also a key aspect, right? We want to act in accordance with our identity of ourselves, and we want to avoid acting in the identity of groups we view as others or enemies to our group, and so on.
JULIA: That's right. I struggled a lot in explaining this in the book. The explanation for why we are motivated to be in soldier mindset in cases like that can be answered at different levels. There's an explanation at the level of what makes you feel good as an individual, like, "I just feel good when I can dunk on the other side," and I feel good when it seems like my tribe is better than the other tribe. But then there's also a more unconscious level explanation, where the reason we evolved to feel good when we prove that our own side is right is because the argument goes, holding those beliefs, being able to honestly and consciously hold the belief that my tribe is right, helps us stay in the tribe, stay a member in good standing of the tribe. Other people will want to associate with us and trust us if we hold beliefs that say our tribe is right and the other tribe is wrong. That's more of an evolutionary explanation for why we're often motivated to be in soldier mindset about political things. The way that manifests on a conscious level is just that we feel really good when we defend those beliefs.
SPENCER: Right. It's dangerous to be thought of as being part of a different enemy tribe. Probably 10,000 years ago, it was literally dangerous; you might get ostracized or killed or something like that.
JULIA: That's right. I'm often wary of using evolutionary arguments just because they're hard to prove. People often argue about them and get accused of being just-so stories. I think this is a pretty plausible argument for why we're in soldier mindset about political issues. I also don't think it's a necessary argument because the fact is it does feel really good when you conclude that your tribe is right and the other one is wrong. The question of why it feels really good is a separate question that's worth talking about. All I need for my thesis is the fact that it does feel really good because that provides us with motivation.
SPENCER: Yeah, that makes sense. Although, it seems really clear that groups do lots of things to differentiate themselves from other groups. You can see this in frat houses based on the way people dress and the language they use. You can also see it in tribal cultures, where people will wear certain types of jewelry that's different from what the other tribes wear. Every human group seems to differentiate itself from the others. Robin Hanson, the economist who blogs at Overcoming Bias, has a metaphor of beliefs as clothing, where we choose our beliefs in the same way we choose our clothing for similar reasons. We choose our clothing to tell the world or the people around us what kind of person we are. You might choose clothing that makes you look sophisticated or clothing that makes you seem edgy. Similarly, we often unconsciously choose beliefs that make us seem sophisticated or edgy or wise or mature, or clothing that shows that we're with the in-group. If all the cool kids or all the smart and prestigious people in your social circles come to believe that socialism is better than capitalism or that deep learning is going to take over the world, then you have a strong motivation to come to believe that as well. In much the same way that people have a strong motivation to wear whatever skirt length is in that season, as opposed to the skirt length that makes you look out of date and behind the times. Unless your identity is strongly tied up in being contrarian, in which case maybe the opposite becoming popular gives you a motivation not to accept that.
SPENCER: Interestingly enough, even contrarians often do kind of standard contrary things, like they become goth or something like that.
JULIA: Exactly. Yeah. There are really not that many genuinely original thinkers. There are a lot of people who are kind of self-styled contrarians, but it often involves — what's that saying about thinking outside the box? There's often a particular other box that you're supposed to think of that's outside of the original box, but it's still a box. Not to say that everyone should have completely unique and hard to predict views. But I do think that's an interesting phenomenon.
SPENCER: I agree we should be careful about evolutionary arguments because they are often really hard to prove. It does make sense that you can't really survive by yourself in the wilderness, right? You need a group. So you want to be part of some group, right?
JULIA: Right.
[promo]
SPENCER: One critique someone could have of your book, and I know that you kind of addressed this in the book, is, okay, well, if we have a motivation to come across as part of a certain tribe or having certain beliefs in order to have people view us positively or dunk on the other side, then are we acting in our own self-interest by sort of adopting the soldier mindset?
JULIA: Yes, good. I'm so glad we are going to talk about this. I spent a lot of time thinking about potential reasons people disagree with me or potential objections to my thesis. I think this is one of the best. The label for this objection is rational irrationality. The term has been used by lots of economists and evolutionary psychologists. But that term itself, rational irrationality, comes from Bryan Caplan, who's another economist at George Mason University, like Robin Hanson. The idea that humans are rationally irrational refers to the claim that we basically use the word rational in two different senses. There's epistemic rationality, which is about forming beliefs that are accurate, and then there's instrumental rationality, which is about making decisions that effectively help you achieve your goals. The idea of humans being rationally irrational is that we are really good at just intuitively choosing just enough epistemic rationality to further our goals. In this model of humans, we're really good at just intuitively knowing which situations we should try to seek the truth in and which situations we should deceive ourselves and try to defend some particular belief irrespective of the truth. The question is, are humans in fact rationally irrational? Because if we are, then I kind of don't have much to offer in a book about scout mindset. By which I mean, if we were rationally irrational, I couldn't promise people that you will be better off, or even that you should expect to be better off in expectation by shifting from soldier mindset towards scout mindset. Because in this theory, you're already using just the right amount of soldier mindset in your life. Let me try to explain why I don't find the idea of rational irrationality very compelling. It's premised on the idea that humans evolved in a way that's optimized for our own interests. Therefore, of course, our minds would have evolved to be good at choosing the right combination of scout and soldier mindset. I don't find that a very compelling claim for several reasons. First, we're already aware of a number of ways in which the human mind is not optimized for making good decisions, at least not in the modern world. Maybe it was in the environment we evolved in, but it has systematic mistakes that it makes in the modern world. For example, as I'm well aware, and as I'm sure many of your listeners are well aware, we tend to overweight consequences that are immediate relative to consequences that occur in the future, which is why we tend to break our diets or procrastinate and leave our work until the very last minute. The immediate rewards of eating the cupcake now or putting off our paper are just too tempting compared to the kind of abstracted delayed rewards of looking and feeling better or finishing the project on time. Everyone is well aware of this; it's well known that this thing called present bias tends to lead to bad decision-making that's against our own interests. What I'm arguing in the book is that present bias also affects the way we think, not just the way we act. It affects which belief we reach for, and crucially, scout mindset benefits tend to come more in the future, whereas soldier mindset tends to pay off immediately. If you convince yourself that you are right about something, you feel good right away. But if you acknowledge to yourself, well, actually, maybe I was wrong about that thing, or maybe I made a mistake, you feel a little bit bad right now. But then in the future, the benefit is that you're less likely to make that mistake again. The fact that we have that bias towards immediate consequences suggests that we are probably by default choosing soldier mindset more often than we should just for our own self-interest, in the same way that we probably eat the cupcake or put off our work more often than we should just for our own self-interest.
SPENCER: It's such a good point. A lot of this stuff around being open to the idea of being wrong means you actually have to also be open to experiencing some pain in that moment. It does hurt at that moment, often, when we discover we're wrong or realize the other person has a better argument than us, etc. That's just going with your idea that we're experiencing potentially a short-term benefit from being in soldier mindset, whereas scout mindset might have a short-term cost, but then maybe later, we make better decisions, and we decide to get the right medical treatment or make a decision that leads to better long-term consequences.
JULIA: The case of being in scout versus soldier mindset on political issues is kind of a variant on this argument, where, as I said, there are no direct benefits of getting the right answer on some political issue. But I do think there are indirect benefits, which is that thinking in scout mindset is kind of a habit of mind. Whether you remember to second guess your immediate intuitive answer or whether you are able to admit you were wrong about something, those are generally good habits of thinking. Every time you admit you were wrong about some political issue, even if that has no direct benefit for you, what it's doing is helping reinforce this generally useful habit of thinking that is going to be directly useful to you in other ways, like being willing to question your first assumptions about your business idea or about the relationship you're in. Those actually do have direct consequences for you. It's much easier to think in the scout-like way if you have been doing it as a general practice than if you're only doing it when it seems immediately useful to you.
SPENCER: It reminds me of an argument you sometimes hear around things like alternative forms of medicine that may actually not be that effective, like homeopathy, which maybe just doesn't work at all. People will say, what's the big deal? Someone takes homeopathy when they have a cold because there's not much to do for a cold anyway, and maybe it has a placebo effect, etc. That kind of response says, well, if someone gets into homeopathy, are they really just going to use it for a cold? Or are they going to start actually using it for more and more serious issues, and then maybe they're not going to get the best medical treatment when it really matters? I think there's a similar slippery slope thing going on there. If you're in the habit of flinching away from finding out you're wrong and always dunking on the other side to make yourself feel good in the moment, are you really going to be able to just switch that off when suddenly the stakes are high? There are really important life consequences to having a scout mindset.
JULIA: That's right. Yeah, exactly. Relatedly, it's just hard to predict how false beliefs are going to affect you in the future. We have these networks of beliefs in our mind, and changing one can ripple through the others in unpredictable and unexpected ways. The example I give in the book is, suppose you deceive yourself about how attractive or likable you are because that feels good. What's the harm? One ripple effect of that is that you might be surprised that more women don't want to date you. Why wouldn't they if I'm so attractive and likable? Now you have to make sense of that consequence. Maybe what you convinced yourself is that they don't want to date me because women are all shallow and only want to date rich guys or something like that. That's another update to a node in your network of beliefs. That update also has its own ripple effects because other people are going to try to convince you that you're wrong. Women are not as shallow as you think they are. To make sense of that, now you have to come up with another way to explain that, like, well, maybe everyone's just lying to me about how the world works. They're just saying whatever the socially acceptable or politically correct thing is about women not being shallow. My point is there are all of these ripple effects that happen as a result of us trying to make sense of the world given the other Beliefs in our network of beliefs. I can't promise that in any given situation, you will definitely be better off in scout mindset instead of soldier mindset. But I think all of these things we've been talking about, all of the ways in which soldier mindset has delayed or unpredictable harms, and scout mindset has benefits that don't come right away, suggest that at the very least, we are not in scout mindset as often as we should be, even though I can't guarantee that we should be in scout mindset 100% of the time.
SPENCER: That makes a lot of sense to me. I would also add that a lot of the short-term benefits we get from soldier mindset might have a compensating benefit if we learn to adopt scout mindset in a thoughtful way.
JULIA: Yes.
SPENCER: You probably talked about this in your book, but just as an example, there's sort of, you know, the same way that you can enjoy eating candy, but candy also has bad long-term consequences if you eat it every day. You could also learn to enjoy eating healthy foods and really enjoy them, right? Again, they're healthier for you in the long term. It's sort of like that, where you can get a lot of pleasure momentarily dunking on the other side or trying to prove other people wrong and feeling superior. But you can also get this maybe not as candy-like pleasure out of being the sort of person that is willing to change my mind and able to have interesting debates with people where we actually learn from each other, and we both feel really good about the experience.
JULIA: Absolutely. I have a two-pronged argument. One of the prongs is that we tend to overweight soldier mindset relative to scout mindset. The other prong is that we can actually make scout mindset more beneficial and more immediately beneficial to us than it is by default, by, for example, feeling good about ourselves when we change our minds or feeling good about ourselves when we notice that we were wrong about something. It's kind of a way to patch the bug in our native psychology, where the bug is that we are super motivated just by rewards that happen immediately.
SPENCER: If you think about the virtues, the soldier mindset's forms of pleasure and avoiding pain are not very virtuous, right? They're sort of, you know, bashing on other people. They're self-deluded. I don't want to make soldier mindset seem too terrible or anything, but these are not the most virtuous ways to feel good.
JULIA: I think I was trying really hard to avoid making a moralistic argument in the book because it felt too easy to wag one's finger at people and say they should think better because it's more virtuous. I also think that the case for the selfish benefits of scout mindset, of intellectual honesty, is underappreciated. I really wanted to point out the ways in which it is actually in your self-interest to be intellectually honest more often, even about topics like politics, where it doesn't seem directly useful to you.
SPENCER: To be fair to the soldier mindset side, let's steal mana for a moment. What do you think is the strongest argument in favor of being in soldier mindset quite a bit?
JULIA: There are a few examples of cases where a lot of people argue that you're better off being in soldier mindset. I think they're wrong, but it's understandable why they view things that way, and I think they have some good points.
SPENCER: Spoken like a true scout.
JULIA: I try, I really do try to have scout mindset about my own thesis. In fact, the book changed a lot over the couple of years that I was writing it because I was wrestling with the critiques of my thesis. The arguments I make in the book now are the result of having applied scout mindset to my original thesis. One of the arguments that people make in favor of soldier mindset is that when you're trying to do something hard, like starting a company, you need basically self-delusion in order to force yourself to do this really hard thing. You need to be overconfident about your chances of success; you need to believe that you are definitely going to succeed, even though in reality, it's very uncertain. Even people who are brilliant and work really hard are far from guaranteed a successful outcome. Still, people argue you need to believe you will succeed against all the evidence. It's only having that false certainty that will motivate you to work hard enough to actually have a decent shot at succeeding. I call this the self-belief model of motivation. The interesting thing about this model is that some of the most successful entrepreneurs are counter-evidence to the self-belief model. For example, Jeff Bezos, when he was deciding to quit his job on Wall Street and start the company that would become Amazon, tried to think realistically about the probability that his fledgling company would actually succeed. The number he came up with was about 30%, which is very low compared to the way most founders think and talk about their companies. The way he got to that number was he estimated that about 10% of internet startups succeed. He thought, well, I think I'm really pretty smart, and I have a good idea and I'm talented, but I have to adjust upwards from that base rate. He put himself at about a 30% chance of success. According to the self-belief model, Jeff Bezos would be doomed to failure because he wouldn't have the motivation to actually try. Clearly, that is not true. He worked very hard and effectively and made Amazon one of the most successful companies in the world.
SPENCER: Do you know how he motivated himself in that case?
JULIA: Yes. Bezos, and I would say most of the other examples I looked at, had a particular kind of approach to motivation that is different from the self-belief approach. Their motivation relied on essentially probabilistic thinking. There's a basic concept in probability theory called expected value, which I know you know, but to explain it: the idea is that you multiply the probability of success times the value of success, plus the probability of failure times the value of failure, or how bad it would be if you failed. That's roughly speaking the expected value of whatever the act is. The expected value of an act can be quite positive, even if the probability of success is pretty low, if the value of success is high enough and the value of failure is not that bad. If you were offered a bet where you could roll a die, and if you got a six on the six-sided die, then you get $1,000, and if you don't get a six, then you have to pay $1, that's a bet in which you will probably fail. But it's still a very positive expected value because the value of what you get if you win is so great that it outweighs the small amount you have to pay if you lose the bet. In real life, bets like starting a company are much messier than this kind of artificial die roll. It's hard to estimate the probabilities; it's hard to quantify how good or bad it is if your company succeeds or fails. But you can still often do a very rough back-of-the-envelope estimate in your head. Often what that rough estimate reveals is that this bet has a really positive expected value, even though I think the most likely outcome is failure. For Bezos, the value of success was huge, and the value of failure was not that bad. He thought about being 80 and looking back on the fact that he had tried and failed to create a company when he was younger. He thought, I would feel okay about that; just the fact that I tried, I would be proud of that. I would have gotten to participate in the growth of the internet, and that would be cool. Basically, failure is tolerable, and success would be really great. This is clearly a positive expected value bet to take. You can definitely still be motivated to strive for hard things if what's motivating you is not the guarantee of success, but instead the expected value of the bet you're taking.
SPENCER: This is giving me a flashback to a time I was talking to a VC a number of years ago, a venture capitalist. I was making a similar case to them that you can be motivated by expected value, and they said, oh, that never works. You just have to completely believe in your startup; that's the only way to succeed.
JULIA: Did they really? Bezos and Elon Musk actually were even more explicit about the expected value thinking that he was doing.
SPENCER: Oh, really?
JULIA: Did you argue with him, by the way?
SPENCER: Oh, no, I just thought that was really interesting that he denied that. There's another way of looking at it that I really like as well, which is if you think about it as a game where you get to roll the dice multiple times, then the more die rolls you do, the higher the chance of success. That doesn't mean that you'll achieve exactly what you originally set out to do. If you're open-minded about what success looks like and you roll the dice a bunch of times, maybe you actually do have a pretty good probability of achieving success in some form that you're happy with, even if it's not exactly what you intended on the first roll.
JULIA: Exactly. What's interesting is that people are already very used to thinking about this in the context of an investment portfolio. They're much more sanguine about the fact that many or even most of their investments will fail, but the portfolio as a whole will still probably go up over time. What you're talking about, and I totally agree, is it's like a portfolio, but it's extended in time. You're still making this collection of different bets that has a pretty good shot of success overall, but you're making them staggered, one month or one year after the other. I think this is a less intuitively obvious example of an investment portfolio to people because it's not a bunch of bets you're making all at the same time, but the principle still holds. The principle of your chance of coming out ahead overall goes up the more positive expected value bets you make. That still holds even if you're making the bets staggered in time, not at the same time.
SPENCER: I really like that way of looking at it. It's especially true if each time you roll the dice, you learn something, so you're a little bit better next time.
JULIA: Exactly. That is exactly true. They're not totally independent of each other. I should also add that this kind of expected value thinking is not what tends to motivate people on a day-to-day basis. On a day-to-day basis, you're motivated by much more immediate things, like the milestone you achieved last week, or the product you're planning to ship at the end of this week, or whatever the latest fire is that you have to put out at work, or trying to put a presentation together tomorrow. Those are the things that motivate you. It's just in those moments when you're deciding whether to take the plunge to take a risk or in the moments when you're stepping back and reflecting on your choices, feeling good or bad about the choices you've made in your life. Those are the moments when I think being motivated by expected value is so powerful about not having to rely on believing with certainty that you're going to succeed.
SPENCER: What was the Elon Musk example? So many people look up to him as their entrepreneurial hero.
JULIA: He's such an interesting case because Elon is someone about whom people tend to describe with things like the self-belief model of success. For example, someone who I think actually worked with Elon in the early days said, this was a famous quote that got passed around the internet a lot. They said, the reason that Elon Musk is so successful is that he literally cannot conceive of failing. This is exactly the opposite of what Elon has said himself about his own motivation many times throughout the years in many different contexts. What he says is, I assumed in the beginning that both Tesla and SpaceX were probably going to fail. He gave each of them about a 10% chance of success, even lower than Jeff Bezos's 30% estimate of success for Amazon. Of course, the low odds that he assigned to his own company success tended to baffle interviewers who would ask him things like, well, then why are you doing it? If you think you're probably going to fail, then why do it? Elon answers something along the lines of, well, if it's important enough, then you should try even if the most likely outcome is failure. He also, just like Bezos, thought to himself, is failure tolerable? Well, yes, I think I'd still feel proud of myself for trying even if I failed. Elon Musk also viewed failure in a similar way, where he thought to himself, okay, if Tesla fails, how bad would that be? Well, I wouldn't be personally ruined, and we would probably make at least a little bit of progress. Even if we didn't become wildly successful, we could change people's perception of electric cars so they see them as something exciting and not something clunky and nerdy, like a golf cart, I think he said. Similarly, if SpaceX failed, it wouldn't be the end of his life or anything, and they would probably make at least a little bit of progress, even if they technically failed. Maybe some future company could pass the baton, increasing the likelihood that some future company would help make humanity an interplanetary species, which was Elon's motivation with SpaceX. That's his long game. These are both cases where even if there's only a 10% chance of success, success would be really amazing if you got it, and the 90% chance of failure is quite tolerable. Both Tesla and SpaceX are positive expected value for Elon. As you point out, there are only two of the whole collection of different bets that he's going to be able to make in his life. Even if he can't be strongly confident that any one of those bets is going to pay off, the collection of bets over the course of his life can be much more confident in that.
SPENCER: Given how well both of them have done, at least up to today, it actually seems like evidence he was underconfident, if anything.
JULIA: I've wondered about that, actually.
SPENCER: If they were independent events, the one in ten for each of them would be a one in one hundred that each of them would do this well.
JULIA: For Bezos, I don't think we have much evidence that he was underconfident just because Amazon was a one-shot. But for Elon Musk, if he put a 10% probability on each one, I don't think Tesla has technically been successful yet. His prediction for SpaceX was that he thought there was a 10% chance they would even make it into orbit. He's clearly won that bet. Given the low odds he assigned, I think that suggests maybe he was underconfident, which is a weird thing to say about Elon Musk. But in this particular sense, I think, yes, he was.
SPENCER: Obviously, Tesla has been extremely successful by most people's standards; they haven't achieved their mission yet of having electric vehicles everywhere.
JULIA: I don't know that he was ever precise about what he meant by success for Tesla, but at least they haven't failed yet. He was putting a 90% probability on them failing at some point. That's clearly not happened yet.
SPENCER: You draw this distinction between two types of confidence. Do you want to unpack that for us?
JULIA: This is actually a great point at which to talk about this because I brought up the motivation issue as one of the things that people think you need soldier mindset for. I was pointing out that if you're motivated by expected value, you don't actually need to be certain of success. The other claim that tends to go along with that is that you need soldier mindset in order to seem confident. Basically, you need to be certain of your beliefs, you need to speak with certainty in order to seem like a leader or to be charismatic or persuasive, to be influential.
SPENCER: Not just convincing yourself, but now we're talking about convincing other people.
JULIA: That's right. The two tend to go together because both are important for a startup founder. If you're starting a startup, you have to be motivated to do this really hard thing and persevere when things get tough. But you also have to be able to speak about your company with drive and passion and convince everyone else that this is a really good thing to invest in or to work on. You want to convince the media to cover you, and so on. Exuding confidence is actually quite important for a startup founder, in addition to being motivated. People tend to see that as a deal breaker for scout mindset. If I'm a startup CEO or in another kind of similar role, then I need soldier mindset. I used to assume that was true. I figured this is an unfortunate trade-off in the world where you can either have scout mindset and be intellectually honest, or you can exude confidence and have people look up to you and follow you, and you have to choose. That's too bad. Then I did a bunch of research for my book, looking at academic studies and real-world examples. Now I don't think that's true anymore. I don't think that trade-off exists to anywhere near the extent that people think it does. Here's what I think is going on. I think there are two different types of confidence. There are two different things we mean by the word confidence, and people are conflating them. For lack of better words, I call them social confidence and epistemic confidence, where epistemic confidence is about how much certainty you have in your belief. If you say, I'm positive that my company is going to succeed, or I'm 99.99% sure that the Democrats are going to win, those are statements of high epistemic confidence. Social confidence is just about how self-assured you are. Do you speak in a confident tone? Do you have good posture? Do you seem relaxed and at ease in social situations? Do you seem comfortable in leadership roles? Do you go out and take charge and make things happen? Those are all signs of social competence. Social and epistemic competence are two different things. They often go together, but they don't have to. They're distinct. What the research suggests is that social confidence is important if you want to seem like a leader, if you want to make people trust you and follow you. But epistemic confidence actually doesn't really matter. Jeff Bezos is a great example of this because, as I mentioned, he gave Amazon about a 30% chance of success. You might assume, of course, even if he thinks that, he wouldn't say that to investors or to the media because no one wants to hear that. But in fact, he did. In all of his early meetings where he was trying to raise seed capital for Amazon, he told everyone, by the way, I think there's about a 70% chance this is going to fail. So don't invest unless you're willing to lose all of your money. You might assume, who would ever invest? No one is going to trust a startup CEO who says that about his own company, but people did. I went back and found all of the examples I could scrounge up of people who invested in Amazon in those early days. What did they say about why they invested in him? The things people comment on are all about social confidence. People say things like, he spoke with so much passion about his vision for Amazon. Even though he said it wasn't a sure thing, I could tell he was really passionate and driven. Or they'll say, when I saw him bouncing down the steps of Amazon's headquarters, he had so much energy and charisma, I knew immediately that I wanted to be in business with him. That was one of the VCs from Kleiner Perkins who said that about Bezos. Or they say, he's so smart. He's done all his homework. He knows this market really well. Even though he can't promise this is a sure thing, he still seems like a really good bet to invest in. All of their comments are about Jeff Bezos's social competence, not about any claims he was making about Amazon being a 100% sure bet. I think this is a really lovely fact to discover about the world: you can be influential without having to sacrifice your ability to see things clearly and be well-calibrated and have uncertainty about things that you can't justifiably be certain of.
SPENCER: It seems to me you can come across as really confident when you make statements like, I think there's an 85% chance that will happen. Yes, you're literally saying you're not epistemically confident, but there's a way of saying that where people are like, wow, you've really done your homework. You must have really thought about this. I believe you.
JULIA: That is so true. In fact, people accuse you of being arrogant when you express uncertainty.
SPENCER: If you say you're 87%, then you are really arrogant.
JULIA: 87% definitely sounds more arrogant than 85%, which sounds more arrogant than 80%. Even though I guess I should have gone the other way, 82% sounds more arrogant than 85%. And 85% sounds more arrogant than 90%.
SPENCER: Because don't dare add an extra decimal point.
JULIA: Exactly. Yeah, that's also an interesting and important distinction between different kinds of epistemic confidence where you can be quite uncertain in the actual probability you're assigning to something. You can say, I'm only 61% confident or something. But you can still have a fair amount of epistemic confidence in the way you've analyzed the situation. So, I could be only 61% confident that some bet is going to pay off. But in order to get to that number, people assume I've done so much research. And I've really analyzed all of the available information. And I have concluded 61 is the correct number to put on this prediction. And so that kind of suggests a strong command of the topic, even though you can't claim that you're 100% confident in the bet itself. Does that make sense?
SPENCER: Absolutely. I've also noticed that very slight changes in wording can have almost no difference in how people react to a statement, but can take it from something that's delusional to something that's very honest.
JULIA: What's a good example?
SPENCER: One of my favorite examples of this is when a startup founder will say, "we're going to be the best in the world at X". Right? And that sounds like maybe it's possible, but it also sounds a little bit delusional. And they're probably wrong. You could just change the wording slightly and say, "our goal is to be the best in the world at X". And to the human brain, those two sentences are almost indistinguishable. But the second one is actually a totally 100% honest statement; that really is their goal to be the best in the world at X. Whereas the first one, that you're going to do it, is like, how do you know that? That sounds actually really overconfident.
JULIA: Right, right. I mean, that's the thing; people are not tracking that closely the specifics of exactly what you're claiming with your words. They've got kind of a much more impressionistic sense of what you're saying, which is why people who even worked with Elon Musk come away thinking, oh, he can't even conceive of failure. Because even though he explicitly says, I think this has a 90% chance of failure, the way he talks about his vision and just the way he speaks about the details of the business sounds so confident. So, it's kind of ironic, but I think it's a fortunate fact about human psychology.
SPENCER: When people ask me how to speak with more confidence, I give them the advice to find something that you completely believe and that you 100% actually think is true. And then just state it with confidence. So, what's the closest statement that you fully believe?
JULIA: That's great advice. I wish I'd put that in the book. Actually, what's an example of that? Something you could say with a large amount of justified confidence?
SPENCER: Well, for example, you could say justified confidence in how hard you're going to work to make something happen. Yeah, as opposed to the thing that is definitely going to happen.
JULIA: Actually, that's exactly what Bezos does in his interviews. So, the interviewer will push him on things like, can you be sure that Amazon is going to succeed or whatever? And Bezos will say, no, no, of course, you can't be certain; this is a very uncertain business. And it's hard to predict which companies are going to succeed. But I am confident that we're doing everything we can to have the best shot at success. And so he is, I think, justifiably confident that he's giving it the best possible shot, even if that shot isn't 100% likely to succeed.
[promo]
SPENCER: So you mentioned that there were various ways that people think soldier mindsets are beneficial. Were there other ones you wanted to talk about?
JULIA: Yes. There are three main ones I focus on in the book. And I talked about two of them so far. One being the motivation to do hard things, and the second one being the ability to project confidence so people follow you and trust you. The third one I wanted to talk about is essentially the ability to cope with reality. By which I mean the ability to protect your ego from all of the failures, disappointments, or insecurities that could threaten your ability to function in the world, the ability to feel good about your future. These are all things that are very important to us; I'm not going to deny that. And people tend to assume that you need some amount of self-deception in order to get those valuable things, to get self-esteem and comfort. Again, I totally understand why people feel like they need to prioritize self-esteem and comfort over truth. But again, I don't actually think that trade-off exists. I think in every instance in which people have claimed, oh, well, you have to deceive yourself in order to feel okay about XYZ, in every one of those cases, my reaction is no, you don't actually. There's plenty of ways people have found to feel okay in a situation like that that doesn't require self-deception. For example, I was reading a great book by Carol Tavris and Elliot Aronson called Mistakes Were Made, But Not by Me. The book is all about something they call self-justification, which is a kind of soldier mindset in which you're trying to convince yourself that you made the right choice, you did the right thing. Mostly, the book is about the problems of self-justification. But towards the end, they say, well, of course, humans need some amount of self-justification because without it, we would just torture ourselves with regret over having made the wrong choice. So we have to fool ourselves to some extent into thinking we did the right thing, even if we didn't. I read that, and I was like, wait, do we have to? Are our two choices only deceiving ourselves on the one hand or torturing ourselves with regret on the other? Couldn't we find ways to come to terms with the fact that we maybe didn't do things perfectly without suffering terribly over that fact? Indeed, I know plenty of people who I think are very good at that. They're able to just shrug and say, yep, I could have done that better, but I've learned something, and now I have a greater chance of doing things better in the future. I can feel okay with that. In every case where someone claims they need self-deception in order to feel okay about something, I think they're probably wrong. I can't claim that that's true of every single case, but it's been true of basically all the cases that I've seen. Does that ring true to you?
SPENCER: Well, I noticed this pattern in a bunch of what we've talked about, which is that people may have strategies that give them something that you call a soldier mindset. Yeah. But there may be other equally good strategies that they could use that also have better long-term consequences or have less bad trade-offs. But they just may not have taught themselves those strategies yet. They are just using what they found, right?
JULIA: Exactly. Yeah, the way I think about it is that when we feel the threat of some negative emotion, like insecurity, worry, or fear, it's like we reach into this bucket of coping strategies. We just grab whatever the first coping strategy is that we can latch onto. That coping strategy might be self-deceptive or might be false, like telling ourselves, no, I didn't make a mistake or telling ourselves, everything's going to be fine. But there are so many other potential coping strategies in the bucket that if you just rummage around a little more and put a little more effort towards finding something at that intersection of comforting and also true, then there are tons of things in that intersection.
SPENCER: You can see why people will get stuck on the same coping strategies. They have this sort of delusional element because if it's worked for them a few times, they then get this reward for it.
JULIA: Yeah, I think it's partly that they're doing what they've found works in the past. They don't have much motivation to deviate from that. It's also, I think, just a little bit short-sighted in a very understandable human way that we tend to get stuck on kind of locally optimal solutions. If you imagine you are dealing with a bully at school who says to you, oh, you have to hand over your lunch money to me, or I'm going to beat you up, then you might just figure, well, yeah, the logical thing to do is to give him my money because it's better than getting beaten up. That's kind of true locally. But if you zoom out and look at the longer term, there are other things you can do. You can learn to fight, or you can maybe switch schools, or you can find some way to get the bully caught red-handed so that he gets punished or sent away to military school or something like that.
SPENCER: I wouldn't want to bully you.
JULIA: If you're a little more strategic and focused on the long term, and not just your trade-offs right now, in the moment, you don't necessarily have to accept that trade-off being presented to you where you either get beat up or hand over your money. I think this is analogous to the alleged trade-off we face when it comes to soldier mindset and scout mindset because we tend to think, and people will claim, that the trade-off is that you have to hand over some of your judgment, you have to hand over some of your ability to see things clearly. Truthfully, or else you're going to get beat up, in the sense of suffering a blow to your self-esteem or a blow to your motivation or influence. Maybe to some extent, that's true locally. But there are all these strategies you can use to escape the trade-off and to have your cake and eat it too, to have your self-esteem and your motivation without having to sacrifice any of your precious ability to see things clearly.
SPENCER: This topic resonates with me a lot, just on a personal level, because I hate the idea of deluding myself. I work really hard to try to do as little as possible. That being said, I know that I do it sometimes, and I'm very annoyed about that. I try to reduce it as much as I can.
JULIA: As do I.
SPENCER: The idea of trying to find strategies that don't involve self-delusion to achieve the same benefits, I really like that. It's something that I try to do in my everyday life. I just wanted to comment that I think a class of strategies that can be really helpful is reframing strategies, as opposed to self-delusion strategies. There's a subtle distinction between them. Just to give you an example, I was talking to a friend of mine, and she was saying how when she thinks about the fact she's alive, it feels amazing to her, like, wow, I'm a being, I get to experience life. Think about the weird odds that I exist, and it's just an interesting kind of funny reframing on existing at all that makes her feel really happy. There's nothing self-delusional about it, right? It's just a way of looking at being alive. Another way to look at being alive is, oh man, every day, I get one day older, and I'm just slowly moving towards death or something like this, right? That's just depressing as hell, and it's equally valid. Reframing is just looking at things from different directions or different perspectives, none of which are right or wrong; they just make you feel a different way. They're functional, either helpful or less helpful. Self-delusion is actually about deceiving yourself. Another friend of mine had this mantra she would say to herself, that everything that happens to me happens exactly the way it needs to. This made her feel better about the things that happened, but it really bugged me because it was just deluding herself. No, that thing shouldn't have happened to you; you should try to make sure it doesn't happen again. That wasn't what you needed. I was worried that it would have longer-term consequences. Even though it made her feel better in the moment, I was worried it would have these longer-term consequences if you really start believing it. So anyway, I just want to throw that out there. That there are kind of like two different sorts of strategies.
JULIA: Yeah, exactly. It astonished me to learn that many people weren't really making that distinction between honest versus self-deceptive coping strategies, and they just conflated the two. They would say it's really important to be able to delude yourself in order to be happy, and then they would give an example that was just focusing on the positives. That's not deluding yourself; that's just choosing to think about the good parts and choosing not to dwell on the bad parts. You're not actually telling yourself a falsehood there. In another breath, they would talk about a coping strategy that was actually false. It was clear to me that they weren't even really tracking the difference between those two things. To me, they're very different, importantly different. I use reframes quite a lot and just ask myself, do I have to look at things this way? Is there a different frame that's no less honest or no less accurate but makes me feel a lot better? I also look for silver linings. Anytime something goes badly, a silver lining is that you might be able to avoid such things in the future. Sometimes when things go badly, a silver lining is you have a funny story to tell as a result. If you have a disastrous first date, at least you got a story out of it. If you lose your job, a silver lining could be that you could use this as a kick in the pants to explore some things you wouldn't have had the chance to explore otherwise. The important point is that you're not telling yourself this was actually for the best. Sometimes it might be, but you're not claiming that if it's not actually true. The important thing is just that it's a silver lining to the dark cloud that you otherwise wouldn't notice. You're not claiming that the dark cloud doesn't exist; you're just claiming that there is also a silver lining.
SPENCER: I think this leads to a general principle: if you can become more aware of times when you use a sort of self-delusion strategy to get benefits, you can ask yourself, is there a different strategy? Maybe there's a reframing strategy or a different type of strategy where I can replace the delusional strategy and get the same benefits. Now I'm going to get better long-term outcomes because I'm going to have a potentially truer view of reality and be more likely to make good decisions because I understand the reality of the way it actually is.
JULIA: Exactly. That kind of habit you were talking about, where you get better at noticing when you're using the self-deceptive strategy to feel good, and use that as a trigger to reach for a less self-deceptive strategy. I do something like that when I'm in an argument, and I start to worry that I might actually be wrong, which is not a pleasant thought. It's very tempting to flinch away from that thought and reach for a coping strategy, like a justification for why actually I am right or why it wasn't my fault that I got this wrong. That's my kind of self-deceptive coping strategy that I use by default. The thing that I've often replaced that with is just the reminder that if I tell this person that I was wrong, if I concede this point, that makes me more credible in the future because I've shown that I'm not the kind of person who just sticks to her guns because she's unwilling to ever admit that she was wrong about anything. Basically, it's like I'm investing in my future ability to be convincing and trustworthy. It doesn't entirely take the sting out of being wrong, but it's often enough of a silver lining to make the unpleasant truth tolerable enough that I'm willing to accept it.
SPENCER: You told me about a kind of metaphor many years ago that I found super useful and is really relevant here. I don't remember what game this is, but you told me that there's some ---
JULIA: Oh, Monkey Island. It makes me seem like such a nerd. There was this PC game that I used to play when my brother and I were kids called The Curse of Monkey Island. It was a very silly game, but lovely. One of the silly aspects of it was that you would fight pirates, but you didn't really fight them. I guess you had a sword, but the fights were focused around insults, so you and the pirate would hurl insults back and forth at each other. If you didn't have a good rejoinder to the pirate's insult, he would score a blow against you, and you would be defeated in that battle. On the plus side, when you threw an insult at the pirate and he responded, you got to keep that response in your toolkit of insults and responses. When a future pirate tried to insult you with that same insult, you would now have this rejoinder to parry him. I don't know how well this is coming across to your audience, but that's my memory of it. The point is that another kind of silver lining that comes to mind for me, or that I rely on in arguments when I think I'm going to lose, is that if I lose this argument, then I can take their winning arguments, the points they made that I was forced to admit were correct. I can use those in the future to win other arguments. Even if I lose this one, I get to borrow their weapons and use those in future arguments to make me more likely to win. It's a bit like an ironic way to appropriate soldier mindset and use it for good instead of for bad.
SPENCER: I think it's a really powerful reframe on scout mindset, because it says, if you already know the truth, that's awesome. You know the truth. And if it turns out you're wrong, by adopting scout mindset, your arguments are going to be stronger next time. So either way, you win, in some sense.
JULIA: There's this tricky relationship between the desire to be right and actually being right. If you really want to be right, that makes you more likely to be right, because you're going to be more motivated to figure out what the true answer is. But it's also tricky in any given situation, because really wanting to be right makes you less likely to notice the ways in which you're wrong and the ways in which you could improve your map of the world. The goal of this reframe was to replace the desire to be right in that particular argument and just refocus your attention on your desire to be right in general in the long term. Acknowledging that you were wrong in this particular case helps with that longer-term goal of being right in general.
SPENCER: It's so interesting because soldier mindset doesn't even achieve its own goal, in some sense, if you take a longer perspective, right? It helps you win right this moment, but then you actually just have weaker arguments in the future.
JULIA: That's right.
SPENCER: Because you tend to have less true beliefs, and so you actually do tend to lose more in the war of ideas.
JULIA: It's almost self-reinforcing. I'm now noticing that if you don't allow yourself to notice any of the ways in which your beliefs could be improved, could be made more accurate, then you are kind of forced to rely on soldier mindset more to protect those beliefs, because they're kind of weak and flimsy and not very well backed up. If you want to feel good about yourself for being right, then you're kind of forced to defend those beliefs with bad arguments.
SPENCER: If you really want to feel good about yourself for being right, be right.
JULIA: Exactly.
SPENCER: I think you have some cool ideas around how we can be right more often. You want to talk about some of those, for example, using betting?
JULIA: Yeah, so betting is a really interesting strategy that I talk about in the book, shifting your thinking in the moment, away from soldier mindset towards scout mindset. I'm going to give you another analogy if you're not already sick of my analogies. This one I borrowed from Robert Kurzban, an evolutionary psychologist who wrote a great book called Why Everyone Else is a Hypocrite. In that book, he talks about how the way we think is often kind of analogous to a press secretary at a company, where the role of the press secretary is not to figure out what's actually true. It's to just say things that make the company look good to the public or to the press. The press secretary is not really thinking about whether those things are true; he's just thinking about whether they are plausible enough that he can get away with saying them. In contrast to the press secretary, a company also has a board of directors who has to actually make the crucial decisions about whether to pivot the company or who to hire or fire. Those decisions have stakes; the company will thrive if the board makes good decisions, and the company will fail if they make bad decisions. Very often, the things that we tell ourselves are as if we're our own press secretaries. I might tell myself this company is definitely going to succeed, or I might tell myself this is hopeless, so I don't have to try. I might tell myself I was right in that argument, and my partner was being unreasonable. We're only really concerned with whether the things we're telling ourselves are plausible enough that we can get away with claiming them to ourselves. The goal of betting is to shift yourself from being in that press secretary mode towards being more in the role of the board, where you're actually making bets with real stakes. When you actually make a bet, or even just imagine making a bet, it causes you to notice that you're not as confident in the claim as you felt you were when you were making it to yourself before the bet. In the analogy of the press secretary and the board, suppose it's like a toothpaste company, and the press secretary tells everyone we're confident that our toothpaste is the best on the market and makes teeth the whitest out of all the other toothpastes that exist. Then suppose the board is approached by a dental school professor who says, hey, I want to do a study where I'm going to have people use all these different brands of toothpaste and see which one actually makes their teeth whitest, and then I'm going to publish the results. Does the board feel excited about taking the professor up on his offer or not? Despite what the press secretary says, the board might feel, we're not confident enough that our toothpaste would win this contest and that we want to risk looking bad by taking it. So no, actually, I don't think we want to take you up on your offer. That's just an example where your confidence in a claim can shift when you imagine betting on that claim. I use this principle of trying to at least imagine that I'm betting on a claim to see how confident I really am about it. Suppose I say to myself, I was really in the right in that argument I had with my partner; any reasonable person would say that I was in the right. That feels true, but I could bet on that claim. I could imagine making a bet that if I describe the argument to some reasonable third party, and I don't reveal which side I was on, and then that third party judge tells me which side he thinks was more in the right. If my side wins, then I get $1,000, but if I lose, I have to pay $1,000. How confident do I feel about taking that bet? I might not actually feel that confident that a third party would see my side as being more in the right. That's interesting because I felt so confident up until the point where I had to imagine actually having skin in the game. This basic move of imagining what it would look like to bet on your belief and then imagining facing that bet and noticing how confident or unconfident you feel about it goes a long way towards shifting yourself from soldier mindset into scout mindset.
SPENCER: I like that you don't even have to make the bet; you just have to imagine making the bet. It shifts you into that truth-oriented mindset.
JULIA: Exactly. I think there's a number of things going on here. Part of it is just imagining the stakes, noticing that you're hesitant to risk losing $1,000. But I think also just the act of operationalizing the belief, like if your belief is that our servers are secure, making that more concrete to the point where you could actually imagine a bet being made on it and coming out one way or the other would force you to imagine something concrete. Suppose I hired a hacker to break into our servers, and I lose $1,000 if the hacker succeeds within five hours. Would I make that bet? Just picturing what it would look like to find out whether I was right about the servers being secure goes a long way towards noticing what I actually believe is going on here. The more abstract the claims are to yourself, the easier it is to get away with BS, if that makes sense.
SPENCER: It reminds me of something that comes up sometimes when people ask me for advice on questions like, how do I show people that I'm really good at XYZ? One thing I've noticed over the years is that often, we don't clearly separate between the thing itself and the marketing for the thing.
JULIA: What do you mean?
SPENCER: If someone says, how do I show people I'm really good at XYZ? That really splits into two questions. One is, how do I get really good at XYZ? Or how do I make sure I'm actually really good at XYZ? Once you've got that locked down, it's like, okay, now conditioned on the fact that I'm actually really good at it, how do I show people that? Often, it seems like what we want to do, just as humans, is kind of do those two things simultaneously, like convince people and be good at the same time. It's sort of like, let's imagine a company has a marketing department. It's like, okay, prove to the world that our toothpaste really is the best. The marketing person can't do that. All the marketing department can do is try to convince people that their toothpaste is the best. Sometimes we are sitting in this weird fusion between marketing and trying to do the thing. Some of the techniques you're talking about, like the betting thing, switch you to the mode of, am I really this good? Not blending it with the marketing side.
JULIA: Exactly. That's well put. It reminded me of something you said before about how we kind of conflate what we think is true versus what we wish was true. You talked about someone you knew who said, I forgot what the belief was.
SPENCER: Oh yeah, I remember this. The story from back when I was in high school, and it just kind of stuck with me my whole life. I was talking to a friend of mine, and she told me that she'd become a pagan ---
JULIA: Right.
SPENCER: She believed in a female god of some sort. This really surprised me because I had never met anyone who had this belief system before. This person didn't believe this thing. I was kind of taken aback by the sudden change in their position. I remember asking her, are you saying that you believe that this deity actually exists and made the world? Or are you saying that you like the idea of this deity existing and having made the world? She thought about it for a minute, and then she said, you know, I'm not really sure ---
JULIA: Praise for honesty.
SPENCER: Absolutely. Of course, this was in high school, but I think we do this a lot more than we realize. We're not even clear with ourselves whether we really believe that thing or if we want to believe that thing or want others to think that we believe that. We're blending these things together. If we can peel them apart, like, okay, right now, I'm in the mode of what is actually true. Later, maybe I'll be in the mode of how do I present that to other people? That's a different mode to be in, right?
JULIA: And both of those are separate from what I want to be true.
SPENCER: Exactly.
JULIA: Just being able to think separately about those things is so important. I also understand why people are instinctively resistant to separate those things because part of the illusion kind of depends on not separating those things.
SPENCER: Right. If you take the idea that we're constantly trying to signal information about ourselves, people might be doing this without realizing it because it's strategic.
JULIA: Right, exactly. The implicit logic of what's going on there is that the more you're able to not notice that you don't really believe the thing, the better you will be able to convince other people that you believe that thing. If you get really good at thinking separately about what you believe versus what you want to convince other people of, then it could be self-undermining. I think that's actually wrong, but I think that's the logic that motivates the unwillingness to separate those out.
SPENCER: We've talked about a few really practical strategies for trying to implement these ideas in our lives. For example, noticing when we're deluding ourselves and switching to reframing or thinking, would I actually bet on this thing being true? Do you have any other practical strategies you want to share before we finish up?
JULIA: One thing we started talking about at the beginning of our conversation is the idea of talking to people you disagree with and hopefully learning something from those disagreements. I think this is something that people often try to do, and it tends to go pretty badly. A liberal will listen to Fox News because they feel like they should hear the other side. More often than not, they just come away hating the other side even more than they did before they turned on Fox News. This happens on the conservative side as well. Conservatives will listen to NPR or read the New York Times and just come away even more polarized. I think what's going wrong here is that, by default, we tend to just reach for the most prominent representatives of the other side, like the sources or the people that are most popular on the other side, like Fox News. The problem with that strategy is that you think about what makes someone popular with a particular side. They play to their base, they present things in a really one-sided way that flatters their base's preconceptions, they mock the other side, i.e., Fox News, Fox liberals. All of that is exactly the opposite of what you want. If your goal is to listen to the other side with an open mind and maybe come away with your mind changed, you shouldn't be listening to the source that's going to mock you and present the most biased version of things. The advice that I advocate instead is to be more strategic in how you choose the representatives of the other side that you seek out and listen to. Specifically, try to find people on the other side with whom you have at least some common ground. A good example of this that I talked about in the book is Jerry Taylor, who was an anti-climate change activist at the Cato Institute, which is a libertarian think tank. Jerry Taylor's role was to essentially go on talk shows and explain why climate change as a threat was overblown and not actually a problem. I tell the story of how he changed his mind gradually, over several years, and came to think climate change was actually a big deal. Now he's an activist on the other side, and the conversation that ultimately changed Jerry Taylor's mind was not with a climate change activist, as you might picture one, like someone wearing tie-dye and hemp, with dreadlocks in their hair and signs that they're waving on the street. Instead, it was a conversation with a fellow libertarian who used to run a hedge fund and who thought climate change was a serious threat. A friend of his had set up this meeting, and he had this long conversation with this guy whom he respected because this guy's clearly smart and agrees with him about libertarianism. All of that background formed this really valuable common ground that made Jerry Taylor much more willing to listen to the guy and also meant that the conversation was going to be happening in language that Jerry Taylor already accepted. This guy made an argument for why climate change was, as he put it, a non-diversifiable risk, the kind of thing that if you're investing, you can't really hedge against. It's worth putting a lot of effort into avoiding that downside risk. This was language that Jerry Taylor appreciated; he could think in terms of risk, and he thought that was a valuable way to think. That conversation was far more valuable for him than a thousand conversations with activists who made their case in language about our duty to Mother Earth or something like that. That's an example of how I think finding someone with whom you share at least some common ground, even if you disagree with them about the particular issue, can lead to much more promising disagreements where you're more likely to actually change your mind.
SPENCER: Now, that's such good advice. Also, just going back to the theme of your whole book, it's so much easier to be in scout mindset if it's a person who you feel is on your team, and you're just debating this one topic, rather than being in enemy territory, of course, having to be in soldier mindset.
JULIA: Exactly. There's an emotional component and an intellectual component. The intellectual component is that the arguments they make that they find compelling are actually more likely to be compelling to you too because of the intellectual common ground. Then there's also just the emotional common ground of feeling like this person is someone like you in a lot of ways. That also makes you more willing to listen to them.
SPENCER: This was so fun. Thank you so much for coming on.
JULIA: Thank you so much for letting me talk for so long and for asking such excellent questions. It was such a pleasure coming on your podcast. Thanks, Spencer.
SPENCER: Where can people find your book? Is Amazon the best place?
JULIA: Yes, there's a page for The Scout Mindset on the Penguin Random House site. Also, on Amazon, you can get the Kindle, the hardcover, and the audiobook, which I narrated. If you like my voice, then you should get the audiobook. On my [website] (https://juliagalef.com/), I have a page about the book with some descriptions of the principles and some quotes from people who read it. You can go to https://juliagalef.com/ to learn more about the book or check out my podcast or some of my videos.
SPENCER: Oh, yeah, we forgot to mention your podcast. I highly recommend "Rationally Speaking", a fantastic podcast.
JULIA: Thank you.
SPENCER: I also can't wait to read your book. For anyone who's thinking of getting it, I really recommend you do. Supporting first-time authors like Julia is impactful because buying their book early on changes the statistics and can propel the book to success. It's a great way to support Julia.
JULIA: Thank you. I hate begging, but it is true; pre-orders are really valuable. If you're considering getting the book, getting it earlier rather than later is certainly helpful. Thanks.
[outro]
Staff
Music
Affiliates
Click here to return to the list of all episodes.
Sign up to receive one helpful idea and one brand-new podcast episode each week!
Subscribe via RSS or through one of these platforms:
Apple Podcasts Spotify TuneIn Amazon Podurama Podcast Addict YouTube RSS
We'd love to hear from you! To give us your feedback on the podcast, or to tell us about how the ideas from the podcast have impacted you, send us an email at:
Or connect with us on social media: