CLEARER THINKING

with Spencer Greenberg
the podcast about ideas that matter

Episode 225: Decision-making and play-testing (with Dan Epstein)

Enjoying the episode? Want to listen later? Subscribe on any of these apps or stores to be notified when we release new episodes:

August 29, 2024

What sorts of decisions are we making without even realizing we're making them? Are people aware of their own values? Do they know how those values rank relative to each other? What are all the various parties, interests, and values that have to be addressed and balanced when making decisions in a healthcare context? What does it mean to "play-test" yourself? What are the best strategies for giving feedback? How much energy is required to make various kinds of decisions? How can we practice and get better at decision-making? What is "tabletop exercising"? What are the most effective ways to bring other people into the decision-making process? What are some aspects of games that ought to be put to good use in non-game contexts? Why are educational games usually neither fun nor educational? How can game design features be used in ways that avoid turning metrics into targets? How can we make better decisions about how to divvy up our time?

Dr. Dan Epstein is a practicing medical doctor and academic PhD focusing on decision-making and game design. Dan is the director of The Long Game Project, which helps businesses and leaders improve strategy and decision-making with games and tabletop exercises. Dan is also an ambassador for Giving What We Can, a community of people who pledge to donate a portion of their income to effective causes; and he's a member of High Impact Athletes, a community of current and past athletes who do the same. Follow him on Twitter at @drdanepstein, email him at email@longgameproject.org, connect with him on LinkedIn, or learn more about his work at longgameproject.org.

Further reading:

JOSH: Hello and welcome to Clearer Thinking with Spencer Greenberg, the podcast about ideas that matter. I'm Josh Castle, the producer of the podcast, and I'm so glad you've joined us today. In this episode, Spencer speaks with Dan Epstein about finding the right approach, energy and expertise for good decision making, and evaluating how we spend our time.

SPENCER: Dan, welcome.

DAN: Hi. Thanks for having me, Spencer.

SPENCER: It seems to me that one of the most important life skills we can develop is to make better decisions, because so much of our life is going to be dictated by what we choose. And I know it's a big question, but when you're thinking about how to make better decisions, where would you start?

DAN: That's a really big question. I like to take the approach of really thinking about this from a pragmatic point of view, where we just want to help people kind of recognize that decisions are a massive part of their life to begin with. I think a lot of your audience and you and I probably think quite high level about decision making, and really think about the things we can do on the margins of improving our decision making. But I think, generally, to improve overall decision-making skills for most people, firstly, it's about recognizing you're making decisions — a lot of people don't even recognize where decisions come up — and then, getting to know what that decision actually means, what it means for you, what's the context of it. So probably the biggest thing, I think, that a lot of people can do is just practice their awareness about decisions that come up — situational awareness.

SPENCER: So, what's an example where people don't even realize that they're making a decision?

DAN: I think that a lot of people just make decisions that are based on prior habits, or potentially the defaults that they usually have when they're going about their day and making their choices. So if we take healthy eating as a good example, their default might be that they stop in on the way home to McDonald's, and that's just their hobby, that's their routine, and that's the thing that they regularly do. And the decision that they don't recognize that they have is, "What do I want to eat tonight?" They're just stuck in that loop of going through and having a habit or a default option that they usually follow through with. And then it's about recognizing that a little bit ago, when you became hungry, that you actually have a decision at that point in time as to what to do, instead of just following through on whatever your default behavior is or your hobby. So I think at the very low level, that's where people need to intervene and think about what decisions they're making.

SPENCER: It reminds me of a metaphor that I like to use for decision making, which is: you're walking down paths trying to get to the top of a mountain. And I think a lot of good decision-making advice, you can kind of embody it in this kind of mountain metaphor, and to bring that metaphor into this situation, imagine you're walking down the path, and you're thinking about just walking straight ahead, and you don't realize, "Wait a minute, I could actually go off the path right now." And maybe going off the path is the best thing to do right now. But it doesn't even occur to you, because you're just kind of walking down this path. Now, in that metaphor we see most of the time, you probably shouldn't go off the path, right? Most of the time you should just keep doing what you're doing. And it's like, in most daily things, you don't want to be thinking about it all the time. You don't want to be thinking about, "Well, exactly how many strokes in my tooth brushing should I do today?" You know, I mean? But there's some really valuable strategic advantage in occasionally revisiting those paths and saying, "Hmm, maybe I should go off this path today."

DAN: Yeah, exactly right. So in your metaphor, a person just walking the path towards the mountain is just choosing, probably, the path of least resistance. And that's what a lot of us do. A lot of us choose the path of least resistance. It's usually whatever we've done before, or what our context or environment, or what the other people around us are doing. And a lot of the time that's when we're making decisions where we're not really aware that we're having choices. So, we're having behaviors and decisions that are going through our life, and we don't really stop to reflect on the other paths that might exist, as in your metaphor.

SPENCER: So where do values fit into this? How can we think about using values to make better decisions?

DAN: Yeah, I really love the work that you've done with Clearer Thinking on really ascertaining what people's values are. I think it's so important because decision making is such a complicated area. It's so unique to the individual. So what might be a good choice or a good decision for one person, not necessarily the best thing to do for another person. And in many really hard situations and really hard decisions where it's really complicated to figure out what the best thing to do is, when you make a decision according to what your values are, that's usually the best way to go about it, or at least a really good kind of broad strokes formula is to apply. But one of the things that people have a lot of trouble with is that they don't really know what their values are. A lot of people actually don't understand or have a clear idea of what things really mean something to them, or what their particular perspectives are on something. So that's actually a really big thing that I like to work with people, especially in my role as a healthcare professional, is really trying to figure out what their values are, what actually means something to them, or what means more to them than something else. So, you do a great job thinking a lot about this Spencer. So I'd really love to hear actually, how you think about values and decision making.

SPENCER: Yeah, I find values most useful when you're making a really thorny decision, where it feels like there's no right answer, that it's impossible to do the right thing. An example of this often happens when two or more of your values are pitted against each other. For example, suppose you're pursuing your career. You really deeply care about your career. It's really meaningful to you, but your dream job just came available, and it would require moving to another city, and yet, your partner, who you deeply love and you want to stay with is really wedded to the place where you currently live. And now suddenly it's like, "Oh my gosh, these values I have are now suddenly in conflict. They used to be harmonized, and now they're in tension." And I find it starts to get useful to think about your values in a situation, saying, "What are the values at stake here to me, and how much of one value am I willing to trade off against another value?" And there's not gonna be a right answer to that, but it begins to kind of let you formulate the problem in a clearer way as, "Oh, what's really happening here is I care about value A and B a lot, and if I move, I get both A and B, but I also care about C a lot, and if I stay, I get C."

DAN: Yeah, I think a lot of people don't have a clear understanding of what their values actually are. I think that some people who spend a lot of time thinking about decision making, like you and me, probably have thought a lot more about this, but a lot of people just don't really think about what their values truly are, or even harder, what values mean more to them than other values. So, people might have an understanding that they really value honesty. But do you value honesty more than you value something else that might be a trade off in a situation?

SPENCER: Right. And if anyone is interested in this, you can take our "Intrinsic Values Test" on our website, clearerthinking.org and it will help teach you about your values and help you figure out what your highest priority values are.

DAN: Yeah, it's great.

SPENCER: I'm interested in your medical work. What sort of a typical example where you see values come up in the decisions of your patients?

DAN: Decision making in healthcare heavily relies on values, and a lot of it are crucial decisions where you know there may be one obvious decision that is the correct thing to do, or the preferred option from the medical standpoint, but it may actually not align with the patient values. So you really have to think about it through the "values of the patient" lens a lot of the time. For example, you might have someone who is an old person who falls over and breaks their wrist, and a surgeon will maybe not want to do the operation in a way where the function of the hand is preserved as much as possible, because it might be risky if they're on blood thinners or something. But if this person is like a violinist or a piano player, and they so highly value the use of their hand, they may be willing to take that risk more so than the surgeon's opinion. That's just one example, but there's so many different examples of where values collide in healthcare. Another one that I can think of would be, when you're helping someone in elder care kind of transition from a home to a nursing home. It's a situation where sometimes the values collide between the patient and the family. And you really have to think about this at a quite a deep level as to what that person's true values are, what they want for themselves. But then, you kind of have the family coming in to want what's best for them as well, and maybe those things don't align, so it's complicated. But we use values and thinking about values and decision making quite a lot in healthcare.

SPENCER: It makes me think about the fact that when you're consulting with an expert, whether it's a doctor or a lawyer or psychologist or what have you, they can bring a lot of expertise to a situation, but they can't tell you how much you care about different things, right? That, you've got to bring that. And a good practitioner will try to kick that into account, but they won't always do that. They might just give you advice that is standardized for everyone, and you kind of have to take that advice and match it with your values and say, "Okay, well, they think I should do this surgery, but what are the other options and which is actually best based on what I care about?"

DAN: Yeah, and I think a really good doctor, a really good lawyer or any type of expert will try to get to know the person, to try and understand their values. But it's really hard, because you've got to kind of hand hold people through and try to explore their values with them. And I guess the best way to do that, in a practitioner context, is to really outline the obvious things: like the big risks, or the low likelihood risks that have big consequences, or potential benefits against those risks. So outlining the whole broad suite of options and possible future consequences helps inform the person about their decision making, but it's also a way to kind of explore what their values are, and have them kind of, I guess, navigate for themselves what those potential outcomes would mean to them. And through that process, they kind of can figure out what their values are, if they're a bit unsure.

SPENCER: I think a really interesting case of this is end-of-life care, where there's a lot of reasons why a doctor or nurse, et cetera, might want to resuscitate a patient because, you know, partly their duty is to save a life, and there also could be legal ramifications, and so on. But it's not always in the patient's interests, and whether it is or whether it isn't, it will ultimately depend partly on the outcomes if you resuscitate, but also partly on the patient's values. Some people might say, "You know what, if I'm going to basically be a vegetable, I'd rather not be alive." And other people are like, "No, if there's even a chance that I'm going to get my life back, I want to take that chance, even if the outcome could be really horrible."

DAN: At the end-of-life care, if we're talking about situations like in the ICU, a lot of the time, it's relatively medically predictable, to some extent, what prognosis is going to be like. But as you said, what's not predictable is patient preferences and values. I'm a primary care doctor, so I see people before they're in the ICU, but I make sure that everyone who's got chronic disease or terminally ill or even way before that, when they're 70 or something, I make sure that they go through the process of having a, what's called in Australia at least, an advanced care directive, which is really thinking about what you want from treatment, not just for resuscitation, but for things like, "Hey, if I get really sick with pneumonia, do I want antibiotics or not?" And that might be an obvious choice to someone who is not at the end-of-life care, but if you've had multiple strokes in the past, and if pneumonia is going to be a really bad outcome for you, and if you might end up in ICU, someone might actually say, "No, look, I just don't want that kind of treatment." And it can be really challenging to explain those things to patient families, and that's where the values really clash and get a bit heated. But you want to be able to navigate all of those possible options well before that time to make the decision actually comes up, because that's when you want to avoid family conflict and challenging end-of-life decision making — basically making those decisions far in advance.

SPENCER: Yeah, and you're putting a big burden on your family. If you haven't made your wishes clear, they have to make this incredibly stressful decision on your behalf, right?

DAN: Yeah, it's really challenging when you see families who haven't had that conversation or are disagreeing about that conversation. Because if it isn't clear, then it becomes a he said, she said. "Mum said she wanted this treatment," and the brother is saying, "No, she told me like three times, that she wanted us to do everything possible to save her life." And it can become really, really challenging and really an emotional situation that's like a bad death. You can definitely have good deaths, but you can have bad deaths. And a lot of the bad deaths are probably because of miscommunication with healthcare practitioners or rushed bad decision making in the lead up. It's one of those things that could definitely be at least mitigated slightly with some kind of pre-thought and better decision making.

SPENCER: I think another reason people can make bad decisions in this context is that they don't really understand what the outcomes are going to be like in certain resuscitation scenarios. And I think people often suspect, "Oh, if you get CPR, then you'll just be totally fine." But actually, they might be seriously impaired for the rest of their life. What's your kind of experience with that?

DAN: It's so hard to be able to tell people what things are going to be like. The range of outcomes is so much. And we see so often, people have these miracle resuscitations, and they're totally fine in the media, but a lot of the time it's months of ICU afterwards, loss of function, like pretty significant outcomes. And sometimes, people have these premorbid conditions where they're very sick, and if they were to go through a big stroke or a big cardiac arrest, and there was a massive effort to resuscitate them, they may not get the health outcome or the functional outcome they wish. And then, it's also about thinking about what your life is going to be like after that, what your family's lives are going to be like after that. And then, as from a doctor's decision-making perspective, we have another kind of more complicated lens to put on top, and that's making decisions from a population level, or like a health-system level. So, we have to think about other things like resource involvement or economic cost of treatment. It is quite complicated, because if we said we're going to keep absolutely everyone alive, I'm pretty sure we'd run out of ICU beds, because we can keep people alive for quite a long period of time, but it might not be the right thing to do from a resource allocation perspective as well. It's really tricky.

SPENCER: Yeah, these are really, really tricky things. But I think a really actionable takeaway from this conversation is, if you don't have an advanced directive, think about going and making one. And there are free ones you can download online. They take, I don't know, 20 or 30 minutes to fill out. And then fill one out, send an email to your loved ones and say, "Hey, here's my advanced directive. If anything should happen, I just wanted to let you know, just in case." Print out a copy and put it in your room somewhere as well.

DAN: Yeah. I think it's a great, great thing for anyone to do, really, no matter how young you are, and you never know if there's going to be a car accident around the corner, if someone's going to need your organs. It's something that you can easily just forecast your values and let people know, and then you can save other people around you from making really challenging decisions by doing those things. You can save mum and dad or brother and sister the choice of whether or not to give your corneas to someone who needs them, which is going to be a very emotional conversation with them. If you know exactly what you would want, then you should be able to broadcast that.

SPENCER: On a less morbid topic, what do you mean when you talk about playtesting yourself?

DAN: Playtesting yourself comes from the concepts around game design. Basically, when someone is going and designing a game, you want to create almost like a minimum viable product, and you want to create the rules and the context of the game enough to be playable, but you know it's going to have bugs, and you know it's going to have imbalances. And some of the parts of the game won't be fun, some of them might be broken. So what happens is, you send an early copy of a game — whether it's a board game or a tabletop game like design or if it's a computer game — and people test it. They play it, and they find places that break, places that don't work. And it's all about really tightening feedback loops. So the feedback loop is the most important part to concentrate on here. The concept of playtesting yourself is basically thinking about accepting feedback and accepting criticism and feedback loops in a more kind of fun way and in a more fun context. I kind of came up with this by myself. Everyone could be better at accepting feedback and getting better at improving themselves, but I particularly found accepting criticism quite hard when I was younger, and really wanted to get better. But sometimes criticism can be, you've received and you don't feel great about it. But I tried to put the lens on it of playtesting. So I tried to teach people now to continually playtest themselves. So basically, go out in the world, let people experience you, what you do, do the things you normally do, and then whenever someone comes back to you with playtester comments or feedback, you should look at that as like, just a way to receive the feedback in a better format, and use it as a chance to kind of tweak things, improve yourself. So, it's really just a fun kind of twist on the way that you could kind of close feedback loops and receive criticism.

SPENCER: So is the basic idea that if you kind of think about it as like, okay, if you were designing a game, you would have to playtest it to kind of get the kinks out. And similarly, like with a person, the kinks will never all be out, and this feedback lets us kind of get the kinks out over time. And it's no different than any kind of complex system.

DAN: Yeah, it's just a lens in a way, for you to view the world and how you interact with it. And people who do give you feedback or give you criticism, it's a lens in which you can view that as part of the way of actually improving the end product. So playtesting yourself is a way to actually make you at the end better and receive criticism and feedback in a positive way, and not in a way where you take it emotionally or to heart. You can actually use it to your strategic advantage.

SPENCER: Yeah, I think a lot of the challenge people have with feedback is emotional, right? Just to a lot of people, it feels bad. Maybe it makes them doubt themselves, or maybe even can affect them negatively in the future, if now they think, "Oh, there's something wrong with me." I find what's really helpful for me is, I imagine two worlds, like World A: I have this problem or flaw, and I don't know about it, and I go around my whole life not knowing about it; or World B: I have this problem, the same flaw, but find out about it now and then do something about it now. I just so much prefer World B, and I'm just reminding myself of that, that that's really your options. Because everyone has flaws. So the options are either you have it and are blissfully unaware and you're just annoying people with it or doing things less well than you could be doing, or you find out about it as soon as possible, right?

DAN: Yeah. I also think it has to do with how hard it is to fix or how other people perceive it as desirable or undesirable. Like, if it's something super obvious, like you got spinach in your teeth or something, that's an obvious one that you're walking around with that you want feedback on. But if it's something like, "The way you gave that presentation, you could do this, you could do that. You could be a bit more charismatic if you added this or that," people are proud of whatever they just presented or want to have done the best job possible, so they feel they're very emotional or vulnerable in situations where you think you might have done the right thing or you gave it your best, and it wasn't like unintentional spinach in the teeth. Yeah, it's definitely the emotional part of it. And I think whatever you can do to kind of separate your own rational brain from the emotional response of receiving feedback is good. I guess the way that I do it is through the idea of playtesting myself.

SPENCER: Yeah. And I think another issue is that sometimes people are bad at giving feedback. I would say, quite often people get bad at giving feedback, and some people can be a little traumatized from the bad feedback they've gotten.

DAN: How do you give feedback, Spencer? What's your best idea to give feedback? Do you have any tips or tricks on giving feedback? I think it's quite hard.

SPENCER: Well, one of my main strategies is I try to give positive feedback a lot. And so, if you give positive feedback a lot, then when you occasionally give negative feedback, it just hits less hard. It's sort of like, "Oh, I know this person thinks positively. I know that they, overall, like me and think I'm doing a good job in life. They just want me to improve this one thing," right? Whereas, if you never get positive feedback, and then you throw someone negative, it just comes across, I think, much harsher.

DAN: I've had a lot of people try to give feedback sandwiches where you have a good bit, a bad bit in the middle and then a good bit to try and have a bit of primacy and recency bias to make you feel a bit better, I guess. So the first and last thing don't hit so hard. But I don't know. I think feedback, as long as you can frame it productively to someone is good, like, "This is what needs to change, and this is the reason why it needs to change. And this is what I think you could be like if these things are implemented."

SPENCER: Right? Well, I think feedback sandwich can be helpful for some people, but it's not even really what I mean. I'm talking about just trying to be the sort of person that gives regular positive feedback to people. Like, when you see someone doing something that you like, or you think they're doing a great job, just tell them. I try to do this with my work colleagues, like trying to tell them regularly they're doing things well. And so, you keep that positivity. Because I think one of the reasons that it's hard to get feedback is it makes us think that we're bad or overall we're not doing a good job, or the person doesn't like us, right? And so if you can nip those things in the bud, prevent the person from thinking those things when you're giving them feedback, it can be easier emotionally to bear.

DAN: Yeah, it's just easy course correction. Like, I think it's more feedback and more like tacking makes people just more aligned towards the direction that's the overall strategy, or whatever the direction that you want them to go, especially in a workplace context, people do need that tacking to course correct, and I think positive feedback is the best way to do that because they're getting signal.

SPENCER: Yeah. I don't know if you know about the hedge fund, Bridgewater, one of the biggest hedge funds in the world, that has a kind of famously super brutally honest culture where they literally have an app that you use in meetings where you rate people in different attributes, like as they're doing stuff, and it's available to the whole company, apparently. Everyone can see, "Oh, you just rated someone that they're bad in creativity or whatever." And this is happening in real time. I think there's certain things: to a lot of people, this would be a nightmare; to some people, it's like, "Oh, that's really exciting, what a great learning opportunity." But what struck me as odd about that is it feels like it's more based on negative feedback than positive feedback, which always is, to me, such a missed opportunity, because I just think that positive feedback is so powerful. It's both powerful for building relationships, but also for increasing positive behaviors. When someone does something well, it's a great way to get them to do it more. One way to think about this is like, let's say someone sometimes does things you like in a certain domain, and sometimes they do things you don't like in that domain. Well, one way to do it is, every time they do the thing you don't like, you tell them, "Oh, I didn't like when you did that." Another way to handle it is whenever they do the thing you like, you're like, "Oh, I really liked that you did that." You're promoting the same outcome, but one of them makes them feel good about you and makes them feel liked by you, and the other makes them feel bad about themselves. But in either case, you're kind of achieving the same goal.

DAN: Yeah. Sometimes I think it confuses people as well. So, if you're doing disguised criticism, or backhand compliments, or forehanded insults. I think it's quite challenging for people to understand negative feedback if you're trying to make it nice, or if you're kind of wrapping up negative feedback into a mandate like this rating system. I think it's actually, I don't know, I think people actually find it quite hard.

SPENCER: Another thing I think about when I think about how to make feedback more palatable, because what's so essential for improving almost anything is getting feedback. And there's this framing issue that comes up, which is, one way to look at it is being happy with who you are now and then being excited with who you could become. You're like, "Oh man, I could become even better than I am." And another way to look at it is feeling like who you are is maybe not quite good enough, and then, "Oh no, I might also have these flaws." You know, I mean? "What if I also have these flaws, and then I'm not good enough." It's just interesting how it could really be framed either way. And there's no right answer to that. But it's so much more rewarding to think of it as an opportunity to be even better than to think of it as, "Oh no, it might turn out that I'm worse than I thought."

DAN: Yeah, in fact, going back to the start of our conversation, this is a great example of like a hidden decision that people have is when you get feedback, then you have that decision that you may or may not realize. You have a decision: you can either stew in the feedback and kind of get a bit self-loathy, or reflect negatively on yourself, or start to dislike the person that gave you the feedback, or you can look at it exactly like that and see it as a choice to enact on the feedback and take it on board and even decide if it means enough for you to want to change something. It's a good example of a hidden choice and opportunity.

[promo]

SPENCER: So changing topics a bit. How do you think about the way that some decisions require energy and others don't? How does that fit into your kind of idea of how to make good decisions?

DAN: Yeah, so when people encounter decisions that they feel are going to require maybe a lot of energy in the form of time or effort, that's when you have another hidden decision as to whether or not spending that time and energy on that decision might be just, "Is this decision big enough for me to worry about? Is it something that requires enough of my time or concentration or thought space, or whether it's not?" This is a common pitfall for people when they're making decisions — spending too much time and energy on decisions that don't mean much. And it might be, you know, deciding whether or not to buy a small object, or it might be just a minimal cost and they're spending too much time and effort worrying about it or comparing other options, or it might be something that's completely inconsequential, like a future event that may or may not happen. A lot of people who are very, very anxious about a certain thing are running over and over in their head again and again about whether or not I do this or whether I do that, or whether I book this flight, or whether I book that flight, because the stopover is longer, and I hate being in airports. Knowing when to kind of move away from a decision or even bother to spend time and energy on a decision is super hard. And I think a lot of people, everyone really, spends time and energy on decisions that they probably don't need to.

SPENCER: I suspect there's an individual difference here, where some people systematically spend too long on decisions while other people systematically spend too little time on decisions. And of course, if we were thinking about what really is the best way to allocate your time for decisions, it's like, well, you'd want to spend more time on more consequential decisions. But even that is not sufficient, right? You'd also want to think about, "Well, on the margin, how much am I getting for extra time?" Because there might be a really important decision, but there's no more benefit from thinking about longer, you've kind of maxed out. So, you kind of want to allocate your time to the decisions where the marginal extra hour thinking about it is really high value. But yeah, what do you think about that?

DAN: Yeah. I listened recently to your conversation with Annie Duke, and you were talking about expected outcomes. Expected outcomes are a great way to kind of figure out if it's worth your energy or not to spend time on decisions. But sometimes it's hard to know what to measure, to calculate what the expected thing is. And if we're playing with money, those are numbers, and that's easy to quantify. But if we're, for example, talking about something that makes you quite anxious, thinking about the expected outcome in terms of how much anxiety is it going to cause you, or how much psychological distress is actually something that's really worthwhile to think about as well.

SPENCER: So for someone who does spend too long analyzing minor decisions, do you have any advice for them in particular?

DAN: Yeah, I think the first thing is situational awareness, or just awareness that you are spending too much time on something, or you are ruminating about something, and the first thing you need to do is really recognize that you have that problem. And if you're someone who has a problem about indecision or spending a lot of time on things that are a bit too small or really trying to weigh up really small things, or holding up other people because of decisions, if you recognize that you're in a situation where that thing is happening, the first thing to do is just hold a mental stop sign up inside of your head and be like, "Alright, stop, take a second. Now, what is the actual choice that I'm making here, instead of just thinking globally about the whole thing with a lot of anxiety, and do I need to spend this energy on it?" If no, then just make a decision. And probably any decision in that situation is probably a good decision, because you're going to make it for a reason, and that reason is going to be based somewhat loosely on your values. And so, I think that if it's obvious to you that you're ruminating on something for a long time, hold the stop sign up, decide whether it's worth your energy or not. And if it is, then you need to sit down and think about it. But if it isn't, you need to move on.

SPENCER: I think an especially challenging type of decision is one where the two things are about equally good, and what can happen is people can be stuck because they're like, "I can't tell which one's better." And maybe, if they think about it this way, this one seems better, and then they think about a slightly different way, and the other one seems better. And maybe it's even worse if it's a consequential decision, like you're trying to decide between two different places to live and they each have all these different trade offs and they're hard to compare to each other.

DAN: I remember my granddad used to tell me — I can't remember even what the context was, but he said to me — "Any decision is a good decision, because you made it for a reason." And I thought that was really good, because in these situations where you have two equally good options, a decision is going to be good no matter what decision it is, because you probably made it for a particular reason.

SPENCER: Right. I think it's like people get stuck on, "Well, which one's really the better thing?" But if you've really thought about a whole bunch and they seem equally good, it actually doesn't matter in a certain sense, right? From a statistical point of view, it doesn't matter what you choose.

DAN: And a lot of the time it's impossible to know. You actually don't have the information about what the outcome is yet. So anyone in your position couldn't make a better decision than you could at the current time, so just do it.

SPENCER: Yeah, and I think an even harder type of decision are two roughly equally bad choices, right? There's like, "Do I do this really bad option or do this really bad option?" And they both seem really bad, and they both seem about equally bad. I think those are kind of, thankfully, I think people don't have those that often, but those are kind of nightmare decisions.

DAN: I think in those kinds of contexts, going back to the hidden choices and hidden decisions is really spending time thinking about the option of doing nothing or the option of not making a decision. I think that that is sometimes an undervalued approach when it comes to when you feel like you're forced to do something in certain situations. There is always the option to explore further options. That is one choice: you just do more research and see if there's other options than two bad ones. And the second thing is the choice to do absolutely nothing and have a status quo. I think that that's actually a completely viable option that people don't explore. It's a bit of a hidden choice sometimes to some people.

SPENCER: It's funny, because sometimes the do-nothing option is your option A, and then you have another option B. And we can kind of be too attached to option A because we're already doing it, just the default bias. And it's like, if you don't decide, you are just kind of forced to option A automatically, right? But other times, you feel like, "Okay, I'm approaching a fork in the road. I've got to go left or right," and it's like, "No, I could just stay where I am."

DAN: Yeah. I think people also don't like inaction, or they want to feel like they've got some kind of agency over the thing that's happening to them or with them. So sometimes, doing an active part of decision making or enacting change in some way feels more powerful or feels like the better thing to do, but quite often, sitting back and observing or doing nothing, or at least allowing for more time or more information is actually a really good move.

SPENCER: How can people practice their decisions to get better at them?

DAN: Practicing decisions is really tricky because you need to have time to do it. I think that the main way you get better at decision making is to reflect on decisions and have that feedback loop. So, you need to practice decision making on small things and monitor them, and then keep a log of things that you decide that are big things. So actually, if you don't have feedback loops yet, practicing decision making can be about just getting some data and getting some information on current and past decisions that you've made. So practicing decision making can actually come from just observing decisions you've made in the past and learning things about yourself, like what choices were available, what options did you consider, why did you consider those options, and why did you choose what you chose, and how did your choice match up to the outcome? So practicing decision making starts — and when I say practicing, it's not about sitting there and doing some, it's about the actual art of the practice of decision making — is done best when you've got data around it, you've got a formal way of reflecting on your decisions, and then you can kind of naturalistically learn as your decisions come and go in your life. In terms of practicing, as in doing the repetitions, you know, doing a bunch of layups and practicing decisions, then there are other tools. So you can do things like making predictions, learning to ask good questions, practicing going through case studies, learning from history and creating simulations for yourself, doing diagrams and playtesting future situations, which is what I do professionally with tabletop exercising. So in answer to your question, Spencer, it's mostly about getting good habits around your own current natural decisions.

SPENCER: It seems to me that one of the biggest challenges of learning from our decisions is that it's hard to put ourselves back in the position we were actually in. Like, we have this hindsight bias that prevents us from seeing the decision as we saw at the moment. We're like, "Ah oh man, I chose this apartment, and it didn't work out. Well, I don't like my new apartment." But then to learn from that decision, you have to put yourself back in the shoes of yourself when you made the decision, really think about, "What did I know at the time? And what could I have known? Maybe I couldn't have known that I wouldn't like this apartment, or maybe I could have. Maybe if I just sought out some other information, I would have figured it out."

DAN: Yeah, so Daniel Kahneman obviously had the concept of having a decision diary, and I really like this idea. It's not about putting every single detail about it in. It's just about logging what your mood was like, how you're feeling, what your current context is, all those kinds of possible environmental factors, emotional factors and situational factors that may or may not influence what's happening to your decision making. Even the art of writing those things down and being aware of those things and actively thinking about what your context is, can actually help you kind of see, "I might be a bit biased, because, right now, things are stressful at work," or "I'm probably rushing this decision because rent is coming up at the end of the month." So getting a decision diary itself and actually spending time writing it is probably useful. But then, as you mentioned, reflecting on it, having some notes and some documents really helps.

SPENCER: You mentioned tabletop exercises. So how does this work? And how do you apply that thinking to decision making more broadly?

DAN: That's a really good question. Tabletop exercising is such a broad term. It encompasses anything where you're trying to run a simulation of a situation and trying to figure out what the kind of options are, what are the dynamics of the situation, what are the different stakeholders in the situation, and what are the things that really move the needle and the levers that can be pushed or pulled or pressed to kind of make different choices, and what those outcomes are going to be. And so traditionally or historically, that's come from mostly military backgrounds in running combat situations, and the scenarios and the simulations really reflect the physics of the amount of troops and how fast it is to get from A to B, and what this cannon does to this ship. But as we've moved forward in history, where probably the other community that's adopted tabletop exercising is the cyber security community. Cyber and defense are both very defense-minded communities. But tabletop exercising should be done in a lot of different contexts to kind of think about what the outcomes of different dynamic situations with lots of stakeholders are. So as we've gone through history and the world has become a lot more complicated, tabletop exercising has morphed a bit, and it now includes things like negotiation tactics, geopolitics, red teaming, which is kind of pretending like you're an adversary, attacking yourself, or trying to have a direct competitive approach. So it's basically just a way and a tool that we can simulate situations with multiple stakeholders that's very dynamic, to try and just see how things play out. And you can take it back in time, you can rewind it, you can replay it, and you can try to learn how different dynamics of a situation interact with each other.

SPENCER: So is it basically taking a complex situation in life, like whether it's a military situation or a negotiation situation, kind of boiling away the unnecessary details and turning it into sort of a simple model of it, and then essentially playing it like a game?

DAN: Yeah, that's a really good description. So, what you want is just a really streamlined situation where you have the moving pieces that you think impact the situation and nothing else, and you can concentrate on those things. So if it's in the context of a war game or a battle, you're going to be on a specific geographic area with specific units and teams. But if we're talking about a business room or a boardroom, we want to just isolate it to whatever's moving inside of that company, or who the direct stakeholders are, and we're kind of forgetting about all the other noise in the environment, and really focusing on the situation at hand, and what are the moving pieces and how do they play out against each other, and then simulating it. So, with mostly just on if you're thinking about Dungeons & Dragons (D&D) or Warhammer, you just use a very small and easy combination of probability and maths, and you try to simulate things as you imagine they would happen in the world to probabilities. So a lot of it is to do with just rolling a dice to see what outcomes are, or looking up on a table to see what the probability of something is like, and trying to simulate what you imagine those outcomes would be in reality.

SPENCER: What's the example where you did the tabletop exercise that you felt really helped improve decision making in that area?

DAN: Last year, we actually had a really good table exercise that we ran for ALLFED, which is a research organization that looks at disaster scenarios where large parts of global supply chains of food are disrupted. And so, we did a conference in Australia, we had a bunch of conference attendees be stakeholders like the government or farmers, and in different states, in Australia and different food groups, or supermarkets and supply chain and consumers, and we simulated a big volcanic eruption in Indonesia, that basically chucked so much soot into the air that it blocked out the sun. And we can't grow any food, and we don't have enough calories on Earth to kind of feed the population, and the situation is: what do we do? It was really interesting, because we played out and immediately, it was about the government putting out statements and making sure that the community feels safe. But as soon as the scientific community came in and started to show the data, then we got a bit of panic, and then there's very reactive supply chain problems, and you get to see how these really complicated forces interact with each other and what the outcomes are. And there's some funny things that happened during that exercise. I think Western Australia, which is one of the states in Australia, tried to secede from the rest of Australia. We basically learned that there's a lot of problems that can occur and a lot of civil unrest in these situations. And after that exercise, we ended up collaborating with ALLFED to write a letter to our federal government, and it ended up with a meeting with the federal government about the kind of all hazards risk team there. So, it was a good outcome, a good exercise. We showed that these risks that are not thought about a lot can have big consequences. And the simulation had enough fidelity to at least get the government a little bit interested in thinking about those big risks.

SPENCER: How do you design a tabletop exercise like that? Do you assign different people who are playing different roles and say, "Okay, here's what you care about in this role; here's what your incentives are that [you're] supposed to play to"?

DAN: Every design of every tabletop exercise is all about, what is the aim of this exercise? Like, what are we trying to achieve here? And so, the aim of that exercise was, we want to have people see what it's like from a government perspective and from a population perspective, how we respond to something which no one thinks is possible. Even though these big volcanic eruptions are theoretically possible, that has happened in hundreds of thousands of years. So that was the aim: to see how people react. So then, you've got to create the scenarios and the stakeholders which support that aim. So you've got to think about the stakeholders which are going to be involved, and then with each of those stakeholders, you're quite right. You basically think about: What are those stakeholders ambitions? What are their goals? What are their mandates in this situation? And you just give them that directive in the form of a couple of dot points. So, it doesn't mean they're going to act exactly according to that script or mandate, but it gives them a bit of directionality as to how their stakeholder might respond in that situation. And so, you have a very complicated situation with eight different stakeholders, and they all have different goals. It very quickly gets very dynamic and unpredictable. It's not very much like a game of poker where you know what everyone's goal is or what everyone's aim is, and there's just some hidden information. It's much more complicated because there are layers of the cake that can get uncovered as you go.

SPENCER: Does someone have to play the environment as well? Because presumably there must be some kind of external events that occur throughout the game that give it interesting dynamics relevant to the particular situation.

DAN: Yeah. Basically, if there's stakeholders that we don't have at the table as players, then we'll have a central team, or the organizing team, or the game master, will simulate those events and stakeholders. So a lot of the time that is done through pre-thought, through random tables. So people might have pre-thought up different outcomes or different things that are going to happen at different points in time and roll on a random table to see what thing gets injected into the situation. There may be a point in time where we've designed into the scenario like a complication, and that complication might be further flooding or infectious disease outbreak or some other kind of government change or some kind of complication. And so, if any of those things happen, or if anyone wants to interact with anything outside of the direct stakeholders at the tables, usually, that's simulated by the game master or the control team.

SPENCER: And how do you know that it's sufficiently relevant to the real situation? Like, I can imagine playing such a game being like, "That was really interesting," but you might take away the wrong conclusions, because it kind of didn't model the correct aspects of the real situation, or it made some simplifying assumption, so it wasn't actually sufficiently realistic.

DAN: Yeah, that's actually one of my big gripes with the way that tabletop exercising is done currently. Like, I think a lot of organizations that do tabletop exercising at the moment, do it as a big kind of spectacle event, or like a bit of an "invite important people to the big white table and film the whole thing and make an event out of it." And, you know, it's an N = 1 kind of thing, where maybe the lessons aren't the right ones, or maybe there was a bit that was missed. So, I think there needs to be more quick, dirty, frequent tabletop exercising at all levels to try and do more simulations and really uncover it. Because a lot of the time it's not about the direct outcomes of the scenario that are the most important thing, like, "Oh, hey, if we put in this plan to ship this many cattle to this area, it's going to prevent this thing." It's actually more about just the general culture and decision-making processes that happen for the individuals within the game to practice being in those situations. So it's not so much about the content, but it's about the things that those organizations and people have in place to assist them in making good decisions, to assist them in acting in unusual situations or unpredictable situations. And it's the practice of doing that that's the most important thing. So, I'd like to see more people doing this as a kind of cultural practice, rather than just doing one off events and feeling like they beat the pandemic.

SPENCER: Shifting topics again, how do you think about bringing other people into decisions, such as using them for mentorship or brain trust?

DAN: Yeah, I think it's very important to have people who are mentors and brain trusts. Selecting those people can be really tricky, but I think that everyone who is interested in being a better person or making better decisions should really seriously think about who is on their personal board of directors. Who are the mentors you go to with tricky, thorny things? Who is your brain trust or brain's trust that you bring certain types of problems to? Creating those networks and those individual personal connections are really helpful for not only just walking yourself through a situation and describing it to someone else in a logical way, but also just to have someone who can reflect for you what your values are, maybe give some extra input and extra experience to the situation. Mentors are very underrated, and so are brains trusts. I think that people really need to take those things a bit more seriously and know who those people are. You should know who you're going to go to if you have a really hard thing come up in the future. If you know you're going to go through something really challenging at work, or something really challenging personally, you might have the default people that are around you at the time. But, I think you really should have a bit more of an understanding formally about what your kind of levels of escalation are for yourself, if you were to encounter things that do require a mentor or brain trust.

SPENCER: When trying to make a decision that's really big and important, I think about three different types of people that might be useful to talk to. The first is someone who you just think is wise. Maybe they know nothing about this particular type of decision, like maybe you're undecided that you should get your PhD. And this person doesn't have a PhD, but they're just a person you just think is a really wise person, and they tend to have insights. They probably won't give you an answer, but they may give you a new way of looking at the problem. So that's the first kind of person. The second type I think about talking to is someone who has direct experience. So in that case, it might be someone who did a PhD, maybe someone who thought of it and didn't do it, maybe someone who started it and dropped out, right? They're not necessarily experts in it, but they have kind of the first hand experience, they've lived through it. And then the third type I think about is someone with genuine expertise, which is not always available, but let's say you knew someone who's a career counselor, or someone whose job is actually to advise people on their career trajectories or whatever. And so, they may not have ever done it themselves, but they kind of bring the expertise to the table. And I find that these three types, while not always easy to access, often have different things to say and different ways that can help you in a decision.

DAN: Yeah, it's a really good point you make, that you could have people who are at different levels through the process to help you make decisions. The lived experience part is very useful, because it'll help you feel like you have someone that's listening, that's been inside of the decision you're making. But having someone who has gone to the other side of that decision or has helped other people make that decision in the form of an expert that you're talking about is one degree removed, and I think that's also important, because you don't want to get someone who's having a bad time in their PhD give you advice. Likewise, you don't want someone who is needing PhD students because they need a research grant to go through to give you advice. You really need to know what their own personal stakeholder's emotional state is, and what their own kind of goals and ambitions and biases are. You do need to make sure that you have different varieties of people as mentors and as brains trusts and know where those people are going to come from, because you need to just gather a whole heap of viewpoints from a whole heap of directions to kind of shine different torches on the thing that you're all looking at, to try and light it up as much as possible.

SPENCER: Yeah, I guess I would say that it is useful talking to the person that had a really bad time in their PhD, and useful talking to a person that had an amazing time in their PhD. If you just talk to one or the other, it can bias you. I think this is actually a really common problem. It's like, people want to know, "Oh, should I go into academia?" So they just talk to people in academia. It's like, no. That is useful; you should talk to people in academia, but also talk to the people that tried and failed, or who decided not to stay at it, who started it and left. Otherwise, you're getting a very bizarre selection bias that's going to kind of point you in one direction.

DAN: Yeah. This is one of the huge problems we all face as a society at the moment — just about knowing where your information comes from, what the objectives of the people telling you are. It's really challenging, but that's why I think we need these trusted people who know you, who might know your values, and operate as a bit of an advisory team to yourself. I think that's super important.

SPENCER: I know in your work, one thing you think about is games, more broadly, and how we can use games. What are some underutilized aspects of games?

DAN: Games are a very unique medium because they involve agency, and you can make choices inside of them. And that's a really unique thing, compared to things like learning through a lecture or learning through a book. I think that the fact that you can do something within an environment, and then see the outcome of that and have that feedback loop closed is the power of what a game is. So for me, a game is a way to simulate an imaginary world, whether it be in your own brain or throughout some kind of game system, and have a choice that you make, have a direct consequence within that simulated environment, and you close the feedback loop and learn something from it. Games have some really serious advantages that a lot of other mediums don't. So, games don't have to be bound by physics or time. You can really simulate things that are far off in the future or happen really quickly. If you want to simulate an event that's going to happen over five years, you can kind of speed up time and do that through the power of a game. And obviously, one of the main benefits is there's near zero risk and consequences. They're sandboxes where you can experiment and learn without much repercussion. So games are underused in terms of a tool, because I think people think they have to be fun, or that they have to be kind of shiny or designed with lots of money. But really, at the end of the day, games are just a medium where you can have a system that enables people to make choices consequences and have feedback loops.

[promo]

SPENCER: One thing I've observed is that when companies try to make educational games, it feels like they often are neither that fun nor that educational. It's like this weird middle ground where it's like, okay, you can make a game and you can make it really fun, or you can just try to make something educational. And I'm wondering, would you disagree with what I'm saying? Or do you think that often is the case?

DAN: No, I think that happens a lot. It goes back to my point earlier, that the very first thing you want to do in any type of game design is really answer the question, what is the point? What is the purpose of this? What are we trying to do? And you take that, and then you try and think about, well, now that we've defined what it is we're trying to do, how do we put in rules, and how do we put in kind of constraints that try to get the player or the people that are involved in whatever this is, to try and do that thing, to try and support that aim? So the aim might be: test if our new evacuation procedure works. But if your game is about getting people to find the staircase, it doesn't really support the aim. It's kind of adjacent, but you haven't really created a game which really figures out what your true aim is. I think that answers the question of, "If it is educational, it doesn't work?" is that you really need to think about the definition of what is the aim of this, and then try to think about what are the rules that support that aim as much as possible. In terms of the fun part, that's really tricky, because you'd need immersion for games to work. And creating a fun game really is quite a unique magic secret sauce thing that is really hard to get right. The best tool, by far, is to playtest and just to see. And a lot of people don't do that despite having lots of money, lots of funding, they don't playtest well. And they playtest with a little bit of a maybe sunk cost, or maybe a little bit of trying to confirm that what they've already done is the right thing to do, but testing is the main thing to check if it's fun. And if it's not, find out why and try to improve that.

SPENCER: Would you advocate playtesting as early as possible, like even when you just have the bare bones basics of something?

DAN: Yeah, I think definitely. The earliest signal you can get is actually putting someone who has no idea what it is into the situation to see if they do the thing you want them to do. So if you have an educational game about teaching kids how to brush their teeth, and they get into it, and they just don't engage with it at all and looking out the window, you know you got something wrong. You want to know that as early as possible before you put time and money into developing artwork or the shiny things that go on the top of whatever it is. Really, the game should be a complete skeleton concept before you really go ahead and do any kind of development. You really need early playtesting.

SPENCER: Why is it that when people try to bolt on things like a point system or a leaderboard or whatever, that doesn't often work?

DAN: Again, it goes back to what is the point and what is the thing we're trying to achieve? And there's a pretty good saying in economics that you get what you measure. And if you're measuring points and you're measuring leaderboards, what you're going to get is people who optimize for those things, if they care about it. So if you're trying to make employees wash their hands in a hospital, that might be the aim that you're trying to achieve, and you might set up like a tag to try to measure people to how often they're washing their hands, and they climb up a leaderboard and get a free coffee or whatever. But you're going to get people who just optimize for washing their hands as much as they possibly can. What your true aim is, and where you've gone wrong, is not defining the real thing you want, which is we want infection rates in the hospital to go down. And so, how do you make a game that supports that true aim? And so, that's when you might have to recognize, "Oh, maybe the points about how often or how many times a day they wash their hands is a bad way to measure this. Maybe we actually have to start with 100 points and take points away from people if they don't wash their hands after certain events or before certain events, or something." So, points, badges and leaderboards can work, but quite often they fall into the trap of being the thing that is getting measured, and so people just optimize for climbing the leaderboard.

SPENCER: I suspect that there might be an even bigger problem that occurs a lot of the time, which is that people just don't care at all. Like someone's trying to gamify an app or gamify behavior, and they attack on points, and then just people don't care about the points. There may be a personality type that just kind of intrinsically cares about points; if they see a number going up, they get excited. But I think for a lot of people, it's sort of like attacking on points gets the thing backwards. It's like the points have to measure something you care about, and if it doesn't already measure something you care about, why would you care about the points? Right?

DAN: That's exactly right. And you need to really think about who the players are. Some players, as you mentioned, are just not going to care about points. So, you need to have different ways to motivate those people. That's why adding points, badges and leaderboards really isn't creating a game out of something. It's just adding a way for people to measure it, if they care about measuring it. And some people do; some people really love quantifying things in that way and get a little bit obsessive about hitting those numbers or doing the Duolingo every day. But for other people, they don't care. They just want to learn Spanish, and they get annoyed if they see that the Duolingo bird is not telling them that they're going to keep their streak going or whatever. So it's going to be: knowing who your players are and building things to support what their actual goals are.

SPENCER: Before we wrap up, one more decision making topic I want to bring up with you is: many of our decisions have to do with divvying up our time between things like, how much of our time do we put into our job? How much of our time do we put into our immediate relationship? How much of our time do we put into working on our personal project? How do you think about that kind of decision?

DAN: I think it's a really interesting question, and one that I've spent a lot of time thinking about through my career, because I have a bit of a fragmented, almost portfolio career, where I do different things every different day of the week. I think, Spencer, you're more or less the same with a few different things that you do throughout the week as well. But, I like to think about splitting up time in terms of — specifically for career and what you want to do with purpose — thinking about, what are the topics you want to work on, what are the returns on those topics, and how you get the most out of yourself in those different areas. A lot of people, I think, focus too much on putting all their eggs in one basket with one career. And I actually think that it's a really good idea to have different buckets of things you do, for a few reasons. One, it gives you a bit more perspective on different areas. I think there needs to be definitely some full time thinkers in certain topics. But I also think there needs to be people who spend part-time being a foot soldier on the ground doing the thing — like me and my medical work — and part-time thinking as a networker or an idea pollinator, kind of going between the thing that you do — which is for me, medical work — and the other thing you do, which for me, is decision making work. So, I like to take my impact with my medical work and the situations that I run into and apply it to that other thing. I think there are not enough pollinators out there, who kind of go between two different things. And in terms of how I think about the risk and return of those careers, like it's very assured what I'm going to get every day in my medical work, I'm helping people one by one. Sometimes it's a bit like shoveling your hand at the beach, and sometimes it feels like you'd want to have more impact, but you know what you're going to get, and you know that it's going to be like a net positive return. Whereas with my decision-making work, if I can prevent one person from making a really catastrophic risk, that's going to have a huge impact. But I don't know what the chances of that are like. And that might mean that my decision-making work and tabletop exercising work might prevent something really bad from happening and have huge impact, but I don't know if it's going to happen, and I like to kind of hedge my bets in terms of my career impact with spending time on something that I know is going to have a definite return versus something that I was going to have a possible lage return.

SPENCER: It reminds me of barbell strategies or something to talk about in an investing context, where some people will recommend that you have a large section of your portfolio, of your investments or savings that's really safe, and then you also have a very small amount allocated to some really risky, but potentially very high expected value stuff. And that this might be superior to kind of putting all in safe stuff or putting all in risky stuff. Do you see that as analogous, or do you think that's a bit different?

DAN: Yeah, I think it's very analogous. I also kind of think about it as you know a lot of people who want to get the most out of their career, or look at the places that they can have the absolute most impact will really look on the margins of low likelihood but high impact things. I think that's a fantastic place to concentrate your highest expected return in terms of an impactful career. But if you take that and then do that part time, the expected return doesn't go down by that much. The chance of you having a large impact is still there, but you can also have a guaranteed positive impact doing something else as well. It's probably a little bit personality based as well. Like, I just really like doing something different and having different variety in the things that I do. But, yeah, it's pretty analogous to having, I guess, a different investment portfolio and thinking about expected return, not about just maximizing it, but having a bit of a balanced approach.

SPENCER: Do you worry about lack of focus? This is something that comes to mind for me, because I work on a lot of projects. While there are a lot of advantages to doing that, there's a huge disadvantage, which is that you're kind of not all-in on one thing, you're not putting 100% of your attention on one thing.

DAN: I do, but I think there are people who do that better than me. And I also think that one of the things that you can have as a strategic advantage is to have a different perspective and come from a different background, or have a different worldview to something. And I think that that's actually a huge value add if you're part of a team, especially, or part of a group of people who are all doing something. And I think that — a lot of other larger organizations are understanding this now — getting perspectives of people who have lived experience or practical experience, because a lot of the time, big ideas or big projects need to hit the ground. They need to have people who understand the kind of integrative part of it. From a personal perspective, sometimes it worries me that there's not 100% focus, but I think it's more about having enough hours in the day or hours in the week. But I kind of counter that argument in my head with just knowing that it's more sustainable for me. Like, I think that if I was to do medical work all day, every day, it's really hard. It's really emotionally straining, and I think I would really burn out or become not as good of a doctor compared to if I had time to recharge that emotional energy and get excited for it through my other work. And so for me, it's more of a longevity and I guess sustainability thing. And for that, the focus is, or the loss of specialization is probably worth the trade off; but that's my personal value.

SPENCER: Any parting advice for people who want to learn how to be a better decision maker?

DAN: I think that the main kind of three dot points are: be aware of when decisions are coming up, know yourself and know your values, and understand when decisions require your energy and when they don't. And I think that those three things will make most people a lot better decision-maker than just going about and making the best decision they can based on their gut or based on whatever that comes to their head at the time.

SPENCER: Dan, thanks for coming on.

DAN: Thanks for having me, Spencer.

[outro]

JOSH: A listener asks: "What do you think about Moloch? Are we heading towards a nearly inevitable race to the bottom in everything?"

SPENCER: I think that coordination is really, really difficult when you don't have aligned incentives. Imagine a marketplace where you know someone is selling apples and someone wants to buy apples. You can get really good coordination there, because, okay, you just have an exchange. One person gives the apples, one gets the apples in exchange for money, everyone's better off. When you don't have that nice incentive alignment, getting the coordination to work properly is just so, so difficult. And we see many different areas, many different human endeavors, where it kind of goes off the rails, where nobody really loves the equilibrium, or maybe a few people benefit, but most people don't, but it kind of gets stuck there. A good example of this would be the difficulty of building housing in a lot of cities. It's a really, really big problem. They can't build more housing, and the prices skyrocket, and everyone's unhappy. But then you go try to build housing, and it gets tied up because the people who are in that area, well, they don't want the house right there in their neighborhood, and so they want to block it. And so, that's a fundamentally misaligned incentive. And if you could somehow say, "Well, could we all vote to create housing blind to where it's going to be and then pick at random where it's going to go?" Maybe something like that would work. But I do think we have this really big problem as a species that we're not good at coordinating generally when we don't have aligned incentives. And we might need much better strategies for this in order to guide the world safely in the future, as these kinds of Molochian situations get more and more dangerous, especially with advancing technology.

Staff

Music

Affiliates


Click here to return to the list of all episodes.


Subscribe

Sign up to receive one helpful idea and one brand-new podcast episode each week!


Contact Us

We'd love to hear from you! To give us your feedback on the podcast, or to tell us about how the ideas from the podcast have impacted you, send us an email at:


Or connect with us on social media: