June 2, 2022
To what extent does thinking about ethics actually cause a person to behave more ethically? Do ethicists behave more ethically than the average person, or are they just better at justifying their unethical behavior? Why do we sometimes have strong negative reactions to people who seem "too" moral — even if they're genuinely altruistic and not just acting as though they're better than everyone else? Is morality inherently motivating? More specifically, are some kinds of moral beliefs more motivating than others (e.g., beliefs obtained through reasoning vs. beliefs adopted because of social pressures vs. implicit beliefs to which our brains are predisposed for evolutionary reasons, etc.)? In philosophical terms, what is a jerk? How many kinds of jerks are there? Are philosophers mostly trying to find the truth, or are they mostly just playing logic games?
Eric Schwitzgebel is a professor of philosophy at University of California, Riverside. He has published widely in moral psychology and philosophy of mind, including on the moral behavior of ethics professors, on introspection and consciousness, and on the role of science fiction in philosophical thinking. His most recent book is A Theory of Jerks and Other Philosophical Misadventures. He blogs at The Splintered Mind.
JOSH: Hello, and welcome to Clearer Thinking with Spencer Greenberg, the podcast about ideas that matter. I'm Josh Castle, the producer of the podcast, and I'm so glad you've joined us today. In this episode, Spencer speaks with Eric Schwitzgebel about ethics, hypocrisy, and Eric's theory of jerks.
SPENCER: Eric, welcome!
ERIC: Thanks for having me.
SPENCER: On this podcast, we've talked about ethics a lot in different episodes. But one thing that we haven't really talked about is the way ethics does or doesn't change people's behavior. So, that's the first topic I want to talk to you about: to what extent does thinking about ethics actually make people more ethical?
ERIC: Yeah, it's not clear that it does make people more ethical. There's been a lot of hope that thinking about ethics makes you more ethical. In some sense, that's historically been an important part of the justification for studying ethics. But when you look empirically at the work on this, it doesn't look like the evidence is very good that it has much impact on your behavior overall.
SPENCER: It's interesting because if you take people, for example, in the effective altruism movement, it just seems really clear that many of them (not all of them but many of them) can pinpoint certain times in their life where they suddenly shifted their ethical stance — for example, reading philosophical treatise of some kind, or having a conversation — and then it seems like that can radically shift their behavior, like maybe they became vegetarian, or maybe they switched career paths, or tried to figure out how to do more good in their lives. Anecdotally, I've just seen so many cases of that. Yet, interestingly enough, it's not clear that the empirical evidence — broadly, necessarily — is in line with that. So tell us about some of the evidence you've looked at.
ERIC: Yeah, I agree. Anecdotally, it seems like there's some pretty compelling cases. And I think vegetarianism is one of the more compelling cases because it seems like people sometimes read Peter Singer, or something like that on vegetarianism, and say, "Hey, this seems right, I'm going to try it." And then, they become vegetarian; they change their moral behavior.
SPENCER: This is almost exactly what happened to me. I was in high school, and there was a philosophy class. One of the readings was Jeremy Bentham, discussing the treatment of animals (I don't know the exact line) but there was a line in his work that was something about, when you're thinking about the treatment of animals, what matters is not whether the animal can reason but whether the animal could suffer. That line resonated with me. In high school that day, I decided to become vegetarian. That felt like a really big turning point in my worldview; that sentence or certainly that article.
ERIC: The other anecdotal thing on the other side though, before we get into empirical evidence, is — I don't know how much you've hung around with ethicists over the course of your life — but I think one of the things that most philosophers experience is that ethicists seem just like other ordinary people of their social group. It doesn't seem like ethicists are especially better behaved than others. And often people can think of particularly horrible or creepy people who are ethicists or things done by ethicists. There's that kind of weight on the other side too. There's this kind of sense when you interact with us, you're just interacting with normal people.
SPENCER: It's really interesting. When you think about different fields, what do we actually take them to be an expert in? Is an ethicist someone who's an expert in the different arguments for and against different ethical beliefs? Are they someone who knows how to publish academic journals? Or are they someone that we look to understand how to be more ethical? And you could imagine all these different variations on what an ethicist is an expert in.
ERIC: The short version is that the empirical stuff that I've done suggested that ethicists don't behave much differently overall than other people who have similar social backgrounds. Then the one response to that is to say something like, "Well, what would you have thought." In order to be an ethicist, what you need to do is publish journal articles or talk about arguments for ethicists. Being an ethicist isn't the same as being a saint or being a role model, so we shouldn't expect ethicists to behave any better.
SPENCER: Well, you could make a counter argument saying, "If ethicists are studying what we should do, they at least should be much more aware of what we should do than other people," right? In some sense, it should be much more salient to them.
ERIC: Right. That's what I think. I think there's something to be said for the line that I just advanced. At the same time, I think it's a little too easy to say that because you would think (I would think) that thinking in the way that philosophers do for long periods of time about ethical questions ought to have some impact on your behavior. It ought to at least make ethical choices more salient to you maybe than they would otherwise be. A lot of ethicists talk about applied iffy issues: Singer talks about vegetarianism as we mentioned, Kant talks about honesty, there's character ethics, Aristotle talks about all kinds of aspects of personal virtue. So, at a first pass, you might think that someone who spends a lot of their time thinking about those things would somehow manifest [those] in their behavior.
SPENCER: I don't know about Kant. I've heard people claim he wasn't that Kantian, but I haven't looked into it. Peter Singer does seem to really try to live by his philosophy — imperfectly as human, like we all are — but it does seem like it's really influenced his behavior. He started charities. He's given away a bunch of his money. I believe he's vegetarian. I'm not positive about that.
ERIC: Yeah, he's a vegetarian. It's an interesting issue. I think you can feel the pull on both sides. One of these things that I find a little frustrating is that people seem to think that it's really obvious, one way or another, that ethicists would or should behave better or that they wouldn't or shouldn't. I don't think it's obvious. And I don't think it's really been studied very well. That was the aim getting into the empirical research that I was after, when I did a bunch of studies mostly collaboratively with Joshua Rust (on the moral behavior emphasis). We wanted to look as systematically as we could at: given that you can see anecdotal cases on either side, you can see plausible arguments on either side in terms of whether ethicists would behave better or not, what's really the case? We did a whole bunch of studies. We found overall that ethicists didn't behave better. We looked at all kinds of different behavior: responding to emails that were designed to look as though they were from undergraduates asking about classes; to self reported behavior like how recently have you been in contact with your mother; to things like public records of whether they voted in recent public elections (on the assumption that voting is a duty); to things like self-reported charitable giving; and courteous or discourteous behavior at conferences. We went to ethics sessions and non-ethics sessions of the American Philosophical Association. We recorded things like whether people leave behind trash at their seats (littering), which would make the next session in the same room messy. Did they let the door slam when they came in late to a session? Did they rudely talk to another member of the audience during the main presentation in a way that would be kind of disruptive? We looked at in the 1930s, the rate at which philosophers joined the Nazi Party compared to people in other departments. We looked at the rate at which ethics books were stolen from academic libraries compared to other books. That was the one where we found or seemed to find that ethicists behaved worse. Mostly, we've got 19 different dependent measures and main dependent measures and a bunch of sub-measures also that we've looked at. Over and over again, we find basically similar behavior. Except the one case where it seemed to be worse was that ethics books were in fact more likely to be missing from academic libraries compared to other books in philosophy that were similar in age and similar and overall checkout rates.
SPENCER: Maybe they think the ends justify the means.
ERIC: This is one way of thinking about it. You could think that what ethics does for you is it gives you the capacity to rationalize what you want to do. Here's one view of the moral psychology of ethics. You have some inclination to do something. Then you think to yourself, "Well, I kind of want to do this, but would it be moral to do it?" You think about it a bit. If you're a capable ethicist, you might be able to come up with six different reasons why it would be perfectly moral to do this thing, like steal a library book. A good ethicist could come up with a consequentialist justification in terms of the end justifying the means. You could probably come up with a deontological justification in some Kantian mold, "Well, it's a universalizable Maxim." You could come up with an Aristotelian justification in terms of virtue. I had one ethicist tell me that there is no more beautiful crime than stealing a book. This is a view that I call the toxic rationalization view of ethics. I don't think this view is likely to be most of what's going on with ethics, but I think it might be part of what's going on.
SPENCER: Seems like they're not more unethical. This view would be more compatible with being more unethical, right?
ERIC: Right. Maybe for stealing library books, they are more unethical. Maybe in some cases, they are more unethical. I think it's hard to know. But on balance they'd say they aren't more unethical.
SPENCER: Would you think there's a hypocrisy critique that can be leveled in saying that maybe they are more unethical if they're writing papers about ethics and not behaving in line with them?
ERIC: Maybe. Hypocrisy is a complicated question on which I have a complicated view. Maybe we can get back to that in a minute. But let me give you just a flavor of one of the more striking results that Josh Rust and I found on the vegetarianism questions just so that listeners can get a sense (a little bit) of what we did. We sent this questionnaire to about 300 ethicists; by ethicists I mean philosophers, professional philosophers, and philosophy departments specializing in ethics. We sent this questionnaire in 2009 to about 300 ethicists, about 300 philosophers not specializing in ethics, and about 300 professors in departments other than philosophy at universities in five US states. In the first part of the questionnaire, we asked them their opinions about a variety of moral issues, and one of the moral issues was vegetarianism. We had people rate on a nine-point scale from very morally bad to very morally good (various prompts). One of them was regularly eating the meat of mammals such as beef or pork. Then in the second part of the questionnaire, we had them describe their own behavior on the same questions on the same issues. One of the issues was meat eating. We asked, "At your previous evening meal (not including snacks), did you eat the meat of a mammal?" What we found there was striking because the ethicists were much more likely than the other respondents to say, "It was bad to eat meat." Sixty percent of our ethicist respondents rated it somewhere on the bad side of the scale (regularly eating the meat of mammals such as beef or pork) and that compared with 45% of our non-ethicists philosophers, and only 19% of the professor's in departments other than philosophy. But then when we asked in the second part of the questionnaire, "Did you eat the meat of a mammal at your previous evening meal (not including snacks)?". There was no statistically detectable difference among the groups. Overall, 38% of the respondents said yes and it was 37% of the ethicists who said yes. So there's basically no difference among the groups and their response to that question about their own behavior, despite this big difference in response about attitude. We do find on several measures that ethicists tend to have more demanding or stringent moral views. On charity, they also recommend that the typical person give more to charity than the professors in other departments and non-ethicists in philosophy, and yet they did not report giving more to charity. So it's this combination of this stricter, moral view without as far as we could tell better moral behavior.
SPENCER: Out of curiosity, did you look at a correlation between how strong someone's moral view is on the topic and how much meat they ate?
ERIC: Yes. There was a good correlation, especially for the people who rated it at the extreme end. The ones who answered one or two, that is, they said it was — I'm forgetting whether we anchor the scale as extremely morally bad, or very morally bad, but one or the other was the anchor — those who rated it one or two, extremely morally bad or close, were much less likely to report eating meat than those who said it was only a little bad, or didn't say it was bad at all.
SPENCER: It's fascinating because across the whole population, you find this relationship between the more bad you think it is, the less likely you are to eat meat. Yet philosophers think it's more morally bad but aren't less likely to eat it, right? How do you make sense of that?
ERIC: Well, I think it's a pretty common view and this is my own "I eat meat," so this is representative of my own view, too. A pretty common view among philosophers is to think that it's a little bad, but to do it anyway. One of the things that I think sometimes comes out of studying ethics is a sense that our lives are permeated with ethical decisions. We're constantly (I think) making ethical choices, consciously or not consciously, in how we react to people, how we treat people, every product we purchase has an impact on the world. So from that perspective, everything we do is far short of morally perfect, maybe.
SPENCER: So is it like one of those things where if you are on a diet and one day you just accidentally eat too much and you're like, "Screw it, I'll just eat as much as I want today." It's like I can't be perfect so I should just...it doesn't matter what I do.
ERIC: That's not quite how I think of it. Although maybe it's a little in that direction, I think. Sort of this paper published in 2019 called "Aiming for Moral Mediocrity." I guess I think that whether most of us acknowledge it or not, most of us aim not to be morally excellent by absolute standards really, but rather, to be about as morally good as our peers or maybe a little morally better, so you can feel a little good about yourself. I think people don't really aim to be saints and it's much harder to aim for that. The more demanding a moral view you have, the more you tend to see the world as permeated with moral choices, and as every moral choice as being less than ideal.
SPENCER: I certainly see people have a backlash against people that they perceive as too moral. For example, there was a guy who was in the newspaper for giving away one of his organs. He gave away a bunch of his family's money and people had a really strong negative reaction to this guy. I've also seen this with people who make it really clear that they don't eat meat for ethical reasons, or they'll make it really clear that they don't buy from sweatshops for ethical reasons, and people can really have a negative reaction. I think part of it is that people might feel like they're being judged by it.
ERIC: Yes, I do think there is this phenomenon. Some psychologists call this "Do good or derogation." So yeah, I do think there is that phenomenon. People may feel a little threatened by others who are morally excellent or at least who pose conspicuously as morally excellent.
SPENCER: So do you think that studying philosophy has made you more ethical? Or do you think that you just really are acting the same way you used to?
ERIC: I think I act — It's kind of disappointing to me. I'm so torn about all of these issues. — But I think I act about the same.
SPENCER: We're just playing a lot of virtue, right now. I'm being honest.
ERIC: Let me finish the moral mediocrity thought with respect to the vegetarianism thing. So if people aim to be about as morally good as their peers, what would you expect from someone who (an ethicist say) decides, discovers, she thinks, that eating meat is morally bad. Will she decide not to eat meat? Well, not necessarily. That would be the thing that you might decide if you're aiming to be morally good by absolute standards. But if you're just aiming to be about as morally good as your peers, well, your peers are still eating meat. So what might happen would be that you would continue to behave the same as the other people around you. But you just feel like everybody was behaving worse than you thought they were before you discovered that eating meat is morally bad.
SPENCER: Well that leads to a testable prediction. Do philosophers think that their colleagues are more unethical than most people do?
ERIC: That is an interesting prediction. I haven't tested that. I think there's a lot of different psychologies of ethicists. I don't think there's one right psychology about this, but I do think that there's a kind of psychology of ethicists that's fairly common that I think would say, "Look, morality is pretty demanding. Most of us aren't really doing it." Before you engage in a lot of philosophical ethical thinking, you might think, "Well, look, as I'm going through my day, doing my normal stuff, there's nothing morally criticizable in anything I'm doing. I'm just being a normal person." You study enough ethics, and you come to see how all of these issues have impacts on other people that you might not like. You come to see what you do as morally loaded. Buying expensive coffee, that's money you could have given to charity, and maybe there are bad practices that the coffee company engages in. Driving your car to work, you're contributing to global warming. Eating a hamburger, you're contributing to animal suffering. So there's a certain perspective on which by studying ethics we've come to think, "Wow, we do all of these things that are not so ethically great all the time." A consequence might be to think that, "Boy, you know, we're not so morally great, all of us".
SPENCER: I wonder if another factor could be the incentive to stick out an unusual philosophical position. I've heard philosophers say that all you need to do is find a position that you think there's a 10% chance is true, and then become known for not arguing that it's true, but arguing that it's more likely to be true than you thought, or something like this.
ERIC: I think that's certainly a dynamic that you see in philosophy sometimes. I think it's probably a good dynamic for the field. Maybe some ethicists do that kind of thing too.
SPENCER: So you're saying that you feel like your behavior maybe is not that influenced by philosophy. We touched on this idea of hypocrisy earlier about philosophers. Are they being hypocritical if they're writing about ethics, but not necessarily changing their view? I'm curious to hear what you think about that.
ERIC: That is something that I hear a lot of people say in response to reading my stuff on the moral behavior ethicists. I'm inclined to think either it's not quite right, or hypocrisy isn't quite as bad as we think, or both. One way of thinking about hypocrisy is that you're presenting yourself as following a moral code and suggesting that others follow a moral code that you secretly yourself don't follow. So I am not sure that ethicists necessarily have to be doing that because when they advocate positions they might not necessarily be saying, "Hey, well, personally in my life, I follow this." It doesn't necessarily follow. Although it's kind of odd to think that they wouldn't follow their own ethical advice as it were, but at the same time, I think it's not always totally clear.
SPENCER: Well, I guess where hypocrisy could potentially come in from my point of view is if someone writes a paper saying that such and such as the ethical actions as such case, it seems like they're saying that other people should do it. Then if they don't act that way, it feels like it could be a form of hypocrisy.
ERIC: Yeah. Well, here's the other thing that I want to say about hypocrisy. I think we were talking about rationalization before and I'd rather (I think) have a hypocrite than rationalizer. Suppose that morality demands a certain thing of you and you're not going to do that thing. There are two responses that you could have. One is to say, "Well, look, I recognize that morality demands this of me. I'm not going to do it." The other is to rationalize and come up with a bunch of excuses and bad arguments why you don't really have to do that thing. There's a kind of intellectual honesty in being willing to endorse stringent norms that you don't live up to that is missing in someone who rationalizes their way into undemanding norms. So the best person might be someone who says, "Hey, we should all be vegetarians," and then is. But if you're looking at the non-vegetarians, I'm not sure the one who says, "Hey, we should all be vegetarians, but I'm not" is better than the one who kind of comes up with lots of rationalizing excuses for "Oh, yeah, you're right. The animals aren't really suffering, I'm a blah, blah, blah, and there's nothing I can really do, or whatever."
SPENCER: So is that the kind of approach you try to take in your own life? Like saying, "Well look, I'm not perfect ethically. I think we should do X and I don't always do X. But I don't need to explain that away."
ERIC: I guess so. I mean, I find that a little disappointing in myself. I do think that it should be among the aims of ethical reflection and ethical thought to become better and to live up to your norms. I don't think we should just casually say, "Well, yeah, X is the moral thing, but I'm not gonna do it, I'm perfectly comfortable with that." I think that's also uncomfortable. So I don't think there's a comfortable one right space to be here. Unless you do the thing which I think probably any of us could do, and really, basically just become a saint. I think any of us could easily be vastly more morally better than we are. We just choose not to. Of course, you could be more helpful. Of course, you could be kinder to your relatives, your spouse, your friends, your co-workers and do more fun and you help them. Of course, you could donate money, your time to charitable causes, and make the world better that way. There's lots of things. You could drop everything right now and devote your life to a good cause and save multiple people's lives, you could donate a kidney, or whatever. There's lots of things that we could do. There's nothing preventing us from doing things that could be morally better, but we choose not to.
SPENCER: Why do you think that is? Why do you think people are more influenced by these ethical views they have?
ERIC: I think people are willing to trade away some of their self-interest for what's morally good. But I don't think they want to do it a whole lot more than they see their friends and relatives doing it.
SPENCER: Maybe it feels like they're getting the short end of the stick [laughs]. And they're sacrificing more than other people.
ERIC: Yeah, right. I mean, you can even make a fairness case for that, that people don't want to sacrifice more than other people. That's just the kind of normative description of aiming for middleness or mediocrity. But then you could say, "Well, there's something that would be wrong or unfitting about that, or something unfair about that." Then it almost becomes like a moral norm, right? I'm not sure that's quite great.
SPENCER: Soon as philosophers talk about morality as though it's sort of inherently motivating, like sometimes it almost sounds like it's part of the definition of morality, like "What you ought to do? Why you ought to do it?" Well, just inherently you ought to do it fundamentally. If you didn't inherently have reason to do it, it would be something else like if you just did it for you to achieve some other goal it wouldn't be so good. I'm curious to hear your thoughts on that. Do you think that plays on what we're talking about here?
ERIC: Yeah, it does. I do think that morality is inherently motivating and that you don't necessarily need a self-interested purpose to choose what's moral. I just don't think that it's always highly motivating when self-interest is weighing on the other side, especially when you don't see other people making the same kinds of sacrifices.
SPENCER: For my own life, I'm very far from a perfect person, but I think the 'self-interest first' morality framing doesn't capture that well. Like what happened to me internally. I think, for me, what happened is that I have a whole bunch of different things I value. Some you might put in the moral domain, some you might put in the self-interest domain, but some of them maybe don't click too clearly in either domain. Maybe they're more like "I like this person and so I want to do good things for them." It's sort of altruistic but it's like, "Oh, I like this person. And that's my motivation." So I feel, I basically have a bunch of these things that I value, and then they're in competition with each other. And I'm trying to balance between them and say, "Well, I really care about making the world a better place. But if it comes too much in conflict with these other values, then I have to find a balance to navigate between those."
ERIC: Yes, I agree with that. I presented it a little too simplistically as though it's morality versus self-interest, and they are these two things competing against each other. My actual view was much closer to what you just said, that we have a wide number of things that we value. I think every action we do is some combination, and compromise among a variety of things that we value. One of the best things that we can do in our lives is find ways to arrange happy coincidences, so that we advance many of the things that we value through the same type of action. If I value being a good parent, and I value, say, taking walks in the park, then if I can arrange my life so that taking walks in the park is one way of being a good parent, then I get both of those things I value in the same action.
SPENCER: I think that's a really excellent approach. I think there's a lot of individual differences and how much people care about being ethical. I know some people are extremely ethical and some people are less. But one thing that I noticed, even in people who have an extremely strong desire to be ethical, there still could be low hanging fruit, where they could just adjust their behavior a little bit more to do significantly more good for the amount they're sacrificing. One example is a friend of mine, who I consider extremely ethical, would carry like recycled bottles around for two hours just to make sure to throw them into recycling. Sure, that's a nice thing by some viewpoints, but it's such a huge inconvenience. I can almost guarantee that there are much less inconvenient ways to actually do a lot more good than that. So if you think about it, there's some kind of amount you're giving up for any effort that you're putting into some ethical action. You want to look for the ones where you're getting the most ethical bang for the effort back.
ERIC: Also, when you can combine that with things that align with other values of yours, so I enjoy chatting about philosophy and here I am doing it with you, and maybe our conversations have a positive influence on the world (I hope).
SPENCER: Or maybe it makes people less ethical. [both laugh]
ERIC: Maybe, I hope not. But at least it helps people think things through a little better. I suppose some sort of hopefully positive influence but then also, I enjoy it. Also, it advances my career, it helps get my views out there a little bit more, and it's something I can put on my CV later. So there are all these things that I care about, they're kind of bundled into this one package. And the more you can bundle things that you value and care about into positive packages like that, the easier it is to keep doing those things. And the more bang you get for your buck, so to speak.
SPENCER: Additionally, if you make ethical things into something that's really unpleasant, if you just view that from an outside perspective, we can predict you're going to probably do less of it every time. As much as you might want to do it, theoretically, if it's actually punishing, you probably will do less of it. Whereas if you can find ways to bundle doing good with things that make you feel good without sacrificing the effectiveness, that actually seems ideal, just from a motivational and behavioral standpoint. You're probably going to do those things a lot more.
ERIC: But in saying that, I also think we shouldn't lose the fact that we could do things that we find highly unpleasant and we just choose not to and that we know would have good positive impacts. It's not like we failed to have the capacity to do those.
SPENCER: Well, there's another angle for looking at all of this. I want to run by you and see what you think, which is basically I think that there are many different ways we can believe a thing. One example is that let's say, since you're a child, you were told something about society, maybe your parents always told you capitalism is evil, or communism is evil, or whatever. Now you grow up hearing that phrase, and then you're 10 years old, and someone says "What do you think of capitalism?" You're like, "Capitalism is evil, right?" That's a type of belief. But that belief may have very little concreteness to it. It's just a phrase that you memorize. You know that you're supposed to say it's evil, but you're kind of anticipating. So that's one type of belief. Another type of belief would be like, you might think of it as a betting belief. Someone comes up to you and says, "Okay, we're about to find out the answer to this question. I want to bet you $100. Are you willing to take the bet? Or at what odds would you be willing to bet me on this thing?" That is a very analytical kind of probabilistic type of belief where you're really saying, "Well, do I really think it's 60% chance or I think it's 70% chance." Then you can still have other types of belief. You can have sort of an implicit emotional belief where, let's say, every time you see a spider, you freak out. Well, if someone asked you, "Do you believe this type of spider is actually dangerous?" On some level you're gonna say no, but there's another level in which you do implicitly think it's very dangerous. Anyway, I've tried to categorize these, and I was able to come up with quite a shocking number. I think I have 16 types of belief. I suspect one thing that goes on here is, if you ask people, "Do you think it's wrong to do X?" or "Do you think it's moral to do Y?" There are different kinds of beliefs people could be pointing out when they say they do believe it or don't believe it and some of those types of beliefs maybe are much more motivating than others. And if you have the wrong type of belief, maybe it just doesn't compel you to action, whereas other types of beliefs might actually make you behave differently.
ERIC: Yeah, that's interesting. My own view of belief is pretty close to that almost turns it on its head. So in my view, to believe something is to be disposed to act and react generally in the world as though it's the case.
SPENCER: Would that imply though, that people don't believe that these things that they claim around are wrong because they're not actually that disposed to act in accordance with what they say is wrong? Or does it not contradict that?
ERIC: That's one of the trickier cases for this kind of view. Let me get to that in a minute. Let me first give your listeners a sense of how this plays out for, say, implicit bias cases which I think are really interesting and other kinds of value cases. So in my view, what is it to believe that women and men are intellectually equal? Well, I think part of it is to be disposed to say "Women and men are intellectually equal," with a feeling of sincerity. And part of it might be willing to say, "Wager on the outcome of a series of journal articles" or something like that if you want to talk about betting belief. But I think really, the heart of it more — or what we should think of the heart of it as being to the extent where philosophers who can massage this concept of belief or thinking about how we want to think about belief — I think the heart of it is right to be, how do you react to women and men in the world in your daily life, both implicitly and explicitly, both in your thoughtful actions and in your spontaneous responses. So I think there are plenty of people who might sincerely say women and men are intellectually equal. Yet they feel surprised if a woman makes a smart remark, or be much less likely to describe a woman as a genius than a man, or require more evidence before they'd hire a woman for an intellectually demanding job than they had a man. I think we often have, with regard to some of our most important beliefs, what I describe as mixed up dispositional profiles with respect to them. In some respects, we act as though we believe, in other respects we don't. Our true kind of attitude is a complex mix of these things. In this sense, having a belief or an attitude is like having a personality trait. It's just like nobody is 100% pure extrovert. To be extroverted is to be disposed to enjoy parties, to like meeting new people, and to be talkative in large groups and stuff like that. No one's 100% always that way. So likewise, you can be on a spectrum, a little bit extroverted in these respects, and not so much in these other respects. The same thing happens with belief so the 16 types of beliefs that you're talking about, would be like the equivalent of 16 different dimensions of extraversion. Sometimes, you have all of them line up, and then you can say, "Look, this person is way out on the far end of the spectrum of, say, believing in the equality of the sexes, and this other person is way out on the other end, but most of us are more mixed." I think the same thing happens with value beliefs. I think that my children's happiness is more important than their grades. Well, is it to really say that to really think that to really live that way? I think that's a mixed-up complicated thing. That's my view of belief.
SPENCER: Well, I think I noticed in myself something arising here that I often feel when I talk to philosophers, which is that it sometimes feels like philosophy wants to say, "Well, what is belief really? Or what is knowledge? Or what is ethics really?" And my temptation is to say, "Okay, why don't before we have that conversation, why don't we first just disambiguate it, come up with belief type one, belief type two, belief type three? Once we've laid out all the different things that could mean all this different disambiguation, then we can talk about them one by one, rather than trying to have this conversation about what is belief." Well, belief is a lot of things, belief is an ambiguous word, belief means different things in different contexts and people, right?
ERIC: Yes. So I'm very sympathetic with that. The way I preface my comments earlier was sensitive to those issues. In the sense that, I do think that we use belief in lots of different ways. It's not a univocal concept. Part of what I do with this approach to belief, or what I want to do is invite you to think about belief this way, to say this is a useful and attractive way of thinking about belief. Of course, there are other ways of thinking about it, too. But this way, has, I think, several pragmatic virtues that make it a good choice for a general theory of belief which is a little different than saying, "There's some metaphysical fact out there about this that has really been the right account of belief." It's more like a pragmatic or ethical or value choice about how it's best to think or talk about belief, given its ambiguity, and given what we care about.
SPENCER: Right. Because you may come up with 16 definitions but given common usage, given our goals in discussing it, given what other people mean by it, that doesn't mean all definitions are equally important or valuable.
ERIC: Right. So on an intellectualist definition of belief, which a lot of philosophers have, you can say, "Hey, look, if you say with sincerity and you inwardly judge women and men are intellectually equal, you believe it." That's all there is and whether your actions accord with that, that's a separate question. That's a perfectly coherent concept of belief and I think there's a strand of that in our culture that thinks of belief that way. You can say, "Okay, well, that's one way of thinking about belief," and here's another way of thinking about belief — the way that I prefer — in which to believe you really have to kind of live that way consistently. So then we've got these two accounts and there are other accounts too, but let's consider these two for now. And say, "Okay, which account of belief should we prefer? Which kind of belief makes better sense for us given what we care about?" Do we want to be able to say of the implicit sexist, that he really believes that men and women are intellectually equal and just doesn't act that way? Or is it more useful and more in accord with our values to say, "Well, look, it's not quite right to say that he believes this, he says it. So there's a part of him that kind of accords with that. But there's another important part that doesn't accord with that. So I don't really want to say that he really fully believes it." I think the latter is a more useful way of thinking about it in a way that privileges our intellectual self-attributions and the sentences that come out of our mouths over our real life daily choices.
SPENCER: Sort of saying deeds matter more than claims of what you think, right? If we use this definition, how do we square this away with ethical beliefs?
ERIC: So then we take the case of, say, someone who believes in an ethical thing but doesn't act in accord with it. I think part of believing that something is ethically good is having some motivation to act in accord with it. But it doesn't follow that you necessarily do act in accord with it. You might fail to act in accord with it but also feel guilty about failing to act in accord with it. You might feel like, if the situation were different, you would act in accord with it. You might feel admiration and praise for people who do follow those values and disapproval of yourself and others for not following those values. So those are all different ways in which you can genuinely believe that something is morally good despite the fact that you don't do it.
SPENCER: So are you thinking of those as actions, like feeling guilty or feeling good about someone who behaves in that way?
ERIC: Yes. Actually it's a little bit of a simplification, in my view, to think that belief is all about action. I think it's a set of dispositions, some of which are behavioral, some of which are cognitive, and some of which are experiential. So it's a mix of all of those things. Think again of personality traits, what is it to be extroverted? Well, it's partly to behave in certain ways but it's also to feel pleasure in certain kinds of social interactions, and it's also to think a certain way about other people. So I think all of those things belong to belief. It's not just, "How do you act?" It's the overall posture you take toward the world.
SPENCER: Hmm, interesting. I guess what I would say is that insofar as people believe something is unethical, yet, it doesn't drive their motivation very hard, so it's not just that they have a strong motivation to do it and then there's other motivations on the other side that prevent them. But rather, they don't actually have a strong motivation, even though they "believe" it's ethical. What I would say is that they're just are activating, not so much that they don't have that belief, but they have a belief that's not very motivation-driving. Maybe they just have the sentence "eating animals is wrong '' in their head but they don't really connect with it concretely. Or they don't have a visceral response to the idea of eating animals or whatever. But I'm wondering, how would you interpret it in that case where if you ask them, "Is it wrong?" They say, "Yes, definitely." And they give all these words, but they don't seem very driven motivationally.
ERIC: So part of the question is, how wrong do they think it is? If they think it's grievously wrong, and that's different than if they think it's just a little bit wrong and maybe about as wrong as driving a car to work rather than taking the bus or something like that. That would be a very different kind of sense of the wrongness of it which would then connect presumably very differently with behavior. One comparison here might be, sometimes we say things that really have no connection with us. It's not really right to say that we believe them, right? This is, maybe, in your example you started with the person saying capitalism is bad. Or I think in other kinds of cases, imagine someone's sitting on the couch saying to themselves, "I really should eat better and exercise" while they're chowing down on junk food. There's a sense in which they might think that, but there's a sense in which it doesn't really engage them. You can play it out in two different extremes. At one extreme, you might imagine this person, just, they laugh at their kale-eating friends and there are no conditions under which they could actually go get out and exercise. They'll gladly load their cart with junk food, then this is kind of just empty words. On the other hand, you can imagine someone who makes sincere plans to try to behave better, feels really guilty about not exercising more, makes resolutions, tries to get her friends to put her on the hook, but this past week has been really terrible and she's totally stressed out. So those are two very different ways of consistently sitting on the couch eating chips and not exercising while saying that you really should do more. So that then would be a spectrum from something like empty words that don't really reflect any belief at all, to something you believe, or a value you have, or a norm that you endorse but that is failing to get into action either because there's something in the way of putting it into action or there's some other counter-motivation that's weighing more strongly.
SPENCER: Changing topics now, I'm really interested to hear about your theory of jerks. Can you tell us about that?
ERIC: According to my theory of jerks, a jerk is someone who culpably fails to appreciate the perspectives of the other people around them, both their intellectual perspectives and their emotional perspectives. They treat the people around them as fools and as tools to be used rather than authentically interacting with them as peers and equals and other human beings whose opinions and values deserve respect.
SPENCER: So we're thinking of this as a trait that people have, rather than a state someone can be in?
ERIC: Yes. Although, I would hesitate to say that anyone's 100% jerk or 100% non-jerk. We all have relatively jerkish and relatively less jerkish moments. Yet I think it is a personality trait to some extent. There are some people who are more toward the jerkish end and other people who are less.
SPENCER: And why choose that particular way of thinking about it?
ERIC: One of the more practical aspects of ethics is how you interact with people on a day to day basis. A lot of ethics is about more abstract things than that, or actions done to distant strangers, or how you behave in principle in various types of hypothetical situations. I guess, somewhat undervalued in contemporary ethics is just the ethical thinking about our day-to-day, face to face, 'how do you interact with people around you' and this term 'jerk' — which I think is a valuable term in our culture — captures something that's worth capturing; something from ordinary language that represents a certain important kind of moral failing in our day to day interactions.
SPENCER: So what's the use case of this definition? Is the idea that it can help us think more carefully ahead and not be jerks or to help us identify jerks or other things?
ERIC: Both. I think it's helpful. It can be helpful to have a concept or category for thinking about people who are problem people and think, "Oh, okay, this is that pattern of character that is the jerk pattern." When I first started thinking about jerks, I was talking with a friend whose boss was a jerk. We were complaining about the boss who never seemed to either understand other people's perspectives, would always shut people down in meetings, was completely happy to help himself off to all the goodies because he's the one who deserves them. So there's this kind of character trait that we can see in other people and it's good to have a label for that. But also, I guess, that's in a way a preface to saying that, maybe the thing that I care more about in the theory of jerks here is the question of how to assess whether you yourself or I myself might have that particular moral failing. I think that is a pretty difficult question.
SPENCER: It seems like the sort of question jerks would be less likely to ask.
ERIC: Yes, right. [laughs] I think even just thinking to yourself, "Am I being a jerk?" is already kind of not being a jerk. There's a pure diamond grade 100% Jerk. It just wouldn't even occur to them that they might be being a jerk. There's an unreflectiveness in the jerk's assumption that they're right, and that other people's attitudes aren't worth taking into account. But I don't think it's that easy because most of us, if we think "Am I being a jerk?" will tend to want to say "No." The ones who say "Yes" — actually, I suspect the ones who are more disposed and ready to say yes to questions like that — are actually my guess would be the people who are the least jerks. So, think of the opposite of the jerk, the opposite of the jerk in my view is a sweetheart; someone who is maybe painfully aware of how she might be wrong and other people might be right, and how other people around them have valuable perspectives and worthwhile projects, and maybe can even become a little bit of a doormat and even be a little bit misused by people around them because they're so ready to see the value in other people. So imagine a sweetheart like this. This is the kind of person who might be unintentionally rude and then just full of pained apologies afterwards. The person who's so ready to say, "Oh my, I accidentally cut off that other chakra, I was being such a jerk. Oh, I feel so bad." So in a way, there's a possibility of there being somewhat of a negative correlation, possibly, between readiness to think of himself as having done something jerkish and actually being a jerk.
SPENCER: I like this approach to thinking about it. But I want to try to make it actionable for our audience. So suppose you could get everyone listening to this to spend one to two minutes reflecting on a question to think about their jerkiness. What question would you want them to reflect on?
ERIC: So not that question. I don't think the best diagnostic question is to sit there and think, "Am I a jerk?" I think that the best way — although still not at all a perfect way because I think self-knowledge of turpitude is going to be really hard, but I think the best hope we have for it — is through this concept that I think of as jerk goggles. I think there's a characteristic way of seeing the world, a characteristic phenomenology of turpitude. The jerk tends to see the world as full of people who don't deserve much attention or concern. They see themselves as surrounded by fools and tools. They tend to see the people around them in broad, negative social categories like, "Oh, there's the valley girl. Oh, there's the spoiled rich kid. Oh, there's the dumb gardener person. Oh, there's the snooty person who prays to that." Or here's another example. This is actually the example that got me thinking about jerk goggles. I've been talking with my friend about this boss, and this boss is full of these ideas like "I'm important" and "I'm surrounded by idiots." That's how the world looks through the jerk goggles that this boss has. That's a good way of justifying to yourself, rationalizing to yourself your disregard of other's perspectives. "I'm important, and you guys are all idiots." So then maybe a few days after this, I went to the post office and I was waiting in one of these long lines in the post office and the other people were being slow. I remember thinking like, "Man, I had negative thoughts about the people in front of me in line. I felt like I'm important. I'm in a hurry. Why am I waiting in this line behind all these slow, stupid people?" Then I thought, "I've seen the world through jerk goggles and seeing the world just like this boss that we are criticizing the other day sees the world." So I think you can notice to some extent when you're seeing the world through jerk goggles. The sweetheart or at least someone who's not a jerk sees people as individuals with interesting ideas that might disagree with their own, and with projects that might be very different, and values that might be very different from their own, but are still interesting and worth respecting. And the jerk just paints everybody in shades of somewhat negative gray.
SPENCER: So then what can we ask ourselves? Should we ask ourselves like, am I using jerk goggles or how would you frame it?
ERIC: I think that could be a useful exercise to, once in a while, especially when you're around other people, to just ask yourself, "Am I seeing the world through jerk goggles? Am I seeing the people around me as fools and tools, as idiots, as people blind of broad negative social categories? Am I seeing the humanity and value and the individuality in the people around me?" And I think we can notice sometimes when we're seeing people in the first way instead of the second way and correct ourselves.
SPENCER: I'm gonna propose a little mini theory of jerkiness and I want to see how you map it on to your theory because I think I actually think they're pretty compatible. So in my own observation, I've tended to notice four types of jerky people. The first are narcissistic jerky people where they basically seem to think that their interests are more important than other people's interests. So it's not that they think necessarily that nobody else matters at all. It's just that they think what they care about is more important and better than what other people care about, so they put themselves first. And I've met a bunch of people like this. I think we all are familiar with some people like this. The second type is more sociopathic where they actually just genuinely don't have compassion for others. They just don't care at all. At an extreme level, they don't care at all. Obviously, these are spectrums. So there are people that are a little bit narcissistic. There are people that are extremely narcissistic. And same with sociopathic. There are people that are a little bit sociopathic. They just have less compassion, less empathy. There are also people at the complete extreme where they literally almost think of other people as just video game characters. That's how much they care about them. They might still behave ethically in some cases but you can imagine why they're much more likely to be jerks. The third jerk category is what I would call clueless, which is ]people that have very bad social skills and this kind of thing where they're sort of bumblingly jerks. They're not intending to be jerks but they make decisions that make things hard for everyone else, or they're taking something that other people wanted or whatever just because they're really bad at modeling other people. Then the final type, I would say, you might think of them as people that are misled. They have some kind of belief about the world that causes them to be a jerk. Classic example of this would be someone that just really wants a girlfriend and they never seem to be able to get a date. So they get really into pickup artistry and then they get taught that, "Oh, this is how to get a girlfriend, make women feel like crap," or something like this. So now they have bought into some kind of wrong theory of how things work and they are hurting people due to this belief they have but really their intentions are fine. It's just that they bought into some false view. So again, that would be the narcissistic jerkiness, sociopathic, clueless and misled. I'm just curious. I actually don't think it's at all at odds with what you said but I'm just curious to hear your reaction to that.
ERIC: Yeah, I like that. I think all of these may fade into each other. I think it would be hard to be a really horrible version of a pickup artist without also having some of those other vices to some extent. And the true narcissist is a little bit of a sociopath, and that sort of thing. They have that a little bit of disregard, that's part of [inaudible]
SPENCER: Yeah, I think there's a correlation between some of these.
ERIC: So these are kind of somewhat different flavors of the same thing with different emphasis. And I agree that those are an interesting different set of emphasis, so I like what you're suggesting.
SPENCER: Well, maybe one way to put it is, if someone disregards the perspective of another person, we need an explanation for this. Because most people just try to take into account the people's perspectives so we need an explanation. What is the explanation? Well, one explanation is that they just think their perspective is more important. Another explanation is that they actually are completely indifferent to other people's perspectives. A third explanation is they just can't see other people's perspectives. And a fourth is that they're just confused about the nature of people's perspectives. So that's where it's coming from. It's like, "Why do they disregard people's perspectives?" It's trying to point to different causes for that.
ERIC: This is interesting. This is reminding me a little bit of what you were saying about belief and where I have inverted perspective relative to yours in some ways. So you want to find that inner cause that explains the outward behavior, and I'm kind of skeptical about those kinds of inner cause explanations. Somebody thinks, obviously, you've got brains and psychological states that are doing all kinds of things. But I think in a way, those are massively complex and tend not to fit well into our simple theories and categories of causes. So the more helpful way to think about these things is in terms of patterns of action and reaction rather than "here's the real cause." What's the real cause of this person's neglect of somebody else? I'm not sure it has to be one interior thing versus another, that one of those explanations has to be in there doing some causal work. I think what happens is we've got kinds of patterns of virtues and vices that arise from a complex set of interior processes. Some patterns might fit a little closer with the narcissist pattern that you described, and some might fit a little closer with the psychopath pattern that you described, and so forth. But in fact, those patterns would probably be multi-causal and substantially overlapping.
SPENCER: I completely agree that the human mind is so complex and there's gonna be tons of edge cases. That being said, I think in a lot of instances, these categories are reasonable fits. Sure, there's lots of cases also where someone's just a little narcissistic, and a little sociopathic, and also maybe they're a bit clueless, and so on. But maybe when it comes to the most damaging people, they usually tend to be pure, I think. For example, I've tried to learn about a lot of different cults. I've looked into maybe 30 different cults. It's just amazing to me how similar the personality type of the cult leaders are. They almost always are this narcissistic type and I would differentiate it from the sociopathic type. Because although they both can behave similarly, the narcissistic type cares a lot more about people worshiping them and their narcissism being fed. A sociopath is like, "I don't really care. I want to feel good but I don't really care if people worship me. Why would I want that?" Another example is that I've just seen a bunch of people that had just really bad social skills and they would hurt people through this. Some people seem to just be born with much poor ability, like read emotions, and then that creates...until they eventually learn explicitly how to deal with this, they often bumblingly hurt people by accident.
ERIC: Well, I wouldn't want to say — Well, yes, I agree there's a difference between a really extreme canonical case of a narcissist and a really extreme case of a sociopath and that both of them have aspects of jerkiness in them. But narcissism is not the same as being a jerk, so I don't think it's part of being a jerk that you want to be worshiped, for example. Characteristics of sociopathy involve risk-taking and not being very good at long-term planning. That doesn't have to be part of jerkitude. Jerkitude in a certain sense is a little bit simpler phenomenon and you're talking about somewhat more complex phenomenon, I think, with those things. Also with respect to the accidental harming of other people, I do want to emphasize that in my characterization of a jerk, you have to be culpable: culpably disregarding or failing to appreciate the perspectives of others. So someone who really just makes completely innocent mistakes maybe because they're on the autism spectrum or something like that, that's not being a jerk. There may be a gray area there, where you're maybe a little culpable but maybe not as culpable as someone else who had better social skills, and that will take a little bit of the sting out of the accusation of jerkitude.
SPENCER: How do you think about cases where someone has good intentions but a false belief? A really extreme example, it just goes way beyond jerkiness, but someone is taught because of the cult that they're raised in that women should never be allowed outside or something like this. So they have, on the one hand, you might say, "Well, they have a really horrible belief." You might think of them as a bad person. But on the other hand, you say, "Well, but they were sort of indoctrinated with this. It's just a false belief. They didn't choose to believe that false thing and if they stopped believing it, then they don't actually want to harm women. They just believe that this is the best for women or something like that."
ERIC: I guess I think there are a couple different ways that that could manifest. Then there would be a spectrum of cases between these two ways. But one way it can manifest would be in a way that's completely sweethearted. So you can imagine someone who thinks that women shouldn't be allowed outside, thinking that and defending that view in the sweetest possible way completely with respect for the people around them. That person would probably disagree with, say, a woman who wanted to go outside but they wouldn't do it in a jerky way. And you can imagine someone who's a complete asshole about it. Even someone maybe who believes that because it's a convenient thing to believe, to enact they're jerkitude. I wouldn't want to say that the moral explanation is always from you believing X therefore you do Y. Sometimes because you want to do Y, you believe X. So sometimes people who have the most toxic beliefs are ones who have those beliefs for motivated reasons that are not admirable reasons. It seems to me like you have a spectrum between those two kinds of cases. In the first kind of case, we're not talking about a jerk, we're talking about someone who's just kind of misled and maybe isn't going to inflict that in a really toxic way on other people. And the other end, we end up with someone who's a pure jerk and that's how that jerkitude manifests.
SPENCER: Before we wrap up, I just want to mention something, which is, I find that you're very honest, and self-reflective in an interesting way. When people come on a podcast, there's a temptation to make themselves look good. A lot of people — there's nothing sinister about that — just people want to make a positive impression. But I feel that you are less motivated to do that than most people and in a really positive way that I really admire. So I'm wondering, does that resonate with you? Do you feel like that's true about yourself?
ERIC: I don't know. I mean, I'd like to make a good impression. But...
SPENCER: Well, it just seems like you're more honest about your own thought processes and behaviors in a way that's very refreshing. You're just trying to give an honest account of yourself. That's my impression.
ERIC: I guess it's part of my instincts as a philosopher to think that things are usually more complex in the simple accounts and to think that my own simple accounts and my own motivations are probably criticizable and not quite what I think they are. So I feel like I have to admit that. Maybe that's part of what's going on.
SPENCER: Hmm. Do you think that most philosophers that you meet are trying to figure out the truth? Or do you think that they're doing something else?
ERIC: I think for the most part, philosophers do want to figure out the truth. But there's also something fun about playing around with ideas. I think that there's a gray area between saying, "Well, this isn't really a truth, but I'm going to defend it because it might be true. And it's kind of interesting, and it's kind of fun to defend than really doing something that's just opposed." I guess I don't think most philosophers really just go 100% for oppose. I think a lot of times when it might seem like that, it's that they think that there's some truth underneath what they're doing or at least that they've got a position that has more truth and more defensibleness in it that might be appreciated even if maybe it won't ultimately win the day.
SPENCER: There's just an interesting way in which we can sometimes not be clear even with ourselves whether we're playing a game or we're trying to achieve something. Like are we just playing chess or are we trying to figure out the way the universe works? Right?
ERIC: Yeah, right.
SPENCER: And you can slide between those two very, very readily.
ERIC: Yeah, absolutely. I do think that's very well put. I think it's not always clear where the one stops and the other begins, and that the motivations can be some mix between those two things sometimes.
SPENCER: One philosopher that really impressed me a lot was someone who wanted to understand Occam's razor. He started looking into this and then started to realize, "Oh, wait, maybe I don't know enough math to understand Occam's razor" and would start teaching himself math, or "Maybe the Occam's razor relates to computer science" so he starts teaching himself computer science. And he's publishing papers that I'm pretty sure people, his colleagues, don't understand because they're now using math that is not normally taught to philosophers, and so on. It just really struck me as like, that's, to me what it feels like when someone's trying to figure out the truth. They don't do the thing that is going to do exactly what everyone else is doing. It's like the truth leads you every which way and causes you to do things that seem weird, and not the thing that everyone does. Chasing the truth is usually a very winding and complicated path or something like this.
ERIC: Yeah, that might be it. That's a cool story. I like that idea. Sometimes, the opposite end of that — in philosophy, there are some very well-structured debates like the debate about freewill as an example of this. — There are various positions that are defined that are relative to each other. Then you've got various standard objections to the various positions and they have names. Then you have various rebuttals to the standard objections which themselves have names. There are people who get really deep into the back and forth of that, that I also think are kind of really interested in the truth. But what they're doing is almost the opposite of what you described this other person is doing. What they're doing is they're diving deep into a well-structured area in this really nerdy way. There's a nerdiness maybe that's common between the two things, and I admire nerdiness.
SPENCER: I see what you're saying like mapping out every position you could have and building a mental decision tree for the arguments you could use in each case, that sort of thing.
ERIC: Exactly. Exactly. Here's the giant decision tree and wherever, we philosophers, arguments, pro and con are on this thing. I think of a nerd as someone who is interested in something, some intellectual question, to a degree that is on the borderline of being unreasonable.
SPENCER: It's a great definition. [laughs] We will use that as your quote for the podcast.
ERIC: I think it's wonderful that the world is full of people who are more interested in trains from the 19th century than it'd be reasonable to be interested in, and they know so much about that stuff. Or this person who's so interested in Occam's razor, he learns all of these things just to figure out more about it. Or this person who really is interested in the tiny little details of the back and forth of these debates about free will. There's something really great about a world that can support this kind of nerdiness. And that kind of nerdiness does, I think, have this correlation with being interested in the truth about the things that you get so nerdy about.
SPENCER: So to wrap up, I want to ask you a really dumb question. The sort of question that maybe a six year old would ask but I feel like maybe there's something important about this question, which is, why can't philosophers get in a room and come to a conclusion about these topics? Or at least come to a conclusion about why they can't come to a conclusion like a meta conclusion? It's sort of a dumb question because I think everyone knows that this wouldn't work. But on the other hand, you can imagine a bunch of mathematicians coming into a room and coming to an agreement pretty fast on their disagreements most of the time, or at least agreeing about why they disagree. Yeah, it's hard to imagine that working in philosophy and I'm wondering, why is that? Why can't philosophers come to a room and come out saying, "Oh, yeah, here's why we don't agree on what's true about ethics or whether ethics is even a reasonable thing to talk about or if there's objective moral truth or what consciousness is, or is there a hard problem of consciousness" and so on.
ERIC: It is really hard. What I found is that if I sit down for three hours with a philosopher who disagrees with me about stuff, we can often figure out the shape of our disagreement and what may be the real issue is on which we disagree and where we in fact agree. It might not be obvious what the root is of disagreement and what the structure is of what we really agree and disagree on.
SPENCER: That sounds promising.
ERIC: So that's something but that's not yet reaching total agreement. That's discovering the structure of a disagreement. Then try to imagine doing that with multiple people at once is going to add combinatorial complexity to that task. I think that one of the things about philosophy that makes it different from (I don't want to say it just makes it different from) other disciplines — that's too simple, but it's kind of truer or more extreme in philosophy than it is in other disciplines — is that one question opens up into four questions, which then opens up into 16 things tied together in really complex ways and it's difficult to really see the heart of one thing without finding yourself tangled up in many other things.
SPENCER: That's really well said. So, you're trying to figure out why you disagree on a particular topic, let's say on ethics. And it turns out, there's all these different branching points and if even one of them you disagree on then that has all these implications down the road. So there actually might be hundreds of positions you could have or something like that.
ERIC: Right. So just thinking back on our own conversation, we were interested in some ethical questions and questions about moral motivation. Then that brought us into thinking about the nature of belief. Then it turns out that we've got different views, maybe about the nature of belief or even about how you answer your questions about "What a belief is? What's the meta philosophical process by which you figure out what belief is?" Then also, at the same time, that got us into questions in philosophy of mind about "To what extent do behaviors come from internal causes?" versus "To what extent is it better to think in terms of patterns of outward behavior?" So even just in our conversation now, we opened up into issues in metaphilosophy, metaphysics and philosophy of mind in the course of thinking about ethics.
SPENCER: Eric, thanks so much for coming on. It's really fun.
ERIC: Yeah, it's fun chatting. Thanks for having me!
JOSH: Do you consider your current activities to be the most impactful thing you could be doing?
SPENCER: The way that I think about it is I'm looking for an intersection of a bunch of things in order to try to maximize my intrinsic values, the extent to which they're expressed. So I have intrinsic values around the world being better and people not suffering and people being happy. I have intrinsic values around [inaudible] truth and fighting falsehoods. I have intrinsic values around my own happiness and the happiness of my loved ones and friends. So those are kind of the things I'm trying to create in the world and I'm trying to maximize insofar as I can. But in terms of my career, I'm looking for sort of an intersection of different things going on at once. So one of those is impact. Well, I care a great deal about impact. Another of those is personal fit, like, "Am I the right person to do this really well? Do I think I have a competitive advantage at doing this thing?" And a third is, "Will I be happy doing the thing or will I be miserable doing it?" So if I think something could have a great impact and I think I'm a really good fit for doing it because I think I have somewhat unique abilities in that area and I think I'd be happy doing it, then that is something I'm more likely to work on. So it's not just the impact that I'm focused on in my life, although it does play a really major role in how I decide what to do.
JOSH: When and how did improving the world become your focus?
SPENCER: Since I was a child, I felt that the most important thing for me to do is to try to make the world better. When I was quite young, early high school, maybe middle school, I imagined that I would be an inventor of some sort. I think I had some idea of like, I would be inventing some kind of machines like steampunk machines to help the world. Then as I got older, when I first went to college, I thought, "Maybe I'll work in robotics, maybe that's how to improve the world" and I got a little older and I got really into math and then machine learning and computer science. Then I started thinking "Oh no, maybe actually software is a better way to go." So I've always had this vision that I want to use technology to improve the world and that's my main life's mission and goal and what that means just evolved over the years.
Click here to return to the list of all episodes.
Sign up to receive one helpful idea and one brand-new podcast episode each week!
Subscribe via RSS or through one of these platforms: