The Scout Mindset with Julia Galef, Part One
Summary
In this episode, host Jonathan Cutrell interviews Julia Galef, author of The Scout Mindset. Julia introduces the core concept of her book: the distinction between ‘soldier mindset,’ where we defend our pre-existing beliefs, and ‘scout mindset,’ where we aim to see reality as accurately as possible. She explains that this shift in motivation is key to intellectual honesty and better decision-making.
The conversation explores how belief updating often happens through small, incremental shifts rather than dramatic reversals. Julia uses the analogy of scientific paradigm shifts, as described by Thomas Kuhn, to illustrate how anomalies accumulate before a new, more accurate worldview emerges. They discuss the story of Jerry Taylor, a former climate change skeptic who gradually changed his mind after encountering evidence that challenged his position.
Julia and Jonathan delve into the practical challenges of overcoming cognitive biases, such as confirmation bias and motivated reasoning. They note that even experts like Daniel Kahneman are susceptible to these biases, and that mere awareness is not enough to prevent them. Changing these ingrained thought patterns requires sustained practice and effort, similar to cognitive behavioral therapy.
The episode also touches on Julia’s long-running podcast, Rationally Speaking, where she engages in collaborative debates with thinkers to uncover the root of disagreements. The discussion highlights the importance of intellectual curiosity and the willingness to exist in a state of uncertainty while seeking truth.
Recommendations
Books
- The Scout Mindset — Julia Galef’s book, which introduces the concepts of soldier mindset (defending beliefs) and scout mindset (seeking accurate understanding).
- The Structure of Scientific Revolutions — Thomas Kuhn’s work on paradigm shifts in science, referenced as a metaphor for how individuals update their worldviews.
People
- Daniel Kahneman — Nobel laureate and psychologist known for his work on cognitive biases and judgment; discussed in the context of whether biases can be overcome.
- Tom Gilovich — Cognitive scientist cited for his definition of motivated reasoning: ‘can I accept this?’ vs. ‘must I accept this?‘.
- Jerry Taylor — Former climate change skeptic who changed his mind after critically examining evidence, now a climate change activist.
- Rob Reich — Political science professor at Stanford who argued against large-scale philanthropy; discussed as an example of a value-based disagreement on Julia’s podcast.
Podcasts
- Rationally Speaking — Julia Galef’s long-running podcast featuring interviews and debates with thinkers on science, philosophy, and rationality.
Topic Timeline
- 00:02:02 — Introduction to Julia Galef and her podcast Rationally Speaking — Julia Galef is welcomed to the show. She describes her podcast, Rationally Speaking, which she has hosted since 2010. The podcast features interviews with scientists, authors, and philosophers, aiming for collaborative conversations to explore disagreements and seek truth together, rather than engaging in adversarial debates.
- 00:07:56 — The nature of belief updates and paradigm shifts — Julia discusses how changing one’s mind is often a process of small, incremental updates rather than sudden 180-degree flips. She compares this to scientific paradigm shifts, where anomalies accumulate before a new framework makes sense of the data. The goal is to notice and pay attention to these anomalies in our own thinking.
- 00:16:27 — Case study: Jerry Taylor’s shift on climate change — Julia shares the story of Jerry Taylor, a former professional climate change skeptic. After a debate opponent pointed out a misrepresentation in his argument, Taylor began checking his sources more carefully. Encountering repeated poor evidence gradually shifted his view, leading him to become a climate change activist—a rare example of a complete professional identity shift.
- 00:22:41 — Introducing the Scout Mindset vs. Soldier Mindset — Julia explains the core thesis of her book. Soldier mindset is the motivation to defend pre-existing beliefs. Scout mindset is the motivation to see things as they are, to form an accurate map of reality. The book explores why we default to soldier mindset and how we can cultivate scout mindset for better judgment and intellectual honesty.
- 00:27:27 — Asymmetric standards of evidence and motivated reasoning — Julia references cognitive scientist Tom Gilovich’s definition of motivated reasoning: for things we want to believe, we ask ‘can I accept this?’; for things we don’t, we ask ‘must I accept this?‘. This creates an asymmetric standard of evidence. She shares a personal example from writing her book, where she realized she was applying more skepticism to studies contradicting her thesis.
- 00:33:04 — Overcoming cognitive biases and Kahneman’s perspective — The discussion turns to Daniel Kahneman’s view that simply knowing about biases doesn’t prevent them. Julia disagrees with Kahneman’s pessimism, arguing that most studies on improving critical thinking are short-term and don’t reflect the long-term practice needed. She compares changing thinking habits to cognitive behavioral therapy, which requires sustained effort.
Episode Info
- Podcast: Developer Tea
- Author: Jonathan Cutrell
- Category: Technology Business Careers Society & Culture
- Published: 2021-04-19T07:00:00Z
- Duration: 00:36:37
References
- URL PocketCasts: https://pocketcasts.com/podcast/developer-tea/cbe9b6c0-7da4-0132-e6ef-5f4c86fd3263/the-scout-mindset-with-julia-galef-part-one/36ef5292-b7ef-490c-85c7-e65ebe3810f1
- Episode UUID: 36ef5292-b7ef-490c-85c7-e65ebe3810f1
Podcast Info
- Name: Developer Tea
- Type: episodic
- Site: http://www.developertea.com
- UUID: cbe9b6c0-7da4-0132-e6ef-5f4c86fd3263
Transcript
[00:00:00] I’m very excited to be joined by today’s guest, Julia Galef.
[00:00:09] Julia is the author of The Scout Mindset, which is out in pretty much anywhere that
[00:00:15] you buy books.
[00:00:16] Of course, you can find it on Amazon and local booksellers close to you.
[00:00:22] And Julia is also the host of a long running podcast called Rationally Speaking.
[00:00:27] I’m excited to share this interview with you because Julia is such a clear thinker and
[00:00:36] she has such a unique talent in discussions of challenging what you have to say while
[00:00:43] still inviting you to say more.
[00:00:47] In this episode, we talk about quite a few things.
[00:00:50] For example, updating your beliefs, scientific paradigm shifts.
[00:00:54] We have a quick discussion about Danny Kahneman, who could possibly guess that we would talk
[00:00:59] about Danny Kahneman on this episode, but we did indeed.
[00:01:05] Julia is one of the many authors that we’ve had on this show.
[00:01:11] The reason that we have people like Julia on Developer Tea is because, as you know,
[00:01:17] the big goal of this show is to help driven developers find clarity, perspective, purpose
[00:01:21] in their careers.
[00:01:23] One of the most important things that you can do in finding clarity and finding perspective
[00:01:31] and ultimately those things leading to help you find purpose is to understand how to think
[00:01:39] better, how to build a better thinking machine.
[00:01:42] So much of our content is pointed towards that.
[00:01:45] And the Scout Mindset is yet another kind of way of framing this idea, this thinking
[00:01:52] machine that we have.
[00:01:55] And I’m going to let Julia describe it, but let’s get straight into this interview with
[00:01:59] Julia Galeff.
[00:02:02] Julia, welcome to Developer Tea.
[00:02:07] Thank you for coming.
[00:02:08] Thank you.
[00:02:09] Oh, my pleasure.
[00:02:10] Good to be here.
[00:02:11] So we were just talking before the show, and typically when I have a guest on that has
[00:02:16] their own podcast, I tend to be the veteran in the room just because I’ve been doing
[00:02:22] podcasting since, you know, for six years or something, but you totally eclipse me.
[00:02:27] You’ve been doing podcasting for how long?
[00:02:29] Since the beginning of 2010.
[00:02:32] And if my age that, you know, if my podcasting age doesn’t show me just saying doing podcasting,
[00:02:38] I guess, is you’ve been the host of a podcast, I guess, since 2010.
[00:02:44] That podcast being Rationally Speaking, can you tell us just a little bit about that podcast
[00:02:49] and then we’ll talk about some other stuff that you do?
[00:02:53] Yeah.
[00:02:54] So Rationally Speaking is I originally was co-hosting it with a philosopher, professor
[00:02:58] of philosophy named Massimo Pellucci.
[00:03:01] So we co-founded and co-hosted it together for the first five years or so.
[00:03:06] And then since about 2015, I’ve been hosting it solo.
[00:03:11] And it’s about every two weeks.
[00:03:14] And the each episode is mostly an interview, or for the most part, each episode is an interview
[00:03:19] with a thinker of some kind, a scientist or an author or a philosopher.
[00:03:26] And it’s kind of a conversational episode where I either am trying to explore some topic
[00:03:32] that I’m really curious about or trying to understand, like how does consciousness work
[00:03:36] or what is willpower and why do some people seem to have more of it than others?
[00:03:41] Or the guest comes on with a thesis, you know, they wrote a book making an argument that
[00:03:48] I think is interesting and worth diving into and that I don’t fully agree with.
[00:03:52] And so we have a kind of friendly debate about that.
[00:03:55] And I’m really aiming to focus on guests and issues where we can kind of collaboratively
[00:04:04] figure a thing out together and kind of explore exactly where do our disagreements lie and
[00:04:09] why do we disagree?
[00:04:11] So basically to have the debate feel like more of a collaborative effort to seek truth
[00:04:16] together and less like a battle.
[00:04:18] Right.
[00:04:19] To see who’s going to win this particular discussion that has no way of proving who
[00:04:25] is right, necessarily.
[00:04:26] Yeah, I mean, well, certainly I think sometimes I have evidence on my side to some degree.
[00:04:33] And other times I just have intuitions.
[00:04:36] But yeah, there’s a limit to how much you can do in a podcast.
[00:04:39] So I aim more to just try to get to the bottom of what I call our cruxes of disagreement.
[00:04:46] I don’t know, for example, I had a guest on a couple of years ago who wrote a book that
[00:04:51] I liked but substantially disagreed with.
[00:04:54] It was Rob Reich, who’s a political science professor at Stanford, and he wrote a book.
[00:04:59] I’m blanking on the title now, but it was essentially arguing against large scale philanthropy.
[00:05:05] So he was criticizing billionaires who give away large sums of money to try to do good
[00:05:11] in the world.
[00:05:13] And I’ve worked with and am a big fan of several organizations that give away billionaires
[00:05:19] money to try to do good in the world.
[00:05:22] And so we had, I think, a really interesting discussion about this and kind of got to the
[00:05:27] bottom of one of our cruxes, which was just that we have different value systems, essentially.
[00:05:34] My value system, my approach to ethics is really pretty consequentialist in the sense
[00:05:39] that I want things to happen that cause good outcomes in the world.
[00:05:44] And I want to prevent actions that have bad outcomes in the world, which maybe sounds
[00:05:51] a little bit obvious when I say it.
[00:05:53] But a lot of people’s moral systems include some consequentialism, but also include other
[00:06:00] things like deontology, like some things are just right or wrong, regardless of their consequences.
[00:06:06] And I think I don’t want to misrepresent Robert Reich’s argument here, but my memory of my
[00:06:13] impression of his argument was that, you know, it’s just kind of wrong in some ways for billionaires
[00:06:19] to get status and praise for giving away money.
[00:06:26] Even if their money is actually doing good, it’s still wrong for them to get that status
[00:06:30] and praise.
[00:06:32] He was more focused on who deserves praise in society and less on what is the consequence
[00:06:40] of them giving away their money.
[00:06:41] Anyway, that was a long explanation, but I was just trying to describe the goal of these
[00:06:46] conversations is to kind of get down to the root of why do we have different views on
[00:06:50] this topic?
[00:06:51] And often it’s a value disagreement.
[00:06:53] Sometimes it’s just about we were using words differently.
[00:06:57] Sometimes we just have different predictions about how the world works, and that’s causing
[00:07:00] our disagreement.
[00:07:02] But I find it really interesting and valuable to get down to those roots, even if we can’t
[00:07:05] resolve the disagreement in one hour.
[00:07:07] Yeah, I mean, I can only imagine that these kinds of conversations have probably had a
[00:07:15] pretty major impact on the way you see the world.
[00:07:19] And I’m wondering, do you have a particular memory or a moment, and it can be on the podcast
[00:07:25] if not, where you felt like you had, the reason I’m asking this question, I think this is
[00:07:33] a fairly rare experience.
[00:07:34] I think this experience is pretty important when it does happen, where you have a sudden
[00:07:41] shift or kind of a brightening moment, where you realize something that you didn’t realize
[00:07:46] before, maybe it was counterintuitive to you, but the lights kind of switched on about something.
[00:07:53] Do you remember a particular moment like that?
[00:07:56] Oh, like a moment when I kind of had insight into something that had been more opaque to
[00:08:04] me before.
[00:08:05] Yeah, I think some people might see this as a change in mind, but I think it’s actually
[00:08:10] more like a shift in or, you know, it’s a learning moment.
[00:08:16] It’s not necessarily that you were convinced for different reasons, it’s that you learned
[00:08:20] something new and you updated your belief about something.
[00:08:25] Yeah, well, that’s I love the way you put that, as you might have expected I would,
[00:08:29] because that that is also how I encourage people to think about changing their mind or
[00:08:34] revising their beliefs as not it doesn’t have to be.
[00:08:39] And most of the time, it’s not going to be a 180 degree flip where you originally believed X
[00:08:45] and now you believe the opposite of X.
[00:08:48] A much more realistic thing to shoot for is these subtle incremental shifts in your way
[00:08:56] of thinking, you know, where someone brings up a hypothesis that or someone makes a claim
[00:09:03] that you don’t necessarily think is true, so you’re not fully convinced, but it’s something
[00:09:07] that hadn’t occurred to you before.
[00:09:08] And so now, for the first time, you’re considering this as a possibility, a possible hypothesis
[00:09:13] that you hadn’t been paying attention to before.
[00:09:15] Or, you know, someone points out a potential caveat or an exception to something you believe
[00:09:23] and you hadn’t noticed that exception before.
[00:09:25] And so it doesn’t mean you don’t hold your belief anymore.
[00:09:27] But, you know, now you’re paying attention to how it may not be true in all cases.
[00:09:31] Maybe it’s only true in most cases.
[00:09:32] And so that’s also a valuable shift.
[00:09:35] So I think these these kinds of shifts are, as I often call them, updates to your thinking
[00:09:41] are really valuable.
[00:09:43] And over time, they will often kind of accumulate to a significant kind of paradigm shift in
[00:09:50] the way you see something.
[00:09:52] But you shouldn’t be only looking for paradigm shifts because they’re really made out of
[00:09:57] these little moments.
[00:09:58] Does that resonate with how you’re seeing this?
[00:10:01] Yes, absolutely.
[00:10:03] Just to kind of add one more layer on it, the model that I use for this that I feel
[00:10:07] like is it’s fairly well understood by most people is seeing Newtonian physics as kind
[00:10:14] of a universal truth.
[00:10:16] And then suddenly we find new evidence to layer on top of it that it doesn’t always
[00:10:22] work exactly.
[00:10:23] You know, it’s not continuous all the way up to infinity.
[00:10:27] There are other things going on.
[00:10:29] So Newtonian physics is going to work in some scenarios and in others not so much.
[00:10:34] Yes.
[00:10:35] And this is something that people can click with it.
[00:10:36] Oh, yeah, that’s right.
[00:10:37] I can see how, you know, once you learn for all practical purposes, when I was, you know,
[00:10:43] when I only knew about Newtonian physics, there was no reason for me to have had any
[00:10:48] kind of shift or updating my belief there.
[00:10:52] But then when there is evidence presented, it’s not necessarily that hard to say, well,
[00:10:57] this isn’t the whole picture, right?
[00:10:59] You’re just seeing part of the picture.
[00:11:02] That’s a great analogy.
[00:11:06] And it’s actually, it made me realize that I should clarify the phrase paradigm shift
[00:11:11] because I think the phrase has become kind of a buzzword in business in the last couple
[00:11:15] decades referring to, you know, any shift in the way you’re doing things, allegedly
[00:11:20] a large shift, but oftentimes a small shift that’s being spun as a large shift.
[00:11:25] But really what I meant to refer to by paradigm shift is the way science changes its mind,
[00:11:30] which follows the process you were kind of describing.
[00:11:33] Oh, yeah, like a meta-paradigm shift almost.
[00:11:35] Well, I don’t know if I would call it a meta-paradigm shift, maybe in some cases a meta-paradigm
[00:11:39] shift.
[00:11:40] But yeah, it’s a phrase that was coined by a philosopher of science named Thomas Kuhn
[00:11:46] in the structure of scientific revolutions.
[00:11:48] And he was describing this process that science follows when it kind of collectively changes
[00:11:52] its mind about something like Newtonian physics or like, well, an earlier paradigm shift was
[00:11:58] whether the sun revolves around the earth or the earth revolves around the sun, where
[00:12:02] there’s this reigning paradigm, like the sun revolves around the earth.
[00:12:07] And then gradually some people start to notice anomalies, like observations that don’t make
[00:12:13] sense assuming the reigning paradigm is true.
[00:12:17] And so one such anomaly in that case was the observation that the path that Mars traced
[00:12:23] across the night sky had kind of a weird kink in it, a retrograde movement where Mars
[00:12:28] seemed to move backwards and then reverse course again and start moving forwards.
[00:12:32] And this did not fit with the reigning paradigm that the sun and all the planets trace circular
[00:12:38] orbits around the earth.
[00:12:41] And so at first what happens is these observations are kind of dismissed or ignored because everyone’s
[00:12:46] so confident that the reigning paradigm is true.
[00:12:49] And then gradually they start to accumulate and science enters into kind of a state of
[00:12:53] confusion where people are no longer sure if the reigning paradigm is true, but they
[00:12:58] don’t have anything great to replace it with.
[00:13:01] And they try to revise the reigning paradigm by saying, okay, well, maybe, yes, planets
[00:13:05] revolve around the earth, but maybe they don’t follow perfect circles.
[00:13:08] Maybe there are these little epicycles that they follow that would produce these weird
[00:13:12] paths across the night sky.
[00:13:14] And then eventually someone comes up with a new paradigm that makes all of your data
[00:13:18] make sense again, which in this case is these planets don’t revolve around the earth.
[00:13:24] The planets revolve around the sun.
[00:13:26] And that produces the paths that we see across the night sky.
[00:13:30] So I like this metaphor in thinking about just how I see the world.
[00:13:36] I’ll have a reigning paradigm having to do with how I view social interactions or how
[00:13:45] I think about what is good or bad and gradually little anomalies will accumulate.
[00:13:52] And I think it’s important to notice and pay attention to those anomalies instead of just
[00:13:55] ignoring them.
[00:13:57] And sometimes I just end up becoming more confused, but other times, you know, those
[00:14:02] anomalies accumulate and eventually I’m able to kind of make better sense of them in a
[00:14:07] new paradigm than I did in my old one.
[00:14:10] So yeah.
[00:14:11] It was a long-winded way of agreeing with you, but I really liked your example.
[00:14:15] That makes total sense to me.
[00:14:17] And I think I’ve experienced a lot of that confusion side of things for myself, especially
[00:14:26] if I don’t invest a lot of time in trying to clarify a position that I have.
[00:14:35] The confusion mounts as I let my brain naturally process through things rather than saying,
[00:14:43] okay, intentionally sit down and try to figure out where do I stand on this particular thing.
[00:14:48] If I instead let stimuli come through, whatever you want to call it, if I experience the world
[00:14:55] or if I read a book, whatever it is that I’m consuming and then I move on with my day,
[00:15:03] sometimes somebody will ask me what I think about something and it’s relevant to whatever
[00:15:06] it is that I was just consuming and I have no idea.
[00:15:10] I do not know how to answer the question.
[00:15:12] It’s like my brain has computed both sides and I can see both sides, right, three sides
[00:15:17] or 50 sides.
[00:15:18] I can see the validity in all of them and I don’t attach myself to any of them.
[00:15:24] And that’s really tough.
[00:15:25] That can be really hard for me, actually.
[00:15:26] Have you ever experienced this?
[00:15:28] Yeah.
[00:15:30] It is tough because allowing those anomalies in, as you nicely put it, it kind of requires
[00:15:39] you to hold your views in this superposition where something seems true to you but simultaneously
[00:15:47] you recognize that it doesn’t fit with other data you have and you have to kind of resist
[00:15:52] the temptation to collapse the superposition and force your, you know, shoehorn the information
[00:15:59] into your pre-existing worldview which it may not actually naturally fit into and just
[00:16:04] kind of exist in that state of not knowing, not being sure how to look at things and just
[00:16:11] being okay with that for a while.
[00:16:13] And you can have, you know, you can still say, well, I’m leaning towards this particular
[00:16:18] view.
[00:16:20] There’s this, the story I love that I actually wrote about in my book about a, his name is
[00:16:27] Jerry Taylor and he was a anti-climate change activist.
[00:16:33] So he worked at this think tank, a libertarian think tank called the Cato Institute and his
[00:16:37] main job was to go on talk shows and downplay the threat of climate change and basically
[00:16:43] say, you know, this has really been overblown and we, you know, the climate change doomsayers
[00:16:50] aren’t giving you the full story and actually the earth isn’t, we don’t have good evidence
[00:16:54] the earth is warming or that it’s due to human activity.
[00:16:58] So this was his main job and then his first anomaly came after one of his talk show appearances
[00:17:04] where the person he was debating, who’s a climate change activist named Joe Rahm, said
[00:17:11] to him backstage, you know, something you said seemed wrong to me.
[00:17:14] You, you claimed that the pattern of earth warming has not lived up to the predictions
[00:17:20] made by like in the original testimony given to Congress by people who are worried about
[00:17:25] climate change. But I think that’s not true.
[00:17:26] If you go back and read the testimony, you’ve misremembered or misrepresented what the prediction
[00:17:31] actually was. And so Jerry Taylor kind of expecting himself to be proven right, went
[00:17:38] back to the, to look at the testimony and realized, oh, actually I did misrepresent
[00:17:42] it. That’s interesting. Well, but this, this point was given to me by a scientist who I
[00:17:47] respect who’s skeptical of climate change. And so he went back to talk to the scientist
[00:17:51] and the scientists kind of hemmed and hawed and didn’t have a really good answer for him
[00:17:55] for why he had misrepresented this.
[00:17:58] And so this kind of stymied Jerry Taylor because here was this scientist he had trusted and
[00:18:03] respected who had kind of given him misleading information. And, and so that didn’t on its
[00:18:09] own cause him to suddenly be worried about climate change, but it did make him a little
[00:18:14] bit more confused or concerned about the quality of the evidence that he was relying on. And
[00:18:19] so he just started checking more carefully every time he would make a claim or, or, or
[00:18:25] see a study cited against climate change, he would follow up on it. And he was often
[00:18:30] kind of disappointed in the quality of the evidence that he found. And so this just gradually
[00:18:34] over time started shifting him into being more open to the possibility that maybe actually
[00:18:38] there is something to this climate change issue after all. And I could, I could tell
[00:18:43] the full story if you want, but, but the summary, since I’ve already been monologuing for a
[00:18:47] while is that he eventually did change his mind that the final kind of paradigm shift
[00:18:53] happened and now he’s, he quit the Cato Institute and now he’s a professional climate change
[00:18:59] activist trying to convince people that climate change is actually a big deal and that we
[00:19:03] should be concerned about it. And I think he’s the only professional climate change
[00:19:08] skeptic who actually switched sides.
[00:19:10] Yeah. Well, and it makes sense. There are very few people who certainly deep into their
[00:19:17] professional lives change something as contentious in terms of cultural belief.
[00:19:25] Yeah. And, and central to their kind of identity, certainly their professional identity,
[00:19:30] that I thought that was also really impressive. It wasn’t just a random side belief he had.
[00:19:36] We’ll get right back to our discussion with Julia Galeff right after we talk about today’s
[00:19:40] sponsor, LaunchDarkly. LaunchDarkly is today’s leading feature management platform that will
[00:19:52] empower your team to deliver safely and control software through feature flags by separating
[00:19:58] code deployments from feature releases. You can deploy faster, reduce risk and rest easy.
[00:20:04] Whether you’re an established enterprise like Intuit or a small or medium business like
[00:20:10] Glowforge, thousands of companies of all sizes rely on LaunchDarkly right now to control
[00:20:15] their entire feature lifecycle and avoid anxiety fueled sleepless nights. And you’ve
[00:20:21] probably had some of these, these anxiety fueled sleepless nights on a release day.
[00:20:28] Even if you don’t do release days on Fridays, if you do release days on Wednesdays, sometimes
[00:20:33] we have these bugs that don’t pop up until two days later. And so in the middle of your
[00:20:37] weekend, here you are debugging your release or trying to roll it back manually or something
[00:20:44] like that. This is a nightmare. This is a nightmare. LaunchDarkly can help you avoid
[00:20:51] this. They can make releases snoozefests. You can start deploying more frequently and
[00:20:56] securely with less stress. They also have some in kind of in progress features. You
[00:21:04] may, for example, want to automatically roll back. This is something that I’ve talked about
[00:21:09] with them. You want to roll back some of your, some of the work that you’ve done automatically
[00:21:14] based on some kind of error, some specific kind of error. You can do that. There’s integrations
[00:21:20] with some of the tools that you’re already using. Go and check it out at LaunchDarkly.com.
[00:21:28] One example of somebody who’s using LaunchDarkly, IBM. IBM went from deploying twice a week.
[00:21:35] IBM is huge. They went from deploying just twice a week to over a hundred times a day.
[00:21:41] That’s a lot of deployments and they must have a high level of confidence to be able
[00:21:46] to do this. We’ve been able to roll out new features at a pace that would have been unheard
[00:21:50] of a couple of years ago, said IBM’s Kubernetes Delivery Lead, Michael McKay. Once again,
[00:21:56] head over to LaunchDarkly.com to get started today. Thank you again to LaunchDarkly for
[00:22:00] sponsoring today’s episode of Developer Tea.
[00:22:09] Let’s talk about your book for a second. Great. Since we haven’t really even introduced the
[00:22:14] book. So you had a book and the book has come out, correct? It is published. It’s actually
[00:22:19] coming out Tuesday, April 13th. It will be published very soon. Tomorrow it comes out
[00:22:28] as we’re recording this. But by the time this is out, the book will be out. Can you just give us
[00:22:37] a 50,000th of view and then we’ll get into some of the details of what the book is about?
[00:22:41] Sure. It’s called The Scout Mindset and that is my term for the motivation to see things as they
[00:22:48] are and not as you wish they were. So basically to try to be intellectually honest and objective
[00:22:55] and just to be curious about what’s actually true about a given question or situation.
[00:23:02] The metaphor is it’s part of the framing metaphor of the book in which I say that
[00:23:06] humans are very often by default in what I call soldier mindset where our motivation is to defend
[00:23:11] our pre-existing beliefs or defend things we want to believe against any evidence that might
[00:23:16] threaten them. Scout mindset is an alternative to that because the scout’s role is not to go out
[00:23:24] and attack or defend. It’s to go out and see what’s really there to try to form as accurate a map of
[00:23:31] the territory or of the situation as possible. So the book is about why are we so often in
[00:23:38] soldier mindset as our default and how can we shift towards scout mindset instead and why is
[00:23:45] that a thing we should want to do? As I hear you describe this my brain immediately goes to
[00:23:52] a lot of the kind of canonical literature on the various biases that are related to this
[00:24:00] and I’m sure that you talk about these in the book things like confirmation bias or
[00:24:08] the idea that we’re seeking out information and it’s interesting I noticed myself doing this and
[00:24:13] it’s in subtle ways I think this is the important thing that a lot of people miss
[00:24:19] and I’d like to hear your take on this but subtle ways that I seek information that confirms what
[00:24:24] I’m saying and I’ve noticed this when I google the question that I have if I believe the positive
[00:24:31] side I google the positive side right in other words like if I sometimes I’ll google I’ll give
[00:24:39] a really benign example I’ve been doing power lifting for a couple of months
[00:24:44] oh nice and if I’ve already bought let’s say a particular type of protein
[00:24:48] I’ll google the benefits of that protein rather than googling the downsides to it right right
[00:24:55] it’s it’s uh there I want to make you know and the I guess the um the the what is it the purest
[00:25:05] side of me is saying well I’m learning about this I’m making sure right confirming that I’d
[00:25:11] made a good decision here right yeah go on oh I was just gonna say I’m kind of tricking myself
[00:25:17] into believing that I’m actually seeking out the truth I’m seeking information but because
[00:25:23] I’ve kind of subtly sometimes subtly to the point that I can’t even detect it adjusted
[00:25:31] the search and and when I say the search I don’t just mean on google I mean kind of the
[00:25:37] the search for knowledge on this particular subject right now I’m I’m served those things
[00:25:44] but also kind of making this problem even worse is that google has learned that I like this thing
[00:25:52] so google’s gonna continuously give me the same you know things that they think is going to play
[00:25:57] have you ever tried opening up an incognito window just to see if your results are different
[00:26:02] oh absolutely occasionally it’s so interesting it’s scary though right it’s the same feeling
[00:26:07] of like well I don’t want somebody to tell me this I don’t want to have to think too hard either
[00:26:12] right it would be okay it’d be one thing if I was seeking out this information when I say I by the
[00:26:19] way I’m talking about the lazy brain me not the cognitive me uh it’d be one thing if I had no
[00:26:26] information at all if I was starting from ground zero then of course I’m you know let’s weigh both
[00:26:31] sides of this but I’ve already spent my money you know I’ve already kind of bought into this
[00:26:35] I’ve taken it I just took it you know 10 minutes ago and I don’t want to feel bad about that and
[00:26:41] I think that drives so much of our behavior and I just love this this concept as the the basis for
[00:26:46] this book um can you can you share a little bit more about some of the more I love talking about
[00:26:53] things that are counterintuitive or otherwise we’re shifting for you what was something that
[00:26:57] you learned in the process of writing this book oh that’s a great question and I’m going to get
[00:27:02] to that in just a minute I just wanted to to applaud your description of soldier mindset in
[00:27:08] yourself um because I just you said it really perfectly and it reminded me of the the best
[00:27:15] definition of what I call soldier mindset and what cognitive scientists often call directionally
[00:27:21] motivated reasoning um and so I thought you might appreciate this definition it comes from a
[00:27:27] cognitive scientist named tom gilovich and he says that when we when there’s something that we want
[00:27:33] to believe we view it through the lens of can I accept this but when there’s something we don’t
[00:27:39] want to believe we view it through the lens must I accept this and so our standards our standards
[00:27:45] of evidence or the amount of time we’ll spend investigating a claim and looking for holes in it
[00:27:51] those are those are very different they’re very asymmetric depending on whether it’s something
[00:27:55] that we want to believe or we don’t but what that ends up feeling like is even when there’s
[00:28:02] something you don’t want to believe and you’re asking yourself must I believe this and you’re
[00:28:05] kind of looking at it critically and looking for flaws in it it feels like you’re being a
[00:28:11] rigorous critical thinker right and you are in a sense it’s just that you’re not applying that
[00:28:15] same standard of rigorous critical thinking to the things you want to believe and so the end
[00:28:19] result is that you end up having in your head a bunch of beliefs that are disproportionately
[00:28:25] things you wanted to believe because you’re applying this asymmetric standard of rigor or
[00:28:30] evidence through the same mechanical actions too which I think exacerbates the problem
[00:28:36] right you’re reading white I’m very much reading white papers on both of these
[00:28:43] of sides of the equation here right I’m looking right good evidence and in one on the one hand
[00:28:50] you’re exactly right on the one hand I’m looking for the flaws in the evidence on the other hand
[00:28:54] I’m looking for something that that I can accept and close my tab and go back to whatever I was
[00:29:00] doing before right yeah and in fact I noticed myself doing this as I was writing the book
[00:29:07] I was I came across a study that that purported to show that scout mindset made you
[00:29:13] less successful in life I mean that’s my summary of it but that was kind of the the implication of
[00:29:18] the study and so of course I read this headline and my eyes immediately narrowed and I was like
[00:29:24] all right let’s see let’s check out their methodology and see if this study is actually
[00:29:29] stands up and I read the methodology section and actually it was a pretty poorly done study
[00:29:33] um and I I don’t think anyone should um should update on it but then I asked myself well
[00:29:40] suppose I had come across the same study with the same methodology but it had found the
[00:29:45] opposite conclusion suppose I found the study showing that scout mindset makes you successful
[00:29:50] in life what would my reaction have been in that world and I realized oh into the references yeah
[00:29:57] exactly I was like oh I I would you know set it aside and be like okay I’m going to spend
[00:30:02] three pages talking about this study in my book and so it just the thought experiment made me
[00:30:07] realize I kind of needed to up my game for how skeptical or critical I was being of the evidence
[00:30:12] that supported my thesis and so I went back through a bunch of studies I had bookmarked to
[00:30:17] talk about in the book and I read their methodology sections through the same kind of critical lens
[00:30:22] and realized that a lot of them were actually pretty bad also and I did not feel comfortable
[00:30:26] relying on them in my book and so I had to scrap a bunch of sections this is one of the reasons the
[00:30:30] book took so long to write um so yeah I think you know the the point of that being that uh
[00:30:39] it can be really easy even if you’re very aware well aware of soldier mindset and scout mindset
[00:30:44] as clearly I am because I’m writing a book about it it’s still a very unconscious uh habit of mind
[00:30:50] that you have to kind of be on the lookout for and and catch yourself doing and merely knowing
[00:30:56] about the existence of motivated reasoning or soldier mindset is not enough in itself
[00:31:00] to prevent you from from doing this yeah yeah uh as as I believe Danny Kahneman famously has
[00:31:08] said over and over that he is he falls prey to biases more than his uh than his uh than his
[00:31:17] friends do or something I can’t remember exactly what he said I forget his exact wording too but he
[00:31:22] he well he said a couple things one thing he said is that just knowing about about you know
[00:31:30] the the biases and human reasoning is not enough to prevent him from doing it and he’s he also
[00:31:35] has some funny stories of him himself falling prey to biases that he’s you know written textbooks
[00:31:41] about yeah it seems like one about buying furniture or something is in there I don’t
[00:31:48] remember that one I was thinking of the one where he was he was working on a textbook
[00:31:53] writing a textbook with some people and they were trying to predict how long it would take them
[00:31:57] to finish the first draft of the textbook and they their estimate was was way over optimistic
[00:32:04] and it ended up taking I think I don’t know six times longer than they thought it would
[00:32:09] or maybe they never even finished I don’t remember um but you know but but his point was he is well
[00:32:15] aware of how people’s predictions about how long things are going to take are over optimistic and
[00:32:20] he knows that you should look at the outside view which is his term for you know how long do things
[00:32:26] like this typically take like when other people when other groups write textbooks like this right
[00:32:30] yeah or when I’ve done similar things in the past how long has it taken that’s kind of the outside
[00:32:35] view whereas the inside view is just how long does it seem like this should take me when I think
[00:32:38] about doing it how long do I think it’s going to take and and the outside view for textbooks was
[00:32:44] you know about six times longer than they thought it would take and so he was like if I had just
[00:32:48] relied on the outside view the way I tell people to then I would have been accurate but instead I
[00:32:52] just stuck to the inside view so that was a funny example of how just knowing about these biases is
[00:32:58] not enough to prevent you from uh from committing them right and then another thing that he’s said
[00:33:04] which I kind of have an interesting disagreement with is that he said he’s pessimistic about our
[00:33:10] ability to overcome these innate biases and he’s pointed to research suggesting that
[00:33:18] teaching people critical thinking is not it doesn’t actually make them better critical thinkers
[00:33:24] and so my disagreement with this is well I guess the the crux of the disagreement is that
[00:33:32] the studies testing whether people can overcome biases they’re mostly done on you know college
[00:33:39] students who just volunteered for this study for half an hour of their time so there’s some kind
[00:33:45] of selection bias there well they’re they’re not people who are trying to improve their thinking
[00:33:52] over a longer period of time which is I think a much more realistic expectation for what would
[00:33:57] it take to change our innate habits of thinking you know when you just think about that question
[00:34:02] I think it’s kind of obvious that no of course just reading an article isn’t going to automatically
[00:34:06] change the way you think like if you go to therapy if you see a cognitive therapist
[00:34:10] and you you have you’re aware that you have these kind of cognitive distortions in the way you see
[00:34:15] the world like you just you tend to jump to conclusions or you tend to focus only on the
[00:34:20] negatives and not on the positives or you know there’s all sorts of cognitive distortions that
[00:34:25] we all have to some degree and you can to some extent change those habits of thought you can
[00:34:30] get better at recognizing oh I’m doing that thing again where I completely downplay all of
[00:34:36] my positives and just focus on the negatives and you can to some extent with practice you know
[00:34:41] overcome that and learn to step outside of that and go no I you know I remember I’ve done this
[00:34:45] before and you know I need to remind myself of the things that I did right not just the things I did
[00:34:50] wrong but that process takes time it’s not just something where you read an article and all of a
[00:34:54] sudden you can expect yourself to be you know seeing things in a balanced way and so I don’t
[00:34:59] actually think any of the studies that people like Kahneman point to as evidence about our
[00:35:04] ability to improve our critical thinking I don’t think any of those studies actually tested the
[00:35:08] thing that that we should expect would have it make a difference so yeah I think I think this
[00:35:15] is an underappreciated point that these biases are they’re ingrained habits of thoughts that
[00:35:22] can be changed but it just takes a little more long-term effort and practice to change them
[00:35:27] Thank you so much for listening to today’s episode the first part of my interview with
[00:35:33] Julia Galef if you enjoyed this episode then certainly make sure you’re subscribed and whatever
[00:35:39] podcasting app you’re currently using so you don’t miss out on the second part of this interview
[00:35:43] thank you again to today’s sponsor LaunchDarkly head over to LaunchDarkly.com to start resting
[00:35:48] easy when you deploy your features thanks so much for listening to this episode if you want to join
[00:35:54] the discord community head over to developert.com discord we discuss things that we talk about on
[00:36:01] these episodes but we also discuss things that never make it into a developer t episode I give
[00:36:07] advice and other great software engineers with even more experience than I have are giving advice
[00:36:13] in there as well it’s a good time you’ll find support and you’ll find a lot of people who
[00:36:20] really want the best for the others that are in that group thanks so much for listening
[00:36:25] and until next time enjoy your tea