Don’t Trust Your Friends


Summary

This episode of Developer Tea delves into the pervasive nature of cognitive biases and how they are amplified when we surround ourselves with people who share our perspectives. Host Jonathan Cottrell explains that we naturally seek ‘cognitive ease’ by associating with those who agree with us, which creates a collective overlap in bias. While this is a normal human tendency, it becomes problematic when we’re trying to make objective decisions or build products, as shared biases blind us to potential errors and alternative viewpoints.

The core argument is that diversity—not just in race, but in background, belief systems, age, and experience—is a powerful tool for combating collective error. By interacting with people who have different biases, we increase our awareness of our own blind spots. The episode connects this principle to software development, suggesting that relying solely on a homogeneous team or user base for feedback can lead to flawed products and missed opportunities.

Cottrell offers three practical strategies to short-circuit shared biases. First, actively survey and understand the users of your products, as they likely possess different biases than your development team. Second, intentionally seek out information and perspectives that disagree with your position to combat confirmation bias. Third, ask a powerful question borrowed from Tim Ferriss: ‘What if I did it exactly the opposite way?’ or ‘What if I believed exactly the opposite thing?’ This thought experiment helps identify where biases have filled logical holes in planning or belief.

The episode concludes by emphasizing that the goal isn’t to stop collaborating with colleagues or friends, but to take personal responsibility for becoming aware of shared biases. The aim is to ‘level up’ by becoming more holistic and correct in one’s approaches, ultimately building better software and making better decisions by acknowledging and working around our inherent cognitive limitations.


Recommendations

Books

  • Book by Daniel Kahneman — Referenced in the context of ‘cognitive ease,’ the mental state we enter when we are around people who agree with us, reducing mental strain.

People

  • Tim Ferriss — Mentioned as the source of a powerful life-changing question: ‘What if I did it exactly the opposite way?’ which is adapted here as a tool for identifying bias.

Tools

  • Rollbar — Promoted as a sponsor and recommended tool for error tracking in software development. It helps developers find and fix errors in production before users see them, eliminating reliance on potentially biased or inaccurate user error reports.

Topic Timeline

  • 00:00:00Introduction to the problem of trusting friends and colleagues — Jonathan Cottrell introduces the episode’s central theme: questioning whether we should trust our friends and colleagues due to shared biases. He frames the discussion around eliminating ‘collective error’—the mistakes that arise when a group shares overlapping cognitive biases. The episode aims to explore how these biases affect developers’ work and decision-making processes.
  • 00:02:22Cognitive ease and our tendency to agree with those around us — Cottrell explains the psychological concept of ‘cognitive ease,’ referencing Daniel Kahneman. Our brains are wired to seek environments and relationships that require less mental strain, which often means surrounding ourselves with people who agree with us. This natural tendency, combined with other environmental and experiential biases, creates groups with significant overlap in how they perceive the world, making it difficult to identify shared blind spots.
  • 00:04:42The economic case for diversity in eliminating bias — The host makes a direct argument for diversity as a tool for error reduction. If everyone in a room shares similar biases, it becomes nearly impossible to become aware of and eliminate those biases. Diversity in race, background, belief systems, and age brings different sets of biases to the table, increasing overall awareness and allowing a team to pull from a wider range of perspectives, which research shows helps eliminate errors created by bias.
  • 00:08:45Three practical strategies to short-circuit shared biases — Cottrell presents three actionable steps developers can take. Step one is to survey the users of your products, as they will have different biases than you. Step two is to intentionally seek out information that disagrees with your position to combat self-confirmation bias. Step three is to ask, ‘What if I did it/believed exactly the opposite way?’—a question adapted from Tim Ferriss—to explore opposing positions and identify holes filled by bias.
  • 00:14:56Conclusion and the responsibility to improve — The host clarifies that the message is not to stop collaborating with colleagues, but to take personal responsibility for becoming aware of shared biases. The goal is to ‘level up’ by becoming more holistic and correct in one’s approaches. He encourages listeners to reach out with their experiences and reiterates the sponsor message for Rollbar as a tool to proactively find errors, rather than relying on biased user reports.

Episode Info

  • Podcast: Developer Tea
  • Author: Jonathan Cutrell
  • Category: Technology Business Careers Society & Culture
  • Published: 2017-03-06T10:00:00Z
  • Duration: 00:16:51

References


Podcast Info


Transcript

[00:00:00] What if I told you not to trust your friends?

[00:00:08] Or perhaps more appropriately for developers,

[00:00:11] what if I told you not to trust your colleagues at work?

[00:00:16] There’s actually a good reason for why I’m asking you this question.

[00:00:22] My name is Jonathan Cottrell. You’re listening to Developer Tea.

[00:00:25] Thank you for joining me today.

[00:00:26] Today we’re talking about the idea of eliminating overlap.

[00:00:33] Or more specifically, we’re eliminating collective error.

[00:00:38] On this show, we’ve talked about a lot of different biases

[00:00:42] that may affect our way of thinking and therefore they affect our work.

[00:00:48] And this is no different in today’s episode.

[00:00:50] We’re talking about a collection of biases.

[00:00:52] We’re not going to name every single one of them.

[00:00:55] But for example,

[00:00:56] we’re talking about selection bias.

[00:01:00] You can go and look up some of the biases

[00:01:02] or biases that we’re talking about in today’s episode.

[00:01:07] But I want to talk more specifically about the effect of these biases.

[00:01:12] As we go along, I may mention a few of them.

[00:01:16] Biases occur for multiple reasons.

[00:01:19] In fact, perhaps an endless number of reasons

[00:01:22] or at least so many reasons that we are not able to quantify them.

[00:01:26] And that’s because biases are entirely based on

[00:01:30] the person who is experiencing the bias.

[00:01:35] Where they grew up.

[00:01:36] What are their genetics?

[00:01:38] What are their experiences?

[00:01:40] And who do they know?

[00:01:42] What kind of food did they eat today?

[00:01:45] There are so many biases

[00:01:46] and it’s impossible to know exactly which biases

[00:01:50] are affecting you at all times.

[00:01:53] Perhaps the reason that it’s impossible is because

[00:01:56] these biases are affecting you at all times.

[00:01:56] These things come and go and our biases

[00:01:58] are highly related to our current environment as well.

[00:02:02] But there’s an interesting thing about the environmental

[00:02:05] and perhaps the longer term environmental biases

[00:02:09] that we end up carrying with us.

[00:02:11] And we have a specific bias

[00:02:14] that is important to recognize

[00:02:16] for the sake of the validity of the rest of the conversation today.

[00:02:20] And that bias is that we simply tend to

[00:02:22] find people who agree with us.

[00:02:25] We befriend,

[00:02:26] or we end up working with people

[00:02:29] who tend to agree with us.

[00:02:31] And this isn’t necessarily a fault.

[00:02:34] This is the way our brain is wired.

[00:02:36] It’s what Daniel Kahneman in his book,

[00:02:38] I mentioned this in the last episode of Developer Tea,

[00:02:41] he calls this cognitive ease.

[00:02:44] The times when we are actually agreeing with the people

[00:02:48] who are around us,

[00:02:50] our brain is at ease

[00:02:51] and it’s not having to work as hard

[00:02:53] as if we were in conflict

[00:02:54] or if we didn’t agree,

[00:02:56] with the people around us.

[00:02:58] So in a very similar way

[00:03:00] that your body is more at ease

[00:03:03] when you’re not struggling and straining,

[00:03:05] for example, on a workout,

[00:03:07] the average pace that you would tend to walk

[00:03:11] is not going to be a sprint, right?

[00:03:13] You are at ease at a certain level

[00:03:16] and your brain is at ease at a certain level.

[00:03:19] And so we unintentionally strive

[00:03:22] for this sense of cognitive ease.

[00:03:25] And one of the things it gives us,

[00:03:26] ease is to be around people who agree with us.

[00:03:29] Ultimately what this creates

[00:03:31] in combination with multiple other biases,

[00:03:34] for example,

[00:03:35] even the temperature,

[00:03:37] the average temperature where you live

[00:03:39] can change the way you think about things.

[00:03:41] What we end up with

[00:03:42] is a group of people

[00:03:44] that we’ve surrounded ourselves with

[00:03:46] that share a large overlap in bias.

[00:03:51] Now this is generally okay.

[00:03:53] We actually have found a way to survive

[00:03:56] even though we are collectively wrong

[00:04:00] about many things together

[00:04:01] and often about things that we don’t realize

[00:04:04] that we are wrong about.

[00:04:06] And it may not be necessarily

[00:04:07] the right word to use there, wrong,

[00:04:10] but certainly we have a particular way of perceiving.

[00:04:14] And sometimes that way of perceiving

[00:04:16] may or may not line up with reality.

[00:04:20] So why does this matter?

[00:04:21] Well, if you surround yourself with people

[00:04:23] who have similar biases,

[00:04:25] to you,

[00:04:26] and if you’re attempting,

[00:04:28] if you’re listening to the show,

[00:04:29] hopefully you are attempting to

[00:04:31] eliminate the effect of bias on your decisions

[00:04:34] or at least on the work that you’re collaborating on.

[00:04:38] Unfortunately,

[00:04:39] if everyone in the room shares similar biases,

[00:04:42] then it’s going to be harder

[00:04:44] to eliminate those biases

[00:04:46] because it’s going to be harder

[00:04:47] to become aware of those biases.

[00:04:50] This is one of the most direct

[00:04:52] economical reasons

[00:04:54] that we can be

[00:04:55] diversity is good.

[00:04:57] Assuming you want to eliminate error

[00:04:59] and error that is resulting of bias,

[00:05:04] then diversity increases your awareness

[00:05:08] of multiple biases.

[00:05:11] Diversity allows you to pull

[00:05:13] from different groups of people

[00:05:16] who may or may not share the same biases.

[00:05:21] And obviously right now,

[00:05:22] we’re not talking about a diversity only in race,

[00:05:25] but also in background and ethnicity,

[00:05:28] in belief systems,

[00:05:30] in age,

[00:05:31] all of these things that change the way

[00:05:34] that we think.

[00:05:35] Because contrary to the way

[00:05:37] that we intuitively believe,

[00:05:39] perhaps,

[00:05:40] people are not necessarily

[00:05:43] mentally equal.

[00:05:45] I want to qualify what I’m saying.

[00:05:47] People do not have

[00:05:49] the same exact brains.

[00:05:51] In other words,

[00:05:52] we are all prone to different ways

[00:05:55] of thinking about biases.

[00:05:56] But when we collaborate,

[00:05:59] we actually tend,

[00:06:00] and this is backed up by research,

[00:06:03] when we collaborate,

[00:06:04] our average collaboration

[00:06:06] tends to eliminate

[00:06:08] the errors created by bias.

[00:06:11] Okay, we’ve said the word bias

[00:06:13] probably a hundred times in this episode.

[00:06:16] I know you’re tired of hearing it.

[00:06:17] So let’s take a quick sponsor break

[00:06:19] and talk about today’s incredible sponsor,

[00:06:22] Rollbar.

[00:06:23] By the way,

[00:06:24] if you are doing a lot of research,

[00:06:25] you’re depending on your users

[00:06:27] to report errors to you.

[00:06:30] Believe it or not,

[00:06:31] your users are not going to be able

[00:06:34] to accurately report errors every time.

[00:06:37] And this actually happens to do with bias as well.

[00:06:39] Sorry, we had to use that word one more time.

[00:06:42] Errors are going to be in your code.

[00:06:44] They’re going to lurk around in your code

[00:06:46] and detecting and diagnosing those errors

[00:06:49] is really hard.

[00:06:50] Relying on your users to report them

[00:06:51] is very hard.

[00:06:53] They’re not developers.

[00:06:55] So those reports sometimes

[00:06:56] are incredibly hard to understand.

[00:06:59] And on top of that,

[00:06:59] digging through logs,

[00:07:01] trying to debug issues,

[00:07:03] that’s a painful process.

[00:07:05] For anyone who’s ever experienced it,

[00:07:07] you know it’s a painful process.

[00:07:09] Rollbar works with all major languages

[00:07:11] and frameworks to eliminate this problem.

[00:07:13] You can start tracking production errors

[00:07:15] in minutes.

[00:07:15] And basically, you’re going to get the error

[00:07:17] before your users see it.

[00:07:20] You can integrate Rollbar

[00:07:21] into your existing workflow

[00:07:22] and you can send error alerts

[00:07:24] to places like Slack,

[00:07:25] or HipChat.

[00:07:26] And you can link your source code

[00:07:28] in GitHub, Bitbucket, GitLab.

[00:07:30] You can turn errors into issues

[00:07:33] in Jira, and Pivotal Tracker, and Trello.

[00:07:36] Some of their customers, by the way,

[00:07:37] and this should give you

[00:07:38] a pretty good vote of confidence,

[00:07:40] their customers include

[00:07:41] Heroku, Twilio, Kayak, Instacart,

[00:07:44] Zendesk, and Twitch.

[00:07:45] These are huge, huge companies

[00:07:47] that are relying on Rollbar

[00:07:48] to get their errors

[00:07:49] in front of the developers

[00:07:51] before they get their errors

[00:07:52] in front of users.

[00:07:55] Go and check out what Rollbar

[00:07:56] has to offer to you today.

[00:07:59] Rollbar, by the way,

[00:08:00] is providing you

[00:08:01] the bootstrap plan for free.

[00:08:04] Go and check it out.

[00:08:05] Rollbar.com slash developer T.

[00:08:08] Rollbar.com slash developer T.

[00:08:11] Thank you again to Rollbar

[00:08:12] for sponsoring today’s episode

[00:08:14] of developer T.

[00:08:16] Once again, don’t rely on people

[00:08:19] to give you mission-critical stuff

[00:08:23] if you can rely on a system

[00:08:25] like Rollbar.

[00:08:26] This is going to be so much better

[00:08:27] to understand the errors

[00:08:30] that are happening in your application.

[00:08:32] Thank you again to Rollbar.

[00:08:33] So we’re talking about bias today.

[00:08:35] I said that we’ve said that word

[00:08:37] a hundred times in this episode.

[00:08:38] That’s because it’s a real thing.

[00:08:40] And it’s a real thing

[00:08:41] that we often ignore.

[00:08:43] So I’m going to give you three ways

[00:08:45] to hopefully short-circuit these biases.

[00:08:49] Okay, three things that you can do

[00:08:51] in your work every day

[00:08:53] to help eliminate

[00:08:54] the biases that you

[00:08:56] and your co-workers

[00:08:57] and your friends,

[00:08:59] you may all share.

[00:09:01] Step number one,

[00:09:02] survey the people

[00:09:03] who are using the things you build.

[00:09:05] Quite simply,

[00:09:06] the people who are using

[00:09:07] the things you build

[00:09:08] are going to have different biases

[00:09:10] from you in most cases.

[00:09:12] Now, you have to also understand

[00:09:14] that the selection of people

[00:09:16] who are using the things that you build

[00:09:20] may also share a common set of biases.

[00:09:24] But those biases are going to be different

[00:09:26] from the ones that you have.

[00:09:28] And in fact,

[00:09:29] you may need to be aware of those biases.

[00:09:33] Right?

[00:09:33] It’s good for you to understand the biases

[00:09:35] of your larger user base.

[00:09:38] Why is that?

[00:09:39] Well, if you’re going to build a product

[00:09:41] for a group of people

[00:09:43] who share a common bias,

[00:09:45] then building that product

[00:09:46] with the knowledge of the bias

[00:09:48] may allow you to create a better product.

[00:09:51] It’s a very simple equation there.

[00:09:53] The more you know about your customers,

[00:09:54] the better you’re going to be

[00:09:56] at building something for those customers.

[00:09:58] So you need to get to know your customers,

[00:10:00] get to know your users,

[00:10:01] whatever the word is,

[00:10:02] for the people who are using the things

[00:10:04] that you build.

[00:10:06] Understand those people.

[00:10:07] Understand the things that they’re facing,

[00:10:09] the things that you have created for them,

[00:10:12] how they are good or how they are bad.

[00:10:15] Learn from your customer.

[00:10:17] That’s going to allow you

[00:10:18] to eliminate some of the bias that you have

[00:10:20] about whether or not

[00:10:22] the thing you built is good.

[00:10:24] Step number three,

[00:10:24] intentionally seek out

[00:10:27] things that disagree with your position.

[00:10:30] Perhaps the most powerful bias

[00:10:32] that we experience on a day-to-day basis

[00:10:34] is our self-confirmation bias.

[00:10:38] We hate being wrong.

[00:10:39] This is both a psychological factor

[00:10:42] and a physiological factor.

[00:10:45] We hate being wrong.

[00:10:47] We fear being wrong.

[00:10:49] And if you can learn how you are wrong

[00:10:53] and if you can learn how you are wrong,

[00:10:54] and if you can constantly remind yourself

[00:10:56] that being wrong is a part of being human,

[00:11:00] then you can eliminate this accidental stigma

[00:11:03] that you’ve adopted into your brain

[00:11:05] about being wrong.

[00:11:06] If you can constantly remind yourself

[00:11:10] that there are things that disagree with you,

[00:11:12] that there are people that disagree with you,

[00:11:15] that there are studies,

[00:11:16] perhaps proof that disagrees with you,

[00:11:19] seeking out information

[00:11:21] that doesn’t confirm

[00:11:23] what you believe.

[00:11:24] or that you believe already

[00:11:25] but instead actually opposes what you believe,

[00:11:29] this is going to help you reform

[00:11:32] or at least become aware of your biases.

[00:11:35] Now a quick point of warning here,

[00:11:38] seeking out this kind of information

[00:11:40] can lead you down a rabbit hole

[00:11:42] of feeling unconfident.

[00:11:45] What I don’t want you to do

[00:11:46] is start believing that

[00:11:48] you are always going to be wrong

[00:11:50] or that the things that you do

[00:11:52] or the things you believe

[00:11:53] somehow limit you

[00:11:55] from becoming better.

[00:11:58] In fact,

[00:11:59] when you seek out this information,

[00:12:01] part of what is happening,

[00:12:02] part of the reason you may feel this way

[00:12:05] is because it is tiring.

[00:12:07] It is quite literally physically

[00:12:09] tiring your brain out

[00:12:11] to seek this information out.

[00:12:13] So remind yourself

[00:12:15] that you are doing this

[00:12:17] because you want to become better,

[00:12:19] not because you want to

[00:12:21] downgrade yourself,

[00:12:23] but instead,

[00:12:23] because you want to,

[00:12:25] as we say at Spec,

[00:12:26] you want to level up in your career.

[00:12:28] Becoming aware of your biases

[00:12:29] is not reminding yourself

[00:12:33] why you’re wrong,

[00:12:33] but instead,

[00:12:34] learning how you can become more right.

[00:12:38] If you view it with that lens,

[00:12:40] it’s an empowering thing

[00:12:41] to learn that you are wrong.

[00:12:43] It’s kind of a catch-22 there.

[00:12:45] Knowing that you’re wrong

[00:12:46] is psychologically difficult to handle,

[00:12:49] but also realizing that you are wrong

[00:12:51] is better than never knowing

[00:12:53] that you’re wrong.

[00:12:54] Assuming that you want to get better.

[00:12:56] Assuming that you want to be right.

[00:12:58] So actively seek out information

[00:12:59] that disagrees with your position.

[00:13:01] Step number three,

[00:13:03] and this is going to be stolen

[00:13:04] directly from Tim Ferriss.

[00:13:06] He has a podcast talking about

[00:13:08] the questions that changed his life.

[00:13:11] And this question is a powerful one.

[00:13:13] It’s actually in a different context here

[00:13:15] than it was on his episode.

[00:13:17] He was talking more about experimentation

[00:13:18] and trying to find things that work.

[00:13:23] And here,

[00:13:23] we’re talking about trying to eliminate bias

[00:13:25] in the work that we’re doing.

[00:13:27] Ask yourself,

[00:13:28] what if I did it exactly the opposite way?

[00:13:32] Or perhaps more applicable to this,

[00:13:34] what if I believed exactly the opposite thing?

[00:13:38] Now, for some things,

[00:13:39] believing the opposite is somewhat impossible.

[00:13:42] So for example,

[00:13:43] I have a strong belief that two plus two equals four.

[00:13:46] And if I asked myself

[00:13:48] what the opposite of two plus two equals four would be,

[00:13:51] the logical opposite,

[00:13:53] would be two plus two is not equal four.

[00:13:55] But a lot of people would also say

[00:13:57] that the opposite would be

[00:13:58] two plus two equals negative four.

[00:14:01] This is an intuitive opposite response.

[00:14:04] So don’t get caught up in the semantics

[00:14:06] of what opposite means,

[00:14:08] but ask yourself,

[00:14:09] what if I believed an opposing position?

[00:14:12] Or what if I implemented an opposing feature set?

[00:14:18] What if I supported a completely different group

[00:14:21] of people in this application?

[00:14:23] These are the kinds of questions

[00:14:25] that allow you to explore

[00:14:27] and hopefully identify holes

[00:14:29] where your biases have filled those holes in.

[00:14:33] Now, these are three practical things

[00:14:35] that I’ve given you today

[00:14:36] to try to start becoming more aware

[00:14:38] of your collective bias.

[00:14:40] And you’ve noticed that I didn’t tell you

[00:14:43] to go and collaborate

[00:14:43] with the people that you’re working with.

[00:14:45] It’s because we’re specifically talking about bias

[00:14:48] that is shared between you and your coworkers,

[00:14:51] you and your friends,

[00:14:52] you and the people,

[00:14:53] that you are directly associated with

[00:14:54] on a regular basis.

[00:14:56] And I want you to hear me clearly.

[00:14:58] This does not mean

[00:14:59] to stop working well with those people.

[00:15:02] Instead, it means

[00:15:03] that you have a responsibility,

[00:15:06] or at least if you’re listening to this show,

[00:15:09] you have the drive to become better.

[00:15:12] You have the drive to eliminate errors

[00:15:15] that you’re making,

[00:15:16] whether that’s cognitive errors

[00:15:17] or quite literally errors in your code.

[00:15:20] You have the desire

[00:15:22] to get better at what you do

[00:15:24] and to be more holistic

[00:15:26] and more correct in your approaches.

[00:15:30] Unfortunately, we’re all limited by biases,

[00:15:33] but these three tips I’ve given you today,

[00:15:35] hopefully they will help you

[00:15:36] overcome some of the biases

[00:15:38] that you are still facing,

[00:15:40] even with a fantastic team that surrounds you

[00:15:43] and keeps you accountable within that team.

[00:15:45] I hope this episode provokes discussion.

[00:15:48] I hope you will reach out to me

[00:15:49] and let me know how these things work out for you.

[00:15:52] Some of the biases that you have,

[00:15:54] I’d love to hear those.

[00:15:55] Reach out to me on Twitter at developerT.

[00:15:57] Of course, you can email me at developerT at gmail.com.

[00:16:00] Thank you again to today’s incredible sponsor, Rollbar.

[00:16:03] If you don’t want to rely on your users

[00:16:06] and just sit back and wait for them to tell you

[00:16:09] what’s wrong with your application,

[00:16:11] if you want to be proactive

[00:16:12] and fix it before they see it,

[00:16:14] you need to use Rollbar.

[00:16:16] Go and check it out, rollbar.com slash developerT.

[00:16:18] They have the bootstrap plan waiting for you for free.

[00:16:21] It’s very easy.

[00:16:22] It’s easy to set up.

[00:16:23] I have done it on many applications.

[00:16:25] Once again, rollbar.com slash developerT.

[00:16:27] Thank you so much for listening to today’s episode.

[00:16:30] You can find everything related to developerT

[00:16:32] and other awesome shows for designers and developers

[00:16:35] at spec.fm.

[00:16:37] Do something today to level up in your career.

[00:16:40] That’s my challenge to you today.

[00:16:42] Thanks so much for listening.

[00:16:43] And until next time, enjoy your tea.