Self-Awareness and Intellectual Honesty


Summary

The episode begins by distinguishing between outright lies and the more common, often unintentional form of lying that developers engage in regularly: intellectual dishonesty. This occurs when we present our opinions, preferences, or heuristics as objective truths, often to gain credibility or avoid appearing uncertain. The host, Jonathan Cottrell, argues that this behavior stems from a lack of self-awareness about the true origins of our beliefs, which are frequently adopted second- or third-hand from authorities rather than developed through direct evidence or experience.

Cottrell connects intellectual dishonesty to the broader goal of self-awareness, which he defines as the foundation for self-improvement. When we become more self-aware, we better recognize both our weaknesses and our strengths, allowing us to focus our energy more effectively. However, pursuing intellectual honesty in a group setting presents a social risk similar to a prisoner’s dilemma: the first person to admit uncertainty or the shaky foundation of their beliefs risks being ostracized or having their ideas devalued if others continue to maintain facades of certainty.

This dynamic places a significant responsibility on leaders and managers to cultivate a team culture that values and rewards intellectual honesty. The host suggests that such a culture is essential for genuine improvement, as it allows teams to openly identify weaknesses, rely on each other for feedback, and collaboratively seek better versions of reality. Without this foundation of honesty, it becomes difficult to improve processes, estimate work accurately, or build truly effective collaboration.

The episode concludes with practical advice for listeners: to increase their sensitivity to moments of intellectual dishonesty, which often arise from feelings of fear or anxiety. Cottrell provides a concrete example of how to be intellectually honest in a stand-up meeting by prefacing an admission of uncertainty with the intent to improve. By consistently operating from a place of honesty, individuals can set a new precedent that, over time, can transform team culture and lead to better outcomes.


Recommendations

Tools

  • Clubhouse — Sponsored the episode. Described as a project management platform for software development that balances simplicity and structure for cross-functional collaboration, featuring an intuitive interface and a robust API for integrations.

Topic Timeline

  • 00:00:00Introduction to lying and intellectual dishonesty — The episode opens by asking listeners about the last time they told an outright lie, then distinguishes this from the more common, subtle form of lying the host wants to discuss: intellectual dishonesty. Jonathan Cottrell introduces himself and the show’s mission to help developers connect to their career purpose through greater self-awareness.
  • 00:02:09The value and dual effect of self-awareness — Cottrell explains that increased self-awareness has two juxtaposed effects: it makes your weaknesses more apparent while also allowing you to better focus on and hone your strengths. This self-knowledge forms the foundation for self-improvement, enabling you to direct your energy more effectively.
  • 00:03:31Example of intellectual dishonesty among developers — The host provides a common scenario: developers advocating for a technical solution not based on thorough vetting or objective evidence, but on personal preference, past experience, or adopted beliefs. They often present these preferences as if they are objectively better, which is a form of intellectual dishonesty.
  • 00:07:08Defining intellectual dishonesty and its sources — After a sponsor break, Cottrell delves deeper into intellectual dishonesty, describing it as a loose term for deception that isn’t directly detectable, such as using rhetoric, dodging questions, or employing logical fallacies. He notes that it often stems from convincing ourselves of unvetted beliefs, like ‘best practices’ adopted from authority figures rather than direct evidence.
  • 00:09:42The social risk and prisoner’s dilemma of honesty — Cottrell frames the pursuit of intellectual honesty as a social prisoner’s dilemma. The first person in a group to be honest about what they don’t know risks being ostracized or having their ideas devalued if others maintain facades. This highlights the critical role leaders play in establishing a culture where intellectual honesty is safe and valued by all.
  • 00:11:44Practical advice for cultivating intellectual honesty — The host encourages listeners to increase their sensitivity to moments of intellectual dishonesty, which are often triggered by fear or anxiety. He gives a practical example: in a stand-up meeting, honestly admitting uncertainty about why work was delayed, but prefacing it with the intent to improve. Consistently practicing this can set a new, healthier precedent for team communication.

Episode Info

  • Podcast: Developer Tea
  • Author: Jonathan Cutrell
  • Category: Technology Business Careers Society & Culture
  • Published: 2019-01-25T10:00:00Z
  • Duration: 00:15:19

References


Podcast Info


Transcript

[00:00:00] What was the last time that you outright lied?

[00:00:09] This is kind of an uncomfortable question for us, but for most people, an outright lie

[00:00:14] is not something that we do regularly.

[00:00:17] Now, I’m not talking about the kind of lie that we often brush off as no big deal.

[00:00:25] I’m talking about bald-faced, directly lying to someone for your own interest.

[00:00:32] This isn’t a common practice.

[00:00:34] Culturally, it’s not really acceptable.

[00:00:37] We burn bridges, especially when we’re found out in lies.

[00:00:42] And most moral systems don’t really support lying as a common good practice.

[00:00:52] And there’s plenty of reasons why, but in today’s episode, I don’t want to talk about

[00:00:56] this kind of lying.

[00:00:58] Instead, I want to talk about the kind of lying that we do all of the time.

[00:01:04] The kind of lying that I’ve probably done countless times on this show.

[00:01:09] The kind of lying that is often unintentional and very rarely condemned in any format.

[00:01:18] My name is Jonathan Cottrell and you’re listening to Developer Tea.

[00:01:21] My goal on this show is to help driven developers connect to their career purpose and do better

[00:01:25] work so they can have a positive influence on the people around them.

[00:01:29] A lot of what we talk about on this show comes down to having a higher degree of self-awareness.

[00:01:36] Being able to recognize your own faults.

[00:01:40] And even when you can’t see them, because there’s going to be plenty of faults that

[00:01:44] you can’t see very well, you realize that there are plenty of faults that you can’t

[00:01:49] see very well.

[00:01:51] And so this means that you’re vulnerable.

[00:01:54] You have weaknesses.

[00:01:56] You have places to improve.

[00:01:58] You have places where other people can speak into whatever you’re doing and that will improve

[00:02:06] the outcomes that you’re seeking.

[00:02:09] When you’re more self-aware, two things happen and they’re kind of juxtaposed.

[00:02:15] The first thing that happens is your weaknesses, your faults, the things that you’re bad at

[00:02:21] become more apparent to you.

[00:02:24] The second thing that happens is the things that you’re good at, you’re able to focus

[00:02:30] on and hone much better.

[00:02:33] Why is this?

[00:02:34] What is this effect that’s happening when you become more self-aware?

[00:02:39] Of course, this is a blanket statement and I don’t want to say that everybody who has

[00:02:43] a higher degree of self-awareness just naturally becomes better.

[00:02:47] There’s no magic pill to becoming better, to improving.

[00:02:51] But when you know what things you’re not very good at and you know what things you are pretty

[00:02:56] good at, then you can kind of focus your energy towards the things that you are pretty good

[00:03:01] at.

[00:03:02] So the self-improvement process starts kind of at a foundation of becoming more self-aware.

[00:03:10] And this seeking for self-awareness is actually a part of a bigger seeking for better versions

[00:03:18] of the truth, better versions of reality from your own vantage point and trying to understand

[00:03:24] reality from others’ vantage points.

[00:03:28] So what does this have to do with lying?

[00:03:31] We’ll start with a very common thing that happens with developers.

[00:03:36] When you’re talking to another developer and you have an opinion.

[00:03:40] You have an opinion about how something is done.

[00:03:43] Often, the opinions that we have are based not in some well-established experience, but

[00:03:52] rather they’re based on what we enjoy, what we think based on our own experiences or based

[00:04:02] on what we thought before.

[00:04:05] Perhaps we adopted these beliefs from someone that we trust.

[00:04:10] Maybe we adopted these beliefs by using these tools or going through a few processes.

[00:04:17] But if it really came down to it, most of us have not thoroughly vetted all of the options

[00:04:24] that are on the table in a given conversation.

[00:04:27] However, very few times do we validate and discuss that reality.

[00:04:33] But the reason that we are presenting one solution over another and advocating for one

[00:04:40] solution over another is not because of some shallow reasoning or because of some heuristics

[00:04:47] that we use.

[00:04:49] We like to present as though this is somehow objectively the better way.

[00:04:56] And this isn’t an unreasonable thing.

[00:04:59] It’s not unreasonable to default to this position, because if you were to tell people that the

[00:05:06] reason that you want to continue using, for example, JavaScript on a project is because

[00:05:11] you like JavaScript, this is often not going to be received as a valid reason.

[00:05:19] This kind of lying is one form, and there’s plenty more, of intellectual dishonesty.

[00:05:26] We’re going to take a break, and then we’re going to come back and talk about other forms

[00:05:30] of intellectual dishonesty that we participate in every day as developers.

[00:05:37] Today’s episode is sponsored by Clubhouse.

[00:05:40] Clubhouse is the first project management platform for software development that brings

[00:05:44] everyone on every team together to build better products.

[00:05:49] Clubhouse provides the perfect balance of simplicity and structure for better cross-functional

[00:05:54] collaboration.

[00:05:55] The best intuitive interface makes it easy for people on any team to focus in on their

[00:06:00] work on a specific task or project, while also being able to zoom out and see how that

[00:06:06] work is contributing towards the bigger picture.

[00:06:09] If you’re like me, as a developer, one of the first questions that you ask is whether

[00:06:13] a given service has an API.

[00:06:16] This is because I like to create my own little integrations and tools and utilities, and

[00:06:21] sometimes I like to add a display to my, for example, my terminal.

[00:06:26] When I open a new tab, maybe I have a to-do list that prints out in my terminal.

[00:06:32] The only way that that’s possible, at least in a sustainable way, is if there’s an API.

[00:06:38] Clubhouse has a simple API and robust set of integrations.

[00:06:43] Clubhouse seamlessly integrates with all the tools that you already use getting out of

[00:06:46] your way so that you can deliver quality software on time.

[00:06:51] Developers of Developer T get two months of Clubhouse by heading over to clubhouse.io

[00:06:57] slash developer T. That’s all one word, clubhouse.io slash developer T. Thanks again to Clubhouse

[00:07:04] for sponsoring today’s episode of Developer T.

[00:07:08] So how does intellectual dishonesty work?

[00:07:13] What exactly is it?

[00:07:16] Intellectual dishonesty is kind of a loose term.

[00:07:19] There’s not any exact definition.

[00:07:21] The idea here is that you’re lying in some way that isn’t directly detectable.

[00:07:30] You’re deceiving someone perhaps using rhetoric or somehow you’re dodging questions or you’re

[00:07:37] making yourself out to be smart in some particular way.

[00:07:43] Maybe you’re using a logical fallacy that makes your point seem more applicable.

[00:07:51] This happens in all kinds of ways as developers and often this is the result not just of us

[00:07:57] trying to get the upper hand in a conversation, which itself is a pretty natural thing to

[00:08:02] do, but it also happens when we convince ourselves of things that are not necessarily true.

[00:08:10] For example, many of us have best practices that are ingrained in our heads.

[00:08:16] If we traced where those best practices come from and why we believe them, if we were totally

[00:08:22] honest most of us would end up either saying we don’t know why we believe that particular

[00:08:29] thing or because someone who had some level of authority in our lives told us that it

[00:08:37] was true.

[00:08:39] That can come from a book, it can come from a professor, but often we are not developing

[00:08:45] our beliefs based on direct evidence, based on direct experience.

[00:08:52] Usually we develop our beliefs second, third, fifth hand.

[00:08:57] We do similar things when trying to, for example, make estimates about the work that we do.

[00:09:03] Humans are not evidence driven, at least naturally, when we estimate things.

[00:09:09] We’re pretty bad at estimation, we’ve talked about this countless times on the show.

[00:09:13] These beliefs that we have often cause us to be intellectually dishonest in ways that

[00:09:20] protect us.

[00:09:22] These are our automatic ways of thinking, our automatic ways of behaving, especially

[00:09:28] in some kind of social context.

[00:09:31] And when I say social here, I’m including work as a social context, any context where

[00:09:37] we have to cooperate with other people.

[00:09:40] This is sort of a prisoner’s dilemma as well.

[00:09:42] If you have a group of friends, a group of coworkers, whoever is the first to be intellectually

[00:09:49] honest about the things they don’t know or the source of their beliefs, the person who

[00:09:55] is seeking that intellectual honesty runs a risk.

[00:10:00] They run the risk of the other people in their group continuing their facades, continuing

[00:10:07] their intellectual dishonesty.

[00:10:10] And that one person who seeks that intellectual honesty may be ostracized.

[00:10:16] Perhaps their beliefs are not necessarily well founded and so their ideas are, even

[00:10:23] if that person is not totally separated from the group, their ideas are not going to be

[00:10:28] as respected necessarily as the supposedly well intentioned or well thought out ideas

[00:10:34] of the other members of the group.

[00:10:37] And so the prisoner’s dilemma applies because really for a pursuit of intellectual honesty,

[00:10:45] it seems that everyone in the group has to pursue intellectual honesty together.

[00:10:51] This is one of the major reasons why leaders of groups, managers or founders of companies,

[00:11:00] they have a big responsibility when it comes to culture because the way that we form and

[00:11:06] share beliefs is so fundamental to the work that we do.

[00:11:12] If we can’t be honest about the source of our beliefs, if we can’t seek greater clarity,

[00:11:20] if we can’t pursue self-awareness, then it’s going to be difficult to improve together.

[00:11:26] It’ll be difficult to see what other people’s weaknesses are.

[00:11:30] It’ll be difficult to rely on others to help us find our own weaknesses.

[00:11:37] I encourage you to try today and this week and as you move forward in your career, try

[00:11:44] to turn up your sensitivity level for yourself.

[00:11:50] Trying to identify moments where you are being intellectually dishonest is often comes on

[00:11:57] the heels of some kind of fear, some bit of anxiety.

[00:12:03] And the way that you deal with that anxiety is through some intellectual dishonesty.

[00:12:09] For example, imagine that you are in your stand-up meeting or whatever your check-in

[00:12:14] is with the person.

[00:12:17] Maybe it’s a co-worker or maybe it’s a product manager that you work with.

[00:12:21] Maybe it’s a manager, whoever it is, and they ask you how things went last week and you

[00:12:27] know that things didn’t go so well.

[00:12:30] Maybe you moved slower than you expected to.

[00:12:34] Maybe there was some kind of technical hang-up, but you feel wrong or somehow afraid to tell

[00:12:41] them the truth.

[00:12:43] This is a moment where you have an opportunity to grow and to learn how to be intellectually

[00:12:48] honest.

[00:12:50] It’s important to preface your answer because we’re all conditioned to hear these intellectually

[00:12:57] dishonest versions that are wrapped in some kind of explanation that makes sense to us.

[00:13:04] But if you’re intellectually honest, sometimes that can sound abrasive.

[00:13:08] So for example, saying, I’m not really sure what happened last week, but we didn’t get

[00:13:13] as much done as we wanted to get done.

[00:13:15] This is a very common scenario for developers to not really have great information about

[00:13:22] why you didn’t proceed as planned, but we don’t often say it in those terms.

[00:13:29] So if someone hears you say it in those terms, it might catch them off guard.

[00:13:34] It may be a little bit alarming.

[00:13:36] So preface it by saying, I want to be totally clear and honest about this so that we can

[00:13:41] improve.

[00:13:43] Not because this is the new status quo, but instead because I can recognize the weaknesses

[00:13:51] in our own execution, our own behaviors.

[00:13:56] Once you set this precedent and you continue to operate from that place of pure honesty,

[00:14:03] the sticker shock, the feeling that, wow, that was unexpected.

[00:14:08] That feeling hopefully will fade and other people hopefully will catch on to the fact

[00:14:14] that you indeed are improving.

[00:14:19] Thank you so much for listening to today’s episode of Developer T. I encourage you to

[00:14:23] again, turn up that knob, that sensitivity to your own intellectual dishonesty, the moments

[00:14:29] where you feel like you’re making something up to get by and to take the time to find

[00:14:36] out how you can be more intellectually honest.

[00:14:42] Thank you so much for listening.

[00:14:43] Thank you again to Clubhouse for sponsoring today’s episode of Developer T. Head over

[00:14:47] to clubhouse.io slash developer T to get two months for free on Clubhouse.

[00:14:53] Developer T is a part of the Spec Network.

[00:14:55] The Spec Network is for designers and developers looking to level up.

[00:14:59] There are other podcasts and other content on spec.fm.

[00:15:04] Go and check it out.

[00:15:05] Thank you again to today’s producer and editor, Sarah Jackson.

[00:15:09] I’m Jonathan Cottrell, and until next time, enjoy your tea.