How We Construct Software - Part Three (Decision Variance)
Summary
This episode continues the series on software construction by examining why different developers approach the same problems in varied ways. The host explains that even with similar mental models, developers will produce different solutions because problems are often articulated broadly, leaving room for multiple interpretations. Using the analogy of planting a tree, he illustrates how a simple instruction can lead to numerous questions about implementation details, scope, and methodology.
The discussion then explores specific psychological factors that create variance in decision-making. First, developers naturally evolve their approaches as they gain experience and learn new skills. Second, recent conversations with respected colleagues can temporarily shift opinions and practices, creating volatility in decision patterns. Third, confirmation bias makes it difficult to change publicly stated beliefs even when presented with contradictory evidence.
A particularly relevant phenomenon for developers is the possibility effect - the tendency to treat possible outcomes (no matter how improbable) as significantly different from impossible outcomes. This manifests in software development through excessive optimization, where developers pursue performance improvements simply because they’re possible, without considering whether they’re necessary or beneficial to the overall system.
The episode emphasizes that understanding these psychological factors is crucial for effective software development. By recognizing how biases and effects like confirmation bias and the possibility effect influence decisions, developers can make more deliberate choices about where to focus their optimization efforts and how to approach problem-solving more objectively.
Throughout the discussion, the host encourages developers to research these psychological phenomena further, as developing awareness of decision-making patterns can improve collaboration, project timelines, and overall problem-solving effectiveness in software development careers.
Recommendations
Concepts
- Mental models — Psychological frameworks that shape how developers view software and the world, influencing their problem-solving approaches and decision-making processes.
- Confirmation bias — A well-studied psychological phenomenon where people seek evidence supporting existing beliefs while rejecting contradictory information, particularly problematic when developers have publicly committed to specific approaches.
- Possibility effect — A psychological effect where people treat possible outcomes as significantly different from impossible ones regardless of probability, often leading developers to pursue unnecessary optimizations simply because they’re possible.
Tools
- Sentry — A tool for catching bugs before users see them, providing immediate alerts through channels like Slack, full stack traces, and identification of the responsible code commit. The host recommends it as part of a multi-angle approach to bug management.
Websites
- teabreakchallenge.com — A website offering daily soft skills exercises delivered via email, designed to help developers level up their career skills through regular practice.
- spec.fm — The Spec Network website featuring shows created for designers and developers looking to advance their careers through professional development content.
Topic Timeline
- 00:00:00 — Introduction to software construction and decision variance — The host introduces the continuation of the series on software construction, focusing on how developers make day-to-day design decisions. He explains that even when given the same problem, different developers will produce varied solutions due to the broad nature of problem statements and individual interpretation. This sets up the episode’s exploration of why decision variance occurs in software development.
- 00:03:02 — The tree planting analogy for problem interpretation — Using the analogy of being told to plant a tree, the host illustrates how simple instructions lead to numerous implementation questions. Questions about what type of tree, planting methodology, soil requirements, and success criteria demonstrate how even straightforward problems contain hidden complexity. This shows why different developers naturally arrive at different solutions when faced with the same broad problem statement.
- 00:07:06 — Three sources of decision variance in developers — The host outlines three key reasons why decisions vary even for the same person over time. First, continuous learning and skill development naturally improve decision-making. Second, recent conversations with respected colleagues can create temporary shifts in opinions and practices. Third, confirmation bias makes it difficult to change publicly stated beliefs even when evidence contradicts them.
- 00:11:27 — The possibility effect and optimization in software — The discussion focuses on the possibility effect - treating possible outcomes as significantly different from impossible ones regardless of probability. This manifests in software development through excessive optimization, where developers pursue performance improvements simply because they’re possible. The host warns against treating optimization as a moral imperative and emphasizes considering whether optimizations are actually necessary or beneficial.
Episode Info
- Podcast: Developer Tea
- Author: Jonathan Cutrell
- Category: Technology Business Careers Society & Culture
- Published: 2019-02-13T10:00:00Z
- Duration: 00:16:11
References
- URL PocketCasts: https://pocketcasts.com/podcast/developer-tea/cbe9b6c0-7da4-0132-e6ef-5f4c86fd3263/how-we-construct-software-part-three-decision-variance/aa88690b-e7ac-4e39-b6e2-5fd4c947e06b
- Episode UUID: aa88690b-e7ac-4e39-b6e2-5fd4c947e06b
Podcast Info
- Name: Developer Tea
- Type: episodic
- Site: http://www.developertea.com
- UUID: cbe9b6c0-7da4-0132-e6ef-5f4c86fd3263
Transcript
[00:00:00] In the last few episodes, we’ve been discussing the construction of software, how it happens
[00:00:07] not in its final forms in code, but rather well before that, how we develop beliefs and
[00:00:17] models for the world, and how we derive so much of our action by asking implicit questions
[00:00:25] or answering explicit questions, and even how we trick ourselves by answering questions
[00:00:33] that aren’t being asked.
[00:00:35] In today’s episode, we’re going to continue this discussion on how we develop software,
[00:00:40] and we’re going to dive a little bit further down towards the actual writing of the software
[00:00:46] in today’s episode, and specifically the design of the software itself.
[00:00:50] On a day-to-day basis, how are we choosing how we are going
[00:00:55] to accomplish whatever we need to accomplish in the software that we’re writing?
[00:01:01] That’s what we’re talking about in today’s episode.
[00:01:03] My name is Jonathan Cotrellian, and you’re listening to Developer Tea.
[00:01:06] This has been a series on the construction of software.
[00:01:10] My goal on this show is to help driven developers like you connect to your career purpose and
[00:01:14] help you do better work so you can have a positive influence on the people around you.
[00:01:19] We like to believe that if you were to hand the same software problem to three developers,
[00:01:25] at a given company, that there wouldn’t be a lot of variance between how those three
[00:01:32] developers solve that problem.
[00:01:34] Now, this may be more true if the problem is extremely narrow-scoped, and if the problem
[00:01:42] isn’t actually framed as a broad problem, but instead as a list of specific kind of
[00:01:49] leading problems, the kind that result in a specific set of features.
[00:01:54] But this is very…
[00:01:55] unlikely to take place.
[00:01:58] The truth is, given sufficiently complex problems, individuals are going to solve them in complexly
[00:02:05] different ways.
[00:02:07] And part of the reason for this, and perhaps part of the reason for everything we’re talking
[00:02:12] about in this series, is the models that we discussed in the last episode.
[00:02:18] If you didn’t listen to that episode, I encourage you to go back and listen to that, but also
[00:02:23] do a little bit of research on mental models.
[00:02:25] Mental models and beliefs, and how we form our beliefs.
[00:02:29] So that’s kind of the backdrop for all of these discussions.
[00:02:32] But it should be noted that if you zoom in a little bit, and even if we had similar models
[00:02:38] and beliefs in the way that we view the software, in the way that we view the world, we may
[00:02:45] come out with different answers to the same problem.
[00:02:48] This is often because the problem is articulated in a broad way, and the specifics of that
[00:02:55] problem can be expressed in very different ways.
[00:03:00] You can think about it kind of like this.
[00:03:02] A problem that you receive for a given software project is kind of like being told to go and
[00:03:10] plant a tree.
[00:03:12] How do you plant a tree exactly?
[00:03:16] And what tree do you choose?
[00:03:18] How long should you be spending planting the tree?
[00:03:22] Can you plant multiple trees so that…
[00:03:25] When it doesn’t grow properly, you have a fallback tree?
[00:03:29] What kind of soil should you be using?
[00:03:33] And are you actually just cultivating what nature is already doing?
[00:03:37] Because nature also plants trees.
[00:03:40] At what stage are you planting the tree?
[00:03:43] Is it at the stage of the seed, or the seedling, or perhaps even bigger?
[00:03:48] These are just some of the questions for a fairly simple problem like planting a tree.
[00:03:54] And the answers to these questions could lead you down very different paths.
[00:04:00] Now imagine taking a much more complex problem set and walking down the line of reasoning
[00:04:07] for all of the features that would express the solution to that problem.
[00:04:13] Of course, going from one expression of a solution to another is going to result in
[00:04:18] variance.
[00:04:19] This happens even when the same person is solving the same question.
[00:04:24] We’re going to take a quick sponsor break, and then we’re going to come back and talk
[00:04:31] about some of the specific ways that create variance in the way that we make decisions,
[00:04:36] even if it’s the same person making the same kind of decision from one day to the next.
[00:04:52] Today’s episode is sponsored by…
[00:04:54] Sentry.
[00:04:56] Your code is broken, and Sentry is going to help you fix it.
[00:05:00] Relying on customers to report errors that are in your code, this is kind of treating
[00:05:06] customers like an off-site QA team, and you’re not even paying them for that.
[00:05:11] In fact, they’re most likely to leave your product altogether without ever reporting
[00:05:17] any of those problems in the first place.
[00:05:19] Ideally, we could solve this ahead of time.
[00:05:22] We could solve it with great testing.
[00:05:24] We could solve it with a really good QA process.
[00:05:27] But there’s no way we’re going to cover every scenario.
[00:05:31] Our tests are not going to be complete, and we’re not even going to be able to, for example,
[00:05:35] simulate the right kinds of load on our application.
[00:05:40] We can’t simulate the real thing entirely.
[00:05:44] And we can’t predict all the things that people are going to do with our application either.
[00:05:48] And until we can predict the future, responding to real events is one of the best ways that
[00:05:53] we can predict the future.
[00:05:54] So, what are some of the best strategies for dealing with bugs?
[00:05:57] You shouldn’t just have one weapon in your arsenal in the fight against these bugs.
[00:06:03] You have to approach it from many different angles.
[00:06:05] And Sentry provides you with an excellent angle to approach it from.
[00:06:10] Sentry helps you catch bugs before your users see them.
[00:06:14] You’ll get immediate alerts in whatever alert channels that you’re already using.
[00:06:18] Like for example, Slack or push notifications.
[00:06:21] And you can also get more information.
[00:06:24] About that error.
[00:06:25] For example, the full stack trace and the commit to the code that is responsible for
[00:06:31] that error so you can fix it quickly.
[00:06:34] Go get started at sentry.io.
[00:06:36] Thank you again to Sentry for sponsoring today’s episode of Developer Tea.
[00:06:54] So, we’re going to talk about some of the ways that our decisions have variants.
[00:07:06] Even a given person’s decisions are going to change from day to day.
[00:07:11] Now one of the most obvious reasons for this is because we’re constantly experiencing new
[00:07:16] things and so we’re constantly learning.
[00:07:20] We adjust and we grow and we have new skills.
[00:07:23] And so what we would have done a year ago, hopefully we’re going to do something better
[00:07:29] now.
[00:07:30] This kind of upward slope in terms of your skill level is something that the industry
[00:07:35] is fairly respectful of.
[00:07:38] We have jobs that follow that upward slope and as you gain experience, you’re kind of
[00:07:45] expected to also gain extra skills.
[00:07:49] So this is a relatively predictable direction.
[00:07:52] Yeah.
[00:07:53] In how you may make decisions.
[00:07:56] Hopefully you make better decisions as you gain experience.
[00:07:59] But there’s other kinds of variants that are less predictable and sometimes less desirable.
[00:08:06] For example, if you have recently had a conversation where someone you respected claimed that a
[00:08:13] particular industry practice was bad, regardless of even their reasoning, if you respect this
[00:08:21] person.
[00:08:22] Hold them in relatively high regard.
[00:08:25] Then this opinion that they hold is going to have an effect on you.
[00:08:30] And it’s going to have an effect on you, especially in close proximity to that discussion.
[00:08:36] And so when you return to whatever the work is that you’re doing after that kind of discussion,
[00:08:41] you are likely to break away from what you may have even had an affinity for previously.
[00:08:49] You’re likely to push against that specific person.
[00:08:52] specific practice. So in that scenario, you have a lot of volatility, right? You can have
[00:09:00] swinging opinions that change based on conversations that you have. Another more
[00:09:07] long-term or closer to permanent effect that we can observe both as developers and just as humans
[00:09:16] is the confirmation bias. And there’s a lot of other effects and biases. This is a very well
[00:09:22] studied phenomenon in psychology. But the basic idea is that if you already have a belief,
[00:09:30] and especially if you have reinforced that belief, and if you have committed to that belief
[00:09:36] in a somewhat public setting, for example, amongst your co-workers, it is very difficult
[00:09:44] for you to change that belief and then act on that change and go back on those public commitments.
[00:09:52] So if you, for example, believed very strongly in one particular direction or a paradigm for
[00:10:03] solving a given problem, and then you get new information, maybe the problem shifts a little
[00:10:09] bit, maybe you didn’t have all the information up front, or maybe your perspective shifts a
[00:10:15] little bit, right? And you have a new way of thinking about the problem. And your old belief
[00:10:22] is less in line with what the evidence is showing you. You are likely to do two things. One,
[00:10:30] reject that evidence and hold on to that old belief, even though you can cognitively separate
[00:10:36] that it’s probably not the best belief to hang on to. And another thing that you’ll likely do
[00:10:42] is seek out people or evidence that support your previously held belief. And this can obviously
[00:10:50] have major impacts.
[00:10:52] It can have major impacts on software development timelines. It can have major impacts on how well
[00:10:58] you and your team work together. And of course, it’s going to have major impacts on your ability
[00:11:03] to actually solve the problems that are in front of you to solve. The third example,
[00:11:09] and there are plenty more. So it’s very important that you continuously try to learn more about
[00:11:16] these kinds of things, about how we make decisions as developers. But the third phenomenon that
[00:11:22] I want to discuss is called the possibility effect. And we’re talking about some things
[00:11:27] related to the possibility effect. But the basic idea of the possibility effect is that if
[00:11:34] something is possible, even if it is extremely improbable, we still see the difference between
[00:11:43] impossible and possible, no matter how unlikely, as a major difference. The possibility effect is
[00:11:52] relevant for a number of reasons. Specifically, I want to point out one example, and that’s
[00:11:59] optimization. As developers, we are very drawn to the concept of optimizing our code. And this isn’t
[00:12:07] necessarily a bad thing. We are told to learn about how algorithms perform, for example. We’re
[00:12:16] told to understand how to create an adequately optimum program.
[00:12:22] And this happens in a bunch of different stages. But very often, developers apply this concept
[00:12:30] of optimization almost as if it’s a moral rule. And so what we end up with is a number of developers
[00:12:39] working on optimizing code that either doesn’t need to be optimized, or they’re optimizing the
[00:12:46] wrong part of the code, and could stand to gain much better optimization.
[00:12:52] In other places of the code. And the reason this is related to the possibility effect,
[00:12:59] there’s a couple of reasons here. One, if a developer sees that there is a route to optimization,
[00:13:06] very often developers are tempted to take that route, even if there are better ways they could
[00:13:11] be spending their time, or if it compromises the integrity of the readability of that
[00:13:16] same piece of code. The other reason this is related to the possibility effect,
[00:13:22] is that developers are often looking at numbers to determine the success of their optimizations.
[00:13:30] So if we go from, let’s say 20 milliseconds to 11 milliseconds, then this seems like a major jump
[00:13:40] in performance. The problem is that we are zoomed in on these numbers, and we’re only looking at the
[00:13:47] numbers within their own context. We need to understand where our thought process is for our
[00:13:50] optimization, and we need to understand how to bring our own scripting and development into the
[00:13:51] software, and how to follow that process. For instance, we know that the actual number of tasks we’re
[00:13:52] what is the optimum number for this piece of code to run at.
[00:13:58] Rather than just saying we know we can make it faster
[00:14:01] and faster is unilaterally good,
[00:14:04] instead we should understand
[00:14:06] is this code that’s going to run once, for example,
[00:14:10] and optimizing it any further
[00:14:12] is a waste of energy and resources.
[00:14:14] There are a variety of biases
[00:14:17] dealing with calculations and numbers
[00:14:20] that are worth looking at.
[00:14:23] And it’s not just biases,
[00:14:24] it’s also these kind of psychological effects and phenomenon
[00:14:27] that cause us to see numbers in distorted ways.
[00:14:31] I encourage you to go and Google about this,
[00:14:34] read a little bit about it,
[00:14:35] because you’re going to run into many situations
[00:14:38] where you’re dealing with numbers as a developer
[00:14:40] and getting a handle on how to see those numbers more clearly
[00:14:45] is going to help you in the long run.
[00:14:48] Thank you so much for listening
[00:14:49] to today’s episode.
[00:14:50] of Developer Tea.
[00:14:52] I hope you’re enjoying this series
[00:14:53] on how we construct software,
[00:14:55] kind of the mental processes and the models
[00:14:57] and the questions and decisions
[00:14:59] that we have on our plates as developers.
[00:15:03] I encourage you to continue doing some more digging
[00:15:06] on the topics that we bring up in these episodes.
[00:15:10] These are some of my favorite topics
[00:15:11] that we talk about on the show
[00:15:13] and I know that they are going to be valuable to you
[00:15:16] in your career as well.
[00:15:18] Thank you again to today’s sponsor,
[00:15:20] Sentry, to get started finding bugs
[00:15:22] before your users see them.
[00:15:24] Head over to sentry.io to sign up today.
[00:15:27] If you haven’t signed up for the Tea Break Challenge,
[00:15:30] I encourage you to head over to teabreakchallenge.com
[00:15:33] and sign up today.
[00:15:35] The Tea Break Challenge is daily soft skills exercises
[00:15:39] that get delivered to your email.
[00:15:41] Go and check it out, teabreakchallenge.com.
[00:15:44] If you haven’t seen the other shows on spec.fm,
[00:15:48] the Spec Network was,
[00:15:50] created for designers and developers like you
[00:15:52] who are looking to level up in your career.
[00:15:55] Go and check it out, spec.fm.
[00:15:56] Thank you so much for listening
[00:15:57] and until next time, enjoy your tea.
[00:16:00] Thank you.
[00:16:00] Thank you.
[00:16:00] Thank you.