Great Reviews and Terrible Tacos - Sharpening Substitute Questions with Counterfactuals
Summary
The episode explores the concept of substitute questions—the mental shortcuts we use to replace complex, cognitively taxing questions with easier-to-answer proxies. The host explains that we often do this unconsciously, such as substituting “What is a good restaurant?” with “What restaurant has good reviews?” or using past career enjoyment to predict future career desires. While these heuristics are efficient, they can lead us astray if the substitute question isn’t well-correlated with the original concern.
To evaluate and improve our substitute questions, the host introduces counterfactual thinking as a powerful tool. This involves asking “What else could be true?” to generate alternative explanations. For example, a restaurant with many positive reviews might be using review farming, not necessarily serving good food. This technique helps us assess the cohesion—or correlation—between our heuristic and the core question we’re trying to answer.
The discussion applies counterfactual thinking to practical scenarios like hiring. Common interview criteria, such as performance on LeetCode problems, are often poor substitute questions for predicting job success. A counterfactual might be that a candidate who performs poorly hasn’t practiced LeetCode recently because they’ve been doing actual engineering work. The host argues that by using counterfactuals, we can identify higher-signal, higher-cohesion criteria for our evaluations.
The episode also covers how to use thought experiments as a form of counterfactual thinking for more abstract questions, like career goals. By posing unrealistic scenarios (e.g., “Would you want a promotion without a pay raise?”), we can uncover our true underlying desires and refine our goals. The host emphasizes that the goal isn’t to find a perfect percentage likelihood for counterfactuals, but to recognize when an alternative explanation is plausible enough to warrant reconsidering our heuristic.
Finally, the host connects counterfactual thinking to broader applications like debugging and avoiding confirmation bias. By routinely asking “What else might be true?” and considering the likelihood of those alternatives, we can make better decisions, improve our processes, and avoid the pitfalls of biased logic. The episode concludes by encouraging listeners to identify their own daily substitute questions and use counterfactuals to sharpen their thinking.
Topic Timeline
- 00:00:00 — Introduction to substitute questions as cognitive heuristics — The host introduces the concept of substitute questions, where we replace difficult questions with easier proxies. Examples include using restaurant reviews to judge quality or using past career enjoyment to predict future desires. These substitutions happen consciously and unconsciously to reduce cognitive load.
- 00:05:41 — The problem with heuristics and introducing counterfactual thinking — The host explains that heuristics can fail when the substitute question isn’t well-correlated with the original question. He introduces counterfactual thinking as a tool to check these heuristics. The core task is to determine the cohesion between the substitute and the target question.
- 00:07:47 — Applying counterfactuals to restaurant reviews and product ratings — Using the restaurant review example, the host demonstrates counterfactual thinking by asking “What else could be true?” Alternative explanations include review farming or incentives for positive reviews. Platforms use verified buyer badges to increase cohesion by reducing these counterfactuals.
- 00:09:30 — Using thought experiments for abstract career questions — For complex questions like career goals, the host suggests using thought experiments instead of direct counterfactuals. Examples include asking if you’d want a promotion without a pay raise or if you’d accept a promotion requiring 60-hour weeks. These experiments help refine true desires, such as wanting higher pay per hour rather than just a title.
- 00:12:24 — Counterfactuals in hiring and interview criteria — The host applies counterfactual thinking to hiring, where criteria like solving a coding problem or asking good questions are substitutes for predicting job performance. A key example: a candidate failing a LeetCode interview might simply be out of practice, not unskilled. Studies show LeetCode performance poorly predicts job success, making it a low-cohesion substitute question.
- 00:17:38 — Evaluating likelihood and the role of reasonable doubt — After generating counterfactuals, the next step is to assess their likelihood. The host compares this to the justice system’s “beyond a reasonable doubt” standard. The point isn’t to set a specific percentage threshold but to recognize when an alternative explanation is plausible enough to challenge the initial heuristic, especially in hiring decisions.
- 00:21:25 — Broader applications in debugging and avoiding bias — The host extends counterfactual thinking to debugging, where confirmation bias can lead us down wrong paths. By asking “What else might be true?” about a bug’s cause, we can avoid substituting confidence for proper investigation. This kind of thinking can improve decision-making across various domains.
Episode Info
- Podcast: Developer Tea
- Author: Jonathan Cutrell
- Category: Technology Business Careers Society & Culture
- Published: 2025-06-18T07:00:00Z
- Duration: 00:23:28
References
- URL PocketCasts: https://pocketcasts.com/podcast/developer-tea/cbe9b6c0-7da4-0132-e6ef-5f4c86fd3263/great-reviews-and-terrible-tacos-sharpening-substitute-questions-with-counterfactuals/6cbdf8d3-edb9-4a8b-a297-0f295ac035f5
- Episode UUID: 6cbdf8d3-edb9-4a8b-a297-0f295ac035f5
Podcast Info
- Name: Developer Tea
- Type: episodic
- Site: http://www.developertea.com
- UUID: cbe9b6c0-7da4-0132-e6ef-5f4c86fd3263
Transcript
[00:00:00] we’ve talked about substitute questions on the show before the idea of a substitute question
[00:00:16] is that you’ll take a cognitively taxing question and replace it without your own
[00:00:25] realization of this replacement you’ll replace it with an easier to answer question this question
[00:00:32] operates as a heuristic a way of answering something close to something approximating
[00:00:41] or pointing towards the original concern so for example you might ask the question
[00:00:49] what is a good restaurant in my area and then you’ll
[00:00:55] substitute the question what restaurant has good reviews in my area now for many people
[00:01:03] these are synonymous questions they believe that a restaurant that has good reviews
[00:01:12] is entirely going to be representative of a restaurant that is good the difficult part
[00:01:22] is that defining what a good restaurant is
[00:01:25] is cognitively taxing that is it’s very hard to take in all of the variables that might be
[00:01:32] necessary to define good for a given person very often we will substitute questions by
[00:01:42] looking at historical answers to the same question rather than future casting answers
[00:01:48] in other words what do you want out of your career may be answered
[00:01:55] in multiple ways for example what have you enjoyed in your career previously
[00:02:00] or another substitute question might be what have i imagined my career might look like
[00:02:09] this imagination this visioning of your career very often turns into what you expect from your
[00:02:17] career this is true in a lot of our life experiences we begin to desire or
[00:02:25] expect the things that we imagine are most likely to happen there’s a lot of reasons for this
[00:02:31] our desire for stability for example additionally the pain that we experience when something
[00:02:41] unexpected occurs amazingly sometimes this pain is felt even when the unexpected thing
[00:02:49] is a positive thing something that we otherwise may be able to on many
[00:02:55] subjective measures say was a good occurrence these substitutions happen all the time and
[00:03:02] sometimes we do them consciously as well for example we substitute a very difficult or perhaps
[00:03:10] impossible to answer question like will i enjoy this car if i purchase it now will i enjoy it how
[00:03:18] long might i enjoy it another example of this might be will this person this candidate that
[00:03:25] i’m considering hiring will they do well in their role these are questions about the future questions
[00:03:31] about uncertainty and so what we do instead of trying to answer these impossible to answer
[00:03:37] questions is we break them down into various criteria that we hope correlates to an answer
[00:03:45] we try to imagine what kind of thing will i appreciate about a car in the future and does
[00:03:54] this car match that car and what kind of thing will i appreciate about a car in the future
[00:03:55] what kinds of things predict whether a candidate will be successful all these criteria that we are
[00:04:03] using are various types of substitute questions or a substitute operation multiple questions that
[00:04:12] we’re substituting to try to approximate a belief or an assertion about another question
[00:04:21] so you could kind of formulate
[00:04:25] the substitution as if you had a question x you’re going to instead answer question y
[00:04:35] because it’s much easier to answer and you’re going to say because of answer y i believe the
[00:04:44] answer to x is something right i believe that because there are a lot of reviews positive
[00:04:52] reviews for this restaurant
[00:04:55] then so that’s a question or uh that’s that’s the substitute question question y
[00:05:00] since there are a bunch of reviews positive reviews then my belief about question x which
[00:05:07] i’m not going to try to answer directly but instead i’m going to use the information from
[00:05:11] question y my belief is that it is a good restaurant right uh my answers to all of the
[00:05:21] various criteria for this candidate are
[00:05:25] xyz right and therefore because of the criteria question answers my belief about their potential
[00:05:35] is that they will do well now of course there are plenty of ways that this can go wrong
[00:05:41] there are plenty of reasons why our heuristics are not always tuned perfectly and i’m going to
[00:05:50] give you a tool a very simple tool that you can use to try to check your
[00:05:55] own heuristics try to check your own substitute questions uh those substitutions to determine
[00:06:02] whether your heuristics are actually meaningful or not right so in effect what you want to do
[00:06:11] is determine how well correlated your substitution is with the thing you’re trying to substitute
[00:06:19] that is the fundamental task at hand you’re trying to determine if your substitute question
[00:06:25] has a high cohesion to the question that you really care about so let’s take our restaurant
[00:06:31] review example this restaurant has a bunch of good reviews and therefore i believe it is a good
[00:06:39] restaurant what i want you to do is ask the question is that necessarily true right so that’s
[00:06:49] going to give you a very clear cohesion if it is a 100 percent
[00:06:55] cohesion right so then really your substitute question isn’t really a substitute question it’s
[00:07:01] more like an evidence question or a measurement question it has a higher cohesion rate because
[00:07:06] it’s not really uh you know asking a different question it’s just asking a different way
[00:07:13] of the same question okay this is kind of what our brains are tricking us
[00:07:18] into believing we are doing with all of these substitute questions but uh if if there is
[00:07:25] a possibility that that is not necessarily true right that uh a a restaurant that has a bunch of
[00:07:33] good ratings may not necessarily be a very good restaurant the way you arrive at this
[00:07:40] is called counterfactual thinking all right so this is a technique there’s a bunch of ways you
[00:07:47] could do this but i’m going to talk about counterfactual thinking in today’s episode
[00:07:52] and the basic idea of counterfactual thinking is that you’re not going to be able to do this
[00:07:55] what else could be true so instead of trying to find why this thing might be false
[00:08:02] you are instead providing an alternative explanation what else could be true what
[00:08:09] is another good explanation for why there may be a bunch of good reviews perhaps the restaurant
[00:08:16] rewards people for leaving a good review maybe they have hired a bunch of review farming
[00:08:25] uh you know a bunch of people to leave reviews um that never even ate at the restaurant there’s
[00:08:31] a bunch of possible counterfactuals um and as it turns out there are plenty of opportunities
[00:08:39] for those sites for example that host those reviews to try to reduce those counterfactuals
[00:08:45] if you’ve ever seen the kind of like verified buyer uh reviews this requires that the person
[00:08:52] who’s leaving review has actually bought the item and they’re not going to be able to do that
[00:08:55] right so uh that’s trying to cut down on some of the counterfactuals to increase the cohesion
[00:09:02] between that uh substitute question is this product a good product with uh the the substitute
[00:09:12] question for that you know core question is how many positive reviews or what are the star ratings
[00:09:18] you know on amazon or whatever right so increasing that cohesion rate we’re going to
[00:09:23] address some of the counterfactuals
[00:09:25] what do you want in your career this one’s a little bit more complicated right it’s a little
[00:09:30] bit harder to come up with counterfactuals here because you may say well i want to continue
[00:09:35] making money i want to continue being better at my job i want to uh you know get a get another
[00:09:41] promotion all of these things are hard to say well what else could be true that’s not really
[00:09:47] how we would do a counterfactual in this situation instead what we would do is we would play a few
[00:09:53] thought experiments out
[00:09:55] right so let’s say that your intent is to uh to get a promotion so a thought experiment might be
[00:10:04] would you want a promotion if it did not include a pay raise most people would say well that’s not
[00:10:11] realistic and thought experiments fortunately don’t have to be realistic the whole idea here
[00:10:17] is to produce some kind of counterfactual thinking in other words you are kind of poking at the
[00:10:24] question and you’re saying well i want to continue being better at my job i want to continue being
[00:10:25] okay you you mentioned that you want a promotion is that actually the thing you want is that
[00:10:31] precise enough is the thing that you want a promotion and a pay raise and if you were to
[00:10:38] say okay well i i do want a promotion and i do want a pay raise another thought experiment you
[00:10:43] might run is are you willing to work 60 hours a week for a promotion and a pay raise well the
[00:10:50] answer might be well of course not nobody would ever make me do that once again
[00:10:55] it’s a thought experiment so you can explore to find out more and now you might adjust your
[00:11:01] assertion that you want a promotion a pay raise and a balanced work uh environment such that you
[00:11:10] don’t have to work more than 40 45 hours a week something like that right or it may be that through
[00:11:17] this exploration you realize you know what actually i don’t really care so much about the
[00:11:22] pay raise i don’t care so much about
[00:11:24] the promotion what i really want is to reduce the overall amount of work i do i’m actually okay with
[00:11:31] how much i’m getting paid but i’d like to work less and get paid the same amount there’s a
[00:11:36] totally different career goal and so through this thought experiment exploration you may realize
[00:11:43] that actually what you what you really want is to give get paid more per hour not a total amount
[00:11:50] more so when you’re looking at this thought experiment you’re going to realize that you’re
[00:11:54] looking at these assertions and you’re looking at these substitute questions that you ask yourself
[00:11:58] counterfactual thinking either through that what else might be true uh frame the that’s a little
[00:12:06] bit easier in that first example about you know restaurants or product reviews what else might be
[00:12:12] true and counterfactual thinking through the lens of a thought experiment these are both going to
[00:12:19] provide you insight for talk about hiring for example
[00:12:24] the feedback for a good candidate coming in that says he immediately solved the coding problem
[00:12:32] therefore i believe he is a good engineer she asked great questions therefore i believe she
[00:12:40] is a good communicator these things are substitute criteria right you’re trying to evaluate whether
[00:12:49] someone is a good communicator and you’re using a very simple approach to evaluate whether someone is
[00:12:54] a good communicator you’re basically trying to evaluate whether someone is a good communicator
[00:12:56] this is not the case in this case if i say yes then people right now will probably say he you really
[00:13:00] didn’t get anyone if someone tells you that it was a good communicator you’re not a good communicator
[00:13:04] and that means if you don’t agree then you’re actually going to get a good and otherwise okay
[00:13:08] i better woman one time you can still tell me if my answer it was a good communicator can again
[00:13:13] this is not a good indicator you could probably say oh this is a good communicator
[00:13:18] i haven’t got your feel for this you could probably say yes andpei’s clickoooere what else
[00:13:21] should we be looking for maybe that’s not me here you could say yes then everybody else can point you
[00:13:24] signal. You’re looking for some kind of sign that this person is a good communicator. And so you
[00:13:30] might use the heuristic, they asked a question. They asked a good question, a thoughtful question,
[00:13:37] something that wouldn’t necessarily come to mind. They are a thoughtful person.
[00:13:41] They are a good communicator. But all of these are heuristic assumptions. And even the good
[00:13:48] communicator criteria ends up being yet another substitution for, will this person communicate
[00:13:55] well when it really matters in their job? So let’s think about a couple of these interview
[00:14:02] question or outcomes that we could apply a counterfactual to. Let’s say that you had an
[00:14:09] interviewee that did poorly on, let’s say like a leak code style question. Now the immediate
[00:14:17] assumption is,
[00:14:18] this person is not going to cut it. They don’t have the chops that they need to have as an
[00:14:24] engineer. But one counterfactual, and actually a pretty compelling counterfactual, is that this
[00:14:30] person has not been practicing leak code in quite some time. Instead, they’ve been working. They’ve
[00:14:38] been doing actual engineering work, which very rarely actually requires the kinds of skills that
[00:14:44] you use when doing leak code interviews.
[00:14:48] So,
[00:14:48] in an odd way,
[00:14:50] counterfactual thinking may inform us that people who do exceedingly well at leak code interviews
[00:14:56] are good at, well, leak code interviews. They may actually have a negative signal or
[00:15:03] perhaps even a neutral signal here when somebody is particularly good at leak code. Now, it tells
[00:15:11] us something. It doesn’t necessarily tell us only bad or only neutral things. There may also be some
[00:15:17] good things.
[00:15:18] There’s a signal here that the person is willing to put in effort, for example. There’s also signal
[00:15:24] here that might suggest that the person is able to comprehend complex topics. Leak code can often be
[00:15:32] very complex. And so if they can comprehend a leak code complexity, then they probably can also
[00:15:38] comprehend the complexity of whatever is in your domain model. But the leak code measurement itself,
[00:15:45] as studies have shown, and this is kind of interesting,
[00:15:48] Studies have shown that it doesn’t really predict future job performance.
[00:15:52] So if your goal as an interviewer is to answer that core question, how well will this person do in their job,
[00:16:00] then it turns out that leak code as a criteria is probably a bad substitute question.
[00:16:07] Success on a leak code interview may not be predictive of success on the job at all.
[00:16:13] So what these counterfactuals do is they help you hone in on better substitutes.
[00:16:22] Now, it would be unrealistic for us to say you need to answer the core question.
[00:16:26] The only way to answer the question of whether somebody will succeed on the job is to hire them.
[00:16:32] And the whole purpose of having the criteria that we have in an interviewing process is to actually perform the substitution in a high cohesion manner.
[00:16:43] In other words, we want to find substitute criteria that allows us to test this criteria in advance in the cheapest way and the highest signal way as possible.
[00:16:56] That means getting rid of your leak code questions, most likely.
[00:17:00] That’s a low signal or a mixed signal way of looking at your candidates.
[00:17:07] And we can use counterfactuals to drive what kinds of things should we be talking about, what kinds of things.
[00:17:13] Should we discuss?
[00:17:15] If you’re looking at a candidate and you’re trying to understand why they behaved in a particular way, you’re looking at feedback from the candidate, ask what else might be true.
[00:17:25] Is it possible that the thing that is being presented in this feedback about this candidate, that there’s actually more to the story?
[00:17:34] There’s another reason.
[00:17:36] There’s another reason why they got this feedback.
[00:17:38] Now, once you’ve done the counterfactual, here’s a critical part of this.
[00:17:41] And I don’t want you to, you know.
[00:17:43] Miss this particular aspect.
[00:17:46] There are potentially many, many, many explanations.
[00:17:52] There are many explanations.
[00:17:53] It’s possible that that person was extremely distracted.
[00:18:00] It’s possible, right?
[00:18:02] If somebody is distracted in an interview, they’re not going to do as well as they would if they were fully focused.
[00:18:09] Now, you should ask yourself next.
[00:18:12] How likely?
[00:18:13] For my counterfactual, how likely is it that that was the case?
[00:18:20] Now, whether or not your likelihood percentage or your ratio or whatever you want to use here in terms of the metric itself.
[00:18:29] If you’re saying, oh, it’s, you know, 20%, 25% likely.
[00:18:34] Or it’s 75% likely.
[00:18:38] These ratios, these thresholds, whatever you want to call these.
[00:18:42] Are fairly.
[00:18:43] Arbitrary.
[00:18:44] It’s not necessarily the case that you should care about any particular percentage.
[00:18:50] You may have something that you care about at a .001%.
[00:18:56] Right?
[00:18:57] Think of the justice system.
[00:19:00] Counterfactuals are very important when you’re sitting on a jury.
[00:19:04] Because of this simple phrase, beyond the shadow of a doubt.
[00:19:09] What percentage would we assign to the shadow?
[00:19:13] If there is some counterfactual that has even a .1% likelihood.
[00:19:20] Then there might be what some would call reasonable doubt in that situation.
[00:19:25] And so the point of this is not to identify a particular percentage.
[00:19:31] But instead to recognize what percentage you begin to care about.
[00:19:35] If I have an otherwise qualified candidate for a particular role.
[00:19:42] And I run some counterfactuals and I think, okay, this person failed one of, uh, you know, multiple interviews.
[00:19:50] Right?
[00:19:50] They, they failed one of many interviews.
[00:19:52] Otherwise they’re fairly well qualified.
[00:19:55] And I’m going to look at the interview feedback and I’m going to try to determine counterfactuals.
[00:20:00] Is something else possibly true?
[00:20:03] Maybe one out of five, one out of three chance that something else is true.
[00:20:08] And I’m looking at all of the information together.
[00:20:12] I may make the decision that this interview itself used the wrong substitute questions.
[00:20:20] We took something to mean one thing when actually the counterfactual is strong enough that we shouldn’t have taken it to mean that thing.
[00:20:31] We need to ask more higher cohesion questions.
[00:20:35] We need to have closer cohesion between our substitute question, our heuristic.
[00:20:41] And the core.
[00:20:42] Thing we’re trying to figure out the criteria in this case for a candidate we’re trying to hire.
[00:20:48] Thanks so much for listening to today’s episode of developer tee.
[00:20:51] I hope you will begin to think more critically.
[00:20:54] About your substitute questions, the cohesion rate between your substitute question, your target question, or your, uh, you know, your underlying evaluation, the things that you actually care about versus the cognitively cheaper thing that you’re asking and try to come up with better.
[00:21:11] Uh,
[00:21:12] better questions by using counterfactuals.
[00:21:14] This is a way to improve your hiring process pretty drastically.
[00:21:19] It’s a way to improve, uh, the way that you think about, uh, you know, your.
[00:21:25] Even debugging, right?
[00:21:27] Uh, we can end up falling prey to a bunch of biases around debugging, uh, you know, confirmation bias.
[00:21:35] We believe that a particular bug is caused by one thing.
[00:21:38] It turns out it’s caused by something totally different, but as we were.
[00:21:41] We were debugging, we substituted, uh, a bunch of kind of debugging steps for our confidence, right?
[00:21:49] Uh, instead of debugging properly, we used confidence.
[00:21:52] We used our preexisting belief to trace down a particular path.
[00:21:57] It turns out that path was wrong.
[00:21:59] And so we fell prey to confirmation bias.
[00:22:01] We could have used counterfactuals, could have used counterfactuals in the, in that situation.
[00:22:06] What else might be true?
[00:22:08] And how likely is it that that thing might be true?
[00:22:11] Uh, and, and that could help us avoid these, these poor pathways of, uh, of biased logic.
[00:22:19] Thanks so much for listening.
[00:22:20] Um, hopefully this, this thoughtful, uh, this conversation is thoughtful to you and that you would, uh, adopt some of these ideas into your workflows.
[00:22:30] Um, I’d love to hear about it.
[00:22:32] You can reach out to me on the developer T discord, head over to developer t.com slash discord.
[00:22:39] If you enjoyed this episode.
[00:22:41] I didn’t.
[00:22:41] Encourage you to share it with somebody that you think will benefit from counterfactual thinking will benefit from learning about this idea of substitute questions.
[00:22:50] We’ve talked about it on the show before.
[00:22:52] Uh, certainly we will talk about it more in the future.
[00:22:54] These are such a powerful leverage point.
[00:22:57] If you can recognize what kinds of substitutes you are using day to day counterfactuals are going to, in my opinion, they can change the way you think entirely.
[00:23:06] They can totally change your career.
[00:23:08] Uh, they can change your, you know, your working department.
[00:23:11] Whatever it is that you’re trying to improve, try using counterfactuals.
[00:23:15] It’s amazing what this kind of thinking can do for you.
[00:23:19] Thanks so much for listening.
[00:23:20] And until next time, enjoy your tea.