If AI can do your classwork, why go to college?


Summary

Sean Illing speaks with journalist James Walsh about his investigative piece on the pervasive use of AI chatbots like ChatGPT by college students to complete assignments. Walsh spent months interviewing students and professors across the country, uncovering a landscape where AI-assisted cheating has become normalized, bending definitions of academic integrity and leaving educators feeling powerless.

The conversation explores the different student perspectives: from those who cheat because they can, to those who feel pressured to keep up with peers, to others who see AI as an essential tool for future careers. Walsh shares specific examples, like a student who wrote an essay about critical pedagogy using AI, and another who openly admitted attending Columbia primarily to network and find a co-founder.

Professors are depicted as demoralized, caught between an inability to effectively police AI use and institutional inertia. Some resort to oral exams or ‘Trojan horse’ prompts, while others contemplate retirement. The discussion questions whether prompt engineering will replace writing as a core skill and what it means for a society if thinking becomes outsourced to machines.

Ultimately, the episode grapples with profound questions about the purpose of education in an AI-saturated world. If writing is thinking, and AI does the writing, what happens to critical thought? The hosts reflect personally on how these tools might have altered their own intellectual development and express concern about a potential ‘post-literate’ future for liberal democracy.


Recommendations

Articles

  • ‘Everyone is Cheating Their Way Through College’ by James Walsh — Walsh’s reported piece for New York Magazine’s Intelligencer, which forms the basis of the conversation. It details months of interviews with students and professors living through the AI cheating crisis.

People

  • Paulo Freire — The Brazilian thinker and critical pedagogue mentioned in the context of a student using AI to write an essay about his theories on learning methods.
  • Sam Altman — CEO of OpenAI referenced in the discussion. Walsh critiques Altman’s comparison of AI to a ‘calculator for words,’ contrasting it with other statements about AI’s potential existential risks.

Tools

  • ChatGPT — OpenAI’s chatbot, cited as the most commonly used tool by students for cheating, often used as shorthand for all AI chatbots.
  • Claude — Anthropic’s AI chatbot, mentioned alongside ChatGPT, Gemini, and Copilot as a platform students use for academic work.
  • AI Detectors — Tools that scan text to estimate the likelihood it was AI-generated. The episode discusses their unreliability and the debate around their effectiveness, including a report that OpenAI has a more effective detector it has not released.

Topic Timeline

  • 00:02:41Introduction to James Walsh and the scope of AI cheating — Sean Illing introduces guest James Walsh, a features writer for New York Magazine’s Intelligencer, who authored a major piece on AI’s impact on higher education. Walsh explains that his reporting revealed AI cheating is not just about dishonesty but about widespread student and professor ambivalence, disillusionment, and despair as technology outpaces institutional adaptation.
  • 00:07:17How students are using AI platforms like ChatGPT — Walsh details the primary AI tools students use—ChatGPT, Claude, Gemini, Copilot—and their applications beyond essay writing. Students use AI for note-taking, textbook summarization, creating study guides, and, significantly, for coding in computer science courses. He shares an anecdote about a Berkeley professor whose students use AI on take-home assignments but then fail in-class tests on the same material.
  • 00:11:30The mechanics of AI-assisted cheating and detection challenges — The discussion turns to the straightforward process of cheating: copying a prompt into ChatGPT and submitting the output. Walsh mentions professors using ‘Trojan horse’ methods (e.g., inserting ‘mention broccoli’ into prompts) to catch inattentive students. He explains the unreliability of AI detectors and the difficulty of proving cheating, especially when students use AI for ideation and then edit the output, making them ‘editors’ rather than original writers.
  • 00:16:43Student case studies: Roy, Wendy, and varying rationales — Walsh profiles specific students. Roy, a Columbia student, views college as purely transactional for networking and uses AI to ‘cruise’ through coursework. Wendy, who is against blatant copying, uses AI to generate ideas and outlines, dramatically cutting her writing time but not considering it cheating. These examples illustrate the spectrum of student justifications and the blurring ethical lines.
  • 00:29:54Professor perspectives: Despair, policing, and institutional failure — Walsh describes the professor viewpoint, characterized by despair and a sense of being unsupported by administrators. They struggle to police AI use, face overwhelming volume, and have little recourse even when they suspect cheating. Some adapt by switching to oral exams or blue-book tests, but acknowledge something is lost. The conversation highlights the institutional failure to address the crisis with the urgency seen during COVID-19.
  • 00:47:45Personal reflections on writing, thinking, and a post-literate future — Illing and Walsh reflect personally. Walsh admits he likely would have used AI as a student despite hating the writing process, but values the struggle for intellectual development. They argue that writing is thinking, and outsourcing it risks creating a ‘post-literate’ society. They dismiss the ‘calculator for words’ analogy, noting AI’s founders warn of its catastrophic potential, making this technological shift qualitatively different.
  • 00:56:53Will AI amplify or reduce educational inequality? — The final major topic considers AI’s potential to either level the playing field or exacerbate inequality. Walsh acknowledges helpful use cases, like aiding non-native speakers or creating personalized study guides. However, he fears writing and deep thinking could become ‘anachronistic electives’ only accessible to the privileged, creating a deeper intellectual divide in society.

Episode Info

  • Podcast: The Gray Area with Sean Illing
  • Author: Vox
  • Category: Society & Culture Philosophy News Politics News Commentary
  • Published: 2025-06-30T08:00:00Z
  • Duration: 00:59:21

References


Podcast Info


Transcript

[00:00:00] Support for this show comes from the Working Forest Initiative.

[00:00:04] The working forest industry is committed to planting more trees than they harvest.

[00:00:08] More than 1 billion seedlings are planted in U.S. working forests every year.

[00:00:13] From biologists to GIS analysts, hiring managers, accountants, and more,

[00:00:18] working forest professionals have dedicated their focus towards sustainability,

[00:00:22] using their expertise to help ensure a healthy future for America’s forests.

[00:00:27] They say they don’t just plan for the future, they plan it.

[00:00:31] You can learn more at workingforestinitiative.com.

[00:00:57] ChatGPT, Claude, Copilot, they’re not just study aides anymore.

[00:01:02] They’re doing everything.

[00:01:04] We are living in a cheating utopia, and professors know this.

[00:01:09] They see it, it’s becoming more common, but they can’t always prove it.

[00:01:13] And more often than not, they’re too burned out or unsupported to do anything about it.

[00:01:19] So, what are we doing here?

[00:01:22] What is higher education at this point?

[00:01:27] I’m Sean Elling, and this is The Gray Area.

[00:01:35] My guest today is James Walsh.

[00:01:38] He’s a features writer for New York Magazine’s Intelligencer,

[00:01:42] and the author of the most unsettling piece I’ve read about the impact of AI on higher education.

[00:01:50] Walsh spent months talking to students and professors who are living through this moment.

[00:01:55] And what he found is,

[00:01:57] that AI isn’t just a story about cheating.

[00:02:00] It’s a story about ambivalence, and disillusionment, and despair.

[00:02:05] A story about what happens when technology moves much faster than our institutions can adapt.

[00:02:13] We cover a lot of ground here.

[00:02:15] We talk about the students who are cheating because they can.

[00:02:18] The ones who are cheating because they feel like they have to.

[00:02:22] We talk about the professors who are giving up.

[00:02:25] The administrators who don’t want to confront them.

[00:02:27] They don’t want to confront the problem.

[00:02:29] And what all of this means, not just for the future of college,

[00:02:33] but the future of writing and thinking.

[00:02:39] James Walsh, welcome to the show.

[00:02:41] Thank you for having me.

[00:02:43] I’m glad you’re here.

[00:02:44] I love this piece, and I’m really excited to talk to you about it.

[00:02:48] It resonated with me on a few different levels, which I’m sure we will get to.

[00:02:54] I’ll just start.

[00:02:55] I mean,

[00:02:56] it doesn’t seem like AI is your beat, as it were.

[00:03:00] So how did you get drawn into the wonderful world of AI cheating?

[00:03:06] My editor and I were having just a conversation about cheating and not AI.

[00:03:12] So I started doing these interviews and kind of realized that there was no way to write a cheating story without it being about AI.

[00:03:24] And not only that, it was like, oh, everybody I’m talking to is in some way cheating, even if they don’t think it’s cheating.

[00:03:32] It’s just everywhere.

[00:03:35] And it’s like bending the very definition of cheating.

[00:03:39] I mean, is there any other way people are cheating these days?

[00:03:42] Oh, yeah.

[00:03:43] Are people taking Scantron still and like looking over shoulders and bubbling in circles?

[00:03:47] I did talk.

[00:03:48] Yeah, I talked to students who were like, oh, the kind of old school cheating.

[00:03:52] You know, the tradecraft is still there.

[00:03:54] There’s still, you know, whether it’s people using Apple Watches to pull up PDFs or, you know, scribbling something on the palm of their hands.

[00:04:01] As one student said to me, it’s like, you know, the floor of cheating is still there.

[00:04:06] It’s just that, like, the ceiling’s been blown off.

[00:04:10] I’m both elated and bummed that I missed the golden age.

[00:04:16] I know.

[00:04:17] I would go back and forth like, yeah, I use SparkNotes.

[00:04:20] Of course.

[00:04:21] I use, you know.

[00:04:22] Every once in a while, I peek at SparkNotes.

[00:04:23] I’m actually very glad that I didn’t have this tool when I was in college.

[00:04:28] Yeah, I am, too, ultimately.

[00:04:30] And we will also get to that, I’m sure.

[00:04:33] One thing that really made the piece work is how candid everyone you spoke to was, especially the students.

[00:04:40] And I know it’s hard to get people to talk on the record about something they’re not supposed to be doing.

[00:04:48] How did you go about finding these students?

[00:04:50] And why do you think they were so?

[00:04:51] Open.

[00:04:53] The reporting process for finding these students was sort of like, you know, any kind of story.

[00:05:00] I’ll talk to anybody.

[00:05:02] So I would find them through student newspapers was a big one, whether they were mentioned in a story or a few of them had written op-eds.

[00:05:13] So those students were, of course, eager to talk about AI even on the record because they had written op-eds.

[00:05:20] In fact, one memorable interview was a student as a freshman.

[00:05:26] She had written kind of an optimistic AI op-ed saying we need to incorporate AI more in the classroom to teach students how to use this for the rest of their lives.

[00:05:38] And then I talked to her, you know, as a sophomore sometime in the past few months.

[00:05:45] And the first thing she said, she was like, I have done a 180.

[00:05:49] We need to get AI.

[00:05:50] We need to get AI out of the classroom.

[00:05:52] And that was just kind of indicative of, like, how many more students over just the past two semesters are using it.

[00:05:59] And then I, you know, another kind of notable one where I reached out to a lot of students who were having these conversations already on Reddit and Discord.

[00:06:11] And one student in Canada, she went on to her school’s thread.

[00:06:19] On Reddit.

[00:06:21] And said, I have a problem.

[00:06:23] Do you guys have any advice for how to, like, kick a chat GPT addiction because I am fully addicted?

[00:06:29] And it was a really earnest kind of request.

[00:06:33] And so I reached out to her and she was like, yeah, let’s talk.

[00:06:37] And then, of course, just, you know, one student can lead to another student.

[00:06:43] And just kind of trying to speak with every sort of student I could find.

[00:06:48] Well, let’s walk through the tools that students are using.

[00:06:54] Because I still think there are a lot of people who just aren’t familiar with any of this, have not yet dipped their toes into the wonderful world of LLMs and chatbots.

[00:07:04] What are the main platforms?

[00:07:06] Are we talking about, you know, chat GPT, Claude, Copilot, Gemini?

[00:07:12] And how are students using them to cheat?

[00:07:16] Sure.

[00:07:17] I mean, the vast majority of students I talked to were using chat GPT, OpenAI’s platform, chatbot.

[00:07:25] And so much so that sometimes they would even refer to chat GPT and then I’d go back to them and, you know, maybe they were using another platform.

[00:07:34] But chat GPT was just kind of shorthand for a chatbot or AI.

[00:07:38] Right.

[00:07:39] But, of course, students talked about using Anthropox, Claude, and Gemini, Google, and Copilot.

[00:07:45] Right.

[00:07:46] And they were using it for kind of every facet of their education, really.

[00:07:54] You know, I think it’s my bias as, you know, an English major and a writer where I was really interested in hearing how they were using it to write their essays or outline their essays or generate ideas for their essays.

[00:08:14] But students were also eager to talk about, you know, how they used it to take notes during class for them or summarize textbooks or create study guides and outlines.

[00:08:26] And then one of the biggest use cases is just computer science students who are using it to code.

[00:08:33] Some science students using it, you know, to analyze data.

[00:08:36] But the computer science students using it to code is just a huge use case.

[00:08:42] That’s…

[00:08:43] And I spoke with a professor, a computer science professor at Berkeley, you know, one of the top computer science schools in the country who said, you know, his students have a serious problem.

[00:08:54] Tons of his students are using it on their take-home assignments.

[00:08:59] And then when they’re tested on the same sorts of problems in class, they’re failing.

[00:09:08] Have they realized that if they’re using the chatbots to code that they actually won’t be needed as students?

[00:09:11] Yeah.

[00:09:12] They actually won’t be needed as coders.

[00:09:14] Exactly.

[00:09:15] That’s exactly what this professor told them.

[00:09:17] You know, if you are depending on AI, now you are just training yourself to be an assistant.

[00:09:23] I think he put it, you know, the way he put it was an assistant to an AI platform.

[00:09:28] And you are going to be the most replaceable person in the workforce.

[00:09:31] And he said that, you know, he has had conversations with executives at these tech companies who ask him, why would I hire him?

[00:09:41] Why would I hire a coder now anyway?

[00:09:44] And we see that in the workforce, right?

[00:09:46] I think Microsoft just laid off a ton of coders.

[00:09:50] Do they feel that when they’re told that, hey, look, you’re making yourself that much more replaceable?

[00:09:57] Is that something they really feel in their bones or people are just kind of rolling with it?

[00:10:03] You know, that’s a really good question.

[00:10:06] I will say, you know, certainly.

[00:10:09] A good percentage of the students I spoke to have concerns and are aware to some extent of what they’re doing as they rely more and more on AI.

[00:10:27] I think there’s maybe kind of a whiplash going on here in a sense that, like, they’re entering school at a time when, like,

[00:10:38] what can I study now that is going to get me a job later?

[00:10:43] What skills will be necessary in four years?

[00:10:46] This stuff is moving so quickly that, like, I might as well offload this learning or, you know,

[00:10:55] I might as well not work hard at acquiring these skills because who knows if four years from now they’re going to help me.

[00:11:02] And I understand that kind of fear.

[00:11:08] So what’s the simplest example of the process here, of how people are using it?

[00:11:14] I mean, is it as simple as you go in there and you say, hey, I’m reading such and such book.

[00:11:20] You know, I’m going to upload a PDF of said book.

[00:11:24] Here’s my essay prompt.

[00:11:26] Give it to me.

[00:11:27] I mean, is it pretty much that straightforward?

[00:11:30] Yes.

[00:11:31] I mean, it depends on kind of the type of student, the type of class, the type of school you’re in.

[00:11:37] The type of school you’re going to.

[00:11:39] You know, whether or not a student thinks they can get away with that is something,

[00:11:42] or whether or not a student can get away with that is a different question.

[00:11:45] But there are plenty of students who are taking their prompt from their professor,

[00:11:51] copying and pasting it into ChatGPT and saying, I need a four- to five-page essay,

[00:11:58] and copying and pasting that essay without ever reading it.

[00:12:02] One of the funniest examples I came across is a number of professors are using this so-called

[00:12:07] Trojan horse method, where they’re dropping in kind of these non-sequiturs into their prompts.

[00:12:14] Mention broccoli, mention Dua Lipa, say something about Finland into their prompt,

[00:12:21] so that if you were to just copy and paste the prompt into ChatGPT,

[00:12:26] the essay that it spits out will say something about broccoli or Dua Lipa.

[00:12:30] And there are students who are just blindly copying and pasting those essays and handing them in.

[00:12:34] So not only, you know, are they not writing their paper,

[00:12:37] they’re not even reading them.

[00:12:39] Well, it seems like with just a little bit of effort, you could cover that up.

[00:12:43] I mean, unless you’re just so lazy that you just insert the prompt

[00:12:48] and just copy and paste whatever you get back.

[00:12:50] But if you just take a little bit of time to comb through it,

[00:12:53] you should be able to cut that stuff out and cover up the trail.

[00:12:56] And every professor I spoke to said, you know, so many of my students are using AI,

[00:13:01] and I know that so many more students are using it, and I have no idea.

[00:13:05] Yeah.

[00:13:06] Because it can essentially write, you know, 70% of your essay for you.

[00:13:10] And if you do that other 30% to cover your tracks and kind of make it your own,

[00:13:16] you know, it can write you a pretty good essay.

[00:13:19] And, you know, there are these platforms, these AI detectors that, you know,

[00:13:25] there’s a big debate about how effective they are.

[00:13:28] You know, and they essentially will scan an essay and say, you know,

[00:13:32] they’re assigned some grade.

[00:13:35] 70% chance that this is AI generated.

[00:13:38] And that’s really just looking at the language and deciding whether or not that language

[00:13:43] is created by an LLM or an algorithm.

[00:13:46] And, you know, it doesn’t account for big ideas.

[00:13:51] It doesn’t, you know, catch the students who are using AI and saying,

[00:13:54] what should I write this essay about?

[00:13:56] And not doing the actual thinking themselves.

[00:13:58] Yeah.

[00:13:59] And then just kind of writing, you know, it’s sort of like paint by numbers at that point.

[00:14:04] Well, it reduces everyone to editors, right?

[00:14:06] You’re just going in there and manipulating the language that the machine gives you.

[00:14:11] In fact, I had a conversation with, you know, the University of Florida has been eager to adopt,

[00:14:17] a lot of schools have, but University of Florida in particular,

[00:14:21] the administration’s eager to adopt AI.

[00:14:24] And I had a conversation with an administrator very high up there who said to me,

[00:14:30] you know, what does your writer, what does the future of writing look like?

[00:14:33] It probably looks a lot more like editing.

[00:14:36] And he admitted that.

[00:14:39] Did you find that students are relating very differently to all of this, to these changes?

[00:14:45] What was the general vibe you got?

[00:14:48] It was a pretty wide perspective on AI.

[00:14:55] There were students, you know, who I spoke to a student at the University of Wisconsin

[00:15:02] who said,

[00:15:03] you know, I realized AI was a problem when I would walk in,

[00:15:07] started last fall walking into the library and fully, you know,

[00:15:11] at least half of the students were using ChatGPT.

[00:15:14] And it was at that moment that she sort of started thinking about her classroom discussions

[00:15:19] and some of the peer-reviewed essays she was reading.

[00:15:24] And students were, you know, the one example she gave that really stuck with me was

[00:15:31] she was taking some of her students,

[00:15:33] she was taking some psych class,

[00:15:34] and they were talking about attachment theories.

[00:15:36] And she was like,

[00:15:37] attachment theory is something that we should all be able to talk about our own personal experiences.

[00:15:41] We all have our own attachment theory.

[00:15:43] We can talk about, you know, our relationships with our parents.

[00:15:46] That should be a great class discussion.

[00:15:48] And yet I’m sitting here in class and people are referencing studies that we haven’t even covered in class.

[00:15:53] And it just makes for a really boring and unfulfilling class.

[00:15:57] And it was like that realization for her was like,

[00:15:59] oh, we have to put the brakes on here.

[00:16:01] Something is wrong.

[00:16:02] So, you know, there are students like that.

[00:16:05] And then there are students who sort of feel like they have to use AI

[00:16:10] because if they’re not using AI, you know, they’re at a disadvantage.

[00:16:14] And not only that, AI is going to be around no matter what for the rest of their lives.

[00:16:19] So they feel as if college to some extent now is about, you know, training them to use AI.

[00:16:26] And then there are the students who are like, why not?

[00:16:29] It’s something at our disposal.

[00:16:30] And, you know,

[00:16:31] we go to school to learn how to be efficient.

[00:16:35] And, you know, this is sort of, as one student put it, you know, a hackable assignment.

[00:16:41] So I might as well use it to hack.

[00:16:43] There’s this guy, Roy Lee, who’s an interesting character, sort of the protagonist of the piece in some ways.

[00:16:51] And he has this interesting path from community college to Columbia to startup founder.

[00:16:59] What did you make of him?

[00:17:00] And his story?

[00:17:02] Sure.

[00:17:03] Roy is somebody who just has always known he wants to be a founder.

[00:17:09] And he’s been laser focused on that.

[00:17:11] And so, you know, he told me about going to Columbia and using AI to cruise his way through every assignment.

[00:17:22] He just did not care about the assignments.

[00:17:24] And because he had such a winding road to Columbia,

[00:17:28] I had to stop him and say, like, wow.

[00:17:30] Why would you go through so much effort?

[00:17:33] You know, he took a gap year and then did a year at community college to get into an elite school like Columbia

[00:17:40] and then just not take advantage of it at all and not do the work.

[00:17:44] And he said, I’m here to find a co-founder and a wife.

[00:17:49] And I think he was really helpful because it’s not just about his approach to AI.

[00:17:58] It’s about his mindset and his idea of college being this transactional place that, you know, gets you something very specific and is kind of the stepping stone.

[00:18:08] I thought that was really kind of instructive.

[00:18:10] It does kind of give the game away a little bit, right?

[00:18:12] I mean, going to college, it’s, you know, get out of here with this business about, you know, learning.

[00:18:17] Right.

[00:18:18] It’s just it’s a networking activity.

[00:18:20] Right.

[00:18:21] And honestly, you know, as kind of funny or maximalist as his language or like this example is,

[00:18:27] there’s truth to it.

[00:18:28] You know, college, we can get into this.

[00:18:31] It’s just, you know, the idea of college as this place where you just go to grow intellectually is long gone.

[00:18:39] Yeah.

[00:18:40] No, look, it makes me a little sad, but I respect the game.

[00:18:43] I respect the honesty.

[00:18:44] Sure.

[00:18:45] Sure.

[00:18:46] Yeah.

[00:18:47] I mean, was that a pretty common perspective among students, especially at the more elite universities?

[00:18:52] You know, I wouldn’t say it was common with the students I spoke to.

[00:18:56] But certainly.

[00:18:57] But those students often sort of talked derisively, I think, about other students who had that perspective in certain majors and kind of their they would talk about their classmates in finance and some in computer science who really felt like, you know, they were there for networking.

[00:19:19] Well, Wendy was a different case.

[00:19:21] Right.

[00:19:22] So she is or she says she’s against cheating.

[00:19:26] She’s against copying and pasting.

[00:19:29] But, of course, she’s using chat GPT.

[00:19:32] What was the story she was telling herself to justify that?

[00:19:36] Or if that’s the wrong word, what was her explanation?

[00:19:40] I think Wendy was sort of the best example, kind of spoke to a lot of students’ experiences where, you know, she understands the honor code as it’s written.

[00:19:53] And she.

[00:19:54] Yeah.

[00:19:55] And she views cheating now as anyone who copies and pastes from chat GPT into their Google Doc and then turns it in.

[00:20:07] And she, you know, like I was saying earlier, somebody who uses chat GPT to formulate ideas, to come up with topic sentences, and then does that kind of paint by numbers essay sits down, you know, and writes an essay in two hours.

[00:20:23] That would normally take somebody six, seven, eight hours.

[00:20:26] So it’s kind of hacking.

[00:20:28] And she, in my conversation with her, I just sort of realized, I think the best part about this assignment was watching students in real time kind of decide where the line is on cheating.

[00:20:41] And she hadn’t really figured it out, you know, where she was nostalgic for the act of actually writing.

[00:20:49] But felt as if she wasn’t cheating by.

[00:20:51] Yeah.

[00:20:52] She wasn’t cheating by outsourcing the deep thinking that essays are meant to provoke, you know, outsourcing that to an AI.

[00:21:03] She just didn’t feel like it was cheating at all.

[00:21:06] I’m not sure anyone knows.

[00:21:08] Where is the line between using AI to assist and using AI to cheat?

[00:21:13] I mean, I’m very sympathetic, especially to her case, right?

[00:21:16] I mean, she’s clearly someone who would rather live in a different world.

[00:21:20] But this is the world.

[00:21:21] But this is the world we got.

[00:21:23] And in this world, for all the reasons we’ve already said, this is the game.

[00:21:27] And people around you are playing it.

[00:21:29] And if you’re not, then you’re going to be at a disadvantage.

[00:21:32] And also, just setting that aside, it’s just incredibly tempting.

[00:21:37] Right.

[00:21:38] I mean, how do you not?

[00:21:40] It’s right there at your fingertips.

[00:21:42] You could be done in 30 minutes and hit the club or go to the ballgame or whatever.

[00:21:49] Right.

[00:21:50] But it’s asking a lot.

[00:21:52] It’s asking a lot of students to not partake.

[00:21:56] It is.

[00:21:57] It is.

[00:21:58] And I am somebody personally who, like, when ChatGPT, the version as we know it, in November 2022 came out,

[00:22:08] you know, shortly after was the first time I really played around with it.

[00:22:11] I was like, ah, it’s a party trick.

[00:22:13] And I think in the course of reporting this story, I played with it a lot more to, you know, familiarize myself with it.

[00:22:19] And it was, like, the first time I realized, oh, if I, you know, had two paragraphs and needed a transition and I couldn’t come up with it, you know, like, it could offer me something.

[00:22:29] And I am, you know, much older than these students.

[00:22:33] And there was an immediate realization of, like, if I start doing this now, I am going to lose something.

[00:22:39] Like, some part of my brain is not going to flex and work.

[00:22:45] And that is really scary.

[00:22:47] That’s scary to me.

[00:22:48] And I think that’s the current that, like, I just don’t want to do that.

[00:22:51] And to put that sort of, like, ask on 18-year-olds, 19-year-olds, 20-year-olds is crazy to me because, like you said, you know, they have clubs to be at.

[00:23:03] Or they have, you know, as one student put it, sometimes an essay just needs to get writ or something, you know.

[00:23:11] Yeah, touche.

[00:23:12] Yeah.

[00:23:13] I’m just curious.

[00:23:16] Did any of Wendy or anyone else share some of their essays with you?

[00:23:21] Did you get a chance to look at them?

[00:23:22] Were they good?

[00:23:23] Were they convincing?

[00:23:24] Would they have fooled you?

[00:23:25] The Wendy moment was, like, the crazy moment for me.

[00:23:29] It was, you know, after our conversation, I asked her to send me the essay she had written that she talked about.

[00:23:35] And I opened the essay and it was about critical pedagogy, this theory, you know, Paulo Freire, the Brazilian thinker, on learning methods.

[00:23:44] And I went back to her and said, you know, you kind of have to see the irony in using AI to write about critical pedagogy.

[00:23:54] And she really first, like, just, I think, quickly flipped it on me.

[00:24:01] She’s like, what do you think?

[00:24:03] And I said, no, I mean, explain this to me.

[00:24:06] And I think, you know, she had lines in the essay about, you know, learning is what makes us human.

[00:24:13] And so I asked her again about it.

[00:24:17] And she said, you know, something about, like, I do think depending on AI, you know, you’re going to lose some critical thinking.

[00:24:23] But, you know, it’s there.

[00:24:26] You know, AI is always going to be there for us.

[00:24:28] So we might as well be using it.

[00:24:30] Is there a case to be made, a depressing case, but true, that students who are good at using AI, and I think you alluded to this earlier, will be more prepared, actually, for the future?

[00:24:42] Yeah.

[00:24:43] That, you know, prompt engineering is just going to be the new writing.

[00:24:46] And better to figure that out now so you can adapt.

[00:24:49] I don’t know.

[00:24:50] I can’t really speak at all to the nuances of that in computer science or in sciences and, you know, what you have to learn in order to get AI to do what you want.

[00:25:05] I just can’t speak to it.

[00:25:07] But in terms of essay writing, it’s just kind of hard.

[00:25:12] It’s just kind of intuitive, I think.

[00:25:15] And I don’t know necessarily what you’re learning by copying and pasting a prompt into ChatGPT.

[00:25:22] I mean, like, you know, anybody can do that.

[00:25:25] So I don’t really understand when people talk about, like, teaching students to use AI in the arts or humanities.

[00:25:33] I still don’t really know what that looks like.

[00:25:42] But yeah, I can do it.

[00:25:45] Why keep me breathless?

[00:25:53] Yes.

[00:25:54] All right.

[00:25:55] Um.

[00:25:57] Well, it really helps if people are making a lot of time and effort into designing stuff.

[00:26:01] Yes.

[00:26:02] I want to keep是

[00:26:05] There’s a lot of way to think about something.

[00:26:06] I mean, just thinking ingredients and materials, which is weird is dangerous.

[00:26:10] Yeah.

[00:26:11] Wix is packed with actually useful AI features

[00:26:14] and agents built specifically for SMB

[00:26:16] so you can grow your business without burning out.

[00:26:19] You can get a custom, ready-to-use website in minutes

[00:26:21] with Wix’s AI website builder

[00:26:23] or choose from designer-made templates.

[00:26:26] They offer lots of built-in solutions

[00:26:28] tailored to your business, from e-com to services.

[00:26:31] And you can rest easy knowing that your Wix site

[00:26:33] is backed by 99.99% uptime and enterprise-grade security.

[00:26:37] Their data shows 280 million businesses around the world

[00:26:41] rely on Wix for their websites.

[00:26:44] Ready to create your website?

[00:26:46] Go to wix.com.

[00:26:47] That’s wix.com.

[00:26:53] Support for the show comes from Mint Mobile.

[00:26:56] We all have that stubborn friend

[00:26:58] who insists on doing things the hard way,

[00:27:01] like your friend whose car only starts 60% of the time

[00:27:04] or your other friend who never drinks water

[00:27:07] and for some reason,

[00:27:07] always has a headache.

[00:27:09] Well, let’s make sure you’re not the friend

[00:27:12] who’s overpaying for wireless in 2026.

[00:27:15] Go with Mint Mobile instead.

[00:27:17] Same coverage, same speed,

[00:27:19] just without the inflated price tag.

[00:27:22] The premium wireless you expect,

[00:27:24] unlimited talk, text, and data,

[00:27:26] but at a fraction of what others charge.

[00:27:28] Ready to stop paying for more than you have to?

[00:27:31] New customers can make the switch today

[00:27:32] and for a limited time,

[00:27:34] get unlimited premium wireless for just $15 per month.

[00:27:37] Switch now at mintmobile.com slash gray area.

[00:27:40] That’s mintmobile.com slash gray area.

[00:27:43] Upfront payment of $45 for three months,

[00:27:46] $90 for six months,

[00:27:47] or $180 for 12-month plan required

[00:27:50] or $15 month equivalent.

[00:27:53] Taxes and fees extra.

[00:27:55] Initial plan term only.

[00:27:57] Over 50 gigabytes may slow when network is busy.

[00:27:59] Capable device required.

[00:28:01] Availability, speed, and coverage varies.

[00:28:03] Additional terms apply.

[00:28:05] See mintmobile.com.

[00:28:07] Support for the show comes from Bombas.

[00:28:15] It’s the new year,

[00:28:16] so you probably have a long list of resolutions

[00:28:18] to make your life happier and more productive.

[00:28:21] Everyone has their own system.

[00:28:23] Here’s mine.

[00:28:24] I take last year’s resolutions and change I resolve to

[00:28:28] I resolve not to.

[00:28:30] This year, I’ve resolved not to drink less wine,

[00:28:33] not to do more exercise,

[00:28:35] not to be a more patient,

[00:28:37] attentive,

[00:28:37] and gracious husband and father,

[00:28:40] and not to make smoother segs and ads for Bombas.

[00:28:44] If you’re trying to hit the gym or get more active,

[00:28:47] the all-new Bombas sports socks are engineered

[00:28:49] with sport-specific comfort for running,

[00:28:51] golf, hiking, skiing, snowboarding, and all sport.

[00:28:55] For those every day around the house resolutions,

[00:28:58] Bombas also has you covered

[00:28:59] with the super luxurious Sherpa Sunday slippers

[00:29:02] and new squishy Saturday suede slip-ons

[00:29:05] for comfort on the go.

[00:29:07] You may know,

[00:29:07] I’ve tried out Bombas myself.

[00:29:09] I’ve been rocking the sports socks for over a year now.

[00:29:13] They are my favorite.

[00:29:14] I work out in them.

[00:29:15] I run in them.

[00:29:16] I use them basically every day.

[00:29:19] You can head over to bombas.com slash gray area

[00:29:23] and use code gray area for 20% off your first purchase.

[00:29:27] That’s B-O-M-B-A-S.com slash gray area code gray area at checkout.

[00:29:37] Let’s talk about the professor perspective.

[00:29:54] I mean, the professors you spoke to

[00:29:56] all seem to share something pretty close to despair.

[00:30:02] Yes.

[00:30:02] Those are primarily the professors

[00:30:05] in writing heavy classes

[00:30:07] or in writing books.

[00:30:07] Or computer science classes.

[00:30:09] You know, there were professors

[00:30:11] who I spoke to

[00:30:13] who actually were really, you know,

[00:30:16] bullish on AI.

[00:30:18] I spoke to one professor.

[00:30:20] Doesn’t appear in the piece,

[00:30:21] but she is at UCLA

[00:30:25] and she teaches, I believe,

[00:30:29] comparative literature

[00:30:30] and used AI to create her textbook,

[00:30:34] her entire textbook

[00:30:34] for this class,

[00:30:36] this semester.

[00:30:38] And she says it’s the best class she’s ever had.

[00:30:41] And so I think there are some people who are optimistic.

[00:30:47] She was an outlier in terms of the professors I spoke to.

[00:30:50] For the most part, professors were, yes, in despair.

[00:30:55] They don’t know how to police AI usage.

[00:30:59] And even when they know an essay is AI generated,

[00:31:03] the recourse there is really,

[00:31:06] is really,

[00:31:06] is really thorny.

[00:31:08] If you’re going to accuse a student

[00:31:10] of using AI,

[00:31:12] there’s no real good way to prove it.

[00:31:15] And students know this.

[00:31:17] So they can always deny, deny, deny.

[00:31:19] And just the sheer volume

[00:31:21] of AI generated essays

[00:31:23] or paragraphs

[00:31:24] is overwhelming.

[00:31:28] So that, just on the surface level,

[00:31:31] is extremely frustrating

[00:31:32] and has a lot of professors down.

[00:31:36] Now, if we kind of zoom out and just think also kind of

[00:31:40] about education in general,

[00:31:44] you know, this just raises a lot of really uncomfortable questions

[00:31:47] for teachers and administrators

[00:31:49] about the value of each assignment

[00:31:52] and extrapolate that out, you know,

[00:31:54] not just the value of the assignment,

[00:31:56] but the value of degree,

[00:31:57] of the degree in education in general.

[00:32:00] Yeah, I mean, look,

[00:32:00] I was a professor very briefly,

[00:32:04] and it is very,

[00:32:06] very easy to spot

[00:32:09] when someone hasn’t authored their own work.

[00:32:12] I mean, you can kind of tell.

[00:32:14] But knowing and proving are very different things,

[00:32:16] like you were saying.

[00:32:17] And I can imagine a lot of faculty

[00:32:20] just deciding, you know what,

[00:32:23] it’s not worth the hassle

[00:32:24] of making these sorts of allegations.

[00:32:27] It’s not worth it.

[00:32:29] Which, again, I understand.

[00:32:31] But I think the end result of that

[00:32:34] is that everyone involved

[00:32:36] sort of,

[00:32:37] ceases to take any of it seriously.

[00:32:39] And the whole thing just becomes

[00:32:40] completely hollowed out.

[00:32:43] Yeah, I don’t,

[00:32:44] I somehow don’t think we’re going to police our way

[00:32:46] out of this problem.

[00:32:48] You know, one, also just like

[00:32:49] asking

[00:32:50] professors to, like,

[00:32:55] do this CSI sort of thing

[00:32:57] and then go to the,

[00:32:59] for every AI-generated essay

[00:33:01] is just not sustainable.

[00:33:02] So, you know,

[00:33:04] what professors,

[00:33:05] you know, and administrators

[00:33:07] are kind of talking about

[00:33:08] is one,

[00:33:09] upfront,

[00:33:10] getting students,

[00:33:12] the most important thing

[00:33:13] for them to learn at this point

[00:33:14] is, like, why they shouldn’t be using

[00:33:16] AI and convincing them

[00:33:19] not to use AI.

[00:33:20] You know, cheating

[00:33:21] in general is,

[00:33:23] the kind of research on it shows

[00:33:25] that cheating comes,

[00:33:27] you know, from a lot of different factors

[00:33:29] that we kind of all understand.

[00:33:30] You cut corners when you’re,

[00:33:31] you know, stressed

[00:33:32] or,

[00:33:33] or,

[00:33:35] or, you know,

[00:33:36] short on time.

[00:33:37] You know, there are all these external factors.

[00:33:39] And I think

[00:33:40] if,

[00:33:42] if we can really get students to understand

[00:33:44] what they’re losing by cheating,

[00:33:47] that’s, like,

[00:33:47] the number one kind of

[00:33:48] method of,

[00:33:50] of making sure they don’t use AI.

[00:33:51] From there,

[00:33:52] like, there’s the practical stuff,

[00:33:53] like going back to blue books,

[00:33:55] going, you know,

[00:33:56] I talked to one professor

[00:33:58] who has gone,

[00:33:59] switched entirely from essays

[00:34:01] to oral exams.

[00:34:02] And he really enjoys it.

[00:34:05] He sits down.

[00:34:05] I mean, he’s lucky in that he’s got small enough

[00:34:08] class sizes that he has time to do this.

[00:34:11] But he has really enjoyed

[00:34:13] having this one-on-one time with students

[00:34:14] and getting to ask them questions

[00:34:17] and hearing their responses.

[00:34:18] And it’s a better way for them to show them

[00:34:20] the mastery of the topic.

[00:34:23] But he also admits that there’s something lost.

[00:34:25] What does he think is lost?

[00:34:27] Well, just the ability to write, right?

[00:34:29] Like, there are plenty of students

[00:34:32] who are going to do a better job

[00:34:34] writing

[00:34:35] and sitting and thinking than,

[00:34:36] than, you know,

[00:34:37] sitting down with a professor

[00:34:39] and fumbling through a conversation like that.

[00:34:42] That that sucks for those students that,

[00:34:44] you know, that would have sucked for me in college.

[00:34:47] So there is something that’s lost.

[00:34:49] So I think there’s got to be some sort of

[00:34:54] ad hoc way that can,

[00:34:55] you know, be a combination of blue books,

[00:34:57] orals and getting students to really understand

[00:35:01] the value of doing their own work.

[00:35:03] I kind of go back to something you said,

[00:35:05] a couple of minutes ago.

[00:35:08] The professor who thinks it’s a good thing

[00:35:11] that she was able to use AI to write her textbook.

[00:35:17] You can’t really ask students to not use AI

[00:35:20] to do their assignments if you’re using AI

[00:35:23] to produce your lectures or write your textbooks

[00:35:27] or do your lesson planning, right?

[00:35:28] A lot of professors, and I understand why,

[00:35:31] are also telling their students it’s okay to use AI

[00:35:34] as long as they cite their work or,

[00:35:36] you know, in some cases they’ll ask for a printout

[00:35:41] of the conversation between the student and the AI,

[00:35:45] just as a way to like kind of show the thinking.

[00:35:48] How many professors do you think now are just having AI

[00:35:51] write all their lectures?

[00:35:53] You know, there’s been a little reporting on this.

[00:35:56] I don’t know how many are.

[00:35:58] I know that there are a lot of platforms that

[00:36:03] are advertising themselves or, you know,

[00:36:08] asking professors to use them more,

[00:36:10] not just to write lectures, but to even grade papers,

[00:36:13] which of course, you know, as I say in the piece,

[00:36:15] opens up the very real possibility that right now

[00:36:18] an AI is grading itself and offering comments

[00:36:21] on an essay that it wrote, you know?

[00:36:23] And this is pretty widespread stuff.

[00:36:26] Microsoft has given this platform to like all of the students

[00:36:33] in the Sao Paulo, you know, high school network, you know,

[00:36:37] and there are plenty of universities across the country

[00:36:41] using, offering teachers this technology.

[00:36:48] And so, you know, students love to talk about catching their

[00:36:52] professors using AI.

[00:36:54] I mean, it brings a lot of joy.

[00:36:57] I know another professor you spoke to just said,

[00:36:59] look, every time I talk to a colleague about this,

[00:37:01] the same thing comes up.

[00:37:02] Retirement.

[00:37:03] You know, that’s pretty bleak, but was that a pretty common level

[00:37:08] of demoralization in your reporting?

[00:37:12] I’ve spoke to another couple of professors who are like,

[00:37:16] you know, I’m nearing retirement, so it’s not my problem.

[00:37:19] And good luck figuring it out, younger generation.

[00:37:23] Cool.

[00:37:24] Yeah, I mean, it is.

[00:37:26] I just don’t think

[00:37:29] people outside of academia realize what a seismic

[00:37:32] change is coming and it’s, it’s

[00:37:38] and it’s in many ways a canary, right?

[00:37:41] This is something that we’re all going to have to deal with professionally.

[00:37:44] And it’s happening much, much faster than anyone anticipated.

[00:37:49] I spoke with somebody who works on education at Anthropic who said,

[00:37:53] you know, we expected students to

[00:37:57] be early adopters and use it a lot.

[00:37:59] We did not realize how many students would be using it.

[00:38:01] And how often they would be using it.

[00:38:04] I want to go back to the administrators.

[00:38:08] Is it your sense that a lot of university administrators are

[00:38:14] incentivized to not look at this too closely?

[00:38:18] That it’s better for business to shove it aside?

[00:38:23] I mean, I, I want to give administrators more credit than that.

[00:38:27] I don’t.

[00:38:28] You don’t.

[00:38:29] I mean, I guess you’ve learned.

[00:38:31] You have a lot more experience in academia than I do.

[00:38:34] So, you know, I’ll take your word for it.

[00:38:38] I mean, I, you know, I do think there is a vein of AI optimism among a certain, you know, type of person of a certain generation who is, you know, saw the tech boom and thought, like, I missed out on that wave.

[00:39:03] And now I kind of want to adopt, you know, I want to be part of this new wave, this future, this inevitable.

[00:39:08] Future that’s coming.

[00:39:09] So they want to adopt the technology and aren’t really picking up on how dangerous it might be.

[00:39:16] I still know a lot of people who teach at universities, and I talk to them all the time.

[00:39:21] And a lot of them tell me that they feel very much on their own with this, that the administrators are pretty much just, hey, figure it out.

[00:39:31] And I think it’s revealing that university admins were very quickly able to.

[00:39:38] Say, during COVID, implement drastic institutional changes to respond to that.

[00:39:47] But they’re much more content to let the whole AI thing play out.

[00:39:52] And just so that it’s clear what I’m saying, and this is me saying this, this is my opinion.

[00:39:58] I think that they were super responsive to COVID because it was a threat to the bottom line.

[00:40:04] They needed to keep the operation running.

[00:40:06] AI, on the other hand, doesn’t.

[00:40:08] It doesn’t threaten the bottom line in that way, or at least it doesn’t yet.

[00:40:11] But AI is a massive, potentially extinction-level threat to the very idea of higher education.

[00:40:20] But they seem more comfortable with a degraded education as long as the tuition checks are still cashing.

[00:40:27] You think I’m being too harsh?

[00:40:29] No, I don’t.

[00:40:29] You can punt.

[00:40:30] No, no, I genuinely don’t think that’s too harsh.

[00:40:33] I think there may be a factor.

[00:40:38] There is not much of an appreciation among administrators for the power of AI and exactly what’s happening in the classroom and how prevalent it is.

[00:40:49] I mean, but you are right.

[00:40:51] I did speak with many professors who go to administrators or even just older teachers.

[00:40:57] You know, TA is going to professors and saying, this is a problem.

[00:41:01] As I spoke to one teacher, one TA at a writing course at Iowa, who…

[00:41:08] Who went to his professor, and the professor said, just grade it like it was any other paper.

[00:41:13] You know, it’s sort of like, turn a blind eye to it.

[00:41:16] And that is one of the ways AI is, you know, challenging, kind of like exposing the rot underneath education.

[00:41:25] Like, it’s just this system that hasn’t been updated in forever.

[00:41:28] And in the case of kind of the U.S.’s educational higher ed, it’s like, yeah, for a long time,

[00:41:37] it’s been this transactional.

[00:41:38] It’s been this transactional experience.

[00:41:41] You pay X amount of dollars, tens of thousands of dollars, and you get your degree.

[00:41:45] And what happens in between is not as important.

[00:41:49] And look, even if what you said a minute ago is true, right, that maybe a lot of the administrators

[00:41:52] don’t fully appreciate the power of these tools, that’s not really an excuse, right?

[00:42:00] I mean, that’s the result of a decision not to understand.

[00:42:04] Fair.

[00:42:04] And that, to me, is just as obscene.

[00:42:08] Totally.

[00:42:08] And many of these universities do have partnerships with AI.

[00:42:13] And what you said about universities can also be said about AI, that, you know, for the

[00:42:22] most part, these are companies or companies within nonprofits that are trying to capture

[00:42:29] customers.

[00:42:29] One of the kind of more dystopian moments, you know, we were finishing this story, getting

[00:42:34] ready to just completely close it, and I got a push alert.

[00:42:38] That was like, Google is letting parents know that they, you know, have created a chatbot

[00:42:42] for children under eight years old or 10 years old or something.

[00:42:47] And it was kind of a disturbing experience.

[00:42:52] But they are trying to capture these younger customers and build this loyalty.

[00:42:57] You know, there’s been reporting from the Wall Street Journal on open AI and how they

[00:43:03] have been sitting on an AI detector that would be really, really effective.

[00:43:08] Essentially, watermarking their output.

[00:43:13] And they’ve been sitting on it.

[00:43:14] They have not released it.

[00:43:15] And you have to wonder why.

[00:43:16] Wow.

[00:43:17] I did not know that.

[00:43:18] Yeah.

[00:43:19] You have to wonder.

[00:43:20] And, you know, you have to imagine they know that students are using it.

[00:43:23] And in terms of building loyalty, you know, an AI detector might not be the best thing

[00:43:28] for the brand.

[00:43:38] It only does that.

[00:43:48] Yeah.

[00:43:49] Yeah.

[00:43:50] Yeah.

[00:43:51] Okay.

[00:43:52] All right.

[00:43:53] Thanks, Matt.

[00:43:54] Thank you for joining us.

[00:43:55] Thanks for having me.

[00:43:56] You’re welcome.

[00:43:57] Thank you for having me.

[00:43:58] Thank you for having me.

[00:43:59] Thanks for having me.

[00:44:00] Thanks for having me.

[00:44:01] I appreciate it.

[00:44:02] Thanks.

[00:44:03] Thanks.

[00:44:04] Thanks.

[00:44:05] Thanks.

[00:44:06] Thanks, Matt.

[00:44:07] Thanks.

[00:44:08] They say their options range from personalized products to trusted generics that cost 95% less

[00:44:14] than brand names. HIMS say they bring expert care straight to you with 100% online access

[00:44:19] to personalized treatments that put your goals first. You can think of HIMS as your digital

[00:44:23] front door. You can step through to get back to your old self. It’s not a one size fits all

[00:44:28] approach. They say they put your health and goals first with real medical providers, making sure you

[00:44:33] get what you need to get results. You can get simple online access to personalized affordable

[00:44:38] care for ED, hair loss, weight loss, and more by visiting HIMS.com slash gray area. That’s HIMS.com

[00:44:46] slash gray area for your free online visit. HIMS.com slash gray area.

[00:44:56] Support for the show comes from Shopify. Starting a new business has never been easy,

[00:45:00] but without the right tools, it can feel almost impossible.

[00:45:04] Shopify says they can help set you up for lasting success.

[00:45:08] Shopify is the commerce platform used by millions of businesses around the world.

[00:45:12] They say they can help you tackle all those important tasks in one place from inventory

[00:45:17] to payments to analytics and more. No need to save multiple websites or try to figure out

[00:45:22] what platform is hosting the tool that you need. Everything is all in one place,

[00:45:27] making your life easier and your business operations smoother.

[00:45:31] Let Shopify be your commerce expert.

[00:45:33] With world-class expertise in everything from managing inventory to international shipping to

[00:45:38] processing returns and beyond, you can get started with your own design studio. With hundreds of

[00:45:44] ready-to-use templates, Shopify helps you build a beautiful online store that matches your brand

[00:45:49] style. It’s time to turn those what-ifs into with Shopify today. You can sign up for your

[00:45:56] $1 per month trial period and start selling today at Shopify.com slash box. Go to Shopify.com slash

[00:46:03] box. That’s Shopify.com slash box.

[00:46:12] Support for the show comes from Shopify. Starting a new business has never been easy,

[00:46:17] but without the right tools, it can feel almost impossible. Shopify says they can help set you up

[00:46:22] for lasting success. Shopify is the commerce platform used by millions of businesses around

[00:46:28] the world. They say they can help you tackle all those important tasks in one place from

[00:46:33] inventory to payments to analytics and more. No need to save multiple websites or try to figure

[00:46:38] out what platform is hosting the tool that you need. Everything is all in one place,

[00:46:43] making your life easier and your business operations smoother. Let Shopify be your

[00:46:49] commerce expert with world-class expertise in everything from managing inventory to

[00:46:53] international shipping to processing returns and beyond. You can get started with your own

[00:46:58] design studio. With hundreds of ready-to-use templates, Shopify helps you build a beautiful

[00:47:03] online store that matches your brand style. You can sign up for a free trial of Shopify.

[00:47:03] You can sign up for your $1 per month trial period and start selling today at Shopify.com

[00:47:16] slash box. Go to Shopify.com slash box. That’s Shopify.com slash box.

[00:47:33] If you don’t mind, I just want to ask you about some of this on a more personal level. I mean,

[00:47:45] as someone who is in the business of thinking publicly or writing publicly, if these tools

[00:47:52] existed when you were in college, do you think you would have used them? And if you did, do you

[00:47:56] think you would have become a writer at all? Could you have become a writer if you didn’t actually,

[00:48:01] you know, write?

[00:48:03] I mean, I don’t, I would do, to this day, you know, I’m 39 and I will do anything other than

[00:48:19] write when I have to write. Like, I’ll do my taxes. I will eat a bag of nails. Like, I hate

[00:48:25] writing. I hate, it’s just, and I am a professional writer. So, this tool in 18,

[00:48:33] in the hands of 18-year-old me, you know, I would have used it. That being said, you know,

[00:48:39] I feel really good when I write now. And the idea of stringing together words and ideas is

[00:48:47] really important to me and makes me feel good about myself. And so, that, you know,

[00:48:58] professors I spoke to, and I’m thinking of one TA in particular,

[00:49:03] who said the thing that he was most worried about is students taking the easy way out,

[00:49:11] that they didn’t sit down and struggle. And, you know, the idea of getting from a blank,

[00:49:17] blinking cursor to one page, even if that student’s not going to go on to be a writer,

[00:49:23] is really important for their sense of self and their ability to think in complex, critical ways.

[00:49:33] And that, to lose that is really scary. And no, I certainly, you know, I doubt

[00:49:40] I would be who I am and do what I do.

[00:49:46] Look, I mean, I will not judge any student who’s doing this, because I’m fairly certain I would

[00:49:56] have too, if I had the option. However, I think it’s really important that we not,

[00:50:03] separate thinking and writing. Because in so many ways, writing is thinking.

[00:50:10] Right.

[00:50:10] And for me, at least, it’s very often, I don’t even know what I think in total. I write.

[00:50:16] Right.

[00:50:17] And to the extent I think well now as an adult, which is super debatable,

[00:50:23] you know, but it is because I spent years in school sitting with these books, reading these books,

[00:50:33] thinking about these books, they changed me. They inspired me. They set me on the course

[00:50:38] that I’m on. And if ChatGPT was doing the work for me, that would not have happened.

[00:50:45] Mm-hmm. Mm-hmm.

[00:50:46] I don’t think it’s even conceivable that it would have happened. I’d be a different person doing

[00:50:52] something different. I don’t know what that would be, but I’d be a different person.

[00:50:54] Certainly.

[00:50:55] And yeah, that is what’s so dispiriting to me about all of this. Right now I’m sermonizing.

[00:51:01] Well, no, I mean, the thing that—

[00:51:03] A lot of the sort of defense that a lot of AI optimists put up is like it’s that it’s a

[00:51:09] calculator or, you know, I grew up with spell check and other generations before me didn’t have

[00:51:17] that. And just fundamentally, I say to these people, like, do you not understand the difference

[00:51:22] between like a little green line, squiggly line between like after a dangling modifier and like

[00:51:29] something that’s generating ideas and summarizing the books that you wrote?

[00:51:33] That you were supposed to have spent the past two weeks reading? Like,

[00:51:36] of course there’s a difference between these technologies.

[00:51:38] We’re a long way from spell check.

[00:51:39] Yes.

[00:51:39] You know what I mean? All right. You mentioned the calculator, right? I mean,

[00:51:46] which is a good time for me to ask the obligatory, are we sure we’re not old people yelling at clouds

[00:51:53] here question, right? People did panic about calculators. People panicked about the internet.

[00:51:59] Hell, Socrates panicked about the written word.

[00:52:03] How do we know this isn’t just another moral panic, one that might look silly in retrospect?

[00:52:09] To be clear, this is my opinion. I genuinely, I don’t know if it’s the case. I think there’s

[00:52:16] a lot of different ways we could respond to that.

[00:52:20] It’s not a generational moral panic. This is a tool that’s available and it’s going,

[00:52:24] it’s available to us just as it’s available to students. Society and our culture will decide

[00:52:31] what, you know, the morals.

[00:52:33] Are, and it’s, and that is changing and the way that the definition of cheating is changing. So,

[00:52:38] so who knows, it might be a moral panic today and it won’t be in a year. However, I think

[00:52:46] somebody like Sam Altman, you know, the CEO of open AI is one of the people who said this is,

[00:52:53] this is a calculator for, for words. And I just don’t really understand how

[00:52:58] that is compatible with other statements he’s made about AI. You know, potentially,

[00:53:03] being lights out for humanity or, um, uh, statements made by, made by people at an

[00:53:09] anthropic about the power of AI to potentially be a catastrophic event for humans.

[00:53:16] So, um, and these are the people who are closest and, and thinking about it the most, of course.

[00:53:22] Um, I, I, I have spoken to some people who say, you know, there is a possibility. And, um, I think,

[00:53:33] there are people who use ai who would back this up that you know we’ve kind of maxed out the ai’s

[00:53:40] potential to to supplement essays or writing that it might you know not get much better than it is

[00:53:47] now you know and i think that would be like a very long shot and one that i would not want to bank on

[00:53:54] um but if that were the case the worst it will ever be yeah yeah and if that were the case then

[00:53:59] we would look back i think on this conversation and be like yeah we were just kind of old people

[00:54:03] shouting at the clouds just a calculator for words that is nauseating that like that that

[00:54:10] hurts me in my heart james i and it’s i think it’s more likely yeah i don’t know this idea of

[00:54:18] i’m interested in this idea of like the kind of post

[00:54:22] literate world and if that’s if we’re on that highway now you mentioned earlier that you

[00:54:28] understood

[00:54:29] their fear i think we were talking about the students you understood some of the fears they

[00:54:32] have and i’m sure you you understand the fears of all the parties involved here

[00:54:38] and that you share it i mean is this is this your biggest fear that that we are just hurtling

[00:54:43] towards the post-literate society and i would argue again for the reasons i said

[00:54:49] earlier if we are post-literate then we’re also post-thinking uh i mean it’s a very scary thought

[00:54:59] that i try not to dwell in because it’s just also a very depressing thought and um the idea that i

[00:55:05] you know my profession and what i’m doing is just feeding the machine that like my number

[00:55:11] my most important reader now is a robot um and there’s going to be fewer and fewer readers is

[00:55:17] really scary not just because they’re you know because of subscriptions but because

[00:55:22] as you said that means fewer and fewer people thinking and and engaging with these ideas um i’d

[00:55:29] like to i’d like to add to that i think ideas can certainly you know be expressed in other

[00:55:35] mediums and that’s exciting um but i i i don’t think anybody who’s paid attention to the way

[00:55:41] technology has changed um and shaped teen brains over the past decade and decade and a half and

[00:55:49] think yeah we need we need more of that you know i i i think um and and the technology we’re talking

[00:55:58] are orders of magnitude more powerful

[00:56:02] than the algorithms on Instagram or whatever?

[00:56:07] Look, I’m just a lowly political theorist,

[00:56:08] so what do I know?

[00:56:09] But I do not believe there’s a model of liberal democracy

[00:56:13] that works in a post-literate society.

[00:56:19] Don’t know one.

[00:56:20] That’s really scary.

[00:56:21] Yeah.

[00:56:23] Maybe someone can invent one that’s adapted to a society

[00:56:27] that can only think and communicate and speak in means.

[00:56:30] But that’s not the one we have.

[00:56:32] And to get from the one we have to that one

[00:56:34] will probably be messy.

[00:56:37] Yeah, I don’t want to think about that.

[00:56:39] Yeah, yeah.

[00:56:40] I don’t really know how to pivot from all of that heaviness,

[00:56:43] so I’ll just do it.

[00:56:45] But it is a question I wanted to ask

[00:56:46] because I think it’s worth asking

[00:56:48] about every revolutionary technology,

[00:56:51] and this is definitely that.

[00:56:53] Do you think this will ultimately reinforce

[00:56:55] or…

[00:56:57] amplify existing inequalities

[00:57:00] the way a lot of revolutionary technologies do?

[00:57:03] Or do you think this might help in some way

[00:57:06] level the playing field?

[00:57:09] Is this something you thought much about?

[00:57:10] Yeah, I mean, I thought about it a lot

[00:57:11] in the context of education.

[00:57:14] You know, there are certainly really good use cases

[00:57:17] for AI in leveling the playing field, right?

[00:57:21] As a writing tool, it can be extremely helpful

[00:57:25] for people writing it,

[00:57:27] their second or third language.

[00:57:29] It can be really helpful to create study guides

[00:57:33] for really dense material

[00:57:34] and, you know, put it in ways that are customized

[00:57:38] to your style of learning.

[00:57:40] That’s really cool.

[00:57:43] You know, in terms of how it could accelerate

[00:57:47] these inequalities, you know,

[00:57:49] the idea that writing and thinking

[00:57:52] at a college level can be even more specialized,

[00:57:57] you know, I kind of put a line

[00:57:59] in about it becoming, you know,

[00:58:00] that writing will be an anachronistic elective

[00:58:03] like basket weaving, right?

[00:58:05] It’s like, it’s going to be only for people

[00:58:06] who go to this certain school

[00:58:07] or can afford this certain school

[00:58:09] and have the time and privilege to write

[00:58:12] and therefore to, you know, engage with ideas and think.

[00:58:16] And if that’s the case,

[00:58:18] who’s going to be able to afford

[00:58:21] to have those experiences?

[00:58:24] All right.

[00:58:25] James Walsh.

[00:58:26] The piece is outstanding.

[00:58:27] It is called Everyone is Cheating Their Way Through College.

[00:58:32] I highly recommend it.

[00:58:33] Go read it.

[00:58:35] Thanks for coming in, man.

[00:58:36] Thanks, Sean.

[00:58:37] I enjoyed it.

[00:58:45] All right.

[00:58:46] I hope you enjoyed this episode.

[00:58:48] I think you can tell that I did.

[00:58:50] As always, we do want to know what you think.

[00:58:53] And I know at the end of every one of these episodes, I ask you to send me your thoughts.

[00:58:58] But just know that they really do mean a lot.

[00:59:01] I read every single note that we get.

[00:59:03] My team reads every single note that we get.

[00:59:06] And whether they’re positive or negative, we try to learn from them.

[00:59:10] So just know I’m asking sincerely.

[00:59:13] And it means a lot to us.

[00:59:15] You can leave us a message on our new voicemail line at 1-800-214-5749.

[00:59:21] And if you do that…

[00:59:23] Once you’re finished, go ahead and rate and review and subscribe to the podcast.

[00:59:27] It really does help.

[00:59:28] This episode was produced by Beth Morrissey, edited by Jorge Just, engineered by Christian Ayala,

[00:59:34] fact-checked by Melissa Hirsch, and Alex Overington wrote our theme music.

[00:59:38] New episodes of The Gray Area drop on Mondays.

[00:59:41] Listen and subscribe.

[00:59:43] The show is part of Vox.

[00:59:45] Support Vox’s journalism by joining our membership program today.

[00:59:49] Go to vox.com slash members to sign up.

[00:59:51] And if you decide to sign up…

[00:59:53] If you decide to sign up because of this show, let us know.