Exploring the Impact of Generative AI on Software Engineering and Career Paths


Summary

In this episode of the Engineering Culture podcast, host Shane Hastie speaks with Alex Cruikshank, a Director of Software Engineering with over 25 years of experience, about the impact of generative AI on the software engineering profession. Alex draws parallels between the current AI revolution and past technological shifts, emphasizing that while change is inevitable, the fear of immediate job loss is often overstated due to the inherent complexity and human-centric nature of software delivery.

Alex predicts that generative AI will most significantly affect front-end engineering, as it represents a new, more natural way for humans to communicate with computers. This could lead to a shift away from complex, manually-built interfaces toward more conversational and hybrid AI-driven models. However, he stresses that the core skills of software engineering—understanding system architecture, problem-solving, and translating human intent into structured logic—will remain essential, albeit augmented by AI tools.

The conversation explores practical implications, including the importance of developers gaining hands-on experience with AI APIs, the role of AI as a ‘copilot’ that boosts individual coding productivity by roughly 30%, and the open question of how AI-assisted coding impacts the learning trajectory of novice programmers. Alex also highlights the need for designers to develop ‘AI literacy’ to craft effective conversational and hybrid user experiences.

From a cultural and leadership perspective, Alex warns against creating a divide between ‘AI haves and have-nots’ within engineering teams and advocates for rotating team members through AI projects to build widespread competency. He advises new engineering leaders to value team contributors holistically, recognizing that not all value comes from code output, and to focus on fostering healthy team dynamics as the foundation for performance.


Recommendations

People

  • Alex Cruikshank — The guest, a Director of Software Engineering at West Monroe with 25+ years of experience, who has been working extensively with generative AI integration over the past year.

Tools

  • OpenAI API — Recommended as the easiest API for developers to start experimenting with generative AI, to learn how to send messages and structure interactions.
  • GitHub Copilot — Described as an accurate ‘copilot’ that augments a developer’s coding, often by completing thoughts written in comments, leading to a noted increase in programming productivity.

Topic Timeline

  • 00:01:10Generative AI as a technological revolution — Alex introduces the topic of his talk on organizational resilience and generative AI. He frames AI as another technological revolution that follows a gradual adoption curve, alleviating fears of immediate mass job displacement. He argues that human workers deliver immense, often intangible value through collaboration and problem-solving, which is difficult to automate fully, even with advanced AI.
  • 00:03:30Future career paths in software engineering — Shane asks where developers should focus their career paths. Alex asserts that software engineering will remain crucial for decades but will evolve. He believes front-end engineering will be most affected because AI provides a more effective, natural language interface between humans and computers, potentially reducing the need for complex, manually-coded UI components. However, engineers will still be needed to architect these AI-driven systems.
  • 00:05:37Practical advice for developers using AI — Alex strongly recommends that developers start working with AI APIs, like OpenAI’s, to understand how to communicate with and structure processes around AI models. He shares that his company built an internal tool to give developers hands-on experience, which successfully prepared them for client AI projects. The key challenge is achieving consistency and implementing proper guardrails and intent-gating within AI-integrated systems.
  • 00:07:23AI as a team player and productivity tool — Discussing AI’s role within teams, Alex views AI as part of the ‘machinery’ rather than a team member, though tools like GitHub Copilot act as a valuable copilot. He cites an internal study showing AI improved programming productivity by about 30%, but overall productivity only by 10%, as programming is just one part of a developer’s job. The rest involves communication, requirements gathering, and design, where AI currently offers less assistance.
  • 00:09:52The impact of AI on novice programmers — Alex addresses the ‘million-dollar question’ of whether AI helps or hinders novice programmers. While AI can make them more productive, there’s a risk it becomes a crutch that prevents deep learning. He notes that code reviews by senior developers should catch AI-generated mistakes, but the long-term effect on skill development is an open question—it could either accelerate or impede their growth as engineers.
  • 00:11:16Designing products with integrated AI — The discussion turns to designing software products that incorporate AI. Alex states that designers need ‘AI literacy’ more than ever. He envisions a move towards conversational interfaces and ‘hybrid’ models where traditional UI controls and AI chat interact seamlessly via a shared data model. This approach allows users to manipulate the interface through both direct controls and natural language conversation with the AI.
  • 00:14:15Cultural impacts and AI’s role in metrics — On the cultural front, Alex warns of a potential divide between teams working with and without AI. He advocates for cycling all developers through AI projects to prevent this. He also speculates that AI, particularly its ability to analyze unstructured data like stories and commit messages, could eventually help create more nuanced and accurate metrics for team performance, moving beyond simplistic individual productivity measures.
  • 00:16:49Advice for new engineering leaders — Offering advice to new team leaders, Alex emphasizes valuing all team members for their broader contributions, not just code output. Some individuals significantly improve team dynamics and performance without being the top coders. A leader’s primary focus should be on fostering healthy team dynamics and removing obstacles that prevent the team from achieving its collective goals.

Episode Info

  • Podcast: Engineering Culture by InfoQ
  • Author: InfoQ
  • Category: Technology
  • Published: 2024-02-23T07:00:12Z
  • Duration: 00:18:29

References


Podcast Info


Transcript

[00:00:00] Scrum.org focuses on professional Scrum and supports people wherever they are on their learning journey.

[00:00:06] We help them to grow over time with ongoing learning opportunities and resources such as forums, blogs, and more.

[00:00:12] Share your knowledge and gain new insights. Visit Scrum.org to learn more.

[00:00:20] Good day, folks. This is Shane Hastie for the InfoQ Engineering Culture Podcast.

[00:00:24] Today, I’m sitting down with Alex Cruikshank. Alex, well, I’ll tell you what, I’ll let Alex introduce himself.

[00:00:31] Alex, welcome. Thanks for taking the time to talk to us.

[00:00:34] Yeah, thanks, Shane. So, I’m Alex Cruikshank. I am a Director of Software Engineering at West Monroe.

[00:00:41] I have been doing software engineering professionally for at least 25 years.

[00:00:46] I don’t know, like it goes back further than that, depending on how you define it.

[00:00:50] I’ve worked most of the time in consulting, so I’ve worked with over 50 companies.

[00:00:54] Doing various types of projects. That’s been software, small startups, to large enterprise companies.

[00:01:00] Doing architecture, everywhere in between.

[00:01:02] So, I’ve been working with a lot of different technologies.

[00:01:06] And over the last year, I’ve been working almost exclusively with generative AI.

[00:01:10] And figuring out how we can integrate that into projects.

[00:01:13] And that’s what’s brought us here today.

[00:01:15] You did a talk at UConn San Francisco last year, which was really well received.

[00:01:21] Can you give us the very short overview?

[00:01:24] In the show notes, we’ll link to the talk itself, so people can watch that.

[00:01:28] But, what were the highlights?

[00:01:29] I talked about organizational resilience and generative AI.

[00:01:33] And so, mostly, my talk was about how generative AI is going to change things.

[00:01:39] I don’t think anyone’s surprised that there’s going to be some changes with generative AI.

[00:01:43] It certainly has changed what we talked about a lot to last year.

[00:01:46] But, I wanted to talk about it in terms of a technological revolution and how those typically play out.

[00:01:52] How it typically takes time for…

[00:01:54] New technologies to actually get into the workforce, to change the way we do things.

[00:01:59] But, there’s a lot of fear that comes along with that, that people are going to lose their jobs.

[00:02:03] There’s a lot of, sometimes, desire for that to happen more rapidly.

[00:02:07] Because, you know, it’s like, if we could just use AI to replace all these people today, you know, we’d save a lot of money.

[00:02:14] But, the reality is that there shouldn’t be quite as much fear because those things take a lot of time.

[00:02:20] And the main thing is that the people that are there are actually…

[00:02:24] Almost always delivering a lot more value than it actually appears.

[00:02:28] They’re doing a lot more than it says in their job description.

[00:02:30] It’s just the innate ability of humans to interact with each other and figure out problems and make things happen.

[00:02:38] Makes it really, really hard to automate and replace people with any kind of technology.

[00:02:43] And generative AI is no different.

[00:02:45] Even if it is a little bit more, like, closer to the way humans think.

[00:02:49] That’s the gist of it.

[00:02:50] So, given that summary…

[00:02:53] As a developer…

[00:02:54] Where should I be looking to build my career path?

[00:02:59] Yeah, that’s a really interesting question right now.

[00:03:02] I mean, in the same way that it’s a really interesting question, you know, as an entrepreneur…

[00:03:07] What should I do with a startup?

[00:03:09] Because of generative AI is out there and it’s like, every single day there’s like, you know, there’s a new opportunity.

[00:03:15] And then that opportunity gets shut down the next week because, like, some other company is already there or things like that.

[00:03:22] The thing is, I firmly believe…

[00:03:24] Software engineering is going to be extremely important over the next several decades, at least.

[00:03:29] I don’t think that’s going away.

[00:03:30] I do think it’s going to change.

[00:03:32] The interesting thing is, I think it’s going to affect the front end engineering the most.

[00:03:37] And the reason why I think that is because when you get down to it, AI is really a different way for humans to communicate with computers.

[00:03:45] It’s a much more effective way for humans to communicate with computers because it basically speaks our language or, you know, it is speaking our language.

[00:03:53] And that’s a lot of what…

[00:03:54] Programmers have been doing over the years is kind of translating human thought, intent, what the desire is, and somehow into a way that a computer can understand it, you know, very structured data structures can be put into a database or acted on.

[00:04:08] And, you know, now we have a way to go from just like an unstructured, I just want to do this to structure using generative AI.

[00:04:18] And so I think that’s really going to change the way we do interfaces and maybe not even that long.

[00:04:23] I mean, we’ve been playing around.

[00:04:24] We’re already with these hybrid interfaces where, you know, you have a big wizard or like just a bunch of filters on a screen or something like that.

[00:04:30] It ends up being a complicated interface.

[00:04:32] And over the last, well, 20 years with web interfaces, we’ve been learning how to do this well and kind of coming up with good patterns for that.

[00:04:39] But the thing is, we put a lot of effort into building those complicated interfaces that can be buggy.

[00:04:44] You know, there’s a lot of things with AI.

[00:04:46] Maybe we don’t need them quite so much.

[00:04:48] And so that’s one of the reasons why I think front end development is going to change.

[00:04:52] But they are still going to need to be.

[00:04:54] People that use AI to implement those interfaces.

[00:04:58] And it needs to be like people are going to have to wire the two things together and understand still how to get the AI to talk to the computer and what the computer needs to do and figure out what the solution is actually under the hood.

[00:05:09] So that’s why I think there’s always going to be that need.

[00:05:13] But I do think we’re going to see a transition over the next really 10 to 20 years.

[00:05:17] I don’t think it’s going to happen immediately.

[00:05:19] So good news from the point of view of you’re not going to be out of work tomorrow.

[00:05:24] Good news from the point of view of a lot of the underlying skills of engineering still going to be there and learn to talk to the AI.

[00:05:37] This is something I can’t say enough is, yeah, people should be working with AI as developers using the API.

[00:05:44] Just open AI is the easiest right now to use and get used to how you send messages to it and send messages back.

[00:05:51] It’s just an API.

[00:05:52] It’s actually really easy to do.

[00:05:54] To get started with.

[00:05:55] But the problem is when you start playing with it more and trying to get more out of it, getting that consistency of the AI is the real trick and figuring out how to prompt it in the right way, how to effectively get communication back and kind of architect those systems where you’re not just getting communication to and from the AI.

[00:06:12] You’re actually effectively structuring a process and putting in the guardrails, doing the intent gating, kind of all of the other steps you need to do and understanding what those are.

[00:06:23] Those are all things.

[00:06:24] That I think people really should be getting experience with.

[00:06:27] So at my company, Westman Rowe, we have a tool which started out as just kind of like a chat GPT clone that we could use in our company.

[00:06:37] It was using Azure, so it was a little bit better from a legal standpoint and people were just chatting with it.

[00:06:42] But it gave us a kind of a base where we could start building upon it and building new tools on top of AI.

[00:06:49] And then we started cycling as many people through it.

[00:06:52] People were not actively.

[00:06:54] They’re going to our project.

[00:06:55] We’d bring them in and have them start working on this tool and start building out capabilities with it so that they could just get a little familiarity with what’s going on.

[00:07:04] And now those people are all on AI projects.

[00:07:07] It’s like because, you know, still some AI has gone up quite a bit and we need people that are familiar with the technology to go on those projects, which that’s been really rewarding to see.

[00:07:17] Consistently, the team has been the unit of value delivery in software engineering.

[00:07:23] Yeah.

[00:07:23] Software engineering, it’s never the individual.

[00:07:26] How does the AI as a team player play out?

[00:07:31] I guess the thing is, I don’t see the AI as ever being part of the team that maybe just because I haven’t really thought about it enough.

[00:07:38] And it’s definitely something to think about.

[00:07:40] But to me, the AI is still part of the machinery.

[00:07:44] Really, it’s still the thing that the team is acting on.

[00:07:47] I say that.

[00:07:48] But then with the coding tools, it’s different.

[00:07:50] I mean, I use GitHub Copilot and.

[00:07:53] It’s accurately named.

[00:07:54] It does kind of feel like a copilot.

[00:07:56] It’s like sitting there coding with me.

[00:07:58] Sometimes I will kind of like write out a comment and then just take a ten second break while it completes my thought for me, which is just great.

[00:08:06] And almost well, a lot of times gets like maybe didn’t always get it right all the way.

[00:08:11] Right. But it really helps out a lot.

[00:08:13] So I think there is that kind of you can have the AI kind of work with you as a developer, and I still feel like it’s an augmentation and not just like.

[00:08:23] Something that sits next to a programmer.

[00:08:25] I think that’s one of the things that I’ve done a little bit of a study within our group of people that are using this and kind of how effective they were when they were using it, what effect it had on their programming performance.

[00:08:36] And everyone said that it improved their productivity.

[00:08:39] But what they said is it was kind of like 30%.

[00:08:43] And then they asked the question again is, you know, that’s your programming productivity.

[00:08:47] How much has it improved your productivity overall?

[00:08:50] And they all dropped it down to like 10%.

[00:08:53] Because, you know, we don’t spend 100% of our time programming.

[00:08:56] We spend most of our time talking to people, understand their requirements, writing stories, like developing what we need to do, having architectural conversations with people.

[00:09:06] And, you know, the copilot is not in any of that.

[00:09:08] So some of that might be able to get a little help with AI, too.

[00:09:12] So we might see it creeping in some other areas.

[00:09:15] Maybe we get a little closer to that kind of 20, 30% performance improvement overall.

[00:09:22] But for right now.

[00:09:22] It’s much lower.

[00:09:24] So it’s part of the team, but it’s kind of a minor part of the team right now, I’d say.

[00:09:29] You said it gets it right most of the time.

[00:09:31] How do you know when it gets it wrong?

[00:09:33] I’ve been programming for 25 years.

[00:09:36] It’s like, did it write it the way I wanted it to write it or did it not?

[00:09:39] But the thing is, yeah, it’s easier to like look at wrong code and understand it’s wrong and fix it a lot of times than it is to write it from scratch.

[00:09:47] So it’s still quite helpful, even in those circumstances.

[00:09:51] And what if I was a novice?

[00:09:52] Would I know?

[00:09:53] This is another really good question.

[00:09:55] I mean, for me, this is the million dollar question, which is definitely AI helps novice programmers.

[00:10:02] They can be more productive.

[00:10:04] It is doing a lot of the work for them.

[00:10:07] And the question is, is it too much of a crutch?

[00:10:09] You know, are they learning?

[00:10:11] The thing is, they’re novice programmers.

[00:10:13] They should never be able to submit code without some kind of code review.

[00:10:16] So there’s always like someone more senior that’s going to be monitoring it.

[00:10:20] And, you know, the thing is, like, that means.

[00:10:22] If AI is making mistakes and they can’t catch them, then hopefully someone else is.

[00:10:26] But they need to get to the point where they’re that senior developer eventually.

[00:10:30] And the question is, is that developer giving the feedback to them or is it giving it to the AI at that point?

[00:10:36] Which it’s, of course, going to ignore.

[00:10:38] So, you know, it’s like, are they learning at the same rate?

[00:10:41] I think it’s a completely open question.

[00:10:43] It could be no and it’s bad.

[00:10:45] It’s a bad habit.

[00:10:47] Or it could be actually they’re learning faster.

[00:10:49] You know, it’s a great way, you know, because they’re watching the AI.

[00:10:52] Generate the code.

[00:10:54] They’re seeing how it gets done and usually in a good way that it may be just boosting

[00:10:59] their performance as learners, I guess, hopeful that that’s the case and we’ll see.

[00:11:05] What about designing software products, not well using AI to help design and code and

[00:11:12] build the products, but products with AI built in.

[00:11:16] So now we’re building a product, pick a domain.

[00:11:19] You’ve got a CRM.

[00:11:20] What do I add?

[00:11:21] Yeah.

[00:11:22] That’s another one of my favorite topics.

[00:11:24] I think all designers probably even more than developers need to get AI literacy.

[00:11:30] They need to get used to what does it mean to have AI in the picture?

[00:11:34] Like I said earlier, I think AI is going to have a big impact on the way we think about

[00:11:38] user interfaces.

[00:11:40] But then there’s like the kind of step beyond that.

[00:11:42] I mean, a lot of things can just be a chat bot.

[00:11:45] I’m not totally sure that’s like everything’s going to be a chat bot, but things can be

[00:11:50] a chat bot.

[00:11:51] Yeah.

[00:11:52] You can have a conversation with it.

[00:11:53] And there becomes kind of this notion of conversational design.

[00:11:57] It’s like, you’re not just going to let the AI out of the box, you know, just kind of

[00:12:02] have like, you know, open AI’s brand, essentially.

[00:12:05] You want it to have your voice.

[00:12:07] You want it to create the experience for the users that your programming interface has

[00:12:11] created in the past.

[00:12:12] So I think designers need to start thinking a little bit in terms of conversation in terms

[00:12:17] of creating these experiences.

[00:12:20] And I think we all need to be thinking about.

[00:12:22] Intermediates like, you know, going all the way to chat for everything.

[00:12:26] People aren’t going to want to do that and it’s not the right interface for everything.

[00:12:30] But then again, you know, it’s like there’s some simple, you know, if all I have to do

[00:12:34] is click the button, I’d rather just click the button rather than say, click the button

[00:12:37] now.

[00:12:38] But there’s interfaces in between that can be much more complicated.

[00:12:42] And this is where we’ve done some experiments around hybrid interfaces where we have a common

[00:12:48] model of like what the interface is representing.

[00:12:51] And that.

[00:12:52] Model.

[00:12:53] You know, it’s both controls and it’s controlled by the user interface, but it’s also fed into

[00:12:58] the prompts for a text input and then you can have a conversation and the AI will present

[00:13:03] you with a new model, a new version or part of it.

[00:13:05] And then you put that back into the model and the two can play with each other so that

[00:13:10] if you change something over here, the AI knows about it.

[00:13:12] If you change something in the AI, the UI switches.

[00:13:15] And I have a lot of hope for these type of interfaces, at least as a kind of bridge maybe

[00:13:20] between worlds.

[00:13:21] Kind of where we were and where we’re going to be.

[00:13:24] So from that architecture and design perspective, again, the need to really understand, I want

[00:13:32] to say the potential of these new tools, which is not different to any other tool we’ve come

[00:13:38] across over the years.

[00:13:39] No, absolutely not.

[00:13:41] It provides new opportunities.

[00:13:43] And I think what that there’s going to be new expectations.

[00:13:46] And so you just have to understand both of those and, you know, just kind of stay on

[00:13:49] top of it.

[00:13:50] Yeah.

[00:13:51] And you know, I think it’s a great tool to have in your software that you’re delivering

[00:13:52] state of the art.

[00:13:53] But, you know, I think the thing is, the nice thing about UIs is we all have to use them.

[00:13:59] And so we’re all constantly exposed to new ideas.

[00:14:02] And so it’s not that hard to stay up to date with that.

[00:14:05] This is the Engineering Culture podcast.

[00:14:08] What are the cultural impacts for teams of bringing in generative AI at both sides of

[00:14:15] this?

[00:14:16] Well, initially one of the things I’m seeing a little bit of, and, you know, a little bit

[00:14:21] about is that it’s going to potentially create a separation of the kind of the AI haves

[00:14:26] and have nots, you know, like there’s people working on systems that aren’t using AI.

[00:14:31] And then there are people that are like revamping systems or building new systems that are using

[00:14:36] it.

[00:14:37] And so that’s something to watch out for.

[00:14:39] And that’s kind of what I was saying.

[00:14:41] It’s great to get people cycled through that.

[00:14:43] So everyone has those capabilities and it’s not like you have your old fashioned developers

[00:14:47] and your new fashioned developers.

[00:14:49] But as far as other concerns.

[00:14:51] I don’t know, I don’t think teams, you know, it’s like building teams, making teams run

[00:14:58] while it is a part in itself.

[00:15:01] I don’t see how that’s going to change.

[00:15:03] I mean, I think we’re still going to need to have effective engineering cultures, no

[00:15:07] matter what we’re doing.

[00:15:08] So maybe you can imagine ways that AI would be able to help inform that process.

[00:15:15] Or maybe we’d have new tools to help us communicate better.

[00:15:19] But I haven’t seen those come out yet.

[00:15:20] One of the things I am kind of interested in is there’s been a lot of talk and work

[00:15:26] and interest lately in metrics for development.

[00:15:29] And also, which is the nice thing that’s come along with that is the understanding that

[00:15:33] it’s just a hard thing to do.

[00:15:35] You know, it’s like door metrics work well, but they kind of work in an operations environment,

[00:15:40] like production code and then scale metrics work pretty well, but they’re like, well,

[00:15:45] they work well, but they’re kind of like they measure teams that like when you’re looking

[00:15:49] at it.

[00:15:50] If you have a developer population, they tend to work well, but you know, then if you try

[00:15:54] to take it down to an individual level, they don’t.

[00:15:57] And you know, I’m not sure the goal should be like measuring individual performance because

[00:16:01] you know, teams, what you really care about is your team.

[00:16:04] But I do think AI potentially has like the ability to help with metrics and make better,

[00:16:10] more nuanced metrics on team performance.

[00:16:13] That’s probably more traditional AI, not generative AI.

[00:16:16] Although, well, the thing I’ve been interested in is like, there’s a lot of unstructured data.

[00:16:20] And the development process, you know, it’s like your stories and your code, I mean, you

[00:16:26] can analyze the code, but generative AI, like all those things you could potentially bring

[00:16:30] into the metrics process and get a better idea of the actual complexity of a story or

[00:16:35] even identify what was blocking this story or this story or this story and some of the

[00:16:41] like, what was going on behind the scenes to get a better sense of what was going on.

[00:16:44] A lot of our audience are relatively new leaders.

[00:16:49] Engineers.

[00:16:50] I think you know, the number of people that are doing this stuff, the number of people

[00:16:53] that are being promoted into a team leader position is a fair number of our target audience.

[00:16:56] You’ve got 25 years of doing this stuff.

[00:16:59] In general, what advice would you give them?

[00:17:01] I think the first thing is you need to look at all your team members as contributors in

[00:17:07] one way or another, but they’re not all necessarily the biggest contributors in terms of code.

[00:17:12] So that doesn’t mean they’re not big contributors to the team.

[00:17:16] And so you need to watch out for that.

[00:17:18] Because some people just make team changes.

[00:17:20] Just make teams better.

[00:17:21] And, you know, sometimes they do that in addition to like cranking out all our code and sometimes

[00:17:26] they don’t.

[00:17:27] But that’s making the team perform better.

[00:17:28] So yeah, you just always look out for the team dynamics and basically how people are

[00:17:34] getting along to make sure that that is optimal.

[00:17:37] And then you can start worrying about like, you know, how is the team performing overall?

[00:17:42] And is there some issue there that’s keeping it from achieving its goals?

[00:17:46] Well, Alex, lots of good points and interesting observations.

[00:17:49] If people want to continue the conversation, where do they find you?

[00:17:54] Best place is on LinkedIn.

[00:17:55] It’s Akershank on LinkedIn.

[00:17:56] I work at West Monroe.

[00:17:59] That’s probably it these days.

[00:18:00] I’m pretty much all social media.

[00:18:02] Thank you so much for taking the time to talk to us today.

[00:18:04] Yeah, Shane.

[00:18:05] It’s been great talking to you.

[00:18:19] Thank you.

[00:18:24] It was a pleasure.

[00:18:30] Thanks a lot.

[00:18:37] You too.

[00:18:43] Thanks, Dave.

[00:18:44] Drive safe.

[00:18:44] You too!