Rethinking growing engineers in the age of AI coding

Meri Williams, CTO at Pleo, discusses scaling teams while keeping careers moving in an increasingly AI-first world.

  • Meri Williams
    Meri Williams, Chief Technology Officer, Pleo
The transcript below has been generated using AI and may not fully match the audio.
Hi everyone. so yeah, I'm Meri. I have, done lots of things over the course of my career. but I had the great joy of being CTO at Monzo when, what I refer to as the beta of incident.io was built there. Chris has been coding on something along these lines for many years though, 'cause we worked together at Monzo and he was, building our Slack agents to make incident response easier there as well.but yeah, I'm a CTO these days. Start out. As a hardware hacker, and my fun fact is I built part of South Africa's first satellite when I was a teenager. So who have you got kids? This is audience participation. Pay attention. There we go. so I have great parenting advice for you, which is don't let your kids do anything cool when they're young.I sold it something that went into space when I was 16 and it's all been fucking downhill since then. so yeah. and some of you may know me from the, lead dev conference as well, but. Today I wanna talk about how AI coding is changing things, and in particular, how it's changing how we develop people.I don't think any of us would say that it's not changing things. it's very clearly, making a big impact. and I think we're increasingly seeing people refer to AI coding assistance as, like a, junior engineer, like the, smartest intern you've ever had. I've heard all sorts of versions of, this, and.They are to some extent capable of the work juniors previously did. But if AI does the junior engineer work, what happens to the junior engineers? some people think we're not gonna need them anymore. 75% of entry level roles are gone already this year, which is a horrifying statistic if you're a computer science graduate or you've just done a bootcamp.but I wonder where people think senior engineers come from. do they think that they spring fully formed from, like Minerva from Jupiter's forehead? That okay. It was a classic joke. I knew it wasn't gonna go well in this audience. But, luckily we're not alone. I, went and talked to, people in law where they have exactly the same concern.Lots of lawyers come from being paralegals. A lot of the paralegal work now can be done by ai. Although equally, there have been some lawyers, disbarred and censured. And similar for citing court cases that didn't exist because the LLM is hallucinated. so it's not, it, it's not a done deal in their industry at all.My wife's an architect, a real one, not a technical one, not an information one. Built hospitals. and, it's been really fascinating over the course of her career to just watch because she was taught by a generation that was still drafting things by hand with pens and ink and paper. And they had no idea how to train this generation that was doing computer aid design.And then her generation doesn't know how to train the generation that is doing 3D modeling and then just taking pictures of things from different angles to get their plans and elevations and so on. And so I think. We're typically very insular. The tech industry, we tend to, one of the things that really frustrates me is the reason we started lead dev was, we went back to Taylorism.We said, let's start with 1920s manufacturing, management philosophy, and we'll iterate our way from there, rather than, paying attention to the a hundred years of research that's been done. and so I do think that it's worth us. Continuing to look at some of these other, industries and to note what they're finding easy, what they're finding hard, and try to steal from them, wherever possible.And so for, in prepping for this, talk, I interviewed a, bunch of associate engineers, which is what we call juniors, so that we're not, being ageist bastards. and, and then also a bunch of principal engineers, staff engineers, and so on, and found some. some interesting stuff.traditional associate work is well scoped, well-defined. They often do similar work repeatedly. They pair with other engineers a lot. They get clear and rapid feedback in the form of test passing or failing. pull request reviews, pairing with other engineers. but they also do a lot of just like searching for answers on Google and Stack overflow.And increasingly now ai And one of the things that every associate also talked about was this need to learn, when to struggle through a bit, when to keep trying and when to get help. And I think that's a human judgment thing that's in, that's incredibly hard to instill in people without, them having some time and experience.One of the core concepts in the theory of Learning is this concept of deliberate practice, which it's a little academic, so just bear with me for a second. But it basically says you need to be willing to exert effort to improve your performance. So you're willing to put the, work in the design of the task takes into account what you already know so you can quickly execute it correctly, without loads and loads of training.you receive immediate, informative feedback and knowledge of the results of your performance, and you repeatedly perform the same or similar tasks. And when people talk about the 10,000 hour rule and getting to world class performance, this is the type of practice that they mean. So everybody who tries to debunk the 10,000 hour rule is like just doing the same thing over and over again, doesn't teach you shit.And they're right. But it's quite specific in the, research that it's this kind of practice that makes the difference. And there are different models of deliberate practice too, right? There's the sports model, this is why rugby players lift weights. They're not lifting weights on the field, but they need to be strong in order to do the things that they do.there's the chess model where you look at what the grandma did and figure out what you would do and whether it's different or not. And then there's the music model, which is chunking things up, rehearsing them, and I think we can take those three models and apply them to incident response. but aware of the time, I'm not gonna go into that in a load of detail, but the thing that I think we have accidentally lucked into is.Associate work has been designed in a way that makes it really effective, deliberate practice. It's challenging. They're getting paid, so they're willing to exert effort to get better. They get immediate, and, useful feedback either in the thing not working. Passing tests or feedback from other engineers.They tended to do quite a lot of repetitive work. 'cause that was a good way to, master what they were doing. and they were actively learning all the time. And all humans, and actually all primates go through the same process, of learning new skills. We start out unconsciously incompetent.You dunno what you're doing and you don't even know that you dunno what you're doing. This is somebody who doesn't even know what the gear change is for in a car, you go on to being consciously incompetent. You know now that you're doing it wrong. So this is the learner driver grinding the gears, forgetting to check their mirrors.I appreciate I should make this like some sort of tube analogy, but in a week that the kno line is not running. I figured that was just a disastrous thing to attempt. So we're, sticking with driving cars. You become consciously competent eventually and that. Is the reason why somebody who has just passed their driving test spends all of their energy paying attention to driving.But for those of you who drive, who's gotten to work and not been able to remember any of the journey. And the rest of you either don't drive or are lying. and that's because we get to a point of being unconsciously competent, we're automatically doing the thing. And one of the reasons this is interesting is this is why sometimes your most expert engineers suck at helping your most junior engineers.They're least able to explain what they actually do, but they don't like saying, my hind brain decided for me. And so when you ask them why they did what they did, they make shit up. They're not trying to be malicious, but it doesn't help other people to learn from the thing they made up on the spot about why they did what they did, rather than the actual reality, which is this is so inured in me now that I just automatically get it right.And so on this journey in the past, I think engineers learned a lot of stuff by osmosis, you got a lot for free just by being entry level, doing this repetitive type of work that was accidentally really well designed to be deliberate practice. So it was accidentally a really effective form of learning.And it's important for juniors to, learn things like foundations, how things fit together, smells like how do you predict something's gonna go wrong? I think a lot of, entry level engineers earn that, learn that from, more senior engineers reviewing their prs and going, whoa, hang on. I've seen a bad thing that will happen if we do it this way.They learn debugging skills. They learn refactoring skills, and, then they, need to learn how to learn to some extent. I think particularly in our industry, things change so much all the time. Just be a front end developer for three months and you will have seven new framework to about, right? and so learning how to keep current is really important as well.And so I don't think. People will get as much for free. Now that we have AI coding, I think that we are coordinating code being written much more than we're writing it ourselves. And that's a very different skillset and doesn't necessarily teach you all the things that you used to learn by, doing associate work yourself.And so what do we need to do differently about this? cool things are changing. Why do we care? I think that adapting to this new reality requires us to do a few things. We need to teach the foundations a bit more actively because people aren't gonna get it so much for free anymore. And then we need to add new skills that are now required because of this change in paradigm.And I think it's still important for juniors to learn foundations and smells and debugging skills. And oh, and the other two things that I. Don't have on slide, but there are some additional skills that are needed. Critical thinking becomes ever more important. The, more you are having code written for you, rather than writing it yourself, the more you need to be able to.Think about it in a really sensible way. just hacking away until it works or copy pasting from Stack Overflow was never a great strategy, but it's even worse now. I think that people need to learn how to review code much earlier in their career than they used to. I think you used to get to mid-level before you would be the last reviewer of a lot of code.It. It would be very unusual for associate or junior engineers to be reviewing much more senior people's code, not least because the power differential means it's very unlikely that they'll say anything, even if they saw something was wrong. We need to teach systems thinking. 'cause they can work on bigger pieces of work now.But with that comes bigger implications if it's gonna go wrong and, prompt engineering or whatever we're gonna call it next, how do you get the most out of the ai. I think is interesting. One of the things that I found really fascinating when interviewing all these associates was they all talked about AI as an endlessly patient mentor, and they all were really grateful that chat GPT doesn't get annoyed with them when they forget shit.And they were like, yeah, I can rely on it not getting angry at me and just repeating over and over again the, the weird syntax, I can't remember or whatever else. And that's very different from. more senior engineers, are not yet tending to, embrace AI as a, educator or mentor or like thinking partner necessarily.I think, the people who are trying to get the absolute most out of AI are doing that, but a huge number o of senior and staff engineers, they mostly see what AI gets wrong, and that makes them much more skeptical about it. And so I think we need to learn to do a few things differently, in this new, world that we're in.I think we need to learn how to teach. And I know that when I put the slide up earlier saying, this is why your staff engineers can't teach your associates. Some of you were going, no, I've got somebody who can. I've got somebody who's a brilliant mentor, a brilliant coach, even though they're really, good on their own and, that's because they've learned to teach as well as being great at the hands-on skills.I think we need to always explain why. Hiding behind best practice is incredibly dangerous in this new world where, if you don't understand the code, your 2:00 AM wake up is going to be incredibly painful and, possibly company destroying depending on how much you are allowed to go into production.I think we need to help the people who are most skeptical about AI coding get more comfortable with it. Accept its limitations. And people who think that LLMs will stop hallucinating any day. Now, I regret to inform you that it's a feature, not a bug, in how they're, in, how they are, put together, right?and I think we need to encourage people to check sources, check documentation to distrust what the, fa the top level of what they're told, and to make sure that it's real. Make sure that it's true as well. But I think fundamentally, I've got two versions of, this slide. This is the really positive version, which like you can't hold back the ocean, but you can learn to surf.The, more British version, is one I learned on a pain management course, which is, Pain is mandatory, but suffering is optional. and I, I, think that both of those apply. I think we're in a world where we have to, figure out how we adapt, but we also really need to change how we help other people in the industry to adapt to what's going on.Now. We cannot have our current senior engineers be the last senior engineers that exist because senior engineers are constantly deciding to go run bookshops or, start farms or. Other ways of getting the fuck out of tech. and so, I think we do just need to continue to adapt, but we need to also make room, for there to be a, future for the industry that isn't just a bunch of people.a wash in vibe coded mess. I did find that, vibe code. cleanup specialist thing. Really funny because everybody who asks me what I think I'm gonna be doing for the next 10 years, my answer is cleaning up after ai. I think that's the role of the CTO for the next, for the next while. in many companies, I think when you have a brilliant and high functioning engineering team, AI can really accelerate you.But let's be honest, most companies do not have a brilliant and high functioning engineering team, and adding AI to dysfunction just causes more dysfunction. So on that note, you can't hold back the ocean. You can learn to surf. Pain is mandatory. Suffering is optional. thank you very much and have a good day.

London 2025 Sessions