Onboarding Interns in the AI Era

Abstract geometric design with 3D cubes on a gradient background of pink to teal. The cubes form a step-like pattern, creating a modern and dynamic feel.

Every January, May, and September, a new wave of co-op students and interns join engineering teams across North America. At Mantle, that rhythm is familiar. Interns have been an important part of how we build, teach, and grow engineering talent and culture at the company.

Our approach has always been hands-on. Before AI coding assistants became part of the workflow, we relied heavily on pair programming, close mentorship, code review, and direct exposure to experienced engineers. That model has worked exceptionally well. Over the years, we have seen co-ops ramp up quickly, make meaningful contributions, and develop the kind of engineering instincts that are hard to learn from documentation alone.

We have found that Pair programming, in particular, is one of the most effective ways to teach junior developers how real software gets built. It gives them a window into how senior engineers read unfamiliar code, break down problems, make tradeoffs, debug weird edge cases, and decide when a solution is good enough to ship.

With that in mind, it has been fascinating to see the landscape change with the advent of AI development tools like GitHub Copilot, Claude Code, ChatGPT, and Cursor. Used well, they help co-ops build a better understanding of the codebase, explore unfamiliar patterns, and hit the ground running with better context and sharper questions faster than ever before.

Used poorly, of course, it can create a different problem: developers who can generate code faster than they can understand it.

Over the past few co-op terms, one thing has become clear to us: the best junior developers in the AI era will not be the ones who simply use AI the most. They will be the ones who learn how to use it responsibly, with taste, judgment, and ownership.

This matters most in the first weeks of a co-op term, when a new developer is trying to absorb the product, the stack, the team, the codebase, and the way decisions get made – all at once.

AI compresses the path to better questions

Historically, onboarding at Mantle has always been highly interactive. A junior developer sits with a senior engineer, Pair programming with them on real issues; asking how the repo was structured, tracing through a few flows, and slowly building a mental model of the system. Pair programming is still one of the fastest ways to teach not just what the code does, but how experienced engineers think.

AI does not make that obsolete, but it can help make the time around that pair programming much more productive by providing great baseline knowledge of patterns in the codebase, or language-specific intricacies in the tech-stack.

Here are a few examples of the questions I have had interns ask me while pairing in the past:

Those are not trivial questions. They are exactly the kinds of questions junior developers need to ask in order to become useful. The difference is that many of them can now be explored independently, at the student’s own pace, before or after a pairing session.

That is a real unlock for startups.

In a small engineering team, a senior engineer’s attention is precious. The goal is not to reduce mentorship. The goal is to make mentorship higher leverage. Instead of spending the first conversation explaining where the frontend talks to the backend, a senior engineer can spend that time on architecture, product context, tradeoffs, debugging intuition, and engineering taste.

AI is very good at helping a new developer get from “I don’t know where to start” to “I think I understand the pattern, but I’m not sure why we made this tradeoff.”

That second question is much more valuable. It is also much closer to how engineers actually learn.

In week one, AI should be a codebase tutor, not a code generator

One of the mistakes companies can make is giving new interns access to AI tools and immediately encouraging them to ship faster. While this feels progressive it can also backfire, as junior engineers immediately feel the burden of trying to implement features as quickly as possible.

In the first week or two, we want interns to focus on using AI primarily for understanding, not output. The goal is to build a mental model of the system before they start making meaningful changes to it.

A good first-week AI workflow might look like this:

  1. Ask the tool to explain the folder structure and identify the major product areas.
  2. Trace a customer-facing action from the UI to the backend to the database.
  3. Find examples of similar features before implementing anything new.
  4. Ask for the conventions used in tests, error handling, logging, and API responses.
  5. Summarize what was learned and validate that understanding with a human engineer.

That last step is important. We don’t want interns to treat AI explanations as ground truth. We want them to use AI to form a first draft of their understanding, then test that understanding against the code, the tests, and the senior engineers on the team.

A useful rule of thumb: during onboarding, AI should help junior developers ask better questions, not avoid asking questions altogether.

Then the guardrails can start to come down

Once an intern has a solid understanding of the codebase, AI becomes exponentially more useful. At that point, we want them to embrace using AI to solve hard problems: drafting implementation plans, evaluating different solutions and their tradeoffs, exploring edge cases, refactoring hard-to-follow code, and getting unstuck when they hit unfamiliar patterns.

The important shift is that AI moves from “explain this to me” to “help me do this well” or even “help me learn how to do something I don’t fully understand.”

That means using AI coding tools to implement complex features touching the entire stack, from front-end pages and components all the way down to the back-end database layer. This can involve everything from proposing a comprehensive implementation plan, to exploring all existing patterns in the repo and selecting an appropriate one, or coming up with an adaptation of an existing pattern to better suit a new use-case. It may involve refactoring large chunks of existing code to better align with new code, or extending a test suite to cover newly discovered or exposed edge cases. 

These are all tasks that build naturally upon the first couple of weeks that the intern spent learning core knowledge about the system, and with the help of AI tools, a junior engineer can take a healthy stab at implementing.

However, with AI generated implementations and code becoming a larger part of an engineer’s workflow, the expectations have to change too.

If you use AI to generate code, you own the code. Not partially. Not “the model wrote it,” or “Copilot suggested it.” – You own it.

If it introduces a bug, breaks an edge case, misses a permission check, or creates a confusing abstraction, AI is not going to join the incident review. The developer is. And this is true for every engineer on the team, regardless of their seniority.

This is one of the most important lessons junior developers need to learn early: AI can accelerate your work, but it does not absorb your accountability.

The new skill is not prompting. It is judgment.

A lot of people talk about “prompt engineering” as the core AI skill.

Prompting matters, but for junior developers, the deeper skill is judgment.

Can they tell when an answer sounds plausible but is wrong? Can they identify when generated code does not match the style of the existing system? Can they recognize when the model has introduced a dependency, abstraction, or pattern that the team would not want? Can they explain the code in a PR review without hiding behind the tool?

Those are the muscles worth training and fostering for a successful engineering career.

A junior developer who blindly accepts AI output may move quickly for a day and create cleanup work for a week. A junior developer who uses AI to compare options, generate tests, and validate their own thinking can become productive much faster than was possible a few years ago. This can be especially advantageous for co-ops and interns, due to their work terms being only a few months long.

As we move into AI becoming more ubiquitous in the industry, the students who stand out will not be the ones who say “I know how to use AI.” Everyone will say that. The students who stand out will be the ones who can demonstrate that they are able to use AI in a professional engineering environment.

In a real codebase, the goal is not to generate the most code. The goal is to make the right change, in the right place, in the right pattern, with the right tests, and with enough understanding that you can support it later.

If you can learn this early, you will have an advantage.

The market for junior developers is competitive. Companies are still hiring smart, curious, high-agency people, but the bar is changing. It is no longer enough to know a framework or finish a tutorial project. Teams want people who can learn quickly, navigate ambiguity, use modern tools responsibly, and take ownership of their work.

AI raises the ceiling, but ownership sets the floor

AI is changing how junior developers ramp up. It gives interns a patient tutor, a codebase guide, a test-writing partner, and an always-available second set of eyes, especially crucial in their first few weeks on a team.

For startups, that means interns can become productive faster without overwhelming senior engineers. It also means that over the course of a work term, you can expect them to contribute more than ever before. For students, it means the learning curve can be steeper, faster, and more self-directed than ever before, and allow for greater chance to express yourself, take ownership, and stand out from the pack.

But the foundation has not changed. Good engineering still requires understanding. Good teams still require mentorship. Good code still requires tests, review, and accountability.

AI raises the ceiling on what a junior developer can do early in their term. Ownership sets the floor for whether that work is good enough to ship.

At Mantle, we are excited about co-op students who want to learn that balance. If you are curious, ambitious, and interested in building software in an AI-native engineering environment, we would love to hear from you.

Rafay Khan headshot

Rafay Khan is a co-founder and engineer at Mantle, bringing over a dozen years of full-stack development and architecture experience to the team. Rafay holds a degree in Computer Engineering from the University of Toronto.

Discover more from Mantle Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading