If AI Does The Work, How Do People Build Expertise?

Initially published on Forbes, March 19, 2026

AI is removing the work that used to train people.

Entry-level roles were never just about getting work done. They were where people learned how work actually works. How decisions are made, how mistakes happen, how judgment is formed.

That apprenticeship is disappearing.

Organizations are replacing it with tools that expect employees to already know how to think. And then asking them to supervise work they were never trained to do.

For decades, organizations built expertise the same way. People started at the bottom, learned through repetition, absorbed context through proximity, and gradually earned responsibility. Over time, they developed the ability to see patterns, understand trade-offs and make decisions with confidence.

AI systems are increasingly taking over the work that resided in the bottom rungs of the ladder. Entry-level finance analysts no longer build spreadsheets from scratch. Junior lawyers do not review thousands of pages manually. HR coordinators are not drafting job descriptions line by line. Software engineers are not writing code line by line.

AI is changing how people work. But it is also changing how people learn, build skills and develop expertise over time.

We tell junior employees to work with AI, assign tasks to agents, apply judgment to outputs, iterate, and identify issues, assumptions and biases. Essentially, we’re asking them to manage a team of digital workers without having developed the context and understanding that come from direct experience.

How are professionals supposed to learn to apply all of these if they’ve never done the work? How will they develop professional depth, judgment and leadership when the traditional layer that got them there is no longer available?

When Experience Is No Longer Built Through Doing

Judgment has always been built through exposure. You start with narrow tasks, then expand scope. You see how decisions are made. You witness consequences. You compare scenarios. You make small mistakes that do not cost millions. Over time, you internalize patterns that allow you to do it all over again, at a more complex level. You supervise what you once did. Experience and leadership emerge from technical mastery and tenure.

When entry-level roles are compressed or redesigned around supervising tools rather than performing tasks, they no longer provide that exposure. Junior employees are asked to approve forecasts without understanding assumptions, validate analysis without seeing how errors emerge, and communicate conclusions without having built them. When they have never done the underlying work, it becomes difficult to develop the depth required to question what they see.

This is the challenge employee development now needs to solve when people no longer learn by doing the work themselves.

In a recent conversation on The Future of Less Work, Arya Bolurfrushan, Founder and CEO of AppliedAI, offered a perspective that highlights an unexpected upside. When people are no longer trained through repetition, they are also less constrained by how things have always been done. Instead of being shaped by legacy processes, they approach outputs with a fresh lens, often asking better questions and identifying gaps more quickly.

At the same time, AI is forcing organizations to surface what was previously invisible. As Bolurfrushan described it, “it goes from residing in someone’s mind into being institutionalized, which can then be critiqued.” What once lived as tacit knowledge in experienced professionals’ heads now has to be documented so systems can execute it. That shift creates a new kind of learning environment. Expertise is no longer transferred only through proximity or tenure. It becomes encoded, accessible and open to scrutiny.

How Expertise Is Built Without Learning By Doing

If lawyers rely less on mastering case law, engineers on writing code from scratch and marketers on building campaigns manually, professional depth can no longer rely on step-by-step execution alone. It must come from engaging with how work is done, understanding where it breaks and knowing when to challenge it.

Microsoft’s 2025 Future of Work Report makes a clear distinction between three types of expertise required in AI-mediated work: domain expertise, expertise in working with AI, and expertise in managing AI systems. It argues that real effectiveness comes from the interaction between them, not from any one in isolation.

The research highlights a behavioral pattern among experts. Experienced professionals do not hand over everything to AI. They selectively delegate routine tasks while deliberately staying close to higher-order work such as interpretation, synthesis, and decision-making. This suggests that experienced professionals understand they need to preserve and continue to build expertise, continuously calibrating their reliance on AI. That calibration happens when people can see how the system arrives at outputs, what assumptions it makes, and what trade-offs are embedded in its reasoning.

In other words, expertise is built through interaction, comparison, and adjustment rather than execution alone.

If AI systems are designed as black boxes that deliver clean, final answers, they may improve short-term productivity while weakening long-term expertise. People can use the system without ever developing the ability to question it. These are decisions being made today that will shape how expertise develops tomorrow. Developing domain experts in an AI world requires designing systems and workflows that force continuous judgment. That includes making room for disagreement with AI, requiring human validation at critical points, and creating feedback loops where people see the consequences of their decisions relative to the system’s suggestions.

This is the new apprenticeship.

It no longer comes from doing the work. It comes from learning how to ask, how to interpret, how to challenge, and when to trust the machine, and why.

A professional in 2030 will not be valued for producing more drafts thanks to the technology. They will be valued for identifying the problem worth solving and the uniquely human contribution that ensures judgment, context, and values are not missing. They will be able to understand when a draft is misleading. They will be responsible for integrating multiple tools, assessing bias, validating outputs and communicating implications clearly to stakeholders.

This requires a new curriculum inside organizations. Data literacy cannot be optional. Systems thinking must become a core capability. Ethical risk assessment needs to be embedded in everyday practice, not delegated to compliance departments.

In other words, depth shifts from manual execution to interpretive authority.

How To Redesign Career Development In The Age Of AI

The pathway from novice to expert must be redesigned to develop humans who know what the tools cannot see.

Instead of learning by repetition alone, employees will need to learn by supervised oversight. A junior analyst might compare AI-generated insights with historical cases to identify blind spots. A new HR partner might audit algorithmic hiring recommendations against diversity goals and cultural fit. A young product manager might test edge cases the system failed to consider.

This supervised oversight builds the judgment that repetition once built. But it does not happen on its own. Organizations need to design roles, workflows and development paths that deliberately expose people to assumptions, trade-offs and consequences. That means creating space to question outputs, requiring justification for decisions, and making the reasoning behind both human and machine choices visible.

Without this, expertise risks becoming shallow, dependent on tools rather than grounded in understanding. With it, organizations can build a new form of apprenticeship, one that develops depth not through doing the work, but through learning how to see, interpret and challenge it.

Leave a Reply

Your email address will not be published. Required fields are marked *