The answer is not to abandon entry‑level hiring, but to redesign it for an AI‑embedded workplace, says academic
The first casualties of generative AI may not be occupations, but the youngest people in those jobs, with new research out of Europe showing that employers are leaning on experienced staff to do more with the assistance of tech.
A key reason older workers benefit from generative AI has nothing to do with the technology itself, says Kate Cassidy, assistant professor and genAI researcher at Brock University.
“Older workers have that broader, relational, organizational, contextual piece that they can bring to AI content,” she says.
In practice, a senior employee can take an AI-generated draft or analysis and immediately see how it does or does not fit with the organization’s strategy, client history and culture.
As a result, employers are falling into the trap of thinking, “‘Instead of hiring three workers doing that more base-level stuff, we can have just one do it with the use of AI.’ But I think there's a few problems with that," says Cassidy.
This can reinforce age bias while simultaneously cutting off the entry-level pipeline that feeds future leaders, she says.
Patterns and entry-level work
The Swedish paper, Same Storm, Different Boats: Generative AI and the Age Gradient in Hiring, shows that since ChatGPT’s launch, employment in the most AI-exposed occupations fell by 5.5 percent for 22- to 25-year-olds but rose slightly (1.3%) for workers over the age of 50.
Part of that comes from what the tools do well, Cassidy explains.
“Generative AI, it's a pattern maker. It's very good at picking up patterns, which is why people associate it with entry-level work,” she says. When managers see AI rapidly producing first drafts or spotting patterns in data, it is easy to conclude that juniors who used to do that work are now optional.
But that’s only one half of a crucial equation, Cassidy stresses, with the other half being judgment about what to do with that output.
“If you have a large amount of data and you just want to find a certain pattern in it, it's quite quick at that, faster than somebody who might have been doing it manually,” she says.
“But it's then taking that data, finding and contextualizing it to your circumstance, that is the key part. It is all that history that older workers inherently carry, so they can take the raw piece that comes out of AI and they can contextualize it.”
From an HR perspective, this can create a seductive logic: let AI pick up the “base level stuff” and rebuild teams around a smaller number of seasoned staff. But Cassidy explains the flaw in the logic.
“It's not that you don't need younger workers. I think what that really means is you need maybe more workers who have that tacit knowledge,” she says.
“If you are taking the tactic of not having any younger workers, when you eventually want to have the workers that have the more nuanced knowledge, you're not going to have that. So, you're cutting off your pipeline.”
Redesign entry-level hiring for AI
The Swedish research shows that younger workers are bearing the brunt of adjustment in AI-exposed jobs. But Cassidy stresses that early-career employees remain critical – especially if employers want AI systems used wisely.
The answer, she says, is not to abandon entry‑level hiring, but to redesign it for an AI‑embedded workplace. This means utilizing external partners such as universities and colleges to create structured apprenticeships that accelerate the transfer of tacit know‑how.
“You have more experiential students, field placement students, bringing in students in roles where they're both in school and in the workplace, [who] will advance their knowledge faster so that they do have that contextual knowledge,” Cassidy says.
“Also, internal mentoring programs where younger workers are matched with older workers, to bring in that contextual knowledge. Then you can ask more of entry level workers.”
Younger staff also bring rising digital strength at a moment when many leaders are still learning how to use AI.
“We're in a very short blip of transition, but very quickly, younger workers are going to be more digitally native around the use of AI,” Cassidy says.
“We're seeing examples every day where companies choose to use AI and then make a really significant error because it doesn't work in their circumstances … in that way, younger workers are going to be dealing with and learning about that in school, faster than maybe an employer would have time to be training the other workers around it.”
Risk of eliminating entry-level pipelines
The Swedish evidence – echoed by emerging U.S. research the paper reviews – shows generative AI can deepen an age gradient without shrinking total employment. That risk grows if leaders treat the tools as plug‑and‑play.
In that kind of environment, human context is not optional.
“It's not that AI causes a set of people to be unneeded, it's that the history and experience and context becomes more important,” Cassidy says.
“The research I'm doing shows that you almost need more skills around the organizational culture strategy.”
Stripping out junior roles while leaning harder on AI makes it harder to build that human context over time. It also raises reputational and legal questions if younger workers, newcomers or women – who are often over‑represented in administrative and customer‑facing jobs – are disproportionately affected when organizations “optimize” entry-level work.
“The question now becomes ‘How do we take entry-level workers and make them have that experience?” Cassidy says.
“Schooling is going to catch up quickly … so entry people, more soon out of school, will have some knowledge that workers in a company don't. But they're not going to have that contextual organizational culture, organizational strategy, client history.”
Training and expanding roles laterally
The upside of this redesign is the chance to evolve junior roles away from basic tasks and toward higher‑value human work, Cassidy says. For her, designing AI‑era jobs is ultimately a skills and culture project.
“It's that piece that they're going to have to figure out, different ways to either hire for that in new ways, train for that, or change the jobs in ways that reconfigure that,” she says, explaining that a practical question for employers now is how to design jobs so humans and AI complement each other across age and seniority groups.
“We're used to a past where we were training for more basic skill sets. Now it's going to be more reflexive, contextualized development,” Cassidy says.
“I don't think it can be a top down or an imposed thing. All the errors that I've read about come from where there weren't integrated conversations … versus the idea of ‘It's just faster.’ If we just think of output, we miss all of the contextual processual issues in it.”