'When you take the human out of human resources, I think it's a massive issue'

While many organisations are now actively integrating artificial intelligence (AI) into their recruitment processes, they may be at risk of doing more harm than good, according to one expert.
When you're training Large Language Models (LLMs), "if there's a ton of user input that's being allowed to train the model, then you can end up with outcomes that you didn't intend," says Jordan Goure, CEO of Picsume.
He cites the example of a hiring manager using an algorithm that consistently favours certain backgrounds.
“If they keep only selecting candidates from certain countries, with certain surnames, from certain educational institutions, all of these things introduce bias into the process.”
Goure stressed the importance of maintaining human oversight in AI-assisted hiring processes.
“When you take the human out of human resources, I think it's a massive issue,” he says.
Canadian employers are showing increased interest in conducting job interviews using AI technology, according to a previous report.
Equality, fairness in using AI for hiring
Indrajeet Chatterjee, Senior Associate Director for Talent Acquisition at BIG4, says AI “will never replace the human side of hiring.”
“No doubt, AI can screen resumés, sort data, and maybe even suggest good fits. But recruitment is so much more than that,” he says in a LinkedIn post.
“It’s about real conversations. It’s about understanding someone’s story—not just the skills. It’s about reading between the lines, finding a spark, using intuition to check culture fit, building real connections.”
Still, Goure acknowledges that AI can support more equitable hiring outcomes when used responsibly.
“It’s a more fair and equitable way to use an applicant tracking system, to use the recruitment module to find the best people,” he says.
“Equality, fairness, and having these ethics in software when you're using AI—it’s going to be one of the most important use cases of AI. It’s that people are building tools with these in mind.”
Software giant Workday is facing a major lawsuit in the U.S. alleging that its job applicant screening technology discriminates against older candidates. But the company says the lawsuit is without merit: "Workday’s AI recruiting tools do not make hiring decisions or automatically reject candidates — hiring decisions are always made by our customers, who maintain full control and human oversight. These tools look only at the qualifications listed in a candidate’s job application and compare them with the qualifications the employer has identified as needed for the job. They are not trained to use—or even identify—protected characteristics like race, age, or disability."
The court has already dismissed all claims of intentional discrimination, says Workday, "and there’s no evidence that the technology results in harm to protected groups."
Filling the gaps in AI
Picsume is receiving advisory services and research and development funding of up to $141,980 from National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), tax incentives of $112,720 through the Scientific Research and Experimental Development (SR&ED) and is in the final stages of securing up to an additional $450,000 through the Natural Sciences and Engineering Research Council (NSERC) in collaboration with the University of Windsor and Lambton College.
Goure underlines the importance of involving academic stakeholders in AI development.
“Part of our process is engaging with students, newcomers, people entering the workforce, because it's more of a market that's under-serviced,” he says.
“There’s kind of a communication gap between the generation that's about to graduate, the generation that's in school, and the workforce and industry of today.”