“Data science consultant Cathy O’Neil helps companies audit their algorithms for a living. And when it comes to how algorithms and artificial intelligence can enable bias in the job hiring process, she said the biggest issue isn’t even with the employers themselves.
A new Illinois law that aims to help job seekers understand how AI tools are used to evaluate them in video interviews recently resurfaced the debate over AI’s role in recruiting. But O’Neil believes the law tries to tackle bias too late in the process.
“The problem actually lies before the application comes in. The problem lies in the pipeline to match job seekers with jobs,” said O’Neil, founder and CEO of O’Neil Risk Consulting & Algorithmic Auditing.
That pipeline starts with sites like LinkedIn, Monster.com, Facebook, and ZipRecruiter, where algorithms can play a significant role in determining which candidates see which job postings, filtering out those deemed unqualified.
“[Algorithms] are intended to discriminate, they’re trying to discriminate between someone who’s going to be good at this job versus someone who’s not going to be good at this job,” O’Neil said, adding that “the question is whether it’s legal or illegal discrimination.”
O’Neil has written extensively about the role algorithms play in fueling inequality both in her book, Weapons of Math Destruction, and on her blog mathbabe.org. In an interview with Business Insider, she talked about how bias shows up in the hiring process and what employers — as well as platforms like LinkedIn — should do to weed it out.
The article: Here’s why an AI expert says job recruiting sites promote employment discrimination