AI Hiring Tools Are Ignoring Qualified Candidates, Expert Says

AI SaaS

Turns out bots are actually bad hiring managers.

Bad Tech

Companies are increasingly employing artificial intelligence tools to screen potential employees with the aim of streamlining the process and to eliminate any biases. But as the BBC reports, the tools may well be making the hiring process worse by screening out highly-qualified people.

In one instance that exemplifies the problem, a job applicant to a tutoring company called Fullmind, resubmitted a resume after getting rejected, but tweaked their birthday to appear younger, the BBC reports. This small change led to an interview, because the AI software the company used automatically rejected older applicants.

The company had to settle with the US Equal Employment Opportunity Commission on age discrimination charges as a result.

In another incident, an AI tool gave a poor evaluation to a makeup artist based on her body language even though she was scored as highly skilled in her craft, according to the BBC.

These kind of recruiting resources faceplants are likely to multiply as more businesses adopt these tools for the hiring process.

“We haven’t seen a whole lot of evidence that there’s no bias here… or that the tool picks out the most qualified candidates,” New York University journalism professor and writer of the book “Algorithm: How AI Can Hijack Your Career and Steal Your Future” Hilke Schellmann told the BBC.

Today’s Dystopia

None of this is theoretical. The BBC cited a 2023 IBM survey that found that 42 percent of companies were already tapping AI for important HR work and recruiting.

It sounds like a dystopian nightmare for anybody who’s looking to get hired.

You have to jump through hoops in order to make sure your resume hits unknown targets set up by an AI screening tool. And even if you make it through multiple rounds of screening, what if you flunk a body language screening test that may be flawed from the get-go?

These AI screening tools are also trained on prior data that can be flawed and biased already, as seen in another real life incident highlighted by the BBC in which AI trained on the resumes of men already employed at a company ruled out women candidates who didn’t play baseball or basketball.

“One biased human hiring manager can harm a lot of people in a year, and that’s not great,” Schellmann told the broadcaster. “But an algorithm that is maybe used in all incoming applications at a large company… that could harm hundreds of thousands of applicants.”

More on AI: McDonald’s Making Job Applicants Take Weird AI Personality Tests

AI SaaS

Leave a Reply

Your email address will not be published. Required fields are marked *