McDonald’s Making Job Applicants Take Weird AI Personality Tests

AI SaaS

AI has come to the hiring process — and it’s made those mandatory personality tests all the weirder.

As 404 Media reports, companies as disparate as McDonald’s, Olive Garden, and FedEx are now requiring that job applicants take personality evaluations, which are then sorted by an AI system whose operations are cloudy at best.

The aforementioned companies are all contracted with Paradox.ai, a “conversational recruiting software” company whose strange personality assessments include images of blue-skinned humanoid aliens that applicants are, apparently, supposed to identify with.

In one Reddit post fielded by the fine folks at 404, for instance, the applicant is presented with a photo of two blue aliens standing in a restaurant kitchen. One of the humanoids is evidently tearing up spices by hand as another stands beside them, and below the image, applicants are given the confusing directions to “simply click ‘Me’ if the image describes how you generally are and ‘Not Me’ if it does not.” Above them, the word “Traditional” is, for some reason, written as a sort of header.

“Man,” the Reddit user lamented, “I just want a dishwasher job.”

Image via Reddit.

To see how deep this rabbit hole goes, 404‘s Emanuel Maiberg filled out an application for a bartending position at a New Mexico location of the fast-casual Italian chain Olive Garden. He found that the “traditional” image was one of more than 80 such slides, and not even the most bizarre of the lot.

One, Maiberg wrote, featured the one of the aliens sitting next to a bicycle with a bruised knee from an apparent accident. “Thing Happen to Me,” the headline reads, and applicants are again instructed to respond “Me” or “Not me,” though in the case of that kind of faux-existential AI-generated dilemma, it’s unclear what the results of either choice might even be.

As the report indicates, the Olive Garden assessments are part of Paradox’s “Traitify” product, which uses the strange slides to lump applicants into “Big Five” or “OCEAN” personality groups, rating them on how open, conscientious, extraverted, agreeable, and neurotic they are.

As 404 aptly reminds us, the efficacy of these sorts of widely-used personality tests, which have gained lots of pop psychology cachet in hiring over the past few decades, have been disputed for a while now — but it’s not likely that any corporate software-purchaser spending untold tens of thousands of dollars on this kind of HR testing is going to be looking into academic criticism.

Once he finished the quiz, Maiberg said he was presented with a five-page summary of his results that informed him — and the Olive Garden location he applied to — that he was a “producer” who’s “unconcerned with external rewards” and “self-sufficient and adept at monitoring” his own productivity.

Curiously enough, the slides in the lengthy “Traitify” quiz aren’t the AI aspect of Paradox’s offerings. That distinction lies with the company’s Olivia chatbot, which pops up to guide users throughout the application process and is supposed to “help” managers sift through applications.

According to the Paradox website, the chatbot, which features a grainy avatar of a smiling white woman, has a “passion to serve the community and deliver amazing service.” In her “other life,” the chatbot is “the Executive Director of the Arizona Coyotes Foundation” — whatever that means.

Naturally, the company isn’t giving much information away about what’s included in Olivia’s secret sauce, so it’s unclear what kind of criterion the chatbot uses to determine its recommendations — but given how strange the rest of Paradox’s offerings are, we’re not holding out hope for coherence.

More on weird AI: AI Used to Resurrect Dead Dictator to Sway Election

AI SaaS

Leave a Reply

Your email address will not be published. Required fields are marked *