LinkedIns job-matching AI was biased. The companys solution? More AI. – MIT Technology Review

Posted: June 28, 2021 at 10:37 pm

More and more companies are using AI to recruit and hire new employees, and AI can factor into almost any stage in the hiring process. Covid-19 fueled new demand for these technologies. Both Curious Thing and HireVue, companies specializing in AI-powered interviews, reported a surge in business during the pandemic.

Most job hunts, though, start with a simple search. Job seekers turn to platforms like LinkedIn, Monster, or ZipRecruiter, where they can upload their rsums, browse job postings, and apply to openings.

The goal of these websites is to match qualified candidates with available positions. To organize all these openings and candidates, many platforms employ AI-powered recommendation algorithms. The algorithms, sometimes referred to as matching engines, process information from both the job seeker and the employer to curate a list of recommendations for each.

You typically hear the anecdote that a recruiter spends six seconds looking at your rsum, right? says Derek Kan, vice president of product management at Monster. When we look at the recommendation engine weve built, you can reduce that time down to milliseconds.

Most matching engines are optimized to generate applications, says John Jersin, the former vice president of product management at LinkedIn. These systems base their recommendations on three categories of data: information the user provides directly to the platform; data assigned to the user based on others with similar skill sets, experiences, and interests; and behavioral data, like how often a user responds to messages or interacts with job postings.

In LinkedIns case, these algorithms exclude a persons name, age, gender, and race, because including these characteristics can contribute to bias in automated processes. But Jersins team found that even so, the services algorithms could still detect behavioral patterns exhibited by groups with particular gender identities.

For example, while men are more likely to apply for jobs that require work experience beyond their qualifications, women tend to only go for jobs in which their qualifications match the positions requirements. The algorithm interprets this variation in behavior and adjusts its recommendations in a way that inadvertently disadvantages women.

You might be recommending, for example, more senior jobs to one group of people than another, even if theyre qualified at the same level, Jersin says. Those people might not get exposed to the same opportunities. And thats really the impact that were talking about here.

Men also include more skills on their rsums at a lower degree of proficiency than women, and they often engage more aggressively with recruiters on the platform.

To address such issues, Jersin and his team at LinkedIn built a new AI designed to produce more representative results and deployed it in 2018. It was essentially a separate algorithm designed to counteract recommendations skewed toward a particular group. The new AI ensures that before referring the matches curated by the original engine, the recommendation system includes a representative distribution of users across gender.

Kan says Monster, which lists 5 to 6 million jobs at any given time, also incorporates behavioral data into its recommendations but doesnt correct for bias in the same way that LinkedIn does. Instead, the marketing team focuses on getting users from diverse backgrounds signed up for the service, and the company then relies on employers to report back and tell Monster whether or not it passed on a representative set of candidates.

Irina Novoselsky, CEO at CareerBuilder, says shes focused on using data the service collects to teach employers how to eliminate bias from their job postings. For example, When a candidate reads a job description with the word rockstar, there is materially a lower percent of women that apply, she says.

See more here:

LinkedIns job-matching AI was biased. The companys solution? More AI. - MIT Technology Review

Related Posts