Amazon drops secret AI recruiting tool that showed bias against women

Anca Alexe 10/10/2018 | 14:16

Amazon has dropped a new AI-based recruiting engine after its machine learning specialists discovered that the system was biased against women, Reuters reports.

The company had been working on computer programs since 2014 to review the resumes of job applicants and automate the search for talent.

The company’s experimental hiring tool was using artificial intelligence to give candidates scores from one to five stars, but in 2015 the team realized the engine was showing bias against women, especially for technical jobs, since the system was trained to vet applicants based on patterns in resumes submitted to the company over the past 10 years, and most of those had come from men.

Reuters sources said that Amazon edited the software to make it neutral to terms suggesting the candidates were women, but they could not guarantee that the tool would avoid other forms of discrimination.

The company eventually disbanded the team by the beginning of 2017, as its recruiters never relied solely on the tool’s recommendations.

Some 55 percent of U.S. human resources managers said artificial intelligence, or AI, would be a regular part of their work within the next five years, according to a 2017 survey by talent software firm CareerBuilder.

Microsoft racist chatbot

This is not the first time when an AI software is closed due to problems of this kind. Two years ago, Microsoft had to close its AI chatbot in less then 24 hours after it had been corrupted by Twitter and it became racist.

Microsoft’s Twitter bot was described as an experiment in “conversational understanding”, so the more you chat with Tay, the smarter it gets, learning to engage people through “casual and playful conversation.”

Unfortunately, the conversations didn’t stay playful for long. Pretty soon after Tay launched, people starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpist remarks. And Tay — being essentially a robot parrot with an internet connection — started repeating these sentiments back to users, proving correct that old programming adage: flaming garbage pile in, flaming garbage pile out.

Close ×

We use cookies for keeping our website reliable and secure, personalising content and ads, providing social media features and to analyse how our website is used.

Accept & continue