Amazon’s AI recruiting software was biased against women

Amazon’s new AI recruiting engine did not like women.

Amazon had a team working since 2014 for AI software to review job applicants’ resumes with the aim of mechanizing the search for top talent.

The hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars – much like shoppers rate products on Amazon.

Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

The AI software penalized resumes that included the word women as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms.

Gender bias was not the only issue. Problems with the data that underpinned the models’ judgments meant that unqualified candidates were often recommended for all manner of jobs, the people said. With the technology returning results almost at random

Machine learning learns from large datasets of samples.

Current AI software does not explain how it reaches decisions. DARPA is working to get AI to explain why it made a choice. An AI could identify pictures as pictures of cats based upon the pointed shape of ears. There can still be errors.

The flaws in this AI software show that many aspects of AI will need various levels of human supervision and monitoring and improved error checking.

The women bias error was obvious and extreme but other errors could be more subtle and could emerge based upon changes in training data.

logo

Don’t miss the latest future news

Subscribe and get a FREE Ebook