Wednesday, October 10, 2018

A Higher AI-er That Hires, Expired

Judging from Amazon's latest experiment in machine learning in removing the bias in HR from hiring for technical positions, AIs are only as intelligent as the data you feed them to learn and grow from.

Amazon.com Inc’s machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.

The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars - much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

The Seattle company ultimately disbanded the team by the start of last year because executives lost hope for the project, according to the people, who spoke on condition of anonymity. Amazon’s recruiters looked at the recommendations generated by the tool when searching for new hires, but never relied solely on those rankings, they said.

Amazon declined to comment on the recruiting engine or its challenges, but the company says it is committed to workplace diversity and equality.

The irony here is that when it comes to technical positions, Amazon, like most American corporations, wants cheap H1-B labor from overseas, and the massive majority of H1-B workers are male.  The bias in STEM has been towards men for decades, so when Amazon put in ten years of hiring 75-80% men for technical positions into the hopper, the program "learned" that bias too and spat out the same results.

The problem with AI, like any computer program, isn't the program.  It's the people who program it.

You don't have to be a genius super-coder to pick that up.

No comments:

Post a Comment