Michael Blickman: Beware of using artificial intelligence in hiring

  • Comments
  • Print

blickman-michael-workplaceEmployers are always looking for more efficient ways to handle basic human resources needs. One solution that has caught on and is growing exponentially is the use of artificial intelligence—or AI—platforms to assist employers in selecting employees from large groups of candidates.

One of the more well-known platforms is Plum, which offers a product that, as stated on its website, allows an employer to know whether “an applicant will thrive in a role (before you even hire them) by measuring their contribution to culture, work ethic, ability to innovate and more.”

Think about that. Can that computer really contain that kind of crystal ball? The computer can purportedly be taught through the machine-learning process to make better decisions than your hiring managers and HR professionals.

Plum points out, and I believe rightly so, that in hiring employees, humans often make “gut” decisions that can be unpredictive and biased. AI’s goal for hiring is to train the computers in industrial and organizational psychology methodologies so they become much more reliable than humans in selecting employees.

I wholeheartedly agree with the fundamental premise that too many employers are not careful in selecting new employees. In fact, employers often spend more on the termination process and the aftermath that sometimes involves litigation than they spend on the front end in hiring employees.

At the same time, I have to say that some of the claims made by AI-hiring platforms sound too good to be true. While that raises a red flag, many of the largest and best-known companies in the world have adopted AI in hiring. Those that haven’t are worried they are falling behind.

However, a significant problem with using AI in the hiring process is the prospect that the data used by the computer is implicitly biased and that unlawful discrimination will result. In fact, in early October, Amazon scrapped its years-long effort to build a hiring tool that would allow its computers to review hundreds, if not thousands, of resumes at a time and identify the best candidates.

The problem Amazon found was that the computers were consistently biased in favor of male job candidates. As a result, the selection algorithms actually reduced the number of women who were considered qualified for Amazon positions. The algorithms reportedly penalized resumes that included the word “women’s.” Does that mean the computer discriminated against “varsity women’s basketball” team members? It also reportedly penalized women who graduated from certain all-female colleges. As the saying goes, “Garbage in, garbage out.”

The result might not be so surprising given the overwhelming male dominance of tech-related positions in Silicon Valley and beyond. And the gender issue is only part of the story, because high-tech companies are also well behind other firms in the hiring of African-Americans and Hispanics in white-collar positions.

I offer these specific recommendations:

As with any new product being marketed, particularly one that makes the kind of claims the AI companies are making, obtain references in your industry and ask detailed and probing questions of current users.

Ask the AI company to show you evidence that its processes have been properly validated and determined by qualified test-assessment experts to be non-discriminatory.

Rather than purchasing the AI product for all hiring decisions, consider beta testing in one or two positions.

Your contract with the AI provider should contain strong indemnification language so that, if you are sued for discrimination in hiring as a result of using the product, the provider will not only pay for your attorney’s fees and costs, but also accept responsibility for any and all damages.

If you use one of these products, engage an expert to perform a disparate impact analysis from time to time, under the direction of and with the assistance of legal counsel, given the sensitivity of the review.

Make sure you understand the data that formed the basis of the AI algorithm you are going to use. If there is a lawsuit, this will help you avoid that uncomfortable moment on the stand if you are asked (and you will be asked) to identify the factors the computer deemed important in selecting your employees.

AI cannot be the be-all-and-end-all of any company’s hiring process. Train your managers in best-hiring practices and rely on excellent HR professionals to support them.•


Blickman is a partner in the Employment and Immigration Group of Ice Miller LLP.

Please enable JavaScript to view this content.

Story Continues Below

Editor's note: You can comment on IBJ stories by signing in to your IBJ account. If you have not registered, please sign up for a free account now. Please note our updated comment policy that will govern how comments are moderated.