How to Limit the Risks of AI-Driven Hiring

Artificial intelligence (AI) is the science of creating “thinking” computers and it’s about to change how you conduct recruiting and hiring. Like any technology, there are pros and cons to using AI as part of your hiring workflows. What are the risks related to leveraging these sophisticated technologies and how can you avoid the pitfalls of these tools?

What Are the Risks of AI-Recruiting?

AI algorithms can automate mundane tasks for recruiters and help them work smarter. They can spot trends and disparate minute details that can help with decision-support. This software can take behavior patterns from their interactions with humans and “learn” to improve how they behave. If you have a home Alexa, that’s a good example of how these machines can evolve their interactions with their human end-users.

But AI’s are programmed by people and, like people, it turns out they may have biases that could negatively affect your hiring practices.

Take the case of the Amazon AI that hated women. Not that a computer has emotions, mind you, but Reuters reported that Amazon engineers found their new software had some biases against female applicants that applied to technology jobs. Namely, those biases kicked women out of the running for tech jobs and kept the men. Because most of the applicants over 10-years were male (given that the industry was dominated by men during that time), the AI algorithms in the computer software “learned” to prefer male candidates.

When Amazon engineers figured out the problem, they scrapped the software. But these kinds of programming mistakes leave companies liable for discrimination lawsuits by candidates who feel they may have been unjustly treated by an algorithm with your corporate name on it. Therein lies the risk of these cool new tools. What is being done to protect companies using AI platforms and the applicants that apply?

Using AI for Good

AI, like all computer sciences, is evolving. There are going to be mistakes along the way. You just need to be sure the mistake doesn’t affect your company. The way to do that is to create transparent hiring practices that help candidates understand the process. There are also state and federal laws pending that will help define how these tools are used during hiring. Human Resource Executive suggests that some state laws could serve as guidelines for companies working with AI platforms. For example, there is a new statute in Illinois that says:

  • Job applicants should receive a notification about how the AI works and how it will be used.
  • Then gain the consent of the candidate before their application can be processed.
  • Companies should limit the distribution of video interviewing or other data to only the employees evaluating the applicant.
  • Upon the request of the candidate, destroy the video and all backup copies within 30-days.

While AI in recruiting is really in its infancy, employers can still make use of these tools if they take steps to protect themselves. These steps are a good way for companies to gain the benefits of working with an AI for recruiting while lessening the risks.

Blackstone Talent Group is devoted to lessening the risks associated with leaving important technology positions unfilled. Talk with our team today about how we can help you reach your hiring goals.

Leave a Reply

Your email address will not be published. Required fields are marked *