Navigating the tech talent hiring landscape with AI

Finding tech talent isn’t always easy. Just last month, there were more than 300,000 open tech jobs in the U.S. alone, and four out of five employers out of 39,000 surveyed said they are having a hard time filling those roles. Increasingly, companies are finding that AI can help them fill those positions and simultaneously help workers find the right jobs.

Still, the use of AI to land the right tech talent is also rife with challenges, often based on a misunderstanding of what AI can and can’t do. Let’s talk about what those challenges are and how employers today can work to solve them by using AI more effectively.

Problems with AI

AI in the hiring process is not new. For at least a couple of decades now, employers have been using automated skills assessments to evaluate candidates’ expertise and Applicant Tracking System (ATS) software to gather and organize job applications, resumes, cover letters, and more. However, the most beneficial uses of AI in vetting and matching candidates to the right roles goes beyond those tasks. This data-driven approach leverages AI and machine learning algorithms to generate trustworthy hiring recommendations based on potentially thousands of data points.

Public Perception and Regulatory Changes

But when I talk about the use of AI in recruiting, whether it be to a friend over coffee or to a panel of my peers, I hear the same basic concerns. Most of those concerns center around the danger of bias. The fear is that AI systems will reflect existing prejudices, as algorithms inadvertently perpetuate the discriminatory patterns present in historical data, leading to unfair outcomes for certain demographic groups. The news has recently highlighted many instances of bias in AI systems, particularly in industries like banking and health care.

As a result, rules regulating the use of AI in the hiring process are already in the works. For example, starting in July, New York businesses will have to conduct a bias audit of any Automated Employment Decision Tools (AEDTs) they’re using to help them select candidates, as well as notify any potential employees that they are using AI in the hiring process.

Eliminating Bias: A Call for Diversity in Training Data

The ironic thing about hiring bias in AI is that AI systems were once lauded for supposedly eliminating human bias in hiring. It wasn’t a human making the decision, but a seemingly objective machine. The problem is that AI systems are trained on historical data, and the data typically reflects outdated hiring attitudes from the past.

Rigorous Testing for Fair Outcomes

So, if there’s bias in the data, there will be bias in the result as well. Take, for instance, the tech industry, in which only 27 percent of workers are female. In cases where historical data exhibits a bias towards successful male candidates, an AI system providing hiring recommendations may tend to favor males over equally or potentially more qualified females, thereby perpetuating and amplifying the gender bias inherent in the original dataset.

To combat this, AI developers should use a diverse set of training data that takes into account the weaknesses of AI. For example, they need to ensure there are equal amounts of training data related to both successful female and successful male candidates.

But even with the best efforts of your developers, AI systems may not turn out exactly as you expect. That means you’ll need to test the output rigorously to ensure that it matches an unbiased, realistic hiring process. Try conducting blind recruitment experiments in which identifiers such as names, gender, or other potentially bias-inducing information are concealed during the initial screening. Reports that indicate potential bias should prompt further investigation or a temporary suspension of the AI system until the issue is resolved.

Transparency and Trust Building

Aditya Nair,
Head of Fulfillment & Member – Executive team,
Turing Enterprises Inc.

It is also nice to notify candidates that you may use AI for screening, maybe add a little note to the email, and give them some insight into exactly how you’ll be using AI and how you’ll ensure accuracy. For example, you can explain that you use AI for language processing to search for specific keywords in applications related to which coding languages you need developers to have experience in, or how many years of experience they may have. You can explain that the AI will not take gender or ethnicity into account. If you’ve already evaluated your software to comply with new laws about bias in hiring, you can indicate that as well.

The Human Touch: Interviews as an Integral Component

In addition, interviews continue to be an essential part of hiring, even for remote workers. Data about an applicant’s past successes or positions is only part of the equation for determining if someone will be a good fit for your organization. Letting potential applicants know that an interview will be part of the process reassures them that using AI doesn’t mean you’re removing the human element entirely.

Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the views of ET Edge Insights, its management, or its members

Scroll to Top