Where does AI belong in recruiting?

Previously on the blog we’ve talked about whether we are all doomed to be replaced by robots and how the prevalence of automation may impact jobs and the recruiting industry.


As these technologies continue to be developed, experts are still debating what the full impact that automation and artificial intelligence will be on jobs and economies around the world. Without a crystal ball in hand, we can at least begin to think about how AI may be used in recruiting.


One area ripe for artificial intelligence-powered automation is the more routine work and tasks that recruiters perform, which they are sometimes guilty of neglecting. Writing for HRtechnologist.com, Rhucha Kulkarni suggests AI could help improve the candidate experience:


Responsiveness improves the candidate experience: AI-powered chatbots and virtual assistants are the latest trends in recruitment—they help respond to the candidates who have applied but have been rejected, and even communicates status updates at every stage to those who are selected.


Many Applicant Tracking Systems already use automation to send out auto-response emails and to match or screen candidates based on the information they submit in their applications. While these automations have created efficiencies for recruiters, they don’t always result in better candidate experiences, which in turn can hurt recruiting efforts. Perhaps that is where artificial intelligence can improve these outcomes, adding personalization and the learning that comes with experience into the automation equation.


Kulkarni speaks to AI being used to screen and source candidates:


The latest in automation is the use of AI algorithms to screen resumes, following up with candidates, and sourcing potential candidates for future skills requirements.


In particular, an advantage of AI may be help in removing bias from the recruiting and hiring process. Noel Webb writes on UndercoverRecruiter.com:


Implementing AI in the hiring process can help achieve the goal of diverse teams as it will rank and score candidates based on qualification and leave bias out of the decision of whom to add to the short-list of top candidates.


So how would an AI-powered screening or matching technology work differently than ATSs that already automate some part of the screening process? At HRzone.com, Ed Donner explains how “Deep Learning” can use existing candidate and job data to accomplish those kinds of tasks:


Deep Learning is most effectively applied in areas where it can learn from vast quantities of data. It needs examples of inputs and outcomes; the more examples you provide, the smarter it gets. Over time, the algorithm gains the ability to assign outcomes to situations that it’s never seen before.


For most recruiters a shortage of data to input is not as much of a problem as the quality of that data. If you start by inputting profiles or resumes belonging to successful hires, you are assuming that their profiles or resumes are accurate representations of those people in the first place, writes Rob May at HR Examiner:


Like all A.I. systems, the output is only as good as the input.  The input here is natural language descriptions of people captured through various HR workflows.  If employees don’t take that capture seriously, and use frivolous, incomplete, or buzzword-laden language, then the results of any word vector analysis built on that data will be weak.


Recounting Microsoft’s experiment with an AI chatbot named Tay, Jessica Bateman at the Guardian writes about how easily AI and algorithms can adopt human biases:


But they also highlighted a major problem faced by the AI industry: if robots learn from humans, there’s a good chance they’ll also adopt the biases – gender, racial and socio-economic – that exist in society.


Bateman’s article suggests that it’s important that whatever AI programming we rely on needs to be built with diverse perspectives from the ground up in order to avoid bias being inherited and expressed by these new programs.


There is both opportunity and risk in developing and using AI for recruiting. We’ve seen that when time-saving automations are applied blindly and without compassion or intelligence in recruiting it can (and often does) contribute to bad experiences for candidates, which hurts a recruiter or employer’s ability to hire over time.


With the help of artificial intelligence, it may be possible to reduce biases in recruiting and hiring – or it may compound them, depending on how they’re built. It’s a common phenomenon that people will unconsciously hire candidates who remind them of themselves, so an AI recruiting tool designed to help you avoid bias might be simultaneously tasked with finding more qualified candidates as well as those who qualify but run counter to the patterned preferences that may potentially be tied to bias. Essentially, the AI would give you what you ask for, but also question your judgement in certain areas.


The science fiction nerd in me feels a little weird about building robots that are designed to doubt or disobey you, but I think there is great potential for AI to help remove or reduce bias in the recruiting and hiring process. I’m also excited about its capacity to create better candidate experiences and provide insights into how recruiters, employers, and hiring managers can improve recruiting and hiring overall.

You may also like

Leave a comment