Bias in hiring is still a widespread problem. Even when we have the best intentions, unconscious biases – automatic, mental shortcuts used to process information and make decisions quickly – can negatively affect our recruiting decisions outside of our awareness and control.
Fortunately, we can now turn to technology to help solve this very human problem. The biggest topic in recruiting these days is artificial intelligence (AI).
As AI techniques become more common in recruiting, AI is helping recruiters fight unconscious bias in three promising ways.
How AI is being used for recruiting
AI for recruiting is applying artificial intelligence techniques such as machine learning, sentiment analysis, and natural language processing to streamline or automate parts of the recruiting process.
The most common recruiting functions that AI is being used to streamline and automate include candidate sourcing, screening, and outreach. These are areas where research has found unconscious bias in hiring can occur.
How unconscious bias affects hiring
Unconscious bias includes stereotypes based on group membership and cognitive biases such as the similarity attraction effect (i.e., the tendency for people to seek out others who are similar to them).
Research on hiring practices has found this similarity attraction effect: hiring managers prefer candidates who are similar to themselves in terms of hobbies such as the sports they play and life experiences, even though these similarities aren’t related to job performance.
A famous study found resumes with white-sounding names receive 50 percent more interviews than identical resumes with black-sounding ones. Another study found resumes with male names were twice as likely to be hired than identical resumes with female names.
AI can fight unconscious bias in job postings
A carelessly written posting is bad not only from a recruitment marketing standpoint but from a diversity one as well.
Employers may not be aware that their job postings might be turning off segments of the candidate pool from applying. Studies has found, for example, that using too many masculine-type words (e.g., ambitious, dominate) in your job description may dissuade female candidates from applying.
Software that uses AI can “de-bias” a job posting by conducting sentiment analysis to identify exclusionary language and suggest alternatives that appeal to a more diverse talent pool.
AI can reduce unconscious bias in resume screening
AI can screen thousands – even millions – of resumes instantly. Software that uses AI can reduce unconscious bias by using machine learning to understand what the qualifications of a job are. AI does this by analysing employee resume data rather than relying on (untested) rules of thumb such as the school someone graduated from and then identifying resumes of candidates who fit the profile.
To further prevent unconscious bias at the screening stage, AI recruiting software can be programmed to ignore demographic information such as gender, race, and age or proxies for race and socioeconomic status such as the names of schools attended and addresses.
While resume screening can be mind-numbing for human recruiters, it’s exactly the type of pattern matching AI was made for – immune from the mental fatigue, assumptions, and biases that humans fall victim to.
AI can pinpoint unconscious bias in your overall recruiting process
Because AI is trained to detect prior patterns, any human bias that may already be in your recruiting process can be learned by the AI. That means human oversight is still necessary to ensure AI isn’t replicating existing biases or introducing new ones.
You can test your recruiting process for bias by assessing the demographic breakdown of your new hires and comparing it to the overall candidate pool. The nice thing is that AI can identify where bias is happening more quickly and accurately than a human can.
So while technology can identify the problem, it still requires recruiters’ best judgment to come up with solutions to overcome biases that may be limiting the diversity of the workplace.
About the author: Ji-A Min is the Head Data Scientist at Ideal.