OVERVIEW
In today’s highly competitive business environment, a diverse workforce has become a factor that can give an organization the edge. Winning clients, attracting talent or even securing funding to take a company public can depend on diversity. For example, Goldman Sachs Group recently announced it will no longer support initial public offerings of companies with all-male boards.
The returns on diversity are tangible, especially when it comes to innovation: a survey of 1,700 companies across eight countries found that organizations with above-average diversity had, on average, 19 percentage points higher innovation revenues and 9 percentage points higher earnings before interest and tax margins.
However, humans are inclined to bias, whether intentional or unconscious. Organizations are therefore looking to block those inclinations in order to make objective decisions when it comes to hiring, firing, promoting and developing workers.
In studies on orchestra auditions, women were more likely to advance to final rounds when they performed behind a screen. For many organizations, artificial intelligence (AI) has begun to serve as that screen throughout the talent life cycle.
“Challenges in today’s business world require diverse thinking,” observes Katherine Conway, head of Diversity & Inclusion and Community Affairs for Aon in Europe, the Middle East and Africa. “This is about driving business value. Clients want to see teams that reflect the global workforce, and they want the unique and creative ideas that come from diversity of thought.”
IN DEPTH
Algorithms, AI and data analysis are helping companies find the right internal and external people to fill jobs and decide how to develop their employees. The right controls can evaluate candidates’ and employees’ skills, experience and other specified criteria – and exclude any characteristics that could incite bias.
Reducing Bias In Recruitment
The traditional hiring process, which involves individuals sorting through résumés and conducting numerous interviews, can be distorted by something as simple as time constraints. For instance, recruiting teams can introduce biases as they narrow candidates to a more manageable size – such as targeting graduates of a certain university or using hiring platforms that may only accept applicants with photos.
When used properly, AI can help reduce the impact of those human biases in hiring. The technology’s ability to consider a larger pool of candidates enhances the chances for greater diversity. Using data generated by detailed questionnaires to evaluate that pool, technology can help companies identify qualified prospects quickly and without bias.
“From a diversity perspective, AI is not going to be influenced by a candidate’s demographic characteristics,” says Richard Justenhoven, product development director at Aon’s Assessment Solutions. “Bringing AI into the process can help keep the focus on measures like competencies, skills and experience to determine a candidate’s suitability for a role.”
Reducing Bias Throughout The Talent Life Cycle
Beyond recruitment, AI analysis of behavioral questionnaires can help create a more diverse and inclusive workplace. For example, the technology can help promote diversity by increasing existing employees’ awareness of and access to opportunities for advancement in the organization.
“Relying on performance management data and instinct to select who gets promoted is an inherently biased process,” comments John McLaughlin, commercial director at Aon’s Assessment Solutions. “Employee evaluations can be subjective. All employees don’t have equal visibility with leadership and access to sponsors who can bring more awareness to their achievements and skills. Technology can help reduce bias in decision-making around promotions and leadership potential.”
As businesses look to fill roles or evolve in new directions, technology can help leaders identify and develop suitable candidates who might already be on the payroll. “Often organizations aren’t thinking through a growth strategy for innovation based on growing their own internal talent,” McLaughlin adds. “The opportunity to apply this type of data in this way can be critical to an organization that is undergoing this type of change.”
AI models can also predict market-based pay for jobs, even with limited data. “This is important as new jobs are being created, especially technology-related jobs,” says Stefan Gaertner, partner at Aon’s Rewards Solutions Practice and cohead of the People Analytics Practice. “The rise in remote working has meant that there isn’t a lot of compensation data available. Using AI models to predict or infer fair market pay reduces compensation bias in the workplace, creating a fairer pay environment.”
In addition, companies can use AI to model future-state scenarios for their business and then design development paths that will allow people to perform in future jobs, even if those can’t yet be clearly defined.
The technology can also be used to provide quality control to an organization’s talent assessment process – that is, improving the performance of the humans actually making hiring and promotion decisions. Comparing the AI rating of job candidates to those of human recruiters and managers can highlight discrepancies that might be due to bias. These findings can then provide a basis for the recruiters’ retraining, thereby increasing their awareness of any potential and actual favoritism or prejudice.
Getting The Process Right
If technology is to reduce or eliminate bias across the employee life cycle, fine-tuned and robust questionnaires and evaluation algorithms are essential. According to McLaughlin, the key is transparency.
“We like to recommend a ‘glass box’ approach: you want to offer visibility into what the AI is evaluating, how it’s scoring results and how it’s used to arrive at a decision,” he notes. “Transparency allows someone to course correct if bias is being introduced at some point along the way.”
It’s critical to ensure the assessment tools are designed to avoid inadvertently favoring a specific gender or ethnicity. Algorithms and questionnaires must be designed and evaluated with care. For example, simply basing an assessment algorithm on past hiring practices might result in a biased algorithm if the organization’s past hiring was biased.
“We also need to future-proof these models,” remarks McLaughlin. “We need to make sure that the models and the way we evaluate and assess people are adaptable.”
Using Unbiased Technology To Help Overcome Human Bias
Machines will never replace humans in the hiring and talent evaluation process. Yet, when equipped with neutral questionnaires and well-constructed algorithms, AI can screen job candidates without biases that might usually hinder diverse hires and promotions. In addition, it can help employees develop the skills needed to become more effective and open doors to advancement. Taken together, the result could be a more diverse workforce.
“Properly designed and used, AI and assessment technology can help underrepresented workers break through bias-driven professional ceilings and help organizations reap the benefits of diverse workforces,” says Justenhoven.
The post Recruiting Artificial Intelligence To Create A More Diverse Workforce appeared first on The One Brief.