Can AI eliminate bias in banks' hiring decisions?

BOK Financial has spent the last few years refining its recruitment strategy. One motivation is to diversify its employee base.

The Tulsa, Okla., company has updated job descriptions to remove gender bias and revamped its interview process to be more structured. It also enhanced its training: In 2016, every recruiter completed a Certified Diversity and Inclusion Recruiter training program run by AIRS, a division of payroll company ADP. BOK rolled out its own six-month diversity and inclusion accreditation program last year, which focuses on understanding inclusion and unconscious bias.

At the same time, the $45.6 billion-asset BOK is using technology to smooth out the application process and give it a competitive edge among job seekers.

BOK Financial is one of several financial institutions that have turned to technology-driven recruitment and assessment tools in recent years. Some of these are expressly designed to diversify hiring decisions. Others are designed to eliminate any type of bias by highlighting skills over factors that could suggest age, gender or race, such as name and location. But the technology has its limits and is no replacement for human decision-making, experts say.

In January, BOK deployed a conversational artificial intelligence assistant from the recruiting software company Paradox. The assistant, named Olivia after the Paradox founder’s wife, lives on the BOK careers homepage, ready to recommend open positions and help job-seekers complete applications. Paradox’s interview scheduling module integrates with Microsoft Outlook calendars to automatically arrange interviews. One advantage of initially screening candidates using Olivia, according to Paradox, is that the assistant can prequalify candidates without intentionally or subconsciously passing judgment on names or other background details.

Roxanna Maciel, director of talent acquisition and attraction at BOK Financial, likes that Olivia can answer questions at any time of day or on the weekends. She reports that customer feedback has been excellent so far. But, “we remain very selective with what we hand off to artificial intelligence,” she said by email. “We currently view our technology partnerships, particularly with Olivia, as assistive.”

Paradox is one vendor that employs artificial intelligence or other forms of technology to try to strip racial, ethnic, gender and other biases from the recruiting and hiring process, which can creep in through people’s names, addresses and other background details. Pymetrics, for example, builds assessment tools that focus on candidates’ soft skills. EFinancialCareers and Paradox help screen prospects or match them with recruiters based on their qualifications rather than their personal information.

These vendors all count financial institutions among their clients. Still, while technology can help speed up certain processes and emphasize skills over less meaningful data, human oversight is still vital in recruiting, attracting and retaining diverse talent.

Aaron Rieke, managing director of Upturn, a nonprofit that advocates for equity and justice in technology, says that companies first need to ask themselves what abilities are required for a position, what their goals are for diversity, how they will hold themselves accountable and where they will look for underrepresented candidates. Once companies have set these goals, technology can help deliver on them more consistently.

“Technology can’t do that hard work for you,” Rieke said. “AI is not a panacea or a solution to bias.”

How technology helps

Hiring managers for highly paid finance, consulting and legal jobs have traditionally weighed pedigree schools and common networks in their decisions, said Rieke. Technology that focuses on skills and job-related factors can help break through those barriers.

EFinancialCareers is a global careers site that connects financial services professionals with recruiters, and counts HSBC, JPMorgan Chase, Morgan Stanley and Wells Fargo among its clients. When recruiters search its marketplace for candidates, they won’t necessarily find names attached to job seekers’ resumes. (Site users can choose whether or not to reveal their names in search results.)

The site uses AI and predictive analytics for a process it calls “candidate match,” where it connects skills and requirements from a job posting to an application and indicates to recruiters how closely a candidate matches their desired qualifications.

Art Zeile, the CEO of eFinancialCareers, which is based near Denver, says these decisions are important because they eliminate unconscious biases that may seep in.

“The process has nothing to do with where they are from or the nature of their last name,” said Zeile. “It lets skills dictate who moves forward.”

The site is also working on an automated process to find discriminatory language in job postings and resumes. The goal is to scan postings and resumes for certain phrases, such as “early stage in your career” (which could indicate latent ageism) or “must have U.S. citizenship,” that could open a candidate up to bias, and flag them to the poster as well as to the compliance team. Currently, this process is manual.

Adam Godson, the chief product officer at Paradox, in Scottsdale, Ariz., says his product places a similar emphasis on skills. The Olivia conversational AI assistant screens candidates by asking questions about their qualifications, such as licenses and certifications or previous experience, and requirements that wouldn’t normally appear on a resume, such as whether an applicant can work certain hours.

A number of financial institutions, including Citizens Financial Group in Providence, R.I., and Desert Financial Credit Union in Phoenix, use Paradox on their careers sites — though with different nicknames and photos for the assistant.

Beyond the resume

Another crop of vendors has emerged in the last few years that uses machine learning to train assessment models and de-bias hiring. Pymetrics is one such platform that has 15 clients in financial services, including nine in the U.S.

Early in the application process, candidates will be assigned 12 to 16 online games that evaluate their cognitive and emotional attributes and sometimes, numerical and logical reasoning. To test memory, one game quickly flashes a series of numbers on the screen and asks users to retype the sequences. Another game rewards users with escalating sums of money to pump balloons as large as they can before they burst, gauging the player’s approach to risk. Other activities look at planning and attentional styles.

“These features are known to be, if not completely unbiased, far less so than resume data,” said Frida Polli, CEO and co-founder of pymetrics, which is based in New York City. The names, schools and even activities listed on resumes can introduce bias.

The platform then matches candidates with the roles it believes they are best suited for.

Pymetrics builds its algorithms based on training data gathered from millions of people who have been successful in their fields, or is customized to a specific company based on top-performing people in the roles the company is hiring for. To mitigate bias in its algorithms, the company tests them against reference groups to ensure they meet legal standards of fairness, such as recommending no fewer than 80% of women for all the men selected.

On its careers site, JPMorgan Chase mentions that it will invite prospects to play the pymetrics games after reviewing their applications.

Rieke is cautious about vendors that model their assessments on an existing pool of candidates, because gender and racial biases may already be embedded in the workplace. For companies that use this approach, he says the starting question should be, “Is the existing population of employees or recruits representative of where we want to be as a company?”

Polli’s response is that the behavioral data pymetrics collects is largely unbiased, because memory, altruism, decision-making and other traits don’t differ across gender or ethnicity. The company also tests every model to make sure different genders and ethnic groups aren’t receiving different scores.

At BOK, Maciel says the bank will test some of its processes with AI and look for areas of improvement.

“AI seems promising for the recruiting field," she said, "but the verdict on its impact on diversity is still out.”

For reprint and licensing requests for this article, click here.
Diversity and equality Racial bias Recruiting Recruiting tools JPMorgan Chase
MORE FROM AMERICAN BANKER