Bankers may have to put pitfalls in facial recognition higher on their priority list, and the concerns are as much societal as technical.

Last week the American Civil Liberties Union demanded that Amazon stop selling its Rekognition program to government agencies and police departments. The ACLU said the technology is flawed and that it is worried law enforcement agencies will use the system to track protesters and immigrants.

“Face recognition is a biased technology,” the ACLU said. “It doesn’t make communities safer. It just powers even greater discriminatory surveillance and policing.”

Recent studies have shown facial recognition systems tend to have higher error rates for women and minorities than white men.

In one real-life example, a Chinese woman’s colleague was easily able to unlock her iPhone X with her face. Apple gave the customer a new phone, but the same thing happened again. Both women are Asian. Apple did not respond to a request for comment.

Rizwan Khalfan, chief digital and payments officer at TD Bank, said such stories give him pause but do not totally discourage him.

Apple’s Face ID “is not 100%. But how many authentication capabilities are 100%?” he said. “I can’t think of anything that’s 100%. There have been a few cases like that, and we keep a close eye on it.”

Another example he noted was of a father and son in India with a similar profile who were able to spoof the Face ID.

“That’s always a concern, but that’s why we don’t depend on one form of authentication,” he said. “When we authenticate a customer it’s always multifactor authentication. Nothing is perfect on its own, but in combination, you get to a much better level of security.”

As banks increasingly adopt Apple’s Face ID and other facial recognition technologies to let people log into mobile banking with a selfie, here are some things they need to think about, including questions of security, social equity and customer service.

The bias problem

Antony Haynes, associate dean for strategic initiatives and information systems at Albany Law School, pointed out that all artificial intelligence systems have the potential for bias.

“One assumption we make as human beings is that putting something in software makes it somehow objective or neutral or unbiased,” he said. “That couldn’t be further from the truth because a human being has to write the software, provide the training data, and tell the system when it succeeds or fails.”

Facial recognition systems get fed sets of images to learn to identify people.

The trouble, Haynes said, is the data sets being used to train these systems, including free image libraries, do not represent the overall population.

“Think about the companies that create software,” he said. “Many are based in Silicon Valley, [and] most are controlled or owned by young white men and young Asian men. So the data sets look like that, mostly men. Because the photographs used to train their systems are mostly of white men, the software is going to do better at recognizing men than women, and better at recognizing white people than black or Asian people.”

The biggest danger for banks out of this, in Haynes’ view, is that legitimate customers might not be able to access their accounts.

Haynes shared that when he was in the Air Force, his base in Colorado Springs had several access-control mechanisms, including an iris scanner.

Haynes, who is African-American, passed everything but the eye scanner.

“The security guard said, ‘Sir, look into the eye scanner!’ I said, ‘But I am looking into it!’ ”

Because the eye scanner failed to recognize him, he had to go through pat-downs and other additional security measures.

“I was the only black person, and all my Caucasian comrades made it through the process of getting into the base,” Haynes said.

Later, he found out the eye scanners used at the base were calibrated for Caucasian eyes.

“If the bank’s facial recognition is not properly tested and standardized for a range of features, then you’ll have frustrated customers,” Haynes said. “It’s an inconvenience. I don’t know if it’s a legal issue, but it’s definitely a PR issue.”

Defending the technology

USAA’s facial recognition system had trouble recognizing minorities in its initial deployment, said Rick Swenson, director of enterprise fraud management at TIAA, who previously was responsible for USAA’s facial recognition system. It would prompt users to improve their ambient light so the company could get a match.

The company no longer has this problem, according to Richard Davey, senior manager of information technology at USAA, which today has 3 million members using its biometric logins — fingerprint, face and voice. It supports Apple’s Face ID and uses Daon’s biometric technology in its mobile app to support other devices.

“I’m not aware of any issues being brought to our attention where members were having difficulties with using the facial recognition,” Davey said. “I have not heard any significant issues related to any facial types or even glasses or facial hair.”

USAA monitors all authentication activity for signs of biometric login errors.

“That’s something we pay great attention to and we see a strong success rate with facial recognition,” Davey said. He declined to share a pass-rate number.

“It may fail a couple of times before being successful, but that happens so quickly; the end user doesn’t experience any delay, they just see the application open,” he said. “The end user doesn’t see those failures.”

Some solutions

There is also a PIN fallback to biometrics, so if the lighting is too poor for facial recognition, background noise interferes with voice recognition or wet fingers prevent fingerprint recognition, the member can type in a code.

To keep its app secure, even if a biometric like facial recognition fails and the consumer uses a PIN, USAA has embedded a token credential in its mobile app that uniquely ties that instance of the app to the member’s device, and that coupled with the PIN provides two factors of authentication.

“We’re not aware of any fraudulent account takeovers that have occurred through the biometrics on our mobile app,” Davey said. “In order to compromise a phone using the USAA mobile app, you would need physical access to that device in addition to the ability to pass the biometric or knowledge of the PIN.”

TD Bank enabled Apple’s Face ID within days of the launch of iPhone X to customers in the U.S. and Canada, Khalfan said.

The bank’s experience with it has been “seamless,” he said.

“It was a real advancement in my opinion because they use three-dimensional mapping, versus 2D,” he said.

The bank uses other forms of authentication in combination with the facial recognition, he said. For instance, it makes sure the customer’s device is one they have used before; if not, the customer will be prompted to go through other authentication checks.

"We need to make sure the customer experience is balanced with ensuring security for the customer," Khalfan said. "So far, so good.”

TD Bank has not seen any failure rates with Face ID users that are anomalous to other types of authentication like Touch ID or one-time passwords, he said.

“I’ve used it myself since Nov. 1, and my experience has been pretty good,” Khalfan said. “There have been a few times when it didn’t recognize me, but typically on the second try it recognizes you. Out of curiosity, I’ve tried it wearing a hat, unshaven, and in different light.”

The bank does not use any other facial recognition for authentication, he said. It is waiting for facial recognition to become available on Android devices.

The bank does use facial recognition for research. During customer focus groups, customers’ faces are monitored to gauge their response.

“We ask questions, as you would do in any focus group, to measure your sentiment and your voice, but we also look at your facial expression to see if there are any unarticulated needs we can determine,” Khalfan said. “That’s been quite useful for us.”

A customer looking at a new bank website design, for instance, might say they like it. But their expression might show they are confused.

Need for thorough testing

Swenson pointed out that not all facial recognition systems are the same.

“Systems that rely more on facial characteristics will be more prone to failure,” he said. “Solutions that have ambient light requirements could be impacted. Solutions that use symmetrical analysis and perform measurements from eye to eye, eye to jawline, jawline to forehead, are much less prone for failure.”

Apple, for example, has users turn their head side to side and up and down to gather 3-D data that can also measure depth of nose, mouth and neck line.

Haynes conceded that lighting can be a factor in facial recognition.

“If I have two faces with the same general features and skin tone — two Caucasian males — and one is well lit and the other is not, I will get radically different results,” he said.

But in a famous case involving HP laptops, different recognition results occurred with black and white users regardless of the lighting.

“Why should the user have to change their environment to make your product work?” Haynes said. “For white employees room lighting is fine, black employees need a spotlight? That’s silly. It doesn’t make any sense.”

The main thing companies need to do is carefully test this and any other technology before deploying.

“There’s always a requirement to integrate any technology into your organization in a thoughtful way,” Haynes noted. “Sometimes organizations aren’t thoughtful about it and it ends up causing more pain and misery than joy.”

Editor at Large Penny Crosman welcomes feedback at