Bankers would like to take advantage of one of Facebook's prime marketing skills — the ability to target users who exhibit attributes like their existing customers — but there are persistent fears that doing so could make financial institutions run afoul of regulatory restraints.
In theory, Lookalike Audiences, which allow a company to upload a list of customers so that Facebook can identify common demographic information or interests, could help accelerate banks' marketing campaigns.
But fair-lending laws and disparate impact, under which banks can be held liable for unintentional discrimination, are staying banks' hand when it comes to use of the tool.
“We have definitely seen interest from our banking clients in running targeted customer acquisition campaigns on Facebook, and for good reason,” said Pierce Hasler, vice president and general manager of industry verticals for Oracle Data Cloud. “The problem they run into is that Lookalike Audience modeling processes in advertising platforms tend to be automated and are typically designed to make maximum use of available data, which also means there’s no ability to exclude data or filter restricted attributes.”
For example, under Regulation B, banks can’t consider race, color, religion, national origin, gender, marital status, age, or whether someone is a recipient of public assistance in determining whether or not to provide service. But there is no way to see if Facebook's algorithms use that information or to filter it out if it does. As a result, many banks' compliance teams are steering clear.
“Financial services companies’ No. 1 concern about using more advanced people-based data targeting is the transparency, the understanding of what kind of data is fueling the overall algorithms,” said David Dowhan, CEO of TruSignal, a predictive marketing software company. “That has big implications for the compliance teams. Unless the compliance teams can review in detail exactly what kind of data is being used to fuel the targeting, the marketing and compliance teams are uncomfortable using black box algorithms.”
Banks perceive Facebook’s Lookalike Audience as a potential benefit if they could overcome the compliance hurdles, however.
“Because Facebook cannot or will not reveal the underlying data that’s powering those models, our banking clients feel uncomfortable using them,” Dowhan said. “Compliance will not sign off on them.”
Google Ads has a similar challenge. It lets advertisers go after an age category, for instance, but it doesn’t get specific about what data it’s looking at.
Some bankers are beginning to push back against such restrictions, arguing compliance is being overly cautious. They note that Regulation B was established for credit scoring, not marketing.
“I believe a disconnect exists between the regulations, the interpretation of the regulations by both sides and the advanced use of data, for example Facebook Lookalike Audiences,” said a fintech analyst, who spoke on the condition of anonymity.
Another data privacy executive at a large bank, who similarly did not want to speak on the record, said there shouldn't be a problem with using the Facebook feature, "since many companies target based on an ideal demographic that matches their existing best customers.”
But others contend that the perceived compliance risks are real.
"I can understand the compliance pushback," said David Hawkins, principal at Incite CX, a customer experience consultancy; he was previously director of experience innovation at Umpqua Bank. "There would be the potential to exclude a wide swath of folks from messages and offers. It's more suitable for branding and awareness build than product offers."
Some compliance departments are worried about factors that aren’t even mentioned in any of these regulations.
“Presence of children is not called out by Regulation B, but there are some compliance teams we work with that have concerns about using any reference to the presence or absence of children as a factor that determines targeting,” Dowhan said.
The 'black box' problem
This isn’t just a Google/Facebook issue, Dowhan said.
“This applies to the concept of transparency and data targeting in general for digital,” he said. “That’s what we’re hearing: any kind of data targeting that’s not transparent — it’s a black box model and financial services companies are uncomfortable. Whether it’s Facebook, Google or some other algorithm, if you can’t accurately describe exactly the kinds of data and how your targeting works, the compliance department is pushing back and saying we can’t allow it.”
The so-called “black box” problem crops up repeatedly when banks think about using artificial intelligence, whether it’s for credit scoring, lending, or other types of service. If a machine is going out and finding data patterns and correlations for itself, how can they be sure it’s not going to find a pattern that leads to diminished service for a protected category? What if a “source audience” happens to be mostly white, wealthy and under a certain age?
The issue can occur even when banks use AI-based customer analytics internally.
“Any platform-based lookalike modeling capability, where the inherent task is to take a seed of users and leverage other, usually large data sets to match and model for additional scale, involve the same risks of possible inclusion of restricted attributes,” Hasler said. “At a minimum, the assurance that those attributes have not been included is lacking.”
That said, he believes banks will use AI-based marketing first with their own first party data, then gradually work their way to audiences beyond existing customers.
When TruSignal works with financial services companies to build lookalike models, it suppresses any factors that might cross a compliance line.
It might have the model focus on past purchase behavior or the types of cars people have, hobbies and home ownership.
“The objective is to find good predictive power in the model, and make sure we’re avoiding certain factors that are prohibited under Regulation B,” Dowhan said.
The jury is out on whether banks can safely use Lookalike Audiences. Banks’ caution could hamstring them, or it could keep them out of trouble. Eventually, Facebook and Google may be pushed to provide transparency around their data and algorithms.
Editor at Large Penny Crosman welcomes feedback at firstname.lastname@example.org.