In their battle to keep fraudsters at bay, large banks are running their own versions of The Dating Game and evaluating strangers based on how they talk.
The banks are using new software or testing beta versions aimed at comparing callers' speech patterns to stored voice prints. In some cases, the vocal profiling is being used as a marketing tool to cut the number and duration of call center interactions, prevent customer defections and improve cross-selling. In others it's being employed to fight fraud by pinpointing callers' places of origin based on their accents in a bid to determine whether they're members of groups known for financial scams.
While the systems are intended to make legitimate customers more profitable and cut fraud losses, experts caution that vocal profiling also puts banks at risk of alienating consumers. Not all customers, after all, are likely to respond favorably to having their banks judge their honesty and value based on how they talk.
"Marketers talk about this in terms of … how they can manage the [customer] relationship better," but "consumers [might] think about this as profiling," says Fatemeh Khatibloo, a senior analyst at Forrester Research Inc., of Cambridge, Mass.
Vendors of voice profiling software acknowledge that it raises privacy issues but say banks are already required to collect a wealth of personal data.
"Calls to financial institutions get stored because of regulatory requirements," Ben Knieff, director of product marketing for NICE Actimize. a unit of NICE Systems Ltd., says. "If we can identify fraud on the first call, we can put this in a database and we could identify all future attempts."
NICE Actimize is testing a biometric voice print system that identifies known fraudsters, many of whom are repeat offenders.
"It becomes really challenging to strategically use soft attributes [such as a particular accent] from a legal and 'false positive' perspective," Knieff says. "No financial institution wants to alienate a group of good customers because of soft attributes."
Mattersight Corp., of Chicago, has developed algorithms based on linguistics and speech patterns that sort customers by to personality type. Banks can then route customers to the representatives best equipped to deal with them.
Mattersight has also developed linguistic software it says can help flag fraudsters based on their accents. In a potentially controversial move, its technology aims to detect accents from high-crime regions of the world. A person's accent is, however, just one of more than 100 characteristics that would be vetted in real time, according to Kelly Conway, president and chief executive of Mattersight
One top U.S retail banks has been using both its fraud and customer assessment services since 2009, Conway said. (Mattersight's parent company eLoyalty Corp. lists one bank, Wells Fargo & Co., as a customer in a 10-Q filing from May with the Securities and Exchange Commission. Wells Fargo declined to comment.)
Mattersight's bank client saved $3 million in fraud costs and cut attrition among call-in customers whom its voice profiling system identified as having made "distressed" calls, according to a case study on its website.
"If I know the customer responds to facts I'm going to give them an offer based on the facts, and I would want to tailor my offers to highlight those attributes," Conway says. "If those people were emotive, they might want a relationship first before they will buy something."
Potential controversy aside, evaluating callers based on their accents is not too different from the vetting banks already do based on computers' Internet Protocol addresses. Those IP addresses are analyzed to determine whether a user's computer is from a region known for high incidences of fraud.
Such technology is particularly important, Conway says, because one in every 4,000 calls to a bank call center is fraudulent, and upwards of 40% of credit card losses involve call centers in some way.
Banks are understandably averse to discuss fraud-fighting programs, privacy advocates concede. However, customers should be informed when data is collected about them, and for what purposes, they add.
"The question becomes to what degree information like this is being retained," says John Verdi, senior counsel at Electronic Privacy Information Center, a public interest research center in Washington, D.C. "[Banks] should not be operating secret data systems with secret inputs and secret outcomes."
Others warn of a potentially more insidious unconscious profiling that could result from voice profiling technology.
"I'd worry that call center agents who are armed with this information would be hesitant to offer certain services or even deals if taking a call puts them at risk," says Khatibloo.
Bart Narter, senior vice president of banking research for Celent, downplays such concerns. With banks in an eternal battle to retain customers, and speech analytics unavailable to most customer reps, it's unlikely that they could use it to screen customers, he says. Most banks make efforts to retain customers only after they express an intention of leaving and even then use only five standard retention tools, such as teaser rates and fee waivers, Narter added.
"Would I invest in technology that detects mood in real time and routes calls accordingly, or give my people more 'save' tools?" Narter asks. "I'd give them more save tools, or better analytics based on segments of the customer base, not the data based on the mood of the customer on a particular day."