Chatbots offer advice without judgment. Low-income people are noticing.

Chatbots don’t judge.

They welcome sensitive questions that people might feel embarrassed posing to a human. They don’t see race, age and gender the same way a live customer service agent might.

These are some of the reasons that people with low-to-moderate incomes have felt comfortable using digital assistants — tools on a mobile app or website that interpret questions customers type or speak, and use AI to generate answers — to interact with their banks. For a September report, Commonwealth, a nonprofit in Boston that aims to build financial security for financially vulnerable people, surveyed 1,290 people in the U.S. with annual incomes under $60,000, focusing on women and Black and Latinx people in particular. They found that use of and trust in chatbots by people in this income bracket has risen substantially since the pandemic began.

The results may resonate with banks that are looking for economical ways to scale up customer service and connect with low- to moderate-income customers.

Respondents were two times more likely to have interacted with a chatbot than those surveyed prior to the pandemic, and most suggested that the habits they developed during the pandemic would persist even after branches reopened. More than two-thirds said they would prefer to get certain types of advice from chatbots than from humans. For instance, 26% would rather go to a digital assistant for help with managing debt and expenses and 22% said they'd be interested in receiving advice on how to save more money.

“Our research suggests that there is a growing interest and openness to using chatbots and virtual assistants,” said Timothy Flacke, executive director of Commonwealth. “More importantly, that interest and openness extends beyond better served and higher income customers.”

Separately, several conversational AI providers have found that queries about postponing mortgage or loan payments, transactions and fees, or more generally about financial hardships have been common during the pandemic — questions that skew toward a lower income demographic or suggest customers who are concerned about their finances.

Among consumers of all incomes, attitudes toward chatbots are mixed. For instance, in a study conducted by Phoenix Synergistics in early 2021, only 26% of consumers using AI-powered chatbots said they were very satisfied.

Michigan State University Federal Credit Union in East Lansing, Michigan, has a chatbot nicknamed Fran that is powered by Boost.ai in Norway. The credit union serves students, alumni and faculty of Michigan State University and their families, as well as employees of some large local companies, including in areas that are economically depressed. (Both MSUFCU and Boost.ai are participating in the next phase of Commonwealth’s research to test the September findings.) The main goals when MSUFCU launched Fran in October 2019, with a different provider, were to extend service to 24 hours a day and resolve simple questions that don’t require a human to answer, such as the bank’s routing number — the most popular question that crops up, whether people are searching the credit union’s website or contacting customer service.

“Fran took 100,000 chats in 2020,” said Ben Maxim, vice president of digital strategy and innovation at the $6.3 billion-asset credit union. “That’s 100,000 chats we didn’t have to have our live agents answer, so that helps with our staffing and slowing down our hiring needs.” Fran is trained with content from the website’s frequently asked questions, by scouring live chat logs and with newly formed answers to address the economic stimulus payments, childcare tax credits and other events.

What people want from bots

Anecdotal evidence from conversational AI providers who were not involved with the Commonwealth report supports the finding that lower-income people are increasingly turning to this communication channel.

Kasisto, which has 20 financial institutions around the world as clients, measured a 35% increase in messages exchanged between customers and its intelligent digital assistant, Kai, between February 2020 and April 2020. Although Kasisto does not capture personally identifiable information about bank customers, executives have noticed an increase in certain requests, namely inquiries about transactions, payment deferrals (there was an 18% increase in requests related to payment relief in the same time frame) and dealing with financial hardships (“I’ve lost my job, how can you help me”).

“If someone asks a question four or five times about recent transactions or spending, you can deduce those people are worried,” said Zor Gorelov, CEO of the New York City-based Kasisto. “People who are well off don’t always look at the last transaction.”

Kore.ai, another provider of digital assistants to banks, would not comment on specific demographic details related to conversational AI. But, “the top requests during the pandemic include disputing transactions, requesting credit line increases, requesting balance transfers and answering inquiries related to fees on personal accounts,” Peter Berbee, vice president of product management in financial services for the Orlando, Florida, company, said by email. “The nature of these tasks indicate a skew towards a lower income demographic.”

Conversational AI also allows for questions that people would feel embarrassed to pose to a human, perhaps because they are sensitive or feel awkward or trivial.

Henry Vaage Iversen, chief commercial officer and co-founder at Boost.ai, found that even before the pandemic, questions about how to postpone a mortgage or loan payment were extremely common. These questions multiplied during the pandemic. Before the pandemic, he also noticed people asking for definitions of basic terms, such as interest rate, or for the differences between products.

“If you are not well versed in financial terms or don’t understand what you should be doing with money, a chatbot is a great way to phrase problems in your own language,” said Anne O’Leary, research analyst at Curinos, a data, analytics and technology company for financial institutions. “It makes help accessible for those who are maybe not as financially literate as others, and it’s less intimidating than talking to a real person.”

This is an angle MSUFCU is exploring with Commonwealth. “Chatbots seem to be a way for people to open up, get the conversation started and become more comfortable seeking help from a human,” Maxim said. He finds that with Fran, the perception of human judgment is removed and people are more comfortable sharing intimate financial data.

The study picked up trends concerning race as well.

When it controlled for income and other demographic factors, Black and Latinx participants reported feeling more comfortable with conversational AI compared with white participants — who were also less likely to trust advice coming from bots than Black individuals were.

“Imagine you didn’t feel welcome in an interaction in person or over the phone. That would be one reason to be more open to these technologies,” said Flacke. “If you felt that a live customer service or in-branch experience was unwelcoming, it stands to reason why you might be more interested in the channel where you don’t have to deal with that possibility.”

When it comes to seeking advice from chatbots, there are also demographic differences. For example, Black people were more likely to want advice on how to increase their savings while those who identified as Latinx or other non-white race categories were about equally interested in advice about saving, managing debt and investing. People who defined themselves as “financially comfortable” were more likely to want advice about saving, while those who report they are struggling are more likely to eschew any kind of financial advice, perhaps because of negative emotions related to finance. Thus, fintech providers may want to add a more encouraging spin to their suggestions.

The drawbacks of conversational AI

The report also found that financially vulnerable people have concerns about using chatbots. They worry about the risk of being misunderstood, the security of a bot and uncertainty that their needs would be met without speaking to a human.

Worries about being misunderstood in particular are well founded.

The pandemic massively accelerated the trend of banks implementing conversational AI, said O’Leary. These tools have become increasingly sophisticated; some have morphed from glorified FAQ engines and can perform actions, such as locking a debit card.

But they have their imperfections. In a recent test, O’Leary was surprised to find how many chatbots couldn’t understand queries with a typo. They may also give up on certain vernacular or slang. Or they may deliver generalized advice, which would likely not be helpful to a person of low- to moderate-income with complex needs.

At MSUFCU, Maxim has found that people are more trusting of a chatbot and more easily forgive its mistakes when it’s apparent to customers that they are not interacting with a human. If Fran doesn’t understand a question, it will respond, “I’m still in training.”

Still, these assistants have learned to adapt and become more intelligent over time.

“When we were processing all this COVID data during the summer of last year,” said Gorelov, “the only thing our original systems knew was that ‘virus’ meant computer virus and ‘corona’ meant spending money on beer.”

For reprint and licensing requests for this article, click here.
Bank technology Artificial intelligence
MORE FROM AMERICAN BANKER