Last year, only 10% of surveyed Americans reported they had asked large language models such as ChatGPT and Claude for financial advice. This year, 55% did so, according to TD Bank's second annual survey of 2,500 consumers, released Tuesday.
It's part of the inevitable march toward AI adoption, according to Ted Paris, head of analytics, intelligence and AI, at TD Bank U.S.
"There's an evolutionary journey people take in adopting new technology, whether it's online banking and mobile banking or what we're seeing now with AI," Paris told American Banker. "People are going to look for things to facilitate and ease their life. And where people were dabbling before, it's rapidly democratizing, and now they just have that expectation of it."
Willingness to talk to ChatGPT about money is somewhat generational: among Gen Z, 77% said they use AI to make financial decisions, alongside 72% of Millennials, 49% of Gen Xers and 30% of Boomers, according to the survey results.

The research did turn up good news for banks: respondents say they trust humans and banks more than large language models.
While 62% of Americans trust AI to provide honest and reliable information, only 18% would trust AI to make financial recommendations autonomously, the survey found.
The vast majority of consumers – 90% – say they trust personal relationships and 85% say they trust financial institutions. Nearly half (48%) say human review of AI-generated guidance would increase their confidence.
This gives banks a trust advantage that they need to sustain, Paris said.
"The majority today still say, as much as I value what it does in terms of improving my experience, I want to make sure that there's true human intelligence behind the curtain," he said. "People are going down this path with a certain degree of concern, consideration, skepticism, even. And so that whole element of trust becomes very, very important – trust and degrees of transparency. I think banks are exceedingly well positioned to lean into that space."
ChatGPT's limitations as a financial guru
"Financial advice without context is limited," Zor Gorelov, senior advisor at Klaros Group and the founder of Kasisto, one of the first virtual-assistant providers. "A question like 'Can I afford a Tesla?' can be answered in a polished and engaging way by tools like ChatGPT or Claude, but even as LLM reasoning capabilities improve, without access to a person's full financial picture, including historical spending patterns, the output remains inherently shallow and potentially misleading."
Gorelov pointed out that LLMs take no responsibility for the advice they give. "If someone acts on that advice — buys the car and then misses mortgage payments — who is accountable? With public LLMs, the answer is effectively no one," he said.
Banks, at least for now, have a structural advantage: trusted relationships, comprehensive financial visibility and clear accountability, Gorelov said. "That combination enables guidance that is not just personalized, but responsibly grounded — something generic AI tools cannot replicate."
ChatGPT is an advanced language model, not a financial advisor, Indiana-based 1st Source Bank wrote in a recent
OpenAI, maker of ChatGPT, and Anthropic, the company behind Claude, did not respond to a request for comment.
Psyfi Money, a financial comparison and research website, recently conducted a
The study evaluated each response based on the amount of jargon used, beginner-friendliness, accuracy, relevance, length, price discrepancy of cryptocurrencies and the inclusion of up-to-date legal and regulatory information. These factors were combined to calculate a total index score out of 100 for each provider.
While ChatGPT and Claude scored quite high, 82 and 72 respectively, the bots all provided inaccurate answers. For instance, "When asked about the safest way for a beginner to invest in cryptocurrency in 2026, Claude incorrectly described Binance as registered with the Financial Conduct Authority," the blog stated. "In reality, Binance was ordered to cease regulated activities in the UK in 2021."
The PsyFi Money testers asked each model to price 50 different cryptocurrencies, and "shockingly found that no AI model could accurately price every single one," according to the blog. Gemini, for example, overstated their value by more than seven times.
Paris noted that LLMs can hallucinate. "Getting it wrong for certain types of questions and inquiries can be more negatively consequential than for others," he said. "That's one of those things that I think would lead people to then turn more to financial institutions and to banks they're familiar with to say, hey, let me just confirm this."
Paris said consumers understand all this. While people get financial advice from ChatGPT, Claude and the like, "they're doing it with caution," he said. "People are not naive in that regard."
The TD survey found that 55% of consumers say they would rather get a recommendation more slowly than rely on it coming exclusively from an AI agent, Paris said. Conversely, just 30% of respondents said they would choose faster, AI-powered recommendations even if it meant less human interaction.
"They want the confidence behind it," he said.
Human led, AI-enhanced
The survey offered clues to what consumers want from their banks on the AI front.
Nearly half (48%) of respondents are open to using AI-powered banking assistants that proactively help with everyday tasks such as paying bills, setting alerts and transferring funds.
When provided with the scenario of calling the bank for support by phone, 81% of consumers would prefer some level of human involvement, either with AI gathering information first, then connecting the caller to a human (42%), or connecting immediately to a human who uses AI tools to find the right solution quickly (39%).
TD conducts this survey to understand its customers. The research informs the pipeline of use cases that TD will deploy and helps the bank anticipate how different segments of clients will respond.
"This alone isn't sufficient for us to go out and define a service offering," Paris said. "You do much more in terms of focus groups and things like that. You get much more granular. What this is intended to do is say, hey, are we getting the broader picture correct in the sense that we can anticipate where our clients' expectations, their needs and preferences are going, and can we meet them there to make sure that we're, we're supporting them in the way they want to be supported?"
TD will do some brand marketing around the idea of being "simply human."
"To date, what we have emphasized is what we'll call that human-in-the-loop approach," Paris said. "When somebody wants advice on a decision that's important to them, whether it's behind the scenes or whether it's explicit, humans will be involved."











