FSOC raises alarm on financial stability risks from AI

Gary Gensler
Gary Gensler, chairman of the Securities and Exchange Commission, said during a meeting of the Financial Stability Oversight Council Thursday that artificial intelligence could "heighten financial fragility, as it could come to promote herding among individual actors making similar decisions as they get the same signal from the base model or data aggregator; and they may not even know it."
Bloomberg News

WASHINGTON — The Financial Stability Oversight Council Thursday highlighted new and emerging risks posed by artificial intelligence in its 2023 annual report, giving additional urgency to an emerging technology that has already commanded the attention of the administration and lawmakers.

The council voted to approve its 2023 annual report Thursday during an open meeting, and Treasury Secretary Janet Yellen — who serves as chair of the FSOC — highlighted the need for federal financial regulators to better understand and monitor AI in order to stave off potential risks. 

"Supporting responsible innovation in this area can allow the financial system to reap benefits like increased efficiency, but there are also existing principles and rules for risk management that should be applied," she said.

FSOC's approval of the report marks the first time the body — which was created by Dodd-Frank in the aftermath of the 2008 financial crisis — has identified AI as an emerging risk. The report noted that the swift adoption of such technologies in recent years — particularly the use of AI in financial services — may imperil safety and soundness by enhancing cyber threats and creating the potential for herd behavior. The report recommends FSOC member-agencies monitor the rapid developments in the artificial intelligence sphere. 

"The Council recommends financial institutions, market participants, and regulatory and supervisory authorities deepen expertise and capacity to monitor AI innovation and usage and identify emerging risks," noted a Treasury release.

FSOC discussed data security, consumer protection, and privacy risks around generative AI models like ChatGPT, saying financial institutions using them will assume such risks. The council's report also raised concern with the opaque nature of some AI models, which can throw their reliability into question. FSOC also expressed concern with the potential for biased or inaccurate results which could have implications for fair lending and consumer protection compliance by firms.

Securities and Exchange Commission Chairman Gary Gensler — who has previously raised concerns about AI — went even further at Thursday's meeting, saying while he believes the FSOC's report accurately discussed micro challenges of bias and the lack of explainability of AI, he believes regulators need to see the big picture risks like herd behavior.

"We live in a world where we really have one search engine in the U.S., we have three large cloud companies — I mean, why would we not anticipate that we will end up with a base or foundation model that everybody else is building on top of?" he said. "AI may then heighten financial fragility, as it could come to promote herding among individual actors making similar decisions as they get the same signal from the base model or data aggregator, and they may not even know it."

AI is one of 14 potential risks raised in the FSOC report. The report also delved into the evolving role of nonbank financial institutions, addressing vulnerabilities in activities such as nonbank mortgage servicing and private credit. The Council noted in a release that it supports ongoing efforts to assess and address risks associated with these activities and endorses SEC initiatives to manage risks in investment funds.

FSOC had already voted last month to finalize a new framework for identifying and designating risky nonbanks, reversing a Trump-era guidance that made nonbank SIFI designations far more difficult. 

While the rules will empower the department to preemptively designate firms whose collapse would impact the broader financial system as systemically risky and subject them to heightened prudential standards, Yellen Thursday reiterated her view that any such a designation will follow a rigorous, transparent process.

Some bank regulators commented on the revamped designation process and the ongoing risks posed by nonbanks as well.

Consumer Financial Protection Bureau Director Rohit Chopra noted that while FSOC has had the power to designate nonbanks systemically risky and subject them to enhanced prudential standards, he wants to see that power more readily wielded when appropriate. 

"Despite the bailouts of the past in 2008, and again, at the beginning of the pandemic, there are still a total of zero firms designated for heightened scrutiny," he noted. "So it's important we make clear that this authority is not dead letter and we'll use it when warranted."

Federal Deposit Insurance Corp. Chairman Martin Gruenberg also indicated he was supportive of the newly empowered council.

"The annual report reflects the very robust analysis from staff on hedge funds, and nonbank mortgage service providers," he said. "The new FSOC analytic framework and revised nonbank designation guidance have set the stage for thoughtful and important risk analysis and policy proposals for addressing these risks."

For reprint and licensing requests for this article, click here.
Regulation and compliance Politics and policy Artificial intelligence
MORE FROM AMERICAN BANKER