How new IBM partnership could speed AI

IBM is partnering with a firm specializing in machine learning in an effort to speed up artificial intelligence programs.

AI models can be complex and analyze extremely large data sets. Running one model can often take several days. That's a problem for financial services firms and other companies that are using AI for risk modeling and other purposes.

As a result, IBM turned to H20.ai, a firm that supports open-source software already used by 10,000 companies, mostly in financial services, insurance and health care, to build AI programs.

The new partnership allows H20.ai’s software to run on IBM Power computers embedded with high-speed graphics processing units from Nvidia. This can bring nearly fourfold performance improvements, with 30% higher memory bandwidth, the companies say.

Faster speed is important because “when you do machine learning, you’re teaching an AI brain how to recognize a pattern,” said Sumit Gupta, vice president of AI, machine learning and HPC at IBM. “You give it a lot of data, you tune the brain, you teach it, then you look at the results, you modify some aspects of it, you turn a knob, you may massage your data, and you teach the brain again. You do this hundreds of times. And these AI models eventually become very accurate at picking that pattern out. Each one of those runs can take hours and in some cases days to run.”

H20.ai is no stranger to banks. Wells Fargo, Barclays and other institutional investors led its latest funding round, with banks seeing potential in its open-source machine learning systems.

“We’re building a grass-roots movement in open source that brings the best compilers, technologists, and grand masters in data science together to build a software stack that rivals the likes of Google and giving it to enterprises and businesses that cannot attract that kind of talent,” said Sri Ambati, H20.ai's chief executive.

artificial-intelligence-eight.jpg
Design made of outline of human head and symbolic elements to serve as backdrop for projects related to knowledge, science, technology and education

Speed can be a problem with AI. For example, one deep learning model took 12 days to run on a desktop computer. When it was run on an IBM Power System, the time was reduced to seven hours.

“I tell people I’ve never taken a vacation for 12 days, so what am I supposed to do for 12 days while I wait for this to run?” Gupta said. “As a data scientist, you’re twiddling your thumbs.”

Several large financial companies are testing the combined systems, which have been in the works for about 18 months. One large bank is experimenting with using the technology to benchmark its internal models to make sure they’re solid and safe.

Part of what IBM and H20.ai hope to achieve, Sambati said, is “driverless AI.”

“The goal of driverless AI is to make novice data scientists who would otherwise take a decade to become really good at building good models and preventing common pitfalls, able to perform at par or even better than grand masters and also take advantage of the latest hardware and software architectures,” he said.

A typical installation of this hardware and software will cost around $250,000 a year, the companies said — the cost of a data scientist.

“The fact is that every industry will be disrupted by AI,” Gupta said. “The people who adopt the right AI platform with the right software are going to be advantaged. What we’re providing here is a combination of the right hardware infrastructure with the right software infrastructure and that platform to the enterprise.”

For reprint and licensing requests for this article, click here.
Artificial intelligence Machine learning Bank technology Open source IBM
MORE FROM AMERICAN BANKER