During a recent visit to IBM’s digs in the chic Astor Place section of Manhattan, I got a peek at how Watson — the famous (and increasingly useful) artificial intelligence machine — is being taught to look for signs of improper trading, fraudulent account openings and other employee misdeeds.
"We take all of traders' emails and chats and run them through our personality insights and tone analyzer and identify whether there’s anger, are they happy, are they sad?" said Marc Andrews, vice president of Watson Financial Services Solutions. “We’re analyzing the behavioral patterns that are associated with misconduct: How do people start behaving right before they get involved in misconduct?”
One thing Watson has discovered, according to Andrews, is that U.S. traders stop using profanity and angry language just before doing something they shouldn’t.
“It might have been because they were trying to hide things,” Andrews said.
But in the U.K., traders’ use of profanity rises when they go rogue.
“They were being proper beforehand, but then they let go of their emotions,” he said.
Along with the communications, Watson is analyzing trading behaviors, volumes and frequencies, looking for suspicious trading sequences, abnormal order sizes or significant price changes. Watson will weigh it against other recent events and communication patterns for signs something might be off.
“Seeing a suspicious trade sequence or price alert alone might not indicate a problem, because good traders are good at timing the market,” Andrews said. What would be telling would be a communication with a company insider just beforehand.
One trader received a note that said, “Hey man, I think it’s going to rain here in Seattle, you’d better cover up before you get drenched.” Watson recognized that it was from a company insider and that it had a warning tone and therefore flagged it.
Watson will also look to see if traders had other compliance violations in their history or if they had made angry remarks.
By observing such patterns, it is hoped, Watson can start to alert banks to possible insider trading, pump-and-dump schemes, collusion and other forms of misconduct.
To catch phony accounts, Watson starts off looking for an unusually large number of complaints, which might indicate something is awry. It also looks for dormant accounts, accounts where notifications have been suppressed, mismatched contact information, suspicious logins, enrollment reversal and odd login times.
Watson will look to see if an employee suddenly had a spike in sales or unusual customer distribution, such as targeting elderly customers. It looks for management emails that express undue sales pressure.
IBM is beginning to apply this to voice communications, too, to identify ethics violations and changes in tone and speed of speech, as well as language, Andrews said.
The staff of Promontory Financial Group, the compliance consulting firm IBM bought last fall, provides color around motives, culture and conduct risk.
One of the largest global banks — IBM would not say which one — is using a cloud-based version of Watson for employee surveillance.
The gift of hindsight
Andrews acknowledges that when it comes to rogue trading and fake accounts, IBM is training Watson with histories of known prior misconduct —and hindsight is 20-20. In fact, if you know exactly what you are looking for and someone violates a policy or law, a rules-based system could catch it; you do not even need artificial intelligence. It is when you do not know what to look for that trade surveillance gets tough.
“Human beings are never static; they’re never doing the same thing today that they did previously,” said Marten Den Haring, chief product officer at Digital Reasoning, whose artificial intelligence technology has been analyzing trader activity and communications on exchanges that use Nasdaq technology for a year.
Den Haring takes Watson’s conclusions about British and American traders’ use of profanity with a grain of salt.
“There are cultural differences to any type of communication patterns,” he pointed out. “I would be cautious to think you could identify the types of patterns you just described.”
Better signals of wrongdoing come through tracking behaviors over time across multiple channels and seeing people try to conceal their behavior, he said.
“In trying to cover up, people make more mistakes and leave a lot more clues,” Den Haring said. A good example is boasting. “You completed something nefarious, you’re happy, you’re done, you don’t realize that high-fiving each other digitally is leaving just as many clues behind as planning to do something together,” he said.
Digital Reasoning also pays close attention to networks among people and sudden changes in behavior. “Those are far more interesting and less based on emotion and cultural differences,” he said.
Someone has to care
In addition to the technological difficulties of identifying patterns of bad behavior, there is the question of the culture and will of a company and its management.
In most banking scandals, the underlying bad behavior was visible to the human eye for some time. Seven hundred whistleblower complaints had been lodged about fake accounts at Wells Fargo by 2010, along with hundreds of employee and customer complaints. In the JPMorgan Chase "London Whale" case, the trader Bruno Iksil has said his dangerously large credit swap positions were part of “a trading strategy that had been initiated, approved, mandated and monitored by the CIO’s senior management.”
In such cases, it is not that no one knows what is going on, and technology is needed to bring it to light. Management knows and may even be directing the bad behavior, through emails and calls pressuring employees to cross-sell more aggressively or by ordering traders to execute a high-risk strategy. No amount of software, no matter how intelligent, can force leaders to make ethical decisions.
What technology can do is help speed a compliance investigation when foul play is suspected.
“Once you have put your finger on an individual you’re putting on a watchlist, we’re making the investigation capability far richer, more interesting for the financial institutions,” Den Haring said. “That quick 360-degree look-back gives you more clues into what seems out of the norm for a trader.”
Andrews also describes the value of IBM’s Watson this way.
“We’re providing augmented intelligence to banks to help them identify things more quickly, earlier on, and with less resources,” he said. “We’re not making the decision, [but] we’re providing evidence to support a decision.”
Valerie Bannert-Thurner, senior vice president and head of risk and surveillance at Nasdaq, says some of Nasdaq’s bank clients have started integrating voice and electronic communications together with its SMARTS trade surveillance software and the Digital Reasoning AI engine, in order to watch everything traders say and do in all channels at once.
Customers want to know if traders are changing language, location or communication channels, or suddenly starting to communicate more rapidly or often, she said.
“All that metadata around communications, overlaid with trading” will flag unusual trade activity and any intent to manipulate markets, Bannert-Thurner said.
Artificial intelligence software could also uncover collusion. In late April, the Federal Reserve fined Deutsche Bank $156.6 million for, among other things, "using electronic chatrooms to communicate with competitors about their trading positions."
Technology from IBM, Digital Reasoning and Sybenetix could easily catch such known violations. A rules-based system probably could as well.
AI can help compliance officers do their jobs better and make traders more aware they are being watched.
Sybenetix teaches its AI engine the specifics of each job, so it can create a model of normal behavior. This is used to create intelligent alerts for compliance officers, which lets them ask smarter questions. In some cases, it’s replacing an Excel sheet and trade sampling.
“We’ve seen in a number of cases that traders are now coming to compliance officers before they trade and checking with them,” said Wendy Jephson, co-founder and chief behavior scientist at Sybenetix. “One of our clients said this is unheard-of behavioral change, for the front office to come in and talk to compliance. They’re basically saying, Look, I know you’re going to tap me on the shoulder, you’re going to ask me questions, let me just tell you about it upfront and make sure you’re fine.”
And it helps with the classic problem of compliance: false positives.
“If you go to large banks, their systems that are processing trades produce tens of thousands of alerts, and 99% will be false positives,” said Richard Maton, chief marketing and strategy officer at Sybenetix.
At the same time, regulators are requiring surveillance on more asset classes and instruments. Software can help sift through the large amounts of data faster than humans, and layer in related communications and behavior, to isolate activity that is truly suspicious.
Editor at Large Penny Crosman welcomes feedback at email@example.com