BankThink

Boosting Your In-House Big Brother May Stop Employee Crime

I recently opened a work email with the subject line: "time to upgrade your computer." I knew that I was due for a replacement laptop and was very relieved to see that someone was taking care of it, at last. But it took only moment for me to realize that I had been caught by our IT security team looking to entrap weak-minded employees susceptible to cyber-criminals. This was not a real phishing scam, but a simulated hack designed to prepare me for the real thing. I will not be so quick to open that next suspicious-looking email.

A compliance or risk management team within a bank or other company plays the valuable role, among others, of trying to mitigate the gullibility or ethical lapses of the firm's own employees. But actual law enforcement officers take this to the next level. Rather than merely using deceptive measures to strengthen the defenses of potential crime targets, they look for ways to infiltrate criminal groups, pretending to help them in order to stop and arrest them before a crime is carried out.

What if risk and compliance departments were to engage in activities more akin to real-word detectives? Instead of just filling holes in a bank's security safeguards, perhaps they could use some form of espionage to catch employees committing financial crimes or policy violations. This could mean identifying potential or even real criminals inside the company, or simply regulating employees' impulses to commit a legal or ethical lapse. Let's consider a few possible examples.

Take a trading-related case. An email directed to an employee promotes a certain stock promising it is about to take off to previously unknown levels in the next days. The email not only suggests that the employee should invest in the stock but also asks him/her to promote the stock to his/her colleagues, promising a commission fee for every person recruited to the scheme.

[Get recognized: Applications for the 2016 FinTech Forward rankings are now open until July 25, 2016. Each year, FinTech Forward — a strategic alliance of American Banker and BAI — identifies the top 100 financial technology vendors (by revenues from financial services clients), the top 25 enterprise companies in the space, and 20 companies to watch. Apply here.]

The employee, without thinking too much about it, talks up the stock in an email sent to a colleague. Perhaps the trader isn't a fraudster at heart, and doesn't even realize that the sent email — meant to promote a share price — effectively violates securities laws. But the exercise has still addressed the trader's susceptibility to a violation. Instead of the email actually going to the colleague, an email comes back from Compliance explaining to the trader that it was a simulation, that this would be at best a questionable activity, and at worst, a criminal act. The employee would get a warning and have to attend re-training.

Another example would be testing how prone employees are to ethical lapses pertaining to client retrieval. An email goes to client-facing employees from a vendor or client promoting a gift of some type, say, two tickets unaccompanied to the musical "Hamilton." All that is required to secure the tickets is for the employee to arrange a meeting between the vendor and more senior client-facing managers.

Without thinking, the employee responds, "Yes, please," (it is "Hamilton" after all) and at that point, instead of two tickets as promised, an email comes back from Compliance explaining that this was a potentially serious code of conduct violation. The employee similarly receives a warning and is required to attend a re-training program.

These are fairly basic cases. What about something a little more involved? It is widely known that banks conduct employee surveillance programs. In the wake of the FX, LIBOR and other scandals involving use of chat rooms and the like, these surveillance programs are becoming more rigorous as part of consent orders issued by regulators. More data is being collected from employees, and with more sophisticated tools available, compliance officers are better able to identify potential bad actors.

This potentially gives compliance officers the means to infiltrate networks of collaborators and conspirators planning to engage in questionable activity — something akin to a corporate FBI. Using technology, surveillance may even potentially be expanded to analyze certain keywords or patterns of behavior in emails and chats, which could reveal, for example, extreme levels of stress from financial pressures or other changes in personal circumstances. Maybe an employee is planning to undertake, or is vulnerable to the suggestion of, certain illegal activities to plump numbers or just to make numbers.

Identifying such individuals would enable compliance to undertake more targeted re-education and warning programs specifically to "at risk" employees. Looking to ensnare employees only so far as being able to educate them and warn them against the pitfalls of committing real financial crimes could be a valuable tool for banks looking to avoiding revisiting some of the sins of the past.

Andrew Waxman is an associate partner in IBM Global Business Services' financial markets risk and compliance practice and can be reached at abwaxman@us.ibm.com or on Twitter @abwaxman. The views expressed here are his own.

For reprint and licensing requests for this article, click here.
Bank technology Cyber security Compliance systems Analytics Fraud detection
MORE FROM AMERICAN BANKER