Editor at Large

Forget Brexit and all the rest. The European threat that U.S. bankers should start taking more seriously is the potential spread of tough new privacy standards across the Atlantic.

Some say the United States won't be far behind Europe in implementing harder privacy and security laws. Presently there is no cross-industry privacy law in this country-- rules tend to come from industry regulators and state attorneys general.

"There's a growing demand from customers, who are largely global citizens now no matter where they live, that they should have knowledge and possibly some participation in the use of their personal data," said Stuart Lacey, CEO of Trunomi, maker of software that helps companies obtain consumer consent. "I don't see the U.S. being that far behind in having that level of surety."

Privacy has been an issue that has plagued financial services for at least two decades, but many observers believe it is about to come to a head because banks have more data about their customers than anyone else, do the least with it, yet will have to rely on it as they overhaul their business models to stay competitive with fintech and other rivals.

Banks want the ability to offer someone a mortgage at the time they are thinking about buying a house, or offer term life insurance with an annuity when they're about to retire. Segmentation and anonymous data do not get them there, only scrutiny of individuals' personal behavior and history does.

To be sure, consumers seem willing to take some chances. Their demand for personalized services runs deep today, said Jim Marous, owner of the Digital Banking Report and co-publisher of The Financial Brand.

"Consumers are willing to share family information, they're willing to share transaction information, and they believe the bank should know their short-term and long-term financial goals," including warning them when they are about to get into financial trouble, Marous said. "Their desire for personalization, for customization, and for contextual engagement — where I am and what I'm doing — is getting greater and greater."

Striking a balance among all these conflicting demands is hard. Calls for strict controls around such data-sharing programs are on the rise, in part because there is a real mismatch between the rules that exist today and the capability of emerging technologies.

"The rules that guide what consumer data a company collects, how it stores, transfers, and manages that data, and how it uses and disposes of that data — are nebulous at best and dangerously misguided at worst," Forrester analyst Fatemeh Khatibloo wrote in a recent report.

"As the data landscape evolves, from technological advances that enable de-anonymization of digital data to the proliferation of data that is personally identifiable in one context but not another, this question inevitably arises: How should organizations treat customers' personal data, and what constitutes personal data, anyway?"

Europe's New Stance

In April, Europe's Parliament passed a new set of rules that govern how consumer data is collected, shared and managed, the European General Data Protection Regulation.

"It affects every company in the world that has an operation in Europe or has European customers," Trunomi's Lacey said.

The fine for an infraction is huge — 4% of annual global revenue — and the rules are tough. Not only will companies need to obtain consent to use customers' data, they'll need to get subject-matter consent. In other words they will have to explain to the customer precisely how they plan to use her data just before they plan to use it. They will also have to rescind or erase the customer's data on command.

"The [European Union] takes a position that privacy is a fundamental right," said Andy Roth, partner at Cooley LLP, a Palo Alto, Calif., law firm. "There are aspects of the law that are going to raise the bar on compliance and the mechanisms you have to have in place. They raised the level of consent that's required [in the GDPR]; it has to be express affirmative consent. You're going to need the ability to keep track of what the consent is for."

The rules give consumers the right to be forgotten — to have all traces of them erased from a company's records.

"The right to erasure is difficult because [the rules] don't prescribe any governance, so now you have to set up your own entity to handle these rights; you're setting up almost a mini governmental body," Roth said.

Data portability is another challenging requirement of Europe's new rules — the need to give customers a copy of their data they can take with them. Companies will also be required to have a data protection officer.

The regulation does not spell out how these requirements should be handled; the rules will be phased in over the next two years.

"This is first time we're seeing regulation that's in the best interest of customer, not regulation for regulation's sake," Lacey said. "It's not just trying to create more efficient markets or protect a jurisdiction, but protect the end customer and to give them some sense of transparency and help as to who is using their data and how."

The GDPR only directly affects companies that do business in Europe or have European clients — about 800,000 companies globally.

However, U.S. companies may come in contact with this rule more than they think, through third parties such as cloud providers, Roth said.

U.S. Catching Up

Americans historically have been less concerned about privacy than Europeans.

A Pew Research Center survey of more than 1,000 U.S. adults, conducted during the standoff between the Justice Department and Apple over an iPhone used by one of the suspects in the San Bernardino terrorist attacks, found that 51% believed Apple should unlock the iPhone to assist the FBI investigation. Only 38% said Apple should not unlock the phone to ensure the security of its other users' information.

And while the Obama administration crafted a White House Consumer Privacy Bill of Rights last year, Congress is unlikely to pass it.

"In the U.S., privacy is not a fundamental right," Roth said. "We're more a no-harm-no-foul-type society."

The most recent piece of U.S. legislation affecting data privacy, the Cybersecurity Information Sharing Act of 2015, is a privacy advocate's nightmare.

Yet when Americans wake up to the fact that their data is being used in ways they didn't authorize, they do push back.

In a poll conducted in April by ACT/The App Association, 93% of U.S. adults said it is important that the photos, health data or financial information they store on their phones and apps, or share online, stay secure and private. And 92% said they need encryption technology to make sure their information is secure.

"I believe there's a misperception that Americans don't care about privacy and that young people don't care about privacy," Roth said, pointing out that older Americans are gravitating to sharing personal information on Facebook, while millennials are shifting toward more private apps like Snapchat and WhatsApp.

There is a cultural shift from sharing personal information in exchange for free services (the Google model) to deeper engagement and more protection of information, while paying for services (the Apple and Uber model).

"When I drive, I happily exchange my location for a better road to avoid a traffic jam, to get there quicker," Lacey noted. "That's a value exchange I willingly participate in. What I don't want is a company turning on my camera or my voice recorder because the app allows it to without my knowledge while I'm sitting at my desk. I don't want a Barbie doll uploading 24/7 to the cloud every word spoken in my daughter's room. Right now we have no idea how much stuff is being pulled out of us because it's unregulated."

Lacey expects a new privacy phase in the U.S., "where our comfortability with the sharing of our data will not be oblivious and blind but will be measured and will be for some level of compensation, whether that compensation be financial or utilitarian."

U.S. companies today typically get consumers' consent to use their data by relying on a general "terms and conditions" disclosure with an "I Agree" button at the bottom.

The European rules wouldn't allow this blanket approach to consent, Lacey noted. "I always refer to this as the iTunes agreement, which is actually 27,000 words. No one reads it; we just want to get on with our lives. Regulators know that many of these are written so ambiguously as to almost make them undecipherable."

Banks would still need to obtain general consent when they sign up new customers, then ask again each time they have a new use for the customer's data, such as passing it to the mortgage department to see if she qualifies for a special offer. "This is the whole concept of unambiguous consent," Lacey said.

Mapping Out Answers

Companies that have to comply with strict privacy rules like Europe's will need software to capture and track customers' express consent.

For all companies, policies and procedures around data privacy are important. Roth recommends banks do global data mapping — create a map of all the data assets they have and how they move across the enterprise around the world.

"That's the No. 1 step in establishing a control state and understanding what you're dealing with," he said. "In that process, you find out you're using way more third parties than you thought and they're getting much more information than you would have thought."

In the same vein, Forrester recommends categorizing all customer data in three buckets: radioactive, toxic and unclassified. Radioactive data includes personally identifiable data and PCI-protected data that will violate a business agreement if lost.

"Protect this data aggressively," Khatibloo recommended. Toxic data is information that, if lost, will do harm to customers or incur costs or brand damage for the company; it should be controlled with opt-out best practices. Unclassified data can be treated as public information without harm to the organization or customers.

There are also the dual principles of respect and courtesy: how would any reasonable person want their data protected?

"If you put a Fitbit on, and you choose to engage in Fitbit services, you're sharing the data you create with Fitbit," Lacey said. "If Fitbit sells that information to a pharmaceutical company for a biomedical study on you and they make money off it, should you have known about that, and should you have participated in that? That's the question that's coming now."

Editor at Large Penny Crosman welcomes feedback at penny.crosman@sourcemedia.com.