Banks are trying to spin consumer privacy as a matter of customer relations, and though consumers are buying in to the banks' rationale, lawmakers are not, according to Charles Nesson, a Harvard Law School professor and the director of its Berkman Center for Internet and Society.
In addition to addressing Internet ethical and policy issues such as public access and jurisdiction in cyberspace, Prof. Nesson has recently taken up the issue of privacy. He raises the possibility of the construction of a surveillance network that could easily capture the minutiae of people's lives.
Prof. Nesson, known as "the dean of cyberspace," has written a textbook and numerous articles on evidence and is co-editor of "Borders in Cyberspace," which was published by MIT Press in 1997.
He served on the plaintiffs' legal team in the Woburn, Mass., toxic-waste trial that was the subject of Jonathan Harr's book "A Civil Action," and he has appeared as a legal commentator in several PBS and CBS television documentaries.
In a recent interview, Prof. Nesson assessed the privacy debate.
What are the major threats to privacy, and why has it suddenly become an issue for financial institutions?
It has to do with the meaning of "banker." It used to be that a banker was conceived of as someone, some entity that had an extremely confidential relationship with the people that it served. That's the image of the Swiss banker. That identity has been radically changed by the merger of banks with other kinds of entities that provide different kinds of services, all pushed by an underlying profit motive like brokerage and insurance.
We're moving into a time when the fundamental identity or at least the traditional identity of "banker" is very much changing.
How does this become an ethical or practical dilemma for banks?
Well, it creates a kind of dynamic in which the motive of the banker is to have a mass of customers comfortable enough to tolerate the level of information-trading that the bank engages in. It's a shifting sand. The expectation of privacy turns out to be something that isn't fixed in any given point in time.
We're in an environment now where the banking industry - and it's not just the banking industry; there are others in a similar situation - are I'd say very anxious not to have levels of privacy become pegged by legal rules but would rather keep them flexible and capable of evolving in a market. What it comes down to is privacy becomes something that's treated as a problem of customer relations.
Many surveys sponsored by financial institutions report that privacy concerns, while growing, are still manageable. Do you think those are legitimate findings, or is it a form of damage control?
I think they are describing something that's real. The public concerns about privacy seem to be very broad and growing but not very deep. They're very easily bought off, very easily assured. But the fact is I think banks are up against a problem that has got a strong political dimension to it.
Broad but not deep issues are perfectly good as far as politicians are concerned. Legislators want to be in the position of responding to broad concerns. So I suspect there's going to be legislative movement that the banks are not going to be so happy with.
Why the degree of political involvement on this issue?
It's something that comes into focus for people. To me, what's been spurring it has been the Internet.
The Internet is marvelous in many, many respects, but it's also frightening to people because it's got a new quality to it and people get surprised about the degree to which they are exposed in situations where they didn't really expect to be. And so there's an appetite for privacy stories. Each new horror story gets publicized and plays to public fears. And that then becomes a very rich field for political movement.
What are the specific areas that raise privacy concerns among consumers, but aren't addressed in Gramm-Leach-Bliley Act?
The ability of institutions to trade information around amongst their component parts.
Like their affiliates?
Not just affiliates, but within their own corporate environments. If you have an integrated insurance broker/bank, there's plenty of grounds for consumer worry that medical information will affect banking decisions, banking decisions will be affected by brokering decisions, all this kind of stuff. It's the cross-marketing, using the pool of information that's gathered from the various sources.
How well do you think financial institutions have been managing the situation?
The industry is saying, "Let us self-regulate. Let us become responsible and evolve a self-governance that we will show you ultimately satisfies the mass of consumers." And the privacy advocates are coming at that and saying "No, look. This is the wolf-pack organizing not for the benefit of the sheep and the deer, but for itself. And at the end of the day, it isn't working. And at the end of the day, it won't work. And there will continue to be egregious examples that we'll be able to tout around."
So the tension is whether they can keep up the momentum of growing privacy concerns and crystallize it in legislation.
How does this relate to the ways in which personal information is now stored?
If you look at our privacy as having been a function of immature technology, then you can see what the technologies are that have matured.
First, we've got processing power. A lot of our privacy came because there weren't enough people to listen and collate and do all the stuff that you need to do. We're just moving into the era, I think, of surveillance powers.
The sensors are now getting miniaturized and cheap enough so that they can proliferate like crazy. You can see a kind of flow towards an environment which is full of sensors. And they're networked, and they're connected to processors, so it's a real environmental change, probably not one that the law is going to ultimately hold back.
During the recent elections, you said in an interview that though privacy would be talked about, "when it actually comes down to doing things that would constrain commerce in ways that would cost people money, I suspect things will be a lot more vague." Do you still believe that even if it comes up in the legislature, you can't stop commercial uses of information?
It's very tough to do it. For one thing, unless you make an absolute bar and say it can't be done and the consumer can't give permission to do it, then you're in a trading situation. And if you're in a trading situation, then a market develops. You know, "What is it worth to me to check the box or not to check the box?" So far the evidence is that it doesn't take a whole lot to persuade people to let their information be used.
Do you think that consumers don't really know what they're getting into, that they don't understand the implications of privacy threats?
Privacy is hard for people to value, because each little bit of information seems like just that: Some little bit of information. You don't conserve it in a rigorous way as you go about daily life, and so what difference does it make if you let it out to this person? You've let it out to other people in the past.
The point of view of the consumer is very different from the point of view of the aggregator, at the other end. The aggregator sees the value of the information all put together, whereas the consumer can't see it. And so the aggregator sees value where the consumer doesn't, and that puts the aggregator in a position where they can buy it cheap.
Do you think it's up to the law to look out for consumer interests, or do you think it's corporations' job to set up appropriate standards?
That's the contest. The contest is whether you think of privacy as a human right, in which case you want to look for collective protection of it. Or whether you look at privacy as a customer relations problem, in which case you leave it to self-regulation. Now my own feeling is that we should conceive of privacy as a human right.