To listen to what some people inside and out of government are saying, the banking system has huge security problems because so many banking platforms come from the same couple of vendors. If an attacker knows the holes in these systems, goes the theory, they can attack any bank using them. So, they ask, wouldn't it be better to revert to the old days, when every bank had its own computer systems, and an attacker would have a new set of problems every time they decided to hack somebody?
Forget about it, says Ted Kochanski, president of Lexington, MA- based Sensors Signals Systems. "What standardization allows institutions to do is constantly adjust to the marketplace, because the vendors of these systems are constantly upgrading them, and, as this happens, the old holes are plugged up," he says. New holes appear, of course, he concedes; but computer security is a dynamic process, characterized by a constant give and take between the hackers on the inside and the hackers on the outside.
Standardization thus becomes a double-edged sword: It creates system-wide vulnerabilities which must be constantly tested for and defended against; yet, the economic edge that constantly-upgraded standard systems offer their users, by allowing them the flexibility to adjust to changing market conditions, outweighs the potential problems-problems that may not even arise before they're fixed and can be fixed on a very broad basis when they are fixed.
In fact, says Kochanski, the perceived weaknesses of these widely used systems are also their strengths. "The key is that the standardization leads to an openness of development which allows the marketplace to be active in solving these security problems," he says. "The openness may be what invites attacks, but, on the other hand, if you have a customized platform, and you make a little change in the program to adjust to the market, you may be creating a vulnerability that only you can fix."
The key to computer security, says Kochanski, is accepting the security game, making prudent choices and not becoming too anxious about what could possibly happen, because among other things, the systems available today are pretty good. "When you hear these stories about somebody penetrating these things it's usually because of stupidity, not a failure of the system-because the password was never reset from the factory, for instance.
"The thing is that you're dealing with a game that's constantly being played. There are people who are interested in system penetration as an intellectual activity, who test these systems all the time and who feel the best defense is a constant offense.
"The downside is that there are a lot of holes out there, and there are other people out there doing the same thing, but not for quite such innocent purposes. Some are doing it for relatively harmless malicious purposes-they want to break in and find out what's going on, but they don't know what to do with it-but there are others who are trying to conduct criminal activity.
"In any event, the idea that somewhere out there is a perfectly secure system is nonsense. "It's a total fantasy," Kochanski says. "The modern concept of war is a maneuverable war, where you have more information than the other side and you're able to attack the enemy where he doesn't expect it. That's how we won the Gulf War. The other way, the entrenched customized system, is like the Maginot Line the French built-a fantastic, well-developed defense structure that the Germans just went around."