BankThink

Poor Vulnerability Management is Increasing Data Breach Risk

Organizations may be being misled about their security risk exposure, and are in serious danger of becoming victims of a security breach, which has become a major problem for merchants and payment companies. 

A number of the currently utilized automated vulnerability management (VM) systems, which are essential in gauging network security information risk, contain a serious “hidden” flaw, and companies are investing in technology that does not have the capability to accurately reconcile results from one scan to another.  

The impacts include inefficient use of IT resources, inaccurate remediation of security vulnerabilities leaving information assets exposed to possible exploitation by cybercriminals, overwhelmed IT Security resources resulting in lower job satisfaction and higher employee turnover, and erroneous security risk reporting that can destroy the credibility of an information security program,

There are several crucial capabilities related to assessing and managing information risk which should be demanded from any system. These include the ability to gauge risk at a given point in time, gauge the change in this risk over time, and ideally, re-evaluate past views of risk when faced with newly discovered information applicable to the past. 

The past is a very important part of your overall information risk picture. Many VM vendor solutions on the market today are adequate in assessing network risk at a given point in time, but are very limited in their ability to track network risk over time.

In financial accounting, transactions which occur at various points in time are ultimately reconciled in order to ensure balance. Similarly, in vulnerability lifecycle management, point-in-time assessments must be reconciled. Yet, VM systems are challenged with this reconciliation process. This is because the primary technology used to assess network hosts, referred to as remote network scanning, employs a scanner that does not have a presence on the hosts being assessed. Instead, it resides on an independent machine. 

In essence, it is challenged in its attempt to recognize what it “saw” at one point in time, to what it “saw” at a different point in time. There are two primary reasons for this recognition challenge. First, there is no “magic bullet” characteristic present for the systems to “latch” onto to ensure the entity can always be found. Second, those characteristics that are always present and discoverable are subject to change over time due to normal network IT administration. 

The accuracy of one single vulnerability assessment is very important and remains a benchmark traditionally used to compare various VM solutions and products. Unfortunately, far less attention has been paid to the accuracy of these results as they relate to comparison of scan results over time. This means that an organization using such limited solutions must question the matching technology used within them and take action to avoid the pitfalls. 

Organizations that find their current solution is inadequate should either procure a replacement system that employs more robust reconciliation methodology or, alternatively, integrate with a third party solution that can detect and eliminate the errors present in the information provided by their current VM solution. 

Gordon MacKay is executive vice president and chief technology officer of Digital Defense.

For reprint and licensing requests for this article, click here.
Analytics Data security
MORE FROM AMERICAN BANKER