Established to protect the security and integrity of card payment systems, the Payment Card Industry Data Security Standard is designed to evolve in response to new threats and a changing business environment.
On Oct. 28, the PCI Security Standards Council is set to release an updated version of these standards.
Tokenization versus encryption has been a topic of debate over the past year, but confusion remains around which technology will best enable compliance with the PCI DSS. Additionally, financial organizations must consider which option offers the broadest security and return on investment in the long term.
Tokenization is one of the leading technologies that can protect data and reduce the scope and cost of compliance under the PCI DSS. Tokenization is typically deployed as a centralized service and lends itself to being delivered as a hosted service. The tokenization service receives card data from clients, stores that data in its vault and returns a token, which the client uses as a substitute for the real card data.
Encryption is the other principal choice for protecting sensitive data in a reversible way (the other methods defined by PCI DSS are hashing, truncation and masking, but these are not reversible). The crucial difference relative to tokenization is that with encryption there is no central vault for the sensitive data.
Instead, the data itself is rendered unreadable by an encryption algorithm and encryption key rather than being substituted by an unrelated token. So how do the two technologies compare when it comes to complying with the PCI DSS and beyond?
An important issue to consider is deployment. In this regard encryption is the more flexible of the two.
Tokenization forces a centralized or serviced model, and requires constant online operation for access to the data vault. Encryption can be deployed centrally, with a shared encryption service, but can also be distributed where the encryption processes take place at the point where data is captured or needs to be processed.
By presharing keys to these locations it is possible for remote sites to operate autonomously offline for as long as is desirable. This flexibility is definitely a plus for encryption.
Related to the flexibility of deployment is the issue of sharing sensitive data between sites or between different organizations. Because of its centralized nature tokenized data cannot easily be shared — providing selective access for third parties into the tokenization system presents a huge identity management problem, and adding external connections all but eliminates the scoping benefits.
Scalability is a challenge for all data protection systems — the more data there is to protect and the more places it can be found, the greater the size of the problem. With encryption, the need to protect and manage large numbers of keys can certainly be an issue, but tokenization faces at least a couple of additional challenges.
The practical Achilles' heel to encryption comes with key management.
Companies must ensure that their key management systems are watertight if they are to ensure compliance with the PCI DSS requirements. Once encrypted, information is only readable if the decryption key is available to unlock it. Consequently, the key becomes as valuable as the data it is protecting. This situation can be likened to the security of a home: locking the house increases the security of its contents. However, if the key is then left under the mat, the level of security is compromised.
In the same way, encryption keys need to be stored and managed effectively in order to ensure data is secure. If a company's key management operations are not effective, then they run the risk of losing keys and therefore data permanently — the dreaded "data shredder."
Determining which technology to deploy requires careful consideration not only of how to ensure PCI DSS compliance but also of how to meet continuing security requirements. Bob Russo, general manager of the PCI Security Standards Council, has suggested that "there needs to be a mind shift from just compliance to security since compliance is a byproduct of good security." With this in mind, companies should look to enforce best practice security of customer data and for larger companies in particular, this usually involves encryption or a combination of encryption and tokenization.
For those companies that only want to protect cardholder data for a single purpose, tokenization is likely to be the preferred choice. However, as the type of data that privacy regulators care about extends beyond card data alone, it is likely that companies will need to look beyond just tokenization in order to comply.
Indeed if encryption is centralized then the average person would have a hard time telling the difference: in each case you authenticate to a service, send in the sensitive data and get back something unreadable. For many companies therefore, enterprisewide data protection may well involve the use of tokenization and encryption.
Organizations handling diverse types of sensitive data within dispersed business applications will probably be forced to opt for encryption with centralized key management. Similarly, multinationals who face a wide and dynamic compliance landscape will probably come to the same conclusion.
Nevertheless, tokenization still presents a useful point solution for certain situations and by using shared key management can be integrated into the wider enterprise data protection strategy.
Whichever is best for your situation, with numerous factors to consider and opinions on the merits of both technologies still divided, it is good to see that the proposed changes to PCI DSS 2.0, which were previewed by the PCI Council in August, won't make matters worse by changing the rules about tokenization or encryption.
However, expectations across the industry are high that additional guidance on both topics is in the pipeline. This will help qualified security assessors and financial companies alike to ensure continued compliance, whether companies opt for tokenization, encryption or both to protect cardholder data.