Tokenization vs. end-to-end encryption

14.08.2009

The other key point of E2E is that some companies are focused on an enterprise view of end-to-end, rather than defining one of the endpoints as the acquirer. In addition, the policies for and the processing of chargebacks in some companies tends to mess up the end-to-end scenario.

The main thing to remember regarding encryption is that it is but one of 12 major PCI-mandated controls, even when it is E2E. It seems unlikely that changes to PCI DSS requirements in the next release in the Fall of 2010 will eliminate the need for the other 11 primary PCI DSS controls, just because one control is in place. But we'll see.

Speaking of things that are no longer needed, there is a lot of discussion about tokenization solving all problems. Tokenization involves the replacement of credit card numbers (or other confidential data) by a surrogate number or "token" and then centralizing (or outsourcing) the card data to reduce (some say eliminate) insider threat.

In an ideal world, that may well be possible. However, in our research, we haven't found any large companies that have been able to completely eliminate or outsource card data even though they implemented tokenization. Some of the reasons include business requirements, the cost to change their production applications and the difficulty of actually finding and purging all their card data.

On the other hand, some smaller organizations, which are single-channel and have a highly centralized data architecture, have been the most successful at handing off the data and compliance headaches to tokenization companies. Therefore, these companies still need to encrypt whatever credit card data remains, in order to be compliant and to minimize insider threat.