If you ever watched Star Trek, you soon learned Dr. McCoy’s signature line: “Dammit, Jim, I’m a doctor, not a \[insert a more useful occupation for the crisis at hand\].” In the Payment Card Industry (PCI), it appears companies are doing a riff on Bones’s signature line: “I’m a merchant, Jim, not a security expert!” So why are we surprised when we hear about the latest data breach?
Not that there aren’t penalties for losing data. A company can be fined by the credit card companies for a violation and even lose its credit-card taking privileges. High stakes, but companies also face the cost of storing, managing, and monitoring encrypted data and being audited by PCI-certified auditors, all of which adds complexity and takes away profit.
A solution that’s relatively new to the market, tokenization, offers potential over the de facto standard, encryption. But even the PCI’s standards committee can’t decide which defense is best to use to keep credit card data safe.
“There are too many changes in IT happening too quickly for an organization to wait for a standards committee to issue a clear pronouncement on each of them,” says David Taylor, a former e-commerce analyst with Gartner and research director of the PCI Alliance, in “Data Security Slugfest: Tokenization Vs End-to-End Encryption.”
"Rather, I would suggest that retailers begin now to investigate the value of these technologies, especially tokenization and end-to-end encryption, to determine where one or the other, or both of them, can be used...." His explanation of why encryption alone doesn't work is useful.
At The Falcon’s View blog, Ben Tomhave shares his frustrations about his search for data security solutions in "Does Tokenization Solve Anything?": “To me, the solution here is to get the data out of the hands of the merchants. If the merchants don't have the cardholder data, then you don't need to worry (as much) about them getting compromised.” Tokenization, he admits, can do just that, but he still sees problems with it.
To sort through the confusion, I'd like to point to an interview several Penton editors did with Gartner analyst John Pescatore. He explained how tokenization came about: “A lot of pretty big companies don't have credit card payment as a big part of their business, but they have the PCI security requirement even for the small amount of payment processing they do. And they thought encrypting and other PCI security requirements were too complicated, so they outsourced the payment processing so they'd never store the card data, just a token.
"These companies could get full access to the transaction data, but the outsourced payment processor sends it to them without the card data. This idea of tokenization and masking started with these outsourcers.
“Now enterprises who either can't or don't want to outsource payment processing can do it themselves with tokenization. However, outsourced payment processors do have to get certified as PCI compliant.
“Taking this approach, companies can keep their sensitive data in one database and use tokenization for other applications that need to look up credit card related data, thereby reducing the odds of a data breach. What's more important to most enterprises, however, is that now all those servers on which they used to store the sensitive data are no longer part of the PCI audit, because the only systems in the scope of the PCI audit are the systems that store and process the sensitive data.
"So what tokenization really does is limit the scope of the PCI audit, which reduces the cost of the audit and the cost of dealing with the audit.”
Pescatore had some other interesting things to say about tokenization, as well as whether it could be used for securing other types of data. To read the interview with him, check out my colleague Linda Harty’s write-up at the Systemi Network blog.