How Tokenization Can Help Reduce PCI Compliance Costs

The clock is ticking on the July 1, 2010 deadline for complying with the Payment Card Industry Data Security Standards. Introduced in 2004, the standards were developed by the major credit-card companies as a guideline to help organizations that process card payments prevent credit-card fraud, hacking, and other security threats.

It’s no surprise that merchants are finding the expense of complying with PCI DSS to be significant, which has chief security officers investigating practical ways to reduce both risks and costs. One option is a new data security model—tokenization—that substitutes “tokens” for credit card numbers.

To minimize risk, the PCI recommends that merchants first get rid of any stored payment card data that isn’t truly required for the business. This practice reduces the number of places where payment card numbers are stored and also minimizes the range for compliance audits.

Tokenization takes that footprint reduction concept one step further while adding another level of security.

Unlike traditional encryption methods, in which the encrypted data is stored in databases and applications throughout the enterprise, tokenization substitutes a token—or surrogate value—in place of the original data. Under the PCI DSS, encrypted payment-card data is considered to be in scope for audit purposes.

By limiting occurrences of encrypted data to a central vault, organizations can reduce the number of systems, applications and processes that must be audited for compliance with PCI DSS. This can cut down on the time and cost required to pass annual compliance audits dramatically.

With the newest form of tokenization, called format-preserving tokenization, the token uses the same amount of storage as the original clear text data (data transferred or stored without cryptographic protection) instead of the larger amount of storage required by encrypted data.

Because a token is not mathematically derived from the original data, it is arguably even safer than cipher text (encrypted data that’s unreadable until it’s been converted into plain text with a key). Authorized applications that need access to encrypted data can only retrieve the data using a token issued from a token server, which provides an extra layer of protection for sensitive information.

Compared to traditional encryption, a tokenization architecture can also reduce data-storage requirements and preserve storage space on data-collection computers.

Tokenization can be used to reduce the number of points where sensitive data is stored within an enterprise, making it easier to manage and secure. By centralizing encrypted data storage in a single location within the enterprise, points of risk are eliminated and security management is simplified.

What’s more, for companies that are also collecting and storing other types of customer information, such as consumer loyalty data, tokenization is just as effective in protecting personally identifiable information.

In the traditional data encryption model, data is encrypted at the stores and stored there; or encrypted at headquarters and distributed back out to the stores.

With the tokenization model, encrypted data is stored in a central data vault and tokens replace the corresponding cipher text in applications available to the stores.

This reduces the instances where cipher text resides throughout the enterprise. And it lowers risk, because the only place any encrypted data resides is in the central data vault until it is needed by authorized applications and employees.

In the second scenario, the focus is on using only tokens in spoke applications thereby reducing scope for a PCI DSS audit. In this case, employees only need a format-preserving token in which the token provides enough insight for them to perform their jobs.

For instance, the token will contain the last four digits of a credit card. In the traditional encryption model, cipher text resides on machines throughout the organization. All of these machines are candidates for a PCI DSS audit. In the centralized tokenization model, many of the spokes can use tokens in place of cipher text, which takes those systems out of range for the audit.

Format-preserving tokenization is ideal for some merchants, while a hybrid approach using strong local encryption and tokenization is better for others. Localized encryption is the default when stores are not always connected to a central data vault. In instances where stores are electronically connected to the data vault, tokenization is often the preferred method.

For many merchants, using a combination of localized encryption and tokenization is a practical approach for improving data security.

Gary Palgon is vice president-product management for data security software and services provider nuBridges.