What Is Tokenization and How Can It Help Simplify PCI Compliance?

  • April 26, 2023
Author

Anna Fitzgerald

Senior Content Marketing Manager at Secureframe

Reviewer

Marc Rubbinaccio

Manager of Compliance at Secureframe

PCI compliance requires 300+ security controls and a dozen requirements that range from network security to data encryption. For most companies, achieving and maintaining compliance is a difficult and expensive process — but it doesn’t have to be.

In this post, we’ll explain how tokenization can help streamline your PCI DSS compliance efforts.

We’ll be using insights from the Secureframe Expert Insights webinar held on March 9 featuring Secureframe compliance expert Marc Rubbinaccio, CISSP, CISA and Basis Theory co-founder and COO Brian Billingsley. For all their tips on how to simplify PCI compliance with tokenization and automation, watch the video replay on demand.

What is PCI tokenization?

Tokenization refers to the process of exchanging sensitive data for a non-sensitive, non-exploitable identifier (a “token”). 

Some merchants that must comply with PCI DSS use tokenization to reduce or eliminate the need to retain cardholder data (CHD) in their environment once the initial transactions have been processed. Replacing CHD with tokens reduces the amount of cardholder data in the environment and the number of system components for which PCI DSS requirements apply, which can help simplify compliance.

By working with a tokenization service provider, organizations pursuing PCI compliance can also continue to perform the same storing, processing, and transmitting cardholder data without being responsible for the security of that data. They transfer this risk to the service provider instead. 

It’s important to note that PCI DSS requirement 3.2 does not permit storage of sensitive authentication data after authorization, including magnetic stripe data or the equivalent on a chip, CAV2 / CVC2 / CVV2 / CID data, and PINs/PIN blocks unless you perform issuing services as well.

Does PCI require tokenization?

No, PCI does not require tokenization for CHD storage. However, the use of tokenization can reduce the overall PCI scope and, in turn, the amount of effort required to comply with PCI DSS requirements. It is therefore an increasingly common data security strategy for organizations pursuing PCI DSS compliance.

The Ultimate Guide to PCI DSS

Learn everything you need to know about the requirements, process, and costs of getting PCI certified. 

How is tokenization different from encryption?

Tokenization and encryption are both methods of securing sensitive data, but they have key differences. 

Encryption converts readable sensitive data (plaintext) into unreadable text (ciphertext). The purpose of encryption is to make sensitive data unreadable to unauthorized individuals. Only authorized users with the decryption key should be able to convert ciphertext back into its readable form. Since encryption requires the use and management of keys, organizations need to put strong key management processes in place. Without them, decryption keys could fall into the hands of individuals who should not be accessing sensitive data.

Key management presents challenges in terms of keeping encrypted data secure. Using tokenization-as-a-service does not require encryption key management at all.

Tokenization replaces sensitive data with unique, non-sensitive tokens, which reduces the risk of an attacker accessing the actual sensitive data. Tokens can be stored, processed, and transmitted without exposing any sensitive information. In the case of a data breach for example, no sensitive data would be found being processed, stored, or transmitted — only non-reversible tokens would be.  

What is the format of a PCI token?

Tokens can vary in format and can be generated using different methods based on the tokenization provider.

A token replacing a primary account number (PAN) will likely follow a similar structure or may be made up of numeric characters only, alphabetic and numeric characters, or the first and last digits of the PAN with alphabetic and numeric characters replacing the middle digits, among other formats.

examples of token formats for PAN

Source: PCI SSC’s PCI DSS Tokenization Guidelines

How does PCI tokenization work?

The tokenization process varies by tokenization platform but typically can be broken down into the following key steps.

To start, an application collects and then passes CHD, including the PAN, to the tokenization platform provided by the tokenization service provider. The platform then generates a unique token corresponding to the original payment data. The tokenization provider stores the sensitive data in a protected database and must follow PCI requirements regarding the storage of CHD.  

The tokenization provider then sends the token back to the organization who can now utilize this token in the same way as actual payment data. When the organization wants to utilize the token, they will send the token to the tokenization provider and the tokenization provider will use the payment data instead of the token.  

PCI DSS Tokenization Requirements

Because the tokenization system stores, processes, and/or transmits cardholder data, it must be PCI DSS compliant. The good news is you can then utilize the tokenization provider's attestation of compliance to meet many of the requirements related to cardholder data security.. 

Some key requirements the tokenization system must meet are:

  • Do not provide PAN in any response to any application, system, network, or user outside of the merchant’s defined cardholder data environment
  • Establish all tokenization components on secure internal networks that are isolated from any untrusted and out-of-scope networks
  • Design all tokenization components to strict configuration standards and protects them from vulnerabilities
  • Only permit trusted communications in and out of the tokenization system environment
  • Enforce strong cryptography and security protocols to safeguard cardholder data when stored and during transmission over open, public networks
  • Implement strong access controls and authentication measures in accordance with PCI DSS Requirements 7 and 8
  • Support a mechanism for secure deletion of cardholder data as required by a data-retention policy
  • Implement logging, monitoring, and alerting as appropriate to identify any suspicious activity and initiate response procedures

Is tokenization right for you and your PCI compliance journey?

Tokenization can be a good solution for you if you fall into one of the categories below:

  • Merchants that want to use card data to track transactions, do fraud screenings, and more and share that data with multiple payment processors without actually touching that data or having it hit their system or logs.
  • Service providers that need to show card data to customers — to provide them with a one-time use card, for example — and stay in scope for PCI DSS.

How Secureframe and Basis Theory can help you get and stay PCI compliant

Tokenization and automation can eliminate 90% of the time and effort involved with PCI compliance.  

A tokenization service provider like Basis Theory offers a complete PCI level 1 infrastructure with everything you need to collect, secure, and use sensitive card data faster and easier.

When paired with Secureframe's security and privacy compliance automation platform, you can further streamline the PCI compliance process by automating PCI evidence collection, continuously monitoring your PCI controls, and having on-staff experts help you scope your engagement and perform a readiness assessment. Request a demo today.