Payment card tokenization has undergone a fundamental architectural shift over the past decade. What began as a PCI DSS scope reduction mechanism — replacing the Primary Account Number with a merchant-specific surrogate value — has evolved into a network-level infrastructure operated by Visa, Mastercard, and American Express that changes the mechanics of card-not-present payment authentication entirely. Network tokens, provisioned by the card networks through their token service platforms (Visa Token Service, Mastercard Digital Enablement Service), are now the default payment credential for digital commerce at scale. EMV 3DS provides the authentication layer. Together, they represent a significant re-architecture of how card payments work that affects every participant in the payment ecosystem: issuers, acquirers, payment service providers, and merchants.
Network Tokens vs Gateway Tokens
There are two distinct tokenization architectures in common use, and conflating them creates both technical and compliance misunderstandings. Gateway tokenization — the original model adopted by payment gateways from roughly 2010 onwards — replaces the PAN with a gateway-specific token that the merchant stores instead of the PAN. The gateway maintains the mapping between its token and the real PAN. This reduces PCI DSS scope for the merchant (the stored token is not itself cardholder data) but does not remove the PAN from the payment ecosystem; the gateway still handles PANs and the card network transaction still uses the PAN.
Network tokenization is fundamentally different. The card network provisions a token — a 16-digit number that follows the Luhn algorithm and is formatted like a PAN but is cryptographically bound to a specific device or merchant — directly to the merchant or PSP. When a network token is used in a transaction, it travels through the payment network and is de-tokenised by the network before reaching the issuer. The issuer processes the transaction against the real PAN. The merchant, PSP, and acquirer never handle the real PAN. For participating merchants, this can substantially reduce or eliminate PCI DSS scope for card-not-present transactions.
The scope reduction benefits of network tokenization are real but not automatic. A merchant that stores network tokens must still ensure that its token storage environment does not receive PANs during the provisioning or authorization flows. If the merchant's payment integration receives a PAN at any point — even temporarily, in memory, before substituting the network token — that system is in PCI DSS scope. The scope reduction only materialises when the integration is designed from the outset to receive network tokens and never PANs.
EMV 3DS Authentication Architecture
EMV 3DS (3-D Secure 2.x) provides the cardholder authentication protocol that makes network-tokenized card-not-present transactions both secure and frictionless. The protocol operates through three parties: the 3DS Server (operated by the merchant or PSP), the Directory Server (operated by the card network), and the Access Control Server (operated by the issuer). For each transaction, the 3DS Server collects a rich set of device and transaction data — device fingerprint, browser characteristics, transaction amount and merchant category, prior transaction history if available — and sends it to the Directory Server, which routes it to the issuer's ACS.
The issuer's ACS applies risk-based authentication to decide whether to challenge the cardholder (requiring biometric, OTP, or other verification) or to approve the transaction frictionlessly. The data richness of 3DS 2.x compared with the original 3DS protocol is what makes frictionless authentication viable: the ACS has enough context about the transaction to distinguish low-risk legitimate transactions from potentially fraudulent ones without presenting a challenge that degrades the customer experience. In well-implemented 3DS 2.x deployments, challenge rates below 10% are achievable while maintaining fraud rates below industry benchmarks.
Token Lifecycle Management
Network tokens require lifecycle management that adds engineering complexity beyond the initial provisioning flow. Tokens have an associated cryptogram (TAVV — Token Authentication Verification Value) that must be generated for each transaction and validated by the issuer, ensuring that the token cannot be replayed across transactions. Token domain restrictions bind the token to a specific merchant, device type, or channel, so a token provisioned for a merchant's app cannot be used in a browser context. Tokens must be updated when the underlying card account changes — reissue, expiry, or account number change — through a token update notification that the TSP sends to the merchant's token requestor.
Issuers must build or configure their authorisation processing to handle network tokens natively, recognising incoming token PANs, requesting de-tokenisation from the network, and validating the TAVV cryptogram before approving the transaction. Issuers that have not implemented native token support rely on the network's fall-back behaviour of presenting the real PAN to the issuer — which works but loses the enhanced authentication data that the token transaction carries, reducing the issuer's ability to apply accurate risk scoring.
PCI DSS Implications for Each Ecosystem Participant
The PCI DSS implications of network tokenization differ for each participant. Merchants that implement network tokenization through a compliant TSP integration and never receive PANs can seek confirmation from their QSA that the relevant systems are out of scope. PSPs and payment gateways that handle network tokens on behalf of merchants must be assessed against PCI DSS but may qualify for reduced scope if their integration architecture ensures that PANs are not present in the tokenized transaction flow. Acquirers and issuers remain in full PCI DSS scope because they participate in the de-tokenisation process, but their scope is defined by their existing card processing infrastructure rather than expanded by the token architecture. Understanding the scope implications for each participant's specific integration pattern requires a detailed review with a qualified security assessor — generalised statements about scope reduction should be treated with caution until the specific data flows have been validated.
The engineering behind this article is available as a service.
We have done this work — not advised on it, not reviewed documentation about it. If the problem in this article is your problem, the first call is with a senior engineer who has solved it.