On July 14, 2010 Visa released version 1.0 of their Tokenization best practices document. This follows up previous publications on data field encryption. The release is brief, but provides several interesting data points that Merchants, Providers, and Practitioners should consider.
Visa breaks down Token systems into 4 Unique Components:
- Token Generation
- Token Mapping
- Card Data Vault
- Crytographic key Management
A broad set of best practices are highlighted throughout the rest of the document and well worth a review (Here is the direct link to the PDF). A few specific items that were interesting:
- The Tokenization system must be segmented and “be subject to a full PCI DSS assessment” – note this does not say QSA audit, but rather implies a confirmatory action by the owner of the system.
- Monitoring suggests a need to “detect malfunctions or anomalies and suspicious activities”, which are far more fraud focused. In addition there is mention of rate limiting functions, which would be beneficial beyond the token-PAN mapping environment
- The Token must be so designed to “not be computationally feasible” to recover original PAN. A token may be created using EITHER a “known strong cryptographic algorithm” or a “one-way irreversible function”
This publication is meant to provide high level guidance, and Visa is seeking any comments by August 31, 2010 (send messages to email@example.com w/ “Best Practices for Tokenization” in the subject). In previous publications Visa has issued more granular publications following these high level documents, so feedback and comment is important.
James DeLuccia IV