Tokenization

A relatively newer solution for removing sensitive data from business processes, applications, and user interaction is tokenization. This method is commonly presented as a solution to reduce PCI DSS scope and reduce business risk associated with storing credit card numbers. Tokenization is the process of generating a representation of data, called a token, and inserting the token into the processes where the original data would be used. A database is used to map the original data to the token value, allowing for both values to be retrieved if needed, and to maintain a real value for the token.

An example is when credit card numbers are inserted at point of sale and then sent on for authorization. Once authorization occurs there are only a few reasons the credit card would need to be maintained beyond the transaction. Since these reasons don't really require the credit card number itself, a unique value like a token can be used to allow business intelligence, fraud investigations, and card tracking to continue while removing this sensitive data from the systems involved in transaction processing.

There is no real standard for tokens but one method to consider is format preserving, meaning the output token would look like, in this example, a credit card number to all processes, applications, and users, reducing complexity in rewriting applications for new formats and confusion for humans that have to read the output format. Tokenization, as with all data protection methods, has to be evaluated to determine if it is the correct fit for the enterprise use cases.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.226.34.25