What is Tokenization?

What is Tokenization?

Tokenization protects sensitive data by substituting non-sensitive data. Tokenization creates an unrecognizable tokenized form of the data that maintains the format of the source data. For example, a credit card number (1234-5678-1234-5678) when tokenized (2754-7529-6654-1987) looks similar to the original number and can be used in many operations that call for data in that format without the risk of linking it to the cardholder’s personal information. The tokenized data can also be stored in the same size and format as the original data. So storing the tokenized data requires no changes in database schema or process.

Data tokenization allows you to maintain control and compliance when moving to the cloud, big data, and outsourced environments.

If the type of data being stored does not have this kind of structure – for example text files, PDFs, MP3s, etc., tokenization is not an appropriate form of pseudonymisation. Instead, file-system level encryption would be appropriate. It would change the original block of data into an encrypted version of the data.

Related Articles