noun_Email_707352 noun_917542_cc Map point Play Untitled Retweet Group 3 Fill 1

How to share sensitive data efficiently? Tokenize it!

Tokenization is gaining more traction as a method to protect sensitive data and to lower cybersecurity risks.

Tietoevry at your service / November 08, 2018

Interest is rising due to GDPR requirements, for instance. What are the benefits of tokenization, and how does it compare to encryption?

In the payment industry, tokenization is a common and well-known way to protect transactions and payment data of point-of-sales systems and applications. Recently, the method has started to spread to other industries and new use cases, especially in the cloud.

The funny aspect is that tokenization is an ancient concept. However, it is now utilized for new problems either to replace or complement other means to protect data.

Tokenization is a way to obfuscate sensitive data such as a social security number by converting it into a random string of characters. That string is called a token. 

Tokens are stored in a database called a token vault, which stores the relationship between the original piece of data and the token. Whenever someone tries to access tokenized data, the token handler will verify whether he/she has the right to read detokenized data.

Tokenization ≠ encryption

Some might ask if tokenization is just a version of encryption. It isn’t.

Encryption obfuscates data by a complex mathematical algorithm. In symmetric key encryption, the very same algorithm can also be used for returning ciphered data into the plain text value.

Benefits of tokenization:

  • No way to reverse engineer tokens to original data.
  • Stolen tokens possess no risk to data security.
  • Low demand for computing.
  • Suitable even for IoT security.

Tokenization involves no keys but only random characters. If an intruder steals tokens, they are useless and harmless because there is no way to reverse engineer the real data behind a token. In case of a data breach, this improves cybersecurity.

Additionally, encryption is always an intensive computing process whereas tokenization requires very little processing power and is very fast. That makes tokenization useful even in IoT security because typical IoT devices have weak processors and require low power consumption.

Analyze data assets with little risk

For organizations, tokenization offers a number of benefits especially to protect personal data in the cloud.

It may be used either as an option to encryption or in combination with encryption depending on the requirements of the use case or regulation. In many cases, neither encryption nor tokenization alone is enough or a viable solution.

Four use cases drive tokenization at the moment:

  • GDPR: tokenization help ensure compliance and minimize data breach risks.
  • Cloud: tokenization help protect identities in cloud services.
  • Analytics: tokenization provides a method to pseudonymize big data assets of personal data and to enable data processing.
  • Application development outsourcing: tokenizing data makes it easier to give data assets for outsourced developers.
  • GDPR is the main factor that has brought tokenization to discussions during the last couple of years. Tokenization provides a relatively simple method to obfuscate personal data and thus fulfil regulatory requirements.

Cloud usage may be the single biggest driver of tokenization. Many organizations have data of their staff and/or customers that are partially sensitive, and that identifies individuals, but they also have to use data in cloud-based applications and transactions. A secure way to use such data is to convert the sensitive fields into tokens before transmitting it into the cloud. Tokenization can be performed by in-house applications, or by tokenization as a service platforms offered by some cloud security vendors.

A third big driver is related to both GDPR and cloud. Organizations know that their massive data assets with personal and enterprise data would provide valuable insight for reaching their business targets, but performing analysis of such data requires careful risk management. If sensitive data is converted into tokens, it ceases to be personal data, and it can be analyzed en masse without risks.

Finally, application development often requires data assets, even sensitive ones. But a big headache is trusted if development is outsourced as it usually is. Once sensitive data is tokenized, you are safer.

It’s easy to understand why the use of tokenization is increasing. This efficient method has certain drawbacks, but organizations can benefit from a hybrid strategy of tokenization and encryption to simplify cybersecurity – and to gain more value from the data assets.

Are you interested to know more about protecting your assets by tokenization? Contact TietoEVRY Cybersecurity experts for more information.

 

This blog post was originally written by Mikko Peltonen, Tieto alumni and Head of Digital Risks & Cyber, CISSP at If Insurance.

Maria Nordgren
Tietoevry alumni
Tietoevry at your service

Share on Facebook Tweet Share on LinkedIn