Widely used and trusted SHA-1 hashing algorithm , one-way
encryption function , which was surprisingly broken a few years ago by a
university in China.
We are seeing more and more organizations moving from
Encryption to Tokenization, as it is more secure and cost-effective approach.
The reason is because it is the best way to minimize the data security risk while
dealing with PCI compliance.
How Traditional Encryption works?
The term “encryption” applies to the use of cryptographic
algorithms to render data unreadable unless the user possesses the appropriate
cryptographic ‘keys’ to decrypt the data.
Technically, cryptographic keys must be treated with the
same care as the data, as a compromise of the keys will result in a compromise
of the encrypted data. It is simply a case of switching protection from the
data, in instances where it is unencrypted, to the cryptographic keys in
instances where the data is encrypted. While it is a practical approach
on the surface, encryption keys are still vulnerable to exposure, which can be
very dangerous, particularly in large enterprise environments.
Encryption also lacks versatility, as applications and
databases must be able to read specific data type and length in order to
decipher the original data. If database and data lengths are incompatible, the
text will be rendered unreadable.
In addition companies must ensure that the encryption
method selected is of sufficient strength. Increasing computer power and new
cryptologic research will require additional encryption strength over time.
Is Encryption is really different
than Tokenization
Tokenization is a form of
cryptography; however, in practice, they are different animals. Encryption
brings with it vulnerabilities caused by the key and the unchanging nature of
the algorithm. Tokenization – Code systems that rely on Codebooks to transform
plaintext into code text.
Format preserving encryption may look like tokenization at
the surface. The cipher text value may look like the original value in data
type and length. If a malicious attack results in the capture of the key used
for the format preserving encryption and its associated algorithm, then the
clear text could be derived whereas, a token cannot be derived by the systems
interacting with the tokenized data which is why those systems can remain out
of audit scope for PCI compliance.
Tokenization approaches the problem from a completely
different angle. At the basic level, tokenization is different from encryption
in that it is based on randomness, not on a mathematical formula, meaning it
eliminates keys by replacing sensitive data with random tokens to mitigate the
chance that thieves can do anything with the data if they get it. The token
cannot be discerned or exploited since the only way to get back to the original
value is to reference the lookup table that connects the token with the
original encrypted value. There is no formula, only a lookup. A token by
definition will look like the original value in data type and length. These
properties will enable it to travel inside most applications, databases, and
other components without modifications, resulting in greatly increased
transparency. This will also reduce remediation costs to applications,
databases, and other components where sensitive data lives, because the
tokenized data will match the data type, length and format of the original, and
will better protect data as it flows across systems due to the higher level
of transparency to business rules and formatting rules.
Is my Tokenization technique is
valid.
External validation of the token generation method is
needed, since homegrown tokenization solutions and vendor solutions are using a
range of different methods to generate tokens. Leading analysts are advising
organizations to not develop homegrown tokenization solutions.
Organizations need to understand what questions to ask their vendors in
order to understand what is being offered.
Encryption requires years of field
use in order to become an accepted and validated solution. Tokenization
requires only to be validated by experts in the field.
Let’s opt for Tokenization over
Encryption !!