skip to content


Tokenization, Misinformation, and the Benefits of Host Card Emulation

Tokenization is not a new concept. Yet, while the process has improved card security across a variety of industries, there still exists myriad misinformation about why it’s important and how to best implement it.

Tokens in their simplest form are a low-value representation of something with a higher value, replacing actual card numbers with a special set of numbers or characters used for making payments. Tokenization reduces security risks associated with the transference of highly sensitive data stored on mobile devices over networks during the payment process.

Traditional notions of tokenization argue a one-size-fits-all approach, when in reality different transaction types and different industries have very specific needs that tokenization has evolved to address.

Static tokenization, the traditional form, replaces 16-digit card numbers with another 16-digit number. In essence, the consumer’s card is replaced with another card. Throughout the authorization process, that 16-digit token does not change. In some instances, these static tokens may be one of the least secure forms of tokenization.

Dynamic tokenization, on the other hand, continuously changes and recycles a token after each transaction. A truly dynamic PAN (primary account number) is one in which the processor must receive a newly tokenized number with each transaction. While this is potentially the more secure route, it is also more difficult to implement with legacy infrastructure.

In between static and dynamic lies semi-dynamic tokenization, which is not as well-protected as dynamic tokenization but also not as unsafe as static tokenization. To compensate for this missing dynamic factor, a transaction cryptographic calculation—called a cryptogram—is included as part of the transaction. Cryptograms can include transaction-specific information such as the value and terminal input, boosting the level of protection while mitigating some of the deficiencies of a static-only token. Apple, always true to form, created a method of tokenization unique to its product. This is Apple Pay. Rather than receiving a card number, expiration date, and billing address from the customer, Apple Pay merchants only receive device-specific tokens and a dynamic security code. Apple works directly with card issuers, which translate the tokens into credit card numbers only once they reach the payment network. While this means only the consumer’s bank and payment processor can access the data, the burden shifts to card networks like Visa and American Express to adopt a tokenization model.

Android, on the other hand, demands more steps in the approval process. Android Pay has to coordinate with more parties more often. Along with this, Android’s issuer is reliant on the network and only provides one form of tokenization across the board. Interdependence of the networks and an off-the-shelf approach offers retailers and banks no control over the product lifecycle or product.

It’s clear retailers and financial institutions leveraging certain third-party services can quickly back themselves into a difficult corner. The alternative, host card emulation (or HCE), eliminates the hardware dependencies and device-specific requirements of other tokenization models and puts the banks back in control.

Host card emulation allows the banks to use their own network tokenization and host those tokens in their cloud, encompassing a broader spectrum of approaches and facilitating a more flexible system that works for a wider variety of card issuers and financial institutions. Not only can HCE forge one tokenization approach with another, but it can also provide agnostic support to each unique issuer. Security flows more freely, along with user experience, implementation, and the brand identity unique to each bank and retailer.

In the transaction cycle, the most important element flowing through the system is the card number. Banks and retailers that can make sense of the misinformation out there today and use a more sophisticated tokenization solution will themselves be more sophisticated.

Source: Digital Transactions