Papers tagged as tokenization
  1. Updatable Tokenization: Formal Definitions and Provably Secure Constructions 2017 FinancialCryptography tokenization fc17.ifca.ai
    Christian Cachin, Jan Camenisch, Eduarda Freire-Stoegbuchner, Anja Lehmann

    Tokenization is the process of consistently replacing sensitive elements, such as credit cards numbers, with non-sensitive surrogate val-ues. As tokenization is mandated for any organization storing credit card data, many practical solutions have been introduced and are in commer-cial operation today. However, all existing solutions are static yet, i.e.,they do not allow for efficient updates of the cryptographic keys while maintaining the consistency of the tokens. This lack of updatability is a burden for most practical deployments, as cryptographic keys must alsobe re-keyed periodically for ensuring continued security. This paper introduces a model for updatable tokenization with key evolution, in which a key exposure does not disclose relations among tokenized data in the past, and where the updates to the tokenized data set can be made byan untrusted entity and preserve the consistency of the data. We formally define the desired security properties guaranteeing unlinkability oftokens among different time epochs and one-wayness of the tokenization process. Moreover, we construct two highly efficient updatable tokenization schemes and prove them to achieve our security notions