When and how to migrate to Post-Quantum Cryptography (PQC)?


In recent years, cryptography has been widely used as a basis for IT systems. For example, encryption is used to protect information by making it appear as meaningless data even if the data flowing or stored in the communication path is eavesdropped or stolen by an attacker. In addition, digital signatures are used to ensure that the data is self-created and has not been tampered with; both encryption and digital signatures are typical cryptographic techniques that support today's society.
On the other hand, in recent years, significant progress has been made in quantum computer implementation technology. Quantum computers have the potential to perform calculations in a short period of time that traditionally could not be accomplished in a realistic amount of time, and are expected to be used in research areas such as artificial intelligence and drug discovery. However, generally speaking, when a new technology emerges, so does the threat of attacks utilizing that technology. In other words, there is a fear that encryption will be broken by quantum computers in the future.
Attackers will take a long-term view and try to successfully crack the code. Even if the encryption cannot be broken due to the current lack of performance of quantum computers, the "store first, decrypt later attack" is beginning to be seen as a threat, which is also referred to as the "capture now, decrypt later attack" or the "harvest now, decrypt later attack". Capture now, decrypt later" or "Harvest now, decrypt later".

As a "traditional countermeasure", extending the length of the key will suffice.
Before explaining the threat posed by quantum computers, let us explain the existing threat. It is important to note that this traditional threat has not been replaced by the threat of quantum computers; it still exists. As far as cryptography is concerned, supercomputers have been the traditional threat; given the speed of supercomputer performance, suggested values for key lengths for encryption and digital signatures have been estimated.
For example, one of the signature functions is the RSA cryptosystem.The RSA cryptosystem is based on the difficulty of factoring numbers. In other words, if the prime factorization of a particular number is easy to solve, the RSA cryptosystem will be broken.

--As a traditional countermeasure, extending the key length is sufficient.
The traditional countermeasure to the supercomputer threat has been to increase the key length (number of bits) used for encryption and digital signatures. In fact, symmetric key lengths for RSA encryption have been extended to 512 bits, 1024 bits, 2048 bits, and so on. You might think it would be better to use a long key from the start, but this would make the encryption process and the decryption process time-consuming for users with legitimate keys. In this way, security is over-satisfied while convenience is greatly reduced. Therefore, an appropriate key length can be derived by considering computer performance at the time and computer performance projections for about 30 years from now.
The National Institute of Standards and Technology (NIST) publishes the security strength of each key length for symmetric and public key cryptography, using a common "bit security" standard. Currently, 80-bit security (equivalent to the 1024-bit key security of RSA, one of the public-key cryptosystems) is not allowed.

...... Quantum computers are the new threat

A different technique from PQC is quantum key distribution (QKD), a key distribution technique based on quantum mechanics and using specialized hardware and optical fibers. Encrypted communications utilizing this technique are referred to as quantum cryptography.QKD is sometimes referred to simply as "quantum cryptography" and vice versa, often with ambiguous meanings.
Moreover, historically, the concepts of QKD and quantum cryptography predate PQC, having been published as early as 1984.
In contrast to quantum computers, which are referred to as "classical computers" and PQCs are implemented as an algorithm running on a classical computer, QKDs consist of specialized hardware (transmitters and receivers) based on quantum mechanics and optical fiber, which looks like a network switch or router, a hardware device that can be placed in a server rack. hardware device that can be placed in a rack.
Not all cryptography can be broken by a quantum computer. Here we realize the existence of quantum algorithms and try to understand them.
The part of a quantum computer that utilizes the unique properties of quanta to perform computations more efficiently than a classical computer is called a "quantum algorithm". In fact, the "Shore algorithm"
has been proposed as a quantum algorithm for efficiently solving prime factorization and discrete logarithm problems: the existence of such algorithms poses a direct threat to public-key cryptography, such as the RSA cryptosystem and elliptic curve cryptography.
In recent years, the threat can be seen to be increasing little by little as the technology of quantum computer implementation continues to advance.

U.S. is leading PQC standardization


NIST launched the PQC standardization effort in 2016 and is still working on it.The PQC standardization effort began with the public availability of products in three categories: "Public Key Cryptography", "Key Encapsulation Mechanisms (KEMs)", and "digital signatures". Then, after three to four stages of selection and evaluation, the range of proposed solutions was gradually narrowed down, and the final policy was to select not one, but multiple solutions.
In December 2017, 69 options were accepted; in January 2019, 26 methods advanced to the second round of evaluation based on the results of the first round, and in July 2020, 15 methods were subjected to the third round of evaluation based on the results of the second round.On July 5, 2022, the results of the third round of evaluation were announced. Subsequently, CRYSTALS-KYBER was selected as a standard scheme in the public key encryption and key exchange (KEM) category, and CRYSTALS-Dilithium, FALCON, and SPHINCS+ were selected as three standard schemes in the digital signature category.
Meanwhile, BIKE, Classic McEliece and HQC in the KEM category were put on hold for standardization in the third round of evaluation. It is indicated that a fourth round of evaluations will be conducted additionally, in which at least one of them will be added to the standard.


Lattice-based cryptography was in the majority of submitted cryptographic schemes; in fact, of the four schemes identified by the PQC standard, CRYSTALS-KYBER, CRYSTALS-Dilithium, and FALCON are lattice cryptography: their security is based on the lattice problem.
The trend of many lattice ciphers remains unchanged even among the methods that advance to the second and third rounds of evaluation.
From this phenomenon, we can see that there are many lattice-based cryptography researchers in the world: the large number of researchers in a particular field means that a variety of cryptographic techniques will be produced in that field. At the same time, it can be expected that there will also be many researchers evaluating their security, which will give us some confidence in their security. In fact, the security of RSA cryptography, elliptic curve cryptography, etc. is trustworthy because the prime factorization/discrete logarithm problem has been studied by many researchers for many years.
Therefore, when deciding on encryption algorithms for future migrations to PQC, consider including encryption algorithms belonging to the lattice encryption algorithms as one of the options from the security point of view.
What should I look for when migrating to PQC?
Next, we will summarize seven points to keep in mind when migrating public key cryptographic algorithms embedded in the information infrastructure of IT systems to PQC:
1) Data size may increase
In each PQC algorithm, the size of key data, encrypted data, and signature data is larger than in traditional cryptosystems. If the size of these data is not taken into account in program design, the data may not be stored correctly in memory, IC cards, etc., and the system may terminate abnormally.

2) Processing speed may be slower
For each PQC algorithm, the key generation speed and encryption processing speed may be faster or slower than before. If the speed is slowed down, the waiting time for system users increases and convenience decreases. Caution is especially needed in the case of multiple iterations of TLS session construction and in resource-conserving environments such as IoT devices.

3) Increased cryptographic flexibility
While the security of each PQC algorithm has been fully validated by NIST, its history is shorter than that of algorithms such as traditional RSA, so there is always the possibility of future attacks being discovered.
4) If encrypted data is stored in the system, re-encryption should be considered
If encrypted confidential information is stored in the system (including cases where symmetric key cryptography is encrypted via public key cryptography), it is necessary to consider re-encryption via PQC or symmetric key cryptography with extended keys. This can be used as a countermeasure against "store first, decrypt later attacks".

5) If TLS hardware is used, make sure there is enough time to procure it
If TLS communications are terminated on the load balancer, the load balancer must be updated to support PQC because its cryptographic libraries are located on the load balancer hardware. We must time our procurement to determine if we can procure it in time for the system update.

6) Continuous collection of information released by NIST, SOG-IS, and other organizations
Even if plans to migrate to PQC are considered, it is necessary to keep an eye on information released by NIST and SOG-IS based on updates to the PQC security assessment. Currently, NIST plans to release a standardization document for Post-Quantum Computer Cryptography (PQC) around 2024 to 2025, and there is a need to continue to update the information.
7) Understanding PQC capabilities offered by cloud service providers
If the plan is to migrate all or part of the system to the cloud in the next system update, there is an option to use the PQC functionality provided by the cloud service provider (CSP). Therefore, we should keep track of the CSP's PQC development plans and current status, and decide whether to utilize the PQC-related services provided by the CSP.
So, when and how to migrate?

NIST introduced Mosca's theorem (Mosca's theorem) in order to visualize the urgency of the migration to PQC. Mosca's theorem was proposed by Prof. Michele Mosca of the University of Waterloo, Canada. Mosca's theorem defines x, y, and z as the following number of years:
- x: the number of years you want your data to remain hidden using current encryption technology
- y: the number of years it takes to build a secure encryption infrastructure against a quantum computer attack
- z: the number of years it would take for a quantum computer to break the current encryption technology
If x+y>z, then there is a problem and migration to PQC should be considered. this is because data encrypted after y years with a "secure storage period: x years" may be decrypted by a quantum computer after z years without meeting the x year secure storage period.
About NTT DATA
NTT DATA, part of the NTT Group and headquartered in Tokyo, is a trusted innovator of global IT and business services. NTT DATA helps clients transform their businesses through consulting, industry solutions, business process services, IT modernization, and managed services, and serves clients in more than 50 countries.