Frontpage special

Preparing for the Quantum Leap: Navigating the Path to Post-Quantum Cryptography Readiness

While the demands of modern technology in the last two decades ranging from social media to AI is thought to push Moore’s law and semiconductor fabrication beyond its limits, the advances, with the exception of the invention of GPUs, seem to be more linear: increase the chip density and clock frequency to achieve more performance. Quantum computing on the other hand is bucking this trend and although expensive and somewhat impractical at the moment, the significant investment in the area points to not an if but a when question. Over two decades ago, Shor described an algorithm that could break the commonly perceived difficult mathematical problem that forms the backbone of traditional asymmetric cryptography algorithms such as RSA and Elliptic Curve Cryptography (ECC) that have made modern protection mechanisms such as transport layer security (TLS) or data encryption. At the time, this was not considered a serious threat given that it required the use of a quantum computer. However, in recent years, the idea of quantum computing is no longer far-fetched.

Security professionals started warning about a “harvest now, decrypt layer” scenario, where a hacker simply intercepts all encrypted traffic and stores it until a time in future where quantum computing is more feasible and then he/she will deploy this computing power to break the algorithms and uncover the data that was harvested years ago. This may not be a concern when the data is a short message between two friends setting up a dinner plan, or a temperature sensor in a manufacturing plant sending data to a remote environment control monitor. On the other, this is a concern when the data is classified government information or financial information that requires safekeeping for years to come.
For a manufacturer, “verify now, forge later” can also be a concern, when a manufacturer delivers a signed software update with a traditional algorithm and a hacker later can break the signature algorithm and provide their own software update while forging the manufacturer’s signature.

National Institute of Standards and Technology (NIST) in the US announced a competition for a quantum-resistent cryptography algorithm that can support digital signature and encryption/ key management needs to replace RSA and ECC. The selection process started in 2016 and now in 2024 has gone through four rounds, with each round eliminating a few of contenders and thus narrowing the list of nominees with the last round publishing standards for 5 algorithms as Post Quantum Cryptography (PQC) standards.

In the name of simplification, the National Security Agency (NSA) has issued a second guidance, the so-called CNSA2.0 (Commercial National Security Algorithm 2.0) guidance with the intention of simplifying the planning for any vendors supplying solutions to national security systems. While the intention of this remedy was to simplify the journey for a non-mathematician through the maze of NIST specification, in reality, it caused multiple side effects. The first side effect was that NSA and NIST came at odds with each with respect to some practical workarounds for some of the practical shortcomings of the algorithm. The second side effect stemmed from the lack of experience by the authors who most likely were cryptographers in practical aspects of system-wide implementation of these algorithms.

To dig deeper into these side effects, let’s dive slightly into the algorithms that have so far been standardized and being recommended: in a broad sense, two categories of algorithms were standardized: hash-based algorithms (HBS) and Module Lattice (ML) algorithms.

The hash-based algorithms essentially rely on the decades-old one-way hash functions that are well understood and widely enjoy hardware acceleration support. The difficulty in breaking the algorithm is due to the intensity of computation involved in calculating a hash of the inputs provided and cannot be trivialized by a quantum computer. To give the reader an idea, this is very similar to the computation involved in bitcoin mining. Another advantage of these algorithms is that the creation of signing keys is not complicated. The big downside of these algorithms is that they basically use random numbers as signing keys and the way the algorithm is designed, no signing key can ever be used again. This creates a very tedious problem of having to remember all the keys that are previously used which essentially turns into a difficult state management problem for any hardware appliance that a manufacturer utilizes for signing software that goes inside their products. There are several implications here on both the device side and the infrastructure side. On the infrastructure side, the first challenge is the scalability when multiple appliances need to be clusters to provide production scalability and the second challenge is with high availability and disaster recovery when one appliance needs to quickly jump in and replace another failed appliance. On the device side, the state management requirements simply disqualify the HBS algorithms for use in the device. This is why the CNSA2.0 only recommends the HBS algorithms only for code and firmware signing. The two HBS algorithms are LMS and XMSS.

The second set of algorithms, the module lattice (ML) algorithms are quite the opposite of the HBS. They are mathematically very complicated with many primitives ranging from matrix and vector multiplications in a 256-dimensional space, statical sampling of random numbers in a multi-dimensional space to ensure that they fit in specific size ranges, rounding up, rounding down, and even rejection of final signatures if they don’t fit certain size and mathematical properties. The matrices and vectors can reach sizes up to 50+ kilobytes, keys can reach 5000 kilobytes and the calculations can take 10s of milliseconds without specialized accelerators which brings the device delay performance to ranges that are simply acceptable. All this points to the need to develop or purchase silicon-level accelerators to make these algorithms practical

NIST and NSA essentially require the use of ML algorithms in all non-code signing use cases, ranging from network appliances, web servers and key management infrastructure but have set deadlines that gradually cover more and more use cases until 2030-33 time frame when they expect complete turnover.

Given the lifecycle of hardware, from the algorithm and ASIC design to tape-out to development of FW matching the ASIC to product release can be 2-3 years, the migration to PQC algorithms will have to take a path starting from the current phase of the only traditional algorithm to a hybrid phase with designs and environments that include a combination of PQC and traditional algorithm use, and finally, a phase where only PQC algorithms are used.

While the first and third phases are more straightforward, the hybrid phase involves a large number of tradeoffs based on many factors whose impact is less precise. One important factor that is already mentioned is delay performance. There are very few accurate estimates of how long an algorithm will take to execute in a designated hardware within a device, whether the hardware is available or yet to be manufactured or even simulated. Another important factor is infrastructure appliance readiness for these algorithms. One example is hardware security modules (HSM) used to sign certificates or sign software. These appliances typically need to be operated in a so-called FIPS-compliant mode and the operation in this mode is not possible until the government sets up FIPS validation programs for the PQC algorithms. In lieu of the validation programs, many HSM vendors tend to provide highly experimental software for their appliances to support the PQC algorithms. Another important factor to consider is the longevity of the implemented hybrid schemes to sustain both a transitional period when traditional algorithms such as RSA can be used to help the rollout of PQC algorithms and a final period when the traditional algorithms are no longer acceptable. Not doing this carefully can render the product obsolete when the CNSA2.0 clock turns over in the 2030-2033 period. Thus proper measures to support renewability and algorithm agility need to be inserted in the hardware and infrastructure to support the entire lifecycle.

To conclude, it is important for not only manufacturers of durable goods (such as storage or automotive products) but also any technology providers that use cryptography to protect their own and their customer assets to start putting a PQC readiness plan in place, asking questions from their own technology and appliance providers, be it VPNs, or be it storage or cloud solutions about their PQC migration plans.

Print Friendly, PDF & Email
Madjid Nakhjiri
Lead Security Architect at Western Digital | + posts

Leave a Reply

Your email address will not be published. Required fields are marked *