What is post-quantum cryptography?
Post-quantum cryptography (PQC) refers to cryptographic algorithms designed to be secure against attacks by quantum computers. Quantum computers, leveraging principles like superposition and entanglement, can solve certain mathematical problems exponentially faster than classical computers. This threatens the security of classical cryptographic systems like Rivest-Shamir-Adleman (RSA) and elliptic curve cryptography (ECC).
In order to understand the threat of quantum computers further, let's take a deep dive into the basics of classical cryptography.
Basics of classical cryptography
Classical cryptography forms the foundation of modern data security. Cryptographic protocols such as TLS and SSH use classical cryptographic standards for secure communication, data transfer, and more. These standards are divided into two main categories:
- Symmetric encryption: This approach uses a single key for both encryption and decryption. Examples include Advanced Encryption Standard (AES) and Data Encryption Standard (DES). It is efficient for bulk data encryption but requires secure key distribution.
- Asymmetric encryption: This approach uses a pair of keys—a public key for encryption and a private key for decryption. Examples include RSA and ECC. These methods rely on the computational difficulty of problems such as prime factorization (for RSA) and discrete logarithms over elliptic curves (for ECC). Digital signatures and hash functions are also used to encrypt messages.
Due to their ability to efficiently solve the underlying complex mathematical problems, future quantum computers threaten to undermine the security of the very encryption methods that build the foundation of the online world as we know it.
Threats posed by quantum computers
Computers today are limited to storing data amounting to 2 raised to the power of the number of bits. A 10-bit computer can compute 2 to the power of 10 possibilities, or 1,024 calculations.
Quantum computers, however, utilize quantum bits, or qubits. Unlike classical bits, qubits can exist in a state of superposition, meaning they can store data as either a 0, a 1, or a combination of probabilities of both at the same time. This means that the same 10-bit quantum computer can store up to 1,048,576 possible states, allowing the computer to take processing power up by 1,000 times at just 10 bits. With quantum computers, the real advantage isn't just speed; it's their ability to examine exponentially more possibilities at the same time.
Our current, widely used encryption methods—often referred to as classical encryption—use the complexity of logarithmic equations and prime factorization of numbers extending to trillions of digits. But in the future, a powerful quantum computer with a sufficient number of stable, fault-tolerant qubits will have enough processing power to process and crack today's encryption with ease.
The potential impact of quantum algorithms
Quantum algorithms exploit the unique properties of quantum computing to solve problems that are intractable for classical computers, directly undermining the security of many of our current cryptographic systems. Two of the most significant quantum algorithms for cryptography are:
- 01.
Shor’s Algorithm
This algorithm can efficiently factor large integers and solve the discrete logarithm problem. This is a direct threat to public-key encryption methods, breaking RSA and ECC encryption and the Diffie-Hellman cryptosystems. These methods are fundamental to securing several crucial online activities, including HTTPS websites, email encryption, secure logins, digital signatures, and much more.
A quantum computer running Shor's Algorithm could break these systems, allowing attackers to decrypt vast amounts of currently and historically intercepted data and compromise future secure communications. This potential for catastrophic failure of the public-key infrastructure is a primary reason why developing and adopting PQC is so crucial.
- 02.
Grover’s Algorithm
This algorithm speeds up unstructured search problems, providing a quadratic speedup. Although this algorithm doesn't completely break symmetric encryption algorithms like AES and DES, it effectively halves the key size. This means that organizations will have to adopt larger keys to maintain the same level of security against a quantum attacker. This further underscores the need for quantum-resistant alternatives and stronger symmetric key lengths.
Timeline for quantum computing advancements
While large-scale, fault-tolerant quantum computers capable of running Shor’s Algorithm are not yet available, significant progress is being made. Quantum computers with 50–100 qubits exist from IBM, Google, and Rigetti, but they are noisy and error-prone (indeed, we're currently in the noisy intermediate-scale quantum era). Experts estimate that quantum computers capable of breaking RSA-2048 could emerge within 10–30 years.
The purpose of PQC: Why the urgency now?
Governments and organizations are already preparing for the eventuality of quantum threats by standardizing and adopting PQC. While there's still time before the actual threat materializes, there's an urgent need to stay ahead and make the transition to post-quantum cryptography as soon as possible.
- 01.
Longevity of encrypted data
Data encrypted today with classical cryptography could be harvested and stored by adversaries for future decryption once quantum computers become available. This is a threat known as "harvest now, decrypt later" or "store now, decrypt later." Sensitive data with long-term value is particularly at risk.
- 02.
Transition challenges
Transitioning to PQC is a complex process that involves updates—and sometimes even complete overhauls—to existing protocols, hardware, and software based on the extent of the divide between current and future threats that quantum computers pose. This process could take 10–20 years, making it essential to start now. Organizations must plan for a gradual migration to quantum-resistant systems, yet this plan may need to be expedited in the case of sudden breakthroughs in quantum computing.
- 03.
Government mandates
To get ahead of the curve, the United States government has released a robust policy on implementing post-quantum cryptography upon the security recommendations of the NSA. With the National Institute of Standards and Technology (NIST) finalizing the three standards FIPS 203, 204 and 205 calling for adoption of quantum-resistant algorithms, organizations need to begin the journey of becoming quantum-ready.
NIST PQC standardization
Organizations like NIST are leading the effort to standardize PQC algorithms. NIST has finalized its PQC standards and selected four algorithms. Since early adoption will help ensure compatibility and security as quantum computing advances, governments and industries are beginning to mandate the use of quantum-resistant cryptography for critical systems.
- 01.
NIST’s PQC standardization process
NIST initiated the PQC standardization process in 2016 to address the growing threat posed by quantum computers to classical cryptographic systems. To ensure long-term security, NIST called for proposals from researchers, cryptographers, and organizations for quantum-resistant cryptographic algorithms.
The standardization process involved multiple rounds of evaluation, where submissions were assessed for security, performance, and practicality and scrutinized by the cryptographic community for vulnerabilities and efficiency.
- 02.
Overview of selected algorithms
NIST selected several algorithms for standardization, categorized into two main types: key encapsulation mechanisms (KEMs) and digital signature schemes. Here are the four selected algorithms:
- CRYSTALS-Kyber is a KEM that leverages the module learning with errors approach in lattice-based cryptography, with a secure key exchange for encryption. CRYSTALS-Kyber can be used in protocols needing key agreement and encryption, such as TLS for secure web browsing, VPNs, and secure email.
- CRYSTALS-Dilithium is a digital signature scheme leveraging the Fiat-Shamir with Aborts architecture in lattice-based cryptography, with smaller signatures than other post-quantum schemes. CRYSTALS-Dilithium can be used in applications requiring digital signatures for authentication and integrity, such as software signing, document signing, and secure boot processes.
- SPHINCS+ is a digital signature scheme that leverages hash-based cryptography. This algorithm doesn't depend on mathematical problems; it leaves behind a larger signature and is preferred as a fallback option due to its distinct underlying security assumptions. SPHINCS+ is suitable for scenarios where the larger signature size is acceptable in exchange for the strong security guarantees of a hash-based approach, such as long-term archival of digitally signed documents.
- Falcon is a digital signature scheme that leverages NTRU lattice-based cryptography, leaving behind extremely small signature sizes. Preferred in highly constrained environments, the computationally intensive signing process leads to a difficult user experience compared to the rest. Falcon's small signature size makes it particularly well-suited for resource-constrained devices like IoT devices and embedded systems where bandwidth and storage are limited.
Challenges in PQC implementation
The transition to post-quantum cryptography, however, isn't straightforward. Ranging from performance issues to integration complexities, it presents several challenges that organizations must be mindful of.
- 01.
Performance considerations
PQC algorithms generally have larger key sizes compared to classical cryptographic algorithms. For example, CRYSTALS-Kyber has public keys that are 4–6 times bigger than RSA standards, and SPHINCS+ signatures are significantly larger than ECDSA signatures. SPHINCS+ signatures are significantly larger than ECDSA signatures. This can impact storage, bandwidth, and transmission efficiency, especially in constrained environments like IoT devices.
Many PQC algorithms require more computational resources for key generation, encryption, and decryption, due to their size. For example, Lattice-based schemes involve complex mathematical operations, which can be slower than classical algorithms. Hash-based schemes may require significant computational effort for signing operations.
The increased computational complexity can lead to higher latency, which may be problematic for real-time applications like TLS handshakes or VoIP.
- 02.
Hardware and software compatibility
Many existing hardware systems, like embedded devices and smart cards, are optimized for classical cryptographic algorithms and may lack the computational power or memory to efficiently run PQC algorithms.
PQC algorithms often require new software implementations, which may not yet be optimized for performance. For example, Lattice-based algorithms require efficient polynomial multiplication and sampling, which may not be well-supported in current cryptographic libraries.
Older systems may not support the new mathematical constructs or larger key sizes required by PQC algorithms, creating the need for costly upgrades or replacements.
- 03.
Integration with existing protocols and systems
Many widely used protocols, such as TLS, IPsec, and SSH, are designed around classical cryptographic algorithms. PQC will require significant changes, including extending handshake protocols to support PQC key exchange and authentication, and managing backward compatibility with systems that still use classical cryptography.
To ensure a smooth transition, hybrid schemes combining classical and PQC algorithms are often recommended. However, this increases complexity and may introduce new vulnerabilities.
- 04.
The transition to PQC
The transition from classical cryptography to PQC is, and will be, a complex process for present-day systems. When transitioning, organizations would have to ensure both compatibility and security while integrating or switching over to PQC.
The phased approach involves incrementally shifting to PQC, identifying and transitioning in areas prone to quantum attacks. Hybrid cryptography systems combine classical and quantum cryptography, keeping organizations protected from standard and quantum threats.
Considerations while transitioning to PQC
The way that PQC is implemented in existing systems shouldn't hinder the performance or interoperability of existing protocols due to their size or complexity. PQC needs to continuously evolve to remain secure in a domain of ever changing threats, so adopting the Kaizen philosophy (i.e., change for the better) is wise. Since transitioning will take decades to put into effect, long-term planning and preparations for regulations and compliance requirements need to be made in advance.
Where to begin
The need for PQC is clear, and the time to prepare is now. Your initial steps should involve understanding the quantum risk, familiarizing yourself with the NIST-standardized algorithms, and assessing the potential impact on your organization's systems. Start the conversation internally, begin planning your migration strategy, and stay informed about the evolving landscape of quantum security.
If you're looking for a starting point, here's a good checklist:
- Raise awareness: Start educating stakeholders about the quantum threat and the importance of PQC. When doing so, remember to highlight why it's an urgent priority.
- Risk assessment: Ask relevant stakeholders to evaluate your organization's current cryptographic infrastructure and identify systems most vulnerable to quantum attacks. This should help you formulate effective action plans.
- Pilot test projects: Explore and test the performance and integration of standardized PQC algorithms in non-critical environments. Once the key concerns are addressed, you can move forward quickly.
- Leverage automation, especially for cryptographic infrastructure: Explore and implement automation tools for managing your existing public key infrastructure and other cryptographic systems. This will be crucial for efficiently handling the complexities of a future hybrid environment where both classical and post-quantum cryptography may coexist. Automation can streamline key management, certificate updates, and algorithm transitions, making the eventual shift to PQC more manageable.
Conclusion
With the rise of personal computers in the 70s, smartphones and tablets in the early 2000s, and high-performance GPUs in the 2010s, all industries and job markets have experienced revolutions that force us adapt to the evolving influence of technology.
With the growth of quantum computers, quantum-resistant encryption protocols will be in heavy demand. As usual, the early movers will be more secure from cyberattacks and vulnerabilities, allowing them to maintain uninterrupted operations, and giving them a massive advantage over the competition. PQC will be used in securing IoT devices, cloud computing and blockchain systems, across the government, finance, healthcare and defense sectors for years to come.
While the applications of PQC might vary, the change is imminent. Start now. Embracing this transition proactively will be key to safeguarding your digital future.
Glossary of terms
RSA: A widely used public-key cryptosystem that enables secure data transmission. RSA relies on the difficulty of factoring the product of two large prime numbers, called a semiprime. In RSA, there are two keys: a public key, which is used to encrypt data, and a private key, which is used to decrypt it.
Shor’s Algorithm can factor the large semiprime numbers used in RSA key pairs, allowing an attacker to derive the private key from the public key.
ECC: A form of public-key cryptography that uses the algebraic structure of elliptic curves over finite fields. ECC offers a higher level of security with smaller key sizes compared to traditional methods like RSA. It is commonly used for securing communications in mobile devices, websites, and other systems.
Shor’s Algorithm can solve the elliptic curve discrete logarithm problem, breaking the security of ECC-based systems.
Diffie-Hellman: This key exchange protocol allows two parties to securely exchange cryptographic keys over an insecure channel. It relies on the difficulty of computing discrete logarithms in a finite field. Through this method, both parties can arrive at a shared secret key that can be used for encrypted communication, without ever directly transmitting the key.
Shor’s Algorithm can compute discrete logarithms, compromising the security of key exchange protocols.
- Superposition: A quantum bit (i.e., qubit) can arbitrarily exist in a superposition of states, meaning it can represent either 0, 1, or both 0 and 1 simultaneously. This allows quantum computers to process multiple possibilities at once.
- Entanglement: This is when qubits become entangled, meaning the state of one qubit is dependent on the state of another, even if they are physically separated. This enables highly correlated computations.
- Quantum interference: This is when quantum states interfere with each other, amplifying correct solutions and canceling out incorrect ones during computation.
Lattice-based PQC
Cryptographic schemes that are based on the hardness of mathematical problems related to lattice theory. These schemes are designed to be secure against attacks from quantum computers.
- Key generation: In lattice-based cryptographic systems, keys are typically generated based on lattice problems. For example, in a public-key encryption scheme, the public key might be a matrix that defines a lattice, and the private key might be a short vector in that lattice.
Encryption and decryption: Processes designed around the hardness of lattice problems. For example, in an LWE-based encryption scheme:
Encryption: The message is encoded as a point in the lattice, and a small error vector is added to it. The result is the cipher text.
Decryption: The private key (a short vector) is used to remove the error and recover the original message.
- Preimage resistance: Given a hash value y, it is computationally infeasible to find an input x such that H(x)=y.
- Second preimage resistance: Given an input x, it is computationally infeasible to find another input x′≠x such that H(x)=H(x′).
- Collision resistance: It is computationally infeasible to find two distinct inputs x and x′ such that H(x)=H(x′).
Isogeny: A morphism between two elliptic curves that is also a group homomorphism. In simpler terms, it is a mapping that takes points on one elliptic curve to points on another elliptic curve while preserving the group structure.
PQC typically encompasses the usage of problems that are hard for quantum computers to crack, such as lattice and hash problems.
- Lattice-based cryptography: This is based on lattice theory, using the shortest vector problem and the learning with errors problems, which are hard to compute. Code-based cryptographic system is based on the McEliece cryptosystem.
- Multivariate polynomial cryptography: This is based on the hardness of solving systems of multivariate quadratic equations over finite fields, which is NP-hard, making it a strong candidate for cryptographic applications that are resistant to quantum attacks.
- Hash-based cryptography: This is a type of post-quantum cryptography that relies on the security of cryptographic hash functions using properties such as collision resistance and preimage resistance.
- Isogeny-based cryptography: This is a type of post-quantum cryptography that relies on the mathematical properties of elliptic curves and the concept of isogenies.
