Emerging Technology

Post-Quantum Cryptography Transition: A Practical Guide for Engineering Teams

NIST has finalized its post-quantum standards. Here's a hands-on guide for engineering teams beginning the migration from classical to quantum-resistant cryptography.

Yukti Singhal
Security Researcher
5 min read

NIST's post-quantum cryptography standardization is no longer a future event. The standards are finalized. CRYSTALS-Kyber (now ML-KEM) for key encapsulation and CRYSTALS-Dilithium (now ML-DSA) for digital signatures are production-ready standards. The question is no longer "what algorithms?" but "how and when do we migrate?"

For most engineering teams, this migration will be more complex than any previous cryptographic transition. Y2K had a single deadline and a clear technical scope. The PQC transition involves replacing fundamental security primitives across every system that communicates, authenticates, or signs data.

Understanding What Needs to Change

Before diving into how to migrate, teams need a clear picture of what's affected.

TLS connections are the most visible target. Every HTTPS connection, every API call, every microservice communication relies on key exchange and authentication that quantum computers will break. The good news: TLS library updates will handle most of this. The bad news: you need to verify that every client and server in your ecosystem supports the new algorithms.

Code signing and artifact verification affects your entire software supply chain. Build artifacts, container images, packages, and SBOMs are all signed with algorithms that will be vulnerable. Migration here is complicated by the need for backward compatibility, as existing signed artifacts still need to verify.

Certificates and PKI are the deepest infrastructure change. Your certificate authority hierarchy, from root CAs to leaf certificates, needs to transition to PQC algorithms. Certificate sizes increase significantly with PQC algorithms, which affects protocol performance and storage.

Application-level cryptography includes anything your application does directly with cryptographic primitives: encrypting data at rest, generating JWTs, creating HMAC signatures, or performing key derivation.

Key management systems that store and distribute cryptographic keys need to support new key types. HSMs may need firmware updates or replacement to support PQC algorithms.

The Migration Strategy

Phase 1: Inventory and Assessment (Start Now)

You cannot migrate what you don't know about. Create a comprehensive inventory of every cryptographic algorithm used across your systems.

Automated tools can scan codebases for cryptographic API calls. Look for imports of crypto libraries, direct algorithm references, and configuration files that specify cipher suites. Don't forget infrastructure: load balancer configurations, database TLS settings, and VPN configurations all specify algorithms.

Categorize findings by:

  • Algorithm type: Asymmetric (must migrate), symmetric (may need larger keys), hash (generally fine)
  • Risk level: Internet-facing services first, then internal services, then data at rest
  • Migration complexity: Library update vs. application code change vs. protocol redesign

Phase 2: Crypto Agility (Start Now)

If your code hardcodes algorithm names, fix this before attempting PQC migration. Crypto agility means abstracting algorithm selection behind configuration or negotiation so that changing algorithms doesn't require code changes.

In practice:

  • Use library APIs that accept algorithm identifiers as parameters, not functions named after specific algorithms
  • Store algorithm metadata alongside encrypted data so decryption can select the right algorithm
  • Design protocols with algorithm negotiation so peers can upgrade independently
  • Test with multiple algorithm configurations to ensure your abstraction actually works

Phase 3: Hybrid Deployment (2024-2025)

The recommended transition approach is hybrid: use both classical and post-quantum algorithms simultaneously. This protects against both quantum attacks (classical algorithm is supplemented) and potential weaknesses in new PQC algorithms (classical algorithm provides fallback).

For TLS, this means hybrid key exchange combining X25519 (classical ECDH) with ML-KEM-768 (post-quantum). Chrome and Firefox already support this. Cloudflare and AWS have deployed it. Your servers should start supporting it.

For signatures, the approach is more complex. Dual-signing with both ECDSA and ML-DSA means larger signatures, which affects performance. Start with high-value signatures (code signing, root certificates) and expand as tooling matures.

Phase 4: Full PQC Deployment (2025-2030)

As confidence in PQC algorithms grows and the industry ecosystem matures, transition from hybrid to PQC-only for new deployments. Maintain hybrid support for backward compatibility with systems that haven't migrated.

The timeline depends on your risk profile. Organizations handling data that must remain confidential for decades (government, healthcare, financial) should push toward earlier full PQC deployment. Organizations with shorter data sensitivity windows have more time.

Practical Challenges

Performance impact. PQC algorithms have different performance profiles than classical algorithms. ML-KEM key generation is fast, but keys and ciphertexts are larger. ML-DSA signatures are significantly larger than ECDSA signatures. Test the performance impact in your specific workloads before deploying broadly.

Certificate size. An ML-DSA certificate is roughly 10x larger than an ECDSA certificate. In TLS, this means larger handshakes. For systems that transmit many certificates (mTLS with large certificate chains), this can be a meaningful performance regression.

HSM and hardware support. Many HSMs don't yet support PQC algorithms. If your cryptographic operations are bound to hardware, the migration requires hardware upgrades, which means procurement, qualification testing, and deployment cycles measured in months or years.

Compliance and certification. Some industries require certified cryptographic implementations (FIPS 140-3, Common Criteria). PQC implementations are still going through certification processes. Regulated organizations may need to wait for certified libraries before deploying.

Ecosystem readiness. Your migration is only as fast as your slowest dependency. If a critical library doesn't support PQC, you're blocked until it does. Track PQC support across your dependency tree.

How Safeguard.sh Helps

Safeguard.sh supports the PQC transition by providing the cryptographic inventory visibility that teams need for Phase 1. Our SBOM management tracks the cryptographic libraries and algorithms used across your entire software portfolio, showing you exactly where classical algorithms need to be replaced.

As you migrate, Safeguard.sh's policy gates can enforce cryptographic requirements: ensuring new components use PQC or hybrid algorithms, flagging dependencies that still rely on quantum-vulnerable cryptography, and maintaining audit trails of your migration progress. The transition to post-quantum cryptography is a multi-year effort that requires systematic tracking. Safeguard.sh provides the management layer that keeps the migration on track across teams, services, and dependencies.

Never miss an update

Weekly insights on software supply chain security, delivered to your inbox.