web analytics

NCSC Releases Post-Quantum Cryptography Timeline – Source: www.schneier.com

Rate this post

Source: www.schneier.com – Author: Bruce Schneier

Comments

Bernie March 21, 2025 9:08 AM

I see the potential for this to backfire as many folks roll their own cryptography.

Clive Robinson March 21, 2025 10:48 AM

@ Bernie, ALL,

With regards,

“I see the potential for this to backfire as many folks roll their own cryptography.”

Yup and some people are waving red flags about as well as ringing the bell…

Have a read of,

Post-Quantum Cryptography Is About The Keys You Don’t Play

https://soatok.blog/2025/03/17/post-quantum-cryptography-is-about-the-keys-you-dont-play/

“Post-Quantum Cryptography is coming. But in their haste to make headway on algorithm adoption, standards organizations (NIST, IETF) are making a dumb mistake that will almost certainly bite implementations in the future.

Sophie Schmieg wrote about this topic at length and Filippo Valsorda suggested we should all agree to only use Seeds for post-quantum KEMs.”

Who? March 21, 2025 11:10 AM

@ Bernie, Clive Robinson, ALL

OpenSSH has been using for years a combination of classic and post-quantum cryptography for the KEX stage; this way, if our choice of post-quantum cryptography is demonstrated to be weak, we have at least the old classical algorithms backing our communications.

I would not “just move” to post-quantum cryptography; I would add a second encryption layer while using classical algorithms well tested on-field.

Hendrik Visage March 21, 2025 11:17 AM

May I ask the obvious question that nobody seems to ANSWER: What algorithms had been broken/cracked/etc. by “brute force”/etc. using quantum technology, and how long did it took? Had it been repeatedly done in reasonable times?

Just asking before the scare

Clive Robinson March 21, 2025 11:42 AM

@ Bruce, ALL,

Is “Quantum Computing”(QC) over rated?

The hype of “Quantum Computing” is fairly well known for various reasons. Not least because of the two issues of,

“Classical and quantum researchers compete using different strategies, with a healthy rivalry between the two. Quantum researchers report a fast way to solve a problem —often by scaling a peak that no one thought worth climbing— then classical teams race to see if they can find a better way.

This contest almost always ends as a virtual tie: When researchers think they’ve devised a quantum algorithm that works faster or better than anything else, classical researchers usually come up with one that equals it.

From : https://www.quantamagazine.org/quantum-speedup-found-for-huge-class-of-hard-problems-20250317/

It raises the,

“What practical mass market benefit will QC have?”

Which if the “little to none” view holds would in effect kill it off as a commercial activity.

But then there is the “metric shit-ton” of hardware issues that has kept us down to just a handful of Qbits barely enough to factor a two digit number. Where as best guess estimates say we will probably need as a minimum in the hundreds of thousands to millions of Qbits.

But a more abstract issue arises of,

“Do we know enough?”

There is something called “Quantum Cryptography” that has it’s roots in an idea from the 1960’s for “quantum money” that would be unforgable. In the late 1970’s it gave rise to an experiment that gave rise to what many call BB84 that proved that Quantum Cryptography was practical.

Since then the Chinese amoungst others have pushed it to the point it can provide secure key distribution without a second channel. Known as “Quantum Key Distribution”(QKD) it can provide what is considered absolute security spanning around one third of the way around the globe…

But we need more than “key distribution” and that is where other questions arise,

https://www.quantamagazine.org/cryptographers-discover-a-new-foundation-for-quantum-secrecy-20240603/

The thing about QKD is that not only is it inexpensive in comparison to “Quantum Computing”(QC), the rate it is progressing technically it will be globe spanning in a very practical way before QC is anything more than a very very expensive lab curiosity. Which due to QC’s exorbitantly expensive “consumables” cost will probably make it not commercially viable ever.

Clive Robinson March 21, 2025 11:55 AM

@ Who?,

I’m all in favour of “chained cryptography” as a sensible security measure.

But that apparently makes me an “odd duck”.

NIST amongst others apparently dislikes the idea of “hybrid cryptography” using both pre and post QC algorithms.

The obvious question of,

“Why?”

I’ve not really been able to get to the bottom of due to an excess of hand waving blocking the view…

Bernie March 21, 2025 8:14 PM

@ Clive ,

Welcome to Ducks Anonymous, Odd. Folks around here call me Old. I remember looking forward to when computer systems were finally powerful enough that UIs were always fully responsive. Since I am still waiting, I think that folks should start calling me Naive instead of Old.[1]

While chained cryptography is sensible, computers are never going to be powerful enough for the basics like strong encryption. At least my (unwanted) AI assistant will be able to tell me why my UI is frozen, probably blaming Grace Hopper’s bug.

Also, thanks for that link. Nice read.

[1] If you say anything about it being 5-10 years away, your sentence better mention cold fusion and room temperature superconductors too.

Soatok March 21, 2025 11:36 PM

@ Who?

I would not “just move” to post-quantum cryptography; I would add a second encryption layer while using classical algorithms well tested on-field.

This is often called “hybrid” in the PQC discussions, and it’s a sensible thing to do.

Unfortunately, there are a lot of details with which we can hide devils. https://durumcrustulum.com/2024/02/24/how-to-hold-kems/

(Deirdre Connolly is the author of X-Wing, a hybrid KEM that uses ML-KEM-768 and X25519, which is what I plan to use in my projects.)`

Clive Robinson March 22, 2025 12:29 AM

@ Bernie,

With regards,

“Folks around here call me Old.”

Funny thing… Most every morning I look in the mirror… My brain looks out with the thoughts of a twenty something year old, and is shocked to see a grumpy old git looking back… And it’s all down hill from there for the rest of the day 🙁

Why I make the same mistake near every morning I don’t know, I guess it’s an “old habit that’s hard to break”.

With regards,

“I remember looking forward to when computer systems were finally powerful enough that UIs were always fully responsive.”

To which I say

“Young man the 8bit CLI is where you should look!”

I have a genuine Apple ][ I bought back in the late 1970’s and when in the editor I use on it, it’s still got a faster key-press to screen display time than any current commercial windows based OS on hardware that’s less than three years old…

There is a lesson in there… which is why I run a stripped down Linux on a laptop that shipped with MS Win 7 (according to the sticker underneath). Sadly getting 32bit Linux is no long,

“Go to the news agent and buy a magazine with CD/DVD on the front.”

Which was a “more secure” way than an Internet download.

I do run multiple X-terminals in a Windowing environment and take the speed hit for the convenience of having half a dozen files open to see and cut-n-paste from (yup a bad habit that is almost as old as my back teeth).

But consider your statement,

“… computers are never going to be powerful enough for the basics like strong encryption.”

Back in the 1970’s my Apple ][ was and still is more than powerful enough for “strong” encryption. In that it could happily use an XOR or Add-Mod256 “mix algorithm” of “plaintext” with OTP “ciphertext” from floppy disk file.

More modern encryption like AES uses a “highly complex” “mix function” in lots of rounds…

However run AES in “counter mode” to generate a psudo-OTP to write to file you in effect,

“Take the complexity out of synchronous On-Line encryption and run it asynchronously off-line.”

Or you can run the complexity across multiple parallel compute engines (which is what we did back when 8bit CPUs ran below 4Mhz clock rate).

Thus it can be seen that the issue is not actually “strong” encryption but “managing the complexity” of the
encryption algorithm when being used synchronously “On-Line”.

Thus the question of splitting up the “complexity” into a “chain” of “less complex” algorithmic parts which in effect is what 3DES was all about (only they kept the complexity “Synchronous and On-Line”).

Some years ago now (last century 😉 Prof Ross Anderson wrote a paper on doing something similar,

https://link.springer.com/content/pdf/10.1007/3-540-60865-6_48.pdf

Which led to another paper analysing the ciphers authored by Pat Morin,

https://cglab.ca/~morin/publications/crypto/aardvark-sac.pdf

(Pat also has work on “random algorithms, that is interesting in a similar but different respect. Cryptanalysis often relies on “a fixed structure” to do things like “message in depth” attacks. Randomising the order of the sub algorithms can be shown to keep the desired CS complexity but in effect give a new final algorithm each time thus limiting such attacks).

I hope you find the papers interesting.

Original Post URL: https://www.schneier.com/blog/archives/2025/03/ncsc-releases-post-quantum-cryptography-timeline.html

Category & Tags: Uncategorized,cryptography,quantum cryptography,UK – Uncategorized,cryptography,quantum cryptography,UK

Views: 3

LinkedIn
Twitter
Facebook
WhatsApp
Email

advisor pick´S post