https://blog.computationalcomplexity.org/2026/03/bennett-and...
https://www.quantamagazine.org/quantum-cryptography-pioneers...
https://www.nytimes.com/2026/03/18/technology/turing-award-w...
https://www.wired.com/story/a-quantum-leap-for-the-turing-aw...
matthewdgreen2 days ago
I don't want to take anything away from Bennett and Brassard, but I'd like someone to spare a word for poor Stephen Wiesner, who invented the earliest quantum information-distribution protocols as far back as the 1960s and published them before Bennett and Brassard. He also invented Oblivious Transfer (OT) which is required for multi-party computation -- although his was a quantum protocol that demonstrated some of the ideas behind QKD, not the classical protocol we call OT today [1].* Weisner was an inspiration for Bennett and Brassard, who then realized more useful systems.
While obviously this takes nothing away from BB's many later contributions (and they have extensively credited him), it's just a reminder of the randomness that goes with scientific credit. Since my PhD thesis was on OT, I like to remind people of Wiesner. He deserves a lot more credit than he gets!
* I suppose if you're a real theoretician, since OT implies MPC and MPC implies all cryptography, then perhaps Wiesner's OT implies everything that BB did subsequently. I'm not sure any of that is true (and I've since checked with an LLM and there are some no-go theorems from the 1990s that block it, so that's super interesting.)
Q_is_4_Quantum18 hours ago
Actually you can't compose quantum crypto protocols like you can classical ones - the composed protocol needs a new security analysis. Entanglement across protocols often kills the composition!
Interestingly (to me!) it took a while in the 90’s/early 00’s for the community to realise that there are distinct questions:
Question A: Does there exist a set of target states and measurements that implement the task
Question B: Can mistrustful parties find a communication protocol that securely (from their perspective) create/implement those states/measurments.
An example where the answer to A is “no” is fully secure oblivious transfer. There were a bunch of misguided papers trying to find communication protocols for OT, but they were doomed from the start!
An example where the answer to A is “yes" but to B is “no” is strong coin flipping. And an example where the answer to both is “yes” is weak coin flipping. (See Carlos Mochon’s magnus opus arxiv 0711.4114 for the coin flipping examples).
I first articulated the distinction between A and B quant-ph/0202143 but left the proof about OT and Question A as an exercise to the reader! Roger Colbeck in arxiv 0708.2843 provided a simple proof and elucidated the whole situation a lot I think.
abdullahkhalids2 days ago
Don't forget about William Wootters, who also did significant work in the 1980s on quantum information. Most notably with Zurek, he proved the quantum no-cloning theorem in 1982. This result is at the same foundational level as energy conservation or constancy of light.
He was also on the Teleportation discovery in 1993.
Q_is_4_Quantuma day ago
Asher Peres told me that Bill Wootters should be given 99% of the credit for the teleportation discovery (and this is in the context that most of us around at the time presumed the majority of the credit should go to Peres and Wootters who had already been discussing publicly very similar stuff).
abdullahkhalids4 hours ago
Interesting to hear, but also not surprising. His thinking was so far ahead of time.
NooneAtAll32 days ago
> and I've since checked with an LLM
but did you recheck it yourself, or are you trusting unreliable narrator?
matthewdgreena day ago
I only checked the abstracts, and they seem consistent. Good LLMs (Claude Opus, ChatGPT Pro) still get things wrong regularly, but lately I've noticed these are mainly the deep details, not easy things like "there is a result that claims X."
lkm02 days ago
If you enjoy reading about undervalued scientists, check out the life of Ernst Stückelberg, who missed out on 4 to 5 Nobel prizes because he mostly published in unknown journals. https://blogg.perostborn.com/2023/03/22/hes-not-so-easily-st...
throwaway815232 days ago
Wiesner was quite a character but he died a few years ago, so wouldn't have been eligible for this year's award.
spot50102 days ago
As a young grad student, I remember going to a talk by Bennett where he explained how a Quantum Computer allows manipulation in a 2^N dimensional hilbert space, while the outputs measurements give you only N bits of information. The trick is to somehow encode the result in the final N bits.
I felt this was a much better layman explanation of what a quantum computer does than simply saying a quantum computer runs all possible paths in parallel.
aleph_minus_one2 days ago
> I felt this was a much better layman explanation of what a quantum computer does than simply saying a quantum computer runs all possible paths in parallel.
Relevant concerning your point:
> "The Talk"
hammock2 days ago
That comic is great I understand qubits a bit better now: it has 4 degrees of freedom but can be mapped onto the 2d surface of a sphere because of normalization (circle rule) and global phase symmetry which each take away one of the four DOF
I need a longer think on the interference/computation connection though
dogtimeimmortal2 days ago
Thanks for this! I guess i need to read up on Hilbert Space.
...and Shor's Algorithm
amemi2 days ago
Don't let the terminology intimidate you. The interesting ideas in quantum computing are far more dependent upon a foundation in linear algebra rather than a foundation in mathematical analysis.
When I started out, I was under the assumption that I had to understand at least the undergraduate real analysis curriculum before I could grasp quantum algorithms. In reality, for the main QC algorithms you see discussed, you don't need to understand completeness; you can just treat a Hilbert space as a finite-dimensional vector space with a complex inner product.
For those unfamiliar with said concepts from linear algebra, there is a playlist [1] often recommended here which discusses them thoroughly.
[1] https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...
hammock2 days ago
Yeah all the names and terminology really do make it seem harder than it is. Took me a long time and I’m still learning. 2d Hilbert space is same as 2d Euclidean space but each dimension has 2 degrees of freedom (real + imaginary). Might even think of it as 4d space, for vector imagining purposes, but that would probably be wrong and someone would call you out
aleph_minus_onea day ago
> ...and Shor's Algorithm
Better start with Simon's algorithm (solving Simon's problem) [0]; it already contains a lot of ideas that you need to understand Shor's algorithm, while not having a lot of technicalities. Then progress to Shor's algorithm, and then to Kitaev's algorithm [1] (link from [2]). The latter solves the Abelian stabilizer problem - this problem contains the more abstract mathematical essence of a lot of quantum algorithms.
[0] https://en.wikipedia.org/wiki/Simon%27s_problem
[1] https://arxiv.org/abs/quant-ph/9511026
[2] https://en.wikipedia.org/wiki/Hidden_subgroup_problem#Instan...
srvmshrop2 days ago
* From the announcement [0]:
ACM has named Charles H. Bennett and Gilles Brassard as the recipients of the 2025 ACM A.M. Turing Award for their essential role in establishing the foundations of quantum information science and transforming secure communication and computing.
* An accessible news excerpt via CNN science [1]
Years before emails, internet banking, cloud servers and cryptocurrency wallets, two scientists devised a way to keep secrets perfectly safe and indecipherable to eavesdropping outsiders.
Their 1984 work depended on the hidden, counterintuitive world of quantum physics, which governs the way the world works at the smallest, subatomic scale, rather than complex but theoretically breakable mathematical codes to secure data.
The insights of Charles Bennett, an American physicist who is a fellow at IBM Research, and Gilles Brassard, a Canadian computer scientist and professor at the University of Montreal, have since transformed cryptography and computing. The pair received the A.M. Turing Award on Wednesday for their groundbreaking work on quantum key cryptography.
[0] https://www.acm.org/media-center/2026/march/turing-award-202...
[1] https://edition.cnn.com/2026/03/18/science/quantum-key-crypt...
bawolff2 days ago
> Bennett and Brassard, with Ethan Bernstein and Umesh Vazirani, showed that in black-box setting, quantum computers would require big-omega(sqrt(n)) queries to search n entries, matching Grover's algorithm. For some reason, the popular press rarely covers these results that limit the power of quantum computing.
This is mentioned almost as a footnote, but to (layman) me seems much more important than QKD, especially from a comp sci perspective instead of a physics perspective.
shemnon422 days ago
QKD can be sold today.
The quantum computers are not quite large enough to search at an `n` such that O(n)` is not viable but `O(sqrt(n))` is, that's where there's money to be made, especially if viability is defined by small time horizons. So it's a footnote for the future.
bawolff2 days ago
> QKD can be sold today.
It can, but it isn't largely because the classical solutions solve the problem better and you usually have to resort to classical solutions to solve MITM anyways afaik. However my point is less about practicality and more QKD seems more like a physics or engineering thing and not a computer science thing.
After all, this is supposed to be a computer science prize not a make money prize, so which is more sellable should be besides the point.
shorden2 days ago
Worth noting that this is a bound on arbitrary search, but there exist some problems with structure (e.g. integer factorization) for which quantum algorithms are exponentially faster than known classical algorithms (a problem believed to be in NP and BQP but not P).
RRRA2 days ago
That's 2 Turing award for the Université de Montréal in 6 years. Sadly, I never had those 2 teachers during my years there!
I did see Gilles' lunch talks though, it was really insightful!
DrNosferatu2 days ago
The math might be beautiful, but I'm very skeptical - practical - quantum computers will ever deliver their promise.
[deleted]2 days agocollapsed
Joel_Mckay2 days ago
Forever is a long time, but I agree people that assert reality is the model are almost always incorrect eventually.
There is some interesting work being done, but it will never match the excessive hype. =3
"The Genius of Computing with Light"
DrNosferatua day ago
I would say optical computing will be practical well before QC is.
Time will tell.
rramadass2 days ago
Not sure what is going on in QC world; With this ACM prize it has become even more murky.
As Sabine Hossenfelder (Theoretical Physicist) points out, companies to do with QC are seeing a surge in investments and marketing. It is as if somebody knows something that the "common public" doesn't - https://www.youtube.com/watch?v=gBTS7JZTyZY
I don't know enough about the science/technology to form an opinion but have recently started down the path of trying to understand it - https://news.ycombinator.com/item?id=46599807
NooneAtAll32 days ago
> It is as if somebody knows something that the "common public" doesn't
oooorrr - and hear me out - investments are inherently hype-based and irrational and there is too much money flying around to do actual smart decisions
rramadassa day ago
Nope.
Quantum Computing (QC) is unlike previous technologies which were all mostly "logical structures" (i.e. the underlying Physics/Technologies were well-known). The viability of both the core Physics itself and its realization through Technology for QC are questioned by some Physicists/Technologists themselves. But in 2024/2025 many Govts. and Companies both have started investing heavily in QC. Moreover the advanced countries have implemented export controls on QC technology prohibiting export of QC computers above 34-qubits.
And now the ACM prize for something done long ago in quantum information.
Finally note that QC algorithms can be simulated (for small size qubits) on conventional computers and the current AI technologies may also play a part here i.e. implement QC algorithms on the "Cloud supercomputer" and using AI technologies.
The logical inference is that there has been some technological (one or more) breakthrough in the realization of the QC qubits technologies, QC algorithms running efficiently on the cloud, AI usage for QC etc. Nothing else explains all of the above facts.
See also: The Case Against Quantum Computing by Mikhail Dyakonov (Professor of Physics) - https://spectrum.ieee.org/the-case-against-quantum-computing
NooneAtAll320 hours ago
> Quantum Computing (QC) is unlike previous technologies
aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny
right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face
viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early
governments are only investing the same way as into particle accelerators - in form of research grants
simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough). Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details
If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.
The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech
rramadass14 hours ago
> aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny
You have not understood the first thing about what i had pointed out.
> right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face
While there has been hype, in the last couple of years things seem to have changed and now culminated in the awarding of the ACM Turing Award prize. Do you know anything about the Physics/Mathematics behind qubits (eg. probablities/superposition/phase/noise etc.) and/or how that has been realized via technologies (eg. superconducting/photonics/trapped-ions etc.)? People are looking at "hybrid" quantum computers i.e. conventional+quantum (eg. IBM, Fujitsu), shuttling qubits on silicon (eg. Hitachi) which allows existing foundry technology to be used for QC. This is huge.
> viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early
Non-sequiteur.
> governments are only investing the same way as into particle accelerators - in form of research grants
No, Govts. are actively funding startups in this area and including technology research/transfers in their Free Trade Agreements with other govts.
> simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough).
Simulation of QC is not "extremely trivial" but requires HPC technology. Datacenter/Cloud technologies are also utilized here. Generally only around 30-50 qubits have been simulated with 50+ qubits being exponentially prohibitive in terms of compute power/memory.
>Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details
To use your own language; this right here shows that you are just a clueless idiot about this domain. AI is a tool applied to various domains eg. AlphaFold for protein structures in Biology which solved an almost intractable problem. People are doing the same with QC+AI. There are a bunch of papers on this; for your edification start with Quantum Computing and Artificial Intelligence: Status and Perspectives - https://arxiv.org/abs/2505.23860
> If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.
This demonstrates your gullibility. Since one of the best studied usecases for QC is cryptography, if there has been a breakthrough in some lab (govt/academia/company all of whom have secrets), the powers-that-be would not want it to be widespread for security (mainly) reasons. But hints might have been given and investments encouraged. Almost all QC companies have a govt. tie-up and cryptographic technologies have already been subject to export controls from the very beginning. Another scenario is defense applications. There are plenty more but these two are the main ones.
> The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech
That is the obvious superficial take. Given what i have written above, what if semiconductor technology i.e. the "hybrid" QC+Conventional allows one to simulate 100+ qubits easily now? What if there has been some breakthrough's by using AI on QC algorithms both existing and new ones? Have any formerly intractable problems in Physics/Chemistry/Biology/Mathematics been made tractable now due to AI usage? How many of these can be implemented on a QC? Etc. Etc.
To summarize; you have to look at the whole complex picture before drawing conclusions. Merely parroting trivialities like "hype" is meaningless.
hammock2 days ago
Check out Eric weinsteins latest theory about how frontier physics has moved “dark” (with a grain of salt, some of the other things he says might tempt you to discount him completely)
[deleted]2 days agocollapsed
rvz2 days ago
Well deserved and much needed recognition in quantum key cryptography, for once not a single mention of "AI" anywhere.
Congratulations to Charles Bennett and Gilles Brassard.
MeteorMarc2 days ago
Really curious, not a critique: apart from the idea of the possibility of intrusion detection due to the quantum nature of the communication link, what is special about the protocol that is mentioned?
kleiba2 days ago
Coincidentally, the same people also recently received the Science Fiction Award 2025!
aleph_minus_one2 days ago
Source?
layer82 days ago
I suspect the parent considers quantum computing science fiction.
kleibaa day ago
A good opportunity to repost this: https://eprint.iacr.org/2025/1237.pdf
aleph_minus_onea day ago
An opposite standpoint: Why it's so much harder to factor 21 on a quantum computer than 15: https://algassert.com/post/2500
goatyishere252 days ago
[flagged]
314crypto582 days ago
[flagged]