As billions of dollars pour into quantum computers and countries build communications networks secured by quantum encryption, the importance of quantum information science has become harder to ignore.
This year’s Breakthrough Prize in Fundamental Physics honors four pioneers who have combined mathematics, computer science and physics to do “fundamental work in the field of quantum information”. IBM’s Charles Bennett, University of Montreal’s Gilles Brassard, University of Oxford’s David Deutsch and Massachusetts Institute of Technology’s Peter Shor share the award.
“These four people have really contributed greatly to the emergence of quantum information theory,” says Nicolas Gisin, experimental quantum physicist at the University of Geneva. “It’s nice to see these awards getting closer to me.”
The Breakthrough Prizes were co-founded by Israeli-Russian billionaire physicist Yuri Milner in 2012 and generously supported by other moguls, including co-founders Mark Zuckerberg and Sergey Brin. Much like Alfred Nobel, whose fortune came from funding the Nobel Prize from his invention of dynamite, Milner’s previous financial ties to the Kremlin have come under scrutiny, particularly in light of Russia’s ongoing invasion of Ukraine. In previous interviews, Milner has emphasized his independence and donations to Ukrainian refugees. A spokesman pointed this out Scientific American that Milner moved to the US in 2014 and has not returned to Russia since.
But getting credit for quantum information science hasn’t always been easy — or with such financial backing. Broadly speaking, the field is a combination of two theories: quantum mechanics, which describes the counterintuitive behavior of the atomic and subatomic world, and information theory, which describes the mathematical and physical limits of computation and communication. His story is a more chaotic tale, with sporadic advances often overlooked by mainstream scholarly journals.
1968, Stephen Wiesner, then a graduate student at Columbia University, developed a new method of encoding information using polarized photons. Among other things, Wiesner suggested that the inherently fragile nature of quantum states could be used to create counterfeit-resistant quantum money. Unable to publish many of his intoxicating theoretical ideas and drawn to religion, Wiesner, who died last year, largely left academia to become a construction worker in Israel.
Before leaving Columbia, Wiesner passed on some of his ideas to another young researcher. “One of my roommate’s friends was Stephen Wiesner, who started telling me about his ‘quantum money,'” Bennett recalls. “[It] I found it interesting, but it didn’t seem like the start of an entirely new field.” In the late 1970s, Bennett met Brassard and the two began discussing Wiesner’s money, which they imagined the unlikely task would require could capture photons with mirrors to create a quantum banknote.
“Photons aren’t supposed to stay – they’re supposed to travel,” Brassard explains the thought process. “When they travel, what is more natural than to communicate?” The protocol proposed by Bennett and Brassard, dubbed BB84, would open up the field of quantum cryptography. Later detailed and popularized in Scientific American, BB84 allowed two parties to exchange messages with the utmost secrecy. When a third party snooped, they left indelible evidence of their interference—like breaking a quantum wax seal.
While Bennett and Brassard were developing quantum cryptography, another radical idea was beginning to emerge: quantum computing. At a now famous meeting at MIT’s Endicott House in Dedham, Massachusetts in May 1981, physicist Richard Feynman proposed that a computer using quantum principles could solve problems inherent to a computer bound by the laws of classical physics , would be impossible. Although he did not attend the conference, Deutsch heard about the idea and was enthusiastic. “I gradually became more and more convinced of the connection between computation and physics,” he says.
Speaking to Bennett later that year, Deutsch experienced a crucial revelation: the then-dominant computer theory was based on the wrong physics—Isaac Newton’s “classical” mechanics and Albert Einstein’s relativistic approach—rather than deeper quantum reality. “So I thought I would rewrite the theory of computing and base it on quantum theory instead of classical theory,” says Deutsch soberly. “I didn’t expect something fundamentally new to come out of it. I just expected it to be stricter.” However, he soon realized that he was describing an entirely different type of computer. Even though it achieved the same results, it got there using principles of quantum mechanics.
Deutsch’s new theory made a crucial connection between quantum mechanics and information theory. “It made quantum mechanics accessible to me in my computer science language,” says Umesh Vazirani, a computer scientist at the University of California, Berkeley. Later, Deutsch, along with the Australian mathematician Richard Josza, proposed the first algorithm as a proof of principle, which would be exponentially faster than classical algorithms – although it did nothing practical.
But soon more useful applications appeared. In 1991, Artur Ekert, then a PhD student at Oxford, proposed a new quantum cryptography protocol, E91. The technique caught the attention of many physicists because of its elegance and practicality – as well as the fact that it was published in a leading physics journal. “That’s a nice idea. It’s a bit surprising that Ekert isn’t on the list of winners of this year’s Fundamental Physics Breakthrough Prize, says Gisin.
Two years later, when Bennett, Brassard, Josza, computer science researcher Claude Crépeau, and physicists Asher Peres and William Wootters proposed quantum teleportation, physicists took notice. The new technique would allow one party to transmit information, such as the outcome of a coin toss, to another via entanglement, the quantum correlation that can connect objects like electrons. Despite popular sci-fi claims, this technique doesn’t enable faster-than-light messaging — but it has dramatically expanded the possibilities of real-world quantum communications. “It’s the craziest idea,” says Chao-Yang Lu, a quantum physicist at the University of Science and Technology of China who helped implement the technique from space.
Words like “revolution” are used too often to describe progress in science, which is usually sluggish and incremental. But in 1994, Shor quietly started doing it. He recorded lectures by Vazirani and Bennett while working at AT&T Bell Laboratories. “I started thinking about what useful things you could do with a quantum computer,” he says. “I thought it was a long shot. But it was a very interesting area. So I started working on it. I didn’t really tell anyone.”
Inspired by the success of other quantum algorithms with periodic or repetitive tasks, Shor developed an algorithm that could decompose numbers into their prime factors (e.g. 21 = 7 x 3) exponentially faster than any classical algorithm. The implications were immediately apparent: prime factorization was the backbone of modern encryption. Finally, quantum computers had a truly groundbreaking practical application. Shor’s algorithm “just made it clear that you have to drop everything” to work on quantum computing, says Vazirani.
Although Shor had found a powerful use case for a quantum computer, he hadn’t solved the harder problem of how to build one – even in theory. The fragile quantum states that such devices could exploit to outperform classical computing also made them extremely error-prone. In addition, error correction strategies for classical computers could not be used in quantum computers. Undeterred, Shor bet with fellow researchers at a 1995 quantum computing conference in Turin, Italy, that a quantum computer would factor a 500-digit number before a classical computer. (Even with today’s classic supercomputers, factoring 500 digits would probably take billions of years.) No one took Shor’s bet, and some asked for a third option: that the sun would burn out first.
Two types of errors plague quantum computers: bit errors and phase errors. These errors are like flipping a compass needle from north to south or east to west. Unfortunately, correcting bit errors exacerbates phase errors and vice versa. In other words, a more accurate bearing to the north will result in a less accurate bearing to the east or west. But later in 1995, Shor figured out how to combine bit correction and phase correction – a chain of operations not unlike solving a Rubik’s Cube without altering a finished face. Shor’s algorithm will remain ineffective until quantum computers become more powerful (the highest number factored using the algorithm is 21, so classical factorization stays on top – for now). But it still made quantum computing possible, if not practical. “That’s when it all became real,” says Brassard.
All of this work led to new views of quantum mechanics and computing. For Deutsch, it inspired an even more fundamental theory of “constructors” – which he says describe “the set of all physical transformations”. Others remain agnostic about the likelihood of further deep insights from the quantum realm. “Quantum mechanics is really weird, and I don’t think there’s ever going to be an easy way to understand it,” says Shor. When asked whether his work on quantum computing makes the nature of reality easier or harder to understand, he says impishly, “It certainly makes it more mysterious.”
What began as a pastime or multifaceted intellectual pursuit has now grown far beyond many of the wildest imaginations of the pioneers in the field. “We never thought it would ever become practical. It was just fun to think about these crazy ideas,” says Brassard. “At some point we decided we were serious, but people didn’t follow us. It was frustrating. It is very gratifying that it is now receiving so much recognition.”