Two Up-and-Comers in Quantum Computing Share Their Insight

If there’s one thing I love, it’s seeing talented young people get involved and take charge of the future. Especially when it’s quantum computing they’re getting involved in.  Bryce Fuller and Patrick Rall are definitely involved, teaching future developers at the University of Texas about quantum algorithms.

How did you first become interested in quantum computing?

PR: During my undergraduate studies at Caltech, I took many physics and computer science classes. I was interested in both, so quantum computing seemed like a perfect fit. I passionately devoured Nielsen and Chuang’s textbook, and immediately worked under Dr. John Preskill research group on a project on quantum cellular automata. I was so thrilled to learn that Dr. Scott Aaronson was coming to UT Austin, making UT a perfect choice for my graduate studies.

BF: It was a process of closing in on what captured my interest. I wanted to pursue quantum physics from the time I was in middle school. Around 2013, I started building computers to mine Litecoin and discovered a passion for computing. I learned about quantum computing during that time, and it was the intersection of everything that fascinated me. I haven’t looked back since.

You’ve been teaching freshman quantum computing together. What’s the main focus of the material, and what has the experience been like so far?

BF: Quantum computing is usually taught at the graduate level and assumes a background in quantum physics and higher mathematics. Our challenge is primarily in teaching this material while assuming as little background as possible.

PR: Luckily, it turns out that there is a way to learn quantum computing using only matrices and complex numbers. Dr. Aaronson used this teaching strategy in Spring 2017, and we’ve implemented it every semester since then. I’m pleasantly surprised by how quickly students can pick up a basic understanding of quantum algorithms. We are excited to continue refining the curriculum and to figure out a way to teach quantum computing as efficiently as possible.

What do your students have the most difficulty with during the course?

PR: Quantum superposition is mathematically similar to probability, but very different in terms of physics. So students often get confused when we mix the two together: random quantum states! The math of these “mixed states” is a little trickier, but they are critical for understanding entanglement and noisy quantum computing.

BF: Many abstract concepts in quantum information can be grounded in one’s intuitions about physics or higher mathematics. Because most of these students do not have this background, we have to use clever analogies as a means of bootstrapping intuition.

For example: Think of a bit as a coin which can be heads or tails. You can think about a qubit as a coin which is suspended in zero gravity, but it falls as soon as you look at it. This gets across several key ideas: before you look at the quantum coin it can be in infinitely many possible orientations, and looking at your quantum coin turns it into a regular coin (heads or tails). I find that formalizing this analogy further and explaining it’s caveats can help students understand quantum states.

Many of the quantum algorithms out there have yet to be tested on real hardware. What potential problems arise when these algorithms move from theory to practice? What is possible today with quantum computers and what do you think the near-term applications will be?

PR: Today’s quantum computers are very noisy and cannot hold onto the quantum information for very long before it dissipates into the environment. This gives a short time window for quantum computation. Many quantum algorithms, like Shor’s factoring algorithm, do not fit into this window. We need to use small algorithms that are robust to errors.

Potentially noise-robust algorithms exist: The quantum approximate optimization algorithm and the variational quantum eigensolver could quickly churn out decent solutions to some optimization problems. Understanding which problems can be solved on a noisy quantum computer is one of the most important research goals in quantum computing today.

BF: Another frontier is the development of quantum data structures and qRAM: hardware that can hold onto quantum states for very long amounts of time or access big data in a quantum way. Clever algorithms have been devised which implicitly require these resources such as quantum recommendation systems (think Netflix) and database searching (think Google).

Scott Aaronson has said that simulating quantum mechanics might be one of the most important applications of quantum computing. What will those simulations allow us to do?

BF: Quantum mechanics has all the answers: it tells us how chemicals interact, so we can design better pharmaceuticals. It tells us how semiconductors behave, so we can design more efficient solar cells. It is instrumental to the design of so much technology we use every day, like the processor in your phone. It can even help us understand and synthesize exotic states of matter such as superconductors. But, even though we know all the underlying rules, quantum mechanics is so hard to simulate on a regular computer that many of these technologies remain just out of our reach!

PR: But a quantum computer has no trouble simulating quantum mechanics. With a quantum computer, engineers and physicists will not need to rely on fancy approximation strategies and can just calculate the things they want to know directly. Initially, small quantum computers will have trouble keeping up with our existing algorithms for regular supercomputers. But I think eventually quantum computers will win and become the primary way we design quantum technologies.

How does the way we currently think about solving problems with a computer have to change? Do developers need to know quantum mechanics?

BF: Understanding quantum algorithms is hard, and designing them is even harder. Similar to GPU programming, just throwing your problem onto the chip and expecting it to be faster will not work; you need to understand how the hardware works, play to its strengths, and have some tricks up your sleeve. To innovate on a quantum computer, you not only need to learn the new architecture, but also quantum mechanics! Patrick and I are working to streamline the learning process.

PR: Of course, premade libraries can always bridge the gap. If you want to program neural networks, you don’t need to understand GPU programming—you only need to understand TensorFlow. In the future I expect libraries for algorithms like quantum approximate optimization and quantum matrix inversion to bridge the gap. Now you only need to tinker to explore the strengths and weaknesses of these algorithms, as you would with machine learning.

What problems do you run into when simulating a quantum computer on a classical machine?

BF: Your laptop is designed to effectively manipulate classical information. But quantum information is fundamentally different in that it can contain a new kind of correlation known as entanglement. If you try to write down quantum information using classical information, you have to write down the entanglement information as well. A few hundred qubits would require more classical bits to write down than there are atoms in the observable universe—every qubit you add doubles the data! State-of-the-art simulations can simulate around 50 qubits for about 25 cycles, using terabytes or even petabytes of memory and several days.

PR: My research is partly about alternative strategies to simulating quantum computers using normal supercomputers. These simulations are useful for understanding noise and verifying if quantum computers are working correctly. Quantum quality control! I try to develop families of quantum computations that are so symmetrical that I can actually simulate them quickly. Then I randomly shuffle these symmetrical circuits (called stabilizer circuits) so they average out to the circuit I want to understand. I can simulate very many qubits this way, without using terabytes of memory.

What are some of the most prominent myths or misconceptions about quantum computing?

PR: The most common misconception I hear being debunked is the notion that a quantum computer will try all solutions to a problem at once and return the correct result. Nature isn’t so kind to just give us the answer to any NP problem for free.

BF: I’d like to clear up a less common misconception about quantum cryptography. Quantum key distribution allows two people to establish a secret key for secure classical communication. People sometimes think of quantum cryptography as an unbreakable protocol which will allow perfect security. The truth is a bit more nuanced.

If an eavesdropper tries to secretly copy down this key, both parties will be alerted and they can abort. This means that if the protocol succeeds, both parties can be sure their key is secret. It serves as a perfect alarm system, not a perfect shield. The obvious attack would be for a malicious third party to always be listening and indefinitely prevent a key exchange.

What do you enjoy most about the work you’re doing now?

BF: I think the most rewarding part of my involvement in QC is creating resources for undergraduates to gain a foothold into this subject. Not long ago, I was a student who desperately wanted to study QC, and I struggled to find the resources to get started. I’m incredibly lucky to not only have found what I’m passionate about, but to also have been given an opportunity to pursue it. Dr. Aaronson and his research group took a chance on me, and this is my way of paying it forward.

Patrick Rall and Bryce Fuller are brave enough to teach quantum information and algorithms to undergraduates. Born in Munich, Patrick earned his undergraduate degree at Caltech. Bryce earned his undergraduate degree at UT Austin, where he is the founding president of the Quantum Computing Initiative, an organization for students who share a common desire to understand quantum algorithms, dedicated to refining methods for teaching quantum information science to undergraduate students.