Texas A&M’s Helmut Katzgraber Sits Down for 15(-ish) Questions with Whurley
Aside from being an avid underwater photographer, Helmut Katzgraber is a professor in Texas A&M University’s Department of Physics and Astronomy and a consultant for 1QBit Information Technologies (research lead) and Microsoft Research. I had the pleasure of interviewing him last week, and was excited to ask him about his research. But first, I had to satisfy my own curiosity.
Your personal website is filled with wonderful underwater pictures. Why scuba diving?
I love the ocean, always have. I was lucky enough to grow up in Lima, Peru, and spend endless summers at the beach. When I was 13, my parents took me to Galapagos where I fell in love with the underwater world. In the meantime, I am a certified dive master, and whenever time permits, I travel to a remote corner of the planet with camera and gear to photograph underwater critters.
Scuba is also a zen-like experience. You are weightless and all you hear is your breathing. You are mostly undisturbed and can spend about one hour in peace either searching for critters of photographing them. Many ideas for projects tend to materialize either when I am scuba diving or during my morning runs.
What is your main research focus at the moment?
My research lies at the crossroads of complex systems, computational physics, quantum computing, and novel computing paradigms. At the moment, my main research focus is to develop algorithms (paired with novel hardware at times) to solve hard optimization problems, hence intractable. These can be in physics applications—such as disordered magnetic systems—or cross over to other fields of research. Most recently, and driven by my team’s involvement in the IARPA-funded Quantum Enhanced Optimization Program, my focus has been on raising the bar for quantum computing technologies and debunking claims of “quantum speedup.”
Can quantum computing help solve the Moore’s law problem?
Yes, and no. In the near-to-medium-term future, quantum computers will be devices used to solve certain types of problems. For example, quantum annealers focus on optimization problems, whereas gate-model devices will focus initially on quantum chemistry. This might “solve” (note the quotes) Moore’s law problem for these fields of research, but will not universally solve the problem. For this reason, we need to seek other computing paradigms, be they, for example, custom silicon or optical. Silicon architectures will not be replaced any time soon. They will still drive most of the worlds computing. However, in the back end, novel technologies based on quantum devices might do the heavy lifting for some applications.
The underlying mechanics of quantum computers and how that style of computation is applied to problems can be difficult to wrap one’s head around. What’s one of your favorite analogies for explaining a particular piece of that puzzle?
Last year I was at a panel in a global derivatives conference where quants wanted to learn more about quantum computing. It soon became clear to me that explaining the concepts was futile given the 30 minutes I had. So, I summarized a quantum computer as follows: input goes in, magic happens, output comes out. If you are lucky, the output is correct—at least for current devices.
Joking aside, I feel that quantum mechanics should be more prominent in today’s education. I was lucky enough to have an inspiring teacher in high school who thought it would be a good idea to introduce differential equations, Schrödinger’s equation, as well as the Heisenberg uncertainty principle. It made little sense to us at that time, but when I came across these subjects in college, just because that seed had been planted a few years earlier, it was much easier for me to grasp things.
Similarly, programming a gate-mode quantum computer will be quite different than a current silicon-based machine. Microsoft has embraced this problem and presented their roadmap recently to deliver the whole stack needed to program quantum devices. I feel universities should start thinking about these problems. After all, the biggest shortage in the quantum computing world lies in good quantum algorithms that exploit quantum mechanics to scale much better than classical counterparts.
Can you explain some of the current problems with benchmarking for quantum computers?
It all starts with poor definitions of quantum speedup. If you make vague statements as to what constitutes “speedup” then you open the door for misleading claims. This has lead to many press releases that could be debunked within weeks of being published. Furthermore, it is a continuous arms race: While quantum devices become better, we are working on faster algorithms on classical hardware that scale better. As such, today’s claims might not be valid tomorrow.
Finally, to date there is only one study that shows that current quantum annealing machines—the D-Wave Systems Inc. 2000Q machine—are faster than the best classical algorithms (shameless self-promotion for a paper lead by Salvatore Mandrà at NASA). However, the speedup is just a constant offset and not an improved scaling with the number of variables in the problem. Even worse, the benchmark problem is of synthetic nature. To date, there are no signs of quantum speedup for applications of interest, such as circuit fault diagnosis, verification and validation, or similar.
What problems will computers using quantum annealing be particularly well suited for? Will this change as universal quantum computers become more advanced?
If you can cast your problem as a binary optimization problem, then special purpose quantum annealing machines designed to minimize these will hopefully eventually surpass classical technology. However, while binary optimization problems appear all over science and industry, they are still a niche in some sense.
One could take prime factoring and cast it as a binary optimization problem. However, the overhead and current performance of annealers does not allow for factoring meaningful numbers. A universal device would be better suited for this, once operational.
What are the biggest outstanding scientific challenges in quantum computing? What other breakthroughs still need to occur before we’ll see commercially viable machines?
I would say an intrinsic fault tolerant scheme, like the one Microsoft is betting on. There are multiple scientific papers that estimate the number of gates needed for applications of interest (e.g., quantum chemistry) and those numbers are frightening. Given the fragility of quantum states, “built-in” error correction is key.
What will quantum computing mean for A.I. and machine learning?
I’d say ask the folks at 1QBit!
What trends do you see in high performance computing that will interact with quantum computing?
There is a lot of interest at the moment in adding accelerators (co-processors, GPUs, FPGAs, etc.) to high-performance computing systems. This is a great development, because it forces coders to think beyond standard parallelization schemes. Quantum processors will likely act in a similar manner for certain steps in a complex computation. Therefore, gaining experience with hardware accelerators now will be key in efficiently binding quantum devices into current HPC systems.
Can you explain spin glasses to a layman?
Spin glasses have been around for a while and are a rather boring subject for most physicists. Basically, it is a model for a disordered magnetic material. However, the interesting aspect is that the mathematical formulation used to model the material is a close cousin of so-called constraint-satisfaction problems. These play a fundamental role in computer science and find applications across industry.
Therefore, if you develop ways to study spin glasses (be it theoretical or numerical), you are effectively developing methods to study many problems across disciplines. Recently, Andrew Lucas wrote a nice paper mapping many archetypical optimization problems onto spin glasses. Among them were graph and number partitioning, minimum vertex covers, knapsack problems, graph coloring, Hamiltonian cycles, as well as the well-known traveling salesperson problem.
In a spin glass, one makes the approximation that the magnet is built from little mini-magnets. These can either point up or down (think zero or one, thus binary). If this were a magnet that sticks to the fridge, then the mini-magnets would want to all point in the same direction so that their effect adds up. This is accomplished via a positive coupler between the mini-magnets. In a spin glass, you randomly replace these positive couplers by negative ones. When a coupler is negative, two interacting mini-magnets want to point in opposite directions.
If you now ask what the “stable” configuration for a system with 100 mini-magnets with random couplers is, you have to search 2^100 (about 10^30) possible solutions. Once you satisfy one pair, you will likely break it for the next. The problem thus becomes quickly hard with a growing number of mini-magnets (variables). Therefore, these are the simplest computationally-hard problem and therefore ideal to benchmark new computing paradigms and optimization machines.
Will the way we currently think about solving problems with a computer have to change with quantum machines?
Yes. First, we will have to determine if the problem is well suited for a quantum computer. Then, we will have to come up with ways to use the quantum computer such that its quantum advantages over classical computers excel. Both are hard problems, and there is a lot of research in this field.
Are there areas where software can fill in the gaps or help overcome the limits of current quantum computing hardware?
Definitely. For example, current quantum optimization hardware has a limited number of variables it can treat. If the problem you are interested in exceeds this number of available variables (i.e., qubits), then you need to break up your problem into cleverly-selected chunks to run these individually on the limited-size hardware at hand. Once the sub-problems have been optimized, they can be stitched together to the original problem of interest.
However, note that, like for Amdahl’s law in HPC, the software/classical step might represent a bottleneck here. Therefore at some point, researchers will have to start thinking about a quantum version of Amdahl’s law.
How can developers get involved in quantum computing? What advice would you give to someone just starting out?
Learn quantum mechanics!
What are some of the most prominent myths or misconceptions about quantum computing?
Myth: I think Time’s cover a few years back sums it up pretty well: “The Infinity Machine.” Yes, quantum computers will eventually able to break current encryption schemes. But they will not be disruptive, because already today we have quantum encryption schemes that are immune to traditional quantum attacks. Furthermore, they will not replace traditional computers.
Misconception: calling the D-Wave quantum annealer a quantum computer. The Oxford Dictionary defines a “computer” in the following way: “An electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program.” Key in the definition is the “variable program” part that allows one to create any imaginable piece of software to do whatever one pleases.
The D-Wave device is a special-purpose optimization device that uses quantum fluctuations to optimize. It is designed for one particular task only, and as such, it should not be called a computer. While D-Wave might argue that it is “programmable,” there is no way to play Tetris on the quantum processor. As such, it should not be referred to as a computer. This is where chief evangelist of D-Wave, Murray Thom, will complain about this article.
There’s talk that the time is now. What are the challenges that could potentially derail the promise of near-term quantum computing?
Other than Trump, probably building high-quality fault-tolerant qubits.
Is there anyone working in quantum today that inspires you?
Without a doubt, Prof. Hidetoshi Nishimori. Not only did he play an integral role in the creation of quantum optimization, he is one of the few theorists I know that does not dismiss numerical work. Moreover, if it can be calculated analytically, then Nishimori Sensei has done it. Finally, every time I meet him at an event, he has come up with something new that is groundbreaking. For example, to show that noise can maybe help quantum annealing machines, or that a sequential sweep of the qubits might massively improve the performance of quantum annealing.
What are some of the most interesting projects you’ve seen others working on in quantum computing?
I really like IBM’s programmable quantum computer. Even in its most rudimentary form of 5 qubits (now many more!) it produced at least 50 scientific publications. It clearly illustrates that the scientific community is ready for programmable devices. The fact that IBM opened up the system to researchers was genius.
Another project presented at the last Adiabatic Quantum Computing conference in Tokyo was the work of Richard Harris at D-Wave. Basically, he figured out how to use the machine as a quantum simulator, Richard Feynman’s original dream. The work is groundbreaking in that it pushes the envelope for classical computers simulating a physical system that is very hard to treat on classical devices.
You’ve taken an interdisciplinary approach to research. Do you have a long-term goal where all of these disciplines converge?
Yes, that is my ultimate goal. I am tired of doing boring physics. Ideally, I want to integrate all these disciplines to help create new computing methods and optimization techniques that will help solve problems across disciplines. This is also why I have been working closely with industry via 1QBit and Microsoft. I want to know what current real-world problems are, and then use my expertise to attempt to solve them.