By Andrew Guo
Day Two of the 2017 IBM ThinkQ Conference featured talks by an impressive array of experts in both the fields of quantum Hamiltonian simulation (Andrew Childs, Robin Kothari, Ryan Babbush, Nathan Wiebe) and quantum chemistry (Ryan Babbush, Garnet Chan). The following post aims to summarize the intersection of those fields and explain how they provide a “killer application” for a quantum computer.
Introduction
Much ink has been spilled regarding the quest to find flashy applications for the 50qubit quantum computers (QCs) that will be available over the next few years. The goal in researching these problems has always been to find a task on which classical computers struggle to make headway, but that quantum computer could do in its sleep^{1}. Such a vaunted demonstration of a “quantum computational advantage” would cement the public image of quantum computing as “Star Wars technology,” and certainly merits further investigation.
But while some researchers are searching for problems for which nearterm QCs would have a computational advantage, others are focusing their attention on a longerterm goal—one that’s been around since the very beginning of the field—the simulation of quantum chemistry and quantum materials. In fact, the idea of quantum simulation can be traced all the way back to the late (and great) physicist, Richard P. Feynman.
Simulation of quantum mechanics
Feynman was one of the #GreatMinds of 20thcentury physics. He shared the Nobel prize in 1965 for codiscovering quantum electrodynamics, gave his name to a halfdozen fundamental concepts in particle physics, and authored a trilogy of famous physics textbooks (as well as a pair of rambunctious, semiautobiographical works). More relevantly to this blog post, he also anticipated the power of quantum computing in a 1982 talk titled “Simulating Physics with Computers.”^{2}
In classical physics, one can simulate the dynamics of systems by solving the equations of motion. These systems of differential equations can be solved numerically for all but the most complicated of systemssuch as turbulent fluids. While quantum computers won’t necessarily be able to help with intractable problems in fluid dynamics, they will be useful in situations where the laws of classical mechanics themselves break down. In Feynman’s own words:
Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly it's a wonderful problem, because it doesn't look so easy.
Feynman foresaw that the challenge of simulating quantum mechanics would stump the world’s most powerful computers—including modernday supercomputers. Thirty years after his death^{3}, Feynman’s words still feature in the motivating slides for many talks on quantum simulation^{4}.
So why is simulating quantum chemistry so hard for classical computers? Well, it goes back to the curse of dimensionality: the more particles you have in a system, the more parameters you need to describe its quantum state. In the worst case, the number of parameters can grow exponentially in the number of electrons! In practice, this makes it impractical to simulate molecular systems with more than a few atoms. And while such exacting quantum mechanical descriptions aren’t typically required to model most reactions^{5}, there are some important exceptions^{6}.
One example of a chemical phenomenon that skews more quantum than classical is the mechanism of catalysts—molecular agents used to speed up chemical reactions without themselves being consumed. The heart of a catalyst’s reactive power is in its active site, which in many heterogeneous catalysts—think enzymes—consists of a geometric arrangement of molecules centered around one (or more) transition metal atoms. The electrons of these metal atoms interact strongly with other electrons, and play an important role in stabilizing the transition states of a reaction. It is exactly these stronglyinteracting systems that cannot be simulated with classical methods!
One such catalyst that researchers have focused on recently is nitrogenase, the enzyme responsible for nitrogenfixation in bacteria^{7}. While nitrogen is ubiquitous in nature—comprising 78% by volume of the air we breathe and featuring in each of the twenty amino acids that make up our proteins—most of the nitrogen on earth takes the form of inert dinitrogen gas. The nitrogenase enzyme can chemically activate nitrogen at standard temperature and pressure, thereby fixing it.
In industry, an analogous reaction known as the HaberBosch process is used to fix nitrogen, specifically in the form of ammonia. The ammonia can be then be used to make fertilizer (or nitrobased explosives—a fact which likely helped extend the duration of WWI by a few years). The reaction consumes one nitrogen molecule and three hydrogen molecules to form two ammonia molecules: \begin{align} \text{N}_2 + 3 \text{H}_2 \rightarrow 2 \text{NH}_3 \quad (\Delta H^\circ = 45.8 \text{ kJ/mol}) \end{align} Although this reaction is exothermic, the process requires temperatures of 400 $^\circ$C and pressures of 200 atm to proceed. So even though nitrogen fixation is thermodynamically favorable at room temperature, it proceeds slowly due to the immense activation energy required to break the triple bond. The extreme temperatures and pressures required to activate the metallic catalyst use up 12% of the world’s annual energy budget—an order magnitude more than is used to mine Bitcoin! The development of a better nitrogenfixation catalyst would make a sizable impact on reducing the world’s energy consumption.
In order to find a better catalyst, we need to first understand the catalytic reaction mechanism, including all of the reaction intermediates and transition structures that may occur between the reactant and product stages. Phrased in physical terms, we must map out the potential energy surface that the system explores throughout the reaction. In this model, the transition structures and reaction intermediates are located at points of local maximum and minimum respectively. Here, quantum simulation can come in handy. By determining the groundstate energies of the chemical structures, quantum computers can help determine the optimal pathway through the reaction space.
Hybrid quantumclassical algorithms
How do we go about mapping the potential energy surface of a chemical reaction? According to Ryan Babbush, we would need to map the potential energy surfaces to the chemical accuracy of 1 kcal/mol in order to get reaction coefficients on the correct order of magnitude^{8}. For comparison, the activation energy for the uncatalyzed nitrogenfixation reaction is a little less than 100 kcal/mol^{9}. So if we could find the ground state energies for all of the transition states and reaction intermediates with an error of at most one part in a hundred, we could make nontrivial headway in finding new reaction pathways!
Unfortunately, finding ground state energies is a hard task in general—even for a quantum computer^{10}! And it could be a decade or more before QC’s will be powerful and robust enough to improve over current chemical simulation methods. But there is some hope for the nearterm: researchers have developed some heuristic algorithms that can perform approximate quantum simulation using the first generation of quantum devices^{11}. While these algorithms do not guarantee any asymptotic quantum speedup, they might perform suitably well in practice.
A simple way to find the ground state energy of a Hamiltonian $H$ is by preparing a system in its ground state $\ket{\psi}$ and measuring the expectation value of its energy: $\bra{\psi}H\ket{\psi}$. But this exact state preparation is still hard^{12}. So the next best thing would be to prepare a trial state that is close in energy to the ground state. Let’s say we have a quantum circuit that takes in a set of input parameters and applies a sequence of gates to prepare a specific quantum state. By measuring the expectation value of the energy of that state, we obtain an upper bound to the system’s true ground state energy. Then, we use classical optimization techniques to minimize the energy as a function of the input parameters. After feeding the optimized parameters back into the quantum circuit, we repeat the process all over again. If we’re smart about this, then we’ll end up getting close to the groundstate energy in a reasonable number of iterations.
This proposal for finding ground state energies goes by the name of a variational quantum eigensolver (VQE), and is an example of a socalled hybrid quantumclassical algorithm. Here, the classical part consists of a soupedup, descentbased optimizer, whereas the quantum part is responsible for the state preparation and measurement. Another way to think about VQE is that the measurement of the state’s energy produces data on which we can train the state preparation protocol. This may sound familiar to those of you with a machine learning background; indeed, Ryan Babbush has said that the idea of VQE and other hybrid algorithms is to “train shallow quantum circuits like a neural network.” Since these lowdepth quantum circuits do not require errorcorrection, these algorithms are prime candidates for harnessing the power of nearterm QCs.
Conclusion
This wraps up our brief tour of quantum simulation. I hope that this post has elucidated why some people consider simulating chemistry to be one of the premier applications for a quantum computer. And there are numerous other practical implications; in addition to catalysts, QC’s could be used in the future to study the reactivity of chemical structures like proteins and crystals, or even probe exotic materials like hightemperature superconductors^{13}.
It’s important, however, to not to get ahead of ourselves: quantum computing isn’t the only revolutionary technology capable of transforming quantum chemistry. Artificial intelligence and machine learning in particular are poised to make a significant impact on the field — long before universal QCs are expected to take the stage^{14}. One challenge in the meantime is to find expert collaborators who can help identify materials or molecular systems for which quantum simulation befits, but who also have the patience to play the long game. As with other forms of “Star Wars technology,” quantum computers will have their day—in a spacetime far, far away.
Thanks to Leigh Martin and Stephen Ting for helpful discussions. Special thanks to Andrew Childs for his course on quantum algorithms at the University of Maryland, which planted the original inspiration for this post.

Quote attributed to Scott Aaronson (download warning: 1.3MB PPT file). ↩

Richard P. Feynman, Simulating physics with computers, International Journal of Theoretical Physics 21 (1982), no. 67, 467–488. doi:10.1007/BF02650179 ↩

Feynman passed away on February 15, 1988. Fun fact: the 100th anniversary of his birthday will take place on May 11, 2018. ↩

See, for example, the talks by Andrew Childs and Garnet Chan. ↩

To simulate the dynamics of biochemical systems like proteins, it usually suffices to model them using classical molecular mechanics—already a computationallyintensive task in and of itself. To compute molecular orbitals in solidstate systems and quantum chemistry, the meanfield method of density functional theory (DFT) has had many successes. Unsurprisingly, DFT still struggles to model systems of stronglycorrelated electrons efficiently. ↩

As is the case for most “rules” in chemistry. ↩

Reiher, M., Wiebe, N., Svore, K. M., Wecker, D., & Troyer, M., Elucidating Reaction Mechanisms on Quantum Computers. PNAS 2017 July, 114 (29) 75557560. doi:10.1073/pnas.1619152114 ↩

Using the Arrhenius equation: $k = Ae^{\Delta E/RT}$, where $\Delta E$ is the activation energy of the reaction or one of its intermediate steps. ↩

Jayant M. Modak, Haber Process for Ammonia Synthesis, General Article, Volume 7, Issue 9, September 2002 pp 6977. Fulltext ↩

More specifically, the klocal Hamiltonian simulation problem is QMAcomplete for $k \ge 2$. See the following paper for details: Kempe, J., Kitaev, A., & Regev, O., The complexity of the local Hamiltonian problem, SIAM Journal on Computing, 35 (5): 1070–1097. (2006) doi:10.1137/S0097539704445226 ↩

Jarrod R McClean, et al., The theory of variational hybrid quantumclassical algorithms, New J. Phys. 18 023023, (2016). doi:10.1088/13672630/18/2/023023 ↩

QMAhard, in fact. ↩

For a comprehensive outlook on technological forces poised to “disrupt” the field of chemical simulation, check out this recent paper: AspuruGuzik, A., Lindh, R., & Reiher, M., The Matter Simulation (R)evolution, ACS Cent. Sci., Article ASAP, (2018). doi:10.1021/acscentsci.7b00550 ↩