As part of our work on the software behind quantum computing we are investigating a number of different areas. A team at Oxford is looking specifically at:
Architectures: Understanding the theory of how our qubits should be arranged, the levels of connectivity, the clock speed relevance versus noise, etc.
Control - how best to perform individual manipulations
Compilation - turning an algorithm from a high level description into the low level processes that run on the device.
Emulation - using powerful supercomputers to simulate small quantum computers
Know your system!
In order to understand how best to use emerging quantum hardware, it is vital to optimize the tasks that we aim to perform with respect to the specifics of the hardware. This can include the hardware connectivity (which qubits can ‘talk’ to which other qubits), the relative timescales of control and measurement processes, and most importantly the noise processes in the hardware. In the NISQ era, the question of whether or not a given task can be performed will often come down to the ingenuity of the noisy handling.
Quantum Analytic Descent
Quantum Variational Algorithms (QVAs) are widely seen as a leading route to exploiting early quantum computers.
A new QVA called Quantum Analytic Descent moves most of the non-quantum calculation to the supervising classical computer, greatly accelerating the speed of solving problems.
Learning-based Error Mitigation
Quantum error mitigation is the route to controlling errors in early forms of quantum computer. A challenge is that we don’t know what the correct output from the computer should be, so it’s tough to fix errors!
Learning Based Error Mitigation uses machine learning techniques to ‘teach’ our system to correct errors on a set of training data that CAN be checked, and thus fix errors in real task that can’t be checked.
Optimising for chemistry
One of the most promising tasks for quantum computers, especially those in the longer term with fault tolerance, is predicting the properties of molecules.
In our paper we show how to shorten the depth of an algorithm for predicting the energy of molecules, by identifying the key terms and ‘truncating’ to boost speed / reduce error. This simple idea gives a boost of 10x to 100x!
Emulation - Making conventional computers pretend to be quantum computers!
The ideal way to test out an idea for a quantum algorithm, or an error mitigation technique, is of course to try it on a real quantum system! As more and more prototype quantum devices are put online, this becomes a real possibility. But in the current environment, the devices available are limited in scale, oversubscribed, and may suffer very severe levels of noise. Moreover, a given quantum device will have specific noise and connectivity properties that are ‘baked in’ whereas an algorithm designer may wonder how their idea will perform on systems that don’t yet even exist. For these reasons it is vital to have the power of emulation: Using convention computer hardware to accurately simulate a quantum machine. So we made QuEST and QuESTlink.
QuEST is a C and C++ simulation framework, which supports a rich set of operations like Pauli gadgets, multi-qubit general unitaries, density matrices, general Kraus maps. QuESTlink integrates these high-performance facilities into Mathematica, for an intuitive and usable interface.
QuEST and QuESTlink can run on local, multi-core, GPU and distributed systems seamlessly. QuESTlink can even use remote hardware to perform simulations, with the results accessible within Mathematica.
QuESTlink supports a rich variety of facilities, like analytic calculations and fantastic visualisations of circuits and states. More information can be found at: https://questlink.qtechtheory.org
To find out more about Professor Simon Benjamin's group at the University of Oxford you can visit http://qtechtheory.org/.