Physical Review Research (Mar 2024)

Large-scale quantum approximate optimization on nonplanar graphs with machine learning noise mitigation

  • Stefan H. Sack,
  • Daniel J. Egger

DOI
https://doi.org/10.1103/PhysRevResearch.6.013223
Journal volume & issue
Vol. 6, no. 1
p. 013223

Abstract

Read online Read online

Quantum computers are increasing in size and quality but are still very noisy. Error mitigation extends the size of the quantum circuits that noisy devices can meaningfully execute. However, state-of-the-art error mitigation methods are hard to implement and the limited qubit connectivity in superconducting qubit devices restricts most applications to the hardware's native topology. Here we show a quantum approximate optimization algorithm (QAOA) on nonplanar random regular graphs with up to 40 nodes enabled by a machine learning-based error mitigation. We use a swap network with careful decision-variable-to-qubit mapping and a feed-forward neural network to optimize a depth-two QAOA on up to 40 qubits. We observe a meaningful parameter optimization for the largest graph which requires running quantum circuits with 958 two-qubit gates. Our paper emphasizes the need to mitigate samples, and not only expectation values, in quantum approximate optimization. These results are a step towards executing quantum approximate optimization at a scale that is not classically simulable. Reaching such system sizes is key to properly understanding the true potential of heuristic algorithms like QAOA.