PRX Quantum (Jan 2025)

Sign Problem in Tensor-Network Contraction

  • Jielun Chen,
  • Jiaqing Jiang,
  • Dominik Hangleiter,
  • Norbert Schuch

DOI
https://doi.org/10.1103/PRXQuantum.6.010312
Journal volume & issue
Vol. 6, no. 1
p. 010312

Abstract

Read online Read online

We investigate how the computational difficulty of contracting tensor networks depends on the sign structure of the tensor entries. Using results from computational complexity, we observe that the approximate contraction of tensor networks with only positive entries has lower computational complexity as compared to tensor networks with general real or complex entries. This raises the question of how this transition in computational complexity manifests itself in the hardness of different tensor-network-contraction schemes. We pursue this question by studying random tensor networks with varying bias toward positive entries. First, we consider contraction via Monte Carlo sampling and find that the transition from hard to easy occurs when the tensor entries become predominantly positive; this can be understood as a tensor-network manifestation of the well-known negative-sign problem in quantum Monte Carlo. Second, we analyze the commonly used contraction based on boundary tensor networks. The performance of this scheme is governed by the number of correlations in contiguous parts of the tensor network (which by analogy can be thought of as entanglement). Remarkably, we find that the transition from hard to easy—i.e., from a volume-law to a boundary-law scaling of entanglement—already occurs for a slight bias of the tensor entries toward a positive mean, scaling inversely with the bond dimension D, and thus the problem becomes easy the earlier the larger D occurs. This is in contrast both to expectations and to the behavior found in Monte Carlo contraction, where the hardness at fixed bias increases with the bond dimension. To provide insight into this early breakdown of computational hardness and the accompanying entanglement transition, we construct an effective classical statistical-mechanical model that predicts a transition at a bias of the tensor entries of 1/D, confirming our observations. We conclude by investigating the computational difficulty of computing expectation values of tensor-network wave functions (projected entangled-pair states, PEPSs) and find that in this setting, the complexity of entanglement-based contraction always remains low. We explain this by providing a local transformation that maps PEPS expectation values to a positive-valued tensor network. This not only provides insight into the origin of the observed boundary-law entanglement scaling but also suggests new approaches toward PEPS contraction based on positive decompositions.