Contracting Tensor Networks with Generalized Belief Propagation

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

Contracting Tensor Networks with Generalized Belief Propagation

Authors

Joseph Tindall, Grace M. Sommers, Hilbert Kappen

Abstract

Recent years have seen a growing interest in the use of belief propagation - an algorithm originally introduced for performing statistical inference on graphical models - for approximate, but highly efficient, tensor network contraction. Here, we detail how to apply generalized belief propagation (GBP) - where messages are passed within a hierarchy of overlapping regions of the tensor network - to approximately contract tensor networks and obtain accurate results. The original belief propagation algorithm is a corner case of this approach, corresponding to a particularly simple choice of regions of the tensor network. We implement the GBP algorithm for a number of different region choices on a range of two- and three-dimensional, infinite and finite tensor networks, solving the corresponding fixed point equations both numerically and, in certain tractable cases, analytically. Our examples include calculating the partition function of the fully frustrated Ising model, computing the ground state degeneracy of three-dimensional ice models, measuring observables on the deformed AKLT quantum state and evaluating the norm of randomly generated tensor network states.

Follow Us on

0 comments

Add comment