Skip to main content

Tensor Network States with Low-Rank Tensors

Publication ,  Journal Article
Chen, H; Barthel, T
Published in: arXiv:2205.15296
May 30, 2022

Tensor networks are used to efficiently approximate states of strongly-correlated quantum many-body systems. More generally, tensor network approximations may allow to reduce the costs for operating on an order-N tensor from exponential to polynomial in N, and this has become a popular approach for machine learning. We introduce the idea of imposing low-rank constraints on the tensors that compose the tensor network. With this modification, the time and space complexities for the network optimization can be substantially reduced while maintaining high accuracy. We detail this idea for tree tensor network states (TTNS) and projected entangled-pair states. Simulations of spin models on Cayley trees with low-rank TTNS exemplify the effect of rank constraints on the expressive power. We find that choosing the tensor rank r to be on the order of the bond dimension m, is sufficient to obtain high-accuracy groundstate approximations and to substantially outperform standard TTNS computations. Thus low-rank tensor networks are a promising route for the simulation of quantum matter and machine learning on large data sets.

Duke Scholars

Published In

arXiv:2205.15296

DOI

Publication Date

May 30, 2022
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Chen, H., & Barthel, T. (2022). Tensor Network States with Low-Rank Tensors. ArXiv:2205.15296. https://doi.org/10.48550/arXiv.2205.15296
Chen, Hao, and Thomas Barthel. “Tensor Network States with Low-Rank Tensors.” ArXiv:2205.15296, May 30, 2022. https://doi.org/10.48550/arXiv.2205.15296.
Chen H, Barthel T. Tensor Network States with Low-Rank Tensors. arXiv:220515296. 2022 May 30;
Chen, Hao, and Thomas Barthel. “Tensor Network States with Low-Rank Tensors.” ArXiv:2205.15296, May 2022. Manual, doi:10.48550/arXiv.2205.15296.
Chen H, Barthel T. Tensor Network States with Low-Rank Tensors. arXiv:220515296. 2022 May 30;

Published In

arXiv:2205.15296

DOI

Publication Date

May 30, 2022