Skip to main content

A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width

Publication ,  Conference
Gaitonde, J; Mossel, E
Published in: Proceedings of the Annual ACM Symposium on Theory of Computing
June 10, 2024

We revisit the well-studied problem of efficiently learning the underlying structure and parameters of an Ising model from data. Current algorithmic approaches achieve essentially optimal sample complexity when samples are generated i.i.d. from the stationary measure and the underlying model satisfies "width"constraints that bound the total ℓ1 interaction involving each node. However, these assumptions are not satisfied in some important settings of interest, like temporally correlated data or more complicated models (like spin glasses) that do not satisfy width bounds. We analyze a simple existing approach based on node-wise logistic regression, and show it provably succeeds at efficiently recovering the underlying Ising model in several new settings: Given dynamically generated data from a wide variety of Markov chains, including Glauber, block, and round-robin dynamics, logistic regression recovers the parameters with sample complexity that is optimal up to loglogn factors. This generalizes the specialized algorithm of Bresler, Gamarnik, and Shah (IEEE Trans. Inf. Theory '18) for structure recovery in bounded degree graphs from Glauber dynamics. For the Sherrington-Kirkpatrick model of spin glasses, given poly(n) independent samples, logistic regression recovers the parameters in most of the proven high-temperature regime via a simple reduction to weaker structural properties of the measure. This improves on recent work of Anari, Jain, Koehler, Pham, and Vuong (SODA '24) which gives distribution learning at higher temperature. As a simple byproduct of our techniques, logistic regression achieves an exponential improvement in learning from samples in the M-regime of data considered by Dutt, Lokhov, Vuffray, and Misra (ICML '21) as well as novel guarantees for learning from the adversarial Glauber dynamics of Chin, Moitra, Mossel, and Sandon. Our approach thus provides a significant generalization of the elegant analysis of logistic regression by Wu, Sanghavi, and Dimakis (Neurips '19) without any algorithmic modification in each setting.

Duke Scholars

Published In

Proceedings of the Annual ACM Symposium on Theory of Computing

DOI

ISSN

0737-8017

Publication Date

June 10, 2024

Start / End Page

503 / 514
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Gaitonde, J., & Mossel, E. (2024). A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width. In Proceedings of the Annual ACM Symposium on Theory of Computing (pp. 503–514). https://doi.org/10.1145/3618260.3649674
Gaitonde, J., and E. Mossel. “A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width.” In Proceedings of the Annual ACM Symposium on Theory of Computing, 503–14, 2024. https://doi.org/10.1145/3618260.3649674.
Gaitonde J, Mossel E. A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width. In: Proceedings of the Annual ACM Symposium on Theory of Computing. 2024. p. 503–14.
Gaitonde, J., and E. Mossel. “A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width.” Proceedings of the Annual ACM Symposium on Theory of Computing, 2024, pp. 503–14. Scopus, doi:10.1145/3618260.3649674.
Gaitonde J, Mossel E. A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width. Proceedings of the Annual ACM Symposium on Theory of Computing. 2024. p. 503–514.

Published In

Proceedings of the Annual ACM Symposium on Theory of Computing

DOI

ISSN

0737-8017

Publication Date

June 10, 2024

Start / End Page

503 / 514