Skip to main content

Improving textual network learning with variational homophilic embeddings

Publication ,  Journal Article
Wang, W; Tao, C; Gan, Z; Wang, G; Chen, L; Zhang, X; Zhang, R; Yang, Q; Henao, R; Carin, L
Published in: Advances in Neural Information Processing Systems
January 1, 2019

The performance of many network learning applications crucially hinges on the success of network embedding algorithms, which aim to encode rich network information into low-dimensional vertex-based vector representations. This paper considers a novel variational formulation of network embeddings, with special focus on textual networks. Different from most existing methods that optimize a discriminative objective, we introduce Variational Homophilic Embedding (VHE), a fully generative model that learns network embeddings by modeling the semantic (textual) information with a variational autoencoder, while accounting for the structural (topology) information through a novel homophilic prior design. Homophilic vertex embeddings encourage similar embedding vectors for related (connected) vertices. The proposed VHE promises better generalization for downstream tasks, robustness to incomplete observations, and the ability to generalize to unseen vertices. Extensive experiments on real-world networks, for multiple tasks, demonstrate that the proposed method consistently achieves superior performance relative to competing state-of-the-art approaches.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2019

Volume

32

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Wang, W., Tao, C., Gan, Z., Wang, G., Chen, L., Zhang, X., … Carin, L. (2019). Improving textual network learning with variational homophilic embeddings. Advances in Neural Information Processing Systems, 32.
Wang, W., C. Tao, Z. Gan, G. Wang, L. Chen, X. Zhang, R. Zhang, Q. Yang, R. Henao, and L. Carin. “Improving textual network learning with variational homophilic embeddings.” Advances in Neural Information Processing Systems 32 (January 1, 2019).
Wang W, Tao C, Gan Z, Wang G, Chen L, Zhang X, et al. Improving textual network learning with variational homophilic embeddings. Advances in Neural Information Processing Systems. 2019 Jan 1;32.
Wang, W., et al. “Improving textual network learning with variational homophilic embeddings.” Advances in Neural Information Processing Systems, vol. 32, Jan. 2019.
Wang W, Tao C, Gan Z, Wang G, Chen L, Zhang X, Zhang R, Yang Q, Henao R, Carin L. Improving textual network learning with variational homophilic embeddings. Advances in Neural Information Processing Systems. 2019 Jan 1;32.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2019

Volume

32

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology