skip to main content
10.1145/3219819.3220054acmotherconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

Embedding Temporal Network via Neighborhood Formation

Published:19 July 2018Publication History

ABSTRACT

Given the rich real-life applications of network mining as well as the surge of representation learning in recent years, network embedding has become the focal point of increasing research interests in both academic and industrial domains. Nevertheless, the complete temporal formation process of networks characterized by sequential interactive events between nodes has yet seldom been modeled in the existing studies, which calls for further research on the so-called temporal network embedding problem. In light of this, in this paper, we introduce the concept of neighborhood formation sequence to describe the evolution of a node, where temporal excitation effects exist between neighbors in the sequence, and thus we propose a Hawkes process based Temporal Network Embedding (HTNE) method. HTNE well integrates the Hawkes process into network embedding so as to capture the influence of historical neighbors on the current neighbors. In particular, the interactions of low-dimensional vectors are fed into the Hawkes process as base rate and temporal influence, respectively. In addition, attention mechanism is also integrated into HTNE to better determine the influence of historical neighbors on current neighbors of a node. Experiments on three large-scale real-life networks demonstrate that the embeddings learned from the proposed HTNE model achieve better performance than state-of-the-art methods in various tasks including node classification, link prediction, and embedding visualization. In particular, temporal recommendation based on arrival rate inferred from node embeddings shows excellent predictive power of the proposed model.

Skip Supplemental Material Section

Supplemental Material

zuo_neighborhood_formation.mp4

mp4

444.5 MB

References

  1. Amr Ahmed, Nino Shervashidze, Shravan Narayanamurthy, Vanja Josifovski, and Alexander J. Smola . 2013. Distributed Large-scale Natural Graph Factorization WWW. ACM, New York, NY, USA, 37--48. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio . 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.Google ScholarGoogle Scholar
  3. Mikhail Belkin and Partha Niyogi . 2001. Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering NIPS. MIT Press, Cambridge, MA, USA, 585--591. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. HongYun Cai, Vincent W. Zheng, and Kevin Chen-Chuan Chang . 2017. A Comprehensive Survey of Graph Embedding: Problems, Techniques and Applications. CoRR Vol. abs/1709.07604 (2017).Google ScholarGoogle Scholar
  5. Sandro Cavallari, Vincent W. Zheng, Hongyun Cai, Kevin Chen-Chuan Chang, and Erik Cambria . 2017. Learning Community Embedding with Community Detection and Node Embedding on Graphs CIKM. 377--386. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Quanyu Dai, Qiang Li, Jian Tang, and Dan Wang . 2017. Adversarial Network Embedding. CoRR Vol. abs/1711.07838 (2017).Google ScholarGoogle Scholar
  7. Nan Du, Yichen Wang, Niao He, and Le Song . 2015. Time-sensitive Recommendation from Recurrent User Activities NIPS. 3492--3500. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio . 2014. Generative Adversarial Nets. In NIPS. MIT Press, Cambridge, MA, USA, 2672--2680. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Aditya Grover and Jure Leskovec . 2016. Node2Vec: Scalable Feature Learning for Networks. In SIGKDD. ACM, New York, NY, USA, 855--864. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. William L. Hamilton, Rex Ying, and Jure Leskovec . 2017. Inductive Representation Learning on Large Graphs. CoRR Vol. abs/1706.02216 (2017).Google ScholarGoogle Scholar
  11. Alan G Hawkes . 1971. Spectra of some self-exciting and mutually exciting point processes. Biometrika Vol. 58, 1 (1971), 83--90.Google ScholarGoogle ScholarCross RefCross Ref
  12. Petter Holme and Jari Saramäki . 2012. Temporal networks. Physics Reports Vol. 519, 3 (2012), 97 -- 125.Google ScholarGoogle ScholarCross RefCross Ref
  13. Joseph B Kruskal and Myron Wish . 1978. Multidimensional Scaling.CRC press. 875--878 pages.Google ScholarGoogle Scholar
  14. Rémi Lemonnier, Kevin Scaman, and Argyris Kalogeratos . 2017. Multivariate Hawkes Processes for Large-Scale Inference AAAI.Google ScholarGoogle Scholar
  15. Remi Lemonnier and Nicolas Vayatis . 2014. Nonparametric Markovian Learning of Triggering Kernels for Mutually Exciting and Mutually Inhibiting Multivariate Hawkes Processes Machine Learning and Knowledge Discovery in Databases. 161--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean . 2013 a. Efficient Estimation of Word Representations in Vector Space. CoRR Vol. abs/1301.3781 (2013).Google ScholarGoogle Scholar
  17. Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean . 2013 b. Distributed Representations of Words and Phrases and Their Compositionality NIPS. Curran Associates Inc., USA, 3111--3119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg S Corrado, and Jeff Dean . 2013 c. Distributed Representations of Words and Phrases and their Compositionality. In NIPS. 3111--3119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Shirui Pan, Jia Wu, Xingquan Zhu, Chengqi Zhang, and Yang Wang . 2016. Tri-party Deep Network Representation. In IJCAI. AAAI Press, 1895--1901. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Bryan Perozzi, Rami Al-Rfou, and Steven Skiena . 2014. DeepWalk: Online Learning of Social Representations SIGKDD. ACM, New York, NY, USA, 701--710. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Sam T. Roweis and Lawrence K. Saul . 2000. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science Vol. 290, 5500 (2000), 2323--2326.Google ScholarGoogle ScholarCross RefCross Ref
  22. Jian Tang, Meng Qu, Mingzhe Wang, Ming Zhang, Jun Yan, and Qiaozhu Mei . 2015. LINE: Large-scale Information Network Embedding. In WWW. International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 1067--1077. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Joshua B. Tenenbaum, Vin de Silva, and John C. Langford . 2000. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science Vol. 290, 5500 (2000), 2319--2323.Google ScholarGoogle ScholarCross RefCross Ref
  24. Laurens van der Maaten and Geoffrey E. Hinton . 2008. Visualizing High-Dimensional Data Using t-SNE. JMLR Vol. 9 (2008), 2579--2605.Google ScholarGoogle Scholar
  25. Daixin Wang, Peng Cui, and Wenwu Zhu . 2016. Structural Deep Network Embedding. In SIGKDD. ACM, New York, NY, USA, 1225--1234. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Hongwei Wang, Jia Wang, Jialin Wang, Miao Zhao, Weinan Zhang, Fuzheng Zhang, Xing Xie, and Minyi Guo . 2017 b. GraphGAN: Graph Representation Learning with Generative Adversarial Nets. CoRR Vol. abs/1711.08267 (2017).Google ScholarGoogle Scholar
  27. Jingyuan Wang, Fei Gao, Peng Cui, Chao Li, and Zhang Xiong . 2014. Discovering urban spatio-temporal structure from time-evolving traffic networks Proceedings of the 16th Asia-Pacific Web Conference. Springer International Publishing, 93--104.Google ScholarGoogle Scholar
  28. Xiao Wang, Peng Cui, Jing Wang, Jian Pei, Wenwu Zhu, and Shiqiang Yang . 2017 a. Community Preserving Network Embedding.Google ScholarGoogle Scholar
  29. Lekui Zhou, Yang Yang, Xiang Ren, Fei Wu, and Yueting Zhuang . 2018. Dynamic Network Embedding by Modeling Triadic Closure Process The AAAI Conference on Artificial Intelligence.Google ScholarGoogle Scholar
  30. L. Zhu, D. Guo, J. Yin, G. V. Steeg, and A. Galstyan . 2016. Scalable Temporal Latent Space Inference for Link Prediction in Dynamic Social Networks. IEEE Transactions on Knowledge and Data Engineering Vol. 28, 10 (Oct . 2016), 2765--2777. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Embedding Temporal Network via Neighborhood Formation

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          KDD '18: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining
          July 2018
          2925 pages
          ISBN:9781450355520
          DOI:10.1145/3219819

          Copyright © 2018 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 19 July 2018

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          KDD '18 Paper Acceptance Rate107of983submissions,11%Overall Acceptance Rate1,133of8,635submissions,13%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader