PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Exploring convolutional auto-encoders for representation learning on networks

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
A multitude of important real-world or synthetic systems possess network structures. Extending learning techniques such as neural networks to process such non-Euclidean data is therefore an important direction for machine learning re- search. However, this domain has received comparatively low levels of attention until very recently. There is no straight-forward application of machine learning to network data, as machine learning tools are designed for i:i:d data, simple Euclidean data, or grids. To address this challenge, the technical focus of this dissertation is on the use of graph neural networks for network representation learning (NRL); i.e., learning the vector representations of nodes in networks. Learning the vector embeddings of graph-structured data is similar to embedding complex data into low-dimensional geometries. After the embedding process is completed, the drawbacks associated with graph-structured data are overcome. The current inquiry proposes two deep-learning auto-encoder-based approaches for generating node embeddings. The drawbacks in such existing auto-encoder approaches as shallow architectures and excessive parameters are tackled in the proposed architectures by using fully convolutional layers. Extensive experiments are performed on publicly available benchmark network datasets to highlight the validity of this approach.
Wydawca
Czasopismo
Rocznik
Strony
273--288
Opis fizyczny
Bibliogr. 33 poz., rys., tab.
Twórcy
  • VJTI, Computer Engineering and IT Department, Mumbai, India
autor
Bibliografia
  • [1] Bandyopadhyay S., Kara H., Kannan A., Murty M.: FSCNMF: Fusing Structure and Content via Non-negative Matrix Factorization for Embedding Information Networks. In: arXiv preprint arXiv:1804.05313, 2018.
  • [2] Barabasi A.L., et. al.: Network science. Cambridge University Press, 2016.
  • [3] Cui P., Wang X., Pei J., Zhu W.: A survey on network embedding. In: IEEE T Knowl Data En, 2018.
  • [4] Defferrard M., Bresson X., Vandergheynst P.: Convolutional neural networks on graphs with fast localized spectral Filtering. In: Advances in Neural Information Processing Systems, pp. 3844-3852. 2016.
  • [5] Deng J., Zhang Z., Marchi E., Schuller B.: Sparse autoencoder-based feature transfer learning for speech emotion recognition. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 511-516. IEEE, 2013.
  • [6] Deng L., Seltzer M.L., Yu D., Acero A., Mohamed A.r., Hinton G.: Binary coding of speech spectrograms using a deep auto-encoder. In: Eleventh Annual Conference of the International Speech Communication Association. 2010.
  • [7] Denny M.: Social network analysis. In: Institute for Social Science Research, University of Massachusetts, Amherst, 2014.
  • [8] Denny M.: Intermediate Social Network Theory. In: Institute for Social Science Research, University of Massachusetts, Amherst, 2015.
  • [9] Goyal P., Ferrara E.: Graph embedding techniques, applications, and performance: A survey. In: Knowl-Based Syst, vol. 151, pp. 78-94, 2018.
  • [10] Hamilton W., Ying Z., Leskovec J.: Inductive representation learning on large graphs. In: Advances in Neural Information Processing Systems, pp. 1024-1034. 2017.
  • [11] Hamilton W.L., Ying R., Leskovec J.: Representation learning on graphs: Methods and applications. In: arXiv preprint arXiv:1709.05584, 2017.
  • [12] Jackson M.: Social and economic networks. Princeton university press, 2010.
  • [13] Kipf T., Welling M.: Semi-Supervised Classification with Graph Convolutional Networks. In: arXiv preprint arXiv:1609.02907, 2016.
  • [14] Kipf T., Welling M.: Variational graph auto-encoders. In: arXiv preprint arXiv:1611.07308, 2016.
  • [15] Leskovec J., Krevl A.: SNAP Datasets: Stanford Large Network Dataset Collection. http://snap.stanford.edu/data, 2014.
  • [16] Li Q., Han Z., Wu X.M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-Second AAAI Conference on Artificial Intelligence. 2018.
  • [17] Mikolov T., Chen K., Corrado G., Dean J.: Efficient estimation of word representations in vector space. In: arXiv preprint arXiv:1301.3781, 2013.
  • [18] Nerurkar P., Chandane M., Bhirud S.: A Comparative Analysis of Community Detection Algorithms on Social Networks. In: Computational Intelligence: Theories, Applications and Future Directions-Volume I, pp. 287-298. Springer, 2019.
  • [19] Nerurkar P., Shirke A., Chandane M., Bhirud S.: A Novel Heuristic for Evolutionary Clustering. In: Procedia Computer Science, vol. 125, pp. 780-789, 2018.
  • [20] Nikolentzos G., Meladianos P., Tixier A.J.P., Skianis K., Vazirgiannis M.: Kernel graph convolutional neural networks. In: International Conference on Artificial Neural Networks, pp. 22-32. Springer, 2018.
  • [21] Ou M., Cui P., Pei J., Zhang Z., Zhu W.: Asymmetric transitivity preserving graph embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pp. 1105-1114. ACM, 2016.
  • [22] Pandhre S., Mittal H., Gupta M., Balasubramanian V.: STwalk: learning trajectory representations in temporal graphs. In: Proceedings of the ACM India Joint International Conference on Data Science and Management of Data, pp. 210-219. ACM, 2018.
  • [23] Rozemberczki B., Davies R., Sarkar R., Sutton C.: GEMSEC: Graph Embedding with Self Clustering. In: arXiv preprint arXiv:1802.03997, 2018.
  • [24] Rozemberczki B., Sarkar R.: Fast Sequence Based Embedding with Difusion Graphs. In: International Conference on Complex Networks. 2018.
  • [25] Tran P.: Learning to Make Predictions on Graphs with Autoencoders. In: arXiv preprint arXiv:1802.08352, 2018.
  • [26] Tsitsulin A., Mottin D., Karras P., Muller E.: VERSE: Versatile Graph Embeddings from Similarity Measures. In: Proceedings of the 2018 World Wide Web Conference on World Wide Web, pp. 539-548. International World Wide Web Conferences Steering Committee, 2018.
  • [27] Velickovic P., Cucurull G., Casanova A., Romero A., Lio P., Bengio Y.: Graph attention networks. In: arXiv preprint arXiv:1710.10903, 2017.
  • [28] Wang Z., Ye X., Wang C., Wu Y., Wang C., Liang K.: RSDNE: Exploring Relaxed Similarity and Dissimilarity from Completely-imbalanced Labels for Network Embedding. In: Network, vol. 11, p. 14, 2018.
  • [29] Wu Z., Pan S., Chen F., Long G., Zhang C., Yu P.S.: A comprehensive survey on graph neural networks. In: arXiv preprint arXiv:1901.00596, 2019.
  • [30] Yang Z., Cohen W., Salakhutdinov R.: Revisiting semi-supervised learning with graph embeddings. In: arXiv preprint arXiv:1603.08861, 2016.
  • [31] Yu B., Yin H., Zhu Z.: Spatio-Temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 3634-3640. 2018.
  • [32] Zhang M., Cui Z., Neumann M., Chen Y.: An end-to-end deep learning architecture for graph classification. In: Proceedings of AAAI Conference on Artificial Inteligence. 2018.
  • [33] Zhou J., Cui G., Zhang Z., Yang C., Liu Z., Sun M.: Graph neural networks: A review of methods and applications. In: arXiv preprint arXiv:1812.08434, 2018.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-863e221e-f42b-40e3-a100-ac23fd84f020
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.