A framework of transferring structures across large-scale information networks

Shan Xue, Jie Lu, Guangquan Zhang, Li Xiong

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

1 Citation (Scopus)

Abstract

The existing domain-specific methods for mining information networks in machine learning aims to represent the nodes of an information network into a vector format. However, the real-world large-scale information network cannot make well network representations by one network. When the information of the network structure transferred from one network to another network, the performance of network representation might decrease sharply. To achieve these ends, we propose a novel framework to transfer useful information across relational large-scale information networks (FTLSIN). The framework consists of a 2-layer random walks to measure the relations between two networks and predict links across them. Experiments on real-world datasets demonstrate the effectiveness of the proposed model.

Original languageEnglish
Title of host publication2018 International Joint Conference on Neural Networks, IJCNN 2018
Subtitle of host publicationProceedings
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages1-6
Number of pages6
Volume2018-July
ISBN (Electronic)9781509060146
DOIs
Publication statusPublished - 10 Oct 2018
Externally publishedYes
Event2018 International Joint Conference on Neural Networks, IJCNN 2018 - Rio de Janeiro, Brazil
Duration: 8 Jul 201813 Jul 2018

Conference

Conference2018 International Joint Conference on Neural Networks, IJCNN 2018
Country/TerritoryBrazil
CityRio de Janeiro
Period8/07/1813/07/18

Fingerprint

Dive into the research topics of 'A framework of transferring structures across large-scale information networks'. Together they form a unique fingerprint.

Cite this