TY - JOUR
T1 - Concept representation by learning explicit and implicit concept couplings
AU - Lu, Wenpeng
AU - Zhang, Yuteng
AU - Wang, Shoujin
AU - Huang, Heyan
AU - Liu, Qian
AU - Luo, Sheng
PY - 2021/1
Y1 - 2021/1
N2 - Generating the precise semantic representation of a word or concept is a fundamental task in natural language processing. Recent studies which incorporate semantic knowledge into word embedding have shown their potential in improving the semantic representation of a concept. However, existing approaches only achieved limited performance improvement as they usually 1) model a word's semantics from some explicit aspects while ignoring the intrinsic aspects of the word, 2) treat semantic knowledge as a supplement of word embeddings, and 3) consider partial relations between concepts while ignoring rich coupling relations between them, such as explicit concept co-occurrences in descriptive texts in a corpus as well as concept hyperlink relations in a knowledge network, and implicit couplings between concept co-occurrences and hyperlinks. In human consciousness, a concept is always associated with various couplings that exist within/between descriptive texts and knowledge networks, which inspires us to capture as many concept couplings as possible for building a more informative concept representation. We thus propose a neural coupled concept representation (CoupledCR) framework and its instantiation: a coupled concept embedding (CCE) model. CCE first learns two types of explicit couplings that are based on concept co-occurrences and hyperlink relations, respectively, and then learns a type of high-level implicit couplings between these two types of explicit couplings for better concept representation. Extensive experimental results on six real-world datasets show that CCE significantly outperforms eight state-of-the-art word embeddings and semantic representation methods.
AB - Generating the precise semantic representation of a word or concept is a fundamental task in natural language processing. Recent studies which incorporate semantic knowledge into word embedding have shown their potential in improving the semantic representation of a concept. However, existing approaches only achieved limited performance improvement as they usually 1) model a word's semantics from some explicit aspects while ignoring the intrinsic aspects of the word, 2) treat semantic knowledge as a supplement of word embeddings, and 3) consider partial relations between concepts while ignoring rich coupling relations between them, such as explicit concept co-occurrences in descriptive texts in a corpus as well as concept hyperlink relations in a knowledge network, and implicit couplings between concept co-occurrences and hyperlinks. In human consciousness, a concept is always associated with various couplings that exist within/between descriptive texts and knowledge networks, which inspires us to capture as many concept couplings as possible for building a more informative concept representation. We thus propose a neural coupled concept representation (CoupledCR) framework and its instantiation: a coupled concept embedding (CCE) model. CCE first learns two types of explicit couplings that are based on concept co-occurrences and hyperlink relations, respectively, and then learns a type of high-level implicit couplings between these two types of explicit couplings for better concept representation. Extensive experimental results on six real-world datasets show that CCE significantly outperforms eight state-of-the-art word embeddings and semantic representation methods.
KW - Concept Representation
KW - Coupling Learning
KW - non-IID Learning
KW - Representation Learning
KW - Word Similarity
UR - http://www.scopus.com/inward/record.url?scp=85090473684&partnerID=8YFLogxK
U2 - 10.1109/MIS.2020.3021188
DO - 10.1109/MIS.2020.3021188
M3 - Article
SN - 1541-1672
VL - 36
SP - 6
EP - 15
JO - IEEE Intelligent Systems
JF - IEEE Intelligent Systems
IS - 1
ER -