TY - GEN
T1 - BioALBERT
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
AU - Naseem, Usman
AU - Khushi, Matloob
AU - Reddy, Vinay
AU - Rajendran, Sakthivel
AU - Razzak, Imran
AU - Kim, Jinman
PY - 2021
Y1 - 2021
N2 - In recent years, with the growing amount of biomedical documents, coupled with advancement in natural language processing algorithms, the research on biomedical named entity recognition (BioNER) has increased exponentially. However, BioNER research is challenging as NER in the biomedical domain are: (i) often restricted due to limited amount of training data, (ii) an entity can refer to multiple types and concepts depending on its context and, (iii) heavy reliance on acronyms that are sub-domain specific. Existing BioNER approaches often neglect these issues and directly adopt the state-of-the-art (SOTA) models trained in general corpora, which often yields unsatisfactory results. We propose biomedical ALBERT (A Lite Bidirectional Encoder Representations from Transformers for Biomedical Text Mining) - bioALBERT - an effective domain-specific pre-trained language model trained on a huge biomedical corpus designed to capture biomedical context-dependent NER. We adopted a self-supervised loss function used in ALBERT that targets modelling inter-sentence coherence to better learn context-dependent representations and incorporated parameter reduction strategies to minimise memory usage and enhance the training time in BioNER. In our experiments, BioALBERT outperformed comparative SOTA BioNER models on 8 biomedical NER benchmark datasets with 4 different entity types. The performance is increased for; (i) disease type corpora by 7.47% (NCBI- disease) and 10.63% (BC5CDR-disease); (ii) drug-chem type corpora by 4.61 % (BC5CDR-Chem) and 3.89% (BC4CHEMD); (iii) gene-protein type corpora by 12.25% (BC2GM) and 6.42% (JNLPBA); and (iv) species type corpora by 6.19% (LINNAEUS) and 23.71 % (Species-800) is observed which leads to a state-of-the-art results. The performance of a proposed model on four different biomedical entity types shows that our model is robust and generalisable in recognising biomedical entities in text.
AB - In recent years, with the growing amount of biomedical documents, coupled with advancement in natural language processing algorithms, the research on biomedical named entity recognition (BioNER) has increased exponentially. However, BioNER research is challenging as NER in the biomedical domain are: (i) often restricted due to limited amount of training data, (ii) an entity can refer to multiple types and concepts depending on its context and, (iii) heavy reliance on acronyms that are sub-domain specific. Existing BioNER approaches often neglect these issues and directly adopt the state-of-the-art (SOTA) models trained in general corpora, which often yields unsatisfactory results. We propose biomedical ALBERT (A Lite Bidirectional Encoder Representations from Transformers for Biomedical Text Mining) - bioALBERT - an effective domain-specific pre-trained language model trained on a huge biomedical corpus designed to capture biomedical context-dependent NER. We adopted a self-supervised loss function used in ALBERT that targets modelling inter-sentence coherence to better learn context-dependent representations and incorporated parameter reduction strategies to minimise memory usage and enhance the training time in BioNER. In our experiments, BioALBERT outperformed comparative SOTA BioNER models on 8 biomedical NER benchmark datasets with 4 different entity types. The performance is increased for; (i) disease type corpora by 7.47% (NCBI- disease) and 10.63% (BC5CDR-disease); (ii) drug-chem type corpora by 4.61 % (BC5CDR-Chem) and 3.89% (BC4CHEMD); (iii) gene-protein type corpora by 12.25% (BC2GM) and 6.42% (JNLPBA); and (iv) species type corpora by 6.19% (LINNAEUS) and 23.71 % (Species-800) is observed which leads to a state-of-the-art results. The performance of a proposed model on four different biomedical entity types shows that our model is robust and generalisable in recognising biomedical entities in text.
UR - http://www.scopus.com/inward/record.url?scp=85116423697&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9533884
DO - 10.1109/IJCNN52387.2021.9533884
M3 - Conference proceeding contribution
SN - 9781665445979
BT - 2021 International Joint Conference on Neural Networks (IJCNN) proceedings
PB - Institute of Electrical and Electronics Engineers (IEEE)
CY - Piscataway, NJ
Y2 - 18 July 2021 through 22 July 2021
ER -