BioALBERT: a simple and effective pre-trained language model for biomedical named entity recognition

Usman Naseem, Matloob Khushi, Vinay Reddy, Sakthivel Rajendran, Imran Razzak, Jinman Kim

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

42 Citations (Scopus)

Abstract

In recent years, with the growing amount of biomedical documents, coupled with advancement in natural language processing algorithms, the research on biomedical named entity recognition (BioNER) has increased exponentially. However, BioNER research is challenging as NER in the biomedical domain are: (i) often restricted due to limited amount of training data, (ii) an entity can refer to multiple types and concepts depending on its context and, (iii) heavy reliance on acronyms that are sub-domain specific. Existing BioNER approaches often neglect these issues and directly adopt the state-of-the-art (SOTA) models trained in general corpora, which often yields unsatisfactory results. We propose biomedical ALBERT (A Lite Bidirectional Encoder Representations from Transformers for Biomedical Text Mining) - bioALBERT - an effective domain-specific pre-trained language model trained on a huge biomedical corpus designed to capture biomedical context-dependent NER. We adopted a self-supervised loss function used in ALBERT that targets modelling inter-sentence coherence to better learn context-dependent representations and incorporated parameter reduction strategies to minimise memory usage and enhance the training time in BioNER. In our experiments, BioALBERT outperformed comparative SOTA BioNER models on 8 biomedical NER benchmark datasets with 4 different entity types. The performance is increased for; (i) disease type corpora by 7.47% (NCBI- disease) and 10.63% (BC5CDR-disease); (ii) drug-chem type corpora by 4.61 % (BC5CDR-Chem) and 3.89% (BC4CHEMD); (iii) gene-protein type corpora by 12.25% (BC2GM) and 6.42% (JNLPBA); and (iv) species type corpora by 6.19% (LINNAEUS) and 23.71 % (Species-800) is observed which leads to a state-of-the-art results. The performance of a proposed model on four different biomedical entity types shows that our model is robust and generalisable in recognising biomedical entities in text.

Original languageEnglish
Title of host publication2021 International Joint Conference on Neural Networks (IJCNN) proceedings
Place of PublicationPiscataway, NJ
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages7
ISBN (Electronic)9781665439008, 9780738133669
ISBN (Print)9781665445979
DOIs
Publication statusPublished - 2021
Externally publishedYes
Event2021 International Joint Conference on Neural Networks, IJCNN 2021 - Virtual, Shenzhen, China
Duration: 18 Jul 202122 Jul 2021

Publication series

Name
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

Conference2021 International Joint Conference on Neural Networks, IJCNN 2021
Country/TerritoryChina
CityVirtual, Shenzhen
Period18/07/2122/07/21

Fingerprint

Dive into the research topics of 'BioALBERT: a simple and effective pre-trained language model for biomedical named entity recognition'. Together they form a unique fingerprint.

Cite this