» Articles » PMID: 29868832

Transfer Learning for Biomedical Named Entity Recognition with Neural Networks

Overview
Journal Bioinformatics
Specialty Biology
Date 2018 Jun 6
PMID 29868832
Citations 39
Authors
Affiliations
Soon will be listed here.
Abstract

Motivation: The explosive increase of biomedical literature has made information extraction an increasingly important tool for biomedical research. A fundamental task is the recognition of biomedical named entities in text (BNER) such as genes/proteins, diseases and species. Recently, a domain-independent method based on deep learning and statistical word embeddings, called long short-term memory network-conditional random field (LSTM-CRF), has been shown to outperform state-of-the-art entity-specific BNER tools. However, this method is dependent on gold-standard corpora (GSCs) consisting of hand-labeled entities, which tend to be small but highly reliable. An alternative to GSCs are silver-standard corpora (SSCs), which are generated by harmonizing the annotations made by several automatic annotation systems. SSCs typically contain more noise than GSCs but have the advantage of containing many more training examples. Ideally, these corpora could be combined to achieve the benefits of both, which is an opportunity for transfer learning. In this work, we analyze to what extent transfer learning improves upon state-of-the-art results for BNER.

Results: We demonstrate that transferring a deep neural network (DNN) trained on a large, noisy SSC to a smaller, but more reliable GSC significantly improves upon state-of-the-art results for BNER. Compared to a state-of-the-art baseline evaluated on 23 GSCs covering four different entity classes, transfer learning results in an average reduction in error of approximately 11%. We found transfer learning to be especially beneficial for target datasets with a small number of labels (approximately 6000 or less).

Availability And Implementation: Source code for the LSTM-CRF is available at https://github.com/Franck-Dernoncourt/NeuroNER/ and links to the corpora are available at https://github.com/BaderLab/Transfer-Learning-BNER-Bioinformatics-2018/.

Supplementary Information: Supplementary data are available at Bioinformatics online.

Citing Articles

Natural language processing (NLP) to facilitate abstract review in medical research: the application of BioBERT to exploring the 20-year use of NLP in medical research.

Masoumi S, Amirkhani H, Sadeghian N, Shahraz S Syst Rev. 2024; 13(1):107.

PMID: 38622611 PMC: 11020656. DOI: 10.1186/s13643-024-02470-y.


Transfer learning-based English translation text classification in a multimedia network environment.

Zheng D PeerJ Comput Sci. 2024; 10:e1842.

PMID: 38435557 PMC: 10909173. DOI: 10.7717/peerj-cs.1842.


Synchronous Mutual Learning Network and Asynchronous Multi-Scale Embedding Network for miRNA-Disease Association Prediction.

Sun W, Zhang P, Zhang W, Xu J, Huang Y, Li L Interdiscip Sci. 2024; 16(3):532-553.

PMID: 38310628 DOI: 10.1007/s12539-023-00602-x.


S1000: a better taxonomic name corpus for biomedical information extraction.

Luoma J, Nastou K, Ohta T, Toivonen H, Pafilis E, Jensen L Bioinformatics. 2023; 39(6).

PMID: 37289518 PMC: 10281857. DOI: 10.1093/bioinformatics/btad369.


Review: A Roadmap to Use Nonstructured Data to Discover Multitarget Cancer Therapies.

Scoarta S, Kucukosmanoglu A, Bindt F, Pouwer M, Westerman B JCO Clin Cancer Inform. 2023; 7:e2200096.

PMID: 37116097 PMC: 10281332. DOI: 10.1200/CCI.22.00096.


References
1.
Al-Aamri A, Taha K, Al-Hammadi Y, Maalouf M, Homouz D . Constructing Genetic Networks using Biomedical Literature and Rare Event Classification. Sci Rep. 2017; 7(1):15784. PMC: 5694017. DOI: 10.1038/s41598-017-16081-2. View

2.
Aerts S, Lambrechts D, Maity S, Van Loo P, Coessens B, De Smet F . Gene prioritization through genomic data fusion. Nat Biotechnol. 2006; 24(5):537-44. DOI: 10.1038/nbt1203. View

3.
Smith L, Tanabe L, Ando R, Kuo C, Chung I, Hsu C . Overview of BioCreative II gene mention recognition. Genome Biol. 2008; 9 Suppl 2:S2. PMC: 2559986. DOI: 10.1186/gb-2008-9-s2-s2. View

4.
Krallinger M, Rabal O, Leitner F, Vazquez M, Salgado D, Lu Z . The CHEMDNER corpus of chemicals and drugs and its annotation principles. J Cheminform. 2015; 7:S2. PMC: 4331692. DOI: 10.1186/1758-2946-7-S1-S2. View

5.
Hochreiter S, Schmidhuber J . Long short-term memory. Neural Comput. 1997; 9(8):1735-80. DOI: 10.1162/neco.1997.9.8.1735. View