» Articles » PMID: 36244991

NeRD: a Multichannel Neural Network to Predict Cellular Response of Drugs by Integrating Multidimensional Data

Overview
Journal BMC Med
Publisher Biomed Central
Specialty General Medicine
Date 2022 Oct 16
PMID 36244991
Authors
Affiliations
Soon will be listed here.
Abstract

Background: Considering the heterogeneity of tumors, it is a key issue in precision medicine to predict the drug response of each individual. The accumulation of various types of drug informatics and multi-omics data facilitates the development of efficient models for drug response prediction. However, the selection of high-quality data sources and the design of suitable methods remain a challenge.

Methods: In this paper, we design NeRD, a multidimensional data integration model based on the PRISM drug response database, to predict the cellular response of drugs. Four feature extractors, including drug structure extractor (DSE), molecular fingerprint extractor (MFE), miRNA expression extractor (mEE), and copy number extractor (CNE), are designed for different types and dimensions of data. A fully connected network is used to fuse all features and make predictions.

Results: Experimental results demonstrate the effective integration of the global and local structural features of drugs, as well as the features of cell lines from different omics data. For all metrics tested on the PRISM database, NeRD surpassed previous approaches. We also verified that NeRD has strong reliability in the prediction results of new samples. Moreover, unlike other algorithms, when the amount of training data was reduced, NeRD maintained stable performance.

Conclusions: NeRD's feature fusion provides a new idea for drug response prediction, which is of great significance for precise cancer treatment.

Citing Articles

DD-PRiSM: a deep learning framework for decomposition and prediction of synergistic drug combinations.

Jin I, Lee S, Schmuhalek M, Nam H Brief Bioinform. 2025; 26(1.

PMID: 39800875 PMC: 11725392. DOI: 10.1093/bib/bbae717.


TransCDR: a deep learning model for enhancing the generalizability of drug activity prediction through transfer learning and multimodal data fusion.

Xia X, Zhu C, Zhong F, Liu L BMC Biol. 2024; 22(1):227.

PMID: 39385185 PMC: 11462810. DOI: 10.1186/s12915-024-02023-8.


Multi-output prediction of dose-response curves enables drug repositioning and biomarker discovery.

Gutierrez J, Lau E, Dharmapalan S, Parker M, Chen Y, Alvarez M NPJ Precis Oncol. 2024; 8(1):209.

PMID: 39304771 PMC: 11415488. DOI: 10.1038/s41698-024-00691-x.


A comprehensive benchmarking of machine learning algorithms and dimensionality reduction methods for drug sensitivity prediction.

Eckhart L, Lenhof K, Rolli L, Lenhof H Brief Bioinform. 2024; 25(4).

PMID: 38797968 PMC: 11128483. DOI: 10.1093/bib/bbae242.


CPADS: a web tool for comprehensive pancancer analysis of drug sensitivity.

Li K, Yang H, Lin A, Xie J, Wang H, Zhou J Brief Bioinform. 2024; 25(3).

PMID: 38770717 PMC: 11106634. DOI: 10.1093/bib/bbae237.


References
1.
Chen Y, Zhang L . How much can deep learning improve prediction of the responses to drugs in cancer cell lines?. Brief Bioinform. 2021; 23(1). DOI: 10.1093/bib/bbab378. View

2.
Chong C, Janne P . The quest to overcome resistance to EGFR-targeted therapies in cancer. Nat Med. 2013; 19(11):1389-400. PMC: 4049336. DOI: 10.1038/nm.3388. View

3.
Liu Q, Hu Z, Jiang R, Zhou M . DeepCDR: a hybrid graph convolutional network for predicting cancer drug response. Bioinformatics. 2020; 36(Suppl_2):i911-i918. DOI: 10.1093/bioinformatics/btaa822. View

4.
Huang C, Clayton E, Matyunina L, McDonald L, Benigno B, Vannberg F . Machine learning predicts individual cancer patient responses to therapeutic drugs with high accuracy. Sci Rep. 2018; 8(1):16444. PMC: 6219522. DOI: 10.1038/s41598-018-34753-5. View

5.
Huang E, Bhope A, Lim J, Sinha S, Emad A . Tissue-guided LASSO for prediction of clinical drug response using preclinical samples. PLoS Comput Biol. 2020; 16(1):e1007607. PMC: 6975549. DOI: 10.1371/journal.pcbi.1007607. View