Language Models for the Prediction of SARS-CoV-2 Inhibitors
Overview
Authors
Affiliations
The COVID-19 pandemic highlights the need for computational tools to automate and accelerate drug design for novel protein targets. We leverage deep learning language models to generate and score drug candidates based on predicted protein binding affinity. We pre-trained a deep learning language model (BERT) on ∼9.6 billion molecules and achieved peak performance of 603 petaflops in mixed precision. Our work reduces pre-training time from days to hours, compared to previous efforts with this architecture, while also increasing the dataset size by nearly an order of magnitude. For scoring, we fine-tuned the language model using an assembled set of thousands of protein targets with binding affinity data and searched for inhibitors of specific protein targets, SARS-CoV-2 Mpro and PLpro. We utilized a genetic algorithm approach for finding optimal candidates using the generation and scoring capabilities of the language model. Our generalizable models accelerate the identification of inhibitors for emerging therapeutic targets.
Automation and machine learning augmented by large language models in a catalysis study.
Su Y, Wang X, Ye Y, Xie Y, Xu Y, Jiang Y Chem Sci. 2024; 15(31):12200-12233.
PMID: 39118602 PMC: 11304797. DOI: 10.1039/d3sc07012c.
Bhowmik D, Zhang P, Fox Z, Irle S, Gounley J Patterns (N Y). 2024; 5(4):100947.
PMID: 38645768 PMC: 11026973. DOI: 10.1016/j.patter.2024.100947.
Deep learning workflow for the inverse design of molecules with specific optoelectronic properties.
Yoo P, Bhowmik D, Mehta K, Zhang P, Liu F, Lupo Pasini M Sci Rep. 2023; 13(1):20031.
PMID: 37973879 PMC: 10654498. DOI: 10.1038/s41598-023-45385-9.
Two excited-state datasets for quantum chemical UV-vis spectra of organic molecules.
Lupo Pasini M, Mehta K, Yoo P, Irle S Sci Data. 2023; 10(1):546.
PMID: 37604820 PMC: 10442335. DOI: 10.1038/s41597-023-02408-4.
Adaptive language model training for molecular design.
Blanchard A, Bhowmik D, Fox Z, Gounley J, Glaser J, Akpa B J Cheminform. 2023; 15(1):59.
PMID: 37291633 PMC: 10249556. DOI: 10.1186/s13321-023-00719-7.