» Articles » PMID: 34495862

Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks

Overview
Date 2021 Sep 8
PMID 34495862
Citations 3
Authors
Affiliations
Soon will be listed here.
Abstract

This article deals with nonconvex stochastic optimization problems in deep learning. Appropriate learning rates, based on theory, for adaptive-learning-rate optimization algorithms (e.g., Adam and AMSGrad) to approximate the stationary points of such problems are provided. These rates are shown to allow faster convergence than previously reported for these algorithms. Specifically, the algorithms are examined in numerical experiments on text and image classification and are shown in experiments to perform better with constant learning rates than algorithms using diminishing learning rates.

Citing Articles

A rate of penetration (ROP) prediction method based on improved dung beetle optimization algorithm and BiLSTM-SA.

Xiong M, Zheng S, Liu W, Cheng R, Wang L, Zhang H Sci Rep. 2024; 14(1):25856.

PMID: 39468121 PMC: 11519518. DOI: 10.1038/s41598-024-75703-8.


CrnnCrispr: An Interpretable Deep Learning Method for CRISPR/Cas9 sgRNA On-Target Activity Prediction.

Zhu W, Xie H, Chen Y, Zhang G Int J Mol Sci. 2024; 25(8).

PMID: 38674012 PMC: 11050447. DOI: 10.3390/ijms25084429.


Recognition of Wheat Leaf Diseases Using Lightweight Convolutional Neural Networks against Complex Backgrounds.

Wen X, Zeng M, Chen J, Maimaiti M, Liu Q Life (Basel). 2023; 13(11).

PMID: 38004265 PMC: 10672231. DOI: 10.3390/life13112125.