» Articles » PMID: 24409142

Gradient Boosting Machines, a Tutorial

Overview
Date 2014 Jan 11
PMID 24409142
Citations 400
Authors
Affiliations
Soon will be listed here.
Abstract

Gradient boosting machines are a family of powerful machine-learning techniques that have shown considerable success in a wide range of practical applications. They are highly customizable to the particular needs of the application, like being learned with respect to different loss functions. This article gives a tutorial introduction into the methodology of gradient boosting methods with a strong focus on machine learning aspects of modeling. A theoretical information is complemented with descriptive examples and illustrations which cover all the stages of the gradient boosting model design. Considerations on handling the model complexity are discussed. Three practical examples of gradient boosting applications are presented and comprehensively analyzed.

Citing Articles

The Application of Machine Learning in Predicting the Permeability of Drugs Across the Blood Brain Barrier.

Jafarpour S, Asefzadeh M, Aboutaleb E Iran J Pharm Res. 2025; 23(1):e149367.

PMID: 40066117 PMC: 11892787. DOI: 10.5812/ijpr-149367.


Predicting Neoplastic Polyp in Patients With Gallbladder Polyps Using Interpretable Machine Learning Models: Retrospective Cohort Study.

He Z, Yang S, Cao J, Gao H, Peng C Cancer Med. 2025; 14(5):e70739.

PMID: 40052528 PMC: 11886608. DOI: 10.1002/cam4.70739.


Artificial intelligence in stroke risk assessment and management via retinal imaging.

Khalafi P, Morsali S, Hamidi S, Ashayeri H, Sobhi N, Pedrammehr S Front Comput Neurosci. 2025; 19:1490603.

PMID: 40034651 PMC: 11872910. DOI: 10.3389/fncom.2025.1490603.


Predicting MBTI personality of YouTube users.

Stracqualursi L, Agati P Sci Rep. 2025; 15(1):7221.

PMID: 40021696 PMC: 11871013. DOI: 10.1038/s41598-025-85183-z.


Functional Disability and Psychological Impact in Headache Patients: A Comparative Study Using Conventional Statistics and Machine Learning Analysis.

Kim J, Kim H, Sohn J, Hwang S, Lee J, Kwon Y Medicina (Kaunas). 2025; 61(2).

PMID: 40005305 PMC: 11857184. DOI: 10.3390/medicina61020188.


References
1.
Lotte F, Congedo M, Lecuyer A, Lamarche F, Arnaldi B . A review of classification algorithms for EEG-based brain-computer interfaces. J Neural Eng. 2007; 4(2):R1-R13. DOI: 10.1088/1741-2560/4/2/R01. View

2.
Johnson R, Zhang T . Learning Nonlinear Functions Using Regularized Greedy Forest. IEEE Trans Pattern Anal Mach Intell. 2015; 36(5):942-54. DOI: 10.1109/TPAMI.2013.159. View

3.
Schmid M, Hothorn T . Flexible boosting of accelerated failure time models. BMC Bioinformatics. 2008; 9:269. PMC: 2453145. DOI: 10.1186/1471-2105-9-269. View

4.
Bullmore E, Sporns O . Complex brain networks: graph theoretical analysis of structural and functional systems. Nat Rev Neurosci. 2009; 10(3):186-98. DOI: 10.1038/nrn2575. View

5.
Death G . Boosted trees for ecological modeling and prediction. Ecology. 2007; 88(1):243-51. DOI: 10.1890/0012-9658(2007)88[243:btfema]2.0.co;2. View