» Articles » PMID: 36690206

Hierarchical Harris Hawks Optimizer for Feature Selection

Overview
Journal J Adv Res
Date 2023 Jan 23
PMID 36690206
Authors
Affiliations
Soon will be listed here.
Abstract

Introduction: The main feature selection methods include filter, wrapper-based, and embedded methods. Because of its characteristics, the wrapper method must include a swarm intelligence algorithm, and its performance in feature selection is closely related to the algorithm's quality. Therefore, it is essential to choose and design a suitable algorithm to improve the performance of the feature selection method based on the wrapper. Harris hawks optimization (HHO) is a superb optimization approach that has just been introduced. It has a high convergence rate and a powerful global search capability but it has an unsatisfactory optimization effect on high dimensional problems or complex problems. Therefore, we introduced a hierarchy to improve HHO's ability to deal with complex problems and feature selection.

Objectives: To make the algorithm obtain good accuracy with fewer features and run faster in feature selection, we improved HHO and named it EHHO. On 30 UCI datasets, the improved HHO (EHHO) can achieve very high classification accuracy with less running time and fewer features.

Methods: We first conducted extensive experiments on 23 classical benchmark functions and compared EHHO with many state-of-the-art metaheuristic algorithms. Then we transform EHHO into binary EHHO (bEHHO) through the conversion function and verify the algorithm's ability in feature extraction on 30 UCI data sets.

Results: Experiments on 23 benchmark functions show that EHHO has better convergence speed and minimum convergence than other peers. At the same time, compared with HHO, EHHO can significantly improve the weakness of HHO in dealing with complex functions. Moreover, on 30 datasets in the UCI repository, the performance of bEHHO is better than other comparative optimization algorithms.

Conclusion: Compared with the original bHHO, bEHHO can achieve excellent classification accuracy with fewer features and is also better than bHHO in running time.

Citing Articles

An intelligent attention based deep convoluted learning (IADCL) model for smart healthcare security.

Maruthupandi J, Sivakumar S, Dhevi B, Prasanna S, Priya R, Selvarajan S Sci Rep. 2025; 15(1):1363.

PMID: 39779774 PMC: 11711617. DOI: 10.1038/s41598-024-84691-8.


Multipopulation Whale Optimization-Based Feature Selection Algorithm and Its Application in Human Fall Detection Using Inertial Measurement Unit Sensors.

Cao H, Yan B, Dong L, Yuan X Sensors (Basel). 2025; 24(24.

PMID: 39771617 PMC: 11678948. DOI: 10.3390/s24247879.


A Novel Adaptive Sand Cat Swarm Optimization Algorithm for Feature Selection and Global Optimization.

Liu R, Fang R, Zeng T, Fei H, Qi Q, Zuo P Biomimetics (Basel). 2024; 9(11).

PMID: 39590273 PMC: 11591711. DOI: 10.3390/biomimetics9110701.


Enhancing Oral Squamous Cell Carcinoma Detection Using Histopathological Images: A Deep Feature Fusion and Improved Haris Hawks Optimization-Based Framework.

Zafar A, Khalid M, Farrash M, Qadah T, Lahza H, Kim S Bioengineering (Basel). 2024; 11(9).

PMID: 39329655 PMC: 11429398. DOI: 10.3390/bioengineering11090913.


A Multi-Omics, Machine Learning-Aware, Genome-Wide Metabolic Model of Bacillus Subtilis Refines the Gene Expression and Cell Growth Prediction.

Bi X, Cheng Y, Lv X, Liu Y, Li J, Du G Adv Sci (Weinh). 2024; 11(42):e2408705.

PMID: 39287062 PMC: 11558093. DOI: 10.1002/advs.202408705.


References
1.
Rahmanl R, Perera C, Ghosh S, Pall R . Adaptive Multi-task Elastic Net based feature selection from Pharmacogenomics Databases. Annu Int Conf IEEE Eng Med Biol Soc. 2018; 2018:279-282. DOI: 10.1109/EMBC.2018.8512229. View

2.
Zhu B, Zhong Q, Chen Y, Liao S, Li Z, Shi K . A Novel Reconstruction Method for Temperature Distribution Measurement Based on Ultrasonic Tomography. IEEE Trans Ultrason Ferroelectr Freq Control. 2022; 69(7):2352-2370. DOI: 10.1109/TUFFC.2022.3177469. View

3.
Kwak N, Choi C . Input feature selection for classification problems. IEEE Trans Neural Netw. 2008; 13(1):143-59. DOI: 10.1109/72.977291. View

4.
Kirkpatrick S, Gelatt Jr C, Vecchi M . Optimization by simulated annealing. Science. 1983; 220(4598):671-80. DOI: 10.1126/science.220.4598.671. View

5.
Ren L, Zhao D, Zhao X, Chen W, Li L, Wu T . Multi-level thresholding segmentation for pathological images: Optimal performance design of a new modified differential evolution. Comput Biol Med. 2022; 148:105910. DOI: 10.1016/j.compbiomed.2022.105910. View