» Articles » PMID: 35185507

Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks

Overview
Date 2022 Feb 21
PMID 35185507
Authors
Affiliations
Soon will be listed here.
Abstract

Robotic leg prostheses and exoskeletons can provide powered locomotor assistance to older adults and/or persons with physical disabilities. However, the current locomotion mode recognition systems being developed for automated high-level control and decision-making rely on mechanical, inertial, and/or neuromuscular sensors, which inherently have limited prediction horizons (i.e., analogous to walking blindfolded). Inspired by the human vision-locomotor control system, we developed an environment classification system powered by computer vision and deep learning to predict the oncoming walking environments prior to physical interaction, therein allowing for more accurate and robust high-level control decisions. In this study, we first reviewed the development of our "ExoNet" database-the largest and most diverse open-source dataset of wearable camera images of indoor and outdoor real-world walking environments, which were annotated using a hierarchical labeling architecture. We then trained and tested over a dozen state-of-the-art deep convolutional neural networks (CNNs) on the ExoNet database for image classification and automatic feature engineering, including: EfficientNetB0, InceptionV3, MobileNet, MobileNetV2, VGG16, VGG19, Xception, ResNet50, ResNet101, ResNet152, DenseNet121, DenseNet169, and DenseNet201. Finally, we quantitatively compared the benchmarked CNN architectures and their environment classification predictions using an operational metric called "NetScore," which balances the image classification accuracy with the computational and memory storage requirements (i.e., important for onboard real-time inference with mobile computing devices). Our comparative analyses showed that the EfficientNetB0 network achieves the highest test accuracy; VGG16 the fastest inference time; and MobileNetV2 the best NetScore, which can inform the optimal architecture design or selection depending on the desired performance. Overall, this study provides a large-scale benchmark and reference for next-generation environment classification systems for robotic leg prostheses and exoskeletons.

Citing Articles

Review on Portable-Powered Lower Limb Exoskeletons.

Jiang C, Xiao J, Wei H, Wang M, Chen C Sensors (Basel). 2025; 24(24.

PMID: 39771825 PMC: 11679449. DOI: 10.3390/s24248090.


L-AVATeD: The lidar and visual walking terrain dataset.

Whipps D, Ippersiel P, Dixon P Front Robot AI. 2024; 11:1384575.

PMID: 39697766 PMC: 11653013. DOI: 10.3389/frobt.2024.1384575.


Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots.

Wang C, Pei Z, Fan Y, Qiu S, Tang Z Biomimetics (Basel). 2024; 9(4).

PMID: 38667265 PMC: 11048416. DOI: 10.3390/biomimetics9040254.


StairNet: visual recognition of stairs for human-robot locomotion.

Kurbis A, Kuzmenko D, Ivanyuk-Skulskiy B, Mihailidis A, Laschowski B Biomed Eng Online. 2024; 23(1):20.

PMID: 38360664 PMC: 10870468. DOI: 10.1186/s12938-024-01216-0.


Insight into the Digital Health System of Ukraine (eHealth): Trends, Definitions, Standards, and Legislative Revisions.

Malakhov K Int J Telerehabil. 2024; 15(2):e6599.

PMID: 38162941 PMC: 10754247. DOI: 10.5195/ijt.2023.6599.


References
1.
Krausz N, Lenzi T, Hargrove L . Depth Sensing for Improved Control of Lower Limb Prostheses. IEEE Trans Biomed Eng. 2015; 62(11):2576-2587. DOI: 10.1109/TBME.2015.2448457. View

2.
LeCun Y, Bengio Y, Hinton G . Deep learning. Nature. 2015; 521(7553):436-44. DOI: 10.1038/nature14539. View

3.
Zhang K, Xiong C, Zhang W, Liu H, Lai D, Rong Y . Environmental Features Recognition for Lower Limb Prostheses Toward Predictive Walking. IEEE Trans Neural Syst Rehabil Eng. 2019; 27(3):465-476. DOI: 10.1109/TNSRE.2019.2895221. View

4.
Rai V, Rombokas E . Evaluation of a Visual Localization System for Environment Awareness in Assistive Devices. Annu Int Conf IEEE Eng Med Biol Soc. 2018; 2018:5135-5141. DOI: 10.1109/EMBC.2018.8513442. View

5.
Zhang F, Fang Z, Liu M, Huang H . Preliminary design of a terrain recognition system. Annu Int Conf IEEE Eng Med Biol Soc. 2012; 2011:5452-5. PMC: 3718465. DOI: 10.1109/IEMBS.2011.6091391. View