» Articles » PMID: 34577482

VES: A Mixed-Reality Development Platform of Navigation Systems for Blind and Visually Impaired

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2021 Sep 28
PMID 34577482
Citations 5
Authors
Affiliations
Soon will be listed here.
Abstract

Herein, we describe the Virtually Enhanced Senses (VES) system, a novel and highly configurable wireless sensor-actuator network conceived as a development and test-bench platform of navigation systems adapted for blind and visually impaired people. It allows to immerse its users into "walkable" purely virtual or mixed environments with simulated sensors and validate navigation system designs prior to prototype development. The haptic, acoustic, and proprioceptive feedback supports state-of-art sensory substitution devices (SSD). In this regard, three SSD were integrated in VES as examples, including the well-known "The vOICe". Additionally, the data throughput, latency and packet loss of the wireless communication can be controlled to observe its impact in the provided spatial knowledge and resulting mobility and orientation performance. Finally, the system has been validated by testing a combination of two previous visual-acoustic and visual-haptic sensory substitution schemas with 23 normal-sighted subjects. The recorded data includes the output of a "gaze-tracking" utility adapted for SSD.

Citing Articles

Navigation Training for Persons With Visual Disability Through Multisensory Assistive Technology: Mixed Methods Experimental Study.

Ricci F, Liguori L, Palermo E, Rizzo J, Porfiri M JMIR Rehabil Assist Technol. 2024; 11:e55776.

PMID: 39556804 PMC: 11612587. DOI: 10.2196/55776.


A brief reference to AI-driven audible reality (AuRa) in open world: potential, applications, and evaluation.

Ates O, Pandey G, Gousiopoulos A, Soldatos T Front Artif Intell. 2024; 7:1424371.

PMID: 39525498 PMC: 11543578. DOI: 10.3389/frai.2024.1424371.


EchoSee: An Assistive Mobile Application for Real-Time 3D Environment Reconstruction and Sonification Supporting Enhanced Navigation for People with Vision Impairments.

Schwartz B, King S, Bell T Bioengineering (Basel). 2024; 11(8).

PMID: 39199789 PMC: 11351581. DOI: 10.3390/bioengineering11080831.


Network QoS Impact on Spatial Perception through Sensory Substitution in Navigation Systems for Blind and Visually Impaired People.

Real S, Araujo A Sensors (Basel). 2023; 23(6).

PMID: 36991929 PMC: 10051106. DOI: 10.3390/s23063219.


Cross-modal correspondence enhances elevation localization in visual-to-auditory sensory substitution.

Bordeau C, Scalvini F, Migniot C, DuBois J, Ambard M Front Psychol. 2023; 14:1079998.

PMID: 36777233 PMC: 9909421. DOI: 10.3389/fpsyg.2023.1079998.

References
1.
Persaud K, Gregory R . The perception of visual images encoded in musical form: a study in cross-modality information transfer. Proc Biol Sci. 2000; 266(1436):2427-33. PMC: 1690460. DOI: 10.1098/rspb.1999.0942. View

2.
Jicol C, Lloyd-Esenkaya T, Proulx M, Lange-Smith S, Scheller M, ONeill E . Efficiency of Sensory Substitution Devices Alone and in Combination With Self-Motion for Spatial Navigation in Sighted and Visually Impaired. Front Psychol. 2020; 11:1443. PMC: 7381305. DOI: 10.3389/fpsyg.2020.01443. View

3.
Giudice N, Klatzky R, Bennett C, Loomis J . Perception of 3-D location based on vision, touch, and extended touch. Exp Brain Res. 2012; 224(1):141-53. PMC: 3536915. DOI: 10.1007/s00221-012-3295-1. View

4.
Massiceti D, Hicks S, van Rheede J . Stereosonic vision: Exploring visual-to-auditory sensory substitution mappings in an immersive virtual reality navigation paradigm. PLoS One. 2018; 13(7):e0199389. PMC: 6033394. DOI: 10.1371/journal.pone.0199389. View

5.
Maidenbaum S, Hanassy S, Abboud S, Buchs G, Chebat D, Levy-Tzedek S . The "EyeCane", a new electronic travel aid for the blind: Technology, behavior & swift learning. Restor Neurol Neurosci. 2014; 32(6):813-24. DOI: 10.3233/RNN-130351. View