» Articles » PMID: 37325198

A-SEE: Active-Sensing End-effector Enabled Probe Self-Normal-Positioning for Robotic Ultrasound Imaging Applications

Overview
Date 2023 Jun 16
PMID 37325198
Authors
Affiliations
Soon will be listed here.
Abstract

Conventional manual ultrasound (US) imaging is a physically demanding procedure for sonographers. A robotic US system (RUSS) has the potential to overcome this limitation by automating and standardizing the imaging procedure. It also extends ultrasound accessibility in resource-limited environments with the shortage of human operators by enabling remote diagnosis. During imaging, keeping the US probe normal to the skin surface largely benefits the US image quality. However, an autonomous, real-time, low-cost method to align the probe towards the direction orthogonal to the skin surface without pre-operative information is absent in RUSS. We propose a novel end-effector design to achieve self-normal-positioning of the US probe. The end-effector embeds four laser distance sensors to estimate the desired rotation towards the normal direction. We then integrate the proposed end-effector with a RUSS system which allows the probe to be automatically and dynamically kept to normal direction during US imaging. We evaluated the normal positioning accuracy and the US image quality using a flat surface phantom, an upper torso mannequin, and a lung ultrasound phantom. Results show that the normal positioning accuracy is 4.17 ± 2.24 degrees on the flat surface and 14.67 ± 8.46 degrees on the mannequin. The quality of the RUSS collected US images from the lung ultrasound phantom was equivalent to that of the manually collected ones.

Citing Articles

Intraoperative laparoscopic photoacoustic image guidance system in the da Vinci surgical system.

Gao S, Wang Y, Ma X, Zhou H, Jiang Y, Yang K Biomed Opt Express. 2023; 14(9):4914-4928.

PMID: 37791285 PMC: 10545189. DOI: 10.1364/BOE.498052.

References
1.
Sen H, Bell M, Zhang Y, Ding K, Boctor E, Wong J . System Integration and In Vivo Testing of a Robot for Ultrasound Guidance and Monitoring During Radiotherapy. IEEE Trans Biomed Eng. 2017; 64(7):1608-1618. PMC: 6711774. DOI: 10.1109/TBME.2016.2612229. View

2.
Kojcev R, Khakzar A, Fuerst B, Zettinig O, Fahkry C, DeJong R . On the reproducibility of expert-operated and robotic ultrasound acquisitions. Int J Comput Assist Radiol Surg. 2017; 12(6):1003-1011. DOI: 10.1007/s11548-017-1561-1. View

3.
Kaminski J, Rafatzand K, Zhang H . Feasibility of Robot-Assisted Ultrasound Imaging with Force Feedback for Assessment of Thyroid Diseases. Proc SPIE Int Soc Opt Eng. 2020; 11315. PMC: 7392820. DOI: 10.1117/12.2551118. View

4.
Al-Zogbi L, Singh V, Teixeira B, Ahuja A, Bagherzadeh P, Kapoor A . Autonomous Robotic Point-of-Care Ultrasound Imaging for Monitoring of COVID-19-Induced Pulmonary Diseases. Front Robot AI. 2021; 8:645756. PMC: 8185340. DOI: 10.3389/frobt.2021.645756. View

5.
Scorza A, Conforto S, DAnna C, Sciuto S . A Comparative Study on the Influence of Probe Placement on Quality Assurance Measurements in B-mode Ultrasound by Means of Ultrasound Phantoms. Open Biomed Eng J. 2015; 9:164-78. PMC: 4541336. DOI: 10.2174/1874120701509010164. View