» Articles » PMID: 31392088

Identification of Cecum Time-location in a Colonoscopy Video by Deep Learning Analysis of Colonoscope Movement

Overview
Journal PeerJ
Date 2019 Aug 9
PMID 31392088
Citations 3
Authors
Affiliations
Soon will be listed here.
Abstract

Background: Cecal intubation time is an important component for quality colonoscopy. Cecum is the turning point that determines the insertion and withdrawal phase of the colonoscope. For this reason, obtaining information related with location of the cecum in the endoscopic procedure is very useful. Also, it is necessary to detect the direction of colonoscope's movement and time-location of the cecum.

Methods: In order to analysis the direction of scope's movement, the Horn-Schunck algorithm was used to compute the pixel's motion change between consecutive frames. Horn-Schunk-algorithm applied images were trained and tested through convolutional neural network deep learning methods, and classified to the insertion, withdrawal and stop movements. Based on the scope's movement, the graph was drawn with a value of +1 for insertion, -1 for withdrawal, and 0 for stop. We regarded the turning point as a cecum candidate point when the total graph area sum in a certain section recorded the lowest.

Results: A total of 328,927 frame images were obtained from 112 patients. The overall accuracy, drawn from 5-fold cross-validation, was 95.6%. When the value of "t" was 30 s, accuracy of cecum discovery was 96.7%. In order to increase visibility, the movement of the scope was added to summary report of colonoscopy video. Insertion, withdrawal, and stop movements were mapped to each color and expressed with various scale. As the scale increased, the distinction between the insertion phase and the withdrawal phase became clearer.

Conclusion: Information obtained in this study can be utilized as metadata for proficiency assessment. Since insertion and withdrawal are technically different movements, data of scope's movement and phase can be quantified and utilized to express pattern unique to the colonoscopist and to assess proficiency. Also, we hope that the findings of this study can contribute to the informatics field of medical records so that medical charts can be transmitted graphically and effectively in the field of colonoscopy.

Citing Articles

Density clustering-based automatic anatomical section recognition in colonoscopy video using deep learning.

Kim B, Cho M, Chung G, Lee J, Kang H, Yoon D Sci Rep. 2024; 14(1):872.

PMID: 38195632 PMC: 10776865. DOI: 10.1038/s41598-023-51056-6.


Public Imaging Datasets of Gastrointestinal Endoscopy for Artificial Intelligence: a Review.

Zhu S, Gao J, Liu L, Yin M, Lin J, Xu C J Digit Imaging. 2023; 36(6):2578-2601.

PMID: 37735308 PMC: 10584770. DOI: 10.1007/s10278-023-00844-7.


Deep learning model for classifying endometrial lesions.

Zhang Y, Wang Z, Zhang J, Wang C, Wang Y, Chen H J Transl Med. 2021; 19(1):10.

PMID: 33407588 PMC: 7788977. DOI: 10.1186/s12967-020-02660-x.

References
1.
Saifuddin T, Trivedi M, King P, Madsen R, Marshall J . Usefulness of a pediatric colonoscope for colonoscopy in adults. Gastrointest Endosc. 2000; 51(3):314-7. DOI: 10.1016/s0016-5107(00)70361-0. View

2.
Taira R, Soderland S, Jakobovits R . Automatic structuring of radiology free-text reports. Radiographics. 2001; 21(1):237-45. DOI: 10.1148/radiographics.21.1.g01ja18237. View

3.
Rex D . Effect of variable stiffness colonoscopes on cecal intubation times for routine colonoscopy by an experienced examiner in sedated patients. Endoscopy. 2001; 33(1):60-4. DOI: 10.1055/s-2001-11179. View

4.
Anderson J, Messina C, Cohn W, Gottfried E, Ingber S, Bernstein G . Factors predictive of difficult colonoscopy. Gastrointest Endosc. 2001; 54(5):558-62. DOI: 10.1067/mge.2001.118950. View

5.
Rex D . Rationale for colonoscopy screening and estimated effectiveness in clinical practice. Gastrointest Endosc Clin N Am. 2002; 12(1):65-75. DOI: 10.1016/s1052-5157(03)00058-8. View