» Articles » PMID: 34337345

Validation of Wearable Camera Still Images to Assess Posture in Free-Living Conditions

Overview
Date 2021 Aug 2
PMID 34337345
Citations 2
Authors
Affiliations
Soon will be listed here.
Abstract

Purpose: To assess the convergent validity of body worn wearable camera (WC) still-images (IMGs) for determining posture compared with activPAL (AP) classifications.

Methods: Participants (n=16, mean age 46.7±23.8yrs, 9F) wore an Autographer WC above the xyphoid process and an AP during three, 2hr free-living visits. IMGs were captured on average 8.47 seconds apart and were annotated with output consisting of events, transitory states, unknown and gaps. Events were annotations that matched AP classifications (sit, stand and move) consisting of at least 3 IMGs, transitory states were posture annotations fewer than 3 IMGs, unknown were IMGs that could not be accurately classified, and gaps were time between annotations. For analyses, annotation and AP output were converted to one-sec epochs and matched second-by-second. Total and average length of visits and events are reported in minutes. Bias and 95% CIs for event posture times from IMGs to AP posture times were calculated to determine accuracy and precision. Confusion matrices using total AP posture times were computed to determine misclassification.

Results: 43 visits were analyzed with a total visit and event time of 5027.73 and 4237.23 minutes and average visit and event lengths being 116.92 and 98.54 minutes, respectively. Bias was not statistically significant for sitting but significant for standing and movement (0.84, -6.87 and 6.04 minutes). From confusion matrices, IMGs correctly classified sitting, standing and movement 85.69%, 54.87%, and 69.41% of total AP time, respectively.

Conclusion: WC IMGs provide a good estimation of overall sitting time but underestimate standing and overestimate movement time. Future work is warranted to improve posture classifications and examine IMG accuracy and precision in assessing activity type behaviors.

Citing Articles

CAPTURE-24: A large dataset of wrist-worn activity tracker data collected in the wild for human activity recognition.

Chan S, Hang Y, Tong C, Acquah A, Schonfeldt A, Gershuny J Sci Data. 2024; 11(1):1135.

PMID: 39414802 PMC: 11484779. DOI: 10.1038/s41597-024-03960-3.


Update and Novel Validation of a Pregnancy Physical Activity Questionnaire.

Chasan-Taber L, Park S, Marcotte R, Staudenmayer J, Strath S, Freedson P Am J Epidemiol. 2023; 192(10):1743-1753.

PMID: 37289205 PMC: 11484608. DOI: 10.1093/aje/kwad130.

References
1.
Doherty A, Moulin C, Smeaton A . Automatically assisting human memory: a SenseCam browser. Memory. 2010; 19(7):785-95. DOI: 10.1080/09658211.2010.509732. View

2.
Lyden K, Kozey Keadle S, Staudenmayer J, Freedson P . Validity of two wearable monitors to estimate breaks from sedentary time. Med Sci Sports Exerc. 2012; 44(11):2243-52. PMC: 3475768. DOI: 10.1249/MSS.0b013e318260c477. View

3.
Woodberry E, Browne G, Hodges S, Watson P, Kapur N, Woodberry K . The use of a wearable camera improves autobiographical memory in patients with Alzheimer's disease. Memory. 2014; 23(3):340-9. DOI: 10.1080/09658211.2014.886703. View

4.
OLoughlin G, Cullen S, McGoldrick A, OConnor S, Blain R, OMalley S . Using a wearable camera to increase the accuracy of dietary analysis. Am J Prev Med. 2013; 44(3):297-301. DOI: 10.1016/j.amepre.2012.11.007. View

5.
Kerr J, Marshall S, Godbole S, Chen J, Legge A, Doherty A . Using the SenseCam to improve classifications of sedentary behavior in free-living settings. Am J Prev Med. 2013; 44(3):290-6. DOI: 10.1016/j.amepre.2012.11.004. View