» Articles » PMID: 29748163

Assessing Competencies Needed to Engage With Digital Health Services: Development of the EHealth Literacy Assessment Toolkit

Overview
Publisher JMIR Publications
Date 2018 May 12
PMID 29748163
Citations 55
Authors
Affiliations
Soon will be listed here.
Abstract

Background: To achieve full potential in user-oriented eHealth projects, we need to ensure a match between the eHealth technology and the user's eHealth literacy, described as knowledge and skills. However, there is a lack of multifaceted eHealth literacy assessment tools suitable for screening purposes.

Objective: The objective of our study was to develop and validate an eHealth literacy assessment toolkit (eHLA) that assesses individuals' health literacy and digital literacy using a mix of existing and newly developed scales.

Methods: From 2011 to 2015, scales were continuously tested and developed in an iterative process, which led to 7 tools being included in the validation study. The eHLA validation version consisted of 4 health-related tools (tool 1: "functional health literacy," tool 2: "health literacy self-assessment," tool 3: "familiarity with health and health care," and tool 4: "knowledge of health and disease") and 3 digitally-related tools (tool 5: "technology familiarity," tool 6: "technology confidence," and tool 7: "incentives for engaging with technology") that were tested in 475 respondents from a general population sample and an outpatient clinic. Statistical analyses examined floor and ceiling effects, interitem correlations, item-total correlations, and Cronbach coefficient alpha (CCA). Rasch models (RM) examined the fit of data. Tools were reduced in items to secure robust tools fit for screening purposes. Reductions were made based on psychometrics, face validity, and content validity.

Results: Tool 1 was not reduced in items; it consequently consists of 10 items. The overall fit to the RM was acceptable (Anderson conditional likelihood ratio, CLR=10.8; df=9; P=.29), and CCA was .67. Tool 2 was reduced from 20 to 9 items. The overall fit to a log-linear RM was acceptable (Anderson CLR=78.4, df=45, P=.002), and CCA was .85. Tool 3 was reduced from 23 to 5 items. The final version showed excellent fit to a log-linear RM (Anderson CLR=47.7, df=40, P=.19), and CCA was .90. Tool 4 was reduced from 12 to 6 items. The fit to a log-linear RM was acceptable (Anderson CLR=42.1, df=18, P=.001), and CCA was .59. Tool 5 was reduced from 20 to 6 items. The fit to the RM was acceptable (Anderson CLR=30.3, df=17, P=.02), and CCA was .94. Tool 6 was reduced from 5 to 4 items. The fit to a log-linear RM taking local dependency (LD) into account was acceptable (Anderson CLR=26.1, df=21, P=.20), and CCA was .91. Tool 7 was reduced from 6 to 4 items. The fit to a log-linear RM taking LD and differential item functioning into account was acceptable (Anderson CLR=23.0, df=29, P=.78), and CCA was .90.

Conclusions: The eHLA consists of 7 short, robust scales that assess individual's knowledge and skills related to digital literacy and health literacy.

Citing Articles

Measuring Digital Health Literacy in Older Adults: Development and Validation Study.

Kim S, Park C, Park S, Kim D, Bae Y, Kang J J Med Internet Res. 2025; 27:e65492.

PMID: 39908081 PMC: 11840366. DOI: 10.2196/65492.


The Effects of the COVID-19 Pandemic on Age-Based Disparities in Digital Health Technology Use: Secondary Analysis of the 2017-2022 Health Information National Trends Survey.

Qiu Y, Huang H, Gai J, De Leo G J Med Internet Res. 2024; 26:e65541.

PMID: 39631070 PMC: 11656112. DOI: 10.2196/65541.


Design, Implementation, and Analysis of an Assessment and Accreditation Model to Evaluate a Digital Competence Framework for Health Professionals: Mixed Methods Study.

Saigi-Rubio F, Romeu T, Hernandez Encuentra E, Guitert M, Andres E, Reixach E JMIR Med Educ. 2024; 10:e53462.

PMID: 39418092 PMC: 11528169. DOI: 10.2196/53462.


Digital Health Readiness: Making Digital Health Care More Inclusive.

Bober T, Rollman B, Handler S, Watson A, Nelson L, Faieta J JMIR Mhealth Uhealth. 2024; 12:e58035.

PMID: 39383524 PMC: 11499716. DOI: 10.2196/58035.


Patients with axial spondyloarthritis reported willingness to use remote care and showed high adherence to electronic patient-reported outcome measures: an 18-month observational study.

Thomassen E, Berg I, Kristianslund E, Tveter A, Bakland G, Gossec L Rheumatol Int. 2024; 44(10):2089-2098.

PMID: 39164589 PMC: 11393250. DOI: 10.1007/s00296-024-05673-7.


References
1.
Furstrand D, Kayser L . Development of the eHealth Literacy Assessment Toolkit, eHLA. Stud Health Technol Inform. 2015; 216:971. View

2.
Haesum L, Ehlers L, Hejlesen O . Validation of the Test of Functional Health Literacy in Adults in a Danish population. Scand J Caring Sci. 2015; 29(3):573-81. DOI: 10.1111/scs.12186. View

3.
Chan C, Matthews L, Kaufman D . A taxonomy characterizing complexity of consumer eHealth Literacy. AMIA Annu Symp Proc. 2010; 2009:86-90. PMC: 2815448. View

4.
Drennan J . Cognitive interviewing: verbal data in the design and pretesting of questionnaires. J Adv Nurs. 2003; 42(1):57-63. DOI: 10.1046/j.1365-2648.2003.02579.x. View

5.
Chan C, Kaufman D . A framework for characterizing eHealth literacy demands and barriers. J Med Internet Res. 2011; 13(4):e94. PMC: 3222196. DOI: 10.2196/jmir.1750. View