» Articles » PMID: 16966539

The Quality of Diagnostic Accuracy Studies Since the STARD Statement: Has It Improved?

Overview
Journal Neurology
Specialty Neurology
Date 2006 Sep 13
PMID 16966539
Citations 77
Authors
Affiliations
Soon will be listed here.
Abstract

Objective: To assess whether the quality of reporting of diagnostic accuracy studies has improved since the publication of the Standards for the Reporting of Diagnostic Accuracy studies (STARD statement).

Methods: The quality of reporting of diagnostic accuracy studies published in 12 medical journals in 2000 (pre-STARD) and 2004 (post-STARD) was evaluated by two reviewers independently. For each article, the number of reported STARD items was counted (range 0 to 25). Differences in completeness of reporting between articles published in 2000 and 2004 were analyzed, using multilevel analyses.

Results: We included 124 articles published in 2000 and 141 articles published in 2004. Mean number of reported STARD items was 11.9 (range 3.5 to 19.5) in 2000 and 13.6 (range 4.0 to 21.0) in 2004, an increase of 1.81 items (95% CI: 0.61 to 3.01). Articles published in 2004 reported the following significantly more often: methods for calculating test reproducibility of the index test (16% vs 35%); distribution of the severity of disease and other diagnoses (23% vs 53%); estimates of variability of diagnostic accuracy between subgroups (39% vs 60%); and a flow diagram (2% vs 12%).

Conclusions: The quality of reporting of diagnostic accuracy studies has improved slightly over time, without a more pronounced effect in journals that adopted the STARD statement. As there is still room for improvement, editors should mention the use of the STARD statement as a requirement in their guidelines for authors, and instruct reviewers to check the STARD items. Authors should include a flow diagram in their manuscript.

Citing Articles

A methodological quality review of citations of randomized controlled trials of diabetes type2 in leading clinical practice guidelines and systematic reviews.

AleTaha A, Malekpour M, Keshtkar A, Baradaran H, Sedghi S, Mansoori Y J Diabetes Metab Disord. 2024; 23(1):101-114.

PMID: 38932844 PMC: 11196434. DOI: 10.1007/s40200-023-01328-9.


Endorsements of five reporting guidelines for biomedical research by journals of prominent publishers.

Wang P, Wolfram D, Gilbert E PLoS One. 2024; 19(2):e0299806.

PMID: 38421981 PMC: 10903802. DOI: 10.1371/journal.pone.0299806.


Meta-research on reporting guidelines for artificial intelligence: are authors and reviewers encouraged enough in radiology, nuclear medicine, and medical imaging journals?.

Kocak B, Keles A, Kose F Diagn Interv Radiol. 2024; 30(5):291-298.

PMID: 38375627 PMC: 11590734. DOI: 10.4274/dir.2024.232604.


Self-reported checklists and quality scoring tools in radiomics: a meta-research.

Kocak B, Akinci DAntonoli T, Ates Kus E, Keles A, Kala A, Kose F Eur Radiol. 2024; 34(8):5028-5040.

PMID: 38180530 DOI: 10.1007/s00330-023-10487-5.


Consolidated Reporting Guidelines for Prognostic and Diagnostic Machine Learning Modeling Studies: Development and Validation.

Klement W, El Emam K J Med Internet Res. 2023; 25:e48763.

PMID: 37651179 PMC: 10502599. DOI: 10.2196/48763.