» Articles » PMID: 19930507

Doctor Scores on National Qualifying Examinations Predict Quality of Care in Future Practice

Overview
Journal Med Educ
Specialty Medical Education
Date 2009 Nov 26
PMID 19930507
Citations 38
Authors
Affiliations
Soon will be listed here.
Abstract

Objectives: This study aimed to determine if national licensing examinations that measure medical knowledge (QE1) and clinical skills (QE2) predict the quality of care delivered by doctors in future practice.

Methods: Cohorts of doctors who took the Medical Council of Canada Qualifying Examinations Part I (QE1) and Part II (QE2) between 1993 and 1996 and subsequently entered practice in Ontario, Canada (n = 2420) were followed for their first 7-10 years in practice. The 208 of these doctors who were randomly selected for peer assessment of quality of care were studied. Main outcome measures included quality of care (acceptable/unacceptable) as assessed by doctor peer-examiners using a structured chart review and interview. Multivariate logistic regression was used to determine if qualifying examination scores predicted the outcome of the peer assessments while controlling for age, sex, training and specialty, and if the addition of the QE2 scores provided additional prediction of quality of care.

Results: Fifteen (7.2%) of the 208 doctors assessed were considered to provide unacceptable quality of care. Doctors in the bottom quartile of QE1 scores had a greater than three-fold increase in the risk of an unacceptable quality-of-care assessment outcome (odds ratio [OR] 3.41, 95% confidence interval [CI] 1.14-10.22). Doctors in the bottom quartile of QE2 scores were also at higher risk of being assessed as providing unacceptable quality of care (OR 4.24, 95% CI 1.32-13.61). However, QE2 results provided no significant improvement in predicting peer assessment results over QE1 results (likelihood ratio test: chi(2) = 3.21, P-value((1 d.f.)) = 0.07).

Conclusions: Doctor scores on qualifying examinations are significant predictors of quality-of-care problems based on regulatory, practice-based peer assessment.

Citing Articles

Does higher performance in a national licensing examination predict better quality of care? A longitudinal observational study of Ethiopian anesthetists.

Asemu Y, Yigzaw T, Desta F, Scheele F, van den Akker T BMC Anesthesiol. 2024; 24(1):188.

PMID: 38802780 PMC: 11129401. DOI: 10.1186/s12871-024-02575-w.


Competence by Design: The Role of High-Stakes Examinations in a Competence Based Medical Education System.

Bhanji F, Naik V, Skoll A, Pittini R, Daniels V, Bacchus C Perspect Med Educ. 2024; 13(1):68-74.

PMID: 38343558 PMC: 10854425. DOI: 10.5334/pme.965.


Do clinical and communication skills scores on credentialing exams predict potentially inappropriate antibiotic prescribing?.

Tamblyn R, Moraga T, Girard N, Boulet J, Chan F, Habib B BMC Med Educ. 2023; 23(1):821.

PMID: 37915014 PMC: 10621187. DOI: 10.1186/s12909-023-04817-w.


Patient views of the good doctor in primary care: a qualitative study in six provinces in China.

Wang W, Zhang J, Lu J, Wei X Glob Health Res Policy. 2023; 8(1):24.

PMID: 37434267 PMC: 10334597. DOI: 10.1186/s41256-023-00309-y.


Validity evidence and psychometric evaluation of a socially accountable health index for health professions schools.

Barber C, van der Vleuten C, Chahine S Adv Health Sci Educ Theory Pract. 2023; 29(1):147-172.

PMID: 37347458 PMC: 10927857. DOI: 10.1007/s10459-023-10248-5.