» Articles » PMID: 15679689

The Use of Qualitative Research Criteria for Portfolio Assessment As an Alternative to Reliability Evaluation: a Case Study

Overview
Journal Med Educ
Specialty Medical Education
Date 2005 Feb 1
PMID 15679689
Citations 17
Authors
Affiliations
Soon will be listed here.
Abstract

Aim: Because it deals with qualitative information, portfolio assessment inevitably involves some degree of subjectivity. The use of stricter assessment criteria or more structured and prescribed content would improve interrater reliability, but would obliterate the essence of portfolio assessment in terms of flexibility, personal orientation and authenticity. We resolved this dilemma by using qualitative research criteria as opposed to reliability in the evaluation of portfolio assessment. METHODOLOGY/RESEARCH DESIGN: Five qualitative research strategies were used to achieve credibility and dependability of assessment: triangulation, prolonged engagement, member checking, audit trail and dependability audit. Mentors read portfolios at least twice during the year, providing feedback and guidance (prolonged engagement). Their recommendation for the end-of-year grade was discussed with the student (member checking) and submitted to a member of the portfolio committee. Information from different sources was combined (triangulation). Portfolios causing persistent disagreement were submitted to the full portfolio assessment committee. Quality assurance procedures with external auditors were used (dependability audit) and the assessment process was thoroughly documented (audit trail).

Results: A total of 233 portfolios were assessed. Students and mentors disagreed on 7 (3%) portfolios and 9 portfolios were submitted to the full committee. The final decision on 29 (12%) portfolios differed from the mentor's recommendation.

Conclusion: We think we have devised an assessment procedure that safeguards the characteristics of portfolio assessment, with credibility and dependability of assessment built into the judgement procedure. Further support for credibility and dependability might be sought by means of a study involving different assessment committees.

Citing Articles

Combining Support and Assessment in Health Professions Education: Mentors' and Mentees' Experiences in a Programmatic Assessment Context.

Loosveld L, Driessen E, Theys M, Van Gerven P, Vanassche E Perspect Med Educ. 2023; 12(1):271-281.

PMID: 37426357 PMC: 10327863. DOI: 10.5334/pme.1004.


Student perspectives on programmatic assessment in a large medical programme: A critical realist analysis.

Roberts C, Khanna P, Bleasel J, Lane S, Burgess A, Charles K Med Educ. 2022; 56(9):901-914.

PMID: 35393668 PMC: 9542097. DOI: 10.1111/medu.14807.


Australian chiropractic and osteopathic graduates' perceptions of readiness for transition to practice.

Haworth N, Horstmanshof L, Moore K J Chiropr Educ. 2022; 36(2):153-164.

PMID: 35041740 PMC: 9536224. DOI: 10.7899/JCE-20-4.


Assessment in the context of problem-based learning.

van der Vleuten C, Schuwirth L Adv Health Sci Educ Theory Pract. 2019; 24(5):903-914.

PMID: 31578642 PMC: 6908559. DOI: 10.1007/s10459-019-09909-1.


Assessing an assessment: The review and redesign of a competency-based mid-degree evaluation.

Mayowski C, Norman M, Kapoor W J Clin Transl Sci. 2019; 2(4):223-227.

PMID: 30820359 PMC: 6382334. DOI: 10.1017/cts.2018.321.