Journal article
An audit of assessment tools in a medical school in eastern Saudi Arabia


Research Areas
Currently no objects available

Publication Details
Author list: Abdullah M Al-Rubaish, Khalid U Al-Umran, Lade Wosornu
Publisher: Medknow Publications
Publication year: 2005
Journal: Journal of Family and Community Medicine
Journal name in source: Journal of family & community medicine
Journal acronym: JFCM
Volume number: 12
Issue number: 2
Start page: 101
End page: 105
Number of pages: 5
ISSN: 2230-8229
Web of Science ID:
PubMed ID: 23012084
Scopus ID:
eISSN: 2229-340X


Background: Assessment has a powerful influence on curriculum delivery. Medical instructors
must use tools which conform to educational principles, and audit them as part of curriculum
review.
Aim: To generate information to support recommendations for improving curriculum delivery.
Setting: Pre-clinical and clinical departments in a College of Medicine, Saudi Arabia.
Method: A self-administered questionnaire was used in a cross-sectional survey to see if
assessment tools being used met basic standards of validity, reliability and currency, and if
feedback to students was adequate. Excluded were cost, feasibility and tool combinations.
Results: Thirty-one (out of 34) courses were evaluated. All 31 respondents used MCQs,
especially one-best (28/31) and true/false (13/31). Groups of teachers selected test questions
mostly. Pre-clinical departments sourced equally from "new" (10/14) and "used" (10/14) MCQs;
clinical departments relied on ‘banked’ MCQs (16/17). Departments decided pass marks (28/31)
and chose the College-set 60%; the timing was pre-examination in 13/17 clinical but postexamination
in 5/14 pre-clinical departments. Of six essay users, five used model answers but
only one did double marking. OSCE was used by 7/17 clinical departments; five provided
checklist. Only 3/31 used optical reader. Post-marking review was done by 13/14 pre-clinical but
10/17 clinical departments. Difficulty and discriminating indices were determined by only 4/31
departments. Feedback was provided by 12/14 pre-clinical and 7/17 clinical departments. Only
10/31 course coordinators had copies of examination regulations.
Recommendations: MCQ with single-best answer, if properly constructed and adequately
critiqued, is the preferred tool for assessing theory domain. However, there should be fresh
questions, item analyses, comparisons with pervious results, optical reader systems and double
marking. Departments should use OSCE or OSPE more often. Long essays, true/false, fill-inthe-blank-spaces
and more-than-one-correct-answer can be safely abolished. Departments or
teams should set test papers and collectively take decisions. Feedback rates should be improved.
A Center of Medical Education, including an Examination Center is required. Fruitful future
studies can be repeat audit, use of “negative questions” and the number of MCQs per test paper.
Comparative audit involving other regional medical schools may be of general interest.


Projects
Currently no objects available

Keywords
Currently no objects available

Documents
Currently no objects available

Last updated on 2020-26-11 at 09:23