Audience Response System Facilitates Prediction of Scores on In-Training Examination
Skip to main content
eScholarship
Open Access Publications from the University of California

Audience Response System Facilitates Prediction of Scores on In-Training Examination

Abstract

Objectives: To determine if scores on review quizzes delivered by an audience response system (ARS) correlate with in-training exam (ITE) scores. Methods: Prospective observational study of EM residents at 6 accredited EM residency programs. Subjects included residents who had taken previous in-training examinations. Subjects participated in bimonthly review sessions using an audience response system. Twelve review quizzes were administered, each consisting of 10 multiple choice questions. After the in-training exam, subjects completed an attitudinal survey consisting of six Likert scale items and one “yes/no” item. A mixed linear model was used to analyze the data accounting for prior 2012 in-training exam scores and nesting due to institution. Results: Among 192 participants, data from 135 (70.3%) participants were analyzed. Results from the mixed linear model indicate that the total mean score on the review quizzes was a significant [t(127) = 6.68; p < 0.001] predictor of the 2013 in-training exam score after controlling for the 2012 in-training exam score.  146 participants completed the attitudinal survey.  96% of respondents stated that they would like ARS to be used more often in resident education. Respondents felt the sessions aided in learning (mean 7.7/10), assisted in preparation for the in-training exam (mean 6.7/10), and helped identify content areas of weakness (mean 7.6/10).  Conclusion: Our results suggest that scores from review quizzes delivered by an audience response system correlate with in-training exam scores and is viewed positively by residents. 

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View