MENUCLOSE

 

Connect with us

Author: Brian S McGowan, PhD

ABSTRACT: Applying multimedia design principles enhances learning in medical education

Abstract
CONTEXT:
The Association of American Medical Colleges’ Institute for Improving Medical Education’s report entitled ‘Effective Use of Educational Technology’ called on researchers to study the effectiveness of multimedia design principles. These principles were empirically shown to result in superior learning when used with college students in laboratory studies, but have not been studied with undergraduate medical students as participants.
METHODS:
A pre-test/post-test control group design was used, in which the traditional-learning group received a lecture on shock using traditionally designed slides and the modified-design group received the same lecture using slides modified in accord with Mayer’s principles of multimedia design. Participants included Year 3 medical students at a private, midwestern medical school progressing through their surgery clerkship during the academic year 2009-2010. The medical school divides students into four groups; each group attends the surgery clerkship during one of the four quarters of the academic year. Students in the second and third quarters served as the modified-design group (n=91) and students in the fourth-quarter clerkship served as the traditional-design group (n=39).
RESULTS:
Both student cohorts had similar levels of pre-lecture knowledge. Both groups showed significant improvements in retention (p<0.0001), transfer (p<0.05) and total scores (p<0.0001) between the pre- and post-tests. Repeated-measures anova analysis showed statistically significant greater improvements in retention (F=10.2, p=0.0016) and total scores (F=7.13, p=0.0081) for those students instructed using principles of multimedia design compared with those instructed using the traditional design.
CONCLUSIONS:
Multimedia design principles are easy to implement and result in improved short-term retention among medical students, but empirical research is still needed to determine how these principles affect transfer of learning. Further research on applying the principles of multimedia design to medical education is needed to verify the impact it has on the long-term learning of medical students, as well as its impact on other forms of multimedia instructional programmes used in the education of medical students.

via Applying multimedia design principles enhances lear… [Med Educ. 2011] – PubMed – NCBI.

ABSTRACT: Novel educational approach for medical students: improved retention rates using interactive medical software compared with traditional lecture-based format.

Abstract
BACKGROUND:
Mannequin and computer-based simulators are useful for the practice of patient management, physical procedures, and competency. However, they are ineffective in teaching clinical medicine. StepStone Interactive Medical Software (SS) is a web-based medical learning modality that provides the user with a highly focused set of evaluative and interventional tasks to treat memorable virtual patients in a visual case-based format.
OBJECTIVE:
To determine whether the SS learning modality is superior to traditional lecture format in medical student learning and retention.
METHODS:
After Institutional Review Board (IRB) approval was obtained and the consents were signed, 30 third-year medical students were assigned randomly to 2 groups of 15 students each: The control group received two 30-minute PowerPoint lectures (Microsoft Corporation, Redmond, Washington) about torsades de pointes (TdP) and pulseless electrical activity (PEA), and the SS group was given 1 hour to review 2 SS cases teaching TdP and PEA. A preintervention test was given to assess their baseline knowledge. An immediate postintervention test was given to both groups. Twenty-two days later, a long-term retention test was administered. The results were analyzed using a Student t test for continuous variables.
RESULTS:
The mean scores for the preintervention test in the control and SS groups were 44.9 ± 3% and 44.1 ± 2%, respectively (p = 0.41). The mean scores for the postintervention test in the control and SS groups were 61.7 ± 2% and 86.7 ± 2%, respectively (p < 0.001). Improvement from baseline knowledge was calculated, and the mean improvement was 16.8 ± 3% in the control group and 42.5 ± 2% in the SS group (p < 0.001). The long-term retention test revealed the mean scores of 55.8 ± 3% in the control group and 70.1 ± 3% in the SS group (p < 0.001). Long-term improvement from baseline knowledge was calculated and the control group improved by 10.9 ± 4%, whereas the SS group improved by 26 ± 3% (p = 0.002).
CONCLUSIONS:
The SS learning modality demonstrated a significant improvement in student learning retention compared to traditional didactic lecture format. SS is an effective web-based medical education tool.

via Novel educational approach for medical students:… [J Surg Educ. 2012] – PubMed – NCBI.

ABSTRACT: Teaching for understanding in medical classrooms using multimedia design principles.

Abstract
Objectives  In line with a recent report entitled Effective Use of Educational Technology in Medical Education from the Association of American Medical Colleges Institute for Improving Medical Education (AAMC-IME), this study examined whether revising a medical lecture based on evidence-based principles of multimedia design would lead to improved long-term transfer and retention in Year 3 medical students. A previous study yielded positive effects on an immediate retention test, but did not investigate long-term effects. Methods  In a pre-test/post-test control design, a cohort of 37 Year 3 medical students at a private, midwestern medical school received a bullet point-based PowerPoint™ lecture on shock developed by the instructor as part of their core curriculum (the traditional condition group). Another cohort of 43 similar medical students received a lecture covering identical content using slides redesigned according to Mayer’s evidence-based principles of multimedia design (the modified condition group). Results  Findings showed that the modified condition group significantly outscored the traditional condition group on delayed tests of transfer given 1 week (d = 0.83) and 4 weeks (d = 1.17) after instruction, and on delayed tests of retention given 1 week (d = 0.83) and 4 weeks (d = 0.79) after instruction. The modified condition group also significantly outperformed the traditional condition group on immediate tests of retention (d = 1.49) and transfer (d = 0.76). Conclusions  This study provides the first evidence that applying multimedia design principles to an actual medical lecture has significant effects on measures of learner understanding (i.e. long-term transfer and long-term retention). This work reinforces the need to apply the science of learning and instruction in medical education.

via Teaching for understanding in medical classrooms us… [Med Educ. 2013] – PubMed – NCBI.

ABSTRACT: Improving participant feedback to continuing medical education presenters in internal medicine: a mixed-methods study.

Abstract
BACKGROUND:
Feedback is essential for improving the skills of continuing medical education (CME) presenters. However, there has been little research on improving the quality of feedback to CME presenters.
OBJECTIVES:
To validate an instrument for generating balanced and behavior-specific feedback from a national cross-section of participants to presenters at a large internal medicine CME course.
DESIGN, SETTING, AND PARTICIPANTS:
A prospective, randomized validation study with qualitative data analysis that included all 317 participants at a Mayo Clinic internal medicine CME course in 2009.
MEASUREMENTS:
An 8-item (5-point Likert scales) CME faculty assessment enhanced study form (ESF) was designed based on literature and expert review. Course participants were randomized to a standard form, a generic study form (GSF), or the ESF. The dimensionality of instrument scores was determined using factor analysis to account for clustered data. Internal consistency and interrater reliabilities were calculated. Associations between overall feedback scores and presenter and presentation variables were identified using generalized estimating equations to account for multiple observations within talk and speaker combinations. Two raters reached consensus on qualitative themes and independently analyzed narrative entries for evidence of balanced and behavior-specific comments.
RESULTS:
Factor analysis of 5,241 evaluations revealed a uni-dimensional model for measuring CME presenter feedback. Overall internal consistency (Cronbach alpha = 0.94) and internal consistency reliability (ICC range 0.88-0.95) were excellent. Feedback scores were associated with presenters’ academic ranks (mean score): Instructor (4.12), Assistant Professor (4.38), Associate Professor (4.56), Professor (4.70) (p = 0.046). Qualitative analysis revealed that the ESF generated the highest numbers of balanced comments (GSF = 11, ESF = 26; p = 0.01) and behavior-specific comments (GSF = 64, ESF = 104; p = 0.001).
CONCLUSIONS:
We describe a practical and validated method for generating balanced and behavior-specific feedback for CME presenters in internal medicine. Our simple method for prompting course participants to give balanced and behavior-specific comments may ultimately provide CME presenters with feedback for improving their presentations.

via Improving participant feedback to continuin… [J Gen Intern Med. 2012] – PubMed – NCBI.

ABSTRACT: Measuring Faculty Reflection on Medical Grand Rounds at Mayo Clinic: Associations With Teaching Experience, Clinical Exposure, and Presenter Effectiveness.

Abstract
OBJECTIVES:
To develop and validate a new instrument for measuring participant reflection on continuing medical education (CME) and determine associations between the reflection instrument scores and CME presenter, participant, and presentation characteristics.
PARTICIPANTS AND METHODS:
This was a prospective validation study of presenters and faculty at the weekly medical grand rounds at Mayo Clinic in Rochester, Minnesota, from January 1, 2011, through June 30, 2011. Eight items (5-point Likert scales) were developed on the basis of 4 reflection levels: habitual action, understanding, reflection, and critical reflection. Factor analysis was performed to account for clustered data. Interrater and internal consistency reliabilities were calculated. Associations between reflection scores and characteristics of presenters, participants, and presentations were determined.
RESULTS:
Participants completed a total of 1134 reflection forms. Factor analysis revealed a 2-dimensional model (eigenvalue; Cronbach α): minimal reflection (1.19; 0.77) and high reflection (2.51; 0.81). Item mean (SD) scores ranged from 2.97 (1.17) to 4.01 (0.83) on a 5-point scale. Interrater reliability (intraclass correlation coefficient) for individual items ranged from 0.58 (95% CI, 0.31-0.78) to 0.88 (95% CI, 0.80-0.94). Reflection scores were associated with presenters’ speaking effectiveness (P<.001) and prior CME teaching experience (P=.02), participants’ prior clinical experiences (P<.001), and presentations that were case based (P<.001) and used the audience response system (P<.001).
CONCLUSION:
We report the first validated measure of reflection on CME at medical grand rounds. Reflection scores were associated with presenters’ effectiveness and prior teaching experience, participants’ clinical exposures, and presentations that were interactive and clinically relevant. Future research should determine whether reflection on CME leads to better patient outcomes.

via Measuring Faculty Reflection on Medical Grand… [Mayo Clin Proc. 2013] – PubMed – NCBI.

ABSTRACT: Continuing medical education: How to write multiple choice questions. [spanish]

Abstract
Evaluating professional competence in medicine is a difficult but indispensable task because it makes it possible to evaluate, at different times and from different perspectives, the extent to which the knowledge, skills, and values required for exercising the profession have been acquired. Tests based on multiple choice questions have been and continue to be among the most useful tools for objectively evaluating learning in medicine. When these tests are well designed and correctly used, they can stimulate learning and even measure higher cognitive skills. Designing a multiple choice test is a difficult task that requires knowledge of the material to be tested and of the methodology of test preparation as well as time to prepare the test. The aim of this article is to review what can be evaluated through multiple choice tests, the rules and guidelines that should be taken into account when writing multiple choice questions, the different formats that can be used, the most common errors in elaborating multiple choice tests, and how to analyze the results of the test to verify its quality.

via Continuing medical education: How to write multip… [Radiologia. 2013] – PubMed – NCBI.

Continuing education meetings and workshops: effects on professional practice and health care outcomes (Review)

In this update, we examined the effects of continuing education meetings on professional practice and patient outcomes. We also investigated factors that might influence the effectiveness of educational meetings. We used methods that have been developed by the Cochrane Effective Practice and Organisation of Care (EPOC) Group (Grimshaw 2003) since the previous review ( O’Brien 2001). These methods were used in other recent EPOC reviews (Doumit 2007; Jamtvedt 2006; O’Brien 2007). The provision of printed educational materials has been reported to have little or no effect, in two reviews (Freemantle 1997; Grimshaw 2001), but this finding has been questioned in a more recent review (Grimshaw 2004). Because printed materials are usually an integral part of educational meetings, we chose to consider printed educational materials as a component of educational meetings and not as an additional independent intervention. Few studies have tested educational meetings without any printed educational materials (Grimshaw 2004).

http://apps.who.int/rhl/reviews/CD003030.pdf

MANUSCRIPT: Intended Practice Changes and Barriers among Primary Care Providers

Background. The purpose of accredited CME has recently been enhanced to change competence, performance or patient outcomes. In addition, CME providers seeking accreditation with commendation are required to implement educational strategies to remove, overcome or address barriers to physician change. However, current methods to measure intended changes in practice and barriers to these changes are limited.
Method. At a free-standing annual Family Medicine review, we administered a -specific instrument asking learners to list intended practice changes related to 3 specific high-impact content areas (reducing error, emerging infections, and contraception for women with medical co-morbidities), score their likelihood of implementing each of these changes (1=very unlikely to l0 very likely), identify perceived barriers to each change, and identify their strategies to overcome these barriers. We analyzed the results and discussed them with learners on the last day of the course.
Results. Our response rate was 30.8%. For the 3 content areas, the mean number of changes per respondent ranged from 1.8 to 2.2, and for 72% of the intended practice changes, the likelihood of implementing them was >/= 8. For all 3 content areas, physicians’ remembering and breaking old habits were commonly-cited, but for reducing error numerous other barriers were also perceived. To overcome these barriers, the most commonly cited strategies were decision support techniques. In addition, for reducing error, additional commonly-cited approaches were more team communication and training and systems changes.
Conclusion. Using a targeted evaluation, we were able to go beyond knowledge and satisfaction and analyze intended practice changes and perceived barriers to change. For some content areas (such as emerging infections or contraception), the most commonly-cited barriers were directly physician related whereas for a more complex content area (such as reducing error), additional barriers were perceived. These findings emphasize the importance of CME providers building bridges with other stakeholders who can influence changes in practice.

via Intended Practice Changes and Barriers among Primary Care Providers | Gibbs | CE Measure.

ABSTRACT: Commitment to change instrument enhances program planning, implementation, and evaluation.

Abstract
INTRODUCTION:
This study investigates the use of a commitment to change (CTC) instrument as an integral approach to continuing medical education (CME) planning, implementation, and evaluation and as a means of facilitating physician behavior change.
METHODS:
Descriptive statistics and grounded theory methods were employed. Data were collected from 20 consecutive CME programs. Physicians were asked to list up to three things they intended to change in their clinical practice as a result of the program. A copy was sent 3 weeks later as a reminder. Six months later, a summary of peer-intended changes was sent to reinforce intended behavior change.
RESULTS:
Of 602 participants, 291 (48%) completed CTC forms, resulting in 803 citations. Responses were congruent with the educational objectives and intentions of the program planners. Using the constant comparative method of analysis, a framework was identified for interpreting physician learning strategies. It included change strategies and motivation, learning issues, better doctoring, changes to clinic practice, and diffusion.
DISCUSSION:
CTC was useful as a multipurpose tool providing planners with meaningful feedback to (1) assess congruence of intended changes in physician behavior with program objectives, (2) document unanticipated learning outcomes, and (3) enable and reinforce intended behavior change.

via Commitment to change instrument en… [J Contin Educ Health Prof. 2004] – PubMed – NCBI.