MENUCLOSE

 

Connect with us

Author: Brian S McGowan, PhD

MANUSCRIPT: Standardized Patient’s Views About their Role in the Teaching-Learning Process of Undergraduate Basic Science Medical Students

INTRODUCTION:
Standardized Patients (SPs) are widely used in medical education. SPs have a number of advantages but also have certain limitations. At the institution, SPs have been used since January 2013 for both teaching-learning and assessment during the basic science years of the undergraduate medical program.
AIM:
The present study was conducted to investigate the perception of SPs about various aspects of the program and obtain suggestions for further improvement.
MATERIALS AND METHODS:
A Focus Group Discussion (FGD) was conducted with a group of five SPs during the second week of November 2015. Respondents were explained the aims and objectives of the study and invited to participate. Written informed consent was obtained. The FGD was conducted using a discussion guide and was audio recorded. Various aspects of the SP program at the institution were discussed. Motivation/s for joining the program and suggestions for further improvement were obtained. Transcripts were created after listening to the recordings and were read through multiple times. Similar responses were coded. Items with similar codes were grouped together into themes.
RESULTS:
Three respondents were female while two were male. The major advantage of SPs was their flexibility and ability to present a standardized response to the student. Students become familiar and comfortable with SPs. However, as a SP is simulating an illness s/he may not always be able to do complete justice to the role. The process used by SPs to prepare themselves to portray various diseases was highlighted. The use of SPs both during teaching-learning and assessment was also discussed. Some SPs are trained to provide feedback to students. Most SPs joined the program based on invitations from their friends who were already SPs. Challenges in recruiting SPs in a small island were discussed. Suggestions for further improvement were obtained.
CONCLUSION:
The present study obtained the perception of SPs regarding various aspects of the SP program at the institution. The overall opinion of SPs was positive.

via Standardized Patient’s Views About their Role in the Teaching-Learning Process of Undergraduate Basic Science Medical Students. – PubMed – NCBI.

ABSTRACT: Engaging medical undergraduates in question making: a novel way to reinforcing learning in physiology

The monotony of conventional didactic lectures makes students less attentive toward learning, and they tend to memorize isolated facts without understanding, just for the sake of passing exams. Therefore, to promote a habit of gaining indepth knowledge of basic sciences in medical undergraduates along with honing of their communication and analytical skills, we introduced this more interactive way of learning. The present study was performed on 99 first-semester medical students. After conventional didactic lectures, students were asked to prepare small conceptual questions on the topic. They were divided into two teams, which were made to ask questions to each other. If a team failed to answer, the student who questioned was supposed to answer to the satisfaction of the other team’s student. Data were then obtained by getting feedback from the students on a 10-item questionnaire, and statistical evaluation was done using MS Excel and SPSS. To draft questions, students went through the whole system comprehensively and made questions from every possible aspect of the topic. Some of the questions (30%) were of recall type, but most judged higher cognitive domains. Student feedback revealed that they were satisfied, motivated to read more, and were confident of applying this learning and communication skills in future clinical practice. Students also expressed their desire to implement this activity as a regular feature of the curriculum. The activity resulted in an increase in student perceptions of their knowledge on the topic as well as communicative and analytical skills. This may eventually lead to better learning.

via Engaging medical undergraduates in question making: a novel way to reinforcing learning in physiology. – PubMed – NCBI.

ABSTRACT: Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team

OBJECTIVE:
Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level.
METHODS:
Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement’s (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level.
RESULTS:
Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control.
CONCLUSIONS:
Utilizing the IHI’s Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination.

via Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team. – PubMed – NCBI.

MANUSCRIPT: Effect of warning symbols in combination with education on the frequency of erroneously crushing medication in nursing homes: an uncontrolled before and after study

OBJECTIVES:
Residents of nursing homes often have difficulty swallowing (dysphagia), which complicates the administration of solid oral dosage formulations. Erroneously crushing medication is common, but few interventions have been tested to improve medication safety. Therefore, we evaluated the effect of warning symbols in combination with education on the frequency of erroneously crushing medication in nursing homes.
SETTING:
This was a prospective uncontrolled intervention study with a preintervention and postintervention measurement. The study was conducted on 18 wards (total of 200 beds) in 3 nursing homes in the North of the Netherlands.
PARTICIPANTS:
We observed 36 nurses/nursing assistants (92% female; 92% nursing assistants) administering medication to 197 patients (62.9% female; mean age 81.6).
INTERVENTION:
The intervention consisted of a set of warning symbols printed on each patient’s unit dose packaging indicating whether or not a medication could be crushed as well as education of ward staff (lectures, newsletter and poster).
PRIMARY OUTCOME MEASURE:
The relative risk (RR) of a crushing error occurring in the postintervention period compared to the preintervention period. A crushing error was defined as the crushing of a medication considered unsuitable to be crushed based on standard reference sources. Data were collected using direct (disguised) observation of nurses during drug administration.
RESULTS:
The crushing error rate decreased from 3.1% (21 wrongly crushed medicines out of 681 administrations) to 0.5% (3/636), RR=0.15 (95% CI 0.05 to 0.51). Likewise, there was a significant reduction using data from patients with swallowing difficulties only, 87.5% (21 errors/24 medications) to 30.0% (3/10) (RR 0.34, 95% CI 0.13 to 0.89). Medications which were erroneously crushed included enteric-coated formulations (eg, omeprazole), medication with regulated release systems (eg, Persantin; dipyridamol) and toxic substances (eg, finasteride).
CONCLUSIONS:
Warning symbols combined with education reduced erroneous crushing of medication, a well-known and common problem in nursing homes.

via Effect of warning symbols in combination with education on the frequency of erroneously crushing medication in nursing homes: an uncontrolled befor… – PubMed – NCBI.

ABSTRACT: A Novel Specialty-Specific, Collaborative Faculty Development Opportunity in Education Research

PURPOSE:
For the busy clinician-educator, accessing opportunities that develop the skills and knowledge necessary to perform education research can be problematic. The Medical Education Research Certification at Council of Emergency Medicine Residency Directors (MERC at CORD) Scholars’ Program is a potential alternative. The current study evaluates the program’s outcomes after five years.
METHOD:
The authors employed a quasi-experimental design in this study. The study population consisted of the initial five MERC at CORD cohorts (2009-2013). Development of a logic model informed Kirkpatrick-level outcomes. Data from annual pre/post surveys, an alumni survey (2014), and tracking of national presentations/peer-reviewed publications resulting from program projects served as outcome measurements.
RESULTS:
Over the first five years, 149 physicians participated in the program; 97 have completed six MERC workshops, and 63 have authored a national presentation and 30 a peer-reviewed publication based on program projects. Of the 79 participants responding to the pre- and postsurveys from the 2011-2013 cohorts, 65 (82%) reported significant improvement in skills and knowledge related to education research and would recommend the program. Of the 61 graduates completing the alumni survey, 58 (95%) indicated their new knowledge was instrumental beyond educational research, including promotion to new leadership positions, and 28 (47% of the 60 responding) reported initiating a subsequent multi-institutional education study. Of these, 57% (16/28) collaborated with one or more peers/mentors from their original program project.
CONCLUSIONS:
Kirkpatrick-level outcomes 1, 2, 3, and perhaps 4 demonstrate that the MERC at CORD program is successful in its intended purpose.

via A Novel Specialty-Specific, Collaborative Faculty Development Opportunity in Education Research: Program Evaluation at Five Years. – PubMed – NCBI.

ABSTRACT: Towards a pedagogy for patient and public involvement in medical education

CONTEXT:
This paper presents a critique of current knowledge on the engagement of patients and the public, referred to here as patient and public involvement (PPI), and calls for the development of robust and theoretically informed strategies across the continuum of medical education.
METHODS:
The study draws on a range of relevant literatures and presents PPI as a response process in relation to patient-centred learning agendas. Through reference to original research it discusses three key priorities for medical educators developing early PPI pedagogies, including: (i) the integration of evidence on PPI relevant to medical education, via a unifying corpus of literature; (ii) conceptual clarity through shared definitions of PPI in medical education, and (iii) an academically rigorous approach to managing complexity in the evaluation of PPI initiatives.
RESULTS:
As a response to these challenges, the authors demonstrate how activity modelling may be used as an analytical heuristic to provide an understanding of a number of PPI systems that may interact within complex and dynamic educational contexts.
CONCLUSION:
The authors highlight the need for a range of patient voices to be evident within such work, from its generation through to dissemination, in order that patients and the public are partners and not merely objects of this endeavour. To this end, this paper has been discussed with and reviewed by our own patient and public research partners throughout the writing process.

via Towards a pedagogy for patient and public involvement in medical education. – PubMed – NCBI.

ABSTRACT: Needles and Haystacks: Finding Funding for Medical Education Research

Medical education research suffers from a significant and persistent lack of funding. Although adequate funding has been shown to improve the quality of research, there are a number of factors that continue to limit it. The competitive environment for medical education research funding makes it essential to understand strategies for improving the search for funding sources and the preparation of proposals. This article offers a number of resources, strategies, and suggestions for finding funding. Investigators must be able to frame their research in the context of significant issues and principles in education. They must set their proposed work in the context of prior work and demonstrate its potential for significant new contributions. Because there are few funding sources earmarked for medical education research, researchers much also be creative, flexible, and adaptive as they seek to present their ideas in ways that are appealing and relevant to the goals of funders. Above all, the search for funding requires persistence and perseverance.

via Needles and Haystacks: Finding Funding for Medical Education Research. – PubMed – NCBI.

ABSTRACT: A systematic review examining the effectiveness of blending technology with team-based learning

BACKGROUND:
Technological advancements are rapidly changing nursing education in higher education settings. Nursing academics are enthusiastically blending technology with active learning approaches such as Team Based Learning (TBL). While the educational outcomes of TBL are well documented, the value of blending technology with TBL (blended-TBL) remains unclear. This paper presents a systematic review examining the effectiveness of blended-TBL in higher education health disciplines.
OBJECTIVES:
This paper aimed to identify how technology has been incorporated into TBL in higher education health disciplines. It also sought to evaluate the educational outcomes of blended-TBL in terms of student learning and preference.
METHOD:
A review of TBL research in Medline, CINAHL, ERIC and Embase databases was undertaken including the search terms, team based learning, nursing, health science, medical, pharmaceutical, allied health education and allied health education. Papers were appraised using the Critical Appraisal Skills Program (CASP).
RESULTS:
The final review included 9 papers involving 2094 student participants. A variety of technologies were blended with TBL including interactive eLearning and social media.
CONCLUSION:
There is limited evidence that blended-TBL improved student learning outcomes or student preference. Enthusiasm to blend technology with TBL may not be as well founded as initially thought. However, few studies explicitly examined the value of incorporating technology into TBL. There is a clear need for research that can discern the impact of technology into TBL on student preference and learning outcomes, with a particular focus on barriers to student participation with online learning components.

via A systematic review examining the effectiveness of blending technology with team-based learning. – PubMed – NCBI.

ABSTRACT: Elearning approaches to prevent weight gain in young adults: A randomized controlled study

OBJECTIVE:Preventing obesity among young adults should be a preferred public health approach given the limited efficacy of treatment interventions. This study examined whether weight gain can be prevented by online approaches using two different behavioral models, one overtly directed at obesity and the other covertly.METHODS:A three-group parallel randomized controlled intervention was conducted in 2012-2013; 20,975 young adults were allocated a priori to one control and two “treatment” groups. Two treatment groups were offered online courses over 19 weeks on (1) personal weight control (“Not the Ice Cream Van,” NTICV) and, (2) political, environmental, and social issues around food (“Goddess Demetra,” “GD”). Control group received no contact. The primary outcome was weight change over 40 weeks.RESULTS:Within-group 40-week weight changes were different between groups (P < 0.001): Control (n = 2,134): +2.0 kg (95% CI = 1.5, 2.3 kg); NTICV (n = 1,810): -1.0 kg (95% CI = -1.3, -0.5); and GD (n = 2,057): -1.35 kg (95% CI = -1.4 to -0.7). Relative risks for weight gain vs.CONTROL:NTICV = 0.13 kg (95% CI = 0.10, 0.15), P < 0.0001; GD = 0.07 kg (95% CI = 0.05, 0.10), P < 0.0001.

CONCLUSIONS:Both interventions were associated with prevention of the weight gain observed among control subjects. This low-cost intervention could be widely transferable as one tool against the obesity epidemic. Outside the randomized controlled trial setting, it could be enhanced using supporting advertising and social media.

via Elearning approaches to prevent weight gain in young adults: A randomized controlled study. – PubMed – NCBI.

The “Golden Rules” of Truly Purposeful Educational Design (and how to embrace them!)

Looking back at nearly a decade of training as an academic research scientist, I have come to hold one truth above all else: the most important lesson a scientist learns is how to approach one’s curiosity in a structured way – asking questions and challenging common understanding is essential to growing as a scientist – but you must be methodical in how you go about your exploration. This is how careers are made, how science evolves, and how real breakthroughs emerge.

Having spent the past fifteen years studying medical education, I’d argue that being an educator is no different: the “Golden Rule” and most important lesson an educator learns is how to approach one’s design and planning in a structured way – asking questions and challenging common approaches is essential to growing as an educator – but you must be organized and methodical in how you go about your work.

Ask questions, be methodical

There are two critical elements baked within the “Golden Rule” – 1) asking question/challenging common approaches means that an educator should never accept that the status quo is the optimal approach (or even acceptable) and 2) being methodical about what should/could change and what results from the planned changes is the best path towards success.shutterstock_317617661_golden rules_small

Understanding how to look beyond the status quo is no simple matter, experience suggests that not everyone is comfortable asking tough questions, challenging conventional wisdom, or changing established processes; and not every workplace or culture empowers such questions or curiosity. This is a challenge that I will try to tackle in a separate post – for now let’s assume curiosity is alive and well and that your question has already been identified….now what do you do?

Stepping back a bit…

Educators are tasked with creating impact (i.e., changes in knowledge/performance, etc…) through education. In this vein, volumes have been written on how gap analyses, needs assessments, learning objectives, adult learning theory, and outcomes models are essential tools of this work – but little has been written about how these tools are most effectively and most methodically applied in practice.

To prove this point, my guess is that most educational professionals can point to books on needs assessment (Example 1), or guidelines on writing learning objectives (Example 2), or even meta-reviews on adult learning theory or outcomes models (Example 3), but few can point to examples of how these tools are applied over time.

How do you systematically approach each tool and ‘test’ whether it impacts the impact you intended?

How do you demonstrate in your educational practice/setting that you are making the optimal design choices and generating the greatest impact given all available resource?

It seems that, accepting the “Golden Rule”, what is missing from the decades of literature on educational planning, design, implementation, and assessment is the ‘methodical’ piece – how can we execute on our plans AND ensure that we are intelligently advancing the profession of medical education?

Here is my simple, yet critical suggestion: Much like an academic research scientist documents each step in the scientific method – question, intervention, conditions, data, analysis – an educational designer should meticulously document each step in their educational planning process. In short, an educational designer should commit to keeping a lab notebook!

Your Lab Notebook

As a bit of background, stored away in my basement I have more than a dozen lab notebooks much like the one pictured above, many dating back to the 1990’s. These notebooks document every experiment I ever conducted as a research scientist. The notebooks allowed me to document each step in the exploration from planning, to alignment with prior research, to step-by-step details of the interventions, to data, to analysis and conclusions – as a research scientist my lab notebooks were the scaffolding for my thoughts, my efforts, and my findings.Lab Notebooks

As an educator, I have moved on to a digital notebook – I use Evernote – that similarly documents my questions, the alignment with prior research, my interventional opportunities, any known limitations, and my data, analyses, and conclusions.

Leveraging Your Lab Notebook

By thinking about each of your educational design decisions as an opportunity to learn something new – as an experiment – and by methodically documenting these decisions within your lab notebook, you will quickly recognize how often subtle variables or decisions lead to, or undermine, the success of your interventions!

For example, at ArcheMedX we partner with dozens of academic medical centers, national medical societies, and medical education companies that use our software, learning models, and novel data to improve the educational interventions that they plan, design, and implement – each Partner and each project therefore may be seen as a separate experiment. Connecting these experiments ensures that the educational experiences that are provided to Learners from around the world are constantly improving and my lab notebook helps me connect these dots.

  • How can I demonstrate that the ArcheViewer drives greater completion rates = experiment!
  • How can I prove the effectiveness of the ArcheViewer in different audiences = experiment!
  • Can the Learning Actions Model allow for agile activity enhancements = experiment!
  • What is the optimal number/form of designed learning moments = experiment!

To be clear, it can be a challenge to identify a specific element in each project that may be ‘studied’ or documented. But when the circumstances present themselves – and when I convince our Partners that they have a unique opportunity to collect data that might address a research question – I will commit to documenting these ‘experiments’ within my notebook.

In the end, I see my lab notebook as an incredibly critical mirror that ensures that I don’t lose touch with the impact I am having or could be having on the science of medical education. And my belief is that if are willing to accept the Golden Rule, then you’ll quickly come to see the value of your lab notebook too!

Continue Reading