MENUCLOSE

 

Connect with us

Resource Center

Defining (real) learner engagement in online educational interventions

Over the weekend I came across an interesting article in the latest Alliance Almanac, “Defining Participants and Learners in CME: Standardizing Language for Online Activity Reporting.” Access to this article is limited to Alliance members, but in a nutshell, here is the take-away:

Showing the overall number of people who engaged in our
online CME activities, whether in whole or in part, and
evaluated and/or requested credit for them is important, but
it is only part of the story. Reporting the numbers of actual
target audience members who engaged with the content and
the clinical implications of their engagement is the remainder
of the story. In the outcomes measurement reports,
CME providers should include an explanation of the participation
or engagement metrics and/or process used to gather
them to make this apparent to grantors.

This is both a critical and necessary lesson – the number of ‘learners’ provides only a minimum understanding of impact – and to provide some practical support for educational planners,  the author presents a table of data elements that might begin to provide additional understanding. Here is what is shared:

Lichti - Defining Participants and Learners in CME

While this is a good start, it falls short of truly addressing what real learner engagement and impact looks like and what novel engagement analytics can tell us. From our Learning Actions Research and the implementation of the Learning Actions Model, we can provide an example of a much deeper and (in my opinion) a more valuable exploration of engagement.

When learners are encouraged to reflect and take action in increasingly efficient ways, we now empirically know that learners benefit from taking notes, setting reminders, conducting searches of related resources, viewing resources, downloading resources, sharing resources, asking faculty questions, responding to poll questions, reacting to assessment feedback, and beginning and replying to associated discussions – and the data show repeatedly that these actions are critical to improving learning and competency!

As one brief example, engagement in an online, flipped curriculum that utilizes the Learning Actions Model might be represented in the following way:

What does engagement look like - July 2015 blog post

Here is where a data-driven approach to learning provides a much richer understanding of learning AND engagement – from the figure above we learn that while participating in the 11-part curriculum, 121 learners took nearly 5,000 actions and averaged nearly 8 actions within each activity they began. 70% of these actions are comprised of what we refer to as the core learning actions including: 1,747 resource views, 1,043 resource downloads, nearly 200 notes taken, and nearly 500 reminders set. These are the purest measures of engagement as they are unprompted and self-directed…and behind these numbers are the qualitative data (real learner notes, searches, discussions…etc) that describe the actual cognitive actions that drive learning and knowledge change…and provide educators unique insight into the actual engagement and impact of their education!

Keep these lessons in mind: as the author of the recent Almanac article suggests, understanding of impact of educational interventions based solely on audience reach and traditional assessments (pre/post tests) is barely scratching the surface of what could and should be measured.

My take-away:  With continuing advancements in online learning, we have so many more opportunities to leverage new forms of learning and engagement data than ever before. Yet, we must open our minds to exploring the self-directed actions learners take to support their own learning and retention – only then will we be able to peer into the cognitive underpinnings of learning and behavior change. In short, our community could learn much more about the impact of our education if we strive to provide learners with better tools for learning…and then watch and study how learners actually engage.

Written by

Dr. McGowan has served in leadership positions in numerous medical educational organizations and commercial supporters and is a Fellow of the Alliance (FACEhp). He founded the Outcomes Standardization Project, launched and hosted the Alliance Podcast, and most recently launched and hosts the JCEHP Emerging Best Practices in CPD podcast. In 2012 he Co-Founded ArcheMedX, Inc, a healthcare informatics and e-learning company to apply his research in practice.

Leave a Comment