Sunday, October 31, 2010

Redesigning Continuing Education in the Health Professions

A recent report from the Institute of Medicine (IOM) suggested several ways to improve continuing education in the health professions.  The recommendations in the report would greatly improve the continuing education, especially for educational programs for teams of providers from different disciplines. 

Some of the problems with the current system identified in the report:
  • a lack of scientific understanding of what kinds of training are effective
  • fragmented oversight of continuing education

Lack of Scientific Understanding of Effective Training
One of the things I'm interested in, and the reason I'm writing this blog, is to explore the science of learning and to look at how that has been, and can be applied to continuing education in the health professions.  We know a lot about how professionals learn, but there is much more that we need to understand. 

Fragmented Oversight of Continuing Education
Each discipline (doctors, nurses, social workers, psychologists, etc) has their own accreditation body and their own set of rules to follow.  This isn't a problem if you're planning learning for only one group of professionals.  More and more though, it's becoming apparent that teams of professionals who work together with specific patient populations, should be trained together.  It can be a struggle to create an accredited learning experience, that meets the requirements for each of the separate groups, especially if you're trying to do something that is slightly unconventional. 

Friday, October 22, 2010

Internet-Based Learning is Just as Effective as Face-to-Face

If distance learning is well-conceived and carried out, there are no differences in learning based on studies comparing live training to internet-based training. 

This is the major finding of a review published in JAMA in Sept. 2008 titled, Internet Learning in the Health Professions: A Meta-analysis.  This paper was written by the research group of Cook, Levinson, Garside, Dupras, Erwin, and Montori, who have been writing a lot of very interesting papers on learning from technology in the health sciences. 

They looked at 201 studies, some comparing internet-based learning with a no intervention group and comparing internet-based learning with a face-to-face alternative.  They looked at three different outcomes: knowledge, skills, and behavior/ effects on patients. 

What did they find?  That internet-based training was no different than live training.  Their findings are similar to publications from other fields showing that at worst, there is no difference between learning online and face-to-face learning. 

The conclusion:
It's a waste of time to continue doing studies comparing internet learning with face-to-face learning, it's time to start looking at exactly what kinds of activities and instructional methods lead to better learning. 

In future posts, I'll look at some of the things that you can do to improve learning and outcomes for your online learners.  

Thursday, October 21, 2010

Expertise

Over the last 25 years, there has been a lot of research on how people develop expertise.  Our understanding of what expertise is, and how people develop it, has increased greatly.  One of the leading thinkers on expertise has been K. Anders Ericsson.  This literature is important because it can help us understand how to set up the conditions that can support expert performance within an organization, as well as helping us figure out ways to develop our own expertise. 

Ericsson and colleagues wrote a paper for the Harvard Business Review that is a very good introduction to expertise: The Making of an Expert

Among the highlights:

Consistently and overwhelmingly, the evidence shoes that experts are always made and not born.  We often think that people have inherent skills and abilities but this just doesn't turn out to be true.  Expertise is developed.  Developing expertise in a subject takes hard work.  In fact, the most important factor in studies of expertise are quality practice time – not inherent factors like IQ, learning styles, or anything else.  This has been shown to be true for every field that’s been studied.  The only exception is in sports where body size and height are important.

It takes time to become an expert – most people need a minimum of ten years of intense training.  Ten years of simply repeating the same things over and over again will give you experience, but it won’t make you and expert.  Expertise takes a constant drive to improve your own performance.  This means focusing on the things that you things you need to improve on, not on the things you can already do well. 

Practice must be deliberate.  Real experts seek out constructive (and sometime even painful) feedback.  The best way to improve is to constantly get, and act on, feedback about your performance.
Understanding expertise is important for anyone designing professional education experiences because we want to make sure that we're supporting the development of expertise.

For a more in-depth look at expertise (including a chapter on expertise in medicine), check out the The Cambridge Handbook of Expertise and Expert Performance

Monday, October 11, 2010

How to use video

Providers in a trauma center aren't using sterile technique and they need training.  What kind of training are they likely to get?  Pamphlets?  Presentations by highly regarded or experienced providers?  Maybe if there's enough money in the budget, they'll get training that incorporates video.  One thing we know is that the training will likely be more successful if it involves some kind of interaction because training that uses passive methods is not likely to lead to any changes in behavior

A group from the University of Maryland used video in their training, and they did it in very interesting way.  Instructional videos often use an expert, doing the procedure perfectly.  The idea is that people will see this and be able to copy the perfect performance.  The problem is though, that it's possible to watch videos like this, and not process any of the information being shown. 

The Maryland group did a very clever thing: they used videos of providers doing the procedure, using common non-compliant behaviors.  Then, instead of passively watching, they had the trainees watch for mistakes during the filmed procedures.  This kept them thinking about, and processing the information.  The result was a significant increase in sterile procedures in the group that watched the videos. 

The bottom line?  Training that asked learners to evaluate real situations had a significant effect on sterile procedures.  If you're going to use video in your training, try to use it in a way that helps people think about what they're doing. 

Saturday, October 2, 2010

Lectures don't work

Lectures are usually a pretty bad way to help professionals learn something.  In fact, most of the evidence points to the fact that for professional continuing education, they don't really have much of an effect at all on either learning or on subsequent changes in practice that we often expect from CME programs. 

If you want to find out more about what has worked, and what hasn't worked in CME programs, The best single reference I've found is this JAMA review paper from about 10 years ago, I consider it a must-read for anyone involved in CME.  The authors looked at published studies of CME activities and the measured outcomes.  It's a very good read, but the bottom line is simply this - didactic instruction does not change physician behavior.  What does change physician practice behavior?  Giving people the chance to practice their skills and get feedback is one of the methods that seemed to work the best.   

Having professionals sit and listen to a presentations isn't always a negative.  If you want to give people information about a topic, or give them a broad overview, didactic instruction isn't necessarily a bad way to do it.  Where lectures don't work though, is when you want people to develop, or use, complex knowledge or skills.  This is pretty well accepted by most people in education but it doesn't seem to be widely known by others.