Measuring the time that clinicians spend on teaching activities
A favorite adage of productivity experts is that what gets measured gets done. It is also a truism that what gets rewarded gets done. Does it therefore follow that what gets measured gets rewarded? That's what Dr. Blair Brooks, an associate professor of medicine, and Jennifer Friend, business manager in the Department of Medicine, hope will be the result of a project they're engaged in.
|Impromptu teaching like this can now be quantified, thanks to a new project.|
Three years ago, the Department of Medicine's education committee held a retreat to discuss ways to better support the department's teaching mission. "The clinician-educators get endless data on what they're doing clinically, but no counterbalancing information stream on what they're doing educationally," Friend explains. Yet education is one of DHMC's missions. "Every physician is expected to contribute some portion of their effort to the academic mission," says Brooks. "But there has been no accounting for that time."
Effort: By the end of the retreat, Brooks and Friend had been charged with devising a system to quantify teaching effort- both to manage the effort and to let faculty know that teaching matters.
The first step in the process was to define the data they needed to collect. Educational activities are as diverse as leading a small-group discussion among medical students, overseeing residents caring for hospitalized patients, mentoring students in the lab, or giving a continuing medical education lecture. Once the group had identified the various kinds of teaching, they began the difficult task of determining how much credit should be given for the different activities.
About 40 other institutions nationwide have launched similar projects, according to Academic Medicine; there are two basic methods of tracking teaching effort- an hourly method and a relative-value method. DHMC is using the hourly method.
To do this, the working group needed to determine how much time someone spends in actual contact with a learner, as well as how much time preparation and evaluation require. Preparing a brand new lecture takes more advance work than delivering one again, for example. The initial inventory contained 106 different activities, so the group condensed the list to a more manageable 35 categories.
"It is not our purpose to say what's a good use of someone's time, but merely to record the amount of time spent on teaching," says Friend. "Eventually we will be able to state how much time overall our faculty devote to teaching, and how we can maximize that to meet our programmatic needs."
Built it: The information will be gathered on a Web-based data system designed by Dr. Harley Friedman, director of the department's residency program. The project could not have gotten to its current point without him, asserts Friend. "In the time it took other, outside vendors to tell us how much they thought it might cost us to build this," she says, "he built it."
The group ran a pilot in three sections of the department last fall and had 75% participation. It's going department-wide on July 1. The goal, says Friend, is to manage the institution's investment in the academic mission. An objective measurement of educational productivity can be used in compensation and promotion decisions. "What we often hear," Friend says, "is 'I teach a lot.' Now we'll know what 'a lot' means." This will help the faculty member looking for a promotion, the department looking to allocate funds, and the institution's leaders looking to understand how much teaching is being done so they can better manage faculty resources.
"The metrics for clinical research are crowding out other things," says Brooks. "This is a way to help individuals get credit for the teaching that is getting done and not currently measured." Or, perhaps, rewarded.
Katharine Fisher Britton
If you would like to offer any feedback about this article, we would welcome getting your comments at DartMed@Dartmouth.edu.