Bringing rigor to education, too
It's baffling, really. Medicine, a field rooted firmly in research, has never applied the same degree of rigor to evaluating the way physicians are trained. "If medicine has a high threshold for evidence of clinical care, why is there no corresponding threshold for educational effectiveness?" ask the DMS authors of a September article in the Journal of the American Medical Association (JAMA). "For example, what is the basis for the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME) accreditation requirements?"
"There isn't any!" insists Patricia Carney, Ph.D., assistant dean for educational research at DMS and was the lead author of the JAMA article. "What happens," she says, "is that the accrediting bodies all get together and they decide what medical schools ought to be doing, mostly based on what they, themselves, do." In the article, she and her coauthors call for medical schools around the country to channel resources into what they've dubbed "educational epidemiology"—the science of studying
the training of doctors. While a lot of papers about medical education are published, the number in which "an actual design is applied to the evaluation and a hypothesis is identified and tested . . . is actually pretty low," says Carney. What is published is largely qualitative rather than quantitative, the authors contend.
Data: But a handful of institutions, including Dartmouth, hope to change that. DMS was also one of eight medical schools invited to write about such research for the October issue of Academic Medicine. Medical education research began at DMS in 1995 with data collected on index cards. Carney and family physician Allen Dietrich, M.D., used the cards to track students' experiences in their primary-care rotations. What diseases were they seeing? What procedures were they performing? Then, in 1998, Carney and several colleagues launched ClinEdDoc, a computer-based documentation system.
"To me, ClinEdDoc was a Phase I trial that showed that collecting such data was feasible," says Carney. The system
yielded eight published papers, including one comparing the educational experiences students get at academic medical centers, affiliated residency sites, and community-based practices. ClinEdDoc also allowed students to track their own learning and identify any gaps.
Today, DMS students use a different computer-based tool, the Dartmouth Medical Encounter Documentation System (DMEDS). They can record a wide variety of data on DMEDS, from diagnoses they encounter to communication skills they employ.
Better ways of assessing medical education are essential "unless we plan to go to five-year programs," says Carney. "We have got to figure out what older parts of the curriculum don't need to stay, because medicine is evolving. The science of health care, not just medicine but the whole shooting match, is so complicated that we have got to figure out what belongs and what doesn't."
If you'd like to offer feedback about this article, we'd welcome getting your comments at DartMed@Dartmouth.edu.
This article may not be reproduced or reposted without permission. To inquire about permission, contact DartMed@Dartmouth.edu.