Photo by Jon Gilbert Fox
DMS faculty member Kendall Hoyt
has written a book about the
connections between vaccine
development and national defense.
Defense mechanism
Our national defense is in jeopardy, argues DMS faculty member Kendall Hoyt, due to a decline in the success of the U.S.
vaccine industry. During the 20th century, many diseases—from polio to smallpox—were felled by vaccines. But in the
21st, we've been unable to develop effective agents against either natural scourges like SARS or bioweapons like anthrax.
By Amos Esty
Despite more than 20 years of research and a $1-billion investment, the United States has been unable to develop a new anthrax vaccine. That failure, says Kendall Hoyt, Ph.D., a historian of science and technology and an assistant professor of medicine at DMS, reflects profound problems with the nation's strategy for developing vaccines—problems that leave the country vulnerable to attack with biological weapons.
In her forthcoming book Long Shot: Vaccines for National Defense (to be published in February 2012 by Harvard University Press), Hoyt examines those failings and contrasts them to U.S. vaccine successes of the mid-20th century. Despite the increased attention to bioterrorism since 9/11 and the billions of dollars spent over the past decade, the U.S. remains unprepared for a biological attack, Hoyt says. Part of the problem, she argues, is a misdirected approach to vaccine development. She talks here with Dartmouth Medicine about her research and about what the U.S. can do to be better prepared for the possibility of biological attack.
Amos Esty is the managing editor of Dartmouth Medicine magazine. The cover image below of Long Shot, by Kendall Hoyt, appears courtesy of Harvard University Press, Copyright © 2012 by the President and Fellows of Harvard College.
How did you become interested in the subject of vaccine development and national defense?
In 1999, I was sitting in the back of a bus in Japan reading about Aum Shinrikyo [a cult that in 1995 released a nerve gas called sarin in the Tokyo subway system, killing 12 people and injuring about 5,000]. What I didn't realize is that they had attempted several other biological attacks. While those were unsuccessful, I was horrified by how easily things could have turned out the other way.
Today, there are fewer obstacles to the spread of biological weapons. The technological and economic barriers are lower, and the technical skills are proliferating. You don't have to be a state to develop and use biological weapons. You just need to be a person with a lab, some training, and a grudge, and there are plenty of those.
Did the terrorist attacks of 9/11 cause you to change your thoughts on the subject?
The anthrax attacks in September and October of 2001 strengthened my conviction that biological weapons were a profound threat. By then, I'd done enough research to realize that vaccine innovation rates had been falling at the same time bioterrorism threats were rising. Naturally occurring infectious disease threats were also rising, due to climate change, shifting population patterns, expanding travel and trade, and misuse of antibiotics.
As a nation, we have misplaced confidence that the medicines and vaccines that we need will be there when we need them. Our capacity for vaccine innovation and development is critical to national security and global health. How do we get the medicines we need when we need them? This is the subject of my book.
You mentioned climate change. So in terms of national defense, it's not just about responding to biowarfare, but also about pandemics and infectious diseases?
Absolutely. SARS is a good example. Epidemiological forecasting and stockpiling will not prepare a population for something like SARS. Threats of this nature could be an emerging or reemerging infectious disease. Or a genetically engineered or "weaponized" disease. We need to think hard about how we can respond to all of these threats.
Image courtesy of the National Library of Medicine
Thanks to military scientists, who isolated the causative agent of rubella in the 1960s, these two NIH researchers—Harry Meyer, left, and Paul Parkman—were able to develop the first vaccine against the disease, also known as German measles.
What were some of the big successes in vaccine development during the middle of the 20th century?
There were some huge ones. The ability to cultivate viruses in eggs was one important advance. That led to the first flu vaccines in the 1940s and to more effective yellow fever vaccines. The ability to cultivate viruses in kidney cells permitted polio vaccines. We also saw the rapid-fire development of vaccines for measles, mumps, and rubella in the 1950s and 1960s. There have been significant successes in recent decades, but not at the same rate.
What made vaccine development so successful back then?
It had a lot to do with the way research was conducted. It used to be that Walter Reed Army Institute of Infectious Disease Research was a, if not the, center of excellence for infectious diseases and vaccine development. After the Vietnam War, the focus shifted to the National Institutes of Health (NIH). This was a bigger transition than might meet the eye. At places like Walter Reed, the scientists had highly practical, interdisciplinary training that focused on product development. Their training took them out to the field and then brought them back to the lab. They had fellowships that took them onto the production lines of private-sector vaccine manufacturers. They understood upstream and downstream production requirements, which created systemic efficiencies. They understood what industry needed, so the potential vaccines they delivered to industry could be developed quickly.
When talent and funding shifted to the NIH, researchers received a different kind of training and worked under a different set of incentives. Their focus was on publications, not products. And they were highly specialized. NIH researchers excelled at discovery, but they were weaker at late-stage development and less well-adapted to the translational challenges of vaccine development. So handoffs from the lab to industry become more difficult, and the licensing of new vaccines slowed down.
Another factor you discuss in your book is close cooperation between government leaders and industry leaders in the mid-20th century. How did that happen?
That was a legacy of World War II. During the war, government and industry leaders came together to deal with the national emergency. This effort was very successful, and many of the collaborative practices were sustained well into the postwar years.
The ability to cultivate
viruses in eggs was one
important advance [during
the middle of the 20th
century]. That led to the
first flu vaccines in the
1940s [and then] to the
rapid-fire development
of vaccines for measles,
mumps, and rubella in
the 1950s and 1960s.
There was also a strong sense of public duty. When Walter Reed officials approached Maurice Hilleman at Merck and asked him to develop a vaccine against meningitis, a bacterial infection, it was the last thing he wanted to do, because he'd always worked on virus vaccines and knew little about bacteria. In addition, its commercial prospects were poor. But he thought, "If I don't do this, it will be a tremendous disloyalty to the military." At no point did he think about what the market would be, which seems crazy.
In addition, he had the unilateral authority to say, "Okay, I'll do it," which is also crazy. These are things that don't happen anymore. Large biopharmaceutical companies rarely make research development investment decisions in that way. As a result, they develop fewer vaccines of global health and national security importance.
You write that traditional economic models of innovation don't seem to apply to vaccine development. Can you explain what you mean by that?
Industrial historians tend to view innovation as a function of technological opportunity, economic incentive, and the capabilities of companies in the industry. You would expect innovation rates to track with those three factors. But these factors do not explain innovation patterns in the vaccine industry. These three factors were all relatively weak when innovation was peaking at mid-century. And they had all improved by the end of the century, when innovation was falling. So there was almost an inverse relationship.
When I present these findings, the first thing people usually say is: "Well, maybe they picked all the low-hanging fruit in the '40s and the '50s, and only the 'hard' vaccines remain." Usually they have in mind the elusive HIV vaccine. But the vaccines developed in the '40s and '50s were not "low-hanging fruit" at the time they were developed. The pneumococcal vaccine, which was licensed in 1948, was the first use of capsular polysaccharides to generate immunity. That was a radical innovation by any standard. And no one would accuse Thomas Francis, who was instrumental in developing the first flu vaccine, of picking low-hanging fruit. Or Jonas Salk or Albert Sabin—nobody would accuse them of picking low-hanging fruit when they developed the polio vaccine. So it's a question of perspective.
By contrast, you can argue that a next-generation anthrax vaccine is the very definition of low-hanging fruit. It is technologically feasible, it has been a top priority with political backing, and it has been well funded. But 20 years and a billion dollars later, we still don't have this vaccine.
When I saw these trends, I decided to delve into the history of individual vaccine development to understand what drove innovation. I found that falling innovation rates have less to do with dwindling scientific and economic opportunities and more to do with research practices—specifically, the way we organize, fund, and train researchers.
You can argue that a nextgeneration
anthrax vaccine
is the very definition of
low-hanging fruit. It is
technologically feasible,
it has been a top priority
with political backing, and
it has been well funded.
But 20 years and a billion
dollars later, we still
don't have this vaccine.
You note that at in the middle of the 20th century, a lot of the scientific knowledge needed to create new vaccines was available—it was just a matter of organizing research in a way that would lead to vaccine development. Is it still true that organization rather than knowledge is the problem?
Often it is. There is the so-called "valley of death" concept—the idea that there are vaccines languishing in the pipeline not for scientific or technological reasons but for financial and organizational reasons. That's because the cost of development rises in the later stages of the process. Often, smaller biotechnology companies do not have the resources or capabilities to push through to full-scale production. There are feasible vaccines that are not being developed today for reasons that have nothing to do with science or technology.
So if part of the problem is the way vaccine research is organized today, what can be done about it?
The single most important thing we can do is to reintroduce integrated research practices. "Integrated" is a catch-all term for research that is directed from the top down, coordinated across disciplines and developmental phases, and situated in a community that facilitates information transfer.
But this is easier said than done. We need the right type of governance, training, and leadership. Rather than attempting wholesale reform, I think it makes sense to start with small-scale programs to incubate new approaches. It is increasingly possible to support integrated R&D through a growing array of public-private product development partnerships [PDPs]. For example, Aeras—a PDP for TB vaccine development—incorporates many of these integrated research practices.
What are some of the disincentives that are making pharmaceutical companies reluctant to work on vaccine development?
Some of the disincentives are inherent in the concept of vaccines. If humans are the only reservoir for a disease and you develop an effective vaccine, you will ideally eliminate your own market.
The structure of the market is also a factor. We used to have dozens of stand-alone vaccine development companies, but in the '80s and '90s we had large-scale consolidation of the industry. Now vaccine developers have merged with large pharmaceutical developers, and when they're making collective decisions about where to invest R&D dollars, new vaccine projects are in competition with more profitable alternatives, like Lipitor.
Today, only small and inexperienced firms are willing to accept government contracts to develop vaccines that may have high social but low commercial value. To fulfill the contracts, these firms often must outsource to other companies in order to amass the full range of skills and resources required. As a result, we see more contract renegotiations, longer development times, higher costs, and higher failure rates.
After 9/11, Merck was in discussion with the federal government about developing a next-generation anthrax vaccine. If the government had granted them the right contract price—a price that took into account the opportunity costs of postponing the development of drugs that were more commercially viable, then we would have that vaccine by now. But they didn't. So here we are. That's the difference between working with a large, integrated firm that can push something across the finish line, versus working with smaller firms.
Image © Bettmann / Corbis
Close cooperation between government and industry was key to mid-century vaccine successes, according to Hoyt. Here, in
1956, a drug company employee labels vials of polio vaccine slated for shipment to areas designated by the government.
So despite all the money spent since 9/11 on bioterror and national defense generally, the government has sort of nickel-and-dimed the vaccine effort?
Yes and no. The federal government has thrown a lot of money at the vaccine effort. But they haven't stopped to think about how to organize the development process. Rather than paying up front to engage a large, experienced manufacturer, they've opted for lower-cost contracts with smaller companies that have lower overhead and less experience. But there are a lot of hidden costs to this approach.
A recent article in the New York Times Magazine highlighted some of the problems with the United States' attempt to stockpile large reserves of vaccines that might be needed in the event of a bioterror attack. But you argue that, for the most part, it doesn't make sense to stockpile vaccines. Why is that?
First, stockpiling makes sense only in a limited number of cases. And remember that we're talking about vaccines, not therapeutics, which would be a different strategy. Smallpox and anthrax vaccines can be used in post-exposure scenarios, so there is a reason to stockpile them.
But beyond that, it doesn't make sense. The number of potential threats far exceeds our drug-development resources. The low-end estimate to develop a single drug is $800 million, and the process can take 8 to 10 years. We have neither the resources nor the time to develop a vaccine to respond to every threat.
Second, we're probably going to be wrong. If you look at the history of our attempts to predict biological threats, it's lousy. We prepared for botulinum toxin during World War II because we thought the Germans were going to load V-1 bombs with the toxin. They didn't. We prepared for the swine flu in 1976. That didn't happen. We vaccinated for anthrax during the first Gulf War. That didn't happen. We prepared for smallpox during the second Gulf War. That didn't happen.
So we're often wrong. And we're often unprepared for the things that do happen, like SARS. We just don't have the best track record with this approach.
I think the wiser approach is to assume that there are going to be threats that we didn't see coming and then figure out how to catch up to them after the fact. That does not mean that we will have real-time development capabilities. That's unrealistic. But what we can do is take much of this money that would have been invested in stockpiling and invest it instead in building research tools that have broad application for all kinds of drug development.
We are focused today on product innovation—often to the exclusion of process innovation.
So in most cases, the best the government can do is prepare to react to a threat after it develops? You're saying that we can't realistically prepare for everything?
Right. These threats are very, very specific. It's hard to have a biomedical silver bullet.
Is the reason a strategy hasn't been developed thus far because it's hard to accept that you can't prepare for everything?
I think there's a little bit of that. We started to stockpile because it made sense for smallpox and anthrax. And then stockpiling became the broader strategy because that was what we knew how to do. It's easy to get trapped in that mindset. But before you know it, you've thrown a lot of money at something you're probably never going to use, when you could have been investing in a system that has a broad application.
Are changes in the public's attitude toward vaccination a factor in the development of vaccines for national defense?
The shift in attitudes presents real challenges to the effective administration of vaccines. We live in a time where infectious disease is a less obvious threat to the man on the street. Soldiers going into World War II grew up without the benefit of penicillin and most childhood vaccines. They also grew up in the middle of the 1918 flu pandemic, so the idea of a flu vaccine was downright miraculous. Nowadays, the side effects of vaccines are often more highly publicized than the collective benefits of the vaccine itself.
So in other words, as you note in your book, the problem is that when a vaccine works, nothing happens?
Right. How do you get money for such programs? There's not a strong political constituency for vaccines the way there is for weapons systems. Vaccines are totally unremarkable when they're doing what they're supposed to be doing. The only time they get press is when they mess up.
There have been a lot of advances in biology and scientific technology since the middle of the 20th century. Has vaccine development failed to take advantage of those changes?
I wouldn't say that. We started developing recombinant vaccines in the '80s. I think hepatitis B was the first one. That represented a significant advance, because now we could engineer the antigen apart from the pathogen, which makes it much safer.
So perhaps scientists just haven't figured out yet how to harness all the developments in molecular biology for other vaccines.
Right, that's the key point. The science and technology base has exploded since the 1970s.
But it has become much more difficult to consolidate and apply the knowledge that's out there. That is the real issue. So not only is the challenge greater than it ever has been, but our ability to integrate and translate research has diminished.
A prepublication excerpt from Long Shot
Image courtesy of National Institutes of Health
These soldiers in a U.S. Army hospital in Aix-les-Bains, France, were fighting influenza rather than enemy troops.
War and disease have gone hand in hand for
centuries. As one historian has observed:
"More than one great war has been won or lost
not by military genius or ineptitude, but simply
because the pestilence of war—from smallpox
and typhoid to cholera, syphilis, diphtheria, and
other scourges—reached the losers before they
infected the winners."
Training camps and battlegrounds magnify
the spread and severity of disease. They bring
men from different geographical regions into
close contact with one another. These men are
often physically stressed, or wounded, and disease
spreads easily. Prior to World War II, soldiers
died more often of disease than battle injuries.
The ratio of disease to battle casualties
was two to one in the Civil War and approximately
five to one during the Spanish-American
War. Severe losses, from typhoid fever in
particular, inspired the U.S. Army to sponsor
the research of Major Fredrick Russell, who succeeded
in developing an effective typhoid fever
vaccine for the military in 1909.
This passage is excerpted from Long Shot: Vaccines for National
Defense, by Kendall Hoyt, to be published in February
2012 by Harvard University Press. Copyright © 2012 by
the President and Fellows of Harvard College. Used by Permission.
All rights reserved.
Improved sanitation measures lessened disease
casualties in World War I, but failed to protect
troops from the 1918 influenza pandemic.
Military populations were particularly hard hit.
According to some estimates, influenza accounted
for nearly 80% of the war casualties suffered
by the U.S. Army during World War I.
The 1918 influenza pandemic first appeared
in the United States at an army training camp
at Fort Riley, Kansas, in March, sickening hundreds.
By September, the flu spread to Camp Devens,
Massachusetts. By October, it crossed the
country, infecting recruits at the University of
Washington Naval Training Station. Troop
movements facilitated global transmission, contributing
to three near simultaneous outbreaks
in the port cities of Boston; Brest, France; and
Freetown, Sierra Leone, in that same year. The
flu spread rapidly from these port cities, claiming
approximately 50 million lives worldwide.
Thomas Francis, Jr., professor at the New York
University College of Medicine and chairman
of a commission that coordinated research
on the influenza vaccine during World War II,
feared that another war would generate the epidemiologic
conditions for another pandemic.
He remarked: "The appalling pandemic of 1918
in the last months of the exhausting conflict of
World War I, with massive mobilization of
armies and upheaval of civilian populations, has
irrevocably linked those two catastrophes. It
demonstrated that virulent influenza may be
more devastating to human life than war itself."
If you'd like to offer feedback about this article, we'd welcome getting your comments at DartMed@Dartmouth.edu.
This article may not be reproduced or reposted without permission. To inquire about permission, contact DartMed@Dartmouth.edu.