As the race for a COVID-19 vaccine continues, the stakes have never been higher; the pace of innovation, never faster.
The FDA recently announced new guidance for emergency use authorization of vaccines, a process that is faster than traditional approval but still requires data demonstrating safety and effectiveness. If a vaccine meets this criteria before the end of the year, it will be completely unprecedented. Yet, there's every reason to believe it’s possible.
Vaccines are among the most important weapons in the public health arsenal. Like the history of clinical research, their development follows a complicated trajectory that reminds us of just how far we’ve advanced in safety and informed consent.
6 Milestones in Vaccine Research
Early Attempts At A Smallpox Vaccine
In some ways we "owe" vaccination to one particular viral scourge: Smallpox. Caused by the variola virus, smallpox was highly contagious and had about a 30% mortality rate. Many who survived this infectious disease were left with permanent scars, or pockmarks, especially on their faces. Others were left blind. The heavy pale makeup you often see in Medieval and Renaissance paintings was designed to hide smallpox scars.
As horrible as smallpox was, people typically only suffered from it once in a lifetime.
The idea of giving somebody a mild case of smallpox so they would develop immunity appears to have first been put into practical use in China in the 1500s. Smallpox scabs were dried up and blown into the nostrils of children.
This was not vaccination as we know it today, but an older technique called variolation — a controlled exposure to the disease-causing agent. It's also called inoculation, a term we still use to describe vaccines.
Jenner and the Milkmaids
In the 18th century, the idea of variolation spread to Europe, although Europeans preferred to make a puncture in the skin. Sometimes it worked. Sometimes it killed the patient. Sometimes it even caused an outbreak. But the death rate was under 2%, and many considered this a reasonable risk to take with themselves and their children.
Enter Edward Jenner. Jenner was not the only person to notice that milkmaids seldom caught smallpox, but he was the one who took action. It turned out that milkmaids tended to catch a zoonotic disease called cowpox from their charges. Cowpox produced much milder symptoms. Cavalry officers also enjoyed protection from catching another related disease known as horsepox. It seemed that the two diseases were similar enough that catching one gave you immunity to the other.
The concept of informed consent wasn’t widely practiced in the 18th century, so Jenner intentionally exposed a child to cowpox and then tried to infect him with smallpox. It wasn’t until 1974 that proper protection for human subjects was enshrined into law in the United States.
Jenner’s approach was clearly unethical, but it worked, and intentional cowpox infection became the standard for protecting people from smallpox. By 1801, 100,000 people had been vaccinated. Initially, vaccination was performed by getting a cowpox victim to donate pustule from their sores, but later the vaccine was grown in cows.
This is also the origin of the word vaccine, which comes from the Latin “vacca” for cow.
Further development and global distribution of a smallpox vaccine continued for another century and a half, led by the World Health Organization, until the disease was declared eradicated in 1980.
The Arrival of Live Attenuated Vaccines
With cowpox and smallpox, we got oddly lucky. The majority of diseases don't have a convenient animal analog we can infect ourselves with. For a long time, smallpox remained the only common disease against which people were vaccinated.
Then came Louis Pasteur, the scientist who discovered pasteurization.
Pasteur started with anthrax, and the idea was to weaken, but not kill the pathogen by exposing it to oxygen and heat. The weakened pathogen would trigger an immune response while giving mild or no symptoms.
Live attenuated vaccines are still used today, although they can cause illness in people with weaker immune systems. For this reason, the research community moved on to killing the pathogen entirely. However, for some diseases, the greater effectiveness of live attenuated vaccines is worth the risk. Clinical researchers need to take all of this into account.
The Live vs. Inactivated Polio Vaccine
One quiet triumph of 2020, buried under all the bad news about COVID-19, was the final elimination of wild polio from Africa. Polio is a debilitating disease that often leaves survivors disabled for life. It's caused by one of three different viruses, and while most sufferers only experienced mild symptoms, 1 to 2% developed paralysis. US President Franklin Delano Roosevelt is perhaps the most famous victim of paralytic polio. A vaccination against polio was considered a major goal in the 1940s, and the first to achieve it was Jonas Salk, who used a killed vaccine.
In 1954, however, something unprecedented happened. One million children were enrolled in a clinical trial. This was the first vaccine trial to implement what we now consider the gold standard: the double-blind, placebo-controlled trial. The vaccine was approved the following year; however, more than 250 new polio cases were traced back to faulty batches of the Salk vaccine. The Cutter incident, as it became known, resulted in tightened safety precautions, and harkened back to the days when the FDA's primary role was to prevent adulteration of drugs. Since then, the Salk vaccine has not caused any cases of polio.
In 1962, Albert Sabin produced another polio vaccine, which consisted of a live virus that could potentially cause the disease in rare cases. There was a key difference — it was an oral vaccine stored at room temperature, and it was cheap and easy to produce. He donated it to developing countries and, despite the small risk of paralysis, it was instrumental in the slow eradication of polio. In the US, however, the Sabin vaccine was discontinued in 2000 and now only the Salk vaccine is used.
Swine Flu and Guillain-Barré Syndrome
In 1976, we had a potential pandemic when a version of swine flu began to spread. Health care officials quickly came into action, and over 40 million people ultimately received a vaccine.
However, researchers later discovered an increased risk for Guillain-Barré Syndrome, an immunological condition that causes damage to nerve cells, muscle weakness and potential paralysis, among some people who had received this influenza vaccine. Although the risk was low (about 1 in 100,000 cases), it was enough to stop the program.
Since then, vaccines have been carefully monitored for side effects. Several studies have demonstrated that while the flu shot carries with it a small risk of GBS, it's about 1 to 2 additional cases per million vaccine doses. GBS is more likely to happen after getting the flu than getting the vaccine, but the Centers for Disease Control and Prevention continue to monitor flu vaccines for any potential side effects. This is a vital part of ensuring the safety of any new vaccines as well. Vaccines have to be held to particularly high standards because they are given to so many healthy people.
Development of a Vaccine Schedule
Physicians began recommending some childhood vaccines, including diphtheria, tetanus and pertussis (DTaP), as early as the 1940s. However, it wasn’t until 1995 that the medical community officially endorsed a vaccine schedule for children beginning at infancy. The Advisory Committee on Immunization Practices, American Academy of Pediatrics and the American Academy of Family Physicians approved these guidelines and continues to review them annually.
The current schedule, updated in 2014, recommends a dozen different vaccinations, including the DTaP and another combined vaccine for measles, mumps and rubella (MMR.)
COVID-19 and the Way Forward
As of October, six companies are frontrunners in the race for a COVID-19 vaccine.
All are part of Operation Warp Speed, a federal initiative to deliver a vaccine in record time.
Prior to 2020, the record timeline for vaccine development was four years for the mumps vaccine.
It's possible that one of the vaccines for COVID-19 will be a messenger RNA (mRNA) vaccine, which would be a first. mRNA vaccines use specific proteins extracted from the virus and teach the body to produce proteins that resemble the virus so it develops an immune response. Moderna and Pfizer both have mRNA vaccines in Phase 3 clinical trials. Data on the effectiveness of these vaccines likely won’t be available until the end of the year.
Other major players in the development of a COVID-19 vaccine use live attenuated viruses, inactivated vaccines and another technique called viral vector vaccines.
In these vaccines, a different, non-pathogenic virus is reengineered to contain proteins that trick the immune system into thinking it is SARS-CoV-2. The Oxford virus, for example, uses a chimp adenovirus (a chimp cold) as its vector. Viral vector vaccines echo all the way back to Jenner's use of a different virus to trigger immunity.
It could be several more months before the FDA grants emergency use authorization to a COVID-19 vaccine. Assuming that approval is granted soon, widespread manufacturing and distribution will take time. Moncef Slaoui, chief adviser to the Trump administration’s Operation Warp Speed program, said in a recent NPR interview the companies are “in the process of stockpiling vaccine doses in the single-digital million doses,” and he expects there will be 30 million doses of each of the two mRNA vaccines by January.
What We’ve Learned From The History of Vaccines
Researchers have learned important lessons from the history of vaccines. Regulators have developed higher standards for protecting patients while proving vaccine safety and effectiveness.
Any vaccine approved for COVID-19 has to go through three phases of clinical trials and pass several milestones, including inspection of the manufacturing facility (to avoid another Cutter incident) and testing of product labeling. Once approved, it will be administered under the Vaccine Adverse Event Reporting System (VAERS), which collects and analyzes reports of side effects to make sure nothing was missed during trials.
The clinical research community is under enormous pressure to achieve the next great milestone in the history of vaccines. Yet we also have a responsibility to ensure it happens responsibly, with the utmost concern for informed consent, safety and any potential side effects. History will, in the end, judge the results.
Want a closer look at the history of clinical research? Check out our timeline and download a printable poster.