In 1734, during a visit to England, Voltaire observed the widespread use of smallpox inoculation – infection with a low dose of smallpox virus – that was intended to reduce the future risk of smallpox disease and was much less popular in France at the time. Of this relatively risky precursor of much safer modern vaccines, Voltaire wrote:
“It is inadvertently affirmed in the Christian countries of Europe that the English are fools and madmen. Fools, because they give their children the small-pox to prevent their catching it; and madmen, because they wantonly communicate a certain and dreadful distemper to their children, merely to prevent an uncertain evil. The English, on the other side, call the rest of the Europeans cowardly and unnatural. Cowardly, because they are afraid of putting their children to a little pain; unnatural, because they expose them to die one time or other of the small-pox.”
In the intervening three centuries, enthusiasm for intentional infection has waxed and waned. Faced with the current coronavirus pandemic, some scholars and activists have advocated intentional infection as a contribution to ending this global crisis. One response to this seemingly paradoxical idea, perhaps rooted in a deep disgust towards infection itself or the idea of infecting others, is to decry such proposals as foolishness or madness, just as continental Europeans responded to smallpox inoculation according to Voltaire. In reply, advocates point out that if there is a reasonable expectation that intentional infection could curb the epidemic, it would be cowardly and unnatural not to try it.
At the French Royal Academy of Sciences in 1760, the mathematician Daniel Bernoulli demonstrated that smallpox inoculation of the population would lead to public health benefits beyond the summation of individual benefits – one of the first descriptions of what is now referred to as herd immunity. As Bernoulli’s analysis was not converted to public policy until much later, smallpox epidemics continued – claiming the life of king Louis XV in 1774, among many others. The intellectual work of Voltaire, Bernoulli, and other advocates of inoculation might be classed within several branches of science or, indeed within natural philosophy, since their work reflected not only empirical projects but also efforts to refine concepts of infection and convince others to change their views. Such philosophical and scientific efforts continue today, and several controversies persist.
Infecting people with the virus that causes Covid 19 could help end the pandemic in two ways. First, as a public health practice, if it were possible to infect or “inoculate” consenting healthy individuals with a low dose or low virulence (i.e., less risky) strain of the virus as a crude form of vaccination that would produce at least some degree of immunity, those people would be less likely to become infected and transmit the virus to others. Second, as a research practice, human infection challenge studies, involving the intentional infection of research participants, could be used to learn about this coronavirus (or related viruses) and ultimately help to test interventions, especially vaccines.
Inoculation: let a thousand flowers bloom?
Let us consider inoculation before moving on to challenge studies below. The word inoculation came to public health from botany, where it refers to the process of grafting the bud or shoot from one type of plant to another. In botany, it is hoped that the new graft will thrive, producing a chimerical plant with the roots from one type of plant feeding the leaves and flowers in the graft. In public health, the intention is the opposite — it is hoped that the inoculated microbe (analogous to the bud or shoot) will be extinguished by the immune system, which will then develop an immunological memory that will help to prevent any similar microbes from causing disease in the inoculated individual in future. It is also hoped that the inoculation itself will not cause disease. The promotion of practices akin to coronavirus inoculation via “pox parties” has been banned from social media, perhaps because of fears of fuelling the anti-vaccine movement – yet some experts have argued that the public health benefits of inoculation of healthy young people should be taken seriously, and at least studied in the first instance, especially while no vaccine is available.
Smallpox inoculation, also known as variolation, had been practiced for millennia before it became fashionable in England in the early eighteenth-century thanks largely to the efforts of Lady Mary Wortley Montagu. A brilliant and spirited individual, Lady Wortley Montagu had lost a brother to smallpox and been badly affected by the disease herself early in life. On moving to Turkey, she was therefore excited to discover the well-established practice of inoculation, which she described in a letter of 1717 as occurring at smallpox “parties” where a wise older woman would come “with a nut-shell full of the matter of the best sort of small-pox, and asks what vein you please to have opened. She immediately rips open that you offer to her, with a large needle (which gives you no more pain than a common scratch) and puts into the vein as much matter as can lie upon the head of her needle”. Such a practice was not without risk, and often resulted in several days of fever – in rare cases it led to a serious case of smallpox, which could be fatal. If all went well, there were few or no visible signs of the infection, such as the pocks or scars caused by the disease. Wortley Montagu had her son inoculated successfully, and subsequently petitioned the British royal family, via her friend Caroline, Princess of Wales, to promote inoculation in England.
In a mirror of some controversial examples of twentieth century biomedical research, inoculation was first tested in England on prisoners – inmates on death row at Newgate Prison. In exchange for being test subjects for smallpox inoculation, six prisoners were offered their freedom, which all were granted having once survived the procedure. Again mirroring controversial research in the twentieth century, inoculation was tested next on orphans and in these children too it was shown to be safe and successful. Finally, inoculation was to be tried on children of the Royal Family. Sir Hans Sloane, president of the Royal College of Physicians (also frequently credited as the inventor of chocolate milk), opined that the expected benefits outweighed the risks, and the operations were completed on two young Princesses without harm.
Soon, even in England, controversy struck. Some members of the clergy objected that inoculation opposed the will of God. Some physicians were uncertain that inoculation protected against smallpox and worried that inoculated individuals might actually spread infection to others (all concerns that might be raised in the current pandemic, caused by a very different virus). Lady Wortley Montagu also implied that certain physicians were concerned to lose the considerable income they might gain from treating patients with smallpox, although we will never know which motivations predominated in the minds of particular critics. In rare cases some individuals did die from inoculation, but the rarity of this outcome made it far less fatal than usual smallpox infections, which – as a result of herd immunity – became less common in the population as whole to the extent that inoculation increased. Voltaire praised Lady Wortley Montagu and attempted to promote inoculation in France, citing benefits including the prevention of death and the preservation of beauty (by avoiding the disfiguring scars of smallpox).
From inoculation to challenge studies
All this predated vaccination for smallpox by the best part of a century. In a now familiar narrative, the discovery that vaccination with the milder virus cowpox (the vache in vaccination) prevented smallpox is generally accorded to Edward Jenner. Although Jenner was by no means the first to realise that cowpox infection protected against smallpox, he subjected this phenomenon to more systematic testing, building on proto-experiments conducted by local farmers and physicians. Jenner deliberately infected individuals with cowpox and then “challenged” them with smallpox – that is, exposed them to infection to see whether the prior infection with cowpox would protect against smallpox. He published the findings of what might now be referred to as a “human challenge study” in 1798. His attempts to promote vaccination as a safer alternative to smallpox inoculation were again initially met with disbelief, distrust, and perhaps vested interests, but eventually he prevailed. Smallpox was eradicated by global vaccine campaigns – but the smallpox vaccine was associated with much higher risks than modern vaccines, including relatively significant risks of causing disease; these risks (as well as compulsory vaccination laws) arguably led to the birth of the anti-vaccination movement in the late nineteenth century. Vaccination and intentional infection as public health practices, and controversies related to them, have never quite gone away. Many modern vaccines, including for example the polio vaccine, are somewhere in between because they involve infecting people with weakened or attenuated live viruses that occasionally cause the disease they are designed to prevent.
Meanwhile, human infection challenge studies – research involving the intentional infection of participants in order to learn about infectious diseases and test vaccines – have been associated with great successes but also infamous cases. Successful challenge studies of malaria, yellow fever, and leishmaniasis around the turn of the twentieth century, for example, changed the course of public health by providing evidence that these diseases were not transmitted, as alternative views held at the time, by bad air, dirty clothing, contact with sick patients, or immoral behaviour, but by the bite of bloodsucking insect vectors. Patrick Manson, a famous physician who infected his own medical student son with malaria to provide further proof of the mosquito-malaria theory emphasised that this was a Copernican moment in microbiology at which the general public baulked, perhaps, among other reasons, because of “a disinclination to admit that a pathological puzzle of so many centuries standing could be given so simple an explanation”. This revolution saved many lives through preventive efforts focused on insect control methods such as mosquito nets.
The early yellow fever studies in Cuba, generally attributed to the American Major Walter Reed, not only helped to show how yellow fever could be prevented but also laid the foundations for modern informed consent documents. These were regarded as extraordinarily high-risk studies because yellow fever, if a person develops severe disease, is associated with a significant probability of death. The studies are nonetheless widely considered ethically acceptable in part because of careful consent but also because participants were likely to be infected with yellow fever in daily life and – on some views, including those of the authors of the Nuremberg Code – because researchers also volunteered to be infected. However, later yellow fever studies at the same centre were eventually disbanded after fatal cases occurred among study participants (including healthcare workers) despite great care being taken to reduce the likelihood of such outcomes.
The rise of modern research ethics has been partly in response to scientific scandals, some of which have involved infectious diseases. Among the most infamous were studies conducted by wartime military research units of Germany and Japan in 1939-1945. These experiments involved sadistic or murderous treatment of prisoners and frequently lacked scientific rigour. Yet it does not follow that all research involving intentional infection is necessarily scandalous – and tens of thousands of consenting volunteers have been exposed to a wide range of infections in the last five decades under the supervision of careful clinician researchers and the oversight of ethics committees. So, would it be ethical to conduct coronavirus challenge studies?
Coronavirus challenges
Some of the debates about using challenge studies to test Covid19 vaccines seem curiously similar to Voltaire’s description of those regarding smallpox inoculation in the eighteenth century. Proponents of challenge studies argue that it would be a mistake to let thousands die while standard trials of multiple vaccines take many months or years, when challenge studies would be a more expedient way to sift the most effective vaccines from among the available candidates. In contrast, opponents of challenge studies argue that exposing volunteers to infection would be too risky or uncertain.
Uncertainties about this novel virus are sometimes juxtaposed with supposedly secure scientific knowledge regarding more familiar diseases. Few people realise that challenge studies with apparently familiar diseases have revealed that clinicians and scientists did not understand many aspects of such diseases as well as they believed – in part because good science necessarily involves reducing uncertainty by making new discoveries. Moreover, much of what we know about coronaviruses in general was learned in challenge studies at the UK Common Cold Unit – including by researchers who were involved in naming this family of viruses, for the appearance like the corona of the sun when the viruses were viewed under a microscope.
Is risk a property of viruses?
A key issue remains the perceived risk to volunteers. Many people fear exposing even consenting individuals to a “deadly virus”, but few ask what deadliness means here. Although it is common to speak of risk as a property of the microbe itself (“virulence” is sometimes used in this way), this is a mistaken view that scholars in the philosophy of biology should aim to clarify. The risk of an infection is ultimately a property of the interaction between the microbe and the host. Even some of the bacteria in yoghurt and probiotics can kill people with weak immune systems – yet it would be absurd to label such bacteria as deadly merely because they can cause death in some cases. Nature abhors a dichotomy, and simplistic classifications of microbes as deadly (or not) ignore not only the importance of host-microbe interactions but also the way risks (like many properties in biology) exist on a scale with a wide and continuous range from low to high.
The virus causing the current pandemic is more than 1000 times riskier in older adults than in young adults, and – although many people will find this hard to believe – it may not be much riskier than “common cold” coronaviruses, such as one that might have caused an 1890 pandemic attributed to “influenza” (this pandemic also produced unusual lasting effects in some of those infected, which have also raised concerns this year). A difference now is that many people catch common cold coronaviruses earlier in life (when risks are lower), following which they have some degree of immunity – even if this is imperfect and wanes over time, meaning that the elderly also face high risks from “common cold” coronaviruses.
At the population level, the number of susceptible hosts also matters – when an entire global population has little or no immunity to a virus, this triggers a “virgin soil” pandemic like the measles and smallpox epidemics that wiped out much of the indigenous populations of the Americas and caused a global cooling period. Since risk is not a property of the virus, but a property of the interaction between the virus and its host (or hosts, at the population level), it appears inaccurate to say that Covid 19 challenge studies with young healthy adults would involve infection with a “deadly virus” – for these individuals it does not appear to be very deadly at all. Nevertheless, small risks to research participants should be taken seriously and further minimised where possible, since maintaining trust in research (and vaccines) may be crucial to ending the pandemic and to continuing our future lives together on this planet. But we should be careful to think clearly about the risks and be willing to consider the potential benefits of infection in public health and research, especially if it means developing better vaccines sooner.
In 1760, Bernoulli wrote of smallpox inoculation “I simply wish that, in a matter which so closely concerns the wellbeing of the human race, no decision shall be made without all the knowledge which a little analysis and calculation can provide”. In 2020, questions of intentional infection still need both philosophical analysis and mathematical modelling. History shows that infection sometimes has benefits but that scientists should be especially careful when using infection in public health and research – debates about intentional infection began at least three centuries ago and there is no cure for these controversies in sight.