At a moment in history when smallpox repeatedly devastated Europe and North America, a small number of physicians and clergymen became aware of the Middle Eastern and African folk practices of inoculation, which created immunity to the disease. In a reverse of fates, initial attempts to introduce this to ‘more advanced’ cultures created a strong backlash.
Smallpox killed an estimated 300 million people or more in the 20th century alone; only tuberculosis and malaria have been more deadly, and its victims were often children and infants. It is also, among diseases, humanity’s greatest triumph: the only human disease we have completely eradicated, with zero naturally occurring cases reported worldwide, even in the most remote locations and the poorest countries, for over forty years.
Along the way, it led to fierce debates, as well as some of the earliest clinical trials, quantitative epidemiology, and the first vaccines. On both sides of the border, the resistance of conservative thinkers to alternatives that lie outside of preferred models and beliefs is as contemporary as ever. Defying Providence is a book that offers some food for thought about such conflicts, rooted in an unavoidable entanglement of science, politics and business.
No Small Feat
Once the leading cause of blindness in Europe, Smallpox was highly contagious, so it struck in epidemics. Therefore the actual suffering of the disease was compounded by the anxiety it caused in the healthy, and the anguish felt by those who passed it onto their loved ones. There was no cure.
Humanity has suffered from smallpox for a long time. It was identified as early as AD 340 in China, and it may have killed Ramses V, the Pharaoh of Egypt in the 12th century BC, whose mummy has a pustular rash on the face. The Plague of Athens in 430 BC, based on Thucydides’s description, may also have been smallpox.
With globalisation beginning in the 1400s, it spread to southern Africa and the Americas – usually inadvertently, but sometimes as a biological weapon. Many of the native peoples of the Americas lost half their population to the sudden onslaught, and its devastation helped assure the Spanish conquest of the Aztec and Inca empires.
Smallpox had one blessing, which people noticed at the time: if you survived it, you wouldn’t get it again. This led to theories that the cause of the disease included some innate seed, present in everyone; some poison in the blood that could be activated by the wrong trigger, but then expelled from the body for good.
We now know that the disease is a war fought among particles of organic matter, too small for the naked eye; that the body itself is the battleground; and that the soldiers enlisted to defend their homeland can be trained to recognise the enemy, and to fight off future invasions – an arguably stranger truth.
In crowded cities, the disease grew so common that it became accepted as a fact of life: everyone got it sooner or later. The 1746 charter of the London Smallpox Hospital likened it to “a thorny hedge through which all must pass, and some die, to reach a field of safety.” Which led to a simple and seemingly crazy idea – why not give yourself smallpox on purpose and just get it over with?
Inoculation began as a folk practice. The inoculator, in one version, took contagious matter from the pocks of an infected person, put the liquid on a needle, and pricked the skin of the patient. They developed the fever in 7–9 days and passed through all the symptoms in a few weeks.
Contracted in this way, the disease seemed to be milder and less deadly. The best modern theory is that the body has a more effective immune response if the virus enters through the skin rather than the respiratory system. In any case, inoculation let people choose when they would face smallpox: it was dangerous when you were too young, too old, and especially when pregnant. It also let people face the disease at home, under isolated care and supervision, as opposed to being struck with it while traveling.
The practice was not widely known in Europe before the 1700s, but it was well established in at least China, India, parts of the Middle East including the Ottoman Empire, some parts of Africa, and parts of Wales, where it was known as ‘buying the pox’. It protected those who received it, but it wasn’t used widely enough to be an effective public health measure, so epidemics still raged.
Inoculation had spread haphazardly, through travellers who learned of it in distant lands. But it was about to merge with two main phenomena on a larger scale – science and capitalism, which would catapult it from folk-practice into a worldwide campaign to wipe smallpox off the face of the earth.
Inoculation was formally introduced to the West in the early 1700s, through letters to the Royal Society, the premiere scientific organisation of the era. In 1721, it gained strong proponents in both Britain and the American colonies. In London, Lady Mary Montagu, wife of the British ambassador to the Ottoman Empire, advocated for the practice after encountering it in what is now Turkey; she had suffered the disease herself and had lost her brother to it, and she eagerly inoculated her own children.
Unsurprisingly, the novel practice faced a lot of opposition. And there were real concerns: inoculated patients still suffered symptoms, and some still died. Advocates of inoculation believed that the symptoms were milder, and death rarer, but opponents disagreed. Worse, inoculated patients were still contagious, and anyone who caught the disease from them suffered the full, dangerous version, not a mild form. Thus inoculation, if improperly handled, could start an epidemic.
In 1723, The Reverend Edmund Massey gave a sermon against it, citing: “for what causes are disease sent among mankind?… Who is it that has the power of inflicting them?… Disease is sent either for the trial of our faith, or for the punishment of our sins.” In either situation, preventing disease interrupted God’s plan. Without the miseries of smallpox, faith could not be tested. Without the consequences of infection, sins went unpunished; Without threat of punishment, what vile, lascivious practices might we all indulge in? “God sends disease”, therefore it was a sin for man to claim God’s prerogative by artificially transmitting smallpox.
In Boston, the cause was taken up by Cotton Mather, a Puritan minister, who learned of it from a slave of his who had been inoculated in Africa. He fought back, calling inoculation a “gift from God”, and arguing that it was a moral requirement, according to one’s duty to God to protect one’s health. (Mather was a prominent prosecutor in the infamous Salem witch trials though, so he was also a hypocrite).
If you were against inoculation, it was God’s will that we get smallpox; if you were for it, then it was God’s will that we avoid the disease. Whether inoculation was a gift from God in accordance with His will, or a human intervention that went against it, was a matter of perspective. Might the ‘damn lies and statistics’ of a deified database fare any better?
The Proof In The Pox Pudding
In 1722, physician Thomas Nettleton began collecting data on smallpox cases: the number of natural smallpox cases, the number of inoculations, and how many died of each. He sent these figures to James Jurin, secretary of the Royal Society. Jurin expanded the effort, putting out a general call for as many such studies as he could collect.
Today such quantitative studies are a common sense practice, but in a time and place where medicine was further dominated by tradition and authority, it was cutting-edge. Newton had recently demonstrated the power of mathematics in physics; and it followed that Jurin was inspired to apply math to medicine.
By 1725, Jurin had collected 14,559 cases of natural smallpox and 474 inoculations, from both England and America. The numbers were clear: natural smallpox killed about one in six who contracted it; inoculation killed less than one in fifty. Inoculation was dangerous by the standards of today’s medical procedures, at a ~2% death rate, but it was about ten times safer than organic free-range smallpox.
With the virtues of inoculation demonstrated, it spread throughout England, and eventually to much of Europe. But it didn’t become universal, because it was still time-consuming, expensive, painful, and risky. And these problems were self-inflicted by the groundless practices of many Western physicians.
In the original Middle Eastern practice, a small scratch was made in the skin with a needle. English physicians transformed this into a butchers incision with a lancet, which was unnecessarily painful and often became infected. Further influenced by the ‘advanced’ scientific models of the time (including Hippocrates’ humoral theory), physicians made their patients undergo a lengthy ‘preparation’ of bland diets, enemas, and vomiting, and treated them with poisons including mercury. Altogether, this method took many weeks, taking patients away from their jobs or farms, and carried an unnecessary risk of complications – the cascade of intervention.
“If bleeding were efficacious there would be a lot of healthy people on a battle field.” They bled him for two days and were pleased when his veins ran clear. One of his last lucid remarks, to his valet, was “my doctors have assassinated me”. – Lord Byron, Life and Limnings
The practice of inoculation was eventually improved by country surgeon Robert Sutton and his son Daniel, after Daniel’s older brother almost died from a rogue inoculation. Through experimentation, Robert Sr. discovered that the incision could be replaced with a small jab from the lancet. Daniel, going further, found he could reduce the ‘preparation’ from a month to 8–10 days; he also had patients walk around outdoors or even work when they were not contagious, instead of staying bedridden.
In the 1760s, the Suttons ran a booming business in inoculation houses. They advertised how convenient and mild their treatment was, boasting that most patients had no more than twenty pustules, and could return to their lives within three weeks. They opened multiple locations and eventually an international franchise. Catherine the Great invited Daniel Sutton to Russia to inoculate her in 1768 (he declined and another inoculator, Thomas Dimsdale, accepted – making him a Baron).
In 1767, William Watson, a physician at the Foundling Hospital in London, tested different ‘preparation’ regimens in one of the earliest clinical trials (n=31). In three groups, he tested a combination of laxative and mercury, laxative only, and no preparation as a control. To make the study more precise, he counted the pocks on each patient as a quantitative measure of disease severity. He concluded that mercury was not helping the patients (and with modern statistical methods, he could have seen that the laxative wasn’t either).
In the 1780s, physician John Haygarth recorded every case of smallpox in his town of Chester and every contact they had, to uncover how the disease spread – what is now called ‘contact tracing’. Through this he disproved the myth that smallpox could spread over long distances, or that it was dangerous to even walk by the house of a patient. Instead he showed that it could only be transmitted through the air over about 18 inches, or through contact with infected material such as clothing. Based on these discoveries, Haygarth proposed a list of ‘Rules of Prevention’ that amounted to the isolation of patients, and the washing of potentially infected material. But these practices could never be enforced consistently enough to prevent outbreaks. That would require a new technology.
Vouching For Variolae Vaccinae
Through these improvements, the mortality rate of inoculation was eventually reduced to less than one in five hundred. As it became safer, more convenient and more affordable, it spread throughout England. As it did so, it became evident that some people had already gained immunity, without any signs of previous infection or inoculation.
No one knew what to make of it until a farmer came in for inoculation by country surgeon John Fewster in 1768. The farmer, who had no response to multiple inoculation attempts, said that he had never had smallpox, but that he had recently suffered from cowpox.
With this clue, Fewster started questioning his patients about cowpox, and found that cowpox explained the cases of pre-existing immunity. He reported the finding to a local medical society, and eventually it became known to a nearby surgeon’s apprentice named Edward Jenner. This is in contrast to the origin story that is usually told, where Jenner learns of cowpoxs’ protective properties from a local dairy worker – a fabrication attributed to Jenner’s first biographer.
Jenner saw an opportunity: even in the best case of inoculation, the patient still experienced mild symptoms, the worst case was serious disease or death. And during the treatment, the patient was still contagious, which meant quarantine – if the quarantine failed, there was the risk of an epidemic. Cowpox wasn’t a disease anyone wanted to have, but it also wasn’t deadly. He wondered if it could be a substitute sickness.
Fewster was sceptical, because some patients who had reported cowpox were not immune to smallpox. Through careful research, Jenner untangled the confusion: there were multiple diseases that looked like cowpox. Jenner learned to differentiate between true cowpox, staphylococcal infections, and another disease called ‘milker’s nodes’. When properly identified, cowpox clearly did confer immunity to smallpox.
Jenner tested his technique in 1796 and published it soon after. Since the Latin name for smallpox was ‘variola’, Jenner called cowpox ‘variolae vaccinae’, or ‘smallpox of the cow’. Later, to distinguish the two methods, inoculation with smallpox was called variolation, and with cowpox, vaccination. Decades later, when Louis Pasteur extended the technique to other diseases, he deliberately extended the meaning of ‘vaccination’ in honour of Jenner, giving the term its modern definition.
Vaccination turned out to have one main drawback: unlike inoculation, it wore off after a matter of years. This was solved with periodic re-vaccinations. Like inoculation before it, vaccination was controversial and Jenner had to fight for its acceptance. Then as now, the technique fuelled fears – one cartoon showed vaccine recipients growing cow parts out of their bodies. But the method worked, it was safer than variolation, and eventually it gained cultural and scientific acceptance.
Vaccinate is the most specific of the three terms because it specifically means to give someone a vaccine, which usually consists of a small amount of a killed, weakened, or otherwise modified version of a disease (such as a virus or bacterium). In the context of modern healthcare, inoculate is typically used interchangeably (though not as commonly) with vaccinate. More generally, inoculate means to implant a microorganism (such as a bacteria, virus, or amoeba) into an environment for research. Immunize is the most general of the three words and can mean to grant immunity to a wide variety of things, not just diseases.
Safety Sells Slowly
The next 150 years saw a series of incremental improvements that helped vaccination spread to most of the population in much of the world. Large-scale vaccination provided many challenges; the initial method, like variolation, was ‘arm-to-arm’. The virus was taken from the pustule of a previous patient and transferred straight into a new one.
In an era before disease screening and before the germ theory, there were greater risks. On multiple occasions, patients to be vaccinated accidentally contracted syphilis, which had been undiagnosed or misdiagnosed in the source patient. Substituting one sickness for another in a detrimental way.
For this reason, by the late 1800s, the arm-to-arm method was abandoned in favour of growing the virus on baby calves and harvesting it from them directly. Transferring from an animal directly to a human had a relatively reduced risk of transmitting disease, since not all human diseases can exist in cows. But some can, especially general bacterial infections. To combat this, anti-bacterials were added, starting with glycerin, and later, when they were available, antibiotics.
Then came transportation and storage. Before this, the curative cow was literally carted around town for vaccinations, or brought to the town hall where patients could line up. Small amounts of vaccine could be stored for a short time on ivory points, between glass plates, on dried threads, or in small vials. But the virus would lose its effectiveness quickly, especially when subjected to heat.
Before refrigeration, material was heated and dried for preservation, which was not suitable in this case. The solution, developed in the early 1900s, was ‘freeze drying’. This technique involves rapidly freezing the material, then putting it under a vacuum so that the ice sublimates: water vapour evaporates directly off the ice without melting into water. A secondary drying process involving mild heat or a chemical desiccant removes the remaining moisture, and the result is dry material that has not been damaged in structure. It allowed the smallpox vaccine to last several months even at 37° C.
Tech Turning The Tides
With a safe, effective, portable vaccine, the world had the tools to dramatically reduce smallpox – and ultimately, to completely eradicate it. Culturally, the controversy has never completely disappeared, and politically, attempts to make practices compulsory have always been met with strong opposition. The real risks of infection mentioned above gave people good cause for concern, at least until many of those initial problems were tackled through trial and error over time.
Smallpox ceased to be endemic in much of Europe by the 1930s, the US and Canada by the 1940s, and the rest of the developed world by the 1950s. Australia, which did not see the disease until European explorers arrived in the late 1700s, managed to keep it from ever becoming endemic by quarantining arriving ships, and saw its last case in 1917. By 1966, it was endemic only in Africa, the Middle East, Southeast Asia, and Brazil.
When overseas travel was slower than the progression of the disease itself, it could be controlled via quarantine. But with an incubation period of over a week, an oblivious infected patient could step on a plane and be halfway around the world long before showing any symptoms. The jet-age posed a new set of criteria to contend with.
Smallpox was a good candidate for eradication for several reasons: it could only infect humans, so there could be no animal reservoir of the disease, it was easy to diagnose and to distinguish from other diseases, there was a cheap and effective vaccine, and unlike diseases such as measles it was only contagious when the obvious symptom, the rash of pustules, was apparent – not during the fever that preceded it.
The World Health Organisation resolved to eradicate smallpox in 1959. The man chosen to lead the effort was D. A. Henderson, chief of the surveillance section at the United States CDC (then the Communicable Disease Center, now the Centre for Disease Control). There was a tremendous lack of internal alignment on goals and strategy, and regional directors often refused to cooperate with the programme. Large, non-profit, inter-governmental organisations are unsurprisingly bureaucratic.
At the launch of the effort, the WHO director-general, Marcelino Candau, was against it, expecting it to fail, and wanted to give it as little publicity as possible. In Henderson’s telling of the story, some of the individuals assigned to lead national efforts were simply incompetent, and there wasn’t much he could do except wait for them to be replaced. Henderson and his team worked apart from, and often around, the bureaucracy, and the effort succeeded not because of any grand unity of purpose at the top, but because of the industriousness of many middle managers, and the devotion of over a hundred thousand workers and volunteers on the ground.
The defeat of smallpox was credited, as in war, to superior artillery, surveillance, and surrounding the enemy. But tools aside, one of the most fascinating innovations was in strategy. At the beginning, it was assumed that the path to eradication was mass vaccination. This was what made some people sceptics of the entire goal. It wasn’t just the sheer numbers that made mass vaccination a barrier, but the managerial difficulty of getting to the last remaining % of the population.
This assumption turned out to be false. The winning strategy, called ‘ring vaccination’ or ‘surveillance and containment’, was to establish a network of health professionals who could quickly alert the nearest office to new smallpox cases; when alerted, they would immediately head to the scene and vaccinate everyone who had come into contact with the known cases, forming a ‘ring‘ around them to stop the disease spread. The focus became more targeted, flexible and responsive.
In 1720, inoculation had been a folk practice in many parts of the world for hundreds of years, but smallpox was still endemic almost everywhere. The disease had existed for at least 1,400 and probably over 3,000 years. Just over 250 years later – it was completely gone.
Why did it take so long, and then suddenly change so fast? Why wasn’t inoculation practiced more widely in China, India, or the Middle East, when it had been known there for centuries? Why, when it reached the West, did it spread faster and wider than ever before – enough for responses to significantly reduce and ultimately eliminate the disease? Similar questions apply to many modern technologies.
Non-linear Progression: a legacy of Bacon, by 1700 there was a widespread belief in Europe that useful knowledge could be discovered that would lead to improvements in life. People were on the lookout for such knowledge and improvements and were eager to discover and communicate them. Those who advocated for inoculation in early England did so in part on the grounds of a general idea of progress in medicine, and they pointed to recent advances, such as using Cinchona bark (quinine) to treat malaria, as evidence that such progress was possible. The idea of medical progress drove the Suttons to make incremental improvements to inoculation, Watson to run his clinical trial, and Jenner to perfect his vaccine.
Secular Sciences: advocating for such progress requires a human agency and sanctity of human life. But in a sermon, Massey said in his conclusion, “let them Inoculate, and be Inoculated, whose Hope is only in, and for this Life!” A primary concern with salvation of the immortal soul can preclude concerns of the flesh. Christianity had by then integrated enough of the Enlightenment paradigm that other moral leaders, such as Cotton Mather, gave alternative humanistic opinions on the moralities of inoculation.
Cultural Communication: in China, variolation may have been introduced as early as the 10th century AD, but it was a secret rite until the 16th century, when it became more publicly documented. In contrast, in 18th-century Europe, part of the Baconian program was the dissemination of knowledge, and there were networks and institutions expressly for that purpose. The Royal Society acted as an information hub, taking in interesting reports and broadcasting the most important ones. Prestige and acclaim came to those who announced useful discoveries, so the mechanism of social credit broke secrets open, rather than buying and burying them. Similar communication networks spread the knowledge of cowpox from Fewster to Jenner, and gave Jenner a channel to broadcast his vaccination experiments.
Dogma and Data: to quell controversy, the West had the scientific method and the case was ultimately propelled by the numbers. If people didn’t believe it at first, they did so a century later, when the effects of vaccination showed up in national mortality statistics. The method of meticulous, systematic observation and record-keeping also helped the Suttons to improve their inoculation methods, Haygarth to discover his Rules of Prevention, and Fewster and Jenner to learn the effects of cowpox. The germ theory, developed several decades after Jenner, put miasma theories to rest – by giving them a new context, and dispelling the idea that you could contain contagious diseases through diet and fresh air alone.
Capitalist Care: inoculation is big business, which motivated inoculators to make their services widely available. The practice required little skill, and it was not licensed, so there was plenty of competition, which drove down prices and sent inoculators searching for new markets. The Suttons applied reasonable business sense to inoculation, opening multiple houses and then an international franchise. They provided their services to both the rich and the poor, charging higher prices for better room and board during the multiple weeks of quarantine: everyone got the same medical procedure, but the rich paid more for greater comfort and convenience.
Momentum Mori: the Industrial Revolution was a massive feedback loop where progress begets progress; science, technology, infrastructure, and surplus all provide momentum in reinforcing one another. By the 20th century, it was clear how much progress against smallpox depended on previous progress, including specific technologies and the general environment; compounding advances in response, transportation, convenience, and effectiveness of scope, scale and focus.
Mass inoculation or vaccination happened in cities and towns when epidemics struck, and these would probably have been organised in part by local government. But how did vaccination spread more broadly? Compulsion by law seems to have helped, although some areas achieved results almost as good without compulsory vaccination, while in others attempts at compulsion weren’t enforced – culture and context potentially being more important than law here. Other factors included schools requiring vaccination of students, the police and military requiring it of their recruits, and insurance companies requiring it for life insurance – or demanding an extra premium for the unvaccinated.
The WHO were important in this, even though the director-general lacked faith in the initial roll-out. It was also surprisingly cheap: it cost $23 million/year between 1967 and 1969; adjusted for inflation that is under $200 million/year in 2019. Total US private giving just to overseas programs is over $40 billion; less than 1% of that would pay for eradication were it needed today. The Gates Foundation alone gave away $5 billion in 2018. The WHO provided a mission and a forum in which to make eradication a target, alongside the authority and clout to make international cooperation possible.
Small Changes For Big Returns
The disease disappeared from humanity and the virus could not last long outside a human host. There was no animal reservoir. There were stocks of the virus for research, but they were destroyed in the 1980s – except for two, highly secured laboratories that were allowed to keep them. Owing to Cold War politics, one of those laboratories was at the CDC in Atlanta, Georgia; and the other was in Russia at the State Center for Research on Virology and Biotechnology, known as VECTOR. Perhaps a concept of ‘mutually assured destruction’ for biological weapons.
The smallpox vaccine is still manufactured for limited use, kept in the Strategic National Stockpile. It’s not made with live cows anymore, but in cell cultures. In a global epidemic, humanity would swing into action, uniting however briefly against a common, microbiological foe. Vaccine production could be ramped up, and perhaps improved vaccines with few to no side effects could be created (some were still in the research phase around the time of eradication).
Antiviral drugs, which were not available in the smallpox era, could be developed (one, tecovirimat, has already been FDA-approved for smallpox). The disease is only contagious when the telltale rash is apparent, making it easier to establish isolation and quarantine guidelines. And most importantly, the knowledge of how to fight smallpox has not been lost – preserved in a 1,400-page book published by the WHO.
Smallpox was the first disease to be eradicated, but it doesn’t have to be the last. Other diseases will be even more challenging. Yellow fever can infect animals, meaning that even if it were eliminated from all humans, it could return from the animal reservoir. Cholera can last for a long time in water. Tuberculosis can be latent for a long time in humans. HIV does not yet have an effective vaccine, nor does malaria (although tropical diseases may be eliminated in the future with genetic engineering, such as modifying mosquitoes with a ‘gene drive’). Measles is a better candidate, but it is more contagious than smallpox, and it is contagious before its rash is apparent, making isolation harder.
In spite of the comparative ease of smallpox, a novel outbreak would still be a nightmare to contemplate. Routine vaccination has not been performed in forty years, and most of the public is susceptible. There are vaccine stocks, but not enough to vaccinate everyone. Most doctors have never seen a smallpox case in their lives, and most health workers have not been trained to handle one. Modern transportation systems would scatter the disease around the globe faster than we could track it, and modern social networks can make a cohesive response more difficult. With or without both short and longer term data and evidence (and just as importantly – trust and transparency), there would inevitably be accusations that it had been released deliberately as a biological attack by a nation-state or terrorist organisation.
To quote David Deutsch, anything not forbidden by the laws of nature is achievable, given the right knowledge and approach – but what is the right knowledge and approach? It is surely a sign of human knowledge, technology, wealth, and infrastructure that we aim to reduce suffering and improve standards of living for all, across the board. Thomas Jefferson, a strong advocate of Jenner’s new vaccine, wrote to him: “future nations will know only by history that the loathsome smallpox has existed.” Someday, perhaps, might that be said of disease itself?