It took years, but eventually scientists figured out that water contaminated with sewage carries the pathogen that causes the disease.

It took years, but eventually scientists figured out that water contaminated with sewage carries the pathogen that causes the disease. Kelly Marken/Shutterstock.com

The History Of Cholera Treatment Is a Masterclass On Turning Failure Into Success

Before we figured out how to treat cholera, there were a lot of missteps along the way.

In the mid-1800s, physicians gathered around a man who was being treated for cholera in the city of Puducherry on the southeastern coast of India. At the time, no one understood it was a bacterial disease, so doctors were essentially guessing when they treated patients. In this case, they tried a paste made of lemon juice, rust, and potassium aluminum sulfate (today known as alum and used in pickling vegetables) rubbed all over the patient’s eyes.

It didn’t go particularly well. According to one of the witnessing physicians, “the pain it produced vexed and enraged the sick man, and he attempted to strike those around him; the vomiting became more frequent, his attendants fled to avoid his blows.” Next, the patient ran into a nearby stream, drank loads of water, and went to bed. His cholera eventually disappeared, likely thanks to his immune system, but the treatment left him blind.

The idea of rubbing a blinding, rusty mash on the eyes to cure a diarrheal disease plaguing the bowels is incredibly stupid by today’s standards. Yet it was also an incredibly earnest attempt to improve human lives.

The pattern of eager, yet misguided problem-solving pervades medical history. “People who [were] well-meaning or really smart [went] in a totally wrong direction,” says Sydnee McElroy, a physician practicing in Huntington, West Virginia. She and her spouse, Justin, cohost a weekly medical history podcast called “Sawbones: A Marital Tour of Misguided Medicine,” and have a book based on the show forthcoming in October.

But it’s also true of any workplace. No matter how sincere and thoughtful your first ideas are, they’re probably not great. And that’s okay. Loads of hugely successful companies have encountered some serious flops. Remember the 1993 Apple Newton, a personal digital assistant? No one else does, either. They were a total failure—but iPads, the Newton’s well-conceived follow up, released 17 years later, are now ubiquitous. It took a little more time to learn from Newton’s mistakes and develop the idea to make it a success. Perusing through medical history shows that failure—when treated properly with perseverance, humility, and careful experimentation—can be transformed into literally lifesaving technologies.

When it came to cholera, blindness was obviously not an acceptable consequence of cholera treatment, so physicians kept looking. Over the years they tried everything from mercury (which is poisonous), opium, bleeding, and even burning patients’ heels for reasons that aren’t clear. In the early 1830s, British physicians homed in on the gut, hypothesizing that perhaps keeping the belly warm could ward off the disease, the McElroys explain in the book. So, they gave soldiers—who at the time were regularly falling ill from the illness—“cholera belts” made of flannel to wear under their uniforms.

This was a step in the right direction; targeting the the afflicted area made a lot more sense than trying to rub a cure into the eyes. It was still pretty wrong, though. The belts likely weren’t harmful, but they weren’t particularly helpful either.

Yet no one wanted to admit the idea wasn’t working: The British military stubbornly equipped at least some soldiers with extra flannel belts through the first years of World War II. The collective thinking was that heat worked as a prophylactic against diarrheal diseases, and no one wanted to question the received wisdom. The thinking was so ingrained that thousands of soldiers were forced to wear—and sometimes pay out of pocket for—the belts, despite the fact that they didn’t work and were too hot. If there had been enough humility to abandon the idea as soon as it became clear they didn’t work, it would have made everyone serving a lot more comfortable.

Fortunately, others were still working hard to solve cholera in the meantime. Scientists in Europe in the middle of the 1800s were developing an idea called “germ theory,” which—correctly—stated that microbes are the cause of certain diseases.

In 1854, John Snow, now known as the father of epidemiology, decided to test this idea using existing data. Working in a London neighborhood particularly prone to cholera outbreaks, he mapped all known cases to the water source those individuals were using. It was a revelation: many of the cases could be traced to  one sewage-contaminated pump, which, when he investigated it, was found to be filled with “white, flocculent particles”—which we now know were probably clusters of the bacterium Vibrio cholerae.

But others were skeptical. Even three decades after the notion that bacteria caused cholera was widely accepted, a Bavarian scientist named Max Joseph von Pettenkofer believed germ theory only told half the story. He agreed that cleanliness and sanitation mattered, but he believed the substance that caused cholera was transmitted only in certain populations—like those who were poor, behaved in certain ways, or were generally “unclean.”

Von Pettenkofer went above and beyond the scientific call of duty to prove his point. He drank a slurry of watery stool from someone who actually had cholera, to show that he wouldn’t get sick.

This was a mistake, of course. He did get sick, but thankfully contracted only a mild strain, so he lived to tell the tale. He took this as a win, signifying that his failure to get really sick was because he was a “cleaner” person than those who lived in poor parts of towns, where outbreaks were typically worse.

He did end up being right about this, too, but for the wrong reasons, the McElroys explain: People who lived where water was more likely to be contaminated with sewage were much more likely to develop the disease, but that was due to the lack of sanitations in those neighborhoods, not because their residents were unclean, or immoral.

The idea of hygiene, though, was a success. Keeping away from sewage is a great way to prevent cholera and many other diseases. We also now have antibiotics and rehydration fluids to treat cholera, and even a vaccine to prevent it thanks to this combination of experimentation, perseverance, and willingness to abandon bad ideas for the greater good.

“In history and today, people never settle for feeling okay,” says Sydnee McElroy. This refusal to settle is what makes medicine one of the greatest ongoing success stories.

It’s still not over, though. It isn’t even over with cholera: Globally, there are still hundreds of thousands of cases of the illness annually, particularly in areas where sanitation is poor or that have suffered natural disasters. Doctors and policy makers are still working to make sure everyone has cheap, efficient means of treating the disease, and preventing outbreaks in the first place. Undoubtedly, there will be missteps along the way, but each of those failures will show what doesn’t work, bringing us one step closer to one that does.

At some point in the future, we’re almost certainly going to look back on today’s standard medical interventions and realize just how wrong they were. We’re still don’t fully understand and are not very good at managing chronic pain, neurodegenerative disease, and obesity, for example. But we can also be confident that whatever dumb mistakes we’re making today are being made in good faith. “It’s the most human reaction to preserve one’s species to improve one’s existence,” says Justin McElroy, whether we have the knowledge to do that right or not. Problem-solving anything is almost never straightforward, and almost never happens as quickly as we’d like. Medicine puts that into perspective.