The Murky Past and Precision Future of Anesthesia

Johnny Greig / Getty Images

After 175 years, the medical art with a strange and dark history is getting a lot more precise.

General anesthesia, defined today by four components—amnesia, unconsciousness (sedation), analgesia (lack of pain), and akinesia (muscle paralysis)—was a game-changer, even in the primitive state in which it made its first official appearance on October 16, 1846, in what is now called the “Ether Dome” of Massachusetts General Hospital. On that day, rather than the multitude of drugs we use now—all administered with carefully measured doses, timed based on a variety of factors, like a person’s age, sex, and weight, and monitored with a wall full of electronic equipment—anesthesia consisted of just one agent given in a very inaccurate manner with hopes that it would be enough to put the patient out and prevent pain, hopefully without killing him. And on that day in 1846, it was administered by a con artist-turned-dentist.

It changed everything.

“Without anesthetics, modern surgery would not be possible,” notes Divya Chander, an anesthesiologist who serves as the chair of neuroscience at Singularity University. And by modern surgery, she means not just mind-boggling procedures, such as heart transplantation, but really any procedure that involves cutting into the body operating in a careful, deliberate way. In was during the 1880s, just one generation into the anesthesia era, that the American surgeon, William Stewart Halsted, wrote the rules for operating slowly and carefully, while also taking Joseph Lister seriously by incorporating antiseptic technique. Having a patient who is completely or partly unconscious, not moving, and not in pain from every jab of a needle, every cut of a knife, means that you can cut through the body layer by layer, clamping, ligating, and cauterizing blood vessels as you go. That would provide surgeons a fairly bloodless view of what they were repairing, while no pain and no movement provides some time to work systematically. It seems obvious to us today, but operating in such a precise way was a new concept in Halsted’s time, despite some advances occurring during the decades just before he entered the scene.

One of the largest groups of people to benefit early on from anesthesia was laboring mothers. By inhaling the vapors of chloroform in 1853 to usher in Prince Leopold, Queen Victoria pioneered the use of anesthesia in childbirth, thus paving the way for the use of general anesthesia at home and abroad. Of the 60,000 or so limb amputations during the U.S. Civil War, most were performed under anesthesia of some kind (at least in the Union Army, where it was rare for surgeons to run out of anesthetic agents). Usually this meant the relatively volatile yet effective chloroform, and sometimes diethyl ether. Chloroform was preferred, due to ether’s extreme flammability, but a third agent, nitrous oxide, had fallen out of favor by that time. And while it was safer, it was also less effective at controlling pain. 

“We’re [also] approaching the 125th anniversary of spinal anesthesia in 2023,” notes another anesthesiologist, Jeff Swisher, referring to one of the mainstays of regional anesthesia, which made an initial appearance in the mid 1880s. That’s when Carl Koller, a Viennese ophthalmologist, first used cocaine as anesthesia for eye surgery. Then in 1898 came August Bier’s experimentation with the injection of cocaine into the fluid that surrounds the spinal cord. This produced spinal anesthesia in Bier’s patient, an effect not unlike the lower body anesthesia in the form of epidural injections which many women use today to give birth pain free but awake. With an epidural, the anesthesia is injected through the spine, but not as deeply through the layers of tissue outside the cord as its spinal anesthesia cousin. And, of course, not using cocaine. 

The chemist, the dentist, and the con artist

But to get back to the Ether Dome at Massachusetts General, two years before that fateful day, a dentist had tried nitrous oxide as an anesthesia at the same hospital, only to have jeers and insults hurled at him when his patient loudly moaned. First synthesized by Joseph Priestley in 1772, nitrous oxide was a mere curiosity at first, earning the nickname “laughing gas,” when it was inhaled by volunteers at sideshow booths, entertaining onlookers who gathered to watch them go crazy.

William Thomas Green Morton Wellcome Images

It was at one such exhibition in Hartford, Connecticut, in 1844, that a local dentist named Horace Wells noticed something odd. His friend Samuel Cooley cut and bruised himself rather badly while high on the gas, yet Cooley felt no pain. Nitrous oxide exposure did more than just make people laugh, Wells realized—it also produced analgesia. This was something that the English scientist, Humphry Davy, had figured out decades earlier, but Davy’s suggestion to use it for surgery hadn’t caught on, because surgeons had focused on the stimulation, the going crazy aspect of the nitrous oxide high. After trying out the gas for tooth extractions, Wells hoped to demonstrate the application of nitrous oxide for a more complex surgery in nearby Boston, the epicenter of New England medical practice. He first turned to another dentist, William Thomas Green Morton, whose thriving practice in Boston brought him into the same social circles as surgeon John Collins Warren, a scion of Boston’s medical community who had long sought a method for keeping patients pain-free during his surgeries.

But Morton was as much a con artist as he was a dentist. After years of running from place to place, obtaining notes of credit for a new business venture and then fleeing with the money, Morton had settled into his longest con of all: convincing Wells to teach him dentistry and then cajoling him into lending him money for a new dental device. And despite the financial success of his practice and the new device, Morton never returned Wells’ money. But his connections were still valuable, so Wells resolved to let him in on the nitrous oxide idea and Morton obliged by setting up a demonstration with Warren.

The experiment almost ended before it began after the patient presumably got cold feet and ran, but then a last-minute volunteer came forward as a substitute. He was a medical student with a rotten tooth eager to get it pulled and willing to go under the gas to do it. After administering nitrous oxide, Wells extracted the tooth, but then the medical student moaned audibly, prompting somebody in the audience to cry foul, shouting “This is humbug!” Chants of humbug then echoed through the audience.

For Wells, the experience was a devastating blow from which he never recovered. Over the next three years, he would abandon laughing gas as well as his dental practice and turn to experimenting with chloroform. “However, chloroform proved to be Wells’ undoing, as it was trickier to administer than ether and it was also, in his case, addictive,” says Swisher. In 1848, under chloroform’s influence, Wells would tragically take his own life by cutting his femoral artery with a scalpel.

Meanwhile, months after Wells’ nitrous oxide implosion, Morton learned of diethyl ether, which people were also huffing to get high, so he started huffing it too and found it promising but for the fact that it was a common solvent, already familiar to chemists and apothecaries of the day. Always the con man, Morton tried to disguise its smell with citrus oil to make it appear like an original invention, and he fraudulently tried (but failed) to patent it. 

“This is no humbug.”

However scurrilous his actions, Morton did succeed in securing priority for himself in its administration and claimed, albeit falsely, to be the first practitioner using diethyl ether as an anesthetic when he administered it that day at Massachusetts General in October 1846, to a surgical patient with a difficult neck tumor. Warren easily resected the tumor, and as the patient lay unconscious and perfectly still, Warren declared of diethyl ether, “This is no humbug.”

And he was right. Though discontinued around the 1960s, diethyl ether enjoyed a long career in anesthesia, outlasting chloroform by decades, and the most common inhalational anesthetics in use today—sevoflurane, isoflurane, and desflurane—each have an ether chemical group. 

An historical photo of one of the first anesthetic procedures.
Ether anesthetic being applied in the Ether Dome at Massachusetts General, circa April 1847. Southworth and Hawes

When a moan is just a moan

The plethora of anesthetic drugs available in operating rooms today include various intravenous agents available like propofol and ketamine—and even our old friend nitrous oxide. By mixing and matching different drugs of different types, anesthesiologists induce and maintain general anesthesia by balancing optimal levels of the four corners: amnesia, unconsciousness, analgesia, and akinesia while minimizing adverse effects. They can do this with a level of precision and monitoring of physiological responses that would make one wince at Horace Wells’ administration of nitrous oxide by squeezing it from a leather bag—the anesthesia equivalent of rolling the dice. But the idea of precision in terms of monitoring the effects of agents on the brain during general anesthesia is an area where there is still plenty of room for the field to advance, notes Chander, who is among those pushing the boundaries of her art. She uses electroencephalography (EEG)-derived patterns to monitor the brain’s response to anesthetic agents throughout surgery.

She does this in order to improve the precision of anesthesia. “No one person’s brain responds to the mix of anesthetic agents the way another person’s brain does, and a person may respond differently on different days, depending on a host of factors,” Chander explains. “So, using brain monitoring enables the anesthesiologist to deliver exactly the amount of anesthetic that a particular brain needs.”

Thinking back to the imprecision of the very first attempts at general anesthesia might help us to appreciate the gravity of Chander’s point. Although humbugged out of the theater when his patient moaned, Wells may actually have achieved adequate analgesia, akin to the 21st century nitrous oxide analgesia that dentists sometimes offer to needle-phobic patients, or that anesthesiologists sometimes offer to laboring women who refuse epidurals.

Nitrous oxide induces a dissociative state and interferes with the ability to care about things. “It also works as an incomplete analgesic, by diminishing pain levels,” Chander notes. “Therefore, the guy who moaned did so because nitrous, by antagonizing the NMDA receptor, leaves the person awake and conscious and able to vocalize.” 

Dovetailing with this explanation, the volunteer himself was reported not to have remembered any pain. As Freud might have said, sometimes a moan is just a moan. Nevertheless, one cannot know a person’s mind. Did the volunteer remember no pain because he’d experienced strong analgesia? Or did he simply forget because of the drug-induced amnesia? Getting a view of what’s happening in the mind, at least from the outside, is the very essence of what’s happening with the EEG-derived monitoring that is emerging today.

Beyond indirect measurements

As early as the 1930s, soon after EEG was first invented, anesthesiologists discussed the idea of somehow utilizing it to provide feedback during surgery on how deep of an anesthesia the drugs were producing, so that the drugs could be titrated up and down as needed. But what it really measures is voltage change over time between two locations on the head. Crude data like those are too complex to be useful for adjusting things like drug doses quickly throughout the course of a surgical procedure, so the idea did not take hold a century ago. 

But while technology now offers systems that process and present data for anesthesiologists in real time throughout surgery, there are substantial differences between the EEG-derived patterns that Chander utilizes, and is working to improve, and a more widely used type of device that summarizes the data to such a degree as to display only a number, an index ranging from zero to 100.

“The current quantitative EEG monitors take a long time to calculate a processed number indicating brain state and may give the wrong answer if anesthetics of different drug classes are used,” Chander says. “Furthermore, brain monitoring with the EEG only monitors one axis of anesthesia—the hypnotic or consciousness axis.”

And Chander is not the only one to criticize monitors that present EEG as only an index.

“Given the wide and complex variety of surgeries I provide anesthesia for, processed EEG monitoring, such as bispectral index has not proved to be particularly useful in my practice,” says Swisher. “This type of monitor looks at a limited superficial part of the cerebral cortex and may not be able to measure anesthetic depth to any meaningful degree in the majority of cases that I care for.”

“Currently, anesthesiologists use indirect measures to determine whether a patient is ‘too light’ or ‘too deep.’”

But more sophisticated real-time data analysis can reveal more detail, and monitor deeper levels of the brain, and that’s where clinical researchers like Chander and her former mentor Emery Brown of Harvard Medical School, another combined anesthesiologist-neuroscientist, come into play. Chander and Brown are among a small fraction of anesthesiologists who avail themselves of this kind of data presentation where the anesthesiologist sees colored patterns representing aspects of the big picture as it relates to what the anesthetic drugs are doing second by second in a person’s brain.

“People like myself are trying to write better algorithms to process these data coming from the brain in real time,” Chandra explains, noting that she taught her residents at Stanford to do this very easily.

But apart from avoiding a freaky experience like the moan of Wells’ poor medical student volunteer, why really would such precision brain monitoring be so helpful and revolutionary? Are anesthesiologists worried about rare cases of patients waking up during surgery? Chander provides some perspective.

“Currently, anesthesiologists use indirect measures to determine whether a patient is ‘too light’ or ‘too deep,’” she explains, noting that waking up during surgery, due to anesthesia being too light, is very rare, but that anesthesia that’s too deep is also a concern. Yet striking the just-right Goldilocks zone between too light and too deep gets increasingly hard when surgery is very long and when patients have a variety of health issues that affect how the body processes drugs and stores them in fat cells. So even in 2022, there is still a certain amount of guesswork. But that’s really where the new algorithms are coming to help.

No two people respond to the same mix of anesthetic agents in the same way. But perhaps therein lies the opportunity for the future. “We’ll move to a system of precision anesthesia, rather than population-averaged anesthesia delivery,” she says.

Go Deeper