As 2013 comes to a close, because this probably will be my last post of 2013 (unless, of course, something comes up that I can’t resist blogging about before my next turn a week from now), I had thought of doing one of those cheesy end-of-year lists related to the topic of science-based medicine. Unfortunately, I couldn’t come up with anything I haven’t already done. I even thought of coming up with a list of New Year’s resolutions for 2014. In fact, I even thought of making the first one—in a self-deprecating manner, of course—to be to stop being so mean, nasty, and dogmatic, the better to satisfy my detractors. But then I remembered that nothing is likely to satisfy my detractors and, besides, my ever-lovin’ cuddliness is what makes me so popular. Besides, I have to be me and gotta be true to myself, and all that rot, so that idea went out the window. Of course, what was worse than my inability to come up with something was that I couldn’t think of a way to make it funny. When you’re trying to be funny following the inimitable Mark Crislip, you’d damned well better be funny. So, until my humorous instincts come back, serious it has to be.
But serious doesn’t necessarily mean heavy. The end of a year is a time both to look back on the year before and look forward to the year to come. This year in many ways was a good year for us here at SBM. We launched a Facebook page, reinvigorated our Twitter feed, and have experienced a significant growth in our traffic. Those who know me and/or follow me on various social media know that I’m a big Doctor Who fan, I have been since the 1980s. So the last two big events of the year, the 50th anniversary special in November and the Christmas special on, well, Christmas got me to thinking about time travel, and thinking about time travel revived memories of a topic I covered on my not-so-super-secret other blog four years ago and had been meaning to treat here sometime. It’s a fun topic to finish out the year, not to mention a way for me to blatantly sneak Doctor Who references into an SBM post.
Being a Doctor Who fan and all, not surprisingly, I’ve often wondered what it would be like to be able to travel through time and visit times and places in history that I’m most interested in. For instance, being a World War II buff, I’d certainly want to be able to check out what everyday life was like here in the U.S. during World War II. Given my affinity for psychedelic music and that I was only four years old during most of the Summer of Love, I’d think it cool to check out Haight-Ashbury, although I suspect my reaction to the reality of it would be similar to that of George Harrison when he checked it out for the first time. I guess, if pushed, I’d have to admit that if I were old enough to have been a high school or college student in 1967, I probably would have been one of those straight-laced, short-haired types destined either to go to college to become a doctor or engineer, or to go to Vietnam to fight. Despite loving the music, I never had any interest in experimenting with the drugs. Beer, wine, and—occasionally—a martini or two are my drugs of choice and then only for medicinal purposes, as they say. Heck, I never even tried to smoke tobacco. Even as a child I couldn’t stand the smell of cigarette smoke to the point where it was never even really a temptation.
In any case, what provoked my original bit of musing was a post a few years ago by Martin Rundkvist, who wrote about Fear of Time Travel, where he imagines what it would be like for a modern person to be transported back in time:
First, imagine that you’re dropped into a foreign city with only the clothes you wear. No wallet, no hand bag, no money, no cell phone, no identification. Pretty scary, huh? But still, most of us would get out of the situation fairly easily. We would find the embassy of our country of origin, or if it were in another city, contact the local police and ask to use their phone. A few days later we would be home.
That’s not the scary scenario I rehearse. Imagine that you’re dropped into the city you live in with only the clothes you wear. No wallet, no hand bag, no money, no cell phone, no identification. And it’s 500 years ago. (Or for you colonial types, 300 years ago in one of your country’s first cities.)
It’s a fun thought experiment, with Martin pointing out that you would speak the language with what to the natives living at that time would seem a very strange and nearly incomprehensible accent. Think of how hard it is to understand the English spoken in Shakespeare’s plays, which is full of idioms, turns of phrase, and vocabulary peculiar to the time, and then just think about the number of words that we use that would be incomprehensible to, say, an American living in the Midwest, which at the time for where I live would have been ruled by the French as part of New France but mostly populated by indigenous tribes. So for purposes of the thought experiment, I’ll pick New York or Boston. I could also make like The Doctor and claim imagine that the TARDIS (or something else) had given me the ability to speak the native language and appear to be dressed like everyone else. Or I could imagine that I didn’t understand the language and had to learn it. Over time, it wouldn’t really matter. Here’s the part of Martin’s thought experiment that caught my eye:
Some might think that a well educated modern Westerner would soon become one of the sages of the age thanks to their superior technological and scientific knowledge. For one thing, it wouldn’t be hard for most of us to become the best doctor in the world of AD 1509 if knowledge was all it took. But I have a feeling that such knowledge would not be easily applied in a society that is completely unprepared for it, and not easily implemented in an environment where none of today’s infrastructure exists. And say that you’re actually a doctor or an engineer – how much could you achieve without access to any materials or tools invented in the past 500 years? I mean, I know the principles of nuclear fusion, aviation, antibiotics, vaccination and basic biochemistry, but don’t ask me to put them into practice starting from scratch!
Well, I am a physician and surgeon, and I don’t know if I could elevate myself to a sage of the age with my knowledge. The reason is that so much of what I do and have done in medicine relies on the technology and science of the time—this time, as in the decades between 1984 (when I entered medical school) and now. Let’s start with something very, very basic. I’m a surgeon. I try to cure or treat diseases by operating. Operating on a patient, however, is very difficult without reliable anesthesia, and inhalational anesthesia using ethyl ether wasn’t discovered and widely applied until the 1840s. Before that, there were various herbal anesthetics and hypnotics, natural drugs like opium extracts and later morphine, and even alcohol. While these may have sufficed for minor operations (barely), they were not at all sufficient for doing anything major, such as entering a major body cavity like the abdomen or chest.
That’s why, before anesthesia, surgeons had to be fast, and surgery was very bloody. Think of Abigail “Nabby” Adams Smith, the first born of our second President, John Adams. She was diagnosed with a malignant tumor of the breast and underwent a mastectomy without anesthesia, the gruesome details of which were described both in David McCollough’s biography John Adams and in Jim Olson’s Essay on Nabby Adams:
Nabby entered into the room as if dressed for a Sunday service. She was a proper woman and acted the part. The doctors were professionally attired in frock coats, with shirts and ties. Modesty demanded that Nabby unbutton only the top of her dress and slip it off her left shoulder, exposing the diseased breast but little else. She remained fully clothed. Since they knew nothing of bacteria in the early 1800s, there were no gloves or surgical masks, no need for Warren to scrub his hands or disinfect Nabby’s chest before the operation or cover his own hair. Warren had her sit down and lean back in a reclining chair. He belted her waist, legs, feet, and right arm to the chair and had her raise her left arm above her head so that the pectoralis major muscle would push the breast up. A physician took Nabby’s raised arm by the elbow and held it, while another stood behind her, pressing her shoulders and neck to the chair.
Warren then straddled Nabby’s knees, leaned over her semi-reclined body, and went to work. He took the two-pronged fork and thrust it deep into the breast. With his left hand, he held onto the fork and raised up on it, lifting the breast from the chest wall. He reached over for the large razor and started slicing into the base of the breast, moving from the middle of her chest toward her left side. When the breast was completely severed, Warren lifted it away from Nabby’s chest with the fork. But the tumor was larger and more widespread then [sic] he had anticipated. Hard knots of tumor could be felt in the lymph nodes under her left arm. He razored in there as well and pulled out nodes and tumor. Nabby grimaced and groaned, flinching and twisting in the chair, with blood staining her dress and Warren’s shirt and pants. Her hair matted in sweat. Abigail, William, and Caroline turned away from the gruesome struggle. To stop the bleeding, Warren pulled a red-hot spatula from the oven and applied it several times to the wound, cauterizing the worst bleeding points. With each touch, steamy wisps of smoke hissed into the air and filled the room with the distinct smell of burning flesh. Warren then sutured the wounds, bandaged them, stepped back from Nabby, and mercifully told her that it was over. The whole procedure had taken less than twenty-five minutes, but it took more than an hour to dress the wounds. Abigail and Caroline then went to the surgical chair and helped Nabby pull her dress back over her left shoulder as modesty demanded. The four surgeons remained astonished that she had endured pain so stoically.
Without effective anesthesia, I could do no better than these surgeons from 200 years ago or the surgeons from 300 years ago. In fact, I would probably do much worse, because I’m used to operating in a deliberate fashion, cauterizing individual blood vessels as I go. The reason surgery before science-based inhalational anesthesia was discovered was “cut and slash” was because it had to be. To do otherwise was to prolong what was quite literally the torture of the patient. I wasn’t trained to operate that way. I’m not used to operating that way, and there was no such thing as a Bovie electrocautery machine back then. It would all be scalpels, hot irons, and scissors–or, as in Nabby Adams’ case, razor blades and, in essence, a set of tongs to elevate the breast. It’s another reason why surgeons were frequently so fast in doing amputations that their assistants had to be careful not to let their fingers get in the way. As for more extensive operations, even with the anesthesia available in the latter half of the 1800s, a full 150 years after the time period Martin’s thought experiment envisions, there was no way to control respiration. Anesthesia was a delicate balance between not putting the patient so deep that he stopped breathing but putting him deep enough so that he wasn’t reacting overmuch to the surgical stimuli. Mechanical ventilators were an invention of the 20th century.
That’s just one example. There are numerous other tools, disciplines and examples of knowledge that a modern science-based surgeon depends upon in order to do his or her job, such as antibiotics and germ theory, pathology to identify what a patient has based on tissue samples, transfusions, and a wide variety of medications, to name a few. Then there’s diagnostic radiology. There would be no CTs or MRIs; there wouldn’t even be X-rays. Indeed, even something as simple as a suture would be a problem. The needles used 300 years ago were huge by today’s standards because of the difficulty of making small needles. The technology to do it was not widely available. Throughout history, needles were made of bone or metals such as silver, copper, or bronze, while sutures were made of cotton, flax, hemp, silk, or even tendons and nerves. There was a reason they called “catgut” suture catgut. Although cat gut suture was not made from the actual gut of cats, it was made from actual gut – connective tissue from intestines. (Actually, cat gut was pretty good suture and was still occasionally used 25 years ago when I first started my residency, mainly by the older surgeons.)
In any case, I think Martin’s right in that, without the infrastructure and scientific background being there, it would be very, very difficult for a surgeon of 2013 like me to recreate in the year 1713 much of anything that I do now, even if I were dropped into Boston among the most learned physicians of the age. No one there would have any idea of germ theory (and thus sterile technique, both of which were at least 150 years away), anesthesia (which was 130 years away), or much of basic physiology. Indeed, at that time, diseases were thought to be caused by imbalances in the four humors or miasmas, for the most part. If I were to try to explain the concepts that underlie the science-based medicine of today to the learned men of the time, assuming I could master the dialect of 300 years ago and find a way to describe the concepts to them, they’d assume I was either mad or a witch. It would be a good thing for me that the wave of witch hunts that swept through New England was pretty much over by the early 18th century.
There is one area that I can think of where a surgeon of 2010 might be able to translate some of his knowledge into 1710 and hope to have some influence. The first, of course, is sterile technique. It would not be that huge of an undertaking to sterilize instruments (although sterilizing sutures would be very problematic), either in flame or in alcohol. It would probably not be that huge a challenge to use alcohol, carbolic acid, or some other compound to clean the operative field, the patient’s skin, and one’s hands. (Given the lack of latex or rubber gloves, I’m not sure if it would be possible not to operate with my bare hands, as surgeons of the time did–and in fact continued to do until the late 1800s and beyond). In other words, I could be Joseph Lister 150 years before Lister showed the benefits of antisepsis, publishing his seminal paper on the benefits of antisepsis in 1867.
Come to think of it, I could potentially be Louis Pasteur, again 150 years before Pasteur did much of his work. At the very least, I could figure out how to replicate his experiments disproving abiogenesis and to develop Pasteurization. Of course, convincing the world of 1710 of the validity of these ideas would be even harder than it was for Louis Pasteur to convince his contemporaries of germ theory, because far less of the ground work would have been laid. SBM builds on the science of the times incrementally, and if the requisite prior discoveries haven’t been made yet it’s hard for new science-based modalities to take hold. Lister’s discovery, for instance, didn’t take hold in the US until over a decade after it took hold in Germany because of resistance to the germ theory of disease.
In fact, it brings up an issue that should demonstrate just how hard it would be to convince the physicians and surgeons of 1713 of our knowledge. Consider something as simple as blood pressure. Although the ancient Egyptians knew enough to palpate pulses, the very first measurements of blood pressure (more accurately, pulse pressure) were not made until 1733 (twenty years before I’d be dropped into New England) by Stephen Hales, who measured the blood pressure of a horse. It was not until 1855 when the first sphygmomanometer was devised by Vierordt of Tubingen, an instrument called at the time the sphygmograph, which was considerably improved upon by Etienne Jules Marey in 1860. Before this, it was not possible to measure arterial blood pressure other than under surgical conditions using a canula inserted directly into an artery. Finally, the modern version of the sphygmomanometer was invented by Samuel Siegfried Karl Ritter von Basch in 1881, but Italian physician Scipione Riva-Rocci improved upon it by producing a more easily-used version in 1896. After that, Harvey Cushing discovered the device during a visit to Italy in 1901 and popularized its use in the U.S. after he returned home. What all this means is that the routine measuring of blood pressure as a “vital sign” in virtually all patients did not become truly routine until 100 years ago. Even more amazing, it was still 20 years before Russian physician Nikolai Korotkov popularized the use of the device to measure diastolic blood pressure as well, meaning that the systolic/diastolic blood pressure ratio that we’re all familiar with didn’t become routine practice until the 1920s.
Hmmm. Maybe I could be Harvey Cushing 200 years before Harvey Cushing was in his prime.
In any case, if there’s one thing this little thought experiment has done for me, it’s to make me realize how much of what I do depends on hundreds of years of history and science and any achievements I may have in my career rest squarely on the shoulders of giants. I’ll take another example from my very own specialty, the “god” of modern surgery William Stewart Halsted, who practiced surgery from the late 1870s until his death in 1922. During that time, he pioneered so many surgical advancements that it’s amazing that one person came up with so many ideas and pushed the envelope so much.
In imagining myself 300 years ago trying to replicate such advances more than 150 years sooner than Halsted, I have to think about the sorts of resistance he encountered. For example, when he was working at Bellevue Hospital in New York, Halsted saw Joseph Lister present his discoveries. Inspired by Lister, Halsted tried to convince hospital administrators to construct a sterile operating room. The hospital refused, because it would have been very expensive and was considered at the time to be “wildly innovative.” (The administrators, I suspect, didn’t mean that in a good way.) So Halsted used $10,000 of his own funds to erect a tent on hospital grounds. He equipped it with maple floors, gas lights, and sterilization facilities. Basically, it was the first planned sterile operating environment in the U.S., possibly in medical history. Ironically, Halsted had also pioneered the use of sterile latex operating gloves because his fiancée (she was his nurse at the time) complained that the antiseptic mercuric bichloride had caused dermatitis on her hands. Given how much difficulty physicians and scientists like Halsted and, before him, Ignaz Semmelweis and Louis Pasteur had convincing a scientific world that was primed for their discoveries, imagine how much difficulty I would have convincing the physicians and scientists of 1713 of my discoveries. I can’t help but think that if I were to be dropped in Europe in 1513 (Martin’s original idea of traveling 500 years back in time), I might be introduced to the Tribunal of the Holy Office of the Inquisition, depending upon where in Europe I ended up.
Of course, Halsted was also often rather adventurous (some might say overly so) in his achievements, for example:
In 1881, while visiting the family home in Albany, New York, he performed the first emergency blood transfusion, drawing his own blood and injecting it into his sister’s bloodstream when he found her passed out from a postpartum hemorrhage. The next year he performed emergency gall bladder surgery on his mother, performing the operation on her kitchen table — administering ether to knock her out, sanitizing his hands and equipment by dipping them in carbolic acid, then slicing into her abdomen and gall bladder, draining vast amounts of pus and removing seven stones — the first recorded surgery to remove gallstones.
Obviously, both of these emergencies could have ended disastrously. Halsted’s sister could easily have died from a transfusion reaction, given that virtually nothing was known at the time about immunology or blood typing, and his mother could easily have died due to her home surgery, although given the lack of antibiotics she would almost certainly have died without treatment. No doubt Halsted knew that.
How I treat breast cancer also depends upon hundreds of years of painfully-gained advances in scientific knowledge and is (usually) the best that can be achieved at the time. This inevitably brings me back to the example of Halsted. Halsted performed the first radical mastectomy in 1882. It was a long, extensive, and bloody operation for the time. In talks on the history of breast cancer surgery that I have given, I like to point out that, early on, Halsted’s surgery was viewed as butchery. However, as his survival data matured and it turned out that he produced survival rates better than other surgeons operating at the time, his method became the standard of care for the next 80 years. Yes, it was brutal surgery, but it made sense at the time. When Halsted operated, there was no chemotherapy. Even though Emil Grubbe had demonstrated the first use of radiation to treat cancer in 1896, radiation therapy was crude and the concept of adjuvant radiation therapy was decades away. If breast cancer was going to be cured, surgery alone would have to do it, and, given the scientific understanding of cancer at the time, it made sense to perform a radical resection of the breast and all lymph nodes to which the cancer might drain. Later scientific discoveries and developments rendered such radical surgery unnecessary, but, although it can be argued that such surgery persisted as the standard of care too long after other, less disfiguring, surgery became possible, it makes no sense to criticize the surgery as barbaric when at the time it was the best available and was consistent with the then-current science.
Finally, this fun little thought experiment brings me back to the present through the lens of history. Critics of SBM, in particular those who advocate what we like to call unscientific medicine or pseudomedicine, often make the claim that today’s treatments of cancer will, 50 or 100 years from now, be considered barbaric and unscientific. This sort of idea is sometimes expressed in pop culture as well. Indeed, one is reminded of a scene in Star Trek IV: The Voyage Home (whose plot, not coincidentally, involves the crew of the Enterprise time traveling back to San Francisco in 1986). In that scene, some of the crew finds itself in a hospital of 1986, and Dr. Leonard “Bones” McCoy is horrified at dialysis, which he likens to the Dark Ages:
Later in the scene, he likens a debate between two residents about radical chemotherapy to the “goddamned Spanish Inquisition.”
Did I ever mention, by the way, that Dr. McCoy was one of my inspirations for becoming a doctor?
Of course, this movie scene is a perfect way to end this post. In ST:IV, the time difference is 300 years, the crew of the Enterprise having traveled back from Star Date 8390, which corresponds to the year 2286, back to the year 1986. Perfect! This is pretty much the same as my traveling 300 years back in time to 1713, at least enough for my purposes. I’ve tried to give you a flavor of how different that world was in terms of medicine and how out of place my skills would be there. All that I know about medicine and all of my skills as a surgeon and scientist rest on the advancements of anatomy, chemistry, physiology, and science that have been painstakingly discovered, mostly over the last 300 years—actually, mostly over the last 150 years. So it’s not at all unreasonable to guess that to a fictional doctor from nearly 300 years from now our medicine would seem as primitive as the medicine of 1713 does to a physician like me from today. However, disparaging such medicine as primitive and brutal is misguided. The medicine of 1713 was brutal and primitive because the science didn’t yet exist to make it otherwise. A physician traveling back to that time would discover just how true that is and how difficult it would be to convince his peers of the time of how much better his science and treatments are.
In other words, even if it is true that the science-based medicine of 2313 would be as far ahead of the science-based medicine of 2013 as the SBM of 2013 is compared to the best of 1713 (or the fictional SBM of 2286 is compared to that of 1986), that’s beside the point. We don’t know the science that will inform the medical advancements of the next 300 years any more than physicians of 1713 had a glimmer of an idea of the advances and discoveries that would be made over the next 300 years from them. SBM is always changing and adapting in response to new scientific discoveries. The process is often slower and messier than we would like, but it does continue. The SBM of today will give way as new discoveries are made, and it is not at all shameful if medicine in even 50 years is very different than it is now. After all, medicine now is much different in so many ways from what it was in 1963, and that doesn’t make the physicians of 1963 incompetent. Like us, they were products of their time.
Finally, it occurs to me that the way medicine was practiced 300 years ago here in the colonies bears a lot of resemblance to many “alternative” therapies. Medicines back then were virtually all derived from herbs or animal products, often the herbs or animal products themselves, unpurified. Germ theory was at least 150 years in the future, and disease was thought to be due to things that very much resemble alt-med’s concepts of disturbances in the flow of qi. While I might be very much a fish out of water in 1710 as far as trying to practice medicine and surgery, I suspect many “alternative” medicine practitioners would not be, and that’s the difference. SBM changes substantively as science advances. In “complementary and alternative medicine,” (CAM), the ideas remain the same, and only the names change.