Showing posts with label medicine. Show all posts
Showing posts with label medicine. Show all posts

Wednesday, December 6, 2023

The Courage to Change a Broken World

1001 Books To Read Before Your Kindle Battery Dies, Part 115
Tracy Kidder, Mountains Beyond Mountains: The Quest of Dr. Paul Farmer, a Man Who Would Cure the World

Paul Farmer, Harvard-trained MD, had Haitian friends in his youth, so traveling to Haiti seemed the natural choice. When he arrived, he discovered a kind and magnitude of poverty that he couldn’t believe still existed within the United States’ sphere of influence. Moved by the plight of the suffering, he made Haiti his life’s work, teaching and treating patients part of the year at Harvard, so he could dedicate most of the year to Haiti.

Journalist Tracy Kidder discovered Farmer’s medical mission while reporting on the American invasion of Haiti in 1994, an invasion undertaken for supposedly benevolent purposes, to restore Haiti’s elected government. Kidder encountered Farmer because Farmer didn’t hesitate to name the discrepancies between the American mission, and what Americans actually did. Farmer’s confrontational style forced American diplomats to examine their choices. It also forced Kidder to question his own motivations as a journalist.

The Paul Farmer whom Kidder describes grew up relatively poor and rootless in the American Southeast. Trailer park denizens were “his people.” But he was also a hard worker, a speedy student, and an amiable, gregarious personality. He made friends and connections with an ease that might make others jealous. He maintained friendships with wealthy patrons and university peers, but also learned to speak Haitian Kreyol fluently, and won trust among the country’s chronically exploited peasantry.

This isn’t so much a biography, as a Boswell-like immersion in the subject’s life. Kidder follows Farmer to his free clinic in Haiti’s Central Plateau, among the poorest places in the Western Hemisphere. Farmer and Kidder walk along unpaved switchback roads through steep valleys, delivering medications to agrarian peasants who consider Farmer a literal wizard. Despite being younger than Farmer, Kidder often struggles to maintain Farmer’s breakneck pace and athleticism.

Although Haiti was the Western Hemisphere’s second nation to overthrow European colonialism, it never achieved the prosperity or stability of its older cousin, the United States. American slaveholders couldn’t abide a nearby neighbor populated by rebellious slaves, and never recognized Haiti until after its Civil War. Throughout the Twentieth Century, Haiti served as the proxy battlefield between America and France for hegemony over the nominally democratic world, a battle that propped up anti-democratic strongmen like the Duvaliers.

Dr. Paul Farmer

Farmer, moved by the peasants’ plight and the social mores of his Roman Catholic upbringing, saw bringing medicine to Haiti as his personal mission. But by “personal,” he didn’t mean “individual.” Kidder herein profiles several prominent allies, including Ophelia Dahl (daughter of Roald) and Jim Kim who participated in his mission. Farmer’s organization, Partners in Health, managed to parley several bouts of good luck into a longstanding project. These elements included wealthy benefactors, and Farmer’s own medical celebrity.

Two widespread diseases in Haiti, which dominated Farmer’s early career, were drug-resistant tuberculosis and AIDS. Though neither disease originated in Haiti, mean-spirited Northern PR spun those diseases together with Haitian identity in the world’s imagination. This meant that Partners in Health, backed with a comfortable but not large endowment, became a global leader in TB and AIDS treatment. Farmer’s mission suddenly went from regional and personal, to global. Farmer got dragged along with it.

Kidder was present to watch much of this sudden growth. He set out to write about Farmer’s work in Haiti, but during the writing, he accompanies Farmer on trips to Peru, France, Russia, and elsewhere. Kidder chronicles Farmer’s transition from a simple Haitian country doctor—something Kidder quotes Farmer saying he only wanted to be—to a transnational medical diplomat, visiting medical conferences in Europe and North America, and treating TB patients in Latin American slums and Russian prisons.

Throughout, Farmer is driven by his personal Christianity. Though Farmer eschews doctrine, and shows impatience with formal theology, he heeds the Gospel’s message of feeding and comforting “the least of these.” Kidder repeatedly Farmer citing a “preferential option for the poor,” a term from Liberation theologian Gustavo Guitierrez. Among Farmer’s allies is a former Salesian priest, Jean-Bertrand Aristide. Farmer embodies the self-effacing service described in the Gospels, and too often ignored by many First-World Christians.

This book shipped in 2004, as Farmer’s celebrity reached new heights, and describes a man with benevolent goals still before him. Three years later, Farmer died unexpectedly, on an outreach visit to Rwanda, his desire to retire to his inland clinic unfulfilled. This biography testifies to the good which First-World citizens can achieve, if we channel our privilege toward those in need. When we use our advantages to serve others, the world changes around us.

Wednesday, November 18, 2020

Vaccines, Economics, and Individualism


“Imagine living in a world so spoiled by lack of serious illness,” the message board statement read, “that we forget the benefits of vaccines and start to reject the same vaccines that brought us to this point.” If anyone asks, I tell them I avoid anonymous Chan-style discussion boards, because they encourage ignorance, self-righteousness, and boorish behavior. But I keep coming back, because sometimes, like here, somebody says something so right, it deserves further amplification.

As I approach two weeks without face-to-face contact with another human being, COVID-induced cabin fever reducing me to thoroughly unproductive jelly, this message seemed especially pointed. We Americans, alongside the British, Indians, and a few other highly populous countries, continue struggling with the balance between liberty and health. What form of unfairness do we, collectively, consider acceptable? Which moral precept do we value higher? And exactly how much do we even exist, as separate individuals?

These questions don’t matter merely in the abstract. They also matter because, as the original comment observed, we didn’t achieve the level of widespread health which humanity now enjoys, without many sacrifices along the way. The discovery of vaccines stopped the spread of once-virulent diseases like polio, smallpox, and even potentially lethal tetanus. Penicillin became the first of several wonder drugs. Even handwashing, once deeply controversial, became a lifesaver. Living to age thirty became commonplace.

Now a noisy minority, deaf to history’s lessons, are committedly kicking down the ladder their ancestors painstakingly ascended. Vaccine denial is just one component of this. Science denial has become a hallmark of wealthy modernity. Widespread anecdotal reports of patients denying COVID-19 exists, even as they’re dying of it, are just the latest manifestation of this. Faced with concrete, painful evidence that the way we’ve always done things doesn’t work anymore, some people just deny.

What worries me most, though, is that this problem isn’t unique to science. Reading the pro-vax comment which got me thinking in this direction, I realized I’d seen it elsewhere, almost verbatim. Gar Alperovitz and Chuck Collins made these arguments, almost verbatim, regarding economics. Post-WWII, Americans enjoyed a period of unmatched prosperity and security, made possible by government subsidies and public-private cooperation. Then we destroyed the network, convinced it provided others with an “unfair advantage.”

Admittedly, this prosperity and security wasn’t universal. The New Deal and its Eisenhower-era successors had racism baked into their structure, and the benefits accrued mainly to White Americans. This applied to health equally: Black and Latinx Americans have always had more limited access to preventative medicine, like vaccines, and are more likely to live downwind from smoke-belching factories than White Americans. But in the aggregate, Americans enjoyed government-backed prosperity, right up until we destroyed it.

COVID-19 provides a microcosm of why we seemingly can’t solve persistent problems in America. Just as a lead smelter cannot discharge waste into the air and water and assert that “it’s my property,” I cannot breathe into the shared air and claim “it’s my lungs.” It clearly isn’t mine alone. Yet this belief, which permeates American politics and economics, that we’re all disconnected individuals drifting along, separated from others, is disproved by a contagious disease.

Certainly there are areas where we exist as separate beings. If I enjoy science fiction, and you enjoy romance novels, let’s just read our respective books privately and not bother one another. But that doesn’t apply to circumstances that clearly belong to the community. Our society’s wealthy and powerful, who can afford to purchase legislators at fire-sale prices, have looked at centuries of collective stewardship of shared resources, and decided: nah. Kick the ladder down.

Humanity has achieved widespread levels of health and prosperity when we’ve acknowledged our shared responsibilities. When we admit I have a responsibility to get vaccinated against smallpox, not only so I don’t contract it, but also so I don’t transmit it to others. When I have a requirement to steward my land, not just because it’s mine, but because the invasive species which flourish here spread seeds elsewhere. When ownership derives from trust, not arrogance.

Some people look at their health, prosperity, and comfort, and think: I did this. Certainly, these individuals contributed to it, since heedless people tend to squander money and get sick from needless risks. But to believe you, individually, created your comfort, without any contribution from those around you, or before you, shows profound ignorance of history. If we inherit the dividends of others’ efforts, we inherit the responsibility to keep fighting their good fight onward.

Monday, May 4, 2020

On Miracles and the Miraculous

Christ Healing the Mother of Simon Peter's Wife, by John Bridges, 1839

Donald Trump loves miracles, apparently. He keeps invoking the name of the miraculous in combating the COVID-19 crisis. “One day it’s like a miracle, it will disappear,” he said in February. “It was like a miracle,” he said in April, speaking of hydroxychloroquine treatment. More ambitious media creators than me have made extensive B-roll clips of President Trump citing the word “miracle” to deflect the coronavirus threat.

Clearly, to Trump (who, though nominally Presbyterian, has apparently seldom darkened any church door as an adult), miracles mean something which suddenly, abruptly, solves a problem, with minimal explanation. Miracles are, essentially, an on-demand quick fix for difficult, dangerous problems with grim political implications. Trump’s willingness to expect miracles got me thinking: as a left-leaning Christian myself, what do miracles actually mean?

To answer this question, let’s first set politics aside. President Trump’s appeals to miracles seem, superficially, to resemble a political panacea. But Jesus didn’t heal lepers or raise Lazarus from the tomb to excuse them from social consequence. When he healed the bleeding woman who touched his garments, that woman still probably needed to undergo temple rituals to return to Jewish life. Jesus didn’t just make problems go away.

Miracles happen when some important problem changes direction. Jesus’ most common listed miracles involve healing sickness, which to his Jewish proselytes meant driving away social stigma, since Temple-era Jews perceived physical disfigurement as outward signs of God’s judgement. But Jesus also drove out demons, raised the dead, controlled nature and the weather, and ultimately was resurrected himself. These were all considered miraculous.

Returning to the bleeding woman, I find something important. This miracle ranks so high, it appears in all three synoptic gospels: Matthew 9, Mark 5, and Luke 8. The healing has a two-part formula: the woman first believes that God through Christ will heal her. That is, the woman places trust in authority outside herself. But then Christ affirms: “Your faith has made you well.” So she trusts God absolutely to heal her, then God’s Son affirms that trust did the healing.

Many mass-market healers, like Joel Osteen and Benny Hinn, place great emphasis on the “Your faith has made you well” aspect. To them, human belief is the precursor to miracles. Believe hard enough, the televangelists proclaim, and healing is yours. Like money or power, televangelist theology makes wellness part of the “name it and claim it” philosophy. To them, human faith is the first mandatory component of miraculousness.

Jesus as a Young Jew, by Goya
This isn’t completely unfair. Jesus, in the Gospels, tells the faithful to expect miracles, and they’ll appear. When four men lower a paralytic through the roof for a healing, Jesus says their faith makes the healing possible. You cannot have miraculous healings without first believing miracles are possible. So the televangelists, much as I dislike their self-serving tendencies, have something biblically solid there.

Yet something happens even before their belief. The bleeding woman, according to Mark, spends years and fortunes on doctors, hoping for treatment. Jesus meets the ten lepers in the wilderness, because they’ve been exiled from society. The hunched woman and the paralytic at Bethesda both suffered for years, pushed to society’s margins, because nothing they’d done could improve their situation.

Nothing they had done.

Jesus performs miracles upon these suffering people only when they’ve exhausted all human remedies. When everything possible within Iron Age medicine had failed, and the Jews believed the only remaining explanation was God’s judgement, these patients surrendered hope in human action. They turned their hope outside themselves, trusted that the universe’s order (which we call God) could provide healing, and sent their prayers toward that order’s human incarnation.

Let me restate that: miracles happen when we turn outside ourselves. When we stop seeking inside our egos, our sense of ourselves as individuals mastering life’s unpredictable circumstances, only then do miracles become possible. Even to Jesus Himself, miracles are the action which happens when doctors, priests, and kings have failed to produce results. The universe restores its internal order only after internal sources have made every reasonable human attempt.

By making miracles his first recourse, Trump front-loads the entire process. He makes miracles the first option, rather than the last, and therefore makes miracles, and the God who performs them, beholden to human ego. This happens whenever humans expect, even demand, miracles. This takes God’s action completely from the two-part formula, and makes reality subordinate to human arrogance.

Miracles cannot be bought or sold. Kings who simply expect them must, like Herod, eventually fall.

Friday, April 3, 2020

Coronavirus, Christianity, and the Great Stupid Rebellion

Reliquary of St. Cyprian
I remember learning, in Confirmation Class, that Christianity passed from an outcast religion to the spiritual mainstream in the middle Third Century AD. Once oppressed by the Empire, because it preached that power came from somewhere other than Rome’s blessing, Christianity became first acceptable, then widespread; then, by the Fourth Century, it became the Empire’s official state religion. What, we awestruck middle schoolers all wondered, had happened?

The conventional Christian explanation is: Cyprian’s plague. Named for the Carthaginian bishop who left the most detailed accounts, this poorly understood disease wracked the Empire for thirteen years, leaving so many dead that the Empire’s two economic drivers, the military and agriculture, both nearly collapsed. Yet somehow the Empire survived, and Christianity had multitudes of new converts. Supposedly, outsiders were impressed by Christians’ handling of the plague.

When imperial bureaucrats, the aristocracy, and priests of the state religion fled, Christians brought blankets, food, and clean water into the infection zones. They believed (according to the conventional accounts) that, if Jesus touched lepers and menstruating women, two major classes of unclean people in His era, that they shouldn’t fear touching plague patients. Some Christians, according to Cyprian, contracted the disease. But survivors credited Christians with their survival.

This lesson struck me during recent news of how American congregations have handled the COVID-19 outbreak. We’ve heard news of megachurches holding packed services, risking hundreds, even thousands of lives. Though it’s too early to have consistent statistics, regional reports suggest church may be one of America’s largest vectors of infection. People are dying needlessly in the name of God.

In fairness, I probably understand these motivations. These Christians believe their faith will protect them from suffering, and if they become sick they will, as described in Cyprian’s sermons, become ennobled and saintly. Avoiding church, for these Christians, suggests lack of faith in God’s healing principles and the life made new by Christ. They refuse to live in fear; some, I suspect, yearn to become martyrs to the cause.

Yet I think this misses the lessons St. Cyprian taught. Those early Christians didn’t become martyrs because they didn’t fear death; they weren’t holy because they kept attending church. They became holy because they braved the outside world, the exact opposite of what the rich and powerful did. Today, news trickles in that the rich are fleeing us peons, fearing for their lives. Christians have an opportunity to flow in the opposite direction.

Instead, Christians, especially White Christians, are congregating to publicly display their fearlessness, mainly to each other. Like drag racers and amateur daredevils, these would-be modern martyrs mostly seek one another’s approbation and glory. They aren’t rushing into short-staffed hospitals with blankets and clean water, as Cyprian’s Christians did; if anything, they’re denying the reality of the world to receive a stained-glass version of this world’s glory.


I’m reminded of the German Peasants’ Revolt of 1524. Inspired by Luther’s Reformation, German citizens rebelled against the Holy Roman Empire, and also against the Roman church. As in Rome, Germany had a state religion, and church and state conspired to maintain social order, with themselves on top. Citizens demanding government reform, plain-language church services, and “communion in both kinds,” rose up in arms against these twin tyrants of power.

Most important for our purposes, these rebels believed that, having God’s implicit blessing, they were impervious to violence. They believed that swords could not cut them, that bullets (guns were a new and terrifying technology) couldn’t pierce them. To nobody’s surprise, they were wrong. So many peasants died brutally that the Reformation nearly ended, and Martin Luther, then a fugitive from the gallows, risked death to return and quell the uprising.

Today’s megachurch Christians apparently think themselves heirs to St. Cyprian’s noble martyrs, refusing to fear the disease wracking the land. But if their faith doesn’t motivate them to comfort the suffering, while fighting the unjust powers that allowed this disease to fester, they belong to a different class of Christians. They’re closer to Thomas Müntzer’s doomed Radical Reformation. And if they don’t change the path they’re on, they’ll die just as uselessly.

St. Cyprian gives Christians a model to follow. Multiple sources, both Christian and secular, have reminded us recently that Cyprian’s Plague was a massive turning point, for Christianity specifically and Western civilization generally. But Cyprian’s Christians were brave and holy, not foolhardy. When I see megachurch congregations admiring themselves for fearlessness, I don’t see sacred bravery, I see stupidity. Sometimes, God lets the stupid just die.

Thursday, December 19, 2019

Doctors vs. Accountants©: Part III

Alexander Fleming, discoverer of penicillin
I began writing this series with the best intentions. Affected by a single, emotionally raw tweet, I simply wanted to express why the most common proposal to remedy insurance bureaucracy, nationalizing American healthcare, will only put new bureaucrats in charge of the same system. I’m no fan of private insurance, which privileges corporate interests over the patients it’s supposedly organized to serve. But even less do I like bureaucracy.

As I’ve attempted to clarify my positions, however, I’ve apparently made things even murkier. A dear friend, recently diagnosed with a rare and painful genetic disability, praised my thoughtful responses, but replied, “it feels like you’re saying that people like me are going to be screwed no matter what.” This criticism pierces my heart, because I struggled with this very thought while writing, fearing I was condemning my friend, and people like her, to a lifetime of suffering without remedy.

The more I thought about things, however, the more I’ve realized this isn’t necessarily true. Yes, advances in medical technology have meant people once doomed to early, painful deaths can now live longer, do more, and thrive abundantly in ways once unthinkable. And as I wrote earlier this week, that technology is cripplingly expensive, meaning some bean-counter somewhere needs to make life-or-death decisions about who gets priority access.

But, conversely, the most important advances in medical history haven’t been either high-tech nor particularly expensive. Scottish microbiologist Alexander Fleming discovered penicillin accidentally, after leaving a petri dish unattended overnight. Louis Pasteur, famous for discovering a rabies treatment, and Edward Jenner, who invented vaccinations, worked in similarly low-tech environments, pioneering practices that, today, aren’t very expensive or difficult to acquire.

Louis Pasteur, discoverer of multiple
medical procedures
Even my friend’s diagnosis of Ehlers-Danlos Syndrome, a rare connective tissue disorder, didn’t come through expensive, specialized innovations. Her doctor simply listened to her symptoms, compared them to known research, and referred her to a geneticist. It took years for her to reach the point where a physician would trust her enough to simply listen, but I suggest that’s a whole other problem.

So let me suggest a remedy.

Let’s return to a source I cited at the beginning of this series, Princeton economic historian Jerry Z. Muller. He writes that doctors and accountants have always sparred over treatments: doctors often want to pursue heroic lifesaving measures, even when these measures are expensive, and divert resources from a larger pool of deserving patients. Accountants, by contrast, strive to keep costs down, sometimes despite a patient’s treatments being both affordable and cost-efficient.

This ordinary conflict becomes exacerbated by both private insurance, and nationalized health-care, because the accountants responsible for the economic side no longer work within the medical treatment environment. They occupy external positions in corporations and governments, making decisions based on Xeroxed checklists, without consulting the doctors, and without the doctors having any avenue of appeal. In other words, the problem isn’t accountants, it’s their bureaucratic framework.

Professor Muller suggests a straightforward response to this problem: trust those intimately familiar with the problem, to also have intimate familiarity with the solution. Muller doesn’t make this suggestion only for health care. Teachers know better than corporate standardardized-test writers whether their students are prepared to graduate. Farmers know better than ConAgra or Monsanto how to husband the land. Doctors and on-site accountants know how to treat patients.

Edward Jenner, discoverer of vaccines
My friend’s Ehlers-Danlos diagnosis took twenty years, but I propose the problem isn’t the doctors. I witnessed several, though not all, of her interactions with medical professionals, and I feel confident in saying, they rushed through the diagnosis process, asking checklist questions and assigning her a pharmaceutical diagnosis, because that’s all that insurance would pay for. In order to cover operating expenses, doctors need to rush their patients, lest they go broke.

Return, please, to the image I first conjured: Doctor House struggling for days, weeks even, over one patient. Staying up all night consulting medical journals and concordances. Working to parse the diagnosis. That’s not realistic for most patients, certainly, or their bills would run into the millions of dollars. But imagine if doctors, and their on-site accountants, had time to converse with their patients before settling on whatever diagnosis the insurance bureaucrats will cover.

I suggest my friend might’ve gotten her diagnosis years ago, if the bureaucracy hadn’t held the process hostage for profit. Government bureaucrats, cognizant of the next election and pot-luck outrage over “taxpayer dollars,” will do no better. We need to trust skilled professionals, intimately familiar with their field, to make these decisions. The solution to bureaucracy isn’t reshuffling the bureaucrats; it’s removing them entirely.

Monday, December 16, 2019

Doctors vs. Accountants©: Part II


Last week, I wrote that current attempts to expunge for-profit influences from American medicine are doomed because that simply involves reshuffling the medical bureaucracy. I came in for some mild criticism, suggesting I missed the point. “If we funneled a lot of the money that we put into welfare for corporations into helping people get the medical treatment they need,” a good friend wrote, “we could reduce a LOT of the bureaucratic harmful determinations.”

I understand the appeal of this argument, but I don’t believe it’s true. Yes, I’d love to see some of America’s massive federal subsidies to, say, hydrocarbon mining and tech giants, redirected to serving ordinary citizens’ interests. Certainly we need to stop using public funds to protect the wealthy while ordinary people lack access to common-pool resources. But I’m not sure that’s what’s really happening in medicine. It’s something more subtle and insidious.

All economies are structurally organized around some resource that forms a bottleneck. Something that’s scarce and desirable forms the foundation of every economy. Whether it’s gold, or land, or human labor, every economy needs something that exists in finite amounts; the medium we use to value and exchange that resource, we call “money.” The key to unlocking and reforming an economy lies in identifying that bottleneck resource.

What, in medicine, is the bottleneck? Two or three years ago, I might’ve said “human labor.” Because medicine is highly specialized, and requires years of technical training to begin a career, doctors and nurses will always be a finite supply. And while for-profit hospitals can outbid for the services of experienced, highly-skilled professionals, access to quality, affordable medical care for numpties like me will always founder on money.

But recent news makes me question this assumption. To give just one example that’s gained recent news-cycle traction, we’re also facing a potential shortage of Earth’s helium supply. When I say “helium,” you possibly think of party balloons and silly voices. But much highly specialized technology, including, say, MRI machines, relies upon helium, which is so scarce on Earth that we risk running out in about 35 years.

Therefore we must make valuable decisions about allocating our finite helium, and other nonrenewable resources. Simply turning free markets loose sounds desirable to strict libertarians: if doctors need helium more than party planners, let them outbid. But forcing doctors to compete in resource auctions will increase already-staggering prices, ensuring that ordinary people can afford medical care even less than we do now. Some authority needs to intervene.


One cannot overvalue the importance of scarcity. Those of us old enough to remember the Clinton Administration’s attempts to reform health care, derisively dubbed “Hillarycare,” will recall conservative anger at the attempt to nationalize one-seventh of the American economy. But by the time Obamacare rolled around, that number had risen to one-sixth. The cost of medical care is growing faster than the overall economy, apparently.

Somebody has to make life-or-death decisions about allocating limited resources. Conservatives want private corporations to make those decisions, while progressives trust the altruism of the state. In both cases, some poorly-paid underling with looming deadlines will have to make decisions, about which they might be under-informed or even ignorant, based on one-size-fits-all metrics handed down from authority. That’s the living definition of bureaucracy.

Not only is medical technology limited and expensive, so is skilled labor. So is physical space for providing patient treatment. So is decision-making to determine who receives treatment: a Dutch study found doctors widely agree that older patients are under-treated compared to younger patients, while the American Medical Association prizes “duration of benefit,” which privileges the young. So, despite our best efforts, we’re still asking bureaucrats to make important decisions.

Though I recognize the limits of a layperson’s thought experiments, I cannot separate bureaucracy from medical treatment in a capitalist structure. Hospitals were invented by Christian monks during the Crusades, and hospital treatment was initially a form of religious devotion. Perhaps we could separate money from medicine if we recaptured this sense of spirituality, but how? Money changes people’s values, and the change usually can’t be reversed.

Please don’t misunderstand: I dislike the status quo greatly, and believe it abuses poor and minority people in its lopsided distributions. But, unless we remove all resource bottlenecks, which cannot happen if we want high-tech treatments available, I can imagine no remedy that isn’t just another reorganization of the bureaucracy. Whether you prefer state or corporate bureaucrats more, they’re still bureaucrats. And we’re still running out of helium.

Tuesday, December 10, 2019

Doctors vs. Accountants©: the Role-Playing Game


I just encountered another of those audience-grabbing stories about insurance companies failing to provide medical coverage. The bean counters apparently denied somebody’s necessary life-saving medical care. I understand the outrage this story causes, because we all imagine ourselves, with years ahead of us, suddenly facing mortality because an actuary somewhere said “no.” Putting ourselves in those shoes, the prospect seems horrific.

Is it, though? Princeton economic historian Jerry Z. Muller writes that, while medical metrics are frequently overused in ways that undermine doctors’ autonomy, that isn’t always bad. It certainly can be, when insurance executives who don’t understand medicine overrule a doctor’s opinions based on shoddy math. But throughout medical history, Muller says, hospitals have hosted tension between doctors who want to take heroic life-saving actions, and accountants who tally the costs.

Generations of TV medical dramas have convinced laypeople that medicine consists of earnest, energetic professionals making split-second decisions while lives hang in the balance. This might make sense in Emergency Room conditions, where people come in broken and bleeding, staving off burst appendixes and suppurating aneurysms. But most medical care is slow, deliberative, and costly. Asking whether continued costly treatment will have meaningful outcomes isn’t always unreasonable.

The model we’ve all seen in stories like House M.D. looks exciting, dangerous, and fun. Somebody enters the hospital with a twitching thigh muscle, and the glamorous doctors, who have only one case, piece together clues proving how an aortic dissection threw a clot to the brain, resulting in testicular obstructions: medicine as logic puzzle. I might’ve paid better attention in middle-grade biology had I thought I’d get jobs like that.

But an NPR human-interest story broadcast at the peak of that show’s popularity traced the costs of just one episode, landing on a $300,000 price tag—and that’s a conservative number, because I added up their annotated costs and got something far higher. And that’s nearly ten years ago; advancing technology and added administrative bureaucracy probably mean it’s far higher now. Even under the best circumstances, medicine is expensive.

The early-seasons core cast of House M.D.

Let me interrupt myself here and note: I don’t mean accountants should be more active in scaling back costs and denying medical care. A good friend recently received a medical diagnosis that mercifully ties all her disparate symptoms together. She should’ve received this diagnosis twenty years ago, but overworked, underfunded doctors made hasty short-term determinations, almost certainly rushed along by bean-counters. A little more time and money could’ve saved years.

So yes, I acknowledge that there’s no simple, arithmetic formula to strike a balance between the accountants and the doctors, whose desires often conflict. And the arena of that conflict is a patient’s body. The outcome, as seen in the tweet quoted above, can be tragic for individuals—but downright mandatory for the medical economy overall. Anybody hoping to solve this problem concisely should also wish for a pony, because it’ll do as much good.

Because, let’s be honest, when people complain about private insurance’s interference in medical decisions, their solution is often to nationalize medical care, to a greater or lesser degree. As America’s political Left wants to institute Medicare For All, Britain’s political Right is actively considering privatizing the NHS, which American progressives often brandish as a model of more efficient medical template. Which they want because the NHS isn’t much better.

British medical journalist Dennis Campbell, writing in the Guardian, a newspaper with undisguised Leftist allegiance, notes that NHS bureaucrats regularly overrule doctors’ opinions. Campbell cites a panoply of reasons bureaucrats to this, but buried in this list, he names “resources”—a weasel word meaning “money.” So the NHS, like America’s private insurers, overrules doctors’ wishes to keep costs down and funnel money where it’ll do the most good.

If we expect a constant stream of on-demand, high-tech medical treatment, we’ll inevitably run into the impediment that all resources, including money, are finite. Somebody needs to make decisions about who gets costly, invasive treatment. That “somebody” will inescapably be a bureaucratic goat whose official functions, whether funded by the state or private capital, will conflict with life-saving desires. Somebody’s live will take priority over another.

The system is heartless to individuals. I’m sure that woman whose life-saving cancer treatment got denied is suffering greatly. But trading corporate bureaucrats for state bureaucrats won’t solve anything. Unless we abjure high-tech medicine, which will never happen, we’ll always have to make finite resources cover infinite needs. That’s what bureaucracy does. Sometimes, that means making painful decisions and letting human lives go.

See also: Doctors vs. Accountants©: Part II

Monday, March 11, 2019

Two Modern Dogmas

Back when I worked at the factory, I remember a co-worker approaching me at breaktime, holding a bottle of 7-Up he’d just bought from a vending machine. My co-workers often treated me as smart and authoritative because I have a good head for memorizing facts, and I’d grown accustomed to answering whatever questions might arise during the day. This guy pointed to some words printed on the label. “It says here ‘All Natural,’” he said, smiling smugly. “That means it’s good for you, right?”

Part of me hated to let the poor guy down, but not enough to shut me up. “You gotta be careful,” I said, “that’s one of the ways they fool you. They depend on you to think that way. But there’s no legally or scientifically binding definition of the word ‘Natural.’ You have to do more research than that to keep healthy.”

I recalled that exchange, and my co-worker’s crestfallen face, during a more recent disagreement. My friend, whom I love and respect, nevertheless said something I completely disagree with: that America can never abandon high-tech farming, despite its lousy environmental impact and its grotesque overproduction of food that often gets landfilled, because if we do, we’ll run out of affordable food, and the poor will suffer.

My friend is a big believer in GMOs and their potential to produce healthier, more abundant food. This despite the fact that they haven’t done so, and most GMOs have proven to be more expensive, more bland-tasting, and generally more disappointing versions of existing food crops. In the unlikely event my friend reads this essay, I’m confident he’ll feel motivated to defend his existing positions by asserting I’m just an enemy of “science.”

So we have two conflicting attitudes, which I’ve encapsulated in two people I know personally, though I’ve seen both repeated by other people and in mass media. On the one hand, we have the belief that “natural” means accord with human needs, and a general tinge of moral goodness. On the other, the belief that “science” is a humane progress through layers of understanding to the light of secular salvation.

Both attitudes are wrong.

The belief in nature’s goodness, as a sort of countercultural push against the heedless embrace of technology, is so completely mistaken that it has its own name: the Appeal to Nature Fallacy. Sometimes called the 100% Natural Fallacy, a name taken from a popular brand of breakfast cereal sold in the 1980s. From this fallacy, we get quack medicine, homeopathy, herbal “medicines,” and gullible people drinking their own pee.

The opposing belief, that what’s created in a laboratory is superior to dirty old nature, isn’t widespread enough to have its own name. Yet it demonstrates remarkably similar willingness to trust a dogmatic interpretation of evidence. Human ingenuity gets presented as innately morally good, and scientific advance becomes an end in its own right. But it requires an equal willingness to trust an abstract conviction without question.

Charles Darwin
When my friend argues, and he has, that we shouldn’t worry about mechanized farming damaging soil fertility, because we can replace lost fertility with synthetic chemicals, he makes the exact same appeal as homeopathy: that whatever made us sick will also make us well. This maybe made sense sixty years ago, when we had less empirical evidence. We formerly had to venture into new territory without a map.

But we don’t anymore. We have abundant evidence that peach pits don’t cure cancer, trace amounts of arsenic don’t reverse poisoning, and petroleum-derived fertilizers burn the soil, making future harvests less abundant. Blind trust in either nature or science has produced serious consequences, even cost lives. Experience tells us that calling something “natural” doesn’t make it healthful. And calling something “scientific” doesn’t make it good.

I don’t write this to accuse either my co-worker or my friend personally. Both men simply want a concise, intellectually coherent explanation for today’s difficult and often inconsistent circumstances. I frequently catch myself doing likewise. Unfortunately, modern complexity doesn’t permit such prefab consistency. Failing to frequently test our dogmas against evidence serves the same impulse formerly served by religion.

Tragically, both “nature” and “science” make pretty poor gods. From tapeworms to opioids, both have a track record of turning viciously on their worshipers. It isn’t comforting to say we have to review the evidence constantly, especially in today’s environment, saturated as it is with information. Yet we have to. The era of comforting dogmas, which both men described are seeking, is long over.

Wednesday, August 22, 2018

How Flint, Michigan, Got Its Water Crisis

Mona Hanna-Attisha, What the Eyes Don't See: A Story of Crisis, Resistance, and Hope in an American City

Flint, Michigan’s water crisis began as a rumor—and Dr. Mona Hanna-Attisha, head of pediatric residents at Hurley Medical Center, a teaching hospital in Flint, pooh-poohed those rumors. The water’s fine, she reassured her patients. Don’t hesitate to mix it with baby formula, and certainly drink more water than soda. Until an old friend brought evidence, she refused to believe city water had more lead than a contaminated smelting plant.

Dr. Hanna-Attisha’s memoir of the Flint water crisis hit shelves just as the city made the decision to stop distributing potable water to residents whose tap water still flows brown. She makes a persuasive case that the water crisis resulted from human actions, and has human solutions. What’s more, she demonstrates the official intransigence that made this national disgrace possible. Too many powerful people keep turning blind eyes.

There is no scientifically safe level for lead in water or food. None. Even the slightest amount has lifelong health consequences once it gets inside the human body. But a leaked EPA report, which political appointees strove to bury, revealed Flint’s municipal tap water had lead contamination running sixty times the EPA’s official “action level” where regulations consider panic acceptable. Rather than fixing the lead, appointees tried to discredit the source.

When Dr. Hanna-Attisha brought these findings to Flint’s public health administrators, they offered a stunning response: water isn’t a public health concern. Take it to municipal utility people. Thus begins a bureaucratic runaround in which, even when appointed leaders acknowledge the problem exists, actual actions are somebody else’s responsibility. Apparatchiks would rather defend their shrinking administrative patches than serve public good.

Dr. Hanna-Attisha mixes personal memoir, political exposé, and history of public health concern. She didn’t come into her advocacy position accidentally. Her personal history positioned her perfectly to speak when public adversity came her way. Daughter of Iraqi Christian refugees who fled Saddam’s arbitrary purges, she inherited a passion for solving looming social problems. Her parents taught her the importance of education, and commitment to causes bigger than herself.

Dr. Mona Hanna-Attisha
Public health, as a discipline, arises because human communities have become too large, interconnected, and complex for individuals to take responsibility for their private health. Hanna-Attisha describes John Snow, the Victorian Englishman who first connected London’s frequent cholera outbreaks with improper sewage disposal. There, as with the Flint water crisis, officials refused to believe their continuing policies created disastrous health consequences downstream.

For Dr. Hanna-Attisha, public policy, private health, the managed health system, and information distribution are linked issues. She describes how efforts to control access to information kept people, including herself, ignorant of the water crisis long after accumulations of lead had measurable health impact. And the lack of coordination within the health system prevented alarms from sounding, even after science began gathering evidence. No problem happens in isolation.

The Flint water crisis didn’t just happen. Flint elected an idealistic young mayor and an activist council to offset the continuing economic drains caused when General Motors abandoned the city. Yet the state, utilizing emergency management law, stepped in, overruled the city council, and began a program of cutting financial costs, without regard for human consequences. One of the first changes, was shifting Flint onto a dirt-cheap, but untested, municipal water source.

This isn’t coincidental. Flint, a majority Black city in a majority White state, had its elected city government overruled unilaterally. Dr. Hanna-Attisha points out that nearly every emergency manager in Michigan oversees a majority Black community. She compares the outcome to the Tuskegee experiments. Black Michiganders don’t know where their water, civil defense, sanitation, and other basic services come from, which Michigan is okay with, because they’re poor.

Once Hanna-Attisha becomes aware of the evidence for measurable health problems from contaminated water, she pushes public officials to do something. She initially maintains her faith in the system—after all, she’s employed by the Michigan State University system, she’s a public servant too. Only when they prove deaf to public entreaty, immune to scientific evidence, and more beholden to bureaucracy than common good, does she shift focus to strategic media appeals.

Dr. Hanna-Attisha isn’t what you’d normally consider a public revolutionary. She’s a teacher and doctor, someone who does her job for the love. But in her telling, her love of children and medicine made public resistance necessary. Reality backed her into a corner, and she responded with action. Like the best movie heroes, Hanna-Attisha was prepared to do the right thing, and she acted. That makes her a hero.

Wednesday, July 26, 2017

There Is No Opioid Epidemic

Rachel Maddow has done solemn, suit-clad journalism
on the awfulness of opioid addiction
I'm sick of hearing about the "opioid epidemic." Repeating this claim has become a sure-fire way for media professionals to prove that they're serious about fixing the problems in American society today. Mainstream journalists like Dan Rather and Rachel Maddow have done somber think pieces about opioids, while humorists like John Oliver and Adam Conover approach the problem from a satirical, but still po-faced, angle.

There's just one problem. It's a bunch of crap.

Pause a moment and think about this: what do we call somebody who can't stop taking pain medication? Somebody who continues popping pills long after it's become measurably clear that their appetite for the medication drives a wedge between themselves and their family? Somebody who would rather pop pain medications than hold down a job, have a place in the community, or pursue constructive hobbies?

Modern medicine calls such people addicts. But I suggest another name altogether: patients. A person consuming pain medications with such dogged desperation, probably needs the pain to go away. Calling this person a moralistic name like "addict" implies the person has suffered a failure of ethical character, and we need to punish that person—an attitude which uses medicine and law to enforce moral values. Calling this person a "patient" lets us seek the root underlying pain.

We do need to distinguish here between addicts and recreational users. Some people take pain medication, mind-altering drugs, or prescription stimulants and steroids to enjoy the altered state of mind. Such people creep me out, but they aren't addicts; they just enjoy seeing the world from another perspective for a few hours, usually in the company of friends. As long as they harm nobody but themselves, stopping them feels pretty high-handed.

John Oliver plays the story for laughs, but
clearly has the same self-serious intent
But addicts have a completely different situation. They cannot stop taking whatever substance they're addicted to, because without it, they feel incomplete, pained... inhuman. Addicts don't want to feel good; they want to feel normal.

Consider what people become addicted to. Besides opioids, the most common addictive substances include alcohol, heroin, cocaine, and codeine. All of these are, or were at one time, prescription painkillers. Other addictive substances include nicotine, marijuana, and Valium, which all have anti-anxiety properties. Even lowly methamphetamine, the one drug I'd like to see permanently forbidden, began life as a prescription anti-anxiety medication, Benzedrene.

Besides substances, people can become addicted to behaviors. Psychologists have observed and treated people demonstrating addictions to gambling, sex, eating, work, and more. Each of these behaviors creates an internal reward system: the giddy rush of waiting for your pony to win, the sense of accomplishment from a job well done. They also give addicts something to think about other than the mundane, and possibly painful, circumstances of their day-to-day lives.

Opioid addiction, when considered in news or late-night comedy, seems inextricably entwined with two circumstances: medical trauma, or chronic unemployment. I've noticed a persistent trend of connecting opioids with West Virginia coal country, where declining revenues have forced many former miners, many suffering black lung and other work-related injuries, out of the workforce. The physical pain of illness, and the psychological pain of uselessness.

Journalist Nick Reding, author of Methland, correlates the most widespread consumption of methamphetamine with America's rural communities, where dwindling economic opportunities and declining wages make grasping at straws a necessity to get by. Addiction specialist Gabor Maté, based in Vancouver, British Columbia, notes how many of his destitute patients come from physically or sexually abusive backgrounds. Pain and addiction go hand-in-hand.

So God forbid we actually address the real problem.

The phrases "opioid addiction" and "coal country" keep coming up in the same news stories,
usually without any comment on the connection between addiction and despair


America's policy for dealing with addiction, largely unchanged since the days of Harry J. Anslinger, remains to jail offenders, regardless of their suffering. The isolation of prison, where prisoners lack social networks, meaningful work, or even sunlight, will only compound whatever pain got them addicted to begin with. Rather than helping addicts deal with their problems, our criminal justice outlook only doubles down on the underlying causes of addiction and other diysfunction.

Solemn, tedious think pieces about opioid addiction allow journalists to look engaged with America's diffuse suffering. But they exonerate a socioeconomic system that devalues humans and profits off their pain. Prescription pain meds cannot advertise on broadcast media, so blaming them doesn't hurt revenue. Media professionals thus excuse their own complicity in an economy that encourages resource hoarding, devalues labor, and treats humans as interchangeable parts.

And buying that crap lets us, the audience, off the hook for profiting from that system. Stripped of glitz, the problem is us.

Wednesday, June 28, 2017

Getting Enough Sleep Isn't Enough

Dr. Mark Burhenne, DDS, The 8-Hour Sleep Paradox: How We Are Sleeping Our Way to Fatigue, Disease, & Unhappiness

Snoring isn’t cute. Despite adorable viral videos of snoring babies, puppies, and grannies, snoring is a serious health issue. By now, with the prevalence of CPAP machines and mandibular advancement devices, we’re probably all somewhat familiar with sleep apnea. But Mark Burhenne insists this is only one form of “sleep-disordered breathing,” a category of breathing illnesses that can have cascading effects on your health, happiness, and quality of life.

Burhenne is one of a developing class of dentistry, specializing in sleep disorders. Many warning signs of sleep-disordered breathing, he writes, manifest in the mouth. This may include crowded teeth and recessed jaw, damage from chronic teeth grinding, and others. However, the signs he identifies still need quantified by MD sleep specialists before treatment can begin, or be compensated by insurance. That, he says, is where things become tricky.

The 8-Hour Sleep Paradox, Burhenne writes, is assuming counting the hours you spend asleep and assuming, as Mother probably taught you, that eight hours equals a good night. But simply being unconscious isn’t the same as getting a good night’s sleep. Citing multiple sources, Burhenne suggests that anywhere from half to four-fifths of Americans aren’t getting enough deep, restorative sleep, and lack of air is the most common reason.

Worse, we have a tendency to minimize or dismiss real problems. I say “we,” meaning the general population, but Burhenne writes, that includes medical professionals too. Patients diagnosed with “mild” sleep apnea often get sent home with best wishes and little more, but even mild apnea means a person’s airways close. And that means the person comes partway awake to breathe, and therefore isn’t getting necessary Stage-4 and REM sleep.

This book’s first few chapters explain the warning signs in themselves. These include now-familiar signs of sleep problems, like obesity and chronic fatigue. It also includes, but isn’t limited to, signs of poor sleep, like needing caffeine and frequent naps (guilty); or waking up with dry mouth or headaches, signs you’ve spent the night gasping for air. But if you’re browsing this book, you already know you need to change.

Mark Burhenne, DDS
But Burhenne avoids the most common shortcoming of self-help books, encouraging readers to diagnose themselves. Rather, after just two chapters on recognizing signs of sleep-disordered breathing, the largest portion of the book focuses on working with a sleep specialist to get an actual diagnosis that leads to treatment. This is difficult, Burhenne admits, because sleep studies are expensive, and insurance companies incentivize doctors to avoid costly tests.

Getting an appointment with a sleep specialist usually requires a recommendation from your GP. And if you aren’t middle-aged and overweight, Burhenne writes, many GPs won’t make that recommendation. So he provides tools to increase your doctor’s cooperation in the fifteen minutes you usually have, including the Epworth Sleepiness Scale and the STOP-BANG questionnaire. He also includes important questions to ask, and important information to provide.

After going through all that, patients have no guarantee their insurance will actually cover the procedures. Burhenne copiously expounds how to navigate the paperwork necessary to get treatments covered. This includes how to convince your insurance provider that you need some treatment other than CPAP, which is the most commonly used anti-apnea technology, but doesn’t work for everybody. Getting the right treatment takes effort, apparently, but it’s worth it.

Burhenne does provide life hacks that patients can apply individually. To give just two examples, this includes mouth taping, which is exactly what the name implies. If you have only a mildly obstructed airway, closing your mouth overnight with surgical tape can ensure you breathe through your nose for maximum efficiency. It also includes certain non-steroidal nasal sprays to keep airways open, again, for nose breathing, like your body prefers.

Nevertheless, Burhenne doesn’t mainly emphasize these internet-friendly hacks. He primarily keeps focus on medical science, including the most recent discoveries (as of when this book was written), and the information you need to get best results from your physician. A certain distrust for the medical establishment lingers beneath Burhenne’s prose. Though admittedly, this makes sense, considering the ideas he describes are still controversial in certain circles.

Medical pundits tell patients to watch our weight, manage our stress, and get our eight hours nightly. But we can’t do these if we’re tired from lack of deep, restorative sleep. It’s surprising to read this advice from a dentist, and I admit, I needed some convincing, but Burhenne certainly provides that. If, like me, you need not just more but better sleep, start here.

Monday, February 6, 2017

The Mind Unlocked, In Two Evenings Or Less

Lloyd I. Sederer, M.D., Improving Mental Health: Four Secrets in Plain Sight

Medically grounded mental health treatment has a history of being very fashion-driven. The lengthy inpatient committals at spa hospitals, made famous during the 1980s, were curtailed in favor of heavy medications when HMOs demanded quick, low-cost fixes in the 1990s. Dr. Lloyd Sederer, who has contributed to mental health treatment as both a researcher and a clinician, attempts to eschew trends and focus on what actually works. The product is readable and frequently eye-opening.

According to his introduction, Dr. Sederer writes for two distinct audiences: psychiatric clinicians dealing with patients suffering significant disorders, as well as students; and the families and friends of such patients, who must monitor their loved ones and provide constant palliative care. As such, Sederer’s prose is frequently dense with scientific concepts, but he never introduces terminology without providing definitions. His mix of official, medical language, with case histories, makes this a very humane exposition.

As the title unambiguously declares, Dr. Sederer distills mental health treatment into four broad “secrets,” or functional approaches. The first is, Behavior Serves a Purpose. All human behavior, even counterproductive, harmful, and seemingly “insane” behavior, means something. Substantial treatment begins, not when we get patients to stop hitting themselves, but when we identify what actual meaning their actions serve. This isn’t always easy, much less straightforward. But it’s more productive than just condemning actions we don’t understand.

Second, Sederer emphasizes The Power of Attachment. Humans are linked creatures, and loneliness can transform our mental functions, especially at early ages. People will remain in dangerous relationships rather than confront loneliness (which Sederer clearly distinguishes from solitude). And our need for relationship influences our ability to heal from illness. Sederer describes the “therapeutic alliance,” the relationship by which therapy actually makes any progress. It isn’t just that therapists help, but how they help, too.

Throughout this book, but especially here, Sederer overlaps significantly with the reading on addiction theory I pursued a few years ago. He talks about Bruce Alexander’s Rat Park experiments: laboratory rats in environments designed to resemble their natural habitats wax prosperous, avoid harmful behaviors, and live long, happy lives. Rats raised in cages will gorge themselves on drugs until they overdose and die. Here and elsewhere, Sederer demonstrates that all psychology is linked.

Lloyd I. Sederer, M.D.
Third, Sederer writes, As a Rule, Less Is More. Remember the spa hospitals and heavy medications I mentioned earlier? Though tilted toward opposite extremes, both options represent a do-too-much attitude of massive interventions designed to overwhelm whatever preëxisting conditions produce undesirable behaviors. Rather, Sederer writes, the therapeutic goal should be to reëstablish optimum natural balances, and often, the least intrusive approach works best. Care providers, including families, should avoid the temptation to overtreat routine conditions.

Finally, Sederer hits the one I find most familiar: Chronic Stress Is the Enemy. This takes different forms in different patients, at different stages of life. Children exposed to chronic abuse or neglect develop defense systems that, as adults, turn maladaptive. Adults subjected to these same conditions develop inflammatory diseases that shatter our defenses and literally shorten our lives. These can manifest in myriad ways. What matters isn’t the particulars, but that stress undermines our bodies and brains.

In describing these operant conditions, Sederer also gives constant indications how to counter them. Some responses are within the patients’ control, while others require physicians, families, and other caregivers to take first initiative. In a few cases, Sederer makes recommendations of medications known to have beneficial effects, but in keeping with his Less Is More philosophy, he dispenses these suggestions only sparingly. It isn’t what goes into our bodies, but how we treat them, that transforms us.

Readers familiar with developments in recent psychology, even as filtered for a generalist audience, will recognize much here they’ve read before. From the effects of isolation and company on our short-term mental health, to how epigenetic influences reshape our brains over the long haul, I recognize from other writers. Johann Hari, Stephen Ilardi, and Gabor Maté cast long shadows over Dr. Sederer’s writing. For well-read audiences, Sederer brings these disparate influences together under one tent.

Counting out Sederer’s works cited lists and liberal illustrations, this book runs barely eighty pages, basically a long article. Ambitious readers undeterred by technical prose will savvy this book in one or two evenings. Yet it never feels underwritten or like it’s forgotten anything. It just stays concise, clearly focused on its topic. If anybody you love is undergoing mental health treatment, consider reading this book. It may open your eyes.

Monday, June 6, 2016

When Bread Could Kill You

Paul Graham, In Memory of Bread: a Memoir

Paul Graham, an upstate New York English professor and gastronome, established elaborate rules for himself: Cook your own food. Use local ingredients. Keep fat, sugar, and glycemic index low. Cooking for his wife, eating the bread she baked, and home-brewing beer with his best friend were staples of building a sustainable, locovore lifestyle. Everything food hipsters say will keep us, and the land, healthy. So he couldn’t understand the sudden, shooting bursts of abdominal pain.

Diagnosed with celiac disease at age 36, Graham found himself in an increasingly common situation. Diagnosis rates worldwide have skyrocketed. But are celiac, and other gluten intolerance disorders, really more common today? Or have people previously misdiagnosed now being recognized? (This isn’t academic. I have two gluten-intolerant friends, one who was tested for everything from cancer to lupus for over a decade.) Graham resolved to do what scholars everywhere do: research the situation, and report.

This volume starts with Graham’s own situation. It’s a memoir primarily, of Graham’s own struggle as he goes wholly gluten-free. Fortunately, his wife joins him on the journey. I wish I’d been that brave; when my then-girlfriend was diagnosed gluten-intolerant, I selfishly hoarded coffee cakes and cinnamon rolls. But Graham and his wife, Bec, find they’re not just giving up one ingredient. They’re walking away from buffet spreads, pub nights, and food’s historic social implications.

Wheat agriculture, it appears, helped form Western civilization. As Graham’s investigation expands into the history and science of gluten, he finds wheat so basic to Western history that to abjure eating bread (Graham loves the phrase “wheaten loaf”) means to not participate in our culture. Food-sharing rituals, from pot-luck brunches to Catholic communion, underpin Euro-American culture, and eating bread looms large. Maybe that’s why humorists and hipsters treat gluten-free dieters as mere figures of ridicule.

Since Graham, an award-winning food writer besides his professorship, cooked for himself, and his wife baked, food wasn’t just bodily sustenance; it bolstered the intimacy of his marriage. Thus, for him, the macro-scale and the micro intertwined. Many recipes, and many prepared ingredients, involve wheat where you’d never look for it, especially as a stabilizer. As he abandoned the cultural history of eating wheat, he also lost the personal history of preparing his own dinner.


Our isolated, private society today often loses the community aspect of food. But the simple act of sharing conversation around the table has historically underpinned our society. When he had to walk away from that history—not just the cultural history of shared food, but the personal history of knowing how to prepare his own dinner—Graham had to relearn everything he knew. Not just about food, but about himself, and his place in society.

For one, he has to rediscover how to be Paul Graham, in a world where hobbies like baking and brewing were now off-limits. He needed to relearn cooking. Many store-bought gluten-free (GF) foods simply substitute rice, tapioca, or sorghum flours for wheat, assuming the process remains unchanged. Not so, as Graham discovers in actually preparing edible GF bread. His mentors, though meaning well, taught him concepts that no longer apply. Cooking is an adventure again.

Is bread even really necessary? Graham suggests many deeply ingrained expectations regarding food are learned, not innate, though impossible to discard, the centrality of bread among them. With time, he internalizes the systems necessary for understanding the new world he found himself thrust into. Though by the end, he returns to home-baking his own GF bread, he acknowledges that even then, it means unlearning habits he’s previously mastered. Embrace everything teachers told you to avoid.

By his own admission, Graham set himself many food-related rules well before onset of celiac disease. His “locovore” proselytizing sometimes gets intrusive, and his quest for celiac-friendly foods at farmers’ markets seems quixotic. But everything he says sounds familiar to anyone forced, by health or circumstance, to abandon wheat. The discomfort at public food gatherings (can I eat off this buffet? How do I know what’s safe for me?). The mockery one faces for eating.

If it’s true that only the intimately personal is truly universal, Graham achieves that here. No two gluten-sensitivity sufferers have identical symptoms; that’s what makes diagnosis so difficult. However, everyone who abandons gluten endures the same isolation: the same withdrawal from easy carbohydrates, the same alienation from bread-eating friends, the same journey through dietary blandness. His memoir of struggle can inform all readers, and offer hope that leaving gluten doesn’t mean leaving good food forever.

Wednesday, March 11, 2015

Doobie Newbie Blues

Click to enlarge
The image at right appeared on a friend’s Facebook feed this weekend. Having recently become interested in the sociology of drug use, and the forces that make illegal drugs a desirable choice for so many people, I’ve developed a reflexive distrust of blanket statements about how drugs work. So I decided to don my Snopes cap and investigate this claim’s truth value.

Though floated by various law enforcement, civil service, and public interest groups, this image originates with the American Lung Association. The ALA’s original source material verifies the picture’s essential claim, that marijuana (hemp, cannabis sativa) certainly does deposit more tar on human lung tissue than commercially manufactured, legally sold tobacco cigarettes. As it stands, this claim appears substantially true.

However, reading the source requires understanding basics of argument analysis. If I had my way, Darrell Huff’s 1954 classic How to Lie with Statistics would be required reading in every American middle-grades classroom. Anybody familiar with Huff’s principles will immediately recognize, reading the ALA’s statistics, that they’re guilty of several mistakes. For our purposes, the most important is “Comparing the Incomparable.”

By their own admission, the ALA compares machine-manufactured cigarettes, tipped with cellulose fiber filters, with hand-rolled marijuana doobies. Nearly everyone agrees that cellulose filters reduce the quantity of tar, fine particulate matter, and other matter from cigarette smoke. However, there’s no agreement whether that actually has significant health benefits. Provided they remain moist, meaning alive, tarry lungs continue to function.

Promo art for Reefer Madness, possibly the most
moralistic, wrong-headed anti-drug propaganda ever
There’s also wide agreement that filters don’t obstruct the inhalation of nicotine. The neurologically active component in marijuana smoke, tetrahydrocannabinol (THC), literally cannot kill you at any quantity. We cannot say this about nicotine, which is so lethal that only by diffusing it into microscopic particles, and inhaling it amid hot smoke into the lungs, can it have any narcotic effect without instantly killing the consumer.

Also, the ALA compares cigarettes, manufactured under highly regulated standards, to marijuana, which is grown, distributed, and consumed with no purity standards whatsoever. Cigarette manufacture and sale is strictly regulated by the Federal Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF). This gives ATF remarkable authority to control tobacco purity, and gives consumers confidence regarding what their smokes are made of.

Marijuana, by contrast, remains exclusively in control of criminal enterprises, which can enforce standards only via assassinations. Once marijuana supplies reach street customers, we have no confidence that our wacky tobaccy hasn’t been thoroughly adulterated with pine needles and lawn clippings. Drug dealers are known to stretch supplies by cutting cocaine with baking soda, or heroin with chlorine bleach.

Therefore, if the marijuana consumed by regular users has damaging health consequences beyond ordinary THC effects, well, why is that? It’s hard to differentiate actual marijuana effects from those created by drug prohibition. Say whatever you will regarding the moral implications of corporate regulation. But anyone concerned about public health will agree that having government oversight of food and tobacco production beats not having it.

The Camberwell Carrot, the iconic oversized doobie in
the 1987 British classic Withnail and I
America has attempted limited-scale regulations in this manner. Besides the four states and the District of Columbia, which have all legalized personal marijuana possession for personal consumption, several states (the number changes so quickly that I cannot find reliable sources) have legalized medically prescribed marijuana. But the patchwork of state regulations, and ease of interstate transit, makes even this regulation slapdash and unreliable.

This being the case, any reasonable person must consider the ALA’s conclusions unreliable at best. On multiple levels, they compare diverse products that have little overlap. Imagine if the FDA published a white paper comparing patent-pending pharmaceuticals with homeopathic peach-pit cures. If the FDA proclaimed the natural superiority of lawful, government-approved products, tested in their own labs, we’d have legitimate reason to pause.

The ALA, therefore, demonstrates less about marijuana itself, and more about the destabilizing consequences of drug prohibition. By concentrating control of drug commerce into criminal hands, we provide economic incentives to organizations like the Sinaloa and Zeta cartels. Anti-drug advocates might insist that drugs are bad, and I agree. But simply banning them doesn’t make demand go away, simply reorganizes his routes of transit.

Users embracing marijuana for many and diverse reasons, just as people embrace legal drugs, like nicotine, alcohol, and caffeine. The distinction between which drugs are prohibited, and which remain lawful, has mainly moral rather than medical foundations. If merely being dangerous were sufficient, we’d have banned liquor and tobacco generations ago. How about, rather than cloudying the debate with tar, we get to know users as humans?

Saturday, November 16, 2013

Getting Old Beats the Alternative

Alex Zhavoronkov, The Ageless Generation: How Advances in Biomedicine Will Transform the Global Economy

I’ve received some criticism for this week’s review of Alex Zhavoronkov’s The Ageless Generation, from others who’ve read the book. Some criticism has been public, some private. Most of it, however, purports that I’ve misrepresented Zhavoronkov’s science, particularly his rebuttal of claims that human overpopulation has catastrophic environmental consequences. I usually don’t answer such criticisms, but this topic matters enough, I’ll make an exception.

Zhavoronkov attempts to forestall environmentalist arguments at around the one-quarter mark by citing Paul Ehrlich’s The Population Bomb and Fairfield Osborne’s Our Plundered Planet, famous tracts written before most living readers were born, purporting humanity’s imminent demise due to overpopulation. Ehrlich in particular remains a popular kicking boy for technological cheerleaders, because his dire scaremongering failed to materialize. Naysayers face only one setback: Ehrlich’s apocalyptic prophecy didn’t fail.

Okay, we don’t live on a planet desertified by overpopulation, although human population has doubled since Ehrlich’s book debuted in 1968. The Pope hasn’t certified birth control, our seas aren’t dead, and massive die-backs haven’t commenced. Ehrlich’s alarmism proved founded on poorly sublimated racism and sketchy understanding of human demography. This makes Ehrlich, and his fellow neo-Malthusians, look like screaming Chicken Littles at best.

But many of Ehrlich’s forecasts have transpired, though in less horrifying terms. In India, ancient forests have been clear-cut for firewood. China, for the first time in history, became a net food importer in 2008. Famine has been averted by broadening democracy and improved agricultural technology, but that technology relies on petroleum-based fertilizers, herbicides, and insecticides, and oil-burning heavy equipment. And petroleum is not infinite.

The Best American Science and Nature Writing 2013 seethes with articles about damage humans have inflicted on nature. Not potential damages projected by mathematical models or computer simulations, but real damage that has already happened, and continues right now. Species extinction, habitat blight, resource depletion, and environmental degradation which will take millennia to repair: it’s all happening now. And we’re causing it, you and I.

It doesn’t have to happen this way. Amending how we utilize resources, including human labor, could not only change the consequences we inflict upon our environment, it could improve human quality of life. According to the UN World Health Organization, ensuring girls have adequate education between ages 10 and 14 leads to smaller family sizes, reduced resource consumption, and greater prosperity. This is a matter of will, not technology.

I reiterate, Zhavoronkov’s prognostications on increased human life expectancy are both solid and exciting. I look forward eagerly to seeing what possibilities arise when humans reliably exceed the century mark. But such changes will have ripples outside standard economics. Zhavoronkov never says anything outright wrong in his entire book. But he suffers from excessively narrow focus, excluding important questions that exceed his domain.

Even within his domain, Zhavoronkov’s reasoning reveals limitations. An NBC report this week reveals that chronic disease, not old age, has driven up American medical costs, the exact opposite of Zhavaronkov’s claim. This sounds like an argument for universal health care, not redefining old age. Considering that children as young as ten now receive treatment for what we once called Adult-Onset Diabetes, public health at all ages should receive greater priority.

And while Zhavoronkov’s right that we must revamp our public pension system, what happened to private pensions? In 1975, half of all workers had some form of employer-funded old-age pension; today, that number stands at about one-sixth, mostly public sector workers. Private workers mainly have employee-funded 401(k) accounts. After the Enron collapse of 2001 and housing bubble collapse in 2008 hollowed out many 401(k) accounts, that’s hardly an adequate exchange.

I wanted to avoid this, but I’ll continue. Extending workers’ productive lives past eighty sounds fun for lab researchers, professors, entrepreneurs, or lawmakers. But in two years at the factory, I’ve visibly aged. My beard has gone patchy grey, I have crows’ feet, and I’ve become bald. My voice has grown hoarse, I have tendonitis in my right wrist and arch, and after an eight-hour shift, I limp. Some kinds of work age you terribly. The idea of doing this work as a great-grandfather is horrifying.

Alex Zhavoronkov presents a prospect for revitalizing American and international society, by keeping people productive longer. This sounds great. But such changes have consequences beyond themselves. We cannot change something so fundamental as productive lifespan, and expect the future to essentially resemble the past. I’d rather plan for the most likely consequences now, than get blindsided after it’s too late to change our minds.