Showing posts with label health. Show all posts
Showing posts with label health. Show all posts

Friday, June 6, 2025

Don’t Pretend To Be Stupid, Dr. Oz

Americans used to like Dr. Mehmet Oz

The same day I posted about Senator Joni Ernst’s faulty rhetoric surrounding Medicaid cuts, Dr. Mehmet Oz claimed that uninsured people should “prove that you matter.” The cardiac surgeon, Oprah darling, and failed Senate candidate is now Centers for Medicare and Medicaid Services Administrator, meaning he administers decisions for who receives assistance in paying medical bills. His criterion for proving one matters? “Get a job or at least volunteer or … go back to school.”

Last time, I got Aristotelean and dissected Senator Ernst’s rhetoric, noting that she changed the “stasis of argument” mid-sentence. That is, she pretended to misunderstand the core dispute, sanding off nuance while condescending to her constituents.. When someone said people would die unnecessarily, Ernst pretended they meant people would die at all. She thought it appropriate to remind constituents that humans are mortal—and, in her tone-deaf follow-up, sound an altar call for Jesus Christ.

While Ernst’s constituent wanted to argue the morality of preventable death, and Ernst veered dishonestly onto the fact of mortality, a friend reminded me this argument skirted an important issue. Who will die first? When the government makes decisions about paying medical bills, the outcomes aren’t morally neutral: chronically ill, disabled, and elderly Americans stand the most to lose. The same bloc of Americans whom, you’ll recall, certain politicians permitted to die during the pandemic.

Dr. Oz said what Senator Ernst only implied, that hastening human mortality is okay for certain undesirables. This administration, and indeed conventional American conservatism throughout my lifetime, has tied human worth to economic productivity, and especially to productivity for other people. If someone needs assistance, America’s authorities won’t help you create a business, learn a skill, or otherwise evolve to benefit your community. Their imagination can’t expand beyond getting a job working for someone else.

Nor was this subtext. Oz said aloud: “do entry-level jobs, get into the workforce, prove that you matter.” This correlation between “you matter” and “you work for others” has lingered beneath much of America’s work ethic throughout my lifetime—and, as an ex-Republican, I once believed it, or anyway accepted it. But as anybody who’s faced the workforce recently knows, today’s working economy isn’t a source of meaning or dignity; it often actively denies both.

Even laying aside demi-Marxist arguments like “owning the means of production” or “the surplus value of labor,” employment spits in the human face. Minimum wage hasn’t increased in America since 2009, and as anybody who’s worked a fast food dinner shift knows, employers who pay minimum wage definitely would pay less if the law permitted. Even if the workers receive enough hours to qualify for employer-provided health insurance, they mostly can’t afford the employee co-pay.

Lest anybody accuse me of misrepresenting Dr. Oz, let’s acknowledge something else: he lays this onus on “able-bodied” Americans. We might reasonably assume that he expects healthy, young, robust workers to enter the workforce instead of lollygagging on the public dime. But even if we assume they aren’t doing that already (and I doubt that), the pandemic taught many workers important lessons about how America values labor. Specifically, that it doesn’t, except through empty platitudes.

In 2020, executives, attorneys, bureaucrats, and others went into lockdown. Americans laughed at highly skilled professionals trying to do business through Zoom, thus avoiding the virus. Meanwhile, manual trades, retail jobs, construction, and other poorly paid positions were deemed “essential” and required to continue working. These jobs are not only underpaid and disdained, but frequently done by notably young or notably old workers, disabled, chronically ill, required employment to qualify for assistance, or otherwise vulnerable.

As a result, the workers most vulnerable to the virus, faced the most persistent risk. Sure, we praised them with moralistic language of heroism and valor, but we let them get sick and die. Americans’ widespread refusal to wear masks in restaurants and grocery stores put the worst-paid, most underinsured workers at highest risk. Many recovered only slowly; I only recently stopped wheezing after my second infection. Many others, especially with pre-existing conditions, simply died.

Dr. Oz has recapitulated the longstanding belief that work is a moral good, irrespective of whether it accomplishes anything. He repeats the myth, prevalent since Nixon, that assistance causes laziness, citation needed. And despite hastily appending the “able-bodied” tag, he essentially declares that he’s okay with letting the most vulnerable die. Because that’s the underlying presumption of Dr. Oz, Senator Ernst, and this administration. To them, you’re just a replaceable part in their economic machine.

Sunday, November 24, 2024

RFK Jr. is Maybe, Slightly, Right

Robert F. Kennedy, Jr.

It’s a matter of time before Robert F. Kennedy, Jr., and Elon Musk create an insuperable rift within the upcoming Trump administration. Musk believes that engineers and their technology will solve society’s problems with efficiency beyond anything government can accomplish, a philosophy called cyberlibertarianism. RFK, by contrast, distrusts science and technology, and wants to rescind 200 years of progress in physiology and medicine.

Left-leaning quarters of the internet have begun mocking RFK as part of our routine anti-Trump rhetoric. We ridicule his disdain for vaccines in the immediate wake of a pandemic made absurdly worse by an administration that hampered any efforts to curtail the spread. We disparage his fear of pharmaceuticals as indicative of his now-infamous brain worm, which maybe could’ve been prevented by medicine. Everything RFK says is automatically tainted.

Yet I suggest there’s something to his persistent appeal. People like RFK, and those like him who automatically distrust science, because he isn’t entirely wrong. RFK taps a fertile vein of public sentiment that realizes we’ve heard lies from people and institutions for years, clothed in the vestments of science, technology, and mathematics. Americans have become distrustful of authority, and not without reason. RFK simply identifies that distrust.

We’ve witnessed how industrialized pharmaceutical companies, the storied “Big Pharma,” have yanked patients’ chains for years. Nearly a decade ago, so-called “Pharma Bro” Martin Shkreli increased prices on life-saving diabetes medications, which weren’t rare or proprietary, simply to boost his own shoddy past investments. Malcolm Gladwell writes how Purdue Pharma created the “opioid epidemic” by targeting, not patients in pain, but doctors desperately lonely for human validation.

These highly visible forms of public manipulation only remain in public memory because they’re uncomplicated. More nuanced issues, like the sale of over-the-counter amphetamines as diet aides, or the whole fen-phen debacle, are harder to understand, and therefore harder to remember. We’ve wondered at shifting standards for diet and exercise: a few years ago, doctors told us small quantities of red wine were healthful. Now we’re told no quantity of alcohol is safe.

Elon Musk

What’s more, many of RFK’s concerns about food additives and processing reflect our own concerns. Many packaged and convenience foods are fortified with synthetic preservatives and wheat flour base to remain shelf-stable. Many are also fortified with synthetic flavor compounds, because the processing leaches out natural flavor. As Rampton and Stauber write, these additives are presumed safe only because they haven’t been proven unsafe, and are only lightly regulated.

And our government kowtows to the companies which produce these foods, pharmaceuticals, and other substances we put into their bodies. Across many years and both major parties, administrations have let industrial conglomerates like Pfizer, Unilever, Con-Agra, and Bayer run roughshod, only occasionally constrained when public outrage grows vocal enough to jeopardize politicians’ reelection chances. Almost like representatives care more about their donors’ demands than their constituents’ health.

This isn’t new, of course. Anybody who even fleetingly reads American history knows that, before the rise of antitrust regulation, the companies that manufactured drugs fortified their patent medications with arsenic, cocaine, and opium. Before Congress created the FDA, food companies stretched their packaged foods with sawdust and urine. Government stepped in to enforce baseline safety standards, but now hides behind regulations as opaque as the companies they purportedly regulate.

So yeah, RFK actually understands an important thread in American political discourse: the terror Americans feel at how much power these opaque corporations and government institutions have in our lives. We fear these corporate conglomerates, many of which own significant shareholder stakes in one another. We likewise fear the regulatory institutions which often appear as likely to protect as prosecute the corporations hurting us.

That doesn’t mean RFK is right, of course. His belief in the superiority of “natural immunity” over vaccines, to cite just one example, suggests he thinks people once shrugged off polio, smallpox, and the plague. This is goofy. Like belief in homeopathy or crystal resonances, belief in “natural immunity” only makes sense with a complete unawareness of the history of science and medicine. Thus RFK Jr. is worse than ignorant.

However, progressives ignore RFK’s underlying message at great political cost. Just as we disparaged the Wuhan Labs hypothesis simply because Trump said it, we’re now ignoring a history of institutional abuse of science, simply because RFK says it. If progressives want to reclaim these voters from the kooks who have hijacked it, we need to start by acknowledging their concerns. Because they aren’t wrong; powerful people have lied to us.

Tuesday, August 15, 2023

The Fable of the King Who Would Not Die

Bryan Johnson: Meet the multi-millionaire trying to reverse ageing
—Headline on the BBC News website, 13 August 2023
Bryan Johnson

Once there was a certain king—a stupid ruler of a stupid kingdom, in a nation stuffed chock-a-block with stupid kingdoms and their useless kings. Every king in the nation, and many of the queens, thought themselves very important, because the nation had many town criers willing to ballyhoo the supposed importance of their particular monarch. These kings, and many subjects too, heard the ballyhooed fables so often, they came to believe their own mythology.

Like the myth of the king who ruled the kingdoms of lightning chariots and bluebirds. This king believed himself so important that, one day, he unilaterally declared he had renamed the bluebird kingdom, and henceforth, everyone had to honor his kingdom’s name. But every subject knew it was the kingdom of bluebirds, and called it such, ignoring what their king commanded except when his vast, and easily bruised, ego needed appeased. Which was fairly often.

Likewise, our certain king believed himself terribly important, and when he began hearing creaks from his vertebrae, and snaps from his knees, this king boldly proclaimed: “I shall not die!” The king gathered thirty physicians from throughout his kingdom and began dispensing gold generously, demanding research into diet and exercise, and into whatever alchemical potions the king could consume which would prevent his body from aging, and would keep Death, that eternal unwanted visitor, away.

Meanwhile this king’s subjects—we no longer call them “peasants,” though “peasants” is surely what they were—continued their labors. Some subjects hoed rows so they could plant and harvest wheat. Others smelted iron and brought the metal to the kingdom’s foundries, where blacksmiths forged implements so the field workers could hoe rows. The subjects needed little regular direction from their king, and besides, the king’s goldsmiths signed their pay slips, not the king himself.

Within the castle, the king continued demanding miracles from his physicians. “Make the potions stronger!” he commanded, for surely he saw grey beard hairs in his shaving mirror. His physicians bit their knuckles and wondered what more they could do. Saltpeter in his morning alchemical broth? Magnets on his free weights? Artists painted the king’s portrait with square jaw and bulging chest, but the king was not deceived, and knew he had not stopped age.

Elon Musk

Outside, town criers recounted breathlessly the accomplishments which the king and his physicians made in repelling death. Some subjects believed the stories, and repeated them widely, even unto the kingdom of bluebirds. But other subjects held aloft their iron implements and grumbled: “I care not. If he lives or dies, I must still gather the harvest. And look at this hoe! Barely had it, and already it’s rusty. No quality control in this kingdom anymore.”

Several of the king’s guards, watching from the castle, saw subjects raising their iron implements and grumbling, and this made them worry. The king waved away his guards’ concerns, however; what cared he for discontented peasants, when death still encroached? The captain of the guard wasn’t dissuaded, though, and prudently hired more guards, granting them more armor and more swords, because you cannot be too careful. And the stonemasons made the castle walls slightly higher.

All throughout, the king’s subjects continued improving their skills. The blacksmiths made ever-sharper hoes, which the field workers used to make ever-straighter rows, and thus the kingdom saw ever-increasing yields of bread. Admittedly, nobody had more gold to buy bread, so it rotted uneaten, but the bread existed, and surely that merited some celebration. The king permitted the goldsmiths to sign pay slips worth an extra ducat, or whatever, couldn’t everyone see he was busy?

Eventually the day came, which surely everyone must have expected, when the town criers announced that the king had died. Though they carried portraits of the king as square-jawed and muscular, everyone knew the magnets had made him addled, and the saltpeter had made his pecker drop off. Throughout the kingdom, sad-faced subjects agreed this was a most tragic day, then they turned back to their forges and their fields, because work still needed done.

Throughout our nation stuffed chock-a-block with stupid kingdoms, the death of one king mattered little. Workers still worked, and goldsmiths still banked, and everything carried on much as it had before. And somewhere, in a distant castle, the king of the bluebird kingdom struggled to invent another reason to postpone his cage match against that other weird king, Mark Zuckerberg, who, even in the world of allegory and fable, still looks like a pasty-faced android.

Saturday, March 18, 2023

Your Childhood Nostalgia Is Stupid, and Dangerous

I’ve been thinking about Bobby Harrow*, who held Mrs. Martin’s class at Rancho Elementary School in Spring Valley, California, hostage for two years. He was boisterous and energetic, his mind not circumscribed by such effluvia as coursework or the fact that someone was talking. His nigh-pathological refusal to sit down, shut up, or stay on topic had the ability to derail the entire classroom. As often happens, Mrs. Martin ran her class to mollify Bobby.

Nowadays, of course, we’d diagnose Bobby with ADHD and prescribe pharmaceutical stimulants. But I knew Bobby eight years before I ever encountered the term Attention Deficit Disorder. In the early 1980s, he was simply high-spirited and disruptive, something his teacher needed to handle, and his classmates needed to live with. He was also intensely creative and a natural problem-solver, so Mrs. Martin needed to teach him channeling techniques, not just silence him.

I remembered Bobby this week, upon reading this spectacularly slovenly piece of GenX nostalgia:

Click to enlarge

Judging by the author’s profile, he’s probably approximately my age. Like me, he’s White. Nothing clearly indicates his personal background or childhood, but I’d venture that, like me, he comes from a conventionally suburban upbringing, where parents left the neighborhood daily for work, and school and perhaps the local strip mall were the only factors holding the community together. Neighborhoods like mine, and probably his, weren’t community, they were mailing addresses.

It wasn’t one neighborhood, either. My family moved around, pursuing my father’s Coast Guard career, so I experienced schools and neighborhoods in California, New York, Louisiana, and Hawaii. My parents always chose suburban neighborhoods, which they deemed “safe”; they’d never have admitted it aloud, probably not even to themselves, but that meant “White.” Therefore my upbringing was reasonably whitewashed, free of contact with unconventional demographic groups.

Because of my suburban raising, I didn’t know gay people existed until high school. Of course, they definitely existed; groups like Lambda Legal, the Gay Liberation Front, and PFLAG predate my sheltered Caucasian childhood. Similarly, I knew racism existed, but because I lived in well-scrubbed suburban neighborhoods, I never saw it; therefore, I never thought about it, and grew up believing it belonged to another historical epoch. I never had to know, so I didn’t.

Yet I definitely knew other things. Though I never encountered the term ADHD until 1991, Bobby Harrow definitely proved it existed; we just didn’t force-feed students like him Ritalin or Adderall to make them compliant. Similarly, I know, because Mrs. Martin told us, that she maintained a classroom first-aid kit that included epinephrine, because at least two classmates had food allergies. And autism definitely existed; the school just quietly dumped those students into Special Ed.

Therefore I find myself both willing and unwilling to forgive the tweeter above. For my first eighteen years, I didn’t have to know “different” people existed; I thought my suburban upbringing was normal. When I discovered it wasn’t, that some people had very different childhood experiences, I chose to learn and grow. Other people, mostly White adults, keep discovering different people have different experiences, and simply shout “nuh-uh!” It’s a thoughtless reaction, but I understand.

But I also can’t forgive people who don’t remember things that definitely existed. The simple claim that other people didn’t have neurological conditions, allergies, or even obesity, thirty years ago, only makes sense if this person and others like him willfully edit their memories. And to suggest that grade-school students sat quietly back then? Preposterous! Then as now, kids hated being confined indoors every day, and got stroppy and rebellious, needing repeated discipline.

When people younger than me complain about how lawless, crazy, and self-indulgent children are “these days,” I know they’ve made a choice. I was alive then; I know these problems existed, and often caused great disruptions. Yet somehow, I repeatedly encounter people who believe that today (whenever today is) has become monumentally worse than their own beloved childhood. And I wonder what kind of velvet-coated angel hatchery they claim they emerged from, free of friction.

Why do ADHD and autism diagnoses seem so common nowadays? Because we bother to look for them, Bradley! Why did nobody have nut allergies or Celiac disease when in your school daze? Probably because back then, people with food allergies just died, Susanne! Things aren’t getting worse, we’re just aware that bad conditions exist. If you can’t keep up, that’s a “you problem,” and you need to stop cluttering the discourse with your fake nostalgia.

*not his real name

Wednesday, February 8, 2023

Death of the American Expert

Dr. Steven Hayne giving testimony in an undated photo (source)
This essay is a follow-up to Southern Comfort and Down-Home Injustice

For over a quarter century, Dr. Steven Hayne and Michael West, DDS, made healthy paychecks as “expert witnesses” for Mississippi prosecutors. As Balko and Carrington describe, the two doctors maintained a robust schedule of medical examinations, written and oral testimony, and full-time medical practices. Radley Balko himself has been key in bringing national attention to Hayne and West’s quackery, and the injustice they precipitated.

However, reading Balko and Carrington, I felt distinct discomfort. The authors wrote before the COVID-19 pandemic, which brought dueling definitions of expertise, and high-profile media debates about who to trust. Anti-mask and anti-vaccine advocates used almost exactly the same language to disparage Anthony Fauci as Balko and Carrington use to disparage Steven Hayne. This backs me into a difficult corner regarding how we identify and use experts.

Our authors describe the Daubert standard, the current rule of evidence in American courtrooms, which authorizes judges to determine expertise. Essentially, judges make rulings on whether expert witnesses have sufficient standing in an accredited field to voice opinions that mean anything. This requires opposing counsel to be sufficiently well-informed (and sufficiently lucrative) to actually challenge expertise, and judges to know enough to make such rulings.

However, as we’ve seen since COVID, public officials frequently aren’t sufficiently well-informed to make such decisions. The perception of “scientific consensus” may swing on the Availability Heuristic: that is, if a field like Michael West’s bite-mark analysis gets media attention, judges will know of it, and therefore deem it plausible. Bite-mark analysis helped bring Ted Bundy to justice, and still holds water in courts, but is widely regarded as pseudoscience today.

In Balko and Carrington’s telling, Hayne’s biggest sin was probably maintaining an autopsy and testimony schedule that makes no sense. One autopsy can take four hours, plus weeks of preparation for testimony; but Hayne, at his peak, claimed to complete five or six autopsies per day. The numbers simply strain plausibility.

West, by contrast, apparently invented scientific disciplines from whole cloth. He pioneered an ultraviolet bite-mark identification system that supposedly provided ironclad evidence. However, nobody but West ever successfully replicated his process. Then, his process involved damaging the supposedly already-damaged tissue, ensuring no third party could either verify or dispute West’s findings. Because only West could do it, and nobody could possibly contradict him, his claims were implausibly self-sealing.

Michael West, DDS (Netflix photo)

Without meaningful peer review of Hayne and West’s techniques, they could claim anything, no matter how extravagant. Which, of course, is exactly what Balko and Carrington assert. Hayne and West’s lucrative “expertise” mainly involved courtroom acclamation, not studied give-and-take. In practice, these doctors’ scientific expertise involved oodles of laboratory jargon that mainly served to confuse jurors.

No wonder, when COVID hit America, that so many heel-draggers thought they could answer the Centers for Disease Control with (and I paraphrase) “Nuh-uh!” In this environment, science isn’t a process of inquiry; science is what scientists do. When Anthony Fauci presented steps for preventing disease spread—and importantly, when those prescribed steps changed with new information—his opponents, including the President, challenged his person, not his evidence.

A recurrent social media meme asserts that “if you can’t question it, it isn’t science.” Which is true as far as it goes, but this often reduces to the most ridiculous possible minimum. Popular anti-mask spokespeople, like Judy Mikovits or Stella Immanuel, engaged in fault-finding missions, conflated dissimilar facts to bolster a flimsy narrative, or simply made stuff up. These “questions” were treated seriously by credulous politicians and media giants.

One knock against Fauci was a March 2020 60 Minutes clip where he specifically said mask-wearing wasn’t necessary, at a time when there weren’t enough quality masks to go around. Months later, new information, and more widespread mask availability, led Fauci to endorse mask-wearing. But Facebook and Twitter exhumed that original claim, asserting that Fauci was inconsistent. If a scientist’s claims are inconsistent, his science is disproved, Q.E.D.

Hayne and West emerged from a system wherein expertise was accorded by authority figures, not refined through testing and verification. Courts deemed their statements scientific because they themselves were deemed scientists—and, not coincidentally, because they told state prosecutors whatever they wanted to hear. This wasn’t just bad for individual cases; it redefined “expertise” in ways that privilege individuals over the scientific method.

In that environment, I can’t fault Americans for failing to recognize legitimate science, even when ignorance caused human death. COVID, global warming, and industrial pollution are serious concerns supported by serious evidence. But Americans need responsible individuals to translate that evidence for us, and our credentialed experts act in bad faith. We’re left paying the price.

Friday, May 20, 2022

Governor Pete Ricketts is a Toxic, Gaslighting Liar—and So Is His Party

Pete Ricketts

Nebraska appears to be on a COVID-19 upswing. Official figures indicate five consecutive weeks of rising numbers, including nearly doubling last week. Given this opportunity to demonstrate energetic engagement in Nebraska’s public health, Governor Pete Ricketts, Republican, chose instead to appear on CNN, announcing his intent to unilaterally ban abortion at the first chance. He’s so focused on what he calls “preborn babies” that he’s forgotten the literal living.

I’m conflicted in my response. Ricketts, who was born rich and never held office before being elected governor, has a history of saying complete bilge on national media to gain cheap attention. This isn’t even the first time Ricketts has used the “Nebraska is a pro-life state” line, because he apparently believes he’s entitled to pronounce for all Nebraskans. Ricketts believes himself, not an elected representative, but Nebraska’s embodied avatar.

Yet for all his “pro-life” flexing, Pete Ricketts has demonstrated dismal performance among the actual living. His economic policies have favored the rich and urban in an overwhelmingly rural, working-class state. He continued flooding agricultural conglomerate ConAgra with cash subsidies and tax breaks, only to watch ConAgra move its headquarters to Chicago. Ricketts later tried the same trick to hold TD Ameritrade, which his father founded, with marginally better results.

From the beginning of the COVID-19 pandemic, Ricketts has resisted even the most nominal efforts to curb the spread. As recently as this January, at the second-highest peak of the pandemic, the Ricketts Administration did worse than nothing; it actively opposed local mask mandates, attempting to declare face hankies “illegal.” Hey, remember April 2020, when Grand Island, Nebraska, briefly had North America’s highest COVID rate? I sure do.

Throughout his administration, Ricketts has defined “freedom” as whatever comforts well-off White men, like himself. Massive interest-free cash transfusions to multinational corporations? Don’t mind if I do. Extended unemployment benefits to protect unskilled workers during a pandemic? Piss off, working-class people have a moral obligation to work. Ricketts also opposes even the most fiddling gun control measures, claiming “Nebraska is a pro-Second Amendment state.”

But letting women make reproductive choices unhindered by government bureaucracy? No dice, we’re “pro-life.” Never mind the things we do that cause death, like obstructing any effort to stem a highly transmissible disease. You have a lawful right to breathe your potentially lethal vapors on somebody’s grandma without impediment, that isn’t a life issue. The mere fact that, under current conditions, the wrong person’s exhale is literally deadly, means nothing.

This contradiction baffles me. Even if we accept Ricketts’ premise that a fertilized zygote is already human (which I don’t), then you know what else is already human? An actual walking, talking human being. Calling yourself “pro-life” because you value a glob of meiotic cells, while actively opposing attempts to protect living humans, or ensure they earn enough to buy groceries during a pandemic, makes you worse than contradictory.

It makes you a liar.

Don’t mistake the importance here. I’m targeting Governor Ricketts because, as a Nebraska resident, I’m familiar with his policies, and their lived consequences. But these effects don’t stop at the Nebraska state line. Republicans nationwide are floating bills that would classify abortion as homicide, bills which have already had intrusive consequences for women who miscarry. Many such laws are actively ignorant of biology.

Simultaneously, the same party actively opposes any attempt to make life easier for employed people. A newly floated bill would preemptively forbid any attempt at student loan forgiveness. In the midst of a much-publicized baby formula shortage, House Republicans voted against fixing the conditions which caused the shortage by 192-12. Easier to blame the Brandon Administration than fix the problem, I guess; doesn’t matter who gets hurt.

(Edit: in the House vote, Nebraska's Congressional delegation split. Don Bacon, a Republican representing the 2nd District, which includes Omaha, was one of 12 to break from the party line. Adrian Smith, a Republican representing the large and mainly rural 3rd District, voted against the bill. The 1st District seat, which includes Lincoln, is currently vacant pending a special election.)

Governor Ricketts isn’t important enough to say that today’s Republican Party follows his lead. Indeed, Ricketts probably follows the marching orders of the cable-news androids whose telegenic outrage drives the base right now. But Ricketts is on-brand for today’s Republican mess: claiming “pro-life” prerogative for actions that will create massive burdens for poor people, immigrants, BIPOC, the disabled, and women, while banning even the slightest inconvenience for themselves.

Today’s Republican Party is opposed to anything that would make life better for those currently disadvantaged. Whether it’s systemic policy, like fixing a student debt machine with interest rates worse than a Dickensian usurer, or pure bad luck, like minimizing the consequences of an infectious disease, they’ve demonstrated they don’t care. Fixing it isn’t their responsibility. Anyone who now votes Republican demonstrates they’re okay with this approach.

Monday, October 14, 2019

Capitalism and the Common Cold

My friend Sarah caught an upper respiratory infection off a coworker recently. Like millions of Americans, this coworker, “Rachel,” felt compelled to ignore her illness, go to work, and potentially expose everyone else. To other workers, it’s probably a common cold—a nuisance, admittedly, but nothing catastrophic. But owing to asthma and a systemic hypermobility-related condition, Sarah has limited ability to fight routine infections. Colds, for her, often turn into bronchitis, and she’s out for weeks.

This got me thinking about the times I’ve bucked medical advice, chugged Day-Quil, and gone in sick anyway. Like millions of hourly workers, I don’t have compensated sick days; if I don’t clock in, I don’t get paid. And believe me, I’ve tried foregoing rent and groceries, with little success. Unless I’m too impaired to move, I have no choice but to ignore my illness and work. Same holds, sadly, for most nurses, fry cooks, and other low-paid workers in highly transmissible fields.

During my factory days, one of only two times I got a stern talking-to about my work ethic involved attendance. I breathed a lungful of dust off some chemically treated paper, and spent a week flat on my back. My supervisor called me into a conference room and informed me that, notwithstanding my doctor’s note from the company clinic, I had missed what they considered a substantial amount of time, and was now officially on warning.

(My other stern talking-to involved getting angry at my supervisor, throwing down my safety gloves, and walking out. That’s a discussion for another time.)

My supervisor warned me that, even beyond the pinch I’d enforced on my company, I had imposed upon my fellow line workers, who needed to offset my absence. Clearly, this warning conveyed, I had a moral obligation to ignore the signals my body told me, and come to work. This was only one among many times when the messages I got from family, school, employment, and others, told me that work was more urgent than protecting my bodily health.

Clearly Rachel got the same message, because she even lied to Sarah about how contagious she was. Even while continuing to sneeze on Sarah and other coworkers, Rachel insisted she was past the contagious stage. At this writing, Sarah has been housebound for a week, hooked to her nebulizer and struggling to feed herself. All because Rachel felt the social cue to not spread her cold mattered less than the moral imperative to keep working.

I cannot separate this morality from the capitalist ethic. Like me, you’ve probably spent your life bombarded by messages that work makes us happy, productive, and well-socialized members of society. Conversely, staying home, even when wracked with wet phlegmy coughs, makes us weak, lazy, and otherwise morally diminished. Our bodies aren’t something to respect and listen to; they’re impediments that need silenced so we can become happy contributors to the economy.

(As an aside, Sarah has already written about this experience. She and I discussed this experience, and tested ideas on one another; while she and I don’t say exactly the same thing, there are significant overlaps. My take is slightly less first-person.)

But who benefits when we power through and work sick? I certainly don’t; I feel miserable and sluggish, and also feel guilty for my inability to produce at accustomed levels. My employer doesn’t benefit, because he must pay standard wages for diminished outcomes—indeed, as I can’t rest and recuperate, he must pay more for my illness than if he offered paid sick time. And considering I must pay higher deductibles for off-hours doctor visits, my illness imposes on everyone.

In short, by making my continued attendance morally mandatory, I diminish everyone’s outcomes. Plus I infect everyone around me, including people who, like Sarah, can’t shrug off a cold. But I keep working, so hey, I benefit the capitalist class, right? So I accept the requirement to work, while socializing the risk, and my employer privatizes the outcomes. This offers a distorted morality that literally prioritizes money over individual and public health.

Perhaps you think I’m overstating things, that we don’t really value economic outcomes over health. If so, try telling your employer that hourly workers deserve compensation so they can avoid infecting one another without missing rent. See how your boss reacts with moral outrage. More importantly, see how you feel the gut-clench of wracking guilt before you even speak. That’s the capitalist ethic trying to silence you. Because we’ve made common colds literally immoral.


Also on capitalist morality:
Capitalism, Religion, and the Spoken Word

Wednesday, May 22, 2019

Death Can Only Be Understood By the Survivors


This beautiful long-haired house panther is my boy Pele. I know remarkably little about his backstory. I don’t know where he came from, how many homes he’s had before mine, or even how old he is. The only clear, inarguable fact I know about him, is that I adopted him from my co-worker Jeff during the upheaval surrounding Jeff’s expensive, acrimonious divorce. And that Jeff died last week.

Pele loves to cuddle in my lap and bury himself under my arm. He loves laps so much, in fact, that I’m typing this essay with great difficulty, because he’s draped himself across my forearms, with his paws wrapped around my left arm in a big bear hug. He desperately craves human attention, and when he discovered that I sleep lying on my side, curled into a semi-fetal position, he decided the center of that curl is a pretty cool place to be.

Jeff loved alcohol. Though he was a frequently diligent employee, who didn’t hesitate to accept additional task assignments and overtime hours when needed, he also repeatedly showed up to work at 7 a.m. with beer already on his breath. He told giddy stories of various exploits he’d accomplished, stories which almost invariably began with him already being wasted. He had one of the worst cases of alcoholic rosacea I’d ever seen.

I hesitate to say too much, because Jeff was actively in my life less than one year (I’ve known his cat nearly three times as long as I knew him), and because he leaves behind a ten-year-old son who doesn’t need a dark cloud over his adolescence, or anyway a darker cloud than he’ll have growing up without a father. But I’m a writer, and like most writers, I can’t comprehend difficult situations without writing about them. I hope Jeff will forgive me.

Problem drinking is widespread in my workplace. At least three co-workers are capable of drinking a twelve-pack on a weeknight and still showing up for work the next morning. One co-worker won’t be eligible to have his driver’s license reinstated until 2021. I’ve witnessed colleagues arriving for work so thoroughly hung over, they needed to find secluded spots away from bosses’ gaze to grab quick naps before beginning the productive day.

Alcohol is, of course, a painkiller. Before scientists invented anaesthetics, they used brandy and bourbon to numb patients before surgeries and dentistry. Nowadays, people use alcohol to numb their brains from the maladaptive effects of lifelong trauma. Scratch below an addict’s surface, I have learned, and you’ll find somebody who survived something horrific, usually at a very early age. The issue isn’t whether, it’s what.



Sadly, I never knew Jeff well enough to understand his full story. He fleetingly mentioned an adversarial relationship with his own father, but always changed the topic quickly. He was determined to not repeat his father’s mistakes with his own son; but he also hadn’t yet grappled with his own history, and therefore needed to numb the pain artificially. So he surrounded himself with living things he could love.

Besides Pele, he had two dogs, an energetic little lapdog and the chillest retriever mix I ever met. As his marriage crumbled, and he saw less of his son behind a difficult custody battle, he doted religiously on his animals. But as his divorce dragged on interminably, he couldn’t make house payments, and eventually needed to move back in with his mother—a humiliating concession in a 46-year-old man. So he needed to re-home his animals.

He was red-eyed, and even drunker than usual, the day I arrived to take Pele home with me.

Not much later, Jeff got into a heated argument with a manager and walked off the job forever. I only saw him twice after that. Both times, he had his son with him, as well as his wits. But I heard stories from other colleagues who ran into him without his son. His drinking had apparently intensified; one reported he’d begun suffering minor hemorrhages because his capillaries were shot. I also heard he’d begun shuffling when he walked, like a much older man.

I wonder whether Pele is capable of understanding Jeff’s absence. Like me, he knew Jeff less than one year. I’ve cuddled Pele and talked to him about what happened, but he just blinks his pretty golden eyes, so I don’t know. I’m typing this through tears, while Pele purrs contentedly in my lap. Maybe he knows he’s loved right now. Maybe that’s enough.

Wednesday, February 14, 2018

Unbalanced Nutrition for Unbalanced Times

When Rosa Foods introduced its meal replacement shake, Soylent, my fellow science fiction nerds couldn’t resist the obvious jokes. This product apparently originated in a world free of hammy Charlton Heston impersonations, where nobody would brandish their canister of pre-made powder and shout “Soylent Green is people! It's people!” I was surprised later to discover that the inventor intended this connection deliberately. Irony lives, I guess.

Soylent isn’t the first meal replacement shake I’ve encountered. However, it’s the first I recall that wasn’t designed as part of a health-conscious dietary regimen, such as a high-protein diet combined with workouts and timed fasts. Instead, Soylent is marketed to well-off professionals who can’t spare fifteen minutes per afternoon to make themselves a sandwich. Whip this stuff together, marketing says, and keep going without the tedium of lunch break.

Anybody attempting a meal replacement will probably have two questions: is it nutritious? And, will it satisfy me? Sadly, the answers are no, and no. Though rich in “micronutrients,” it doesn’t provide enough to fuel typical human activity throughout the day, and those nutrients are more than offset by the sugar content. And while the shake is temporarily filling, it isn’t really satiating. I compare it to plugging your hunger with a Snickers bar.

Consulting the nutrition label, one listed serving of this product contains twenty percent of your Recommended Daily Allowance (RDA) of several important nutrients, including Vitamin D, Vitamin C, various B-complex vitamins, and iron. It also includes twenty percent of various other substances we don’t normally consider nutrients, including copper, choline, biotin, and molybdenum. Twenty percent of all of them. Always, consistently, twenty percent.

But when we get off the “nutrients” train, the numbers get wonkier. One serving contains twenty-six percent of your RDA of fats, including thirteen percent of your saturated fats, and thirty percent of your RDA of added sugar. Replace one meal daily with this stuff, and you’ll have to skip dessert. It also provides about twenty-one percent of your daily fiber, two-thirds in the form of soluble fiber, which mostly just expands in your gut, making you feel full.

Charlton Heston (left) and Edward G. Robinson in Soylent Green

It’s that thirty percent daily added sugar that disturbs me. Current scientific thinking contends that obesity is caused, not by eating fatty foods, but by consuming more sugar than our livers can process; our bodies respond by storing the added sugar, and added liver enzymes, in fatty tissue to process later. Except that later never comes. If you ate a diet balanced like Soylent, by the time you consumed your full RDA of other nutrients, you’d have eaten 150% of your daily sugars.

Leaving aside the specious nature of RDA computations, the fact is, your RDA of sugar and sodium is a dietary maximum, while your RDA of magnesium, niacin, and other nutrients, is a minimum. Humans evolved in environments where certain substances (salt, sugar) were scarce, but others were abundant; we retain some, and piss away whatever we don’t need of others daily. Modern processed foods reverse this balance. The effect shows on our waistlines.

Even Rosa Foods wouldn’t recommend living on Soylent for every meal. But if you ate this balance at every meal, you’d get 150% of your daily sugar, and seventy percent of your sodium, before you reached your daily necessity of other nutrients. If you just had Soylent for lunch, you’d still need to eat a nutritionally rich dinner, with no dessert, and skip your afternoon Pepsi, to balance your diet. That maketh me not happy.

All that, for a “meal replacement” that basically only takes the edge off your hunger. I’ve used this to replace my breakfast, and felt myself getting hungry again only ninety minutes later; by lunchtime, I could murder a cheeseburger, and I confess, that impairs my judgement, making me crave carbs and fat. I compared this to eating a Snickers above. Yes, both make you feel not hungry. But they’re filling without being satiating. You pay for that later.

I’d be remiss in ignoring Soylent’s convenience. We’ve all had days when even preparing and eating breakfast cereal would be an imposition. But like fast food, wise eaters should indulge this convenience as infrequently as possible. Each individual use might be okay; only when it becomes a pattern does it become a problem. Users who monitor consumption, as through a Weight Watchers journal, can probably add this to their diet, if they remain mindful. Just don't make it a daily thing.

And no, Soylent isn't people. It isn't people.

Monday, October 30, 2017

How Often Should You (Yes You) Shower?


The tease on The Atlantic’s Facebook page was simultaneously tempting and disgusting: “What Happens When You Quit Showering?” It’s just one more that’s trickled out through the years, including Chip Bergh, CEO of Levi Strauss & Company, telling people never to wash their jeans, or Buzzfeed citing that beloved go-to, “science,” to tell Americans we shower too often. A real cottage industry has developed recently selling the idea that we just don’t need to bathe frequently.

I understand the mindset behind this thinking. Where Americans once bathed about weekly, unless they carried an absolute reek, major corporate products like Ivory soap and Listerine mouthwash have convinced us our bodies are petri dishes of disgusting microorganisms, and we need to confront BO and “halitosis” daily, if not more. Nor is it just Americans. Per The Atlantic again, the global average includes a shower or bath daily, and a shampoo every other day.

Besides the health concerns of ordinary people washing necessary, symbiotic organisms off their bodies, this anti-bathing trend also reflects a pushback against advertisers who profit from our insecurities. I certainly don’t mind seeing corporate profiteers getting a firm public comeuppance for peddling self-loathing to the public. But I spot an unexamined class-based assumption in these articles. They implicitly believe typical Americans don’t get dirty and sweaty enough to even require that much bathing and laundry.

Since I reluctantly accepted my move down society’s economic ladder, and got a blue-collar job, I’ve come to understand that sweat and dirt delineate American social class. When I worked in the factory, indoor temperature often approached eighty degrees Fahrenheit even in the dead of winter; in summer, temperatures nearing a hundred degrees weren’t uncommon. And that’s nothing beside conditions working construction, where we’re expected to work in blazing summer sun and Arctic winter chill.


If I don’t shower daily, I smell like somebody left bologna on the kitchen counter for about a week. Sweat pours off me in quantities you’d not believe without seeing it. Many joggers, cyclists, or  other high-impact aerobic workout fanatics think they sweat; I certainly did when I worked white-collar and biked regularly. But I didn’t discover what sweat meant until it bucketed off me for eight hours or longer like wringing a kitchen sponge.

Showering isn’t optional for people who work like that. Since humans expel nearly a third of our bodily waste through our skin, the exceptionally moist environment of a sweaty construction worker creates a thriving breeding ground for microorganisms that, yes, produce an odor. Many of these microbes are necessary for human health… in limited quantities. But my sweat-soaked body creates conditions where my otherwise healthful, necessary microbiome makes me stinky, and could unbalance my health.

And that’s saying nothing about other class markers. I remember watching Star Trek years ago; Dr. Crusher complained to some hirsute crewmates that, since the invention of the razor, having facial hair was an affectation. I wondered then why the hair growing naturally from men’s faces was affected, while barbering it daily was not. Now I realize something deeper: facial hair keeps hostile conditions, like wind and precipitation, off my skin. Shaving jeopardizes my safety.

Considering that Levi Strauss invented his heavyweight dungarees specifically to withstand the punishment gold miners put their britches through, Chip Bergh’s whine that washing your jeans seems remarkably obtuse. My jeans get battered, torn, grease-stained, and worse, daily. If, atop that, I had to slip them on without thoroughly washing my microbiome off (remember what we said about bodily waste above), that would be like wearing the same tattered, shit-stained BVDs, every day, for years.


So when The Atlantic and Buzzfeed insist I should stop showering, they clearly think people like me don’t read their content. They occupy a rarefied world where pundits in suits scold one another about internalizing commercial hype and selling their self-esteem to our corporate overlords. If they realize people like me exist, they certainly don’t give us further thought. They accept the hierarchy, where building structures and making stuff is too menial for their audience.

Maybe bankers and attorneys could afford showering every two or three days. Many probably should. But audiences are diverse, and many haven’t achieved the lofty standards high-gloss magazines promised us in college. These one-size-fits-all hygiene tips innately assume wealth, and indoor work, are normal. And they exclude the people who build their offices, maintain their server farms, and cook their cafeteria dinners. People like, well, me. Do I need to explain why that’s a problem?

Monday, June 6, 2016

When Bread Could Kill You

Paul Graham, In Memory of Bread: a Memoir

Paul Graham, an upstate New York English professor and gastronome, established elaborate rules for himself: Cook your own food. Use local ingredients. Keep fat, sugar, and glycemic index low. Cooking for his wife, eating the bread she baked, and home-brewing beer with his best friend were staples of building a sustainable, locovore lifestyle. Everything food hipsters say will keep us, and the land, healthy. So he couldn’t understand the sudden, shooting bursts of abdominal pain.

Diagnosed with celiac disease at age 36, Graham found himself in an increasingly common situation. Diagnosis rates worldwide have skyrocketed. But are celiac, and other gluten intolerance disorders, really more common today? Or have people previously misdiagnosed now being recognized? (This isn’t academic. I have two gluten-intolerant friends, one who was tested for everything from cancer to lupus for over a decade.) Graham resolved to do what scholars everywhere do: research the situation, and report.

This volume starts with Graham’s own situation. It’s a memoir primarily, of Graham’s own struggle as he goes wholly gluten-free. Fortunately, his wife joins him on the journey. I wish I’d been that brave; when my then-girlfriend was diagnosed gluten-intolerant, I selfishly hoarded coffee cakes and cinnamon rolls. But Graham and his wife, Bec, find they’re not just giving up one ingredient. They’re walking away from buffet spreads, pub nights, and food’s historic social implications.

Wheat agriculture, it appears, helped form Western civilization. As Graham’s investigation expands into the history and science of gluten, he finds wheat so basic to Western history that to abjure eating bread (Graham loves the phrase “wheaten loaf”) means to not participate in our culture. Food-sharing rituals, from pot-luck brunches to Catholic communion, underpin Euro-American culture, and eating bread looms large. Maybe that’s why humorists and hipsters treat gluten-free dieters as mere figures of ridicule.

Since Graham, an award-winning food writer besides his professorship, cooked for himself, and his wife baked, food wasn’t just bodily sustenance; it bolstered the intimacy of his marriage. Thus, for him, the macro-scale and the micro intertwined. Many recipes, and many prepared ingredients, involve wheat where you’d never look for it, especially as a stabilizer. As he abandoned the cultural history of eating wheat, he also lost the personal history of preparing his own dinner.


Our isolated, private society today often loses the community aspect of food. But the simple act of sharing conversation around the table has historically underpinned our society. When he had to walk away from that history—not just the cultural history of shared food, but the personal history of knowing how to prepare his own dinner—Graham had to relearn everything he knew. Not just about food, but about himself, and his place in society.

For one, he has to rediscover how to be Paul Graham, in a world where hobbies like baking and brewing were now off-limits. He needed to relearn cooking. Many store-bought gluten-free (GF) foods simply substitute rice, tapioca, or sorghum flours for wheat, assuming the process remains unchanged. Not so, as Graham discovers in actually preparing edible GF bread. His mentors, though meaning well, taught him concepts that no longer apply. Cooking is an adventure again.

Is bread even really necessary? Graham suggests many deeply ingrained expectations regarding food are learned, not innate, though impossible to discard, the centrality of bread among them. With time, he internalizes the systems necessary for understanding the new world he found himself thrust into. Though by the end, he returns to home-baking his own GF bread, he acknowledges that even then, it means unlearning habits he’s previously mastered. Embrace everything teachers told you to avoid.

By his own admission, Graham set himself many food-related rules well before onset of celiac disease. His “locovore” proselytizing sometimes gets intrusive, and his quest for celiac-friendly foods at farmers’ markets seems quixotic. But everything he says sounds familiar to anyone forced, by health or circumstance, to abandon wheat. The discomfort at public food gatherings (can I eat off this buffet? How do I know what’s safe for me?). The mockery one faces for eating.

If it’s true that only the intimately personal is truly universal, Graham achieves that here. No two gluten-sensitivity sufferers have identical symptoms; that’s what makes diagnosis so difficult. However, everyone who abandons gluten endures the same isolation: the same withdrawal from easy carbohydrates, the same alienation from bread-eating friends, the same journey through dietary blandness. His memoir of struggle can inform all readers, and offer hope that leaving gluten doesn’t mean leaving good food forever.

Friday, June 3, 2016

Are Vitamin Supplements Bad For You?


Let me begin by answering my title question bluntly: yes. The overwhelming scientific preponderance concurs that consuming high-dose vitamin supplements has negative long-term health effects for most people. If, like me, you find perusing scientific literature sleep-inducing, the findings have been condensed by everyone from The Atlantic to The Daily Mail to comedian Adam Conover (see above). Only the vitamin industry’s well-funded trade association disagrees anymore.

That said, I only recently discontinued my daily vitamin regimen. Despite having read the relevant reports; despite understanding the health risks associated with throwing my natural bodily harmonies out of balance; despite the fact that they’re so damn expensive, I continued taking daily multivitamins for years. If even an educated individual like me, someone who takes pride in staying abreast of facts, continues doing something harmful, we should ask: why?

Please don’t misunderstand me. I don’t believe I’m representative of humanity in general, even considering the well-documented tendency all people have to consider themselves normative. Rather, I only wonder, when the preponderance of evidence holds so heavily with one position, and the known history of the debate is, at best, weird, why anybody who reads would continue consuming something known to cause harm. I suggest the alternative simply feels worse.

In my case, having been pressured by people I considered trustworthy to consider a daily multivitamin regimen, I purchased an inexpensive supplement, gave it a try—and immediately felt better. Not in some abstract psychological sense, either. (Warning: grossness follows.) The second day of the regimen, I passed a massive bowel movement. Also the third day, fourth day, fifth day… massive, soul-shakingly cleansing bowel movements, every day for over two weeks.


Whatever revolting toxins my body had been stockpiling, the vitamin apparently helped purge. The improvement was immediate. I had energy to begin a moderate exercise program, spent more time engaged in creative activities and less watching television, and returned better productive outputs at work. With that waste gone, I had more energy, better moods, and greater mental acuity. Simply put,  took vitamins and felt well.

Reasonably speaking, I know that consequence accrued from the supplement’s probiotic content. Having eaten a diet rich in starches, sugar, and fat, and light on fiber, green vegetables, and roughage, my intestinal fauna was unbalanced and weak. The probiotics, which I probably needed in small doses for short times, pushed long-held waste from my body. From there, I should’ve adjusted my diet, embraced healthier living, and moved on.

But we’re all human, subject to the same anchoring biases and post hoc reasoning as anybody. I took multivitamins and felt well; therefore, I reasoned, the multivitamins caused my discernible improvement. Intellectually speaking, I know they didn’t. Having restored my health by simple, brief interventions, I had an opportunity to adjust my lifestyle for improved health. Instead, I latched onto the one visible change that preceded my palpable bodily improvement.

In today’s fast-paced world, most Americans eat badly. Nine out of every ten Americans don’t get their complete nutritional needs from food alone, a popular advertising campaign warns, before urging us to purchase supplements. They never question the common American diet of restaurants and processed foods, overloaded in red meat, sugar, and heavy starch. If Americans can’t get their nutrition from food, maybe the problem isn’t us, it’s our food.



Admittedly, we face complex restrictions. Our work lives are increasingly performed indoors, seated, without needed exercise, sun, and human companionship. Sweet, fatty foods fill the resulting psychological hole… deeply, but briefly. Busy two- and three-income families have little time for home cooking. And the US Recommended Daily Allowance (USRDA), little changed in two generations, measures nutrients enough to prevent starvation, not to thrive abundantly.

Nevertheless, multivitamins, like drugs, alcohol, and television, provide the promise that we’ll feel better, healthier, restored. People embrace fad diets, like gluten-free or macrobiotic, without considering their individual health needs, because they hope to feel better. Some do: many people going gluten-free experience rapid weight loss. This happens mainly because they stop eating processed foods, but like with my multivitamins, they embrace what immediately precedes their improvement.

Maybe, faced with massive dietary shortfalls and unsupportable lifestyles, Americans should consider the forces making them feel bleak. Reaching for superficial solutions feels good, but does nothing. And multiple magazine articles scolding vitamin buyers has produced little effect. The problems underlying lopsided lifestyles, like basic poverty and little autonomy, loom so large, they’ve become invisible. If vitamin buyers hope to feel good, let’s investigate why they feel bad anyway.

Wednesday, March 18, 2015

The Well-Engineered Self

Gretchen Rubin, Better Than Before: Mastering the Habits of Our Everyday Lives

Attorney turned author Gretchen Rubin has previously written about how people become happy, and why they don’t. A trained legal researcher, she brings a scholarly eye to her projects that isn’t exactly journalistic. She writes with a mix of acumen, anecdote, and humor that reaches certain audiences where they live. Now she turns her attention to the question: how can we make happiness self-supporting?

Humans, Rubin writes, are necessarily creatures of habit—we have to be. Easily forty percent of our daily activities function habitually, on neurological automatic, because we cannot spare the mental focus to make every decision consciously. But often, we fall haphazardly into habits that don’t bring satisfaction, and often obstruct productivity. Rubin encourages additional mindfulness about our habits, and engineering them directly.

Well-built habits begin with self-knowledge. Rubin acknowledges that many self-help books falter because authors assume their personalities are somehow normative. Rather than one-size-fits-all prescriptivism, Rubin begins by coaching readers through steps to recognize their personality types. She broadly outlines what she calls the Four Tendencies, which somewhat resemble the MBTI types, but not really.

I concede doubts about these Tendencies immediately. Rubin’s descriptions of Upholder, Questioner, Obliger, and Rebel run broad as newspaper horoscopes, so inclusive that most people could recognize themselves in these categories. I recognize Questioner and Rebel in myself by natural tendency, and Obliger and Upholder encultured by my upbringing. Actually utilizing these tendencies will require substantial efforts at winnowing.

Leave that aside momentarily, though. Because subsequent chapters on the “Pillars of Habits” have robust roots in hard science and personal observation. Too often, we reprimand people for not fixing their own hash by undifferentiated willpower; pop visions of habit formation rely upon common sense, not replicable science. But as Duncan J. Watts writes, common sense is often neither common nor sensical.

Gretchen Rubin
Some of Rubin’s pillars are internal, particularly her emphasis on diligent self-monitoring, while others are more external, like the need for human accountability. Some straddle this divide, like writing firm schedules, which start internally but create an external document which demands users’ respect. Either way, they underscore that humans need a combination of self-awareness and public mutuality that’s become too rare anymore.

Once habit-forming behaviors commence, Rubin unveils the steps necessary to encourage productive habits, and discourage counterproductive ones. For instance, we’re more likely to continue patterns which we perceive as convenient, and halt those we consider inconvenient, so engineering our lives so desirable behaviors are also handy matters. Similarly, she suggests “pairing,” or linking something we ought to do with something we want to do.

Readers may find Rubin’s abstention chapter most controversial. We’ve heard the claims: I can occasionally indulge this behavior without becoming habitual, or once an addict, always an addict. Debates over addictive habits, from drugs to porn to workoholism, rehash this point, with people insisting that whatever works for me, works for everyone, QED. Rubin dares suggest that humans are individuals, and you’ll know whether you can safely chip.

Perhaps Rubin’s most revealing chapters deal with rewards versus treats. Though we may consider these interchangeable concepts, Rubin demonstrates they’re definitively not. Treats uplift our spirits, giving us motivation to continue onward. Rewards, and their close cousin, finish lines, permit us to consider the process “done.” As Rubin notes, and you’ve probably noticed yourself, once we halt desirable behaviors, getting started again is nearly impossible.

This book essentially isn’t for me. Rubin’s message often overlaps with Charles Duhigg and Kelly McGonigal, without their scientific grounding. In my experience, I (and many others) need the science to remind ourselves why certain processes work, and aren’t just mindless ritual. With her instructional bromides and coupled with upbeat anecdotes, Rubin more resembles a motivational speaker, and writes for audiences who need motivation over data.

Also, my opinion is somewhat colored: I read Rubin directly after Johann Hari’s Chasing the Scream, about the drug war. Quoting multiple researchers, Hari demonstrates that addictive behaviors—the worst of all bad habits—result markedly from childhood abuse or social isolation. Since many of Rubin’s precepts involve social connections and unearthing buried causes, she’s perhaps stumbled onto principles with yet-unexplored mass social implications.

Within those caveats, Rubin writes an engaging book with many actionable principles. Though she doesn’t get into technical details, her points are specific enough that most people could actually apply them in ordinary circumstances. And though early precepts sometimes run vague, Rubin’s overall approach gives readers tools enough to improve their regular choices to create better circumstances and better lives.

Wednesday, March 11, 2015

Doobie Newbie Blues

Click to enlarge
The image at right appeared on a friend’s Facebook feed this weekend. Having recently become interested in the sociology of drug use, and the forces that make illegal drugs a desirable choice for so many people, I’ve developed a reflexive distrust of blanket statements about how drugs work. So I decided to don my Snopes cap and investigate this claim’s truth value.

Though floated by various law enforcement, civil service, and public interest groups, this image originates with the American Lung Association. The ALA’s original source material verifies the picture’s essential claim, that marijuana (hemp, cannabis sativa) certainly does deposit more tar on human lung tissue than commercially manufactured, legally sold tobacco cigarettes. As it stands, this claim appears substantially true.

However, reading the source requires understanding basics of argument analysis. If I had my way, Darrell Huff’s 1954 classic How to Lie with Statistics would be required reading in every American middle-grades classroom. Anybody familiar with Huff’s principles will immediately recognize, reading the ALA’s statistics, that they’re guilty of several mistakes. For our purposes, the most important is “Comparing the Incomparable.”

By their own admission, the ALA compares machine-manufactured cigarettes, tipped with cellulose fiber filters, with hand-rolled marijuana doobies. Nearly everyone agrees that cellulose filters reduce the quantity of tar, fine particulate matter, and other matter from cigarette smoke. However, there’s no agreement whether that actually has significant health benefits. Provided they remain moist, meaning alive, tarry lungs continue to function.

Promo art for Reefer Madness, possibly the most
moralistic, wrong-headed anti-drug propaganda ever
There’s also wide agreement that filters don’t obstruct the inhalation of nicotine. The neurologically active component in marijuana smoke, tetrahydrocannabinol (THC), literally cannot kill you at any quantity. We cannot say this about nicotine, which is so lethal that only by diffusing it into microscopic particles, and inhaling it amid hot smoke into the lungs, can it have any narcotic effect without instantly killing the consumer.

Also, the ALA compares cigarettes, manufactured under highly regulated standards, to marijuana, which is grown, distributed, and consumed with no purity standards whatsoever. Cigarette manufacture and sale is strictly regulated by the Federal Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF). This gives ATF remarkable authority to control tobacco purity, and gives consumers confidence regarding what their smokes are made of.

Marijuana, by contrast, remains exclusively in control of criminal enterprises, which can enforce standards only via assassinations. Once marijuana supplies reach street customers, we have no confidence that our wacky tobaccy hasn’t been thoroughly adulterated with pine needles and lawn clippings. Drug dealers are known to stretch supplies by cutting cocaine with baking soda, or heroin with chlorine bleach.

Therefore, if the marijuana consumed by regular users has damaging health consequences beyond ordinary THC effects, well, why is that? It’s hard to differentiate actual marijuana effects from those created by drug prohibition. Say whatever you will regarding the moral implications of corporate regulation. But anyone concerned about public health will agree that having government oversight of food and tobacco production beats not having it.

The Camberwell Carrot, the iconic oversized doobie in
the 1987 British classic Withnail and I
America has attempted limited-scale regulations in this manner. Besides the four states and the District of Columbia, which have all legalized personal marijuana possession for personal consumption, several states (the number changes so quickly that I cannot find reliable sources) have legalized medically prescribed marijuana. But the patchwork of state regulations, and ease of interstate transit, makes even this regulation slapdash and unreliable.

This being the case, any reasonable person must consider the ALA’s conclusions unreliable at best. On multiple levels, they compare diverse products that have little overlap. Imagine if the FDA published a white paper comparing patent-pending pharmaceuticals with homeopathic peach-pit cures. If the FDA proclaimed the natural superiority of lawful, government-approved products, tested in their own labs, we’d have legitimate reason to pause.

The ALA, therefore, demonstrates less about marijuana itself, and more about the destabilizing consequences of drug prohibition. By concentrating control of drug commerce into criminal hands, we provide economic incentives to organizations like the Sinaloa and Zeta cartels. Anti-drug advocates might insist that drugs are bad, and I agree. But simply banning them doesn’t make demand go away, simply reorganizes his routes of transit.

Users embracing marijuana for many and diverse reasons, just as people embrace legal drugs, like nicotine, alcohol, and caffeine. The distinction between which drugs are prohibited, and which remain lawful, has mainly moral rather than medical foundations. If merely being dangerous were sufficient, we’d have banned liquor and tobacco generations ago. How about, rather than cloudying the debate with tar, we get to know users as humans?

Friday, May 30, 2014

Gluten Sensitivity Insensitivity


A recent YouTube video has garnered social media applause. Late-night host Jimmy Kimmel shoves a microphone into the faces of people who eat gluten free, asking if they know what gluten is. They don’t. Kimmel is a comedian, and edited his ambush interviews to maximize laughs, but many friends I previously trusted called this montage proof that “gluten sensitivity,” one of today’s fastest-growing food issues, doesn’t exist. Gluten avoiders are ignorant, therefore the problem doesn’t exist, QED.

Many people have hopped the “gluten sensitivity” bandwagon, avoiding gluten because it appears moddish. Like fat substitutes in the 1990s, or the Atkins Diet in the 2000s, gluten avoidance gives uninformed hippies the glimmer of worldly wisdom. But their trendy behavior tars people suffering legitimate gluten intolerance. Having helped a dear friend through years of pain, embarrassment, and helplessness due to gluten problems, stunts like Kimmel’s offend me personally. So here’s a legitimate, fact-based introduction.

Gluten is a protein composite present in certain grains, especially wheat. It creates bonds between fibers, giving wheat bread the firm, pliable texture cornbread lacks. Gluten bonds so powerfully that processed food manufacturers add wheat flour to many foods you wouldn’t expect, like cheese sauces, to grant a passably food-like texture. Advertisers historically used wheat emulsion to paste up handbills and posters, because air-dried gluten sets harder than Portland cement. “Gluten,” in Latin, means “glue.”

Importantly, certain people lack the ability to digest wheat, rye, or barley gluten. Science doesn’t yet know why. In the best-known gluten intolerance, celiac disease, gluten compounds actually perforate sufferers’ intestinal walls, causing vital nutrients to leak through undigested. Left untreated, patients’ perforations become large enough to allow fecal matter directly into the bloodstream, resulting in septic poisoning. For untreated sufferers, it’s a coin toss whether they’ll starve to death before suffering lethal septic shock.

For most people who don’t suffer gluten intolerance, actual gluten digestion problems appear childishly comic. The first symptom many sufferers endure after inadvertently consuming gluten is uncontrollable farting. But when sufferers flee public scrutiny, they and their loved ones see more significant symptoms, including painful diarrhea, bloody stool, fatigue, hormone and electrolyte imbalance, migraine headaches, rheumatoid arthritis, misdiagnosed fibromyalgia, violent PMS, lupus, unexplained infertility, and (no kidding) bipolar disorder. Gluten sufferers often live functionally housebound.

Legitimate gluten intolerance tests do exist, but they have limits. The most reliable test involves a complete upper and lower endoscopy, meaning running a camera tube down a patient’s throat and another up the rectum. Though false positives are vanishingly rare with this test, false negatives run as high as twenty percent, meaning if you endure the most reliable test, and you have gluten intolerance, there’s still a one-in-five chance your results will be wrong.

(Edit: after writing the above paragraph, my source informed me I had a misunderstanding. The test with a high false negative rate isn't the most reliable test; it's the test you have to take to qualify for the most reliable test. The whole issue is wrapped up in layers, and possible gluten intolerance sufferers have to educate themselves and be their most passionate advocates.)

Therefore, even the fact you’ve tested negative for celiac and other gluten intolerances, doesn’t mean you have no problem. Celiac diagnosis rates have increased geometrically, doubling roughly every fifteen years. That’s slow growth, admittedly, but celiac and other diagnosable intolerance rates currently run about two percent in America. If two percent of Americans had lymphoma, we’d feel outraged if comedians treated lymphoma as ridiculous, and lymphoma patients as uniformed rubes. And with good reason, too.

The friend I helped with gluten intolerance flunked her celiac test because her endoscopy missed any celiac perforations. Her doctors shrugged. She’d suffered years of severe central-body weight issues, a sign of cortisol imbalance. Her primary care physician thought she had pheochromocytoma, but was unable to locate a tumor. She was embarrassed to admit her chronic farting problem, which might’ve hastened a diagnosis, but maybe not; non-specialists aren’t much trained at spotting digestive disorders.

My friend finally visited a nurse practitioner, who observed her body structure, particularly the disproportion of her weight distribution to her skeletal structure, and identified a celiac-type sufferer. This nurse wasn’t trained at spotting digestive disorders, either; she simply recognized my friend suffering the same symptoms her own daughter, a celiac patient, endured before her diagnosis. Since my friend’s test returned what we now consider a false negative, her timely diagnosis was pure coincidence.


I’ve seen my friend if gluten quantities smaller than a bread crumb get into her food. Celiac sufferers react to gluten concentrations below twenty parts per million. I’ve seen her doubled over in pain, clutching her abdomen, sweating through the painful farts while waiting for diarrhea to hit. I’ve seen her lose emotional control, seen her attention span dwindle to mere seconds, seen her sleep north of twelve hours because muscle fatigue leaves her depleted.

So by damn, don’t tell me gluten sensitivity doesn’t exist because hippies are ill-informed. The only wholly foolproof diagnostic test for celiac spectrum gluten intolerance is to go off gluten altogether for two months or more. If symptom constellations abate, winner winner chicken dinner. Sufferers can exclude gluten, one of the more common food intolerances, from their diets with moderate effort. But false friends mocking their health issues undermine confidence, making legitimate sufferers avoid treatment.

Like prior food fads, gluten avoidance will lose its hip cachet soon enough. But legitimate sufferers will remain, needing to scrupulously police their food intake. For people with clinical gluten intolerance, their medical necessity is as real as people with peanut issues or bee sting allergies. You wouldn’t mock a religiously devout friend out of keeping kosher or halal, would you? Then how dare you mock your friend for avoiding a simple, preventable food reaction.