Friday, March 31, 2023

The New Public Face of Evil

I usually don’t like using moralistic words like “evil” to describe politics, because black-and-white thinking obscures the human souls involved. But watching the mass right-wing dogpile that happened within minutes this week after the Covenant School shooter was revealed as transgender, I think that word is appropriate. The conservative desire to turn this into a referendum on a class of people is, in a strictly Christian sense, evil.

Marjorie Taylor Greene

The key to understanding this evil lies in the second half of Marjorie Taylor Greene’s comment, above. She correlates the shooter’s gendered hormone replacement therapy with mental health drugs. Et voilĂ , we reach the core of conservative judgment: people struggling with mental health are considered morally weak. And gender “dysphoria” is, to the conservative mindset, a mental illness, not a medical condition to receive treatment and respect.

Despite recent trends toward racism and bigotry in American conservatism, elected conservatives still recognize the importance of coded language. They can’t explicitly say racial groups are devolved or dangerous; that’s why Scott Adams can’t buy lunch anymore. The same rules, though, don’t apply to mental health. It remains perfectly acceptable to say people with PTSD, substance abuse issues, or plain old depression are morally weak and individually responsible.

Elected Republicans vote against providing assistance for mental health treatment on a strict party line basis for years. They only mention mental health after mass shootings, to deflect from weapons hoarding as a cause for violence. Mental health remains toxic in America; PTSD is still not considered a “legitimate” war wound, for instance, and not eligible for the Purple Heart. Mental health is considered a personal failing.

This duality becomes pointed after public displays, like Monday’s shooting. By making the shooter responsible by dint of personal moral weakness, conservatives exculpate the two military-grade rifles the shooter carried. If transgenderism exists entirely inside the sufferer’s head, then the sufferer is individually responsible for handling their thoughts. Any person who can’t handle their thoughts is beyond the pale and irredeemable.

But simultaneously, by making the individual morally responsible for their actions, conservatives make the group responsible. Because the group, in this case transgendered people, is organized around something that appears to be a moral choice, then transgenderism is responsible. The mere fact that violence by transgendered people is vanishingly rare matters little. The group is responsible for this individual’s moral failure, and therefore all group members deserve punishment equally.

Right-wing moral judgment depends wholly on a narrow definition of what makes a good or bad person. Conservatives believe moral qualities originate within the individual, and each person is completely unique and isolated. Therefore nothing causes an individual’s actions but the individual’s moral core. Good people do good things, and anybody doing bad is always a bad person. Extenuating circumstances and cultural conditioning don’t exist, it’s always only the individual.

Florida Governor Ron DeSandis, author
of the “Don’t Say Gay” bill

That’s why, even as Republicans rush to blame this individual’s mental health for their violence, they continue voting to discontinue public support for mental health treatment, or even food support. Conservatives scramble to deny mental health treatment, poverty protection, and affordable health care to those who need it most. Thus they deny two groups Jesus specifically cited in the Parable of the Sheep and Goats: the hungry and the sick.

Food and health, American conservatives assert, are luxuries people earn. Those who can afford them, therefore, are already morally good people who don’t need outside support, while those who can’t afford them, don’t deserve them. If they were moral people, the poor wouldn’t be in this predicament. Though American Republicans tout Christianity as their unifying moral force, they actively reject Jesus Christ’s exhortation to defend those who can’t defend themselves.

Thus America persists in doing everything possible to avoid protecting those most needful of protection. Because the defenseless are considered bad people, doing anything to defend them only empowers their need, which only reinforces their immorality. This directly rejects Gospel morality, as expressed by Jesus Christ himself, which states that the poor and the sick deserve Christians’ support, not because they’re uniquely holy or good people, but because they’re poor.

Don’t misunderstand me; this position doesn’t require endorsing changing gender roles or sexual values. This only requires acknowledging that all people are human, and deserve to be protected. But people like Marjorie Taylor Greene can’t do that. They believe protection only extends to “good” people, usually defined as people most resembling themselves. And I consider that evil. As Jesus Christ himself said, even the pagans and sinners can do that.

Wednesday, March 29, 2023

A Brief History of Germany Before “Germany”

Katja Hoyer, Blood and Iron: The Rise and Fall of the German Empire 1871–1918

When the Holy Roman Empire collapsed before Napoleon in 1806, it left a crazy quilt of German-speaking microstates across central Europe. These little Germanies were vulnerable to French, Russian, and Austrian dominion, but for decades struggled to unify. Their separate traditions, laws, and dialects made working together too difficult. They waited in vain for someone to unify them, until a militant nationalist stepped into the role: Otto von Bismarck.

If your high school World History course resembled mine, Germany largely disappeared from discussion between 1806 and World War I. Maybe an Anton von Werner painting depicting the Empire’s proclamation, or an orphan portrait of Bismarck, but certainly not context. German-born British historian Katja Hoyer steps into the vacuum. Her introductory history is broad and sweeping, and provides a good bird’s-eye view, assuming an introduction is what you need.

Hoyer organizes her history into five long, thematically linked chapters: the years leading to unification, Kaiser Wilhelm I’s reign, the tragedy of Friedrich III and Bismarck’s downfall, Kaiser Wilhelm II’s reign, and finally World War I. She attempts to present Germany’s arc in the most sympathetic terms possible. After all, under constant pressure from outside forces, Germany certainly needed a unified state to protect its people and traditions from trampling.

That sympathy isn’t rose-tinted, though. In Hoyer’s telling, Germany provided a necessary defensive service, but at great price. Bismarck, the consummate national organizer, found the common needs of Germany’s tiny states, and played them together under Prussia’s banner. Those common needs, though, were mostly for defense against Europe’s other empires. Therefore Germany was “unified” mainly by its ability to spot, and defend against, perceived enemies, foreign and domestic.

Note the “domestic” in that formulation. While Bismarck, and his puppet emperor Wilhelm I, definitely protected Germany against French and Russian territorial ambitions, Bismarck also despised change from within. He brought the same fervor to suppressing liberal democracy and nascent socialism that he did to expelling tsarists and Bonapartists. Karl Marx himself was never welcome back to his native Prussia, Bismarck saw to that.

Katja Hoyer

Basically, Germany looked at a Europe dominated by various empires in their death throes, violently lashing out at one another like wounded wolves, and thought: I’ll have that. It wanted a unitary monarch to rally around, and Bismarck gave them that, in Prussian King Wilhelm I. Admittedly, Wilhelm never wanted that power, and happily delegated actual authority back to Bismarck, which suited both men, and Germany overall, just fine.

Unfortunately, some people believed the nationalist mythology of a unitary Kaiser, and young prince Wilhelm was one. Though the old Kaiser’s son Friedrich III was progressive-minded, and might’ve extended democracy to Germany, he inherited the throne already terminally ill, and reigned only 99 days. Then power passed to Wilhelm II, who honestly believed the claptrap Bismarck had sold Germany, and set course to rule single-handedly.

History shows how that ended.

Hoyer describes this history in sweeping, synoptic terms. She spends little time unpacking individual events, and nothing on individual personalities, except for Bismarck, the two Wilhelms, and a little about Friedrich III. Hoyer cares less about the events and personalities which comprise the narrative, and more about the overall social forces driving them. Thus she cites names and places, without always explaining why they matter in any particular situation.

She also avoids topics that don’t play into her core interest in political history. The book includes two orphaned references to composer Richard Wagner, and a brief passage about how Germany’s urban proletariat turned culture into a consumer commodity, but nothing much about cultural forces overall. Similarly, though Hoyer admits religion and secularism played into national identity, she doesn’t unpack them beyond how political parties exploited them.

This book therefore courts an audience not necessarily familiar with German Imperial history. And the history Hoyer provides is the history of the German state, not the German people. Readers already familiar with pre-WWI German history will probably find this book excessively synoptic, and readers who don’t necessarily know Germany, but enjoy deep dives into history (like me) will wish she paused to unpack why exactly this story matters.

Notwithstanding Hoyer’s brevity, this monograph concisely introduces an aspect of history often treated hastily. Bismarck’s intricate political horse-trading, and his heirs’ inability to preserve what he started, have plenty of steam-age drama. If Hoyer doesn’t unpack much herself, she at least introduces enough to let us decide what’s worth a deeper dive. I suspect many English speakers don’t know this corner of history, and we should.

Tuesday, March 28, 2023

The Making of a “Classic”

Agatha Christie

“Is there a reason a video game can’t be considered a classic?” I remember a student asking in English 102. “Why can’t a video game be ‘literature’?” Our class that semester was dedicated to researching and writing about the meaning of education, especially higher education, and what it means to be an educated person. The student was an English major, but uncomfortable with the canon that the discipline considered “classic.”

I recalled this student this week, when publisher HarperCollins announced that Agatha Christie would be the latest popular dead writer to have their work rewritten to exclude inflammatory language. Christie, whose work hasn’t gone out of print in a century and whose novels have been more widely translated than Shakespeare, used a number of late-Empire attitudes in her writing. Her protagonists often disliked “foreigners” and used vulgar racist language.

My student got a discussion rolling over what works were considered “classics” in high school English classes. Why, for instance, is The Great Gatsby considered classic literary art, but The Maltese Falcon still lumped with “genre” fiction? Why Shakespeare, but not Neil Simon? We never reached a satisfactory conclusion in class, but I repeated the question to other sections, and eventually, I think we reached a provisional definition of literature.

A work becomes a “classic,” we tentatively decided, when it says something about the time when it was written, and also about the time we read it. In other words, classic literature isn’t transcendentally timeless; it belongs to the time when it was written. This sometimes means outdated attitudes, such as Atticus Finch’s White Savior mentality. But even outdated content says something about us, now, reading and interpreting the text.

(Incidentally, by this definition, a video game narrative could potentially become a classic. The game itself couldn’t, however, because the technology necessary to read it becomes outmoded and illegible to future generations. Thus popular video game adaptations, like the current TV series The Last of Us, might potentially outlive the games that spawned them.)

Roald Dahl

That returns us to the current mania to rewrite classic literature. Sure, sometimes that’s necessary. The widespread outrage over rewriting began with Roald Dahl’s novels, which often contained jarring racism. In Dahl’s 1964 novel Charlie and the Chocolate Factory, the Oompa-Loompas were explicitly described as African pygmies working without pay. First-generation readers considered this so offensive that Dahl himself rewrote that aspect for the 1973 rerelease.

But there’s a difference between an author acknowledging public taste (and his own egregious cock-up) and correcting it, and uncredited outsiders imposing another era’s views on that author. One author being heavily rewritten is Ian Fleming, who created James Bond. In Fleming’s telling, Bond was racist, frequently anti-American, and had little patience with anybody not British. Far from the cinematic hero depicted by MGM, Bond was a lousy human being.

Yet I’d contend that’s the point. Like John le CarrĂ© or Frederick Forsythe, Fleming wrote about the kind of bottom-feeder who generally worked in international intelligence. Anybody who reads the history of international affairs knows that the Cold Warriors working for spy agencies like MI6 or the CIA were often little better than the miscreants they hunted. These novels usually featured what one critic called “the pretty bad standing against the truly awful.”

When ghostwriters shave Bond’s chauvinism, nationalism, and bigotry away, they lose something of the Cold War milieu that produced him. James Bond emerged from an atmosphere of realpolitik that found his skills useful. But Britain also sent him overseas because it wanted to put men like him as far from the homeland as possible. James Bond is, simultaneously, a British national asset, and a liability to a global superpower’s self-image.

Ian Fleming

Ian Fleming, Agatha Christie, and Roald Dahl wrote during the British Empire’s final crumbling years. Britain’s economy, culture, and self-image were in a state of flux, one that arguably paid off in the ignominy of Thatcherism. Their racist ideas reflected at least a portion of Britain at the time, watching burnished imperial splendor wither, exposing the moral rot that had always laid underneath. Arguably, they handled the change badly.

Except I believe that’s the discussion readers need to have when reading these works. Poirot and Miss Marple upheld the law, which included parroting top-level White supremacy and late colonialism. James Bond defended a homeland that was embarrassed by him. Charlie Bucket inherited an economy based on enslaved labor. These works are “classics,” not in spite of, but because of these traits, and the questions they ask us, the readers.

Friday, March 24, 2023

The Myth of Laziness in Modern Society

Devon Price, Ph.D., Laziness Does Not Exist

Dr. Devon Price believes we’ve all willingly swallowed “The Laziness Lie,” a narrative that ties human worth to productivity, self-sacrifice, and beauty. If we aren’t constantly economically productive, giving of ourselves for others, and pretty, we’re disparaged as lazy. This matters because “laziness” is an entirely personal fault, and if we aren’t following somebody else’s arbitrary yardstick, that exonerates the system and makes us individually bad people.

In brief, the behaviors we disdain as laziness in others, and in ourselves, serve solid purposes. When we’re fatigued, bored, and alienated from the fruits of our labors, we mentally check out from work. When loved ones impinge upon our limited mental energy with their demands, we become resistant and resentful of our relationships. When we’re disabled and literally can’t do any more, despite society’s demands to “push through,” we ultimately collapse.

Price therefore asserts that what we malign as laziness, is often our bodies or brains indicating that we’ve pushed too far. Fatigue or mental resistance are how our physical selves broadcast that they’ve been overextended. But our culture and economic structure demand we ignore our feelings as “irrational” and continue overworking ourselves. Our unelected leaders present fatigue, sleep deprivation, and pain as indicators of great moral virtue.

“Laziness” has a relatively recent origin. The word isn’t even attested in English, Dr. Price writes, until the middle fifteenth century—not coincidentally, about the time Western Europeans began keeping chattel slaves. Though global civilizations have always loathed idleness (in various ways), slave-owning economics necessitated inventing the “work ethic,” a codified way of blaming workers for needing rest, even when somebody else owns the fruits of their labors.

Price traces the development of “laziness,” as a moral pejorative, through industrialization, into modern technological society. Under capitalism, basic human needs like housing, food, and sleep, become rewards we earn by sacrificing ourselves to a forty-hour week. Sickness, weariness, and disability become something weak-minded people must endure stoically, not signs that we need to stop. Vacations and free time are luxuries, not human needs.

Devon Price, Ph.D.

Not surprisingly, Dr. Price dedicates around half his book to workplace ethics and “laziness,” and postulates multiple causes and solutions. What capitalist morality calls laziness, Price demonstrates, is a wholly reasonable response to bad pay and lack of autonomy. The mental barriers that demand we occasionally stop working, are only “bad” because they deprive owners of wealth. And workers can resist this exploitation by presenting a united front.

More surprisingly, Price’s other half goes beyond economics. Our relationships with others are often colored by fears that, if we don’t sacrifice enough of ourselves, we don’t love our friends and family enough. We often feel “lazy” and unworthy if we can’t support enough political causes. Even health and beauty frequently reek of “The Laziness Lie,” as disabled people and anybody who’s endured fat-shaming can tell you.

All these situations share a moral component. Whether it’s work, weight loss, or giving ourselves over to demanding kinfolk, we’re expected to perform with the constant, unrelenting efficiency of industrial machinery. But that’s just it, Price writes: we’re not machines. We’re not built from interchangeable parts. We’re complex, unique individuals with personal needs, and when we, or the people around us, ignore those needs, our bodies break down.

(Besides that, machines break down if they don’t rest occasionally.)

As a research psychologist and Loyola University professor, Price structures his arguments eloquently. He shifts seamlessly from individual experiences and case studies that demonstrate his point, out into themes of statistical analysis showing what these experiences mean to us. This is classic textbook organization: what’s true for the individual, proves portable to the population overall. Only the personal is truly universal.

Admittedly, many of Price’s case studies show evidence of selection bias. Everyone given the opportunity to describe their experience with burnout is college-educated and relatively stable. One person describes homelessness in the past tense; nobody’s pulling doubles at the factory or overnights at McDonald’s to afford childcare and keep a Dodge Challenger with expired tags running. If Price asked me, I’d love more economic diversity in his case studies.

Briefly, Price says, actions have purposes, even if we’re not conscious of them. When people start resisting externally imposed standards of productivity, attacking them (or attacking ourselves) as “lazy” uses moral imperatives to hand-wave the underlying problem. Until we find and address the purpose behind supposedly lazy behavior, nothing will change. When we accept our resistance as a sign of necessary repairs, we’ll finally fix our economy, and ourselves.

Monday, March 20, 2023

The Poor Intellectual in America’s Bread Basket

Sarah Smarsh

Late in journalist Sarah Smarsh’s autobiography, Heartland, she undertakes that most time-honored of adulthood rites: leaving for college. In Smarsh’s case, this passage carries special significance. As the first member of her family to attend college, she arrives without the prior knowledge of how to “do school” that many of her peers already possess. And coming from dirt poverty, she carries the necessity to survive that most college students lack.

Smarsh doesn’t dwell on this; it’s only one long-ish scene in her final chapter. She received a full-ride scholarship to cover her tuition, but needed three jobs to afford room and board. The system we have, Smarsh writes, tacitly assumes students’ parents will shoulder the financial burden of education. Because America’s higher education system bears the toolmarks of its makers, who were themselves well-off, and expected likewise from their students.

I remember, during my brief academic career, repeating a time-honored bromide to my students: “Nobody’s ever too tired to read.” I’d heard that from my teachers in public school, and internalized it, despite not having gone straight from high school to college. An inveterate reader from my childhood, I saw reading as innately pleasurable, a source of energy rather than a consumer of it. And I couldn’t comprehend why anyone, like my students, saw it otherwise.

Not until my adjunct position ended without fanfare did I realize how false that claim was. Turned loose onto the demands of a regional economy that had little need for my skills, and desperately in need of grocery money, I accepted a job beneath my capabilities, simply because it was there. (And simply because I loathe the job-hunting process.) And within a matter of weeks, I discovered, for the first time, that it was possible to be too tired to read.

Sometime later, I would learn the mechanics behind this. Though the brain remains deeply enigmatic to scientists, our best research minds have definitely uncovered some facts. One is that the brain draws energy from the same well as the body. What’s more, it draws energy completely disproportionate to its mass: your brain is less than two percent of your body’s mass, yet consumes more than 20% of your body’s energy.

And when the well is dry, the well is dry.

Not until leaving academia and entering the factory (and later construction) did I discover what weariness meant. Sure, I’d been tired in high school, as many people were, but not the soul-sucking weariness of pulling an eight-hour shift, then coming home to housekeeping, cooking, and generally taking care of myself. Left with the same two or three free hours everyone else has, for the first time, I found myself too tired to read.

Reading Smarsh’s description of working three jobs to subsidize taking classes, I felt that weariness again. It’s taken me ten years to regain sufficient energy to read after work, and even that is inconsistent; most days I can read some, but some days, I’m fortunate if I can stare mindlessly at my phone for a few hours. Some days, I’m lucky to wolf microwave food before lapsing into coma-like sleep.

Yet despite that, Smarsh not only had wherewithal enough to complete her degree, she had enough to complete graduate school and move onto a career. Reading her story, it’s easy to understand why: she had a personal vision, one she wanted to pursue without regard for economic limitations. She was fortunate to have that. Too many of my students from poor backgrounds had few aspirations beyond a vague desire for middle-class comfort.

Many of my students, who heard me state that bullshit about “nobody’s too tired to read,” had outside jobs. At least two told me they were working nearly full-time during the week, then driving back to their hometowns to pull shifts at their parents’ farms or machine shops. Conventional academic theorists would say these students were cruising for failure, that working so many hours outside class guaranteed defeat. Work or school: pick one, you can’t do both.

I suspect these students would reply: “Rent is due.”

Education remains, at least nominally, America’s guarantee for a middle-class lifestyle. My poor students chased a degree, not to improve themselves, but to improve their economic prospects. Couple that with crushing student debt, and a job market that doesn’t offer self-sustaining jobs anymore, and school can be as much a recipe for failure as success. I can only imagine how insulting it was to hear me say “nobody’s too tired to read.”

Saturday, March 18, 2023

Your Childhood Nostalgia Is Stupid, and Dangerous

I’ve been thinking about Bobby Harrow*, who held Mrs. Martin’s class at Rancho Elementary School in Spring Valley, California, hostage for two years. He was boisterous and energetic, his mind not circumscribed by such effluvia as coursework or the fact that someone was talking. His nigh-pathological refusal to sit down, shut up, or stay on topic had the ability to derail the entire classroom. As often happens, Mrs. Martin ran her class to mollify Bobby.

Nowadays, of course, we’d diagnose Bobby with ADHD and prescribe pharmaceutical stimulants. But I knew Bobby eight years before I ever encountered the term Attention Deficit Disorder. In the early 1980s, he was simply high-spirited and disruptive, something his teacher needed to handle, and his classmates needed to live with. He was also intensely creative and a natural problem-solver, so Mrs. Martin needed to teach him channeling techniques, not just silence him.

I remembered Bobby this week, upon reading this spectacularly slovenly piece of GenX nostalgia:

Click to enlarge

Judging by the author’s profile, he’s probably approximately my age. Like me, he’s White. Nothing clearly indicates his personal background or childhood, but I’d venture that, like me, he comes from a conventionally suburban upbringing, where parents left the neighborhood daily for work, and school and perhaps the local strip mall were the only factors holding the community together. Neighborhoods like mine, and probably his, weren’t community, they were mailing addresses.

It wasn’t one neighborhood, either. My family moved around, pursuing my father’s Coast Guard career, so I experienced schools and neighborhoods in California, New York, Louisiana, and Hawaii. My parents always chose suburban neighborhoods, which they deemed “safe”; they’d never have admitted it aloud, probably not even to themselves, but that meant “White.” Therefore my upbringing was reasonably whitewashed, free of contact with unconventional demographic groups.

Because of my suburban raising, I didn’t know gay people existed until high school. Of course, they definitely existed; groups like Lambda Legal, the Gay Liberation Front, and PFLAG predate my sheltered Caucasian childhood. Similarly, I knew racism existed, but because I lived in well-scrubbed suburban neighborhoods, I never saw it; therefore, I never thought about it, and grew up believing it belonged to another historical epoch. I never had to know, so I didn’t.

Yet I definitely knew other things. Though I never encountered the term ADHD until 1991, Bobby Harrow definitely proved it existed; we just didn’t force-feed students like him Ritalin or Adderall to make them compliant. Similarly, I know, because Mrs. Martin told us, that she maintained a classroom first-aid kit that included epinephrine, because at least two classmates had food allergies. And autism definitely existed; the school just quietly dumped those students into Special Ed.

Therefore I find myself both willing and unwilling to forgive the tweeter above. For my first eighteen years, I didn’t have to know “different” people existed; I thought my suburban upbringing was normal. When I discovered it wasn’t, that some people had very different childhood experiences, I chose to learn and grow. Other people, mostly White adults, keep discovering different people have different experiences, and simply shout “nuh-uh!” It’s a thoughtless reaction, but I understand.

But I also can’t forgive people who don’t remember things that definitely existed. The simple claim that other people didn’t have neurological conditions, allergies, or even obesity, thirty years ago, only makes sense if this person and others like him willfully edit their memories. And to suggest that grade-school students sat quietly back then? Preposterous! Then as now, kids hated being confined indoors every day, and got stroppy and rebellious, needing repeated discipline.

When people younger than me complain about how lawless, crazy, and self-indulgent children are “these days,” I know they’ve made a choice. I was alive then; I know these problems existed, and often caused great disruptions. Yet somehow, I repeatedly encounter people who believe that today (whenever today is) has become monumentally worse than their own beloved childhood. And I wonder what kind of velvet-coated angel hatchery they claim they emerged from, free of friction.

Why do ADHD and autism diagnoses seem so common nowadays? Because we bother to look for them, Bradley! Why did nobody have nut allergies or Celiac disease when in your school daze? Probably because back then, people with food allergies just died, Susanne! Things aren’t getting worse, we’re just aware that bad conditions exist. If you can’t keep up, that’s a “you problem,” and you need to stop cluttering the discourse with your fake nostalgia.

*not his real name

Wednesday, March 15, 2023

Law & Justice in the Other New York

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 49
Joe Berlinger and Bruce Sinofsky, Brother’s Keeper

Sometime in the small hours of June 7th, 1990, poor dirt farmer William Ward died in his bed. He was 64 years old and had been in failing health for some time. Police initially accepted this as just something that happens. But a hotshot medical examiner soon found slight irregularities in William’s remains and proclaimed foul play. Police quickly arrested William’s youngest surviving brother, Delbert, charging him with “mercy killing.”

Documentarians Joe Berlinger and Bruce Sinofsky built their career around following people that the media world gawks at without bothering to understand. They created a trilogy of documentaries surrounding the West Memphis Three, the poster children for the 1980s Satanic Panic. They also worked extensively with the rock band Metallica. After Sinofsky’s passing, Berlinger directed Netflix’s highly controversial Jeffrey Dahmer biopic, continuing his love of spectacle.

Berlinger and Sinofsky were attracted to William Ward’s murder, and Delbert’s trial, not because of events themselves, but because of the media circus surrounding them. The four Ward brothers farmed their ancestral patch outside Munnsville, a central New York village that (to judge by this film) has few residents younger than forty. The Wards themselves had lived their entire lives on the farm, with electricity but no running water.

Our filmmakers struggle to let the Ward brothers tell their own stories. Problem is, the Ward brothers aren’t very helpful. While Berlinger and Sinofsky’s interview subjects mostly interact well with the camera and explain themselves in measured tones, Delbert Ward and his brothers, Lyman and Roscoe, are visibly uncomfortable. The documentarians have to leave their questions in the edit, because the Ward brothers consistently give uncomfortable one-word answers.

Much media speculation around the Ward murder, as recounted in this documentary, centers on the Wards’ simple lifestyles. Slick-suited downstate journalists loved to interview the brothers, and their neighbors, keeping them centered on camera so the world could hear their regional accents and see their paucity of teeth. None of the brothers ever married; though they seem amiable with Munnsville women, there’s little evidence any has ever had a relationship.

By contrast, Berlinger and Sinofsky aim their cameras at the journalists and their polished crews. While urbane news crews in fashionable late-eighties businesswear get multiple takes to perfect their location shoots, they let Delbert Ward ramble incoherently, and broadcast the first take. Berlinger and Sinofsky show the contrast between supercilious journalists, and the way Munnsville’s people close ranks to protect Delbert Ward, whom they consider a neighbor.

Delbert Ward (right) and his attorney, as they hear the verdict

Unfortunately, Munnsville’s attitude toward the Wards proves as patronizing as the city slickers. Several Munnsville residents give on-camera interviews, but fumble through their cliched, condescending narratives of “neighbors” they clearly don’t know well. Several Munnsville residents spin fictional justifications of why Delbert couldn’t possibly be guilty, or why he is, but it’s secretly okay. Many describe the Ward brothers as simple-minded, rusticating, and possibly mentally disabled.

That last characterization proves prescient when Delbert’s defense attorney deploys it in his opening argument. Ralph Cognetti literally claims Delbert couldn’t have murdered William because he’s too simple-minded—the same argument the nameless defense attorney uses in Ernest J. Gaines’s novel A Lesson Before Dying. Delbert’s attorney, his neighbors, and distant supporters defend him using the same accusation the state uses: these are just hill people, with all the stereotypes.

New York state police based their entire argument on two facts: William’s body had petechial hemorrhaging, and Delbert signed a confession. (They also claim they found semen on William’s corpse, a lurid detail presumably used to bait the media, since it’s never pursued further.) The problem with Delbert’s signed confession is, by his own admission, Delbert can barely read. His entire understanding of justice comes from watching Matlock.

Berlinger and Sinofsky follow the Ward brothers and their Munnsville neighbors through the months preceding the trial, and the trial itself. Their depiction of a murder trial is chilling. Stripped of Dick Wolf’s beloved melodrama, the process appears degrading and spiteful. Lyman Ward handles cross-examination so poorly, I briefly thought he’d died on the stand. It’s enough to make one wonder whether trials are about justice at all.

It spoils nothing to admit: Delbert is acquitted, but not exonerated. This movie isn’t about the outcome anyway. It’s about the conflict between outsiders and the community, the way downstate police and prosecutors (and their media allies) hunted for a murderer before proving a murder actually happened, while the working-class community closed ranks to defend their own. The product is chilling, an indictment of the justice system itself.

Monday, March 13, 2023

The Truth About America’s Rural Poor

Sarah Smarsh, Heartland: A Memoir of Working Hard and Being Broke in the Richest Country on Earth

Sarah Smarsh has no patience with bromides about how honest and simple America’s poor rural core might be. Born a fifth-generation farmer outside Wichita, Kansas, she faced the rootlessness and despair which poverty generates among America’s working classes. Far from Norman Rockwell simplicity, she faced a childhood buffeted by falling crop prices, inability to own anything, and transience. She attended over a dozen schools before twelfth grade.

The product, Smarsh writes, isn’t the picturesque rural ideal beloved by political campaigns and Frank Capra movies. She describes a childhood dominated by banalities, driven by despair. She watched her very young parents build their own house, then lose everything to natural disasters and exploitation. Her construction worker father lost his livelihood when rising interest rates wrecked his industry. Meanwhile, her mother, a farmer, barely  survived the 1980s Farm Crisis.

Smarsh structures her book as an open letter to the daughter she never had. She came from several generations of women who got pregnant as teenagers, and who went through several spouses because society deemed marriage necessary, even when the marriage was palpably harmful. Smarsh grew up conscious that such could happen to her, too, and she planned for the eventuality of winding up with a mouth she was too poor and young to feed.

She describes a childhood surrounded by powerful women who wouldn’t break under life’s pressures. Women like Grandma Betty, who survived several violent husbands and her own mother’s schizophrenia diagnosis to become Wichita’s most competent and respected parole officer. Or her mother, Jeannie, who persevered under intense pressure and lost everything anyway, yet somehow increasingly believed Reagan-era bootstrap mythology.

Smarsh’s narrative moves from the personal to the global, and back again. Her storytelling heart stays closest to her Kansas upbringing, and her family’s voluble storytelling traditions. She reconstructs not only her own life, but the lives of the women around her, lives characterized by big dreams and small, persistent disappointments. The women around Smarsh scarcely move forward before somebody, usually a man, often a husband, kicks down the ladder.

Sarah Smarsh

Poverty, Smarsh asserts, doesn’t just inexplicably happen; it comes from powerful people making deliberate choices. After the American government subsidized rapid expansion through the Homestead Act and land-grant colleges (freely distributing land the federal government had stolen honestly), it abandoned the rural population by shifting to urban industrialization, then to suburban single-family home ownership. At each stage, the government abandoned populations it once actively supported, letting farmers and urban cores rot.

Throughout, Smarsh finds ways to retain her dreams. Schools deem her academically “gifted,” though that means little when she transfers frequently while her parents hunt for employment. She’s particularly talented at public speaking. Most important, she alone among the women around her doesn’t have an abusive father or husband. Her father, Nick, and step-grandfather, Arnie, give her stability that her grandmothers, cousins, and other kinfolk lack.

She also has August, the name she selected for her future daughter. Smarsh describes her increasingly direct relationship with the daughter she never had, asking herself when faced with difficult choices or powerful temptations, what would I recommend to August? This gives her a long-term perspective which her relatives, beset by hunger and abuse, couldn’t afford. Smarsh occasionally addresses August directly, in the second person, grateful for the steadfastness.

I recognize some of myself in Smarsh, but less than I’d expected. We’re close in age, and for much of her Kansas childhood, I lived nearby, in Nebraska. But some aspects don’t jibe altogether. She describes adults encouraging her to leave the homestead and pursue a life of urbanized upward mobility. As the Farm Crisis decimates American small towns, Smarsh writes, nobody dreamed of bequeathing the farm to their children.

Really? Hitting early adulthood in Nebraska, I remember being explicitly told that desiring to leave the state or aspire to employment beyond hourly wage-earning was dishonest. State legislators perform massive economic contortions to keep homesteads in their respective families. Maybe it’s because I was geographically stationary in ways Smarsh wasn’t, but my experience with place and rootedness appears to have been very different from hers.

Despite this, Smarsh tells her story well. Rural simplicity, often extolled by political candidates and big-city nostalgia merchants, Smarsh identifies as simply the choices necessary to get by. Hard work isn’t ennobling, and is often degrading, especially as billionaires cannibalize rural communities for parts. Smarsh asserts that poor, rural “virtues” aren’t heroic qualities. They’re the tricks and techniques poor people use to survive, often despite the world around them.

Friday, March 10, 2023

Another New Front in the Disability Wars

Elon Musk

Elon Musk’s bizarre handling of Twitter, formerly a reasonably respected outlet for minority voices, took a weird turn this week. Musk’s argument with an Icelandic product designer revealed not only internal business practices whereby contracted employees have no idea whether they’re still employed, but a top-level willingness to mock the disabled. It also further exposes the moral rot inherent in America’s—and by extension the world’s—economy.

When Elmo bought Twitter last autumn, I speculated aloud that he’d moderate his increasingly weird opinions. Not because he’d change his mind (his formerly center-left views have drifted into fascist-adjacent territory over the last three years), but because he’d learn quickly that advertisers wouldn’t pay him if their product appeared beside commercially unpalatable content. Outright bigotry hurts your bottom line anymore. Just ask “Papa” John Schnatter.

Rather than learning, however, Elmo’s most recent Twitter spat reveals how money insulates executives from consequences. Since the buyout, Twitter’s workforce has fallen by over two-thirds, including the entire staff responsible for finding and purging bigoted content. As a result, “hate speech,” defined as language targeting blanket groups for aggregated disparagement or threat, surged five-fold in just the first twelve hours. Anecdotally, the incidence of bigoted language remains worse than that.

I realize some readers may consider this a non-issue. If the bird site collapses, why not let it? Nobody particularly mourned when MySpace collapsed under similar mismanagement. But I’d contend it seriously matters, because Twitter provides an important venue nobody else is offering. As journalist Sarah Kendzior writes, Twitter provides a platform where minorities, LGBT+, and the disabled have the same chance as anyone else to be heard.

Product designer Halli Thorleifsson represents probably the least-heard among those categories. While antiracist voices and pro-LGBT+ voices remain active, and have entire months dedicated to resisting their diminishment, the disabled have remarkably few allies. Even those who, like me, have personal stakes in disability rights advocacy sometimes forget the groups with whom we’re nominally allied, because their voices get lost in the brouhaha.

Halli Thorleifsson

Though not disabled myself, life circumstances and personal relationships have taught me recently how widespread ableism is in America’s economy. Our society openly disparages anybody who cannot shape their lives around a forty-hour work week. The fact that Elmo initially answered Halli Thorleifsson’s inquiries with a laugh emoji, and refused to believe that anybody with diminished physical capacity could work for him, is only a highly visible example.

According to government statistics, disabled Americans are twice as likely to be unemployed as the general public. I can’t find statistics on underemployed disabled Americans, that is, Americans working below their training or mental capacity simply because they can’t perform heavy labor. As I learned during my teaching days, people who struggle with diminished physical abilities, but aren’t paraplegic or otherwise stereotypically disabled, often aren’t counted as disabled at all.

That’s why it matters that Thorleifsson got a fair hearing through Twitter. The platform gave him an opportunity to draw public attention to his maltreatment—something few other disabled workers often receive. Halli Thorleifsson proves the lie behind the argument that “nobody wants to work anymore.” He’s living proof that, given the opportunity, most people prefer productivity and usefulness. But a dollar-value economy won’t grant them that.

The implications go beyond one social media platform. Historically, small-F fascists targeted disabled people before racial or ethnic minorities. Not just in Germany or Italy either, but here in America: the SCOTUS decision in Buck v. Bell, which permits the involuntary sterilization of the disabled, has never been formally overturned. Even the Former President (mostly) knew to use coded language in disparaging minorities, but evidently considered the disabled fair game.

Elmo’s distrust of public mockery isn’t entirely unjustified. He probably remembers, more acutely than us civilians, the public furor surrounding the Ligma/Johnson hoax and other similar fake callouts last October. I can’t entirely blame him for meeting an open-sourced accusation with initial distrust. But the fact that he didn’t have somebody present to remind him not to answer a developing disability rights argument with a laugh emoji is disheartening.

If Elmo succeeds in his ongoing campaign to constrict the bird site and silence dissenting voices, disabled workers like Thorleifsson will have even fewer channels for distributing their word. And, let’s be honest, that constriction seems likely. Elmo is exceedingly thin-skinned when criticized, and is unlikely to permit such public criticisms to happen again. It seems a question of when, not if, he’ll close another door for disabled workers’ rights.

Wednesday, March 8, 2023

Free Will and Determinism in Two Recent Horror Novels

This essay addresses the novels Brother by Ania Ahlborn (reviewed here) and The Last House on Needless Street by Catriona Ward (reviewed here). Be warned, this essay contains spoilers.
Ania Ahlborn, author of Brother

“For you, nothin’ matters,” Rebel Morrow bellows at his brother Michael, as we approach the culmination of Ania Ahlborn’s sixth novel, Brother. “You gotta have free will or some guts for shit to matter, and you don’t got neither.”

This is the closest Ahlborn comes to an out-and-out thesis statement for her novel. Michael tacitly acknowledges that, though he possesses free will, he’s never had gumption enough to use it. He’s always kept his head down to avoid Rebel’s unpredictable violent outbursts. This tactic isn’t unreasonable: around the book’s midpoint, Rebel drags Michael into a trackless forest and makes Michael dig his own grave, before suddenly, causelessly relenting.

Ahlborn’s novel shares thematic overlaps with Catriona Ward’s The Last House on Needless Street. Both novels address themes of inherited culpability, and the ways adults process the traumas of their childhood families. Both novels feature a character identified as “Momma” or “Mommy” who is, at once, terribly violent and remarkably absent from most of the narrative. Both show you can’t muscle through such traumas alone.

Michael Morrow survives tremendous abuse at his family’s hands. They not only whip him mercilessly for any transgression, real or imagined, they also force him to watch the similar punishments doled out on his sister, Misty Dawn. The latter is arguably more traumatic. After all, brave persons can stoically endure abuses poured onto their own bodies, but watching abuse poured onto others leaves people feeling helpless and despairing. Ask any child abuse survivor.

Constant abuse has left Michael conditioned to appease his abusers to survive. Again, Ahlborn addresses this, but doesn’t unpack it. For our purposes it matters that, as the novel commences, Michael is now nineteen years old: old enough, that is, to be legally culpable for his actions, and his inactions. This requires us to wonder what magical calendar date magically should’ve granted Michael maturity enough to resist his mistreatment?

Similarly, Ward’s protagonist, Ted Bannerman, survived something at his mother’s hands, though the novel mostly tiptoes around what. Unlike Ahlborn, who depicts Michael’s ongoing trauma in direct, brutal terms, Ward files Ted’s abuses in the sheltered past, where Ted can carefully avoid addressing them. Ted compartmentalizes his entire life to protect himself from understanding what Mommy did, and continues doing so, although Mommy’s been absent for years.

Catriona Ward, author of
The Last House on Needless Street

Ted creates an elaborate alternative narrative whereby he retreats into a world uncluttered by violence, pressure, or work. This alternative becomes so elaborate that it adopts its own personality, and ultimately displaces Ted altogether, at least periodically. We learn in the final chapters that Ted has invented a repertoire of alternate personalities, each equipped to handle different aspects of trauma. Having created them, though, he can no longer control them.

These alternatives, Michael’s constant mindfulness of every possible transgression, and Ted’s retreat into alternative reality, represent opposite responses to family abuse. Michael must remain permanently conscious of the traumas he endures, watchful to ensure he doesn’t do something that invites punishment. In the novel’s climax, however, we discover that Michael’s consciousness is finite, and Rebel has been hand-waving Michael’s (and our) attention from what really matters.

Therefore, Ted has arguably accepted something Michael cannot: that there’s nothing either can do to avoid this violence. Trauma is inevitable; our only reasonable response is how we handle it. Okay, so Ted’s approach isn’t necessarily productive, and possibly results in clothes getting hurt or killed. It certainly results in Ted living a diminished life. But it also results in Ted surviving, which is the important part for the child receiving the abuse.

The debate between free will and determinism, once a theological argument about humankind’s relationship with God, has become a secular psychological issue. Prominent public atheists like Sam Harris and John Gray contend that humans are driven entirely by deterministic systems which we can understand as essentially mechanical. They contrast this with a poorly defined “free will,” understood in terms of religious beliefs in the human soul.

Ania Ahlborn and Catriona Ward, neither of whom show any particular religious inclinations in their novels, temper these two extremes. In different ways, their novels indicate humans have sufficient liberty to make choices, can’t un-make them. Equally important, Michael and Ted make their choices as children, too young to be informed or responsible, then live with them as adults, when the choices are maladaptive but engrained.

Free will, these novelists imply, definitely exists. But we give it up, usually because we have to. Then we live with the mechanical consequences of that surrender.

Monday, March 6, 2023

Darkness’s Little Brother

Ania Ahlborn, Brother

Michael Morrow tries to keep his morals intact, despite the violence around him. It’s been no easy task: he grew up in a family of serial killers. But a chance encounter introduces him to Alice, a beautiful, free-spirited artist who reawakens the spark in Michael’s soul. For the first time in his short adult life, Alice gives Michael permission to dream. But are his dreams ultimately doomed, given the number of deaths in which Michael has been complicit?

Ania Ahlborn pinches a storyline familiar from slasher movies like The Texas Chainsaw Massacre or The Hills Have Eyes: the rusticated hillbilly clan that torments pretty people who wander into their territory. But Ahlborn subverts the genre by telling the story from the killers’ perspective. She unpacks the morals driving some of literature’s most seemingly amoral characters. And she suggests that the monsters may not torment outsiders half as much as they torture one another.

Michael’s brother's real name is Ray, but he wants everyone to call him Rebel. Only Michael is cowed enough to do so. Rebel, along with Momma and daddy Wade, does the actual killing, but they make Michael clean the messes left behind. Even worse, they make Michael’s sister Misty Dawn watch. Through the slow torture of complicity, the Morrows leave Michael with one driving motivation: don’t let the family sickness rub off on Misty Dawn.

Rebel finds constant ways to torture Michael. For one, he ensures Michael never learns to drive, or holds an adult job, keeping him dependent on Rebel to experience the outside world. He forces Michael to shoplift the liquor Rebel uses to quiet his inner torment. Michael has adapted to Rebel’s constant petty torments to survive. Considering how the alternative is collapse, keeping his head down works pretty well.

In flashbacks, we catch glimpses of the Morrows’ life before Michael was old enough to remember. They reveal Ray’s sensitive, emotional childhood, and his devotion to another sister, Lauralynn. What happened, we’re left to wonder, that Lauralynn isn’t in the present? These flashbacks demonstrate the moral complexity beneath the family’s violence. Rebel is sadistic, but deeply lonely. Michael is sweet-hearted, but complicit.

Ania Ahlborn

Alice, the pretty bohemian record-store clerk, gives Michael his first glimpse of a world not circumscribed by murder. She’s beautiful and artistic, but bored by small-town life. Music liberates her soul, and when she loans Michael a treasured 45, he experiences the same liberty. But it comes with a price. Leaving the Morrow farm means leaving Misty Dawn, another musical spirit, at the mercy of a family that murders pure-hearted young women.

Ahlborn plays with genre convention throughout this book. Her inspiration is clearly slasher movies, so reviewing this book in movie terms is justified. Ebert’s Law of Economy of Characters tells us that every character introduced by name and/or dialog serves a narrative purpose. Our role as audience is only to ascertain what that purpose is. Briefly, the answer is, more than I, an admittedly jaded reader and cinephile, anticipated.

The characters make occasional reference to God and religion; Rebel in particular compares himself to a vengeful deity. But this book is completely secular. Notwithstanding that, Ahlborn teases out themes of free will and determinism that even the characters themselves glimpse fleetingly. At what point, Ahlborn asks, does Michael stop being a victim of the Morrows’ evil, and become part of it?

This novel is packed with nuance and shadows. No moment of character or dialog is wasted. If Ahlborn's characters glimpse a daisy in passing, we’ll ultimately see someone important pushing it up. The cascade of moments accelerates until we reach a climax so fraught that, while reading it, my hands literally shook. I don’t rattle easily at mere prose, folks. The fact that this book exhausted my emotions testifies to its dense, well-constructed impact.

It’s hard not to feel for Michael. Presumably, few of us came from hillbilly murderer clans, but we all have ways we feel our families are weird. We’ve all experienced that moment we believed the right person could deliver us from that weirdness… and that moment when we absolutely believed we squandered our chance.

That’s what makes the tortures Michael faces so gut-wrenching. Because he is us, with the choices we all whiffed, and the trail of loved ones we’ve left hurting behind us. Michael’s pain is our pain. And, like Michael, we’re left with the consequences of the choices we could’ve made, but we never knew we had the choice until just too late.

Thursday, March 2, 2023

Economics, Moral Rot, and You

Until February 3rd, East Palestine, Ohio, was notable for its tragic ordinariness

Freshman U.S. Senator J.D. Vance (R-OH) has essentially backed himself into a corner. Having successfully run for office as a novice candidate by supporting Trumpism and its appeals to supposed White grievances, he has tied himself to an agenda of stark deregulation. Yet the train wreck in East Palestine, Ohio, was made possible by the Trump Administration rolling back two key regulations: freight train safety brakes, and prohibitions against shipping volatile chemicals through inhabited areas.

This disaster’s highly photogenic nature, with its biblical pillar of cloud looming over a chronically neglected town, creates inherent contradictions in Ohio. Though considered a swing state, Ohio has consistently leaned Republican for twenty years, and repeatedly supported candidates, like Trump and Vance, who promise to slash regulations. Although Ohio’s economy was notoriously hollowed by the collapse of American manufacturing in the 1970s and 1980s, the state trusts its rich to act right without legal compulsion.

Republicans overall, both elected officials like Vance and the voters who support them, must thread the needle between libertarian economics and an obvious need for government intervention. However, this problem isn’t strictly partisan. The East Palestine catastrophe has made visible a pattern unfolding across America, supported by both parties, reflecting the moral vacuity beneath America’s current economy. While one camera-friendly disaster dominates our media cycle, the same pattern unfolds throughout America on a smaller scale.

Philadelphia journalist Will Bunch describes a plastics factory in Monaca, Pennsylvania, that has received two high-level EPA violations since it opened last November. Dawson County, Nebraska, is currently battling its second railroad disaster just in February. These aren’t incidental disasters or routine entropy. For a factory to be cited twice in its first three months of operation suggests fundamental design flaws. Dawson County’s railroad is one of America’s busiest; it should be sufficiently maintained.

Approximately three train accidents and derailments, the yardstick for current political outrage, occur per day. Simultaneously, ownership of America’s substantially deregulated railroads has reached unprecedented levels of concentration: only seven companies currently control the Class I rail network, and most stations have only one company serving them. That’s a greater concentration than ever enjoyed (if that’s the word) by the 19th Century railroad barons whose names were splashed luridly across my high school history textbooks.

Public art currently on display in East Palestine, Ohio

Classic libertarian economics, espoused by Vance and Trump, contends that if we rescind regulations, everyone will generally do right. I reply: ordinary people like you and I might. The rich, immunized from consequences by their ability to buy lawmakers and regulators, won’t. Again, despite my name-dropping Republicans, this problem is substantially bipartisan; President Obama’s economic policies bolstered the banks that imploded the economy in 2007, while President Clinton signed the bill revoking the Glass-Steagall Act.

This bipartisan deference to the ownership class has simmered for years, but now it’s exploding. The East Palestine catastrophe follows close upon the Texas deep freeze, the worst wildfires in California history, and COVID-19 itself. Taken together, these repeated disasters demonstrate, in blaring neon letters written across the sky, that America’s capitalist class cannot be trusted to handle their resources responsibly. Both parties tiptoe around the seemingly inevitable conclusion that capitalist economics is morally bereft.

Don’t mistake me. Economics isn’t a binary; when I disparage capitalism, I don’t reflexively advocate communism. The retreating Soviet bloc left widespread poverty and environmental devastation in the 1990s, showing that old-line Communists didn’t care who they hurt in pursuing their economic goals, and the notoriously filthy Beijing air reveals things haven’t improved since. Capitalism and communism both reduce humans to economic agents, and slap price tags on the devastation they both equally leave behind.

The two economic theories which have dominated public discourse during my lifetime both produce filth, poverty, and abjection. And they do so because, for all their differences, they fundamentally agree that anything which cannot be priced, cannot be valued. Much like how America’s two dominant political parties frequently agree on more than they differ, humanity’s dominant economic theories share a moral destitution in which money is the only meaningful measure of human life and productivity.

Our world needs an economic theory with a moral foundation. Market socialism or Keynesian capitalism might offset the disasters we’re currently seeing, eventually, but they won’t address the moral rot that makes disasters like the East Palestine cloud possible. Economic models like Distributism center humans as moral agents, not economic agents. Without such a moral core, the problems currently unfolding around us will only continue accelerating, and the poor will suffer the most, and first.