Showing posts with label culture clash. Show all posts
Showing posts with label culture clash. Show all posts

Saturday, February 14, 2026

Lee Brice in Country Music’s Nostalgia Pits

Lee Brice (promo photo)

Lee Brice debuted his new song, “Country Nowadays,” at the TPUSA Super Bowl halftime show on February 8, and it was… disappointing. Brice visibly struggled to fingerpick and sing at the same time, and gargled into the microphone with a diminished rage that, presumably, he meant to resemble J.J. Cale. The product sounded like an apprentice singer-songwriter struggling through an open-mike night in a less reputable Nashville nightclub.

More attention, though, has stuck to Brice’s lyrics. The entire show ran over half an hour, but pundits have replayed the same fifteen seconds of Brice moaning the opening lines:

I just want to cut my grass, feed my dogs, wear my boots
Not turn the TV on, sit and watch the evening news
Be told if I tell my own daughter that little boys ain’t little girls
I’d be up the creek in hot water in this cancel-your ass-world.

Jon Stewart, that paragon of nonpartisan fairness, crowed that nobody’s stopping Brice from cutting his grass, feeding his dogs, or wearing his boots. Like that’s a winning stroke. Focusing on Brice’s banal laundry list misses the point, that Brice actively aspires to be middle-class and nondescript. But he believes that knowing and caring about other people’s issues makes him oppressed in a diverse, complex world.

One recalls the ubiquitous 2012 cartoon which circulated on social media with its attribution and copyright information cropped off. A man with a military haircut and Marine Corps sleeve stripes repeatedly orders “Just coffee, black.” A spike-haired barista with a nose ring tries to upsell him several specialty coffees he doesn’t want. Of course, nobody’s ever really had this interaction, but many people think they have.

Both Lee Brice and the coffee cartoonist aspire to live in a consistent, low-friction world. If your understanding of the recent-ish past comes from mass media, you might imagine the world lacked conflict, besides the acceptable conflict of the Cold War. John Wayne movies, Leave It to Beaver, and mid-century paperback novels presented a morally concise and economically stable world, in which White protagonists could restore balance by swinging a fist.

The coffee cartoon, with its unreadable
signature (click to enlarge)

By contrast, Brice and the coffee cartoonist face the same existential terror: the world doesn’t center me anymore. Yes, I said “existential terror.” What Brice sings with maudlin angst, and the cartoon plays for yuks, is a fear-based response, struggling to understand one’s place in the world. We all face that terror when becoming adults, of course. But once upon a time, we Honkies had social roles written for us.

I’ve said this before, but it bears repeating: “bein’ country,” as Brice sang, today means being assiduously anti-modern. Country music’s founders, particularly the Carter Family and Jimmy Rogers, were assiduously engaged with current events in the Great Depression. This especially includes A.P. Carter, who couldn’t have written his greatest music without Esley Riddle, a disabled Black guitarist. Country’s origins were manifestly progressive.

But around 1964, when the Beatles overtook the pop charts, several former rockers with Southern roots found themselves artistically homeless. Johnny Cash, Conway Twitty, Jerry Lee Lewis, and others managed to reinvent themselves as country musicians by simply emphasizing their native twang. But their music shifted themes distinctly. Their lyrics looked backward to beatified sharecropper pasts, peacefully sanded of economic inequality and political friction.

In 2004, Tim McGraw released “Back When,” a similar (though less partisan) love song to the beatified past. McGraw longs for a time “back when a Coke was a Coke, and a screw was a screw.” I don’t know whether McGraw deliberately channeled Merle Haggard’s 1982 song “Are the Good Times Really Over,” in which he sang “I wish Coke was still cola, and a joint was a bad place to go.”

Haggard notably did something Brice and McGraw don’t: he slapped a date on the “good times.” He sang: “Back before Elvis, or the Beatles.” That is, before 1954, when Haggard turned 17 and saw Lefty Frizzell in concert. Haggard, like McGraw or Brice, doesn’t yearn for any specific time. He misses stage of personal development when he didn’t have to make active choices or take responsibility for his actions.

Country musicians, especially men, love to cosplay adulthood, wearing tattered work shirts and pitching their singing voices down. Yet we see this theme developed across decades: virtue exists in the past, when life lacked diversity or conflict, and half-formed youths could nestle in social roles like a hammock. Lee Brice’s political statement, like generations before him, is to refuse to face grown-up reality.

Thursday, April 18, 2024

Hold Onto Sixteen As Long As You Can

John Mellencamp

Classic rock radio stalwart John Mellencamp got an unwanted attention boost this week when a month-old video of him abandoning the stage went viral. Apparently Mellencamp paused to speak directly to his audience, something musicians frequently do, but an audience member heckled him to shut up and resume playing. An outraged Mellencamp quit playing partway through “Jack and Diane,” leaving an arena audience in the lurch.

Several sources, including Fox News, spun this event as Mellencamp feuding with the audience over politics. Like most “heartland rockers,” including Bruce Springsteen, Bob Seger, and John Fogerty, Mellencamp’s politics skew left. This should surprise nobody who’s listened to Mellencamp’s lyrics—but apparently, several audience stalwart haven’t done so. Listeners are often gobsmacked to discover that their favorite heartland rockers are progressives who don’t just love being rural.

This spotlights a growing rift between artists like Mellencamp, and the largest number of their fans. We saw something similar when former New Jersey governor Chris Christie mentioned his love of Springsteen, and Springsteen responded by duetting with comedian Jimmy Fallon to mock Christie’s performance as governor. These rockers maintain the leftist, anti-establishment passions of their youth, while their audiences have become more conservative and revanchist.

Pop history tells us that “heartland rock” emerged in the middle 1970s: Springsteen’s first hit, “Born to Run,” hit the Billboard Top 40 in 1975, while Tom Petty’s first hit, “Breakdown,” barely creased the Top 40 in 1977. However, this ignores that both artists never developed legitimate star power until the 1980s. It also disregards both Bob Seger, and John Fogerty’s original band Creedence Clearwater Revival, which had their first hits in the 1960s.

Bruce Springsteen

From its origins, heartland rock bore a contradiction. Though its chief songwriters have pressed progressive politics and a disdain for capitalism into their lyrics, their musical stylings were persistently conservative. Fogerty deliberately channeled musical stylings from Delta blues and Memphis soul, while Petty’s sound grew, like Spanish moss, from the swampy slumgullion of influences in his inland northern Florida upbringing.

Thus, conservative audiences who don’t listen deeply have always thought their favorite heartland rockers spoke directly to them. The most famous example, of course, must be Ronald Reagan’s attempt to conscript Springsteen into his 1984 reelection campaign. But my personal favorite comes from TikTok. A whyte-boy in a backward ballcap and a pick-em-up truck shouts “Thank God my mom didn’t raise no f**king liberal!” before tearing off scream-singing with CCR’s “Fortunate Son.”

The complete failure to understand the left-leaning message in these lyrics might seem baffling, except that I once shared it. I’ve written about this before: listening to classic rock radio during my rebellious teenage years allowed me to consider myself forward-thinking because I engaged with the injustices of the Vietnam era. By pretending to care about injustice back then, I allowed myself to passively participate with injustices occurring right now.

There’s nothing innately conservative about consuming media shallowly, but in my experience, people who don’t parse for greater depth usually have conservative politics. Conservatives love surface-level readings. My lifelong Republican parents encouraged me to reject deeper textual analysis of literature, even when high-school English teachers graded me for doing so. Listening to classic rock at the surface level often rewards conservative readings of its time.

Heartland rockers were classic rock before the “classic rock radio” category was invented.

John Fogerty

Surviving heartland rockers like Mellencamp, Springsteen, and Melissa Etheridge continue recording, but they haven’t had Top 40 hits since the middle 1990s. Fogerty, who’s always had a contentious relationship with the recording industry, hasn’t meaningfully charted a single since 1985. Though they all continue touring, they’ve become oldies circuit staples, their concerts consisting primarily of songs first heard forty, fifty, or more years ago.

Like the artists themselves, their audiences have continued aging. The greasers and slicks who got energized for Springsteen’s fight against small-town malaise in 1975, now have mortgages, student debt, and children. Such material investments in the status quo encourage, if not principled conservatism, at least a desire to ensure they didn’t invest themselves in hot air. The audiences have grown away from the artists they admire.

Perhaps the most telling fact is whom these artists now influence. Jake Owens’ “I Was Jack (You Were Diane)” and Eric Church’s “Springsteen” were massive country hits, channeling the artists they name-dropped. But both songs reduce their tribute subjects to mere nostalgia for whyte audiences. These artists, now in their seventies, have become the thing their teenage selves rebelled against. There’s no coming back from that.

Tuesday, October 31, 2023

The New Dark Age of the Fake Self

Buffy Sainte-Marie

News broke last week that multiple award-winning folksinger Buffy Sainte-Marie isn’t Native American, as she’s claimed for sixty years. Since breaking into the Greenwich Village folk scene alongside Bob Dylan and Pete Seeger, Sainte-Marie’s public persona has included her putative claim of Canadian birth, to the Piapot band of the Cree nation; she was subsequently, she said, adopted by a Massachusetts couple. New documents claim she was born White, to an Anglo-Italian family outside Boston.

Sainte-Marie’s public career has focused heavily on breaking boundaries for generations. She was supposedly the first Native American to win an Oscar, and has won Grammy and Juno awards for Indigenous music categories. If the current accusations stand, that means she received accolades intended for legitimately marginalized persons. It means she’s participated in expropriating other groups’ stories, reselling them at a profit, and claiming the honors accruing thereunto. It makes her, morally, a common thief.

This is especially vexing in the music industry, where we expect professionals to speak from experience. “Authenticity” is the watchword among pop singers, poets, and dramatists. Nobody expects this of, say, actors; when Gillian Anderson played Margaret Thatcher on The Crown, nobody bellyached because she wasn’t really a Prime Minister. But whenever Kid Rock rehashes his “plain folks” redneck persona, somebody hastily reminds us that he grew up rich in Michigan, just cosplaying working-class roots.

Recently, the internet’s population of full-time professional offense-takers has become particularly militant around policing racial boundaries. Rachel Dolezal set the pattern eight years ago, but she’s gotten lost amidst the tide. More recently, netizens have pushed themselves into a tizzy about whether BLM firebrand Shaun King might be secretly White, or that MAGA glamour queen Kari Lake might be concealing Black parentage. Whatever the truth, we clearly expect politicians, like pop stars, to be authentic.

Yours truly, right, as an angel (there’s a stretch) withJeff Ensz, left, as George Bailey
in the 2017 Kearney Community theatre production of It's a Wonderful Life

The demand for authenticity sometimes produces strange effects. Actor Jussie Smollett’s case is almost comically extreme, but representative. Being both Black and gay, Smollett evidently believed he needed some categorical oppression in his backstory; but he comes from a middle-class background and a relatively low-friction life. So he paid two brothers to fabricate a hate crime. While most participants in the “oppression Olympics” don’t go nearly so far, lies and street theatre are depressingly common.

But simultaneously, I fear that “authenticity” creates siloes that restrict our ability to participate in culture creation. As a trained actor myself, I can’t entirely manufacture my stage identity. I can’t, for instance, use blackface. Robert Downey Jr. in Tropic Thunder will probably be the last mainstream actor in my lifetime to use blackface on camera. Even then, he played a satire, a character too lacking in self-awareness to realize the harm in his actions.

I have, however, played characters dissimilar to myself. Onstage, I’ve been Jewish, gay, a veteran, and dead—none of which I’ve ever been offstage. I’ve affected Southern, Bostonian, Yiddish, and British accents. Okay, I’ve never discolored my skin to affect another racial identity, but I’ve adopted other external signs of groups I don’t belong to. Exactly how far from my “authentic” identity can I stray before I stop being an actor, and become something harmful?

A friend suggested Buffy Sainte-Marie could take a 23andMe test to determine her genetic quotient. Laying aside that she’s already done so, and found nothing, this testifies to the prevalence of pseudoscience in racial debates. As Richard J. Perry notes, these tests promise results they can’t possibly deliver. They only identify genetic markers existing in statistically significant concentrations in certain geographic areas. Race isn’t genetic, it’s social; did anyone treat Sainte-Marie as Indigenous growing up?

Please don’t misunderstand. I recognize why it’s important for public figures to not misrepresent themselves when speaking for their (putative) heritage. To quote one egregious example, political operative and notorious bigot Asa Earl Carter fled Alabama, moved west, and rechristened himself as Cherokee Forrest Carter. His fictional memoir The Education of Little Tree has actively preserved harmful stereotypes about Native Americans. Jamake Highwater and Hyemeyohsts Storm have similarly falsified their heritage and kept stereotypes alive.

However, ours is a time of shifting standards; formerly acceptable modes of imitation have become verboten, and the new boundaries haven’t solidified. While Buffy Sainte-Marie has her defenders, most reasonable people will agree she crossed a line, pretending to be something she isn’t. But most artists pretend, at least occasionally, and some forms of pretend are more acceptable than others. Some of us will certainly choose wrong; others will choose correctly, but be judged later.

Saturday, August 26, 2023

Some Thoughts on Oliver Anthony

Oliver Anthony playing live (source)

It’s been barely two weeks since singer-songwriter Oliver Anthony burst onto America’s national scene, and already we’re fighting over his legacy. Watching the political left (broadly defined) embrace him, then drop him when they listened to his lyrics, was both amusing and terrifying. He’s since been embraced by professional right-wing bomb-throwers like Jesse Watters and Marjorie Taylor-Greene, who see Anthony, like countless country singers before him, as a conservative emblem.

Yesterday, August 25th, Anthony released a YouTube video specifically rejecting his song’s partisan embrace. “It’s aggravating seeing certain musicians and politicians act like we’re buddies,” Anthony says, from that perch that’s become the White YouTuber political pulpit: behind the steering wheel of a parked vehicle. Anthony disparages both the conservative embrace of his song, and the progressive backlash. He sees himself as essentially centrist, and his song as a moderate anthem.

I strive to avoid partisan allegiances, because they frequently result in people starting with their preferred answer and seeking out the question. Nevertheless, American polarization has made “caring about others” and “protecting the weak” political litmus tests, so apparently I’m broadly leftist. I’m certainly leftist enough to balk at Oliver Anthony’s characterization of “the obese milkin’ welfare,” a stereotype Ronald Reagan simply invented to attack America’s most defenseless.

However, I also appreciate Anthony’s attempt to disaffiliate himself from conservatism’s ugliest proponents. American conservatives have repeatedly attempted to appropriate protest songs like “Fortunate Son,” “Born in the USA,” and “We’re Not Gonna Take It” and turn them into right-wing battle cries. This lack of critical listening has forced numerous artists like Tom Petty and the Dropkick Murphys to aggressively distance themselves from certain politically-minded fans.

What, then, to make of Oliver Anthony? He excoriates conservatives in Verse One, then resurrects Reaganite stereotypes from 1976 in Verse Two. His song names and shames those he believes are harming America, rich and poor alike, but without suggesting a community response, or indeed any stabilizing moral core. There’s no response, in Anthony’s world, except standing on a dock, shouting his grievances in a “high lonesome” accent.

Journalist and pastor Justin Cox finds a possible solution on Anthony’s YouTube page. Anthony has a curated playlist entitled “Videos that make your noggin get bigger.” The list includes Joe Rogan, Andrew Huberman, Billy Graham, and an awful lot of Jordan B. Peterson. Rogan and Peterson both disavow being conservative themselves, but their mostly White, mostly male audiences are frequently dominated by outspoken conservatives, making both men right-wing icons.

Rather than strictly conservative, both Rogan and Peterson are primarily individualists. Both want their audiences to think deeply, feel widely, and have profound experiences, but they want their audiences to go through this as individuals. Peterson frequently disparages collective action, and pooh-poohs any application of his own principles to concerns of class, race, and other collective identity. They see humanity as atomized individuals, not groups with shared interests.

Similarly, historian Kevin M. Kruse identifies Billy Graham as a primary proponent of a politically libertarian, aggressively pro-capitalist Christianity that originated in the 1940s, and achieved political ascendancy by the 1970s. Of the thinkers to receive multiple citations in Anthony’s playlist, only Andrew Huberman isn’t tied to an individualistic ideology—and even that is only because he prefers an exceptionally credulous attitude toward science.

Briefly, Oliver Anthony represents an individualistic worldview. To him, all circumstances arise from bad choices, and though that doesn’t necessarily make others bad people, it also kind of does. Whether it’s politicians choosing to behave corruptly, or “obese” people choosing cheap, starchy foods on EBT, everything is a matter of individual action. No matter how many individuals make he same bad choice, it never adds up to a damaged system.

This manifestation of radical individualism corresponds with something I’ve avoided saying until now: race. Specifically, Whiteness. Black Americans have a history of communitarianism, and collective response to injustice, which White Americans lost somewhere around the time Ronald Reagan functionally legalized union busting. Radical individualism is an essentially White phenomenon, as many of us discovered in the BLM protests of 2020, for which White Americans arrived supremely underprepared.

I’d argue, therefore, that Anthony’s lament is neither progressive nor conservative. Rather, it’s the cri de couer of atomized White loneliness, the awareness that, without communities or unions or church ties, we’re truly alone against a massively brutal world. Within my and Anthony’s lifetimes, White Americans have become so lonely that we can’t imagine not being isolated. So, despite my qualms, Oliver Anthony is the voice of his generation.

Thursday, June 8, 2023

Corporal Punishment, the Church, and Me

My defining moment in the Amazon documentary miniseries Shiny Happy People happens about midway through the second episode. An invited speaker at an Institute for Basic Life Principles (IBLP) seminar invites a child volunteer onstage to demonstrate the speaker’s precepts of Biblically appropriate spanking. The child was volunteered by his parents, not of his own volition, and never speaks or is even identified by name onstage.

The speaker (who shall remain nameless here) takes the child volunteer over his knee and pantomimes the spanking incident, backed with a monologue about how the misbehaving child simply needs discipline to grow with God. Because the speaker mimes the spanking so gently, the effect appears downright predatory. This appearance isn’t helped when, upon letting the child rise, the speaker demands a hug from the kid he just finished disciplining.

Back in the 1990s, I attended a United Methodist congregation in a small Nebraska town. For those unfamiliar with Protestant denominationalism, the Methodist tradition doesn’t have even a shirt-tail theological relationship with most American Evangelical or Fundamentalist churches. Most such churches are theologically Five-Point Calvinist, while Methodism descends from Arminianism, a deliberate rejection of Calvinist absolutism. Methodism shouldn’t be compatible with Evangelicalism.

Yet much of White American Christianity in the 1980s and 1990s trended toward Calvinist conservatism. Pushed by the ideological bloc of Ronald Reagan and Jerry Falwell, many White Christians yearned for the doctrinal certainty which Evangelicals seemingly enjoyed. Congregations which had no theological truck with Five-Point Calvinism snapped up books by Tim LaHaye, Charles Swindoll, and Francis Schaeffer. Their theology soon bled into regular worship and teaching.

As the pro-spanking speaker finishes his ganked, almost fetishistic mock spanking, he demands a hug from his volunteer. But he immediately rejects the hug he receives, declaring it insufficiently enthusiastic. He replaces the kid across his knee and resumes the spanking. This repeats a pattern, perhaps unknowingly, visible in Christian thinkers since at least Augustine: that if you’re sufficiently righteous, you can threaten children into loving God, and you too.

My small Nebraska congregation brought a local pastor aboard who, as part of his ministry, demanded the congregational council hire his son as youth and young adult minister. The son was highly charismatic, and quickly gained acclaim among his intended young parishioners. He introduced a rock concert-influenced evening worship service, and accordingly, local Christians treated him like a rock star. Eventually, he seemed to start believing it himself.

Michelle and Jim Bob Duggar became the celebrity face of Bill Gothard's IBLP

I wanted to believe it, too. In my early twenties, I was considerably more conservative and doctrinaire than I am now, both politically and theologically. This father-and-son team verified that my primarily emotional spirituality was justified. But before long, I realized they didn’t treat everyone equally. They wanted congregants who were extroverted, but submissive. Those who conformed received preferential treatment; everyone else watched from outside, confused and scared.

Don’t misunderstand, my desire to separate wasn’t a Daniel-like stand on morality. I was simply lonely. The ministry focused on highly demonstrative episodes, “mountaintop moments,” and gregariousness; it left no opportunity for thoughtful contemplation, much less deep discussion performed in our “indoor voices.” I attempted to peel myself off simply because I needed time to catch my breath, while their ministry was breathless, breakneck, and quick.

My only mistake came in trying to announce my separation. Instead of just quietly not showing up—as an increasing number of the congregation’s introverted members started doing—I attempted to make my polite apologies before going. The youth minister responded by angrily deploying a laundry list of “sacrifices” he’d made to support his “ministry.” The list rambled on, voluble and extensive, until I finally relented just to escape the situation.

I’ve seldom faced literal violence in my life. I realize how privileged I am to even say that, but I haven’t faced state repression, violent crime, or relationship abuse. Even given my frequently adversarial relationship with my father, he seldom spanked me; he reserved corporal punishment for extreme circumstances, and discontinued it early. Therefore, until I saw a self-righteous spanking enacted onscreen, I didn’t make the connection to what happened that day.

But on a key level, when leaders believe themselves appointed by God, they start demanding love. They demand obedience and adherence from those beneath them. Some enforce those demands through violence, while others enforce them through guilt and shame. But in both cases, they believe they have God-given authority to make demands. Listening, learning, and adapting are for lesser people. Leaders make demands, and the first demand is for love.

Monday, March 20, 2023

The Poor Intellectual in America’s Bread Basket

Sarah Smarsh

Late in journalist Sarah Smarsh’s autobiography, Heartland, she undertakes that most time-honored of adulthood rites: leaving for college. In Smarsh’s case, this passage carries special significance. As the first member of her family to attend college, she arrives without the prior knowledge of how to “do school” that many of her peers already possess. And coming from dirt poverty, she carries the necessity to survive that most college students lack.

Smarsh doesn’t dwell on this; it’s only one long-ish scene in her final chapter. She received a full-ride scholarship to cover her tuition, but needed three jobs to afford room and board. The system we have, Smarsh writes, tacitly assumes students’ parents will shoulder the financial burden of education. Because America’s higher education system bears the toolmarks of its makers, who were themselves well-off, and expected likewise from their students.

I remember, during my brief academic career, repeating a time-honored bromide to my students: “Nobody’s ever too tired to read.” I’d heard that from my teachers in public school, and internalized it, despite not having gone straight from high school to college. An inveterate reader from my childhood, I saw reading as innately pleasurable, a source of energy rather than a consumer of it. And I couldn’t comprehend why anyone, like my students, saw it otherwise.

Not until my adjunct position ended without fanfare did I realize how false that claim was. Turned loose onto the demands of a regional economy that had little need for my skills, and desperately in need of grocery money, I accepted a job beneath my capabilities, simply because it was there. (And simply because I loathe the job-hunting process.) And within a matter of weeks, I discovered, for the first time, that it was possible to be too tired to read.

Sometime later, I would learn the mechanics behind this. Though the brain remains deeply enigmatic to scientists, our best research minds have definitely uncovered some facts. One is that the brain draws energy from the same well as the body. What’s more, it draws energy completely disproportionate to its mass: your brain is less than two percent of your body’s mass, yet consumes more than 20% of your body’s energy.

And when the well is dry, the well is dry.

Not until leaving academia and entering the factory (and later construction) did I discover what weariness meant. Sure, I’d been tired in high school, as many people were, but not the soul-sucking weariness of pulling an eight-hour shift, then coming home to housekeeping, cooking, and generally taking care of myself. Left with the same two or three free hours everyone else has, for the first time, I found myself too tired to read.

Reading Smarsh’s description of working three jobs to subsidize taking classes, I felt that weariness again. It’s taken me ten years to regain sufficient energy to read after work, and even that is inconsistent; most days I can read some, but some days, I’m fortunate if I can stare mindlessly at my phone for a few hours. Some days, I’m lucky to wolf microwave food before lapsing into coma-like sleep.

Yet despite that, Smarsh not only had wherewithal enough to complete her degree, she had enough to complete graduate school and move onto a career. Reading her story, it’s easy to understand why: she had a personal vision, one she wanted to pursue without regard for economic limitations. She was fortunate to have that. Too many of my students from poor backgrounds had few aspirations beyond a vague desire for middle-class comfort.

Many of my students, who heard me state that bullshit about “nobody’s too tired to read,” had outside jobs. At least two told me they were working nearly full-time during the week, then driving back to their hometowns to pull shifts at their parents’ farms or machine shops. Conventional academic theorists would say these students were cruising for failure, that working so many hours outside class guaranteed defeat. Work or school: pick one, you can’t do both.

I suspect these students would reply: “Rent is due.”

Education remains, at least nominally, America’s guarantee for a middle-class lifestyle. My poor students chased a degree, not to improve themselves, but to improve their economic prospects. Couple that with crushing student debt, and a job market that doesn’t offer self-sustaining jobs anymore, and school can be as much a recipe for failure as success. I can only imagine how insulting it was to hear me say “nobody’s too tired to read.”

Saturday, February 25, 2023

The Vanishing Experience of Winter

The thaw happening as I write

There’s steam rolling off the parking lot as I write. Though the air temperature is only 14 degrees fahrenheit, well below freezing, a mostly sunny sky has warmed the pavement sufficiently to cause last night’s thin, fluffy snow to melt faster than it can run off. Because the snowmelt is warmer than the surrounding air, it’s evaporating rapidly, greeting the kind of steam sometimes photographed rolling off sultry Louisiana bayous.

The last few Nebraska winters have been largely mild. I’ve only needed to shovel my walk once this year and, this late in February, the chance of another big ass-kicking snowfall is pretty minimal. Though some American communities have gotten socked pretty hard, we haven’t witnessed the prolonged snows of my childhood, where air stayed so cold so long that accumulations hung around for months, only disappearing in spring.

Winters swing wildly on the Great Plains. It’s too early to calculate this year’s snowfall, but according to the University of Nebraska, last winter was the least snowy on record; the year before, the sixth snowiest, with mounds that took months to melt away. As someone who didn’t live here until the cusp of adulthood, I find winter’s unique fingerprint exciting. And I dread the possibility that winters might stop.

Plenty of Nebraskans hate heavy snows, and have appreciated the change to milder winters. I’m not one of them. I’ve never known most forms of transcendent euphoria which people report with physical exercise, such as the semi-mythical “runner’s high.” But my first winter in Nebraska, in 1992-93, I discovered I can totally access the “snow shoveler’s high,” and eagerly shoveled public sidewalks up and down my block, unwilling to stop.

Yet Earth continues warming. The last eight years have been the warmest in world history. Though record-setting cold events like the Texas deep-freeze do happen, they’re brief, acute, and too fleeting to define the season. Even the Arctic air masses that drive the now-notorious Polar Vortex aren’t as cold anymore. Where I live, Christmas Day 2022 was warm enough to rain, once an unthinkable event in Nebraska in December.

Certainly, some regions have been hit differently than others. Large parts of Wyoming were apparently immobilized yesterday, while my part of Nebraska wasn’t even seriously inconvenienced. But even this, we were warned, is part of global warming’s trend. The Arctic air masses that formerly drove winter weather in middle latitudes like mine, simply can’t muster the oomph necessary to move south like they once did. And they probably never will.

Experts and paid climate pundits have worried about the effects this dwindling will have on Earth’s macrobiome, and its ability to support humanity. I have another question, though: what happens to the culture we leave behind? The touchstones we formerly shared of distinct seasons, of the responsibilities and pleasures that each brings? What will step up to replace them when winter no longer exists?

Despite my age, I haven’t entirely abandoned the hope of marrying and having kids someday. Consider how many Western courtship rituals center around winter. Though we’ve lost traditions like “bundling bags” and winterizing the family log cabin, winter remains a prime opportunity to make memories. Consider how many Euro-American marriage rites center on Christmas—or its successor, the Hallmark Christmas Movie.

And my children: what experiences will we lose forever when the world becomes intolerably hot? Will I have the opportunity to teach them the simple sensory pleasure of the “snow shoveler’s high” that I didn’t discover until my own adulthood? What about sledding, hot cocoa, and lazy Sunday evenings around a hearth fire with people you love? Without winter, these memories will recede into the dark corners of mythology.

Sadly, I discovered partway through writing this essay that the steam off a thawing parking lot doesn’t photograph well. I can see the steam rising with my own eyes, but I can’t preserve it for you; I can only recount it using words and language. This picturesque steam becomes a story, a narrative that you experience through my depth of feeling, through the urgency of my words. It becomes, in short, a myth.

That’s what winter will become. And as happens with myth, the story will quickly overshadow the event. Just as Santa’s Lapplander assistants became poorly defined North Pole elves, the experience of “snow shoveler’s high” or downhill sledding will become something distorted. When we no longer share these experiences, when we no longer shelter for winter together, the reality will become romanticized, warped, and ultimately lost.

Friday, February 17, 2023

The “After” Part of Revival

A recent photo of the Asbury University revival

As I write, an event being termed a “revival” continues in the Asbury University campus chapel in Wilmore, Kentucky. Since Wednesday, February 8th, hundreds of faithful have permanently occupied the chapel building: singing, praying, preaching, and lifting hands unto God. The 24-7 religious outpouring is giving some Christians hope, in a time of seemingly unbridled selfish behavior and the continued numerical decline of American Christianity.

I’ve sat through two previous “revivals,” and therefore have definite opinions. Wrapping oneself in the moment of transcendent unity with fellow believers can definitely feel like communion with God. But, like Jesus tempted in the desert or Buddha planting himself resolutely beneath the Bodhi Tree, that moment only matters in light of what we bring back into the world. The historical track record of that “after” moment leaves me skeptical.

My first “revival” took place at a Christian rock concert in high school, the only Christian rock concert I attended. People were dancing on chairs, singing along, becoming one with the crowd, and then the lead vocalist finished with an altar call. Yes, I responded. But when the crowd dispersed, and we returned to our normal lives, the moment of exultation passed. Without the “worship high,” motivation to repent quickly dwindled.

Years later, a charming young associate pastor at the local United Methodist Church began holding Sunday evening services with a full band. Once again, the experience of crowds, music, and emotional exaltation created a perfect storm of transcendental giddiness. Unlike the rock concert, this service happened regularly, and also involved group Bible study, prayer circles, and other sustained community. This “revival” showed signs of lasting.

This pastor successfully packed a mainline Protestant sanctuary wall-to-wall every Sunday, something most conventional services only accomplish on Christmas and Easter. Donations rolled in, and money was channeled toward common good, like scholarships, community improvements, and overseas disaster relief efforts. Weekly altar calls were warmly received; even my dad, Christian but ordinarily allergic to displays of overt religiosity, walked up to “receive Jesus.”

But as the program continued, something happened: numbers began falling off. Members of the worship band, which peaked around forty-five members, began begging off. The congregation, which briefly held over 750 worshipers—remarkable for a small-ish town—began capping at a hundred, then seventy-five. While worship song instrumental breaks ran longer than the Allman Brothers at the Fillmore East, worship service electric bills started exceeding what the collection took in.

A recent photo of the Asbury University revival

Because the service happened weekly, the falling-off didn’t happen abruptly, as happened after the concert. Rather, things gradually tailed off. People experienced the transcendent worship high, but then returned glumly to regular lives of jobs, school, and cooking dinner. Without discipleship efforts to offer anyone a genuine new life, a genuine straight-and-narrow to walk, the worship high began feeling hollow. Interest waned, and soon, so did the service.

Don’t misunderstand me, what happened in those moments wasn’t hay. The dissolution of self that happens in concert environments distinctly resembles the ego death which Christian and Buddhist mystics describe at moments of salvation or enlightenment. However, concert transcendence depends on the crowd, and ends when everyone goes home. Likewise, when religious people leave the sanctum, if there’s no continuation of community, the emotional response dissipates.

Events like what we’re seeing happen at Asbury University make True Believers feel connected to God and one another. But eventually, everyone has to leave the sanctum and return to daily life. If revival offers nothing beyond that moment of emotional bliss, the pull of ordinary tedium quickly overwhelms grandiose feelings. Like cocaine, a worship high requires greater and greater quantities to overcome the flesh. Mere mortal pastors just can’t provide that.

However, churches can provide community. When “church” is a temporary respite from a world of exploitation, and we return to lives where others profit from our efforts, religion (or anyway religiosity) seems frivolous. Christians need forms of continuing discipleship, opportunities to participate in something larger than themselves. Living the Beatitudes is tiring when you do it alone except for an hour on Sunday. But it’s easy when Christians work together.

I don’t want to diminish the Asbury revival, or the feelings its participants share in that time and space. The defining question, though, is: will they carry those feelings, that experience, into the world? Historically, White Protestant churches are pretty bad at the “after” part of revival. I hope I’m wrong, though, because this world really needs weekday Christians to get busy living by the words we claim to believe.

Monday, February 13, 2023

Are We Living In an X-Files World?

David Duchovny and Gillian Anderson in the 2016 X-Files relaunch

Yesterday, the United States military shot down a third unidentified object in American airspace, this time over Lake Huron. (Technically it was in Canadian airspace, just barely, but stick with me.) These unknown objects invading American skies have become so common, so ubiquitous, that even I have joined in jokes comparing them to UFO paranoia and kaiju movies. Yet I find myself worried about our increasingly militant response.

After the Biden Administration waited several days and several thousand miles to shoot down the first Chinese spy balloon, members of Congress, mostly the Republican Party’s hard-right flank, went berserk. The Administration has met each subsequent one with swift finality, and they're still not happy. Unidentified objects (I’m reluctant to call them UFOs) traversing American airspace have become another rallying point of a fear-based, xenophobic American worldview, again.

When The X-Files first aired in the 1990s, I wasn’t savvy enough to notice the overlap between TV stories and current events. The narrative of extraterrestrials conducting a slow, covert invasion of Earth corresponded with fears of invasion at home. Pete Wilson ran two successful campaigns for Governor of California based on overt, undisguised appeals to racism. Demagogues stoked similar fears of refugees entering America from Haiti, Somalia, and elsewhere.

However, it’s worth contrasting the literal “invasion” of refugees and undocumented immigrants, against Chris Carter’s science fiction invasion. The X-Files depicted a categorical invasion based on strategy and fear. Chris Carter’s invading aliens cultivated allies among the powerful, controlled the media message, and used psy-ops to silence anyone calling them out. Refugees simply showed up, hopeless, scared, and desperate for food and a shower.

Watching events unfold around us, I can’t help noticing who’s using psy-ops to actually divide us. While Marjorie Three-Names and Lolo Bobo scream bloody murder about invasion, about undocumented immigrants bringing disease as biological warfare, and about the enemies at our gates, America’s rich and powerful continue finding ways to narrow citizens’ access to information. We’re busy looking for external enemies, and failing to see them at home.

Elon Musk

Elon Musk’s invasion is probably the most glaring, if only because it’s so ham-handed and clumsy. While he continues running one of America’s best peer-to-peer information sources into the ground, he’s had ten children by three mothers. Overwhelming human genetic codes and planting alien offspring was a literal X-Files strategy for invasion. Despite still having numerous admirers, Musk has devolved into a 1990s sci-fi villain.

Other internal enemies are less visible, but more numerous. Corporate media conglomerates like Sinclair Media and iHeartRadio (formerly Clear Channel) have a stranglehold on American media, and share a notoriously right-wing bent. Though it’s been a while since they did anything as obvious as submarining a platinum-selling country group, their ability to silence dissent is unmatched since the heyday of William Randolph Hearst.

It congeals into a massive goulash of influences where we can, as Fox Mulder famously warned, “trust no one.” While our political leaders hold power by lavishly identifying enemies inside and beyond our borders, the moneyed interests that bankroll their reelection campaigns continue finding ways to foment disunity and paranoia. Law and society persist in undermining the tools of community organization, and we’re left atomized, unable to defend ourselves.

Don’t misunderstand me. The X-Files showed the alien invaders’ human allies as a tight-knit cabal, so friendly that they could gather in one room and make binding decisions quickly. There’s no material evidence of such literal collusion outside a TV writers’ room. In reality, it’s more likely a sloppy agglomeration of mutual back-slappers whose needs happen to coincide. But the patterns are, if not instructive, at least illustrative.

I don’t believe, based on evidence, that it’s going too far to say that America’s rich are currently using the playbook from 1990s science fiction to overthrow the existing society and engineer one that suits their needs. Though the January 6th insurrection failed to coalesce into a coup, it didn’t need to. Subsequent years have seen us descend into exactly the bureaucratic intransigence and partisan backbiting that Mulder and Scully fought against for eight seasons.

Spotting UFOs in American airspace is, arguably, the endgame of this paranoia. It has Americans literally looking into the clouds or across the ocean to spot the enemies that have already seized the levers of American power. Our country’s wealthy and powerful have taught us to live in constant fear and distrust, which undermines our most rudimentary tools to protect ourselves against them. Flying saucers have become just another decoy for power.

Wednesday, December 14, 2022

Hello Darkness, My Old Friend

When I was a kid, I was afraid of the dark, as kids frequently are. My father sneered at this fear and refused to give me a nightlight. “There’s nothing there in the dark that isn’t there in the light,” he declared, insisting this ended the discussion. I asked him to prove this; he did so by turning on the light. The disjunction there apparently never occurred to him.

Back in 2019, short-story writer Amber Sparks wrote an engaging essay about the recent rise in credence given to magical thinking, particularly among women. If astrology, divination, and Wicca aren’t more popular lately, they’re certainly more talked about. To Sparks, scientistic thinking is an outreach arm of the patriarchy. Magical thinking may not necessarily be “true,” but to Sparks, it empowers the chronically disenfranchised, and that’s what matters.

Thinking about fear of darkness, I see a more literal manifestation of this exact premise. Fear of darkness is common among those who lack power in our society. This may mean metaphorical power: children and the poor, for whom being out of sight of others makes them vulnerable to abuse. I’ve also seen paralyzing fear of darkness among well-off adults who’ve been subject to violence, especially relationship violence.

But absence of power can be more literal, too. Without electric light, the night swarms with forces eager to kill you, or anyway cause you harm. Ordinary able-bodied humans receive the largest fraction of information about the world through our eyes, which don’t work in the dark. Without light, the night could be riddled with wolves, bats, bears, and more. Campfires aren’t only for warmth; they keep predators away, too.

Amber Sparks (see also)

Think about it: fear of darkness is most aggressively sneered at by people with access to light. Adults subject ourselves to degrading work conditions to keep the electric bill paid. City dwellers flood every corner of their communities with light. We frown-ups punish children for being afraid of the dark, but permit darkness into our lives only under controlled circumstances; I can’t rise to pee at midnight without turning numerous lights on.

Amber Sparks notes accurately that, for most people, most of the time, scientific truths don’t matter much. “Knowing about how a hand moves doesn’t stop it from covering your mouth.” The same applies to darkness. I know, intellectually, that darkness doesn’t invite monsters into my closet. But only shining light into dark corners proves that definitively. My father turned on the light to prove darkness wasn’t scary, apparently without irony.

Because, seriously, maybe there’s nothing present in the darkness that isn’t there when you turn the light on. But without light, how would you know? Even inside my own house, with modern central climate control and light, I’ve ventured out without turning the light on and stepped on prickly burs, cat barf, and a garter snake that got inside somehow. Darkness didn’t put them there, but it caused me to find them with my feet.

Modernist thinking tells us that science, reason, and limitless electric light have banished fears of darkness, fears they call superstition. Neither the wolves of the primordial forest, nor the demons of medieval folklore, can withstand modernity and its intellect. If we simply swallow our base animal fears, and live in the glorious light of modernity, we have nothing to fear. Wisdom, reason, and manmade light rebound everywhere.

Except we know that modernity’s benefits haven’t been distributed equally. Modernism has traveled hand-in-glove with patriarchy, racism, and war. The German pogroms in Poland, and Soviet pogroms in Ukraine, demonstrate how those who think of themselves as thoroughly hip and modern are willing to inflict massive devastation on poor and powerless peoples in order to continue propagating their diseased vision of modernity.

Nearly three years ago, I spent one night trapped in darkness and cold beside a rural Nebraska highway. Afterward, I waxed rhapsodic about my various insights, because I had nothing but my brain for company. But then, I returned to modernity, to electric light and central heat, and within days, forgot nearly everything I’d learned. I resumed abasing myself before my boss, to avoid spending another night in the dark.

My message being: fear of darkness is natural, even good. Just as Amber Sparks claims magical thinking empowers women amid patriarchy, fear of darkness empowers the poor. Those who would chastise your fear, are the same people who’d demand your subservience to capitalist hierarchies, and that isn’t coincidental. Fear is your source of power; don’t let “them” steal it.

Saturday, November 19, 2022

The Role of Art in a Divided Society

A still from Robert Wise and Jerome Robbins’ 1961 film of West Side Story

Sometime in the 1990s, I’ve forgotten exactly when, my sister’s high school theater program staged the classic musical West Side Story. Because of course they did, it’s standard theatrical repertoire. The only problem was, her school (she and I attended different high schools) was overwhelmingly White. The performance of urban tension between Hispanic and Irish communities, was played by farmers’ kids of mainly German and Czech heritage.

This meant, as you’d expect, brownface. Students playing the Puerto Rican Sharks gang dyed their hair, darkened their skin, and affected Latino accents. The White Jets, meanwhile, learned a stereotyped “New Yawk” accent and got ducktail haircuts. These students, who were entirely White and lived in Nebraska for most or all of their lives, immersed themselves in playing ethnically mixed East Coast characters, not always in the most sensitive ways.

Around twenty-five years later, my sister recalls that performance with a visible cringe. Troweling on makeup to play ethnically clichéd characters, which seemed broadly acceptable then, is patently unacceptable today. Nobody, except a few high-profile heel-draggers like Megyn Kelly, would pretend otherwise. But without the willingness to play characters who didn’t resemble themselves, I contend, these students would’ve deprived themselves, and their community, of something important.

West Side Story remains important theater, seventy-five years after its debut, because it addresses an important American cultural problem. The Jets and Sharks, defined by their race, attend the same high school and walk the same streets. But they never communicate, because they believe long-held bigoted myths about one another. When Tony and Maria dare fall in love, it transgresses one of America’s most cherished internal borders, the color line.

I’ve written before that teaching youth the humanities matters, because through art and literature, students see other people as fully dimensional human beings, with thoughts, feelings and dreams equal to their own. West Side Story reminds us that anybody, raised on such myths, could wind up believing them, and embracing the violence such division brings. Racism, this play reminds us, isn’t inevitable; it’s a choice we make, and keep making.

Arguably, that’s why White actors playing Brown characters is pretty specious, usually. If my sister’s high school had sufficient Hispanic actors to play the Sharks, they should’ve cast accordingly. No matter how sympathetically those student actors attempted to portray characters who were culturally or racially different from themselves, they’ll inevitably resort to stereotypes, sometimes hurtful ones, of people groups they’ve never actually met.

A still from Stephen Spielberg’s 2021 film of West Side Story

But simultaneously, if the school refused to perform this play, nobody would’ve had the opportunity to receive its message. Not the student actors, who needed to stretch beyond their limited small-town experience, nor the audience who, in Western Nebraska, seldom get to witness world-class art. Beyond the high school, getting to see top-tier theater means traveling to Omaha or Denver, and most people can’t spare that much money or time.

This elicits the question: is the message important enough to accept a less-than-optimum messengers? I don’t want to be mistaken for advocating brownface; the specific event I’m remembering belongs to its own time and place, and should remain there. But the event gave students and the community an opportunity to see people whose lives and experiences were wildly different from anything experienced locally. Even if those “people” were actors.

Questions like this will become more important in coming years. In 1957, when West Side Story debuted, Manhattan’s Upper West Side was predominantly working-class, racially mixed, and volatile. Within five years, the combined forces of gentrification and White Flight changed local demographics. By the 1980s, the Upper West Side was heavily populated with yuppies, while the ethnic communities celebrated onstage had been forced into dwindling enclaves.

The White small town where my sister attended high school has experienced something similar: there are now considerably more Hispanic residents, and even a few Black residents. Because the Hispanic residents are mostly agricultural workers, though, they seldom mix substantially with the community. Interactions with what locals call “Mexicans” happen in public places, like grocery stores; the actual community members seldom get to know one another beyond nodding hello.

Artistic expressions like West Side Story will matter more soon, as American society becomes more segregated, more hostile, more like the Sharks and Jets. Opportunities to see “the Other” as equally human to ourselves might make the difference between peace and violence. And sadly, not everybody will have access to racially representative casting choices. Cross-racial casting isn’t ideal, but it’s better than denying audiences the art they need to see.

Saturday, June 5, 2021

Hard At Work in Post-Labor America

It’s that time in the crisis cycle again: time for the self-righteous and wealthy to remind everyone how disconnected they really are. As America re-opens, and we have to make painful choices about how to rebuild the wreckage of our former economy, some people start boasting their privilege by whining about the injustice of it all. I just got accustomed to lockdown, whimper, why start back up again now?

First, cartoonist and essayist Tim Kreider imposed upon Atlantic readers with “I’m Not Scared to Reenter Society. I’m Just Not Sure I Want To.” Despite its promising title, it doesn’t address the implied theme of whether returning to our prior corporatist hellscape lives were worth it; Kreider instead mewls about how a lifestyle of “solitude, idleness, and nihilism” has become more appealing than work. Kreider needs less Pfizer, more Prozac.

Then a survey report emerged, saying that “64% of workers would pass up a $30k raise to work from home.” As big-tech and financial-services companies, which let millions of workers telecommute during the pandemic, desire a return to normality, many workers aren’t interested. They prefer work-from-home conditions, which allows them liberty to task-shift. Bored writing code or handling customer-service emails? Take fifteen to do laundry or heat a frozen burrito!

Both these voices sound superficially familiar. Who hasn’t yearned, periodically, to shirk work and malinger indoors, watching Netflix? And anybody who’s done white-collar work knows the eight-hour jive is moral rather than practical; we can’t focus that long on tasks for somebody else’s reward. Both the desire to jettison employment altogether, and the desire to work under home conditions, suggest a desire to refocus employment on workers, not bosses.

But.

Both stories bespeak mostly unexamined levels of privilege. Tim Kreider admits early that he doesn’t require an income, particularly; he crashes with friends. And the survey reports that supposed $30K refusal mainly among “Zillow, Twitter, and Microsoft employees.” We aren’t talking about people making sandwiches or scrubbing toilets, work that can’t be done remotely. Millions of Americans kept working through the pandemic, or didn’t get paid.

While Kreider didn’t need to work, and Microsoft restructured its work requirements, I and millions of others fell into a third category. Our jobs, declared “essential” for America’s thriving economy, kept going, and we needed to show up. These jobs weren’t without price, either; many low-paying service jobs became superspreader events. The media called grocery clerks “heroes” for doing their jobs, like they had a choice.

My construction labor was considered “essential.” But the things I worked to achieve, participation in community arts or amateur sports or just hanging out with friends, became suddenly lethal. Without a family, I moved from my apartment to my job and back like a convict on work-release. I had neither the luxury for chrysalis-like oblivion, like Kreider, nor liberty to schedule my own day, like the statistical Microsoft workers.

This leaves me a seemingly inevitable conclusion. I don’t need work tweaked, neither through a raise (though one would be nice), nor through telecommuting options. Rather, I question the structure of employment altogether. These sixteen months have made literal what Marx and Lenin considered metaphorically, that the ownership economy steals all meaning from work, while the rewards go to those who merely own things.

Fast-food franchisees complain that workers refuse employment because the government offers a pitiful stipend. ($300 per month amortizes to $15,600 per year—or approximately what Elon Musk makes every three seconds.) Others claim workers will return when they have other perquisites, like affordable child care, universal health insurance, and paid family leave. All these arguments assume people want rewards which have a detectable price tag.

But I’d contend workers want some meaningful connection between work and reward. Not a specific reward; they want control. The pandemic gave Tim Kreider the justification to indulge the moody self-abnegation inherent in his cartoons. It gave the abstract tech-industry workers freedom from being chained to the desk. It gave hourly workers freedom from… having anything to do or anywhere to go after hours. Not exactly the best trade-off for some.

These stories, and the traction they’ve received in the last several days, reflect the relative privilege held by the American punditocracy. The fact that cube farmers don’t want to return to the cube isn’t news; did they ever want to be there? Yet the existential rootlessness which the working class had amplified over the last sixteen months remains ignored. Unless, of course, it stops editors from getting their three-martini lunch.

Sunday, May 30, 2021

Freelance Restaurant Workers: a Modest Proposal

Promo still from the award-winning 2007 indie film Waitress

As the “labor shortage” drags on, both sides blaming each other for America’s struggling service industry, maybe it’s time to reevaluate the market. Our service industry remains beholden to a 19th-Century model of employment, where workers are obligated to management, and management in turn dispenses pay, benefits, and task assignments. But maybe Americans’ growing unwillingness to accept that model at today’s pay scale, suggests that model has outlived its usefulness.

Sure, I know, millions of Americans will insist it’s the pay scale that’s shuffling on, zombie-like, beyond its productive life expectancy. Our minimum wage hasn’t changed since 2009, while rent has increased by over half. And for tipped workers—meaning mostly food-service workers—the minimum wage hasn’t shifted since 1991, during which time rents have nearly tripled. Okay, by that narrow, prescriptivist model, our wage structure is egregiously out-of-date.

But clearly, with today’s governmental structure, proposals to update the wage base are a dead letter. Three Administrations, representing both major parties, have essentially shrugged and admitted their options are few: nobody will support higher wages for America’s underserved, they say, so let’s not even try. The Biden Administration campaigned explicitly on improving working pay for Americans, then surrendered three months into their term. Stop wishing on a star.

It’s time to admit: tipped staff don’t need management anymore. Businesses which hire tipped staff, which again means mostly restaurants, should stop hiring staff altogether, and staff should stop applying for these jobs, like Oliver Twist with his bowl, begging: “Please, sir.” If waitstaff’s absolute wage floor hasn’t increased since my high school days, they clearly don’t need wages at all. They’re already living on tips; why stop there?

Since over two-thirds of tipped workers’ pay has to come from tips just to equal the federal minimum wage, a number that’s already absurdly low, why not the rest. Most food-service work is wholly standardized: the numbering arrangement of tables, digital requirements to enter orders, the dress and behavior codes. Unless your waitstaff wears company-branded clothing, there’s little to distinguish the crew at one restaurant from nearly any other in America.

Restaurateurs should simply maintain a bulletin board with available tables. Aspiring waitstaff simply arrive during peak hours, claim as many or as few tables as they feel comfortable serving, and voila! The staffing problem resolves itself, because waitstaff no longer work for the restaurant. They work for their individual respective customers, get paid in tips, and keep everything they make. Restaurants are off the hook altogether.

Edmonton, Alberta-based chef Serge Belair at work

Owners should embrace this change, because it means they needn’t hire, train, or schedule staff anymore. Workers simply arrive, and work until they believe they’ve earned enough. Workers should favor this because they needn’t feign any particular loyalty to restaurants that provide lousy pay and benefits. Letting waitstaff go completely freelance, frees owners and workers alike from the burdens which employment (as opposed to work) brings.

Moreover, freelance status will give waitstaff more authority over the “I don’t tip” clientele. Under current conditions, some people excuse their refusal to tip by saying “it’s the restaurant’s responsibility to pay workers.” If servers go completely freelance, then customers who refuse to pay their servers have literally stolen services. Customers who don’t pay waitstaff under the freelance model would be thieves, exactly like customers who skip out on their tab now.

I can anticipate the likely counterarguments arising. What if not enough waitstaff want to work during lucrative meal rushes, and restaurants find themselves pleading for help? What if workers only want to freelance at posh restaurants where customers are subdued and respectful? For every objection, I have the same response: that sounds like a “you problem,” and you should work to cultivate a more polite, better-paying customer base.

Indeed, changing the service industry’s worker-employer model would, arguably, expose the roots of problems that make such work undesirable now. If some minority of customers behaves boorishly, making work unbearably nasty, maybe ask yourself what you’re doing that rewards such behavior. And if workers are so reluctant to arrive during peak hours that you can’t plan ahead effectively, we return to the same solution everyone offers: pay better.

I’ve had the pleasure of knowing many waiters, bartenders, and coffee baristas; all tell warm tales of regular customers with whom they became friends. Restaurants, coffee shops, and bars often become the beating hearts of local communities. Yet people leave the industry with disheartening frequency, usually for one reason: bad relationships with management. We can solve this problem by removing management completely.

Friday, April 23, 2021

Some Thoughts on J.K. Rowling and H.P. Lovecraft

J.K. Rowling

Recently I started talking with friends about legendary fantasy novelist J.K. Rowling, an activity that never ends well. Rowling’s influential novels, and the fandom unified by her essentially progressive, antiracist stories, have a massive audience that has become a cultural powerhouse. Rowling herself, unfortunately, has expressed some opinions her fan base finds reprehensible. When challenged, her response is to double down and punch back.

I cannot help comparing Rowling’s personal disintegration to another pathbreaking author whose works have outsized influence: H.P. Lovecraft. Like Rowling, Lovecraft created works that dovetail into an existing literary genre, yet push that genre’s potential into new domains. Unlike Rowling, whose expressed views are broadly progressive on almost all but one key issue, Lovecraft had offensive views about nearly everything, and documented his lousy views extensively.

Rowling’s audience rallied around her political views initially. Her opposition to classism, economic dominion, and nationalism appealed to a global audience which saw actual governing institutions in retreat. Her work was essentially optimistic, pitting Harry Potter in particular, and Hogwarts generally, against a mundane society still glum from the hangover of Thatcherism and its close cousin, Reaganomics. Her work insisted the faithful could resist the cultural tide of pessimism.

Don’t mistake me. Rowling’s works often have unexamined suppositions buried inside them, such as relying on anti-Semitic stereotypes when describing bankers. However, simply having illiberal beliefs doesn’t make someone bad; as Dr. Ibram X. Kendi writes, antiracism means acknowledging racism anywhere you see it, even in yourself. And, as I’ve written, it’s foolish to ask one high-schooler to overthrow Capitalism, even when Capitalism shows its ugliest face.

Lovecraft, by contrast, was deeply pessimistic. He detested immigrants, African Americans, country people, and the poor. (Women he could take or leave.) He feared any form of innovation, which came across in his works: his villains were frequently technicians, “foreigners,” and the unlettered masses. His writings had a succinct ear for catching audiences’ unspoken anxieties and stretching them out with beautiful anguish, but he was, by any definition, a bigot.

H.P. Lovecraft

Yet I’ve noticed a contrast. Lovecraft remains something fandom can discuss, though frequently in pained terms. Even mentioning Rowling, in certain circles, virtually guarantees feelings will run high, and opposing camps will fling harsh words. Discussion boards I frequent online have had to forbid Harry Potter fan discussions, because even mentioning Rowling brings ugly behavior out. I can’t share this essay in certain places, lest violent arguments commence.

It isn’t because of respective views. Lovecraft’s opinions were far more repellant than Rowling’s, and unlike Rowling, Lovecraft put his opinions directly into his works. Yet not only do Lovecraft’s works remain widely read, but recent writers I’ve admired, like Kij Johnson and Victor LaValle, continue working with Lofecraft’s work, attempting to tease his nightmare-like storytelling from his explicit bigotry. I’m unaware of anybody getting violent over Lovecraft fandom.

Meanwhile, Rowling’s fans have attempted to write her completely out of her own fandom. From half-joking jibes that henceforth, we agree the novels were spontaneously generated, or that Daniel Radcliffe secretly wrote the books, to outright insisting that enjoying the books and movies you’ve already purchased makes you a bad person, the call rings to blacklist Rowling as a person. Some anti-Rowling rhetoric is mean-spirited, hateful, and even violent.

Concisely put, organized fandom seems willing to work around Lovecraft’s bigotry. They don’t ignore or whitewash it; Johnson and LaValle clearly foreground that they’re explicitly challenging Lovecraft’s deeply rooted suppositions. Yet Rowling disagrees with her broadly left-leaning fanbase on one topic, and they’re willing to ostracize her from the community she created. They demand agreement in all things, or complete rejection. There can be no middle ground.

EDIT: Since I wrote this essay, Rowling has also made statements that put her in a broad camp with Holocaust deniers. Everyone has a red line, and this is mine. For all her artistic accomplishments, Rowling maybe needs to just shut up and go into personal seclusion.

Maybe it matters that Lovecraft’s work is in public domain. When somebody buys a Lovecraft collection, or remixes his works to create new art, he doesn’t draw residuals on the effort. Meanwhile Rowling gets paid every time somebody buys her books and DVDs, or her movies broadcast on cable or streaming services. But even that seems unsatisfying, since people still buy Orson Scott Card’s books—and look pained while doing so.

I return to something I’ve said previously: it’s foolish to expect great artists to be good people. Great art generally emerges from some internal anguish. From Stephen King’s substance abuse and Norman Mailer’s violent lack of impulse control, to John Lennon’s use of drugs and sex to plug emptiness deep within his soul, artists are usually troubled people. Rowling and Lovecraft are no different, and we should treat them accordingly.

Monday, April 5, 2021

The Pagans Are Coming! Bar the Gates!

What Lil Nas X Means To Me, Part One

A still from Lil Nas X's “Montero (Call Me By Your Name)” video

If I gather one important lesson from the controversy surrounding Lil Nas X’s Satan Shoes, it’s how utterly predictable the faux controversy is. LNX says something photogenically blasphemous, flogs a product, and waits. Then hordes of televangelists and pundits, claiming to represent the public face of Christanity, express their crinkum-crankum outrage. It’s wholly inevitable to those of us who remember Madonna’s dullsville 1989 blasphemy video, “Like a Prayer.”

It’s so wholly predictable, in fact, that one suspects LNX probably engineered that outcome. His track “Montero,” with accompanying video of him engaging in sexual exploits with the Devil, has the same earworm quality as “Like a Prayer,” which had similar video imagery. LNX performed the same basic actions, producing the same basic outcome, with such precision, that he probably followed an Excel spreadsheet of how Madonna propelled a pleasant but undistinguished album to quadruple-platinum status.

LNX’s “Montero” and its marketing tie-in demonstrate the moral bankruptcy among Christianity’s public leadership. Christians, as an aggregate, have embraced recent trends toward militarism, sexual paranoia, and worship of mammon. The degree to which Christians embraced our two most recent Republican presidents, both of whom cut taxes on the rich and entangled America in overseas wars, demonstrates how American Christianity has strayed from its biblical roots.

Mainline Christian denominations have aggressively opposed recent trends toward militant nationalism in American politics. The Catholic, ELCA Lutheran, and United Methodist churches, among others, officially opposed Operation Iraqi Liberation, our decade-long entanglement in Levantine politics. The Roman Catholic Church’s recent declaration that it can’t bless same-sex unions shocked many people, despite being Catholic policy for literally centuries, because so many other churches have reversed position recently.

Yet despite this, you wouldn’t know it from most church actions. Many pulpit ministers flinch from expounding their churches’ policy positions from the congregational lectern, because they know their parishioners would exeunt en masse. When the ELCA Lutheran church reversed its longstanding position and agreed to ordain openly gay clargy, nearly 150,000 Lutherans walked out, organizing the North American Lutheran Church specifically around excluding gay Christians.

Meanwhile, as clergy representing the Christian center-left are reluctant to court controversy, right-wing clergy have no such objections. Paula White’s famous, highly public prayer ceremony to reverse the 2020 election outcomes for the former President, became a disgraceful spectacle of global proportions. Right-wing clergy have closed ranks around a form of nationalist conservatism that privileges the wealth, like the former President, often at the expense of “the least of these.”

A still from Madonna's “Like a Prayer” video

This communal prayer, not for victory over the forces of this world, but for dominion, results in Christianity expressing a fortress mentality. Religious leaders willing to express their political views assert that Christianity is under attack, despite over seventy percent of Americans identifying as Christian. Public Christians, despite reams of evidence, have rhetorically backed themselves into a fortress mentality, and preach faith as constantly under siege.

American history, of course, demonstrates that this isn’t new. The New England Puritans fled England because they thought they were under siege, then quickly provoked King Philip’s War because they thought they were still under siege. American Christians have historically been unified, not by a mission to serve the disadvantaged like Jesus commanded, but by a fear of a monolithic world of malign barbarity. Lock the gates, Pastor, the savages are coming.

I wouldn’t even mind this martyrdom mentality, if Christians were martyred for following the Gospel. Many American municipalities have laws against feeding the homeless or welcoming immigrants, things Jesus actually talked about. Yet many of the public Christians now whimpering about a hip-hop video on YouTube, are the ones enforcing laws against protecting the poor. Jesus separates the sheep from the goats, not by their doctrines, but by their actions.

LNX, like Madonna before him, has exposed public Christianity’s moral vacuity. Given the opportunity to repent of its sectarian, exclusionary history, the Church, or anyway its public face, has chosen to double down. Banshee-like screaming about American moral decay, which televangelists see everywhere, nets ratings and donations. And Christians in the pews, inundated with this messaging, believe that actually represents the preponderant Christian opinion.

Meanwhile, American church membership is falling precipitously. Especially following a Presidential administration that used Christian language to justify literal violence against minorities, this probably isn’t surprising. People aren’t walking away from Christ; they’re abandoning a Church that has forgotten its sacred roots. If Christians don’t take this opporunity to scrutinize ourselves (and so far, too few leaders have), we will see this is only the beginning.

 

See also Part Two