Showing posts with label work. Show all posts
Showing posts with label work. Show all posts

Friday, April 10, 2026

In the Hidden Corners of My Hometown

This West Coast modernist design just sprouts in the middle of a post-WWII development.

Flailing my way through protracted unemployment, I recently started driving DoorDash to get cash moving in. My community is too small to produce enough business for me to live off my gig, but it brings in enough to keep groceries on the table. The gig has provided another important education I didn’t realize I needed: despite living in one small city for over twenty years, I’ve discovered how much of town I just don’t know.

My central Nebraska city has a population slightly above 30,000 people. By current American standards, that’s dinky, but on a historical basis, actually quite large. Legendary ancient cities like Chichen Itza or Babylon topped out around 20,000 people, the practical maximum for societies where the majority needed to farm, and urban infrastructure had to primarily support pedestrians and mule carts. Modernity can support much larger populations, though, mostly because of cars, electricity, and Portland cement.

Modernity has also produced something that ancient cities could’ve never supported: single use zoning. When cars put much larger distances within easy reach, citizens building a business in front of their house, a stable in back, and extra rooms for an inn on the side, makes less sense. American communities are now built in sprawling, monolithic ways that discourage visitors. There’s little reason to visit huge swaths of one’s own city without a prior invitation.

This results in acres upon acres, streets upon streets, where I’ve never visited—until now. DoorDash invites me into single-use residential neighborhoods I’ve never previously had purpose or permission to enter. Visiting these quarters for the first time, I witness eclectic architecture, some of it deliberately either minimalist or rococo, and differing ideas about how large the surrounding yard should be. I’ve also witnessed that, the newer the community, the less likely to contain sidewalks.

Very large lawns, without sidewalks or parks, encourage children to play close to home. Current urban design (which, often, means no design, just vibes) discourages children from one of childhood’s primal impulses, the desire to explore. Wandering away from home may be impractical in new developments and, depending on traffic patterns, unsafe. This means children only have opportunities to meet friends and make connections in officially approved spaces, mainly school and, for some, religious congregations.

Just one of a development of identical crackerbox duplexes with postage stamp lawns,
no sidewalks, and no curbside parking—completely hostile to visitors or teenagers.

The extreme opposite, I’ve observed, is small houses, mainly duplexes, on small lots. These are single-story houses with attached garages, requiring a large physical footprint. However, these developments also lack sidewalks, which means not only no pedestrians, but no curbside parking for guests. These houses seemingly go mainly to young families as starter homes, so maybe they don’t entertain much. But it dampens their ability to perform time-honored neighborhood rituals of group bonding through hospitality.

Small starter homes have no parking and no place to set up picnic tables. Larger homes for established families have space and parking, but are so far away that neighbors can scarcely see one another. Either way, these designs discourage traditional neighborhood activities, like block parties or tenants’ unions, and functionally prevent neighbors from getting to know one another. The McMansions, in particular, look awkward, flexing their design flourishes to impress neighbors they’ll never meet.

Traveling to shared spaces, like work or school, requires either an overland hike without sidewalks, or car rides that create traffic jams. My city is small enough that “jams” are fleeting annoyances. But larger unplanned cities like Houston, which is over forty percent paved, can be dangerous during the morning commute. Ambulances trapped in rush-hour traffic have become a notable part of the Houston experience. So was the city’s inability to drain after Hurricane Harvey.

Current urban design standards divide routine activities. This isn’t entirely awful, as most people wouldn’t want to live beside a lead smelter, kimchi cannery, or hog abattoir. But most people also can’t walk to restaurants, shops, or even their neighbors’ houses. All daily business happens enclosed in hermetically sealed, climate-controlled metal capsules. Ordinary people have diminished opportunities to make friends, discover quirky experimental businesses, or, as I’ve learned recently, see most of their own town.

Old cities like central London, Paris, or New York south of Houston Street are designed around human needs: useful sidewalks, homes designed to double as business sites, and multi-story structures that utilize vertical space as assertively as horizontal. We can’t just regress, because history goes proceeds, even when we wish it wouldn’t. But we can look to older spaces for inspiration for innovative ways to utilize newer, more current spaces that aren’t hostile to visitors.

Friday, January 30, 2026

Toxic Work Ethic in America, Part Three

This essay follows from Toxic Work Ethic in America, Part One and Toxic Work Ethic in America, Part Two.

We humans intensely comprehend our own limitations, fears, and psychological twinges, because we’re all passengers inside our own heads. We can never truly understand other people’s mental states from outside. In my last two essays, I made sweeping generalizations about working-class and upper-class mindsets, but I’m no scientist. I simply constructed a tentative hypothesis from personal experience, conversations, and observing public figures.

To recap, I suggested that most people have conditioned inner narratives driving their workplace habits, and “work ethic” is the benign manifestation of malignant inner trauma. I attribute this trauma to fathers, perhaps because my sources, both personal and public, are men. Maybe women learn more workplace habits from mothers; leave an informed comment. Either way, our “work ethic” is an external tool to paper over inner damage.

But this carries deeper implications: if I’m right, then work ethic, and workplace habits generally, orient toward the past. We appease the voices which exposed our inadequacies as children, constantly trying to silence condemnations that, as adults, only exist in our own heads. Addiction treatment specialist Gabor Maté says something similar about substance users, that they mostly want to assuage pain which their brains keep inflicting on themselves.

(As an aside, many friends have warm, supportive relationships with their fathers. I largely did, too, before his memory started failing. I don’t disparage fathers, but observe how they sustain patterns which they themselves don’t realize have caused harm.)

Contra this past orientation, most spiritual traditions favor a mindful orientation toward the present. Buddhist meditation, Christian centering prayer, and Taoist wandering all encourage supplicants to exist in the present, attuned to each moment, listening for the universe’s subtle call. The workplace of capitalist accrual reminds us of past voices and future rewards. But spiritual practice calls us to exist here, now, as we are.

Bringing spirituality into a workplace ethics discussion is, I realize, risky. Many True Believers insist their spiritual tradition is uniquely true, which could split my audience. Yet bear with me. For all their manifold differences, the religions I’ve studied share a core proposition that the person before us, the community around us, or the conflict buffeting us, holds primacy in our spirituality. Here. Now.

Overcoming the inherent “work ethic” trauma means attuning ourselves to the present. It means listening to instructions, not in fear of punishment or longing for reward, but as they are. It means recognizing our bosses as humans, with the foibles and needs that entails, and not as manifestations of engrained father images. It requires being attuned enough to our own bodies and limitations to say, without malice or fear, “No.”

Humans find ourselves torn between our carnal condition, driven substantially by past traumas and future needs, and our spiritual nature, which faces the present. What’s worse, our spiritual leaders, themselves facing the same tension, encourage this divide. When a millennia-old textbook becomes more important than the immediate person, conflict, or community, then spiritual leaders sink to the level of employers and politicians.

Moreover, the worldly forces which profit from our “work ethic” trauma, already know this. That’s why they barge into our spiritual domains. Billionaires and politicians have transformed Christianity into a nationalist front, reduced “self-care” to retail therapy, and taught us to see mindfulness as a professional strategy. Developing a spiritual discipline will entail purging the anti-spiritual influences from your tradition.

The spiritual equanimity I describe has no single path. Despite me mentioning prayer and meditation, I’ve found these disciplines of limited personal value. But I’ve achieved something comparable by writing poetry: listening to each moment, and selecting the most appropriate word which exists, has helped me attune myself to the present. Whatever removes you from past traumas and future mirages may be your path toward spiritual balance.

This conclusion probably feels abstruse, distant from my starting premise. Yet I believe it holds together. Whether it’s my father chastising me for slowing down, or Errol Musk chastising Elon for not collecting enough accomplishment tokens, that condemning voice comes from the past. The past thus can’t save us, nor the future, which doesn’t yet exist. Only in the present, the spiritual center, can we escape that conditioning.

Elon Musk and I learned incompatible messages from our fathers, which produce wholly divergent outcomes. Yet the harm those messages continue to produce have made us smaller, spiritually less developed beings. And we could both escape by reorienting ourselves away from those messages. But that means stopping seeing ourselves as economic actors, and redefining ourselves as human.

Thursday, January 29, 2026

Toxic Work Ethic in America, Part Two

This essay follows from Toxic Work Ethic in America, Part One.
Elon Musk

Elon Musk, currently likely to become America’s first trillionaire, has a conflicted history with his father, South African entrepreneur Errol Musk. Elon tries to deny Errol’s part-ownership of an emerald mine, for instance, but Errol calls that pure mythology. Even if Errol didn’t bankroll Elon’s earliest ventures, his wealth allowed Elon freedom to pursue an education, experiment with technology, and start several businesses in his early twenties.

If, as I said previously, people arrogant enough to become billionaires and presidents aren’t conditioned in childhood to be self-effacing, that doesn’t mean they’re unconditioned. And like me, their conditioning comes heavily from fathers. My father conditioned me, mainly by yelling, to maintain a self-destructive work ethic, pushing myself to the brink of collapse, then returning home too depleted to do housework. Elon’s father conditioned him to… well.

Like Elon, Errol was a serial entrepreneur, who also used his wealth to buy out enterprises that piqued his interest. Like Elon, Errol married a glamorous, accomplished wife, but seemingly paid her little attention, letting Maye Musk pursue her interests without support or awareness. Like Elon, Errol is sexually voracious: Elon has fourteen children by four women that we know about, while Errol had a child with his own stepdaughter.

Where my father taught me to deny myself and disappear entirely into my role as an employee, student, battalion member, or whatever, Errol Musk taught Elon to elevate himself, and his desires, over other people. Errol conditioned his son to be constantly self-seeking, always aware of ways he falls short or looks small. My father conditioned be to be self-abnegating, while Errol conditioned Elon to be self-centered.

I don’t know Elon’s full story, partly because Elon often contradicts himself regarding his biography. So I’ll draw an analogy. Joe Plumeri, former CEO of the Willis Group, opens his memoir by describing his father showing him the luxurious houses around his New Jersey hometown. Describing himself later as a “workaholic” who loves showing his father around his accomplishments, it becomes clear: Plumeri has spent his life appeasing his father.

Sigmund Freud

One could extend the comparison. Consider the American Presidents and presidential candidates who considered the Presidency their birthright: John Quincy Adams, Benjamin Harrison, George W. Bush. George H.W. Bush and Al Gore were both sons of senators. The Kennedy family exists. John McCain, whose father and grandfather were both four-star admirals, had his military career stall because of his POW status; he ran for President partly to outrank his ancestors.

Developmental psychologists describe human behavior as “highly conditioned.” In plain English, this means our past circumstances shape our present options. We cannot make a completely original decision, but rather see our opportunities defined by our life experiences. Many of the conditioning agents that shape our ability to see fall into two broad categories: standards we want to live up to, and mistakes we want to live down.

Again, for many of us, fathers (or father figures) shape our perceptions. My father taught me to see myself as part of a unit: whether a workplace, a classroom, or a military battalion, I needed to diminish. If I took an unscheduled break, yawned loudly, or even slowed down notably, my father volubly reminded me that I wasn’t just shirking my individual duties. I was letting the entire group down.

Meanwhile, billionaire fathers teach their sons to seek themselves. Sometimes this self-seeking is a doom spiral, as Cornelius Vanderbilt failed to teach his sons business acumen, and the Vanderbilt fortune eventually disappeared. Other times, this self-seeking accrues wealth and power. We can see this in how billionaires treat others: Elon’s multiple divorces and President Taco’s Epstein Island adventures show they see women as consumable resources, not people.

My military analogy recurs. Rank-and-file soldiers internalize an ethos of self-sacrifice, and learn to see heroic death as the ultimate virtue. And I do mean “learn”: cult expert Daniella Mestyanek Young writes that basic training doesn’t teach military skills, it teaches self-abnegation and the primacy of the unit. As a collective, the military survives by teaching its members that their individual lives aren’t worth saving.

Elon Musk claims to work 100 hours per week. This feels specious, since he also claims to be a world-class competitive videogame player, while recently tweeting nearly 100 times per day. But even if it’s true, Musk doesn’t work those soul-breaking hours because he’s disappeared into his job. Instead, he’s made his companies an instrument of his ego, something to inflate himself, though it will never leave him full.

Thursday, January 22, 2026

Toxic Work Ethic in America, Part One

I sometimes literally hear my father’s voice when he isn’t there. Not in a jolly metaphorical way, either, but in a terrifying, often humiliating way. I first noticed this when I worked at a medical components factory during my hiatus between college and graduate school. If I slowed down, slacked off, or simply paused to chat with my co-workers, I heard my father shouting angrily, demanding to know why I wasn’t working until I bled.

Because of this terrifying voice which chased me throughout the workday, I moved faster, took shorter breaks, and got more done than people who had worked there far longer than me. Supervisors took notice, too. They often praised my work ethic, telling me that they wished they had an entire shift full of laborers as “dedicated” as me. Because they weren’t passengers in my head, as I was, they took my terror as committed professionalism.

Management often mistakes being “busy” with productivity. I noticed this often while working in construction: management would schedule marathon hours, especially in the final crunch. But management only deluded themselves. Fatigue, boredom, and resentment created new problems, while workers spent most of every morning ripping out the mistakes they made the previous evening, when they were tired. Team supervisors micromanaged workers’ every decision, because site superintendents micromanaged the supervisors. Everyone was tired all the time.

Literally every blue-collar job I’ve worked has faced some version of this. If food service workers find themselves caught up with tables, they’re given cleaning tasks to do, or refilling table caddies. I’ve worked in two car parts factories, where we were ordered to sweep and clean if the machines even briefly went down. Every moment is policed, every action judged, and companies demand constant maximum productivity; unscheduled pauses are justifications for reprimands, often stern.

Meanwhile, I’ve worked only two white-collar jobs, as a freshman composition teacher and a marketing copywriter. In both positions, I’ve been astounded by how much scheduled work time gets consumed by non-work activities. Chatting, dithering, side projects, day drinking, and even napping are anecdotally common. While hourly wage earners have their hours aggressively monitored for unsanctioned yawns, resulting in paranoid, often manic work, managers have so much discretion that they want for things to do.

My father spent most of his military career as a rank-and-file enlisted man. If you’ve ever spent time on a military installation, you know how aggressively the enlisted men’s time is regulated. Every barracks, parade ground, warship, and hangar is the epitome of cleanliness, with every plank sanded, hinge oiled, bolt painted, and floor scrubbed. Especially for unmarried recruits living on-post, twelve-hour workdays of constant, regulated motion are common, and labor outputs are closely quantified.

Simultaneously, a peer whose father was a career officer told me that officers cultivate the attitude of men of leisure. (We met in school off-post, because even officers’ and enlisted men’s children are discouraged from mingling.) Not that officers don’t work, because they too have pervasive regulations and readiness standards. Rather, they achieve their dictated goals at measured, deliberate speeds. Humans with autonomy, not checklists and rubrics, measure officers’ outputs. Rules are discretionary, not absolute.

This pattern applies broadly. Matt Taibbi wrote (before becoming a culture war spokesmodel) that every SNAP benefit applicant gets treated like incipient fraud, while almost nobody was held responsible for the 2008 financial collapse. Since then, we’ve seen how the only people indicted for the January 6th, 2021, Capitol insurrection, were the foot soldiers at the door, and they got pardoned. Those who incited the crime not only got ignored, but they also got reelected.

America cultivates a socioeconomic narrative in which the poor, the laborer, the voter—the enlisted men of civilian society—hear their inadequacies repeated endlessly. Nor is this accidental. The wealthy and powerful—our officers—want us to suffer the constant loop of condemnation for even momentary weakness, like I heard, and sometimes still hear, my father. The psychological harm which this repetition causes individuals doesn’t matter, because to our “officers,” the outcome is “work ethic.”

But it also enables unbridgeable gaps in American social structure. This is why laborers seldom become management, classroom teachers rarely become administrators, and most citizens have little chance of getting elected to higher office. We numpties cannot lead because we’ve been conditioned to rehearse our inadequacies, real or imagined, constantly. Only those without that conditioning have the arrogance necessary to become presidents, billionaires, and other captains of society. “Work ethic” is the opposite of advancement.

This essay continues in Toxic Work Ethic in America, Part Two

Friday, June 20, 2025

Food, Economic Injustice, and You

Much modern farming less resembles gardening than strip-mining

Amid all the ICE raids which crisscrossed America last week, tipping into street protests in Los Angeles, the Omaha meatpacking raids got forgotten by the national media. This perhaps isn’t surprising. A substantially industrial city with limited glamour, Omaha often gets overlooked unless something catastrophic happens, like blizzards closing Interstate 80, or local darling Bright Eyes releasing an album.

Yet this raid speaks to an undercurrent in American policy. Specifically, since the Civil War, when Abraham Lincoln signed the legislation establishing the Department of Agriculture, American ag policy has focused on abundant yields and low prices. This has involved persistent overproduction of commodity crops, coupled with price supports, ever-improving technology, and efforts to create markets internationally.

As George Pyle writes, efforts to bolster production probably made sense in the middle 19th century, during a civil war, when threats to food supply were common war tactics. But conditions have changed markedly, and our central approach hasn’t kept pace. Agricultural technologies based on diesel-burning equipment and ammonia-based synthetic fertilizers have resulted in bloated yields, as Vaclav Smil writes.

Nick Reding describes how consolidation in the ag processing industry has cut wages so low, workers can only make rent by taking double shifts. Such marathon hours are often only possible when workers supplement themselves with illegal amphetamines. Though I broadly support drug legalization, amphetamines are so destructive that even I prefer they remain illegal. However, workers use them for one basic reason: to keep working, and get paid.

Nor are these outcomes unexpected. As Greg Grandin writes, President Clinton knew that subsidized American crops were artificially abundant and cheap. Before NAFTA went into force, he authorized what was, until then, the largest increase in Border Patrol manpower ever. Clinton knew that lifting trade barriers on subsidized American agriculture would cause food to hit Mexican markets below the cost of growing.

And he was right. Rural poverty in Mexico’s agrarian south quickly exceeded 70%, forcing workers, mostly men, to abandon ancestral farms and go anywhere that work existed. Something similar happened when Clinton forced Haitian President Jean-Bertrand Aristide to sign a free-trade agreement as a condition of American involvement in deposing Haiti’s illegal coup. Now, Mexican and Haitian workers comprise the largest number of America’s undocumented population.

Pigs don't live in pens anymore; this is where your pork comes from

Numerous White Americans remain invested in farming and agriculture, but primarily as owners or live-in bosses. Because much industrialized agriculture uses machine labor, full-time farmhands usually aren’t necessary. Workers remain necessary while planting and harvesting, but these aren’t full-time positions. This work mostly gets done by migrants—a condition few White workers would accept. Undocumented laborers mostly do this work.

That brings us full-circle to the meat processing plants which began this essay. Before 1990, meat processing was considered semi-skilled labor. The meatpackers in Upton Sinclair’s propaganda novel The Jungle were mostly White, first- or second-generation Eurpoean immigrants. But as Nick Redling describes, meatpacking industry consolidation after 1990 drove wages so low that workers with kids and mortgages can’t afford those jobs anymore.

Currently, America enjoys the cheapest food in world history; per George Pyle, most Americans pay more for packaging than for food at the supermarket. But food is historically cheap because it requires undocumented workers pulling abusive hours in Spartan conditions to plant, harvest, and process it. Workers with legal rights would complain to the NLRB under such conditions; undocumented workers have nowhere to complain.

Eyal Press claims that killing floor workers are among America’s most despised, doing work which consumers demand, but which offends our morals. We expect faceless strangers to kill, dress, and package our meat. Similar problems abound in related fields. Tom Russell notes that the Trump Administration wants a border wall built in regions where only Mexican migrants have the skills necessary for such epic construction.

Anecdotes of supervisors demanding long hours and dangerous work from meatpackers are legend. These demands come with the threat, either explicit or implicit, that we’ll call La Migra if you don’t perform. But like a nuclear warhead, this threat works only when unrealized. Once you drop your atomic bomb, literal or metaphorical, it’s expended, gone forever. And management is left with a vacant killing floor.

Donald Trump heard the threats of calling Immigration, and instead of recognizing them for the rhetorical device they were, he believed them. He authorized his administration to perform massive round-ups that look good on right-wing cable TV, but undercut employers’ labor pool. If this doesn’t stop, agriculture employers will have to start paying workers what they’re worth—and you’ll see it in your grocery bill.

Thursday, January 30, 2025

Living in the Wallace & Gromit Economy

Wallace unleashes his newest invention, NORBOT, on his hapless pooch Gromit
in the new film Vengeance Most Fowl, now on Netflix

I’m a fan of Nick Park’s Wallace & Gromit films, since I first discovered the original short films on grainy, probably bootleg VHS in the 1990s. The humor operates on the same principle as Mr. Bean or Red Dwarf: a well-meaning but incompetent protagonist bumbles into situations far above their heads. Wallace, Bean, or Rimmer are momentarily embarrassed, but consistently come out ahead, without really learning anything.

The films present Wallace as a garage inventor and shade-tree mechanic. Though the first short film has him successfully build a moon rocket in a weekend, subsequent films consistently harp on the same theme, that Wallace’s inventions create more problems than they solve. They require added steps, break down frequently, get sabotaged by rascally varmints, and otherwise create needless kerfuffles. All just to less efficiently butter his breakfast toast.

Though that theme runs through nearly every film, short or long, I don’t recall it looming as large as in the latest entry, Vengeance Most Fowl. Throughout Act One, Gromit, the wordless dog character who’s secretly the brains behind the operation, keeps indulging Wallace’s invention mania. However, he longs to complete his necessary tasks and switch over to the activities which give his life meaning: gardening and knitting.

Wallace, however, persistently misunderstands Gromit’s need for meaningful work. He sees both gardening and knitting as repetitive work, which automation can eliminate. Therefore he introduces his newest invention, NORBOT, a self-actuating garden gnome that literally takes jobs right out of Gromit’s hands. Though wordless, Gromit’s Claymation facial expressions make clear the disgust he feels without tasks to occupy his hands and brain.

Thing is, I understand, somewhat, Wallace’s motivation. For years, advocates of Fully Automated Luxury Space Communism have claimed that technology will render work obsolete a week from next Tuesday, and we’ll have limitless free time to… well, to do whatever. More recently, TechBro types have extolled what they falsely call “Artificial Intelligence” to take writing, music, and art away from the nerds by strictly automating it.

Such advocates see work as burdensome, something to outsource. Socialists have historically considered work as something imposed by the economic order, something we can abandon because our high-tech do-funnies will absorb the tedium. TechBros, by contrast, see workers and their jobs as an undesirable sunk cost that they’d rather abandon. Either way, work becomes something to abolish, replacing ordinary humans with machines, computers, and heuristics.

NORBOT represents only the comical reductio ad absurdum of this mentality. It snatches the pruning shears from Gromit’s paws and, in mere seconds, transforms his lush English garden into a topiary extravaganza completely devoid of character. It subsequently steals Gromit’s yarn and knits Wallace another outfit exactly like the one he always wears. NORBOT works fast, cheap, and efficiently, but without personality or meaning.

Socialist writer Barbara Garson admits she thought the capitalist class forced workers to work. Only after visiting workplaces and watching the ways employees extract meaning from standardized work, did she realize that work said something about workers’ souls. People don’t work because overseers and debt collectors force it. They work because what we do with our hands, what we create with our brains, defines who we are.

Economist John C. Médaille similarly observes that, if you watch how people spend their free time, it frequently resembles work. Left to their own devices, people might grow vegetables, build Shaker furniture, write novels, perform home improvement, rebuild classic cars, or paint. Although some people certainly drink beer and watch television, complete forfeitures of experience, most people, given the opportunity, seek work to define themselves.

To a limited extent, advancing technology has made such meaning easier to create. Inventions like the steel plow and combine harvester meant that growing crops required fewer workers. In former days, most peasants farmed from sheer necessity. Now, most people can choose whether they want to cultivate the earth, or whether they’d rather make meaning elsewhere. Therefore I’m no absolute Luddite, and embrace technology to a point.

However, I’d contend we’ve surpassed that point. Early Twentieth Century inventions made work more productive, and homemaking more efficient. However, as Research and Development has superseded invention, most “new” technologies simply complexify existing machines. I struggle to imagine any technology that’s improved our lives in the last thirty years. Made us more productive? Sure. But happier, healthier, better developed? I got nothing.

Watching Gromit get his hobbies stolen, I felt the pang of familiarity. We’re all watching capitalists extract meaning from our lives, sometimes without malice. We’re all Gromit now.

Friday, July 12, 2024

The Perils of Making Work Go Away

As an avid cyclist, I’ve heard the semi-comical stereotypes. We’re self-righteous, have no personality, and wear ridiculous spandex clothes. But perhaps the most persistent stereotype is cyclists’ casual disdain for stop signs. Which, in fairness, is real. I’ve been bawled out several times for pulling an “Idaho stop” through a clearly posted stop sign—usually by a motorist who just rolled through the same intersection.

Consider, though, why cyclists might treat stops flippantly. First, we aren’t going particularly fast. Only elite cyclists, riding the most high-tech bicycles, can achieve the 35 MPH speed limit common on American residential roads, much less 45 on main roads or 65 on highways. We’re hardly caroming at breakneck speeds, as motorists often do. But the biggest reason is even simpler: once we’ve fully stopped, getting rolling again is serious work.

Stick-shift drivers know this feeling. Once the vehicle achieves a standing stop, nothing makes a bicycle go again except the cyclist’s bodily effort. While motorists heckle me from inside their hermetically sealed capsules, letting the engine do the work, I have only my limbs to make the bike move again. Only when I drove a stick-shift pickup, enacting the nuisance of constant upshifting, did I discover something comparable behind the wheel.

Of course, driving a car doesn’t make the effort of starting from a standing stop go away. The entitled motorist shouting from the comfort of his family sedan, with climate control and an automatic transmission, requires far more work to get that mass of metal rolling. However, where I invest the personal effort to move my bicycle, the motorist delegates the effort to the “other party,’ the car’s engine. The work still happens; somebody else just does it.

The more I look, the more I see this pattern recurring throughout society and history. The desire to offload tedious labor has dominated human development. I’ve read speculations that humans invented cooking, in part, to escape the relentless tedium of chewing raw food. Later, humans tamed horses because riding was easier than walking; our ancestors also invented sails because it was easier than rowing (and because water travel is easier than overland).

Early human technologies involved harnessing animal power. Ox-drawn carts and horse-drawn carriages eliminated the burden of walking large distances, an important advance in ages when twenty miles was a relentless slog. After hand tools replaced broadcasting seed in early agriculture, the next major breakthroughs involved animal-driven technology: Jethro Tull’s seed drill, and its immediate successor, John Deere’s steel plow, both drawn by draft animals.

Then there are attempts to harness the elements. Chinese and Greek inventors separately perfected the water wheel to grind grain; the technology, with fiddling changes, remains common in Amish and poor African communities. This technology wasn’t insignificant, either, as Jack Kelly describes the DuPont family harnessing it to manufacture America’s first commercial-grade gunpowder, inaugurating a chemical manufacturing legacy that persists today.

And, of course, slavery existed: the ultimate effort to offload boring labor.

The technologies of the Industrial Age and after, merely complement this process. Internal combustion engines overtook horses, and nuclear power may soon overtake coal, but the motivation remains unchanged. Then there are the more pessimistic manifestations. Generative AI basically promises to enable everybody to write books, paint portraits, and compose music, without the boredom of learning and perfecting the skills.

Never let anyone say I don’t understand the appeal of offloading labor. I love having central heat without having to bank the fire overnight; I enjoy bunking off on vacation without having to walk over hills and across rivers. But increased ease remains pricy. It’s impossible to have central heat without global warming, just as it’s impossible for me to vacation in Missouri without the soul-sucking ennui of the morning commute.

Also consider what losses we suffer. When I commute to work, or toddle off to Missouri along the Interstate, every drive becomes identical, dominated by repetitive buildings and dun-colored landscapes. On my bicycle, I see things drivers ignore, like subtle outbuildings, wide-eyed pedestrians, and—sadly—overlooked roadkill. Likewise, generative AI “artists” can mass-produce content, but they never develop their unique, inimitable style.

I certainly enjoy modern conveniences, like nutritionally diverse food, pharmaceuticals, and the internet. But when I bike to work, the experience is more complex and nuanced than when I drive. Does anybody really enjoy the feeling of road hypnosis? Is anybody particularly moved by an AI painting? Offloading tedious labor onto others makes life easier, certainly. But it also robs life of meaning.

Monday, February 21, 2022

But Someone’s Gotta Do It

Eyal Press, Dirty Work: Essential Jobs and the Hidden Toll of Inequality In America

Few Americans have seen an untrimmed chicken carcass with our own eyes, but we know such things exist. Somebody, somewhere, turns living livestock into the styrofoam-backed meats Americans consume in such quantities. Similarly, somebody has to push the “kill” button on unmanned military drones, someone has to house the prisoners created by America’s “tough on crime” policies, someone has to extract hydrocarbons from the mine.  We sure won’t do it.

Journalist and NYU sociologist Eyal Press gathers the term “Dirty Work” from Everett Hughes, whose post-World War II research dealt with how citizens of totalitarian nations dealt with shared culpability for government atrocities. Ordinary citizens, Hughes determined, created a mental barrier. Some work was necessary, but innately dirty. Therefore a designated underclass handled the distasteful responsibility of, say, conducting pogroms and prosecuting wars of conquest.

Dirty Work, Press emphasizes early, is different from dirty jobs. Some jobs, like construction, agriculture, or infrastructure maintenance, are widely regarded as somewhat crummy. But Dirty Work is morally tainted, and that moral condemnation transfers onto those who do it. But that work remains necessary, because the population wants the work done. Workers performing Dirty Work become agents of society’s collective id, and we reward them with our disgust.

Press focuses mainly on three categories of Dirty Work in America: prison staffers, military drone pilots, and meatpackers. Combining a broad statistical survey of each field with close, intimate stories of people doing the work, he guides readers through these fields’ intricacies. He unpacks the sorts of people who accept, even seek, these jobs. And he demonstrates how they express our collective will, even as we publicly disown them.

Prisons suffer a paradoxical push-pull influence. Like schools, Americans see prisons as necessary means of enforcing laws and creating order. Candidates campaign on pledges to build, and fill, new prisons. But also like schools, prisons are first on the chopping block when budgets get cut. This becomes especially dangerous because, with the mass closure of state-funded asylums in the 1970s, prisons are now America’s leading warehouse for the mentally ill.

Eyal Press

America’s military became reliant on drones because we still expect to maintain global military dominion, but distrust committing troops. Drone pilots, working from bases in places like west Texas or northern California, save the Administration the PR headache of deployments, while keeping America active in stopping militants and terrorists. But because of their technology, individual drone pilots kill far more targets than soldiers ever could. And many suffer catastrophic PTSD.

Industrial meatpacking plants satisfy America’s appetite for cheap, plentiful meat, but at great cost. Animal rights activists love showing footage of how livestock suffer on the killing floors. But plant workers are inordinately likely to be maimed, even killed, while those who emerge physically unhurt suffer PTSD symptoms, including nightmares and hair loss. Like prison guards and drone pilots, they’re despised in their communities, underpaid, and punished for speaking out.

These workers have significant commonalities. Ruralism, for one: prisons, drone bases, and meatpacking plants are built well away from population centers. Prison workers and today’s military are also likely to come from rural areas, where often, these secure government jobs are the only reliable source of income. The workers, and their work, are easily ignored because they remain isolated on the periphery of “polite” American society.

They’re also subject to bipartisan contempt. Whenever whistleblowers in these industries come forward, conservatives ridicule them as crybabies who don’t appreciate the opportunities these jobs offer. But progressives, who historically attribute poverty and crime to systemic, rather than individual, causes, make exceptions for workers doing Dirty Work. Progressives hold these workers individually, not their employers or the laws which regulate them, culpable for their industries’ worst excesses.

Press delves into other Dirty Work, like immigrations enforcement and hydrocarbon mining, though in less depth. For comparison, Press has a chapter on “dirty tech,” high-tech innovations that make society less free, like Google making alliances with Communist China, or Facebook selling personal data to Cambridge Analytica. But, Press notes, dirty tech workers aren’t personally tainted. They’re free to quit with minimal penalties, and seldom held individually responsible.

Overall, Press describes a stratified economy where despised workers get thrust into despised work. This exposé channels historically respected journalists, from Upton Sinclair to Eric Schlosser, who have revealed the two-tiered rot in American society. The product is chilling. And maybe, this time, we’ll learn enough from the lessons Press offers that we won’t repeat these mistakes in the future. Though admittedly, the precedent isn’t good.

Saturday, June 5, 2021

Hard At Work in Post-Labor America

It’s that time in the crisis cycle again: time for the self-righteous and wealthy to remind everyone how disconnected they really are. As America re-opens, and we have to make painful choices about how to rebuild the wreckage of our former economy, some people start boasting their privilege by whining about the injustice of it all. I just got accustomed to lockdown, whimper, why start back up again now?

First, cartoonist and essayist Tim Kreider imposed upon Atlantic readers with “I’m Not Scared to Reenter Society. I’m Just Not Sure I Want To.” Despite its promising title, it doesn’t address the implied theme of whether returning to our prior corporatist hellscape lives were worth it; Kreider instead mewls about how a lifestyle of “solitude, idleness, and nihilism” has become more appealing than work. Kreider needs less Pfizer, more Prozac.

Then a survey report emerged, saying that “64% of workers would pass up a $30k raise to work from home.” As big-tech and financial-services companies, which let millions of workers telecommute during the pandemic, desire a return to normality, many workers aren’t interested. They prefer work-from-home conditions, which allows them liberty to task-shift. Bored writing code or handling customer-service emails? Take fifteen to do laundry or heat a frozen burrito!

Both these voices sound superficially familiar. Who hasn’t yearned, periodically, to shirk work and malinger indoors, watching Netflix? And anybody who’s done white-collar work knows the eight-hour jive is moral rather than practical; we can’t focus that long on tasks for somebody else’s reward. Both the desire to jettison employment altogether, and the desire to work under home conditions, suggest a desire to refocus employment on workers, not bosses.

But.

Both stories bespeak mostly unexamined levels of privilege. Tim Kreider admits early that he doesn’t require an income, particularly; he crashes with friends. And the survey reports that supposed $30K refusal mainly among “Zillow, Twitter, and Microsoft employees.” We aren’t talking about people making sandwiches or scrubbing toilets, work that can’t be done remotely. Millions of Americans kept working through the pandemic, or didn’t get paid.

While Kreider didn’t need to work, and Microsoft restructured its work requirements, I and millions of others fell into a third category. Our jobs, declared “essential” for America’s thriving economy, kept going, and we needed to show up. These jobs weren’t without price, either; many low-paying service jobs became superspreader events. The media called grocery clerks “heroes” for doing their jobs, like they had a choice.

My construction labor was considered “essential.” But the things I worked to achieve, participation in community arts or amateur sports or just hanging out with friends, became suddenly lethal. Without a family, I moved from my apartment to my job and back like a convict on work-release. I had neither the luxury for chrysalis-like oblivion, like Kreider, nor liberty to schedule my own day, like the statistical Microsoft workers.

This leaves me a seemingly inevitable conclusion. I don’t need work tweaked, neither through a raise (though one would be nice), nor through telecommuting options. Rather, I question the structure of employment altogether. These sixteen months have made literal what Marx and Lenin considered metaphorically, that the ownership economy steals all meaning from work, while the rewards go to those who merely own things.

Fast-food franchisees complain that workers refuse employment because the government offers a pitiful stipend. ($300 per month amortizes to $15,600 per year—or approximately what Elon Musk makes every three seconds.) Others claim workers will return when they have other perquisites, like affordable child care, universal health insurance, and paid family leave. All these arguments assume people want rewards which have a detectable price tag.

But I’d contend workers want some meaningful connection between work and reward. Not a specific reward; they want control. The pandemic gave Tim Kreider the justification to indulge the moody self-abnegation inherent in his cartoons. It gave the abstract tech-industry workers freedom from being chained to the desk. It gave hourly workers freedom from… having anything to do or anywhere to go after hours. Not exactly the best trade-off for some.

These stories, and the traction they’ve received in the last several days, reflect the relative privilege held by the American punditocracy. The fact that cube farmers don’t want to return to the cube isn’t news; did they ever want to be there? Yet the existential rootlessness which the working class had amplified over the last sixteen months remains ignored. Unless, of course, it stops editors from getting their three-martini lunch.

Sunday, May 30, 2021

Freelance Restaurant Workers: a Modest Proposal

Promo still from the award-winning 2007 indie film Waitress

As the “labor shortage” drags on, both sides blaming each other for America’s struggling service industry, maybe it’s time to reevaluate the market. Our service industry remains beholden to a 19th-Century model of employment, where workers are obligated to management, and management in turn dispenses pay, benefits, and task assignments. But maybe Americans’ growing unwillingness to accept that model at today’s pay scale, suggests that model has outlived its usefulness.

Sure, I know, millions of Americans will insist it’s the pay scale that’s shuffling on, zombie-like, beyond its productive life expectancy. Our minimum wage hasn’t changed since 2009, while rent has increased by over half. And for tipped workers—meaning mostly food-service workers—the minimum wage hasn’t shifted since 1991, during which time rents have nearly tripled. Okay, by that narrow, prescriptivist model, our wage structure is egregiously out-of-date.

But clearly, with today’s governmental structure, proposals to update the wage base are a dead letter. Three Administrations, representing both major parties, have essentially shrugged and admitted their options are few: nobody will support higher wages for America’s underserved, they say, so let’s not even try. The Biden Administration campaigned explicitly on improving working pay for Americans, then surrendered three months into their term. Stop wishing on a star.

It’s time to admit: tipped staff don’t need management anymore. Businesses which hire tipped staff, which again means mostly restaurants, should stop hiring staff altogether, and staff should stop applying for these jobs, like Oliver Twist with his bowl, begging: “Please, sir.” If waitstaff’s absolute wage floor hasn’t increased since my high school days, they clearly don’t need wages at all. They’re already living on tips; why stop there?

Since over two-thirds of tipped workers’ pay has to come from tips just to equal the federal minimum wage, a number that’s already absurdly low, why not the rest. Most food-service work is wholly standardized: the numbering arrangement of tables, digital requirements to enter orders, the dress and behavior codes. Unless your waitstaff wears company-branded clothing, there’s little to distinguish the crew at one restaurant from nearly any other in America.

Restaurateurs should simply maintain a bulletin board with available tables. Aspiring waitstaff simply arrive during peak hours, claim as many or as few tables as they feel comfortable serving, and voila! The staffing problem resolves itself, because waitstaff no longer work for the restaurant. They work for their individual respective customers, get paid in tips, and keep everything they make. Restaurants are off the hook altogether.

Edmonton, Alberta-based chef Serge Belair at work

Owners should embrace this change, because it means they needn’t hire, train, or schedule staff anymore. Workers simply arrive, and work until they believe they’ve earned enough. Workers should favor this because they needn’t feign any particular loyalty to restaurants that provide lousy pay and benefits. Letting waitstaff go completely freelance, frees owners and workers alike from the burdens which employment (as opposed to work) brings.

Moreover, freelance status will give waitstaff more authority over the “I don’t tip” clientele. Under current conditions, some people excuse their refusal to tip by saying “it’s the restaurant’s responsibility to pay workers.” If servers go completely freelance, then customers who refuse to pay their servers have literally stolen services. Customers who don’t pay waitstaff under the freelance model would be thieves, exactly like customers who skip out on their tab now.

I can anticipate the likely counterarguments arising. What if not enough waitstaff want to work during lucrative meal rushes, and restaurants find themselves pleading for help? What if workers only want to freelance at posh restaurants where customers are subdued and respectful? For every objection, I have the same response: that sounds like a “you problem,” and you should work to cultivate a more polite, better-paying customer base.

Indeed, changing the service industry’s worker-employer model would, arguably, expose the roots of problems that make such work undesirable now. If some minority of customers behaves boorishly, making work unbearably nasty, maybe ask yourself what you’re doing that rewards such behavior. And if workers are so reluctant to arrive during peak hours that you can’t plan ahead effectively, we return to the same solution everyone offers: pay better.

I’ve had the pleasure of knowing many waiters, bartenders, and coffee baristas; all tell warm tales of regular customers with whom they became friends. Restaurants, coffee shops, and bars often become the beating hearts of local communities. Yet people leave the industry with disheartening frequency, usually for one reason: bad relationships with management. We can solve this problem by removing management completely.

Friday, December 11, 2020

Don't Follow Your Bliss

“Ball Lightning in Space” (oil on canvas, 2007)
by Kevin L Nenstiel

“What would you do with your life, if you didn’t have to worry about making a living?” I’ve long forgotten who first asked me this question. Apparently, it’s a common question that careers counselors ask youth and young adults trying to decide what they want their lives to be about. At first blush, it seems reasonable: If you’d get paid for doing what you’d rather spend time doing anyway, then please, make it your career.

Work is central to human identity. We organize our lives according to the labors we undertake, either voluntarily or for pay. As economist John C. Médaille writes, neoliberal economics has historically assumed humans must be coerced to work and be productive; yet when you consider what most people do for recreational activities, like gardening or woodworking or art, these pastimes sure resemble work. If we could turn passion into paychecks, isn’t this a desirable goal?

Yet I’ve come to question its validity. Certainly, I’d love to get paid for writing. I’ve spent years cultivating the skills of telling a good story, creating characters whom my audience will feel something for, and putting them on paths toward accomplishing goals. Yet these skills have never translated into steady income. My writings keep bouncing back, unpublished and unpaid. Some writers’ blogs provide a decent side income, but mine’s barely bought one dinner out.

Indeed, sideline activities and hobbies arguably speak volumes to a person’s core values. The love of earth and harmony found in gardening, for example, reflects a commitment to cultivating closeness to one’s organic roots. My writing arises from a passion for understanding other people, and a desire to communicate. Sometimes these values are counterintuitive. The repetition of gardening, or the isolation of writing, conceal the spontaneity and intimacy which both hobbies create, down the road.

These values run counter to paychecks, which have separate ethics. Work isn’t something people do to satisfy inner hunger or give their lives definition; we work because work is a moral imperative. The demand that poor people work harder, smarter, or better—a demand embodied by “work requirements” on EBT and other safety nets—demonstrates that work is fundamentally moral, not useful. We’d rather let the poor starve, than protect them without “earning it” first.

The late Anglo-American sociologist David Graeber wrote, shortly before his recent passing, that we cannot separate Capitalism’s moral imperative for work, from the current devastation facing the Earth. As a construction worker, I’ve witnessed it. We build buildings nobody particularly wants, including new retail spaces while existing ones go unused. Our sites are malodorous pits of diesel exhaust and blowing dust. Yet we keep doing it, because if we ever stop, we won’t get paid.

A bench I made from old packing pallets, and gifted to a friend

I’ve been accused of being an old-school Communist because I disdain Capitalism’s implicit moral imperatives, like work. But Communism is no better. The ugly cities, scarred land, and blackened skies left by the retreating Soviet Bloc revealed a work morality that, like Capitalism, cared little for the damage left behind. Capitalism and Communism both view human life purely instrumentally: that is, if you aren’t employed, you aren’t contributing, and therefore your life has little meaning.

I repeat, humans are naturally drawn to work. If they can’t spend free time working, they’ll replace work with forms of self-abnegation, like passively watching TV or getting drunk. Both these activities serve the same purpose, to numb the human soul and abolish the dissatisfaction we feel if we can’t work. Try sitting alone some evening, sober, and do nothing. Don’t even watch TV or noodle around on your smartphone. Bet you can’t do it.

That’s why I can’t conscientiously tell people to turn their passions into paychecks. Because paychecks aren’t, fundamentally, about work; they’re about an economic morality of self-destruction. Work, meaning real work and not “gainful employment,” should ennoble people and fulfill the drive to improve our world. Employment, however, is a sinkhole of value, blighting the Earth and exhausting the workers who do it. Don’t let your passion turn into that. Don’t drain it of all meaning.

Please don’t misunderstand. If you’re a paid professional novelist, congratulations. If your hobby farm pays for itself, feeds your family, and lets you save for your kids’ college fund, keep at it. If you make a living doing what you love, or will soon, well done. But don’t let the economy’s moral imperatives turn what you love into a toxic black hole. Because once your paycheck starts ruling what you love, it’ll rule you, too.

Monday, October 26, 2020

Calvinism, America, and Work

John Calvin (etching by Konrad Myer)

I didn’t do one goddamn thing all weekend.

I had sincere intentions this weekend. I planned to get so much writing done, and perform some household repairs, and clean the living room, and maybe get my unused second bedroom into a condition where I could comfortably have guests over. Then I disregarded my alarm both days, laid in bed, read books, watched a YouTube church service on Sunday, and ordered pizza. By contemporary standards, it was a wasted weekend.

As a Christian, I struggle with conflicting impulses. Remembering the Sabbath and keeping it holy is so important, that God wrote it among his Top Ten, alongside not killing and not committing adultery. Refraining from work for one day out of seven seems pretty important. This is emphasized by non-theistic regimes, like Revolutionary France and Soviet Russia, both of which rushed to reinstitute lite-beer Sabbaths when workers became widely sick with overwork.

However, Christianity also strictly condemns idleness, considering it a form of impiety. Though we’re enjoined to reserve one day in seven for doing nothing, the other six are ordered to remain cluttered with effort, an unending struggle to demonstrate God’s creative impulse made manifest in ourselves. Christian scripture brims with exhortations to keep busy, work hard, and remain ever productive. Sitting down and catching your breath is considered wasteful.

This contradiction becomes most manifest in Calvinism. In distinction from the preceding medieval Roman tradition, John Calvin insisted Christians had an imperative from God to constantly improve ourselves and improve our worlds. This improvement meant preaching the Gospel, certainly (though Calvin was squishy on actually feeding the hungry), but also human industriousness. Calvin thought God demanded we constantly work, build, and manufacture.

Although a dominant branch of conservative thinking preaches America as a Christian nation, we don’t commonly think of America as structurally religious. Yet the Calvinist tradition comes into America through our secular worship of the New England Puritans. This combination of religious lip-service and secular myth-making creates a distinct goulash of politics, religion, and economy which we, in America, call “capitalism.”

I’m not unique in drawing this conclusion. German sociologist Max Weber, writing in the early Twentieth Century, sees an almost straight-line connection between Calvinism and capitalism. America’s economic structure couldn’t survive if a critical mass of Americans didn’t believe work makes people morally good. Weber asserted that Calvinist Christianity is absolutely necessary for a thriving capitalist economy.

Max Weber
Thus, by blaming myself for not working harder this weekend, I’m doing the bidding of my capitalist clerics, apparently.

I must note, however, that this Calvinist ethic has come unmoored from its religious roots. I’m Lutheran myself, a tradition that doesn’t share Calvin’s insistence on work as a moral imperative. I have friends from many religious traditions, and no religious tradition, who also insist on this idea. One agnostic friend says aloud that seven-day work weeks are mandatory if you mean to do a good job; days off only inculcate laziness and slovenly work ethic.

Thus, because of secular myth-making, Calvinism has escaped the religious confines which cultivated it, and have become simply “American.” Peons, like myself, absolutely demand a “work ethic” from ourselves which involves complete self-abnegation, and deriving our identity from our employment. We don’t become whole persons through relationships, family, or community anymore; we become human when some institution grants us a paycheck.

Since joining the blue-collar workforce, I’ve noticed this widely. Though not all my co-workers are religiously observant, the language of Christianity nevertheless permeates workplace discussions. The more a person’s workplace language contains reference to God, the more resistant they are to organizations, like trade unions. Going alone makes people morally good, and therefore sufferings, like poor pay and scanty benefits, are a necessary sacrifice.

Many Americans have internalized this ethic. We believe doing something with our time, something upon which we can slap a price tag, is mandatory and constant. Spending a well-earned weekend putting our feet up, chatting with friends, and reading books, is time wasted, time which we’ll never recover and successfully monetize. And I, a lowly worker, sit around punishing myself for catching my breath.

And I’m still working. I can only imagine the internal strife plaguing Americans sidelined by COVID-19, what struggles they must endure because the pestilence continues to devalue work. After a weekend spent doing what I love, rather than what makes money, I feel lost, alienated, and inconsiderate. If I punish myself thus for sitting around for two days, how do people handle being sidelined for six months?

Friday, January 31, 2020

The False Promise of “Just Quit”


I recently made the Technological Age’s most common mistake: I let myself get dragged into an argument in the comments section. Specifically, I let some asshat, holding forth about the famously dismal wages and working conditions for American schoolteachers, try and prove that things aren’t that bad, and where they are, it’s because of common economic factors. In short, this person insisted, stop complaining about teaching conditions; either work or quit.

That latter option dominated this person’s argument. I’ve heard in frequently on my co-workers’ right-wing talk radio, too: if you dislike your job’s wages or working conditions, quit and get another one. They use one phrase repeatedly: “just quit.” Like they believe finding work consists of leaving one job and walking to another. Especially for skilled employment like teaching, the application process can require months of negotiation, besides the expense of relocating.

The average starting schoolteacher’s salary in Nebraska, where I live, is $34,465 per year, according to the National Education Association. That’s about $5000 below the national average—but, considering Nebraska’s lower cost of living, that’s not bad. I wouldn’t want to raise children on that salary, certainly, but for a beginner, it’s a reasonable middle-class wage, if you’re just starting out. But if you think that’s sustainable, try getting a home loan. Please.

My counter-arguer insists that this wage breaks down to about $24 per hour, significantly about the state average. This seemed weird to me, until I worked his math backwards. To achieve this absurdly high number, this person calculated the average starting salary, figured 180 instructional days per school year (the standard in most jurisdictions), and broke each day into eight hours. This figures to about $23.94 per hour.

If you’re following this, my counter-arguer assumes schoolteachers only work on instructional days, and only work eight hours. If you’ve ever taught, or been teacher-adjacent, you know this makes no sense whatsoever. Teachers spend countless long nights, weekends, and non-instructional days on grading and lesson prep, to say nothing of requirements for continuing education, departmental functions, and bureaucratic “in-service” days. Teachers don’t stop when students go home.



Faced with this failure of reasoning, my counter-arguer retreated into high-school economics, insisting that “supply and demand” devalues teacher wages. Basic economics teaches that scarcity makes something more valuable, while oversupply drives value down. Teachers’ wages suck, in other words, because there are too many teachers. This might make sense if teachers weren’t in such short supply that school districts are actively headhunting qualified teachers from other states.

One could extend this logic to other industries. Construction, the field where I’m currently employed, reported a nationwide worker shortage of 300,000 personnel in June 2019. This especially applies to skilled laborers, such as carpenters and ironworkers. Yet wages remain low: one co-worker, an experienced finish carpenter, can’t find work above $18 per hour without moving across state lines. He and his wife together are barely making house payments.

Two factors drive this reality. First, while there’s arguably an undersupply of skilled workers, probably caused by American education policy favoring universities over trade school, there’s also an undersupply of employers. My co-worker, despite being more skilled than me, cannot find employers willing to pay better, at least locally, because our area has too few contractors employing finish carpenters. While I won’t say those contractors engage in price-fixing, they certainly have little incentive to pay better, under current minimal competition.

Second, there’s an economic truth long acknowledged in certain circles: the more useful a job is for general society, the worse it generally pays. Currently, even after the 2008 economic meltdown, financiers and investors generally have the highest income, while laborers, educators, and service providers—the people who create genuine public welfare and social stability—have seen their wages stagnate for two generations.



This reality comes most clearly in anything related to food. “Tipped workers,” a category dominated by food service workers, haven’t seen their minimum wage increased since 1991. That in an industry with America's highest wage theft rates. Meanwhile, America’s farm debt has reached levels unseen since the 1980s. We nominally laud American farmers, and encourage underemployed youth to get jobs waiting tables, then pay both fields starvation wages.

So yeah, “just quit.” Move to another industry… if you can. Which most people can’t; most teachers, carpenters, and waiters will remain teachers, carpenters, and waiters. Career changes, presented as flippant, are exceedingly difficult. So lobbying for better conditions isn’t whining; working people have a right to demand control in work.

Monday, October 14, 2019

Capitalism and the Common Cold

My friend Sarah caught an upper respiratory infection off a coworker recently. Like millions of Americans, this coworker, “Rachel,” felt compelled to ignore her illness, go to work, and potentially expose everyone else. To other workers, it’s probably a common cold—a nuisance, admittedly, but nothing catastrophic. But owing to asthma and a systemic hypermobility-related condition, Sarah has limited ability to fight routine infections. Colds, for her, often turn into bronchitis, and she’s out for weeks.

This got me thinking about the times I’ve bucked medical advice, chugged Day-Quil, and gone in sick anyway. Like millions of hourly workers, I don’t have compensated sick days; if I don’t clock in, I don’t get paid. And believe me, I’ve tried foregoing rent and groceries, with little success. Unless I’m too impaired to move, I have no choice but to ignore my illness and work. Same holds, sadly, for most nurses, fry cooks, and other low-paid workers in highly transmissible fields.

During my factory days, one of only two times I got a stern talking-to about my work ethic involved attendance. I breathed a lungful of dust off some chemically treated paper, and spent a week flat on my back. My supervisor called me into a conference room and informed me that, notwithstanding my doctor’s note from the company clinic, I had missed what they considered a substantial amount of time, and was now officially on warning.

(My other stern talking-to involved getting angry at my supervisor, throwing down my safety gloves, and walking out. That’s a discussion for another time.)

My supervisor warned me that, even beyond the pinch I’d enforced on my company, I had imposed upon my fellow line workers, who needed to offset my absence. Clearly, this warning conveyed, I had a moral obligation to ignore the signals my body told me, and come to work. This was only one among many times when the messages I got from family, school, employment, and others, told me that work was more urgent than protecting my bodily health.

Clearly Rachel got the same message, because she even lied to Sarah about how contagious she was. Even while continuing to sneeze on Sarah and other coworkers, Rachel insisted she was past the contagious stage. At this writing, Sarah has been housebound for a week, hooked to her nebulizer and struggling to feed herself. All because Rachel felt the social cue to not spread her cold mattered less than the moral imperative to keep working.

I cannot separate this morality from the capitalist ethic. Like me, you’ve probably spent your life bombarded by messages that work makes us happy, productive, and well-socialized members of society. Conversely, staying home, even when wracked with wet phlegmy coughs, makes us weak, lazy, and otherwise morally diminished. Our bodies aren’t something to respect and listen to; they’re impediments that need silenced so we can become happy contributors to the economy.

(As an aside, Sarah has already written about this experience. She and I discussed this experience, and tested ideas on one another; while she and I don’t say exactly the same thing, there are significant overlaps. My take is slightly less first-person.)

But who benefits when we power through and work sick? I certainly don’t; I feel miserable and sluggish, and also feel guilty for my inability to produce at accustomed levels. My employer doesn’t benefit, because he must pay standard wages for diminished outcomes—indeed, as I can’t rest and recuperate, he must pay more for my illness than if he offered paid sick time. And considering I must pay higher deductibles for off-hours doctor visits, my illness imposes on everyone.

In short, by making my continued attendance morally mandatory, I diminish everyone’s outcomes. Plus I infect everyone around me, including people who, like Sarah, can’t shrug off a cold. But I keep working, so hey, I benefit the capitalist class, right? So I accept the requirement to work, while socializing the risk, and my employer privatizes the outcomes. This offers a distorted morality that literally prioritizes money over individual and public health.

Perhaps you think I’m overstating things, that we don’t really value economic outcomes over health. If so, try telling your employer that hourly workers deserve compensation so they can avoid infecting one another without missing rent. See how your boss reacts with moral outrage. More importantly, see how you feel the gut-clench of wracking guilt before you even speak. That’s the capitalist ethic trying to silence you. Because we’ve made common colds literally immoral.


Also on capitalist morality:
Capitalism, Religion, and the Spoken Word

Wednesday, October 9, 2019

Capitalism, Religion, and the Spoken Word


Every shift at the factory began with our line supervisor reading a sheet of exhortations. She’d begin by belting out, in a voice to beat the machinery: “What are our top three goals?” And we’d respond in unison: “Safety! Quality! Productivity!” The sheet then transitioned into a list of instructions, things like “Check your machines are in good working order before using them,” and “Keep your workspace clean.” Basic stuff, common to most industrial workplaces.

I worked at this factory for months before I realized this sheet wasn’t busywork. By making everybody participate simultaneously, and requiring us to chant parts of the script in unison, management was steering everybody’s thoughts toward the requirements of work at the beginning of the shift. The unified participation forced us to leave outside obligations outside, unify our thoughts, and shift our brain rhythms toward work. We have a word for this. It’s called “liturgy.”

Liturgy is the verbal assertion of what religious people believe. Perhaps it seems silly comparing industrial labor to church, but bear with me. The order of worship in Christian churches; Islam’s five daily prayers; the tightly scripted mealtime recitations on Jewish High Holy Days—all these are liturgy, and they serve to unify everyone involved in one goal. By reciting liturgy together, believers stop being individuals, and become one coalescent body. Many souls become Soul.

Religions encourage this unity because individuals are necessarily arrogant. The untethered mortal frequently becomes an instrument of appetite, consuming and consuming without ever becoming full. Our culture likes the archetype of the nonconformist bohemian, but it’s an ideal very seldom realized; most people, including myself, can’t be trusted as individuals. We need community and the shared experience of others to restrain our animal desires and become completely human; liturgy is one way to achieve that.

Émile Durkheim wrote, clear back in 1912, that liturgy makes participants speak their values aloud, together. It isn’t enough to privately affirm our beliefs, and treasure their truths in our hearts; anybody can do that, but life’s constant strains force us to compromise our values. We’re all occasionally hypocrites. Speaking our values aloud, together, reminds us not only what we believe, but that we don’t struggle alone. Religions with sturdy liturgy see very little apostasy.


Many non-religious groups recognize this unifying power in reasserting what we believe in public, in unison. That’s why public schools require students to recite the Pledge of Allegiance, and why it caused such controversy when football players elected not to participate in the National Anthem. For national and government purposes, refusing to participate in these acts resembled refusing to speak the Apostles’ Creed and sing Kyrie Eleison in church. They risk generating widespread national apostasy.

For capitalists to embrace liturgical practice serves two purposes. First, it gets everybody in a work mindset immediately. At the factory, we needed to reset our mental rhythms to the pace of the assembly line, without hesitation. Our shared chant, with its almost Shakespearean cadence, accomplished that. At my current job, we have no such liturgy, and getting started on any meaningful work thus requires thirty minutes of grumbling and fumbling as our brains realign.

Second, capitalist liturgy forces us to accept, on some level, capitalism itself. Sure, not everybody who speaks the Kaddish or the Nicene Creed believes the words, but they at least give some level of assent to the principles, making themselves bearers of the words’ value. Likewise, workplace chants, company songs, and the tradition (most common in Japan) of calisthenics at the top of the shift, make workers leave their identities outside and become, temporarily, Employees.

In other words, workplace liturgies, like religious liturgies, make us subjugate our identities to The Other. Whether that Other is God or Capitalism matters only sub-structurally; both approaches get us to stop being individuals. The structure of Church and Capitalism bear remarkable similarity. That doesn’t mean the sub-structural qualities don’t exist; Church calls us to stop being individuals to serve humankind, while Capitalism wants us to serve Capitalists. But they structure it the same way.

Perhaps that’s what makes Capitalism so difficult to unseat, even as we workers look outside and see our labors making someone else rich. Our conscious minds know we aren’t achieving the promise of Capitalism, but we’ve liturgically committed ourselves to the capitalist ideal. Changing our minds now wouldn’t make us merely non-capitalists; it would make us apostates. Just as leaving religion can be terminally painful, abandoning Capitalism forces us to abandon the words we’ve spoken.

Wednesday, October 2, 2019

Not Gonna Take It Anymore

Eric Blanc, Red State Revolt: The Teachers’ Strike Wave and Working-Class Politics

2018 saw a sudden upsurge in teacher strikes and other labor actions in several American states, mostly states that went heavily for Donald Trump. This strike wave defied multiple accepted theories among the punditocracy: theories about how strikes are outmoded, or the “white working class” represents an ideological monolith, or that labor action does no good. What made 2018 special? Can American labor do it again?

NYU sociologist and former high-school teacher Eric Blanc was commissioned by Jacobin magazine to cover the 2018 teachers’ strikes. He focused on three: West Virginia, Oklahoma, and Arizona, the states where multi-day walkouts resulted in significant concessions from conservative governments. What Blanc finds has eye-opening implications for organized labor. But I question how portable these insights are.

Schoolteachers, like other skilled professionals throughout the American economy, accepted austerity as the necessary condition following the 2008 financial crash. But ten years later, despite putative recovery, their wages, benefits, and working environment remained locked in post-crash conditions. School districts granted waivers to put non-credentialed teachers in front of classrooms. Insurance was getting adjusted downward amid a supposedly hearty economy.

However, as Blanc observes early, “Economic demands are rarely only economic.” Schools in many states, especially states with historic Republican governments, have been long neglected, with class sizes exceeding what qualified teachers can handle, physical plants in disrepair, and an adversarial relationship between legislatures and teachers. Educators didn’t only strike for improved pay and insurance; they felt the state had denied them the authority to teach.

Legitimate action began in West Virginia. Donald Trump won this state with a two-thirds share, and Republicans with an anti-organized labor stance controlled the statehouse. Admittedly, West Virginia has a longstanding union tradition, dating back to the Coal Wars of the 1890s. It was once a Democratic Party stronghold. But like many Democratic-leaning states, West Virginia grew disgusted with Democrats running on center-left promises, and governing on right-wing principles.

Blanc provides generous evidence that, since at least Jimmy Carter, Democrats have consistently fielded a lite-beer version of the Republican economic agenda. Both parties have repeatedly cut public education funding, mandated standardized tests written by private contractors, and shifted financial responsibilities onto local communities unprepared for the burden. Teachers’ unions have complied with this trend, apparently on a devil-you-know basis.

Striking teachers in the West Virginia statehouse, 2018 (CNN photo)

West Virginia’s strike combines old-school organization with innovative grassroots action. The state’s teachers were divided among three competing unions (which isn’t uncommon in right-to-work states), so coordination had to begin with the membership. While union leaders feared upsetting the apple cart, educators and, importantly, support staff organized online, including much-despised social media, to create pressure from below. It ultimately worked.

America hadn’t seen a statewide teachers’ strike since 1990. Accepted wisdom for an entire generation held that strikes created bad blood and undermined communications between labor and management— and sometimes, they do. But compliance with authority hadn’t produced any better results, either. When union membership pushed a strike authorization vote, almost ninety percent of West Virginia teachers supported a walkout. The die was cast.

Inspired by West Virginia, teachers in Oklahoma and Arizona elected to strike. But these states learned largely opposite lessons from the experience. Oklahoma gave remarkable power to non-union firebrands who had great energy, but no organizing experience. Importantly, Oklahoma forgot to include support staff in their organizing efforts. Arizona fared better, taking time to lay groundwork for an unprecedented strike action in possibly America’s most Republican state.

Blanc basically provides an oral history of the three movements. Why did West Virginia and Arizona succeed, while Oklahoma resulted in a split decision at best? And why did other states with similar grievances, like Kentucky and Colorado, manage just one-day walkouts with only salutary effects? The answers to these questions largely exist in participants’ testimonies, and Facebook groups where the grassroots members created momentum.

Also, Blanc acknowledges significant limitations to the model he describes. Teachers were able to engage community support because their local connections cross class boundaries in ways, say, auto workers cannot. Blanc admits the strikes avoided addressing race issues, which isn’t insignificant; as Ibram Kendi points out, labor unions have long been bastions of White protectionism. Ignoring race might work for one strike, but isn’t sustainable.

Still, even if Blanc’s account doesn’t create a blueprint for revitalized American unionism, it provides pointers of ways workers build countervailing power against capitalist might. Teachers broke the taboo against labor stoppages, and proved that simple numbers can reverse intransigence. They didn’t solve everything, certainly. But they proved change is possible.