Tuesday, March 31, 2020

The Poor Side of Armageddon


My jaw hit the floor Sunday night when I learned President Trump had accused medical professionals of stealing resources. Speaking specifically of N-95 respirator masks, he asked rhetorically, “Are they going out the back door?” He went on to postulate: “I think maybe it’s worse than hoarding.” Professional journalists stepped in to say explicitly what Trump only implied, that in-house theft was depleting America’s medical resources.

Because I work ahead, I’d already written Monday’s blog entry, which stated that I’d have to re-evaluate all my assumptions about secular apocalypse. I lavished praise on humanity’s demonstrated generosity during this challenge. “Humans, the broader society currently agrees, are natural collaborators,” I wrote, “who persist against unequal odds because we have one another.” Humanity, I’ve realized, isn’t fundamentally adversarial and neoliberal.

Except, as Trump’s statements demonstrate, the wealthy and powerful among us persist in believing society is a Hobbesian bellum omnium contra omnes. Before the COVID-19 pandemic, we had the national spectacle of legislators claiming women would hoard menstrual supplies if they weren’t prohibitively expensive. Apparently the fear of hoarding, or “worse than hoarding,” motivates certain powerful people’s economic and political ideology.

This fear isn’t entirely unreasonable. Considering Americans panic-buying toilet paper and eggs, clearly we’re vulnerable to the same swarming behavior that has periodically driven up the price of tech stocks or housing futures, right before a crash. There’s something to the idea that ordinary people, possessed by fear, will engage in hoarding, or “worse than hoarding.” Fear is a powerful motivator, and people will clutch any shield against perceived threats.

Hoarding, though, is a self-fulfilling prophecy. People rushed to buy toilet paper because they saw other people rushing to buy toilet paper. Fueled by constant repetition of panic behavior on cable news and social media, people internalized this motivation, because humans share one another’s emotions. This fear response is the mirror image of the collaboration I praised yesterday: we trust one another, and therefore trust each other’s fears.

When the President, and other powerful people in highly visible places, accuse ordinary citizens of theft and hoarding, some audiences take these accusations seriously. This isn’t faulty reasoning or illogic. Humans are naturally inclined to trust, and we’re especially inclined to trust people we regard as wealthy or powerful. That’s why reasonable leaders seek to calm fears in moments of distress, because humans naturally entrust our emotions to other people.

I suggest President Trump assumes nurses are hoarding resources, because he knows he and his fellow mega-rich also hoard resources. Just as all thieves assume everyone steals, all resource hoarders assume their behaviors are commonplace. Like most people, the wealthy assume everyone shares their motivations, and derives reward from the same outcome. In this case, he assumes everyone feels gratified by having and controlling stuff.

Capitalism itself is grounded on an authoritarian control of resources. Smarter historians than me have described how Europe’s transition from feudal economy to capitalism happened because the aristocracy brought back chattel slavery and privatized previously public resources, particularly pasture land. Though capitalist societies have abandoned the plantation system, we still hear the drumbeat of privatization: of infrastructure, of military, even of water.

Donald Trump became one of America’s (putatively) wealthiest citizens by controlling swaths of real estate in America’s most expensive city. His treasury secretary, Steve Mnuchin, is a former investment banker, somebody paid to maximize return on money. Even some of Trump’s most aggressive critics, like Warren Buffet and Bill Gates, became prominent by controlling overwhelming shares of their particular industries. Gates has been accused of buying impunity against market manipulation.

Just because some rich people oppose other rich people, doesn’t make them your allies.

Concisely put, despite limited-range changes in our economy, the wealthy remain wealthy because our laws of ownership protect resource hoarding. Americans don’t sell people anymore, they just sell the produce of people’s labor. Then they become angry when they suspect, even incidentally, that ordinary people might hoard resources, even without evidence. Toilet paper, eggs, and N-95 masks are sins to hoard. Because the rich already hoard money, land, and capital.

In lobbing accusations against nurses, President Trump reveals his own vulnerability. He projects his motivations onto strangers. He fears other people will hoard resources for profit, because that’s what he’s always done. And in creating this fear, he makes it a reality, as some people, fearing others’ motivations, rush to buy TP first. That’s the opposite face of this apocalypse: while ordinary people share and persevere, the rich reveal their true motivations.

Monday, March 30, 2020

This Bright and Shining Apocalypse

Abandoned commercial boats on the floor of Russia's Aral Sea, which is nearly dry
from human overconsumption. Photos like these have been used by Hollywood to
design what Earth might look like in a post-apocalyptic hellscape.

About seventeen years ago, I wrote a playscript and accompanying novel set in a post-apocalyptic wasteland. Writing in 2003, my work was one among a flood of pop-culture apocalypses in the aftermath of 9/11 and, discouraged by the overwhelming force of the cultural moment, I gave up before writing the final draft. What, I wondered, would make mine matter when professional writers were addressing the topic with such vigor?

I postulated a world ravaged by contagion. The receding tide of humanity left the surviving rump population scared, defenseless, and desperate for answers. I divided humanity into two camps, those who opted to create autonomous settled communities and rebuild society, and nomadic hordes living out Nietzschean power fantasies. Not surprisingly, for those who know me, I intended to write a character study, and instead drifted into religion and faith.

This season’s slide into uncertainty and plague has reawakened my interest in this work. I’m reluctant to return to something written that long ago, as I’m in a different place, as both a human being and a writer, than in 2003. Yet looking around, at a situation where reality has impinged upon my fictional creation, perhaps it’s important to return to old haunts, if only to balance what I got right, versus what I got wrong.

A world brought low by disease seems timely. When I first took stumbling steps into writing my own stories, we assumed civilization would collapse from nuclear war. Since then we’ve stumbled through robot rebellions, contagions, environmental devastation, and the Aztec calendar as reasons why humanity might end. Somehow, we persistently survive. Yet we remain aware that humanity can only destroy itself once, and it could happen soon.

Yet watching the world actually struggling with pandemic, I realize my most important mistake: I distilled human impulses into a Manichean dualism. On one side, I placed my protagonists, a sympathetic band who wanted to rebuild civilization from the moldering remains. They didn’t know how to accomplish this goal with the available tools, however. One plot point turned on whether a guitar was worth more as music, or as firewood.

The Four Horsemen of the Apocalypse, engraving
by Albrecht Dürer (click to enlarge)
Opposite my protagonists, I placed a mysterious outside force, which within the playscript was only implied. Our heroes viewed the outside world with distrust, because anybody not geographically rooted must, perforce, need something. And some people who need something would take it forcefully. Having faced this challenge once, our heroes feared and disbelieved outsiders—even while acknowledging they needed help, which strangers might provide.

Watching an actual pandemic unfold, this dualism seems remarkably short-sighted. Yes, a certain segment of our body politic seems willing to forfeit the elderly, the chronically ill, and the disabled to appease the gods of economics. But the majority of humanity has banded together. Large arts institutions, which aren’t exactly wealthy, are streaming their content online. Stores are reserving valuable shopping hours for seniors.

Farbeit from me to ever praise corporations hastily, but even Disney, the commercial juggernaut that currently owns your dreams, has shown remarkable humanity. During times of isolation, they could withhold their lucrative content until desperate parents paid handsomely to quiet their stir-crazy children. Yet they’ve willingly released their most valuable current property, Frozen II, to housebound families who need something to make time go away.

Seventeen years ago, I envisioned a split between ordinary citizens who wanted civilization to continue, and ordinary citizens giving rein to their gut-level appetites and aggression. That isn’t what we’re seeing. Instead, I look around and witness a government controlled by plutocrats, who assign monetary prices to human values, who regard survival as negotiation leverage, and want to maximize their self-centered outcomes and calcify their power hierarchy.

Arrayed against this mercenary social stratum, we have ordinary people who, to greater or lesser degrees, acknowledge that humanity survives because we don’t assign prices to everything. Humans, the broader society currently agrees, are natural collaborators, who persist against unequal odds because we have one another. Sure, some people break quarantine or demand Spring Break. But looking around, this appears, from my perspective, like a small and venal minority.

Apparently I have to re-evaluate what the end of civilization will mean. Like countless armchair futurists before me, I’ve always assumed that halting the status quo will unleash some level of human barbarism. Yet outside a few arrogant, self-centered corners, that isn’t what I see. I’ve implicitly internalized the neoliberal argument that, when the chips are down, we’re all on our own. Yet it looks like we’re braving this together.

Friday, March 27, 2020

The Female Monster In Your Closet

Sady Doyle, Dead Blondes and Bad Mothers: Monstrosity, Patriarchy, and the Fear of Female Power

The monster under your bed exists specifically because you expect it to. Let’s start with that strange, paradoxical belief. But what fuels your expectations? Are your fears uniquely your own, or are they conditioned by social forces you haven’t even consciously processed yet? The answer to that last question is painfully obvious, especially if you’re already reading literary criticism. But what forces condition our private monsters?

Journalist and pop-culture critic Sady Doyle has written extensively about the ways society treats women who step outside their social roles. Her second full-length book deals specifically with how women shape, and are shaped by, images of horror in pop culture. This begins with literal horror, like the feminine monstrosity in The Exorcist, but extends beyond, into how we discuss crime and violence, the popularity of “true crime,” and how we talk to women.

To speak of feminine monstrosities, Doyle divides femininity into three overarching roles: the Daughter, Wife, and Mother. Each role represents a female archetype that women (or girls) should honor, and when they don’t, we perceive them as monstrous. The little girl transitioning to motherhood, the rebellious and willful wife, or the negligent mother, all permeate our culture’s nightmares. But why, and what does that say about us?

Common criticisms of sexualized violence in slasher films, to give one example, notice that teenage girls are doomed the moment they acknowledge their sexual nature. To kiss or, worse, to actually have sex, becomes a death sentence. But Doyle also acknowledges a contradiction: before the middle 1990s, the major slasher audience was teenage boys. Now, it’s mostly teenage girls. What changed, in our culture’s fears, to popularize slashers this way?

Doyle uses the word “patriarchy” generously. Veteran lit-crit readers know this word has loaded implications, depending on who uses it and how. Doyle means a society organized according to male, hetero-normative standards, which slots people into social roles according to their assigned-at-birth genders. Our society reproduces roles which people have to either accept or resist, and for women, that means literally reproducing roles.

That titular “Dead Blonde” can, depending on how stories handle her, represent male fears of women exceeding their roles, as when young girls transition into women. Or it may represent women’s fears of patriarchal violence: consider Drew Barrymore in the first Scream movie, literally trapped inside her house, knowing that stepping outside the kitchen means death. Thus the same image can reinforce, or ironically resist, patriarchal culture. What matters is violence.

Sady Doyle
But the “Bad Mother” has proven more obstinately resistant to ironic subversion. From Norma Bates and Pamela Voorhees, to Jordan Peterson and the language of online discourse, patriarchal culture persists in blaming women, particularly mothers, for men’s violent behaviors. Doyle keeps circling back to Augusta Gein, whose bad mothering, always somehow considered in isolation, supposedly motivated her murderous son Ed.

In fairness, though Doyle’s arguments create an eye-opening context in which to read patriarchy’s fear of women, she sometimes seems unaware of her own lens. Reading slasher films as a totalized phenomenon, for instance, overlooks particular examples. Freddy Kreuger is often as hostile to men as women, especially closeted men. The Exorcist subliminally condemned puberty, but The Exorcist III (the only good sequel) condemned male religiosity.

Careful readers, going through Doyle’s exegesis, can find examples where she overlooks the obverse of her position. If patriarchy forces women into artificial, constraining roles, then aren’t the roles reserved for men equally constrictive? What about intersectional roles? White stay-at-home mothers are lionized, while paid childcare workers, Doyle admits, are despised, and often given sub-poverty wages. How, then, does class influence gender roles?

I say this, knowing that all writers make choices about what to include. At nearly 250 pages plus extensive back matter, this book is about as long as most literary criticism written for general audiences these days. Anything longer would’ve made imposing reading, especially on a topic as difficult and morally taxing as horror and violence. Doyle must have known how much she necessarily omitted to create a book audiences would actually read.

Therefore, accepting that it looks at one side of a multifaceted puzzle, this book makes a worthwhile prolegomenon to further examination of how our society uses tacit violence to reproduce social roles. When the monster forces us back into the kitchen, or back into the closet, it doesn’t do so in morally neutral ways. Public art can display our public aspirations, but it also showcases our public fears. Far too often, our most visible monsters are female.

Wednesday, March 25, 2020

“Honor Thy Father and Mother” Means YOU




Dan Patrick on Tucker Carlson's show, Monday night

When I was short, I remember sitting through a church sermon on the Ten Commandments. The pastor, who looked unimaginably old to my grade-school eyes, but was probably only in his middle forties, told the congregation that “Honor thy father and mother,” either the fourth or fifth commandment (depending on your tradition), wasn’t intended for children. “It was written,” he intoned with great Wesleyan solemnity, “to remind us adults.”

Like much about faith, I didn’t understand what this meant until adulthood. I couldn’t imagine how easy it was for grown-ups to become so wrapped in career, household responsibilities, and tedium, that we forget to show proper respect to our parents. We never achieve standing so self-sufficient that our parents have nothing to teach us; even after they’re gone, they teach us the important responsibility of carrying on boldly.

This probably seems self-obvious to my peers, all of us adults, many with children, whose own parents provide a calming and stabilizing influence. I never would have questioned it until this week. But when Texas Lieutenant Governor Dan Patrick became notorious on Monday for saying that Americans over seventy will risk death to preserve the economy, I couldn’t handle this. This smacks of Commandment-level dishonor for our elders.

What I didn’t understand from my booming, elderly pastor, was that we adults don’t honor our parents because they birthed us. That’s a mechanical process, and one which anybody with biology can perform, as many children of abusive parents can attest. We honor our parents, and the elders among our community, because their existence ties us to a continuum. They remind us that past, and by extension future, exist.

Working adults get tangled in requirements of the present. In agrarian societies, like early Israel, this means the continual present of planting and harvesting, of birthing livestock and dressing meat. But even that requires at least medium-term thinking, as spring planters must recollect autumn harvest and its needs. The industrial economy shortens workers’ time horizons onto the current pay cycle. We think in weeks, even days, not months or years.

Parents, with their storehouse of wisdom, keep us connected to a larger time horizon. Who among us, after moving into our first apartment, didn’t phone home, asking for help sautéing mushrooms, or negotiating with used-car dealers, or making a dentist’s appointment? That’s just a scattering of what parents offer. We, as children, didn’t understand the scope of time, or how our actions have echo effects. Our parents did.

Religious tradition requires responsible adults to honor our elders, in the aggregate at least, because they provide living, material connection to our traditions. Any society without elders is a society without a past, without long-term philosophy, without depth of understanding. We exist entirely in the present tense. Then we feel hungry for meaning and guidance, which hucksters and tyrants will gladly sell us, in exchange for our souls.

This isn’t hypothetical. As John Taylor Gatto writes, our industrial society pushes elders into long-term care homes, and children into schools which resemble warehouses. This leaves adults spending most of their waking hours outside the home, away from their parents, working for other people. We exist with neither past (parents) nor future (children), eternally adrift, looking for definition. Which, in capitalist societies, we find by spending money.

I acknowledge, as some will insist, that abusive or neglectful parents exist. My relationship with my own parents flourishes with time spent apart, and by not talking about certain topics. In saying this, I don’t purport that we should return to some beatified agrarian past where everything was good. Life is too complex and nuanced for that. But how do we handle complexity and nuance without some philosophical tradition?

What Dan Patrick said on Monday, by Tuesday had become standard talking points among defenders of the status quo. Fox News potentate Brit Hume called it “an entirely reasonable viewpoint.” And though President Trump hasn’t encouraged aged Americans to risk death out loud, he’s clearly prioritized dollar transactions over public health. A healthy fraction of American leadership is willing to jettison part of our heritage if money keeps flowing.

Don’t be deceived. They advocate a society with no past, and thus by implication no future, because they stand to profit. Either they’ll get rich, or they’ll get powerful, or (like Trump or Bloomberg) both. They know a society without tradition is rudderless and starving; they’re banking on it. This isn’t about money versus human lives. It’s about whether we even have a past.

Monday, March 23, 2020

COVID-19 and the Failed State Era


Buried deep within the Biblical books of Samuel, we find a particular gem, at the beginning of 2 Samuel 11. This chapter details David’s whirlwind assignation with Bathsheba, wife of David’s commanding general Uriah the Hittite. The story begins, in verse 1, with: “In the spring, at the time when kings go off to war…” (New International Version)

Encountering this passage years ago, I halted abruptly. War, in this figuration, is something kings do, for no other reason than that they do it. Farmers plant, blacksmiths forge, and kings fight, because (pardon my mixed metaphor) it’s their dharma. It’s easy to forget, in an era when Queen Elizabeth II’s most important responsibilities apparently involve opening shopping centres, that “King” was originally a military rank.

Kings prosecute wars, not to enforce visions of justice or to bring peace to distant realms, but because they must. King David was probably a hill-country chieftain, not a medieval potentate as often depicted in art, but still, his responsibility was to expand Israelite territory, fight other kingdoms, and collect booty, because… somebody had to. States aren’t domains of law and structure, they’re essentially vessels to contain the military.

This plagued me for years. Post-Enlightenment philosophy assumes something called a “social contract,” that individuals and communities in some prehistoric time banded together and created nations to defend against brigands and enforce justice. But here, the Bible itself apparently says otherwise. States, according to Samuel, exist to fight other states, to separate the polity into winners and losers. Justice, described in Joshua or Amos, is purely private.

Though I found this puzzle unsolvable, it probably would’ve remained personal and minor, until the COVID-19 crisis. Watching the American government, one of history’s most powerful and thorough-going instruments of enforcement, fail to prevent this massive spread (at this writing, my small town has two confirmed cases), has eminently demonstrated that states can’t really confront challenges they cannot physically kill.

President Trump has invoked the war metaphor, which Presidents always use whenever their authority gets challenged. He’s hardly the first to use war language during peacetime. President Bush declared his Global War on Terror, which President Obama tacitly continued. President Nixon declared, and President Reagan most seriously prosecuted, the War on Drugs. President Carter declared “the moral equivalent of war” on waste and energy inefficiency.

Artistic representation of King David
However, the war metaphor collapses when you consider the outcome. Terrorism, drugs, and energy waste remain real problems, sometimes nearly half a century later. Despite having possibly the most sophisticated technology and most advanced regulatory network in history, America, like most states, has proven consistently bad at confronting enemies without borders, capitals, or armed forces we can shoot. The wars inevitably drag on.

What, then, are states for, besides war? We could argue states exist to protect their citizens and promote justice. But protect them from what? Promote justice toward whom? Marginalized groups like Uighurs in China, Dalits in India, and immigrants in America would assert that their respective states have done remarkably poorly in promoting justice. Anecdotally, the larger the state gets, the worse it becomes at protecting marginalized groups.

The Trump Administration has demanded increased enforcement powers, some contravening the Constitution, for the COVID-19 crisis. But if history is indicative of future outcomes, giving the Executive Branch more power will end with crackdowns on designated “outsider” groups, both within and beyond our borders. This administration already cages children whose parents flee chaos and civil war; does anyone doubt similar consequences await citizens at home?

Concisely put, states exist to identify enemies, expand territory, and bring glory upon kings—or, in modern technocracies, upon Presidents and Prime Ministers. To achieve this, they enforce laws upon citizens, laws which, as anyone who’s followed Omnibus Farm Bills and health insurance “reform” knows, actively ignore local knowledge. And they create in-groups and demographic divisions that streamline the process of conscripting citizens.

Dedicated status quo defenders could probably find examples of the government doing good. But how many government solutions only matter because they resolve something awful governments did? The Clean Air Act reduced pollution, yes, but the government did and does subsidize cheap hydrocarbons, creating the pollution. Much social, environmental, and political degradation only exists because massive governments prop up massive corporations.

Governments cannot solve problems governments create. Governments cannot redress injustices when they require in-groups to fortify their institutions. They cannot solve violence when they exist mainly to make war. And, as Uriah the Hittite discovered, they’ll screw your wife while you’re off fighting their battles.

Wednesday, March 18, 2020

The Great American Doom Comedy

1001 Books To Read Before Your Kindle Dies, Part 104
Kurt Vonnegut, Slaughterhouse-Five

“Listen: Billy Pilgrim has come unstuck in time.” One of literature’s most consequential opening lines.

Like many Americans, I encountered this novel in high school American Literature class, one of the rare books by a living or recently-living author that the standard curriculum will allow. But I didn’t understand it then. Reading it concurrently with the nationwide sweep into Operation Desert Storm, the novel’s most important themes went right past me. As often happens with literature, the messages consumed in youth only resonated with age.

Kurt Vonnegut inserts himself into the story through the thin camouflage of Billy Pilgrim. (Vonnegut’s use of characternyms is often high-handed.) Like Vonnegut, Billy Pilgrim is a draftee fighting in Europe against the Nazis. Like Vonnegut, Billy Pilgrim gets captured by Germans already functionally in retreat, and held in a converted hog abattoir, the titular slaughterhouse. Like Vonnegut, Billy Pilgrim survives one of World War II’s most horrific episodes, the firebombing of Dresden.

Unlike Vonnegut, Billy Pilgrim suffers a syndrome where he experiences time out of sequence. This means he has become the first human ever to know what effects his choices will have before making them. Where most officers in World War II gave orders, and soldiers followed them, completely blind to their context in history, Billy Pilgrim understands exactly what knock-on effects every choice he makes will have.

But he’s still helpless to stop them.

Vonnegut bookends this novel with two metafiction chapters in which he, as himself, discusses writing this novel. He explains his desires to de-romanticize World War II, and particularly the destruction of Dresden. But he cautiously avoids discussing the times in which he wrote: this book dropped in 1969, as the government publicity machine misused heroic myths of World War II to keep public approval for the Vietnam War on life support.

Alone and unique among humanity, Billy Pilgrim has the ability to understand what will eventually happen with every choice he makes. But in practice, that doesn’t mean much. He increasingly realizes that history unfolds because it has to unfold, that things happen because the moment is shaped that way. The longer he goes, traveling between discrete moments of “the present,” the more fatalistic he becomes, a mere passenger in life’s great train wreck.

Kurt Vonnegut
Perhaps this explains why this novel remains a staple of high school AmLit courses, because Vonnegut deliberately keeps his themes close to the surface. And his first theme is that life isn’t like a novel. Billy Pilgrim isn’t a protagonist, a hero carrying the action and changing his world; he’s carried along by life’s deterministic forces, changing himself to fit the moment. We are all doomed to accept the great social machine, Vonnegut insists; only the fortunate few ever understand how.

Facing life without sequence, Billy Pilgrim (it’s difficult to identify him without full name) experiences everything simultaneously. War and captivity; later life and adult career; the outer malaise and inner torments of old age. Everything happens to him at once. From this he learns life’s most important lesson, that free will is an illusion. Though he has the illusion of choice, he remains tied to every previous choice that he made, or that life thrust upon him.

To Vonnegut, the universe is a machine, a clockwork mechanism playing to an inevitable end. Nothing could’ve ever been different, because the machine runs, it doesn’t vary from its system. Right and wrong, moral and immoral, are categories humans create because we cannot see the mechanical actions unfolding around us. And few among us will have even the most salutary ability to change the machine—which, he implies, is winding down.

Like his protagonist, Vonnegut is deeply pessimistic. He watches fellow soldiers tramp into conflict, thinking themselves doing God’s work. But all of them, John Wayne heroes and shabby mavericks, officers and soldiers, they all die. Billy Pilgrim’s doomed army unit, his doomed marriage, and his doomed outlook for American history, all fold together into the realization that, like Sartre’s characters in hell, we’re all marching toward disappointment and death together.

That’s heavy to lay on high school juniors desperate to complete AmLit. But, like William Faulkner or Harper Lee, we don’t expect kids to read Vonnegut because they’ll understand him. We expect them to read Vonnegut because, as adults, we’ll recognize the parallels between real life’s moral ambiguities, and what we read at seventeen. Because, like Billy Pilgrim, we’re all doomed to face our respective Dresden. And if we’re lucky, maybe we’ll get out alive.

Monday, March 16, 2020

The COVID-19 Economy


A disaster like COVID-19 was always going to happen, eventually, under capitalism. Like me, you’ve probably seen the widely circulated memes about how food service workers cannot afford to miss work when sick, a reality that has knock-on effects when other workers lack time enough to cook at home. But I suggest the problem extends far beyond food. This crisis has the potential to unmake capitalism like John Ball’s Peasants’ Rebellion eventually unmade feudalism.

Our entire economic system is based upon presumption of scarcity, that there isn’t enough wealth to cover everyone’s needs. This has been our justification for denying healthcare to the indigent, sick leave to hourly workers, and better wages to anyone, even amid record high productivity. The assumption is that we will run out of whatever we particularly treasure. And we will run out because ordinary people will use it up.

Admittedly, the scarcity argument is sometimes true. There isn’t enough land for everyone who wants a house in San Francisco. But applying this reasoning to larger issues becomes absurd quickly. The idea that our economy can’t afford to offer sick days to hourly wage workers resembles the argument that women will hoard hygiene products if they’re subsidized. Sick days and tampons are not essentially similar to land in Frisco.

Would a waitress throwing a sickie really implode the economy? Individually, certainly not. The fear is, every waitress could develop brown-bottle flu simultaneously, which would have disastrous economic effects. But I can only see two reasons that could happen. Either work conditions are so systemically horrible that they’ve called a general strike; or we consider working-class people inherently untrustworthy and prone to stealing common resources.

Which seems more likely?

If we fear working-class people will hoard resources, what does that say about the hoarding tendencies which already exist? In feudal times, nobility officially owned land, but practiced nominal control. Ordinary workers operated that land communally, in trust. The transition to capitalism following the Peasants’ Rebellion involved privatizing formerly public goods, particularly land (by enclosing previously common pastures) and labor (by introducing slavery).

The hoarding accusation blames selfish individuals for removing common-good resources from general use. Importantly, you can’t accuse people of hoarding common-good resources without admitting the common good exists. Saying people will privatize anything made public, admits privatization harms the general welfare. And it means admitting the economy is essentially collective, except where laws make it otherwise.


This past week has seen numerous national chains reverse their previous sick-leave policy. Apparently McDonald’s and Chipotle actually can afford to let sick workers recuperate without losing pay. In other words, nothing has stopped them introducing a paid leave policy, except the necessity of keeping work abundant and money scarce. They have admitted, effectively, that the scarcity principle underlying capitalism is moral, not factual.

Except in certain circumstances, like land in San Francisco, scarcity is an illusion. Our system has enough food that nobody should go hungry, enough houses that nobody should remain homeless, and enough money that nobody should live in penury. Admittedly, we can’t make everyone rich. Nor should we; wealth only matters if you can hire others to work, which requires somebody to be poor. We can’t make everyone rich, but we can make nobody poor.

Before anybody says anything, I know COVID-19 hasn’t struck only capitalist countries. It originated in Wuhan province, China. And the Chinese response, which started with denial and evasion, before moving onto authoritarian crackdowns which proved about as airtight as a kitchen colander, demonstrates that command economy and a one-party state isn’t the solution. Reorganizing the bureaucracy doesn’t fix the structural problems.

Capitalism isn’t the market. Quoting Ibram Kendi, “Markets and market rules… existed long before the rise of capitalism in the modern world.” Rather, capitalism is the accumulation of laws, rules, and traditions that govern ownership. If people don’t own access to land, they don’t own unmediated access to food and water, the foundations of life. By keeping access to survival scarce, the system keeps labor abundant, and therefore cheap.

Communism largely does likewise.

We create scarcity by creating the illusion of constant crisis. But the only scarcity most citizens face, is scarce of access, caused by a bottleneck of money. This fake crisis has been exposed by collision with real crisis, when the actual abundance of goods like food and water comes crosswise against the scarcity of money. The next few weeks will expose how fake our economy is. The next few years will reveal how we handle that.

Monday, March 9, 2020

Bernie or Biden or Bust or Both

I’ve been a registered Democrat since April, 2002, shortly before I turned 28. A longtime Republican descended from a long line of Republicans, I suddenly, at that age, couldn’t reconcile Republican policies with basic fairness. The Democratic Party, though hardly perfect, better represented ideals of protecting the vulnerable, welcoming the oppressed, and holding the rich accountable. Ideals which, as a Christian, I grew up with every day.

So that’s who I am when you hear me say: this party needs to get its shit together and stop living in fear.

Watching the diverse Democratic primary field collapse into two aged White men with spotty histories, I’ve kept hearing one word: “Electable.” Is Bernie electable? What about Warren? America probably isn’t ready, some gadflies say, for its second Black President, or its first woman. I might like Candidate X’s principles, and they make a good speech, but is America ready for someone like this? Is my favorite candidate, or yours, really electable?

The concept of “Electable” indicates, not whether we agree with a candidate’s platform, but whether we think a hypothetical stranger will agree. The claims against serious candidates like Elizabeth Warren or Cory Booker have focused on whether their concrete, actionable plans—or their very persons, as a woman and a Black man—might alienate our hypothetical stranger. We literally fear that a woman with a plan is too much for America now.

Thus, campaigning in a spirit of fear, we’re left with two candidates, the cowards’ choices. Bernie Sanders electrifies large crowds, but he does it by combining grandiose plans with vague strategies. He’s like if a late-night college bull session became a human being. Meanwhile, Democrats widely perceive Joe Biden as the inoffensive candidate. Assuming you can wallpaper over his touching women, his segregationist voting, and his tone-deaf “Epiphany” comment.

“Electability,” as a voting criterion, basically boils down to one of two options: either attracting the largest number of uncommitted voters, or driving the smallest number of those numbers away. However, as Maggie Koerth at FiveThirtyEight writes, electability is a nebulous target. The reasoning is always circular: I’ll vote for whomever everyone else will vote for, and they’ll vote for that person because everyone else will vote, because everyone else—

Yikes.

Put another way, “Electability” involves sacrificing principles to conform with a public consensus. But what is that consensus? As I’ve written before, polls repeatedly show a majority of Americans prefer proactive government leadership in protecting the poor, ensuring health care access, and cleaning the environment. They might reflexively side with Republican candidates, for abstruse reasons, but they favor Democratic planks.

Yet the punditocracy treats political discussions like striking balance between two sides. When the overwhelming preponderance of scientists say human activity is heating the planet, for instance, and paid endorsers say it isn’t, the pundits give both sides “equal time.” This creates the illusion that there’s something to debate, when the discussion should be functionally closed. Then a committed centrist like Biden promises the “middle ground” on Global Warming.

I appreciate the willingness to find compromise on actual controversies. For instance, it’s impossible to definitively say whether capitalism or communism is better, because both systems are responses to incomplete information. You have to make value judgments, which often means splitting the difference. But you can’t split the difference with human extinction, with letting the poor die, or with Nazis. Yet that’s what Democrats keep trying to do.

Some critics, including personal friends, will accuse me of applying “purity tests” to Democrats here. I can’t disagree. But Republicans have applied purity tests for years: I watched for years, initially as an insider, as Republicanism became increasingly intolerant, warlike, implicitly racist, and now explicitly racist. Admittedly, the Republican Presidential candidate has won the majority vote only once since 1988. But sometimes it’s the procedure that matters.

In nearly twenty years since I left the Republicans, they’ve gotten good at motivating True Believers of a certain stripe. Democrats, contrarily, have chosen to muddle their message, accepting everyone alienated by Republicanism’s drift. Yes, it makes for a big-tent party, which core Democratic leaders find desirable. But it also means one one party is more energized to actually participate and vote, leading to lopsided outcomes.

Democrats have essentially decided their abstract American is timid, averse to controversy, and centrist. And maybe they are, I don’t know—and neither do you. But the outcome is one side of the political spectrum more energized and motivated than the other. We’ve seen what that produces.

Friday, March 6, 2020

Is “Finna” a Real Word?

A pseudonymous Twitter user earned her fifteen minutes of fame this month, which she probably didn’t want, when she shared this specious claim:


Look, I get it. People who spend years mastering the inner nuances of English, feel squeamish when neologisms crop up. Respected eminences of written English, like Oscar Wilde and Ambrose Bierce, wrote entire books attempting to stop English from changing. Authors like Henry Watson Fowler, Bryan Garner, and Strunk & White, even invented new rules that didn’t previously exist, to stop what they considered a rapid slide into pop mumbling.

But something about the claim that “finna” isn’t a real word cheeses me off to an unusual degree. Certainly, the word comes from Black English, which is always policed more harshly than my Honky academese. Also, the word’s root comes from “fixing to,” a term that almost exclusively exists in Southern American English, which always gets treated as inferior by Northerners, Whites, and people with credentials.

I probably shouldn’t bring race into the discussion, since the specious tweet’s author is Black herself. But it bears repeating that many words which originated in Black English, including “bae,” “YOLO,” and “on fleek,” were actively discouraged by the word police for years, until White people adopted them. The imprimatur of whiteness often marks the boundary between Proper English and disparaged slang, which seems clearly unfair.

Yet I feel something more visceral than a programmed response about racial fairness and English credentials. I remember Fourth Grade, when a classmate told me, with the smugness only pre-tweens can truly muster: “‘Ain’t’ ain’t a word, because ‘ain’t’ ain’t in the dictionary.” I had multiple problems with this: first, a quick check proved the word “ain’t” actually was in the classroom dictionary. Don’t use proverbs anyone can quickly disprove.

Second, it makes the dictionary the arbiter of “real” words. The reality of words, whether casual 19th-Century Britishisms like “ain’t” or recent AAVE inventions like “finna,” isn’t dependent on some lexicographer somewhere. Dictionary writers respond to regular English, they don’t create it.

Besides the regional origins of “fixing to,” a phrase I still occasionally use despite not having lived in the South since 1987, the mixed racial aspect is pointed. If you’re White reading this, please, try saying “fixing to,” making sure to pronounce every letter as written. You’ll notice it’s quite difficult. The vowels slide all over your tongue, and the consonants shift from the back to front to sides of your mouth.

That’s why White Southerners already reduced it to “fix’-ǝn tǝ.” They already simplified the pronunciation, to make it slide off the mouth more easily. Which, if you’ve ever studied linguistics (or learned Sindarin, like a nerd), you already know humans everywhere do. Words which involve tongue gymnastics always get slimmed down. The Black version of this term, “finna,” does the same by removing the harsh consonants, leaving a more liquid form.

This process of converting existing phrases into neologisms matters. It’s the difference between describing how people already talk, and prescribing how people should talk. And this is always about power. In both Plato’s Athens, and Vergil’s Rome, powerful people tried to institute rules about how “real” Greek or Latin, respectively, could be spoken. These rules weren’t idle. In both cases, these rules were invented to weed out foreigners.

In exactly the same way, rules about “correct” English are designed to exclude Blacks, immigrants, and teenagers. Until you learn the code of powerful people, these rules insist, you have no right to participate, so shut up. This includes my tweeter, a Black woman enforcing a rule against Black English. My classmate who hated “ain’t” didn’t love grammar for grammar’s sake. Her main goal was to pass as an adult.

I’ve written about this before, and recently, too. Real English is always positional. Anybody pitching hard-and-fast rules doesn’t really want to police English, they want to police people, and protect a hierarchy, where they see themselves at or near the top. Anybody who thinks “finna” is fake English, but is okay with tech neologisms like “spam” and “google,” is enforcing a stratification of which groups get to invent words.

Somehow, “permitted” groups always look like me.

Claims about what makes “real words” are never, fundamentally, about words. They’re about what constitutes a real person, a person permitted to participate in public discussion and upwardly mobile economics. Attempts to silence fake words are attempts to disqualify people, and people groups, from community life. And they protect the arguer’s sanctimonious self-image, at somebody else’s expense.

Wednesday, March 4, 2020

Sean Connery On Age and Dignity

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 38
John Huston, The Man Who Would Be King
Richard Lester, Robin and Marian


The 1970s saw Scottish actor Sean Connery taking an unusual chance in mainstream movies: he got old. Though only in his forties, he retired from playing James Bond, allowed himself to go bald onscreen, and took roles playing men facing the reality of age. Two of those movies got shoved into the niche of boyish period pieces, which is unfair, because they’re two of the best films he ever created.

1975’s The Man Who Would Be King starred Connery, Michael Caine, and Christopher Plummer, directed by John Huston. That should’ve been enough to secure classic status alone. But it also derived from a Rudyard Kipling novella, originally written in praise of English colonialism, which revisited Kipling’s themes from a perspective of realizing the empire was already doomed. The themes derived are massive.

Connery and Caine play former British NCOs, veterans of the Anglo-Afghan wars. Retired and bored, they adopt that classic British hobby: exploration. They wander into an Afghan province so remote, no outsider has conquered it since Alexander the Great. Warring clans have spent two millennia battling over Alexander’s legacy, a battle into which our heroes inadvertently stumble. When an arrow fails to kill Connery, they take him for a god.

Former enemy clans band together, believing Connery to be Alexander’s heir, a king heralded by prophecy, and Caine his emissary. The two morally dissipated British establish their petty empire on false promises, misuse of religion, and greed. Fat on conquest, with the province’s treasury at their disposal, Caine suggests absconding to England and living off their proceeds. Connery, however, has begun believing his own snake-oil pitch.

Class matters in this story. The Scottish Connery and the Cockney Caine, poor outsiders in Britain, find themselves monarchs in Afghanistan. Connery dreams of meeting Queen Victoria as an equal. Caine, meanwhile, finds himself torn between conflicting moralities: he’s a common adventurer, who subsidizes his thrill-seeking with crime. But he’s also a Freemason, which binds him to specific loyalties. Being viceroy jeopardizes both.

Sean Connery and Michael Caine in The Man Who Would Be King

In 1976, Connery revisited similar themes in Robin and Marian. Directed by Richard Lester (A Hard Day’s Night), and featuring an all-star cast, including Audrey Hepburn, Richard Harris, Ian Holm, and Robert Shaw, this film features similar motifs of reconsidering childhood myth in adulthood. This time, the myth is Robin Hood, grown old and disillusioned after his outlaw days are over. He’s too old for glory, too young to die.

Robin has discovered King Richard is as venal and corrupt as the Prince he once fought against. After King Richard dies ignominiously, Robin returns to Sherwood, unsure of his virtue. There he finds his Merry Men have become common horse thieves, and Maid Marian has joined a convent. With Prince John elevated to king, old grudges are liberated to fight again. Except for one impediment: the Sheriff of Nottingham won’t have it.

King John attempts to restore his greedy iron hand over England’s North, while Robin attempts to rebuild his Merry Men. Robin wants to turn the clock back ten years: violence, romance, and justice. He wants Marian to rejoin him in the forest. Marian, however, is sincere in her monastic vows, and attempts to broker peace between the parties. Robin literally punches her and drags her back to Sherwood Forest.

In contrast, the Sheriff of Nottingham appears downright genial. He refuses the king’s men access to his shire, preferring to enforce law locally—and is strategic in which laws he enforces. Robin and Nottingham have different visions, based on whether they live in the present or the past. They also have different experiences with their battle, because they’re getting old. Both find themselves tuckered out after relatively short clashes.

These two historical dramas reflect different points in British history, but share important themes. Both take periods famous for myth-making and national glory, and view them through a post-imperial eye. They both, in essence, admit that Britain will keep fighting wars it’s already won, until it exhausts itself and, by winning the war, loses the peace. The end result of great national glory, these movies imply, is national disappointment.

But despite their ponderous themes, these movies are also great fun to watch. They display Connery, a man clearly relishing the transitions of time, just being an old man enjoying the push forward. Both movies mix their pontifical messages with dry humor, splendorous landscapes, and beautifully choreographed fight scenes. Yes, they admit, the empire was always doomed to fail. But didn’t we live a full life on the way there?

Monday, March 2, 2020

How To Be a Digital Friend

An old college friend appeared suddenly in my Facebook timeline last week. She popped up, apologized for not posting much recently, shared some mildly exciting career news, and vanished again. A quick check of her timeline reveals that this was her first post in nearly eighteen months.

I’m sorry to say, I hadn’t noticed she’d gone.

Nearly fifteen years ago, this young woman and I shared some classes together, in writing and literature. We didn’t know each other particularly well. I thought she did pretty well in Advanced Fiction Writing, but suffered from a tendency to censor herself unfairly. To my surprise, she’d graduated to a career unrelated to her English major: as a public affairs advocate, she probably writes a great deal, but her life has clearly gone different directions than mine.

Therefore I feel slightly bad admitting I’d forgotten about her. She didn’t do anything wrong; quite the contrary, she’s accomplished a great deal for people suffering from chronic disabilities, lobbying for greater inclusion and research funding in Washington, D.C. It’s just that, one day, she went quiet, and I never noticed. She deserves better than that.

How many other digital “friends” have vanished down this same memory hole in the intervening years? I’d need to skim my Facebook and Twitter lists, checking who’s posted anything recently, to know. But it forces me to wonder what makes friends in the massively connected digital landscape. Because clearly, our ability to instantly call up words from people we haven’t seen in years, changes what the definition means.

Ellen Hendriksen, Ph.D.
Dr. Ellen Hendriksen is merely the latest author I’ve read to reiterate something important: our friends aren’t necessarily the people with whom we have the most in common. Our friends are the people with whom we spend the most time. I’ve recently encountered many people who ask, in tones shading into desperation, how we make friends as adults. The answer, concisely, is: spend more time around people. Volunteer. Take night classes. Go to church.

Unfortunately, while science knows what makes meaningful adult relationships, the exigencies of late capitalism don’t permit physical time spent together. With an increasing fraction of adults required to work nights and weekends, getting involved in activities that permit making friends is exceedingly difficult. Many of us can’t spare time to maintain the friendships we already have.

Social media permits us to casually subvert this limitation. Jack Dorsey and Mark Zuckerberg have politely created venues where well-meaning adults can sustain contact with old friends, even those we haven’t seen in years. No more poky snail mail: I know what friends I haven’t actually seen in over twenty years are doing tonight. In exchange, I simply surrender personal data to Jack and Mark. Cambridge Analytica is a small price to pay.

But what happens when these “friendships” grow overwhelmed? Perhaps you’ve heard of the Dunbar Number, which calculates how many social relationships humans can organize before getting overburdened. That number, for humans, is about 150, or roughly the size of a medieval village. Facebook, by contrast, permits us to maintain 5000 “friends.” I’ve chosen to cap at around 250. But I don’t remember where I met all of them.

This means I’ve maintained parasocial relationships with former co-workers with whom I never had much in common besides the job, college classmates and former students whose lives and interests scarcely resemble mine, and fellow church parishioners whom I never see except on Sunday morning. My 250 friends include maybe twelve people I’ll see in a given week. What makes this a “friendship?”

Plato and Aristotle, painted by Raphael
Aristotle describes his complete ideal of friendship: two people, both alike in virtue, wish nothing but good for one another, and, where possible, strive to bring that wish to fruition. Returning to my friend who became a public services advocate, I indeed wish her nothing but success and happiness in her goals. When she boasts of her accomplishments, I feel good for her. Perhaps that makes me her friend.

Yet I don’t think we’ve been in a room together since 2005. We haven’t shared activities or time in fifteen years. If I never saw her again, my life would be poorer for not learning her accomplishments, but I doubt I’d even notice. Fifteen years along, my life trajectory probably wouldn’t change one whit if, with one mouse click, I quietly ended this friendship. That feels both sad and scary.

We keep digital friendships alive for decades. Then we digitally kill them in an instant. Strange world we live in.