Wednesday, December 30, 2020

Ben Sasse Shouldn't Be Taken Seriously By Anyone, About Anything

Benjamin Eric Sasse, junior Senator from Nebraska, Republican, peeked his head above ground last week, and American journalists apparently get six more weeks of sunshine. One of America’s most prominent “Never Trump” Republicans, Sasse garnered attention for criticizing the outgoing President’s spate of lame-duck pardons. In a terse, internet-friendly one-liner, Sasse said: “This is rotten to the core.”

Beginning in 2016, Sasse took point in the “Never Trump” movement when he preemptively announced his refusal to back Trump for President. This, and a handful of speeches from the Senate rostrum, made him an early darling of left-leaning social media, which dutifully retweets anything he says critical of the President. Despite being a junior senator from a sparsely populated state, he’s accumulated significant media airtime for his stated opinions.

Except his actions don’t follow his words. He votes strictly party line nearly 95% of the time (which, in fairness, is less strict than most Republican Senators). Given the opportunity to support the impeachment measures in 2020 and remove the supposedly corrupt President, he again voted party line, protecting the President he supposedly disdains. Facing a serious primary challenge, he ran radio ads boasting of protecting Trump’s judicial nominees.

This gap between words and actions prompts an important question: What does Ben Sasse want?

Sasse enjoys issuing media reports and press releases. He loves giving national interviews, and publishing books and articles in venues with a national reach. In short, he loves jockeying for attention, and getting the world's eyes upon him. He wants to be seen, because being seen is necessary for national office. It’s an open secret that he plans to eventually run for President.

Maybe not everything Sasse does is necessarily intended for a presidential run. Maybe he only wants a hand in swaying the national conversation and setting the agenda. But it sure seems at least possible that he has national ambitions. His demands to stay in office, embodied in his primary campaign double-talk, mean he needs to maintain his position and be seen. He isn’t interested in truth, he’s interested in attention.

That answers a lot. Sasse has little interest in truth, as evidenced by his repetition of internet memes in his books. He amplifies notional ideas carried in national and global media, because it’s easier than building a platform based on evidence. Truth, for him, is moral, not factual. Therefore he feels free to insert whatever pseudo-facts he desires, as long as they serve his capital-T Truth.

Sasse’s books, particularly The Vanishing American Adult, are littered with internet memes, to the exclusion of facts. He laments a storied Christian past of stay-at-home mothers and prayer in schools—a past which, as a Doctorate-holder in history, he should realize never actually existed. Many of Sasse’s anecdotes of supposed Millennial and Zoomer entitlement have been widely debunked; even the conservative National Review considered his book specious.

Unburdened by facts, Sasse builds a moral code which he thinks Americans will honor, by paradoxically putting Americans down. He seeks to sway our behavior, not by appealing to our better angels, but by shaming and humiliating us for wanting better. His morality works toward themes of work, service, and self-abnegation. Good people, in Sasse’s morality, subordinate themselves to leaders, who subordinate themselves to God.

His accusations of President Trump being “rotten to the core” derive, not from devotion to facts, but from a moral code in which certain people must serve. Trump’s pardons were not delivered to bad people, in Sasse’s opinion, but to people who strayed from their place: crooked businessmen, political insiders, and insubordinate soldiers. He doesn’t want these people pardoned, not because they are criminals, but because they aren’t correctly subordinate.

He’d never say so to the media, but Sasse has an Ayn Randian worldview, that life rewards certain people remuneratively (including himself) because they’re simply better than the mass. Therefore whatever keeps the mass subordinate is appropriate. Law, in Sasse’s mind, exists for lesser people. Trump is lesser because he churns up emotions in the plebian class, making rowdy White people think they own the country rather than being subordinate.

That, therefore, is what he means by “rotten to the core.” Sasse wants a world where people are appropriately subordinate. His opposition to Trump comes from Trump getting above himself, and not subordinating himself to the hierarchy. It isn’t that Sasse has strong convictions that keep him from supporting a criminal president, it’s that Sasse wants Trump to recognize his place in the hierarchy and stay there.

Friday, December 25, 2020

The Sacred Pause

The year 2020 has changed many of our relationships with time, possibly permanently. The hours of solitude, and for many the weeks of mandatory isolation, have made us aware of passing time. People have been forcibly separated from the activities that made time meaningful, like productive, autonomous work, or time with family; others have been separated from the activities that make time go away, like drinking with friends.

If I’ve learned anything about time in 2020, it’s how flimsy time really is. Only two objective units of time have any meaning to our lives, the day and the year, and those only matter because we happen to live on a certain planet orbiting a certain star. Technology has bequeathed us the “hour” and the “second,” measurable only by man-made machines. Scientists can measure time by vibrations of a cesium atom, accurate but, for most people, wildly impractical.

“Time,” in any meaningful sense, only exists inside our minds. Like anything else that apparently exists in the outside world, we can only perceive it through our senses, and impose meaning on the sensory data through our minds. While evidence suggests an objective world exists outside ourselves, our brains and bodies impose limits on our perceptions of that world. Time, space, and matter probably exist; but only our minds give anything meaning.

Therefore one of antiquity’s greatest activities, was defining time. Ancient civilizations sought ways to define time that allowed humans to synchronize their activities. Beginning with the basics of survival, they organized time around planting and harvesting. These times weren’t only practical, they were sacred, because the collaborative effort of creating food guaranteed the people’s survival. Planting wasn’t ordinary time; it was transcendent.

In the strictest sense, calling something “holy” doesn’t mean calling it “morally pure”; it means calling it “separate from everything else.” This applies to places, but it also to time. Pre-Christian festivals like Beltane and Samhain weren’t pretty traditions; they marked changing seasons, associated with growing and eating. Celebrations of cyclical seasons meant celebrating the people’s continued survival. Time meant keeping people together and alive.

We see something similar in the Abrahamic tradition. Ecclesiastes, with its famous statements about “a time to plant, and a time to sow,” as well as “a time to kill, and a time to heal,” wasn’t simply dividing human lives into hours; it imposed shared meaning on time, keeping the people together. Christianity, in its biblical roots, doesn’t have specific seasons that way, but preserves (sometimes wrongly) previous seasonal festivals, especially Christmas.

Remember, however, that these festivals weren’t private celebrations; everyone participated. To be Mesopotamian, for instance, meant participating in Mesopotamia’s seasonal festivals. Ancients had no separation between one’s national identity and one’s religion; language, ritual, and ethnicity traveled together. Even Judaic tradition preserves this, as the wandering Israelites paid homage to other people’s gods during their sojourns, according to the books of Moses.

Modernity doesn’t have this continuity. One can be American, or British, or Japanese, without observing that nation’s religion. Some countries, like America, have no official religion. You can believe and practice anything, or nothing, and still consider yourself American. There’s definite value to this, as it means one needn’t conform oneself to narrow and oppressive standards to be a citizen. But it also means Americans have no shared definition of time.

Except one. Everybody celebrates Christmas. One needn’t be Christian to celebrate Christmas, as demonstrated by the ascendance of that strictly modern “religious” tradition, the Hannukah Bush. For one day every year, America’s economy pauses. Most people don’t work. We don’t have pressures to perform “useful” or “profitable” activities; we simply sit with our families and appreciate being here, now. Christmas is Capitalist America’s only shared festival of time.

We have exceptions, certainly. First responders remain vigilant, and Chinese restaurants remain so widely open that it’s become a running joke. Generally speaking, though, Christmas is the one time nearly all Americans renounce worldly demands upon our time, and instead celebrate the present. Employers, governments, communities, and other outside pressures don’t own our time; for one day, time belongs uniquely to us.

Other holidays increase pressures: the July Fourth barbecue, with its public warm-weather spectacle, comes to mind. Certainly, advertisers and PR agencies cajole us to spend lavishly before Christmas on huge trees and garish presents. But on Christmas itself, for one day annually, we silence their demands. Frequently it isn’t easy; we’ve internalized demands to work, act, and produce. But Christmas, uniquely, allows us to pause time and just exist.

See also:
Fear of Darkness

Tuesday, December 22, 2020

Truth, and the Metaphors That Make It

If, like me, you care about politics and the relationship between people and power, then chances are your recent social media feed has looked like this:

Seven weeks after the incumbent President lost the election, after multiple court cases to overturn the vote have been ejected prima facie for lacking evidence, after the Electoral College has certified the results, and the Supreme Court (one-third of which the incumbent hand-selected) has unanimously refused to even consider a case they consider specious, loyalists continue demanding their definition of truth.

This demand for “truth” strikes me. Thousands of all-caps tweets continue pouring in, asserting the truth, dammit, that Joe Biden could only have won the presidential election through dishonesty. The only material evidence of electoral malfeasance has come from inside the incumbent administration; time and again, the incumbent’s claims of cheating die for lack of evidence. Yet loyalists continue demanding “the truth.”

President-Elect Joe Biden

Late in the George W. Bush administration, Berkeley linguist George Lakoff published his book Whose Freedom? Lakoff, whose career has focused on how linguistic metaphors shape how humans perceive the outside world, observed the late-Bush-era arguments over how to define freedom. He realized that, though conservatives and progressives used the word “freedom” generously, they imputed it with very different meanings.

Electoral politics generally stands or falls according to bromides, not principles. If you tell voters you’ll tax anyone holding remunerative jobs, or that you’ll let the poor starve, they’ll vote against you in numbers sufficient to torpedo your career. So working politicians learn to traffic in generalities: letting the rich hoard resources is “freedom,” and so is lifting the poor from penury. “Freedom” is an eternally elastic metaphor.

So, I’m coming to realize, is “truth.” When the tweeter above, and the thousands of her cohorts demanding the “truth,” shout that Biden stole the election, we progressives respond, almost crinkum-crankum, “Where’s your evidence? Show me the proof!” Because, for us, “truth” means accordance with reality. For claims to have truth value, they must have some real-world correspondence, something measurable enough to stand up in court.

But capital-T Truth, for defenders of the status quo, doesn’t require evidence. Truth is ultimately moral, not evidentiary; truth derives from accordance, not with the world, but with purity. If the world contradicts their Truth, then the world must amend itself, for the world is immoral. Just as morality requires us to examine ourselves and change our ways, it also requires us to change our world, by force if necessary.

Notice who, in public life, continues most assiduously defending the lame-duck administration. It’s mainly public moralists, people who think in black-and-white terms. From religious leaders like Franklin Graham and Kenneth Copeland, to secular moralists like Edwin Meese and Rick Santorum, the figures most inclined to defend this administration have a history of dividing the world into camps of good and evil, and acting accordingly.

Dr. Jill Biden

In fairness, I sympathize with this position. When I witness how our legal system continues to exclude certain populations, even as we’ve excised naked bigotry from the ledgers, I see a world plagued by moral compromise. When I swallow my objections to my coworkers using racist and homophobic slurs, because that’s the culture of industry, and purging it would leave us without skilled workers, I realize my world is unjust.

However, this doesn’t mean I can ignore reality. Elections frequently break in ways I find morally objectionable, and I’d love to overturn the outcomes; in my world, Elizabeth Warren should be preparing her incoming administration. But that’s not how the election happened. Electoral processes are devised by humans, and Arrow’s Impossibility Theorem proves we’ll never completely purge unfairness, so we must accept a certain level of arbitrary injustice.

To people who think in mainly moral terms, this acceptance of injustice is intolerable. Whether it’s progressives railing against the fact that we haven’t purged racism’s stain from economics, or conservatives wailing that their champion of continuity lost, any level of perceived injustice is unacceptable. Boring old evidence isn’t a meaningful counter-argument to morally founded truth; only a superior morality can reverse their purified thinking.

We must relinquish the idea that we can out-argue defenders of the status quo based upon evidence. They don’t want arguments based upon facts (though some, like Ben Shapiro, claim they do), because to them, this world is less true than the Truth. Even quoting their holy scripture generally doesn’t dissuade them. Truth, to them, is an eternal verity, not a worldly fact; we can only respond with better verities.

Friday, December 18, 2020

Guilt: the New White Man's Burden

Robert Taylor as Walt Longmire

Sheriff Walt Longmire is the sort of person who blames himself when bad things happen to others. That’s what makes him an effective sheriff, in his TV series: he takes crimes in semi-rural Durant, Wyoming, personally. So in the season 2 finale, when a hit-and-run accident leaves his daughter maimed and comatose the same day he gets re-elected, he considers it karmic retribution for his willingness to seek office.

Infuriated, Longmire storms into the Red Pony, the largest local tavern, owned by his best (and only) friend, Henry Standing Bear. “I need your permission,” Longmire growls, “to do what no White man should ever do.” Henry, who is Cheyenne Indian, leads Longmire into the wilderness, where Longmire strips to the waist. Henry gives Longmire his blessing, and Longmire engages in what appears to be a Lakota sun dance.

That’s where my suspension of disbelief hit a wall. The sun dance, for those unfamiliar with Native American traditions, is a ritual of self-mortification in which men, and only men, let mentors pierce their skin with large bone needles, and bind those needles to a tree. The sun dancers then pull against the binding, letting the needles dig into their flesh, until they either have an ecstatic vision, can’t stand the pain any longer, or lose consciousness.

I watched this enactment of Native tradition with mingled disbelief and horror. Those who know me, know frequently I balk at accusations of “cultural appropriation,” which I fear forces otherwise well-meaning people into racialized silos. Once upon a time, fascists didn’t want the dominant population sullying themselves with traditions of “lesser” peoples; today, colonized peoples fear their traditions getting polluted by grabby settlers.

Yet even I acknowledge that cultural mis-appropriation happens: blackface minstrel shows come to mind. It’s possible to seize others’ cultural markers in ways that demean and insult the original culture. That, I fear, happens in scenes like Longmire’s sun dance. First, though many Great Plains nations had religious rituals of self-mortification, the sun dance was specifically Lakota, and the Lakota and Cheyenne were traditional rivals.

Worse, though, is the image of the pious, self-sacrificing White man engaging in this specific ritual. At various points throughout the series, Longmire engages in Native spiritual practices; in a very early episode, we witness him having to leave a Cheyenne sweat lodge ritual because a violent crime report came in. As someone who strives to learn from Buddhist, Jewish, and Native religions, I support Longmire’s spiritual eclecticism.

Lou Diamond Phillips as Henry Standing Bear

But why the sun dance?

Students of history already know, but many White Americans don’t, that the U.S. government specifically tried to eradicate the sun dance. Because it could often be bloody, and some Lakota died practicing it, 19th Century Christian missionaries wanted it abolished, at gunpoint if necessary. Because of this history, the Lakota seldom allow White people to witness the sun dance, much less participate; they never allow it to be photographed.

Yet sun dance reenactments have become a staple of Westerns since at least the 1970 movie A Man Called Horse. The myth has arisen, among White people, that outsiders can purchase tribe membership through this and other rituals. Despite being set in present-day America, Longmire is a conventional Western, even down to Longmire’s Stetson and duster. And like many cowboy characters, Longmire yearns for an older, more primal spirituality.

Much frontier mythology dates not to cowboy days, but to 1893, and to Frederick Jackson Turner’s “frontier thesis.” Turner asserted frontier life’s purity of goodness of frontier, letting White people shed civilization’s neutering influence and “live like Indians.” But as Yale historian Greg Grandin has written, Turner’s hypothesis assumes, falsely, that White people crossed the frontier peacefully. Nothing could be further from the truth.

In practice, the American frontier didn’t exist until the U.S. cavalry drove Natives off their lands. Friendships between White settlers and wise Natives, like between Longmire and Henry, have often populated Western stories. But this only happened after the military penned Natives in, squelched their language, and often destroyed their religious gatherings at gunpoint. The Lakota massacred at Wounded Knee were performing the ghost dance, a religious service.

Watching Longmire perform a TV-friendly version of Native religion, therefore, jolted me to my shoes. No matter how well-intentioned or how sanctioned by Native nations, such depictions cannot exist separate from American history. When the descendents of those who killed Indians, attempt to participate in Indian religion, I realize I’m witnessing settler colonialism. How much worse, then, when networks do it to sell ads?

See also:
Innocence, Experience, and Justice on Netflix
Vampire Cowboy Cyborg Racist Firestorm

Wednesday, December 16, 2020

The Difference Between “Kid” and “Kiddo”

Dr. Jill Biden

Like millions of left-leaning Americans, I felt queasy when shitty journalist Joseph Epstein called presumptive First Lady Dr. Jill Biden “kiddo” this past weekend. Epstein delivered the unearned nickname amid a string of sobriquets lumped on Dr. Biden in jumbled succession, like the repartee in a 1930s screwball comedy. Maybe Epstein thought he sounded like Clark Gable. Instead, he wound up sounding like a patronizing dickhead.

My opinion about this incident is extensively documented. Yet I’ve realized my initial response overlooked something. I said the word kiddo “demonstrates bad faith.” Yet afterward, I remembered the summer of 2017, when Donald Trump, Jr., son of the President, admitted meeting with Russian state agents to purchase dirt on his father’s campaign rival, something he’d previously denied. Several politicians and pundits rushed to Junior’s defense, calling him “a good kid.”

Junior Trump was 39 years old, married, and a father of five. Dr. Biden is seventy years old. Both are, in the eyes of the power establishment, still kids.

At first blush, these two uses of “kid” seem contradictory. Epstein used the word kiddo to undermine Dr. Biden, and with her the entire educational establishment that tacitly rewards women. Multiple sources, including President Trump, called Don Junior a good kid to shield him from culpability for meeting with America’s state enemies during a bitter campaign. The word which tears Dr. Biden down, somehow protects Junior Trump from consequences.

Yet, thinking about it, I realize these two uses overlap remarkably. To call someone “kid” or “kiddo” makes them small, defenseless, and needful of adult guidance. The word kid conjures images of a newborn baby goat, still wobbling unsteadily, requiring its mother for food and protection against a world swarming with predators. Kids, whether human or goat, need adults to prepare them for an innately hostile, even violent, world.

Sure, the attitudes with which one claims adult superiority over a purported minor differ. Epstein’s vulgar diminishment of a grandmother with a terminal degree is noxious, compared to the Trump Administration attempting to keep Junior a permanent adolescent. Besides, Epstein isn’t Dr. Biden’s father; parents can, arguably, claim parental protection over adult children. These shouldn’t be mistaken for equivalent events.

Donald Trump, Jr.

Yet they share one important presumption: some people never, somehow, become adults. They have protected adolescent status well into adulthood, willingly or not. This week’s political shakeouts have demonstrated how that happens in national politics, but it isn’t unique to that domain. Criminal law has revealed how some people become adults at absurdly young ages, while others remain children for literally decades.

Back in 2014, Cleveland, Ohio, police shot Tamir Rice in a public park because he was playing with a toy gun. The officer who pulled the trigger claimed he couldn't distinguish Rice’s toy from a real gun. Importantly for our purposes, he also claimed he thought 12-year-old Rice looked twenty. We’ve heard similar in the shootings of Laquan MacDonald, Michael Brown, and other Black children. Somehow, he always “looked big for his age.”

By contrast, convicted Stanford rapist Brock Turner was nineteen years old, and therefore a legal adult in all fifty states. Yet even after tearful witness testimony and international outcry, the judge nevertheless sentenced him to a fiddling punishment, asserting that Turner was a child and therefore not culpable. Like Junior Trump, Turner is shielded from legal consequences (but thankfully not public censure) by his permanent adolescence.

White people are protected by childhood’s cloak, even if, like Dr. Biden, they don’t want it. Black people, by contrast, have adulthood thrust upon them early. Our society’s definition of childhood innocence is White, frequently fair-haired, and innately female, as any boy whose father thought he needed to “teach you to be a man” can attest. If you’re White, and especially if you’re a woman, you’re a child forever.

Meanwhile, we watch the ideological retreat of a President whose notorious midnight Twitter tantrums, complete with schoolyard name-calling, have perhaps permanently cheapened the Executive Branch. At age 74, his arrested adolescence doesn’t just embarrass him, it jeopardizes global security. This product of permanent White childhood has made humanity less safe, perhaps forever.

Admittedly, secular Western society lacks adulthood rites. Where traditional cultures had programmed events when children became adults, we lack that now, except in religious enclaves. This means extended White childhood is likely to get longer and more destructive. Because if nothing happens to turn the child into an adult, nothing will stop the entitled child demanding the teat, even if they don’t want it.

Monday, December 14, 2020

Women, Academia, and Lousy, Lousy Men

Dr. Jill Biden

Joseph Epstein is a twat-waffle who shouldn’t be taken seriously by anyone. My regular readers can surely agree on this thesis. I can add nothing to the controversy surrounding Epstein’s contemptible Wall Street Journal op-ed which hasn’t already been said better by women, professional academics, and scholars of journalism. And yet, even as I consider him a total asswipe, I can’t help understanding where he’s coming from.

Admittedly, I haven’t read Epstein’s attempted take-down of presumptive First Lady Dr. Jill Biden, and her use of her academic title in non-academic situations. I didn’t bother going beyond the paywall; the first paragraph, in which Epstein calls Dr. Biden “kiddo,” a term almost exclusively used on small children and women, demonstrated Epstein’s attitude promptly. I wouldn’t read a student assignment with such an opening; it demonstrates bad faith.

A former student from Epstein’s years adjuncting at Northwestern University recently posted a personal memoir of Epstein’s thoroughgoing disdain for women. His refusal to call on them in class, to acknowledge their contributions, or to believe they wrote the works they actually wrote, is probably familiar to generations of women. Yet, as a former adjunct myself, I can’t help wondering what my students thought of my treatment according to gender.

Nobody ever complained, to my face, that I favored one gender over another. Indeed, as my entire career focused on teaching Freshman Composition, I found women generally better prepared for college-level writing than men. I seldom gave 100% on any student assignment, but the two times I clearly remember, were both women. If I favored one gender, it was women, but I favored them because they—generally—earned it.

My final teaching semester, I had a student, a young man on a football scholarship, approach me after class. I found this youth, let’s call him Michael, a willing student, eager to learn, but unprepared for higher-level writing. Rarely did I actively dislike any student, but I felt warmly for Michael, because he earnestly tried to overcome his unreadiness; he genuinely wanted to succeed. He just didn’t know how.

“I don’t know, Mr. Nenstiel,” Michael said, studying his shoes with a distinct lack of confidence I don’t recall seeing in many football players, “this just feels more difficult than anything I’ve done before. I just feel like the girls are kicking my ass. I don’t know if I can compete with them, they just do so much better than me.”

Joseph Epstein, former
academic and crap journalist

This was the closest anybody ever came to accusing me of gender favoritism. The girls, Michael felt, were kicking his ass. (I distinctly remember that phrase, and have written about it before.) Yet even then, I realized, Michael saw things incorrectly. In a classroom roughly divided equally by gender, only one woman regularly participated in discussions without being called on; I had five men who eagerly participated.

Yet Michael felt outclassed, not because of classroom participation, but because of tangible, portable outputs. Later in my teaching career, I abandoned lecturing at the 101 level and began running my classes as writing workshops, which better suited my disposition. Therefore Michael had seen every student’s assigned writing, even the women who didn’t speak up, and saw they wrote with more confidence and experience. He didn’t know how to compensate.

Michael responded to this lack of preparation by turning his feelings inward and blaming himself. Personally, I’d blame a public education system dominated by “skillz drillz” and Scantron tests, administered by career overseers with little classroom experience. Women, whose brains mature earlier, need less guidance, in a guidance-free school system, than men. But someone like Joseph Epstein sees the same lopsided outcomes and blames the women for succeeding.

Nearly sixty percent of college students today are women. Women are not only more likely to enter college, they’re more likely to finish what they’ve started, and more likely to achieve graduate degrees. Academia, like business, remains dominated today by male executives and managers, but as the paucity of qualified men becomes more prominent, we’re likely to witness the female domination of post-secondary school and business, possibly within our lifetimes.

Where men like Michael consider themselves responsible for this outcome, and struggle to compete individually, men like Joseph Epstein respond by attempting to tear women down. His attack on Dr. Biden’s qualifications doesn’t merely attempt to diminish Dr. Biden, or even women generally; Epstein attacks academia itself, a system that often rewards prior preparation and early maturity. A system that, in blind outcomes, rewards women. That, to him, cannot stand.

Friday, December 11, 2020

Don't Follow Your Bliss

“Ball Lightning in Space” (oil on canvas, 2007)
by Kevin L Nenstiel

“What would you do with your life, if you didn’t have to worry about making a living?” I’ve long forgotten who first asked me this question. Apparently, it’s a common question that careers counselors ask youth and young adults trying to decide what they want their lives to be about. At first blush, it seems reasonable: If you’d get paid for doing what you’d rather spend time doing anyway, then please, make it your career.

Work is central to human identity. We organize our lives according to the labors we undertake, either voluntarily or for pay. As economist John C. Médaille writes, neoliberal economics has historically assumed humans must be coerced to work and be productive; yet when you consider what most people do for recreational activities, like gardening or woodworking or art, these pastimes sure resemble work. If we could turn passion into paychecks, isn’t this a desirable goal?

Yet I’ve come to question its validity. Certainly, I’d love to get paid for writing. I’ve spent years cultivating the skills of telling a good story, creating characters whom my audience will feel something for, and putting them on paths toward accomplishing goals. Yet these skills have never translated into steady income. My writings keep bouncing back, unpublished and unpaid. Some writers’ blogs provide a decent side income, but mine’s barely bought one dinner out.

Indeed, sideline activities and hobbies arguably speak volumes to a person’s core values. The love of earth and harmony found in gardening, for example, reflects a commitment to cultivating closeness to one’s organic roots. My writing arises from a passion for understanding other people, and a desire to communicate. Sometimes these values are counterintuitive. The repetition of gardening, or the isolation of writing, conceal the spontaneity and intimacy which both hobbies create, down the road.

These values run counter to paychecks, which have separate ethics. Work isn’t something people do to satisfy inner hunger or give their lives definition; we work because work is a moral imperative. The demand that poor people work harder, smarter, or better—a demand embodied by “work requirements” on EBT and other safety nets—demonstrates that work is fundamentally moral, not useful. We’d rather let the poor starve, than protect them without “earning it” first.

The late Anglo-American sociologist David Graeber wrote, shortly before his recent passing, that we cannot separate Capitalism’s moral imperative for work, from the current devastation facing the Earth. As a construction worker, I’ve witnessed it. We build buildings nobody particularly wants, including new retail spaces while existing ones go unused. Our sites are malodorous pits of diesel exhaust and blowing dust. Yet we keep doing it, because if we ever stop, we won’t get paid.

A bench I made from old packing pallets, and gifted to a friend

I’ve been accused of being an old-school Communist because I disdain Capitalism’s implicit moral imperatives, like work. But Communism is no better. The ugly cities, scarred land, and blackened skies left by the retreating Soviet Bloc revealed a work morality that, like Capitalism, cared little for the damage left behind. Capitalism and Communism both view human life purely instrumentally: that is, if you aren’t employed, you aren’t contributing, and therefore your life has little meaning.

I repeat, humans are naturally drawn to work. If they can’t spend free time working, they’ll replace work with forms of self-abnegation, like passively watching TV or getting drunk. Both these activities serve the same purpose, to numb the human soul and abolish the dissatisfaction we feel if we can’t work. Try sitting alone some evening, sober, and do nothing. Don’t even watch TV or noodle around on your smartphone. Bet you can’t do it.

That’s why I can’t conscientiously tell people to turn their passions into paychecks. Because paychecks aren’t, fundamentally, about work; they’re about an economic morality of self-destruction. Work, meaning real work and not “gainful employment,” should ennoble people and fulfill the drive to improve our world. Employment, however, is a sinkhole of value, blighting the Earth and exhausting the workers who do it. Don’t let your passion turn into that. Don’t drain it of all meaning.

Please don’t misunderstand. If you’re a paid professional novelist, congratulations. If your hobby farm pays for itself, feeds your family, and lets you save for your kids’ college fund, keep at it. If you make a living doing what you love, or will soon, well done. But don’t let the economy’s moral imperatives turn what you love into a toxic black hole. Because once your paycheck starts ruling what you love, it’ll rule you, too.

Wednesday, December 9, 2020

Don't Call Me “Sir”

I acquired my distrust for honorifics like “sir” and “ma’am” early, from my father. He didn’t mean it that way. Rather, in fourth grade (so I would’ve been nine years old), I had a teacher who insisted that children use polite honorifics when speaking with adults. As a “go along to get along” kid, I complied. Then one day, at home, my father, who’d recently been commissioned a warrant officer, gave me a direction. I replied “yes, sir.”

“Did you catch that?” my mother asked. “He called you ‘sir.’ He’s trying to show you respect.”

“Oh,” my father said. “I just assumed it was because he knows I’m an officer now.”

His tone suggested he was half-joking, and knowing him as I do now, I suspect he was using jocularity to conceal the fact that he’d completely missed my attempt to show him respect. What struck me then, though, as a child, was that my father didn’t regard me as a son. He regarded me as a subordinate. I swallowed my desire for a sarcastic rejoinder, like the coward I was then, but I also learned an important lesson, and stopped using the word “sir.”

Recently a trusted friend, a schoolteacher specializing in middle-grade history, re-posted this brief harangue to social media: “Unpopular opinion: Children should respond with "yes, sir" or "yes, ma'am" and then do as they have been told.” He followed this comment, which he didn’t write, with his own words: “I’m less hung up on the “sir/ma’am” than I am on a pervasive attitude in the last couple of cohorts I’ve taught that overtly disregarding a teacher or responding to correction or redirection with sarcasm is an acceptable behavior.”

As a sometime teacher myself, I understand this frustration. Classroom learning requires a certain level of discipline, which begins by acknowledging that your teachers have paid their dues, earned their credentials, and want you to have the same opportunities they’ve had. (Toxic exceptions exist, I realize. Bear with me.) Adolescent resistance scores quick points with peers, certainly, but it poisons the long-term experience for everyone involved.

However, I can’t help seeing a straight line from a blanket demand that children should obey adult authority, and the problems Americans see unfolding right now. Whenever police, or other authority figures, shoot Black men for insignificant infractions, defenders of the status quo inevitably emerge from hibernation to insist that the dead men should’ve obeyed. Obedience, unmoored from other ethics, becomes the ultimate defense of authoritarian injustice.

Okay, I'd call him “sir”

As a longtime admirer of French anarchist and theologian Jacques Ellul, I explain myself thus: all authority derives from God. Anyone who claims authority over another person, thus claims to represent God, or if they don’t believe in God, at least they claim to represent the higher power. Even if power isn’t literally God-given, it’s nevertheless God-like. Therefore, all human authority is idolatrous and illegitimate, unless it’s yoked to humility and restraint.

But I also recognize this creates certain contradictions. Even the most doctrinaire anarchist will admit that sometimes it’s appropriate to acknowledge another person’s authority. Teachers couldn’t manage their classrooms without decision-making power. Complex multi-person activities, from simple barn-raisings to paving highways from coast to coast, require coordination, which means somebody necessarily has to take charge.

The question, therefore, isn’t whether authority exists; it’s whether (and when) authority is legitimate. I asked myself this several times in my teaching days. The state university system invested me with responsibility to teach youth how to write on a collegiate level, and authority to execute this responsibility. My students ostensibly acknowledged my authority by enrolling in my class. Does that mean whatever I do is legitimate?

Certainly not. My authority is circumscribed by time, space, and jurisdiction. If I assign students a paper on a given topic, that assignment is legitimate, because as a writing teacher, during a classroom semester, on campus, I have that authority. If I assign students math homework, that assignment is illegitimate, because it’s outside my jurisdiction. If I assign students to wash my car on Saturday, that’s also illegitimate, for hopefully obvious reasons.

My father’s expectation that I behave like his subordinate, that I salute him and follow his orders, continues to burn. It took years to understand why, though. As my father, he had certain authority over me, which corresponds with responsibilities to raise me well. His authority as an officer doesn’t correspond with his authority as my father. This fuzzy distinction causes my lingering dislike for the word “sir.”

Monday, December 7, 2020

How Joe Biden Is Like Toilet Paper

Panic-buying behavior, witnessed in March of 2020

In 2020, Americans got an education in what “panic buying” means. Previously, we’ve witnessed panic buying on regional levels, usually before a natural disaster like winter storms or hurricanes. People rush suppliers to stockpile whatever resources will become necessary in a worst-case scenario. This year, we witnessed the entire country persuaded they needed to stockpile scarce resources (which weren’t actually scarce), most famously toilet paper and eggs.

The nature of panic buying means, you don’t actually believe we’ll run out of important resources. Rather, you believe other people believe we’ll run out. You don’t intend to beat the depletion, you intend to beat the crowds. Much like fad-driven Christmas shopping, when people hoard Cabbage Patch Kids or Furbies because they appear scarce, panic buying creates artificial demand for ordinary, banal resources based on expectations of crowd behavior.

We’ve witnessed, this year, a core distrust of fellow Americans on an historic scale. We apparently believe others will hoard, and implicitly squander, cheap and plentiful toilet paper, while we’re reduced to wondering how soft a puppy feels. I remember friends on social media sneering, during the Toilet Paper Hysteria of 2020, that Americans must be stupid. But that’s projecting; Americans are actually fundamentally distrustful.

Watching the nascent Joe Biden presidential administration unfold, I feel the dull ache of familiarity. Just as Biden’s left-wing critics anticipated, he’s come under fire for nominating future department heads and Cabinet secretaries with close ties to the businesses they’re meant to regulate. This basically repeats the criticism many leftists made of the outgoing administration, that it’s laden with patronage plums for defenders of the status quo.

President-Elect Joe Biden

This probably shouldn’t surprise anybody. Even before the election, the former Senator and Vice President accrued criticism for his policy record, which has involved structural racism, to say nothing of his frequently hands-on approach to women. Biden has signaled a progressive approach to race and sex issues, like an all-female communications staff, and the first Black CIA director. But on economics, where street-level policy happens, Biden is decidedly conservative.

America, broadly speaking, isn’t nearly as conservative as frequently reported. Recent opinion polls reveal frequent left-leaning tendencies in American belief, and by wide margins. 63% of Americans favor universal health coverage. 72% and growing favor increasing the minimum wage. 60% favor keeping abortion legal. Under ordinary circumstances, fifty-five percent is considered a mandate. So clearly the American mandate leans heavily leftist, by conventional numbers.

In the 2020 Democratic primaries, famously flooded with highly qualified candidates, we had two candidates who solidly sided with these majority opinions: Bernie Sanders and Elizabeth Warren. Admittedly, Sanders has accrued a cult of personality which makes my skin crawl; I caucused for him in 2016, but by 2020, he’d become a caricature. But what about Senator Warren? Popular, telegenic, and consonant with American opinion...why didn’t she win?

If one word dominated the last two Democratic primaries, it was “electable.” Democrats believe that popular support lies somewhere between their voting base of civil rights marchers and union members, and whoever occupies the organized right. Republicans have no such belief. Therefore Democrats keep nominating “centrists” who land further right than the electoral mean, while Republicans become increasingly rock-ribbed and doctrinaire, nominating zealots like the last two Republican presidents.

Democrats’ tendency to skew conservative, hoping they’ll tempt the occasional Republican across the aisle, precisely mimics the mentality of panic-buying toilet paper. We don’t think Americans overall want economic policies generally friendly to the corporations who created our current mess, but we believe we’ll alienate Republicans if we change anything substantive. The very fact that Republicans got almost the same percentage of the vote again, proves this is just untrue.

Republicans respond to America’s changing ideological and demographic trends, by becoming more extreme and uncompromising. Sure, they look out-of-touch on individual issues. But taken together, they appear to have principles, and to trust America overall. They don’t begin the negotiation process by promising to fritter their core beliefs away. Maybe they don’t reflect American beliefs broadly, but at least they have beliefs. They aren’t panicking over the goddamn toilet paper.

Joe Biden promises to improve circumstances for Americans in notional ways. He’ll increase rights protections for minorities, for instance. But he’s basically pledged to do as little as possible to arrest climate change, stop economic resource hoarding, or fix the roots of poverty. All for fear of alienating conservative who’ll never support him anyway. He basically distrusts Americans to support policies they believe in. He’s basically panicking over toilet paper.

Friday, December 4, 2020

The Sadness of Reading Hamlet as an Adult

Kenneth Branagh as Hamlet

Almost any erstwhile English major will confess, I suspect, to having read Shakespeare’s Hamlet before being formally assigned it for classroom reading. The kind of person who elects to study literature is likely the sort of person eager to discover new experiences, and to embark on journeys into mysterious worlds, and no “world” is more ballyhooed than reading Hamlet. It’s the Mount Everest of literature: supposedly impregnable, though the trail is well-marked and extensively traveled.

I personally read Hamlet as a senior in high school. This may surprise several classmates, since—open secret—I nearly flunked that year. Not because I was stupid, but because I was impatient with the carefully curated, low-risk “skillz drillz” approach to learning favored in American high schools. I wanted to make independent discoveries and learn what excited me. So I purchased a paperback Hamlet at B.Dalton and undertook it myself, blind and rudderless.

The book I discovered felt dangerous, scary, and frustrating. This giddy kid, angry at his discovery that life didn’t unfold with the elegant symmetry of a medieval morality play, challenged the social order which dominated him, embodied in his stepfather. Young Hamlet realized Denmark, once bold and vibrant, had rusticated and fallen asleep. King Claudius loved wine and sex, not the manful virtues of conquest and justice. Between his books and swords, Hamlet promised revitalization.

Hamlet probably electrified Elizabethan audiences for the same reasons it jolted one suburban White kid in 1991. Just as Elizabethan theatre emerged from the stultification of plays as religious instruction (and opposed the Puritans who threatened to overrun England), this paperback Shakespeare ratified my belief that the institutions dominating my life were overgrown and decrepit. Sure, like Hamlet, resisting this decay might kill me. But it remained a worthy fight, just because it was right.

Mel Gibson as Hamlet

I still own that paperback Hamlet. On a dare, I recently blew the dust off the sadly creaky binding, and reread it. What a massive disappointment. Imagine reacquainting yourself with your oldest friend, only to discover that, while you’re now approaching fifty and facing life as an adult, your buddy remains saddled with rebellious teenage angst. Your friend’s life has fallen into a rut; he keeps repeating the same melodramatic but meaningless shows of defiance.

Shakespeare’s Hamlet begins his play declaring how fat, ingrown, and dissolute Denmark has become. He promises to uproot this corruption, which he sees embodied in King Claudius, and restore Denmark’s glory, which was the person of Old Hamlet. And then...everything conspires to prove him right! Every single belief Hamlet has in Act One, is vindicated in Act Five. Nothing happens to make Hamlet change his outlook or reevaluate his principles. Hamlet just never grows.

This disappointment with Hamlet probably reflects my own life trajectory. Nearly thirty years after first reading Hamlet, I’ve realized my teenage disappointment with middle-class mediocrity was, if anything, too small. But I’ve also realized that throwing myself bodily against the system, hoping my simple mass will change anything meaningful, is foolish. Yes, like many people my age, I resent the concessions I’ve made to systems which, in principle, I hate. But adolescent tantrums change nothing.

Young Hamlet prances around onstage, delivering long monologues about how intemperate, foolish, and shameful modernity is. I felt that, at seventeen. Then, in Act Two, Scene Two, where Hamlet remains onstage for 450 straight lines (one of Shakespeare’s longest), he successfully outsmarts and embarrasses every exemplar of Old Order gerontocracy: Claudius, Polonius, Rosencrantz and Guildenstern. One suspects Richard Burbage, co-owner and prima actor of Shakespeare’s troupe, demanded something that allowed him unlimited virtuoso star time.

Laurence Olivier as Hamlet

Fearful that I’d become irretrievably cynical, I reread Macbeth, King Lear, and Sophocles’ Oedipus Rex. Nope, these works remain complex, profound, and meaningful. Oedipus realizes his transgressions, and accepts his fate. Macbeth realizes his transgressions, and resists his fate. Lear realizes his transgressions, and gives up. Like my adult self, they realize the way life acts upon us, despite ourselves. These characters, in different ways, learn from their journeys, and emerge from the experience transformed.

Not Hamlet. He starts the play resentful and rebellious, sure his convictions matter more than everybody else’s, and he finishes vindicated in that belief. No wonder high school Kevin enjoyed this play. Hamlet reflects every black-clad teenager storming out of the house, screaming “You’re not my real dad!” And somehow, he still gets the hero’s death, so he never needs to realize his mistakes. He gets to be seventeen forever. Real life isn’t so merciful.

Thursday, December 3, 2020

New Delhi’s Romantic Rain Opera

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 43
Mira Nair (director), Monsoon Wedding

Beautiful, fresh-faced Aditi Verma returns to her family’s lush New Delhi manor, to participate in an arranged marriage. The Verma family, wealthy and urbane, see this wedding as an opportunity to display their affluence to the extended family, returning home from living scattered in several nations. Only the family patriarch, Lalit Verma, knows he’s actually broke, financing everything on credit. Aditi, meanwhile, hasn’t broken up with her previous boyfriend yet.

According to reputation, screenwriter Sabrina Dhawan wrote this movie hastily, to have something she could workshop for her MFA program. One of her professors, expatriate Indian director Mira Nair, saw something promising in it. Nair set out to realize Dhawan’s story as a combination of an American low-budget indie film, and a Bollywood spectacular. The result straddles two worlds efficiently, capturing the hybrid world of India’s moneyed gentry.

Lalit Verma (Naseeruddin Shah, Gandhi) is a control freak, desperate for a traditional Punjabi wedding. What he really wants, though, is a sleek tourist destination. After all, his family only gets together about once every ten years, and the groom’s family is rich, with connections to American money. Only when Lalit’s credit starts bouncing does he realize he’s tied his personal money into his business, which is critically overextended.

The wedding planner, Dubey, catches the bulk of Lalit’s copious wrath. To his credit, Dubey, a happy-go-lucky kid with seemingly boundless energy and elbows like hatchets, remains unfazed. Until, that is, he glimpses Alice, the Vermas’ patient, doe-eyed housemaid. Alice’s hard work and infinite grace keep the Verma household together, and Dubey realizes he’s become dependent on her to organize this wedding. Maybe he’s starting to feel something more, too.

Aditi, in her middle twenties, agrees to a traditional arranged marriage, to a man she’s only known a few weeks, largely because she realizes it’s advantageous. Her boyfriend, after all, is married. But she has aspirations of being a modern, Westernized woman, like the glamorous Indians living abroad she sees on television. How can she explain to her fiance that she isn’t going to be a traditional Punjabi wife?

Meanwhile Ria, Aditi’s cousin, has thrown herself whole-heartedly into helping Aditi’s wedding preparations. She seems excited for everything happening, until Lalit’s brother-in-law, Tej, arrives from America. Everyone thinks Tej is perfectly avuncular and welcomes him, especially when he offers to cover Ria’s university tuition in America. So why has Ria become suddenly sullen and withdrawn, lashing out at family members with little provocation?

If this seems like a remarkable number of plot threads, I won’t disagree. Like many American indie filmmakers, Dhawan and Nair create an ensemble whose various individual needs are often in conflict; we know somebody is bound for disappointment. The characters achieve their needs only by wheedling and compromising. We wait with anticipation to see how the movie will land all these divergent threads with satisfaction.

Alongside the ordinary, human conflicts, the movie also includes India’s stark economic contrasts. Most of the movie happens on the Verma family’s large gated compound, a spectacle of post-colonial opulence. But to accomplish anything, the characters must venture into streets crowded with cars and beggars. Alice, the maid, lives in a polite but easily ignored cottage on the periphery. Dubey, the wedding planner, lives in a loud, cruddy walk-up flat.

Culture clash dominates. Aditi has lived in New Delhi all her life, but everyone expects she’ll move to Texas with her new husband, which she anticipates with dread. Dubey, clearly Hindu and proud, falls in love with Alice, who sleeps with a crucifix above her bed. Most of the movie’s dialog is in colonial English, and Lalit Verma desperately tries to appear British, but bursts of Hindi appear so often, the movie requires subtitles.

Overall, the movie follows a standard Bollywood beat sheet. It translates these beats, however, for audiences more accustomed to Western cinematic traditions. The song-and-dance breaks for which Bollywood is famous, are replaced by introspective long shots where the sounds of New Delhi come together in almost operatic unity. The love stories resolve themselves concisely, without ever showing anything the state censorship board would consider naughty.

Personally, I was recommended this movie by a clerk at an Indian grocery store. Fascinated by his store’s rack of Bollywood DVDs, I asked for suggestions to get started. He recommended this movie as a good introduction for audiences raised on Western cinema. Because it has its feet firmly planted in two worlds, and explains itself clearly, it proved a perfect introduction for one inquisitive Westerner.

Tuesday, December 1, 2020

Christians and the Basic Ability to Care

The two tweets (now deleted, and preserved only in screenshots) are dated just two days apart. On Thursday, November 26th, 2020, a woman identified as Alice Willow declared she thought COVID-19 a mere nuisance, and demanded nobody bar her from attending church. “If I get covid attending mass then I’ll deal with it,” she writes. On Saturday she writes that she’s tested positive and adds: “my husband… has a pre existing condition.”

My goal isn’t to name and shame Mrs. Willow. Anybody following American news realizes she’s frustratingly unremarkable, her desire to avoid change at any cost unmarred until the moment the catastrophe strikes her personally. As I write, we have over 13 million confirmed COVID-19 cases in the United States—I’m one of them—and over a quarter-million dead. And some people still don’t care.

For me, the absolute marzipan topper on Mrs. Willow’s bad-attitude cake comes at the conclusion of her first tweet: “Be mad. Don’t care. [red heart emoji].” That encapsulates everything I’ve witnessed about people protesting against mask mandates, school and church closures, the shuttering of bars, and other attempts to contain the spread of a highly virulent disease. They’re not malicious; they just don’t care.

But Mrs. Willow isn’t demanding autonomy to eat out or breathe on strangers. It strikes me that she specifically wants to attend church. She writes in the immediate wake of a Supreme Court ruling that regulators cannot specifically target houses of worship for closure. This means Mrs. Willow, like millions of Americans, wants to be demonstratively Christian. I can understand that. I, too, miss my church friends and social network.

Christianity, however, sort of requires that you do care about others. It’s right there in the foundational text: you care that somebody’s naked, and you clothe them. You care that somebody’s sick, and you nurse them. You care that somebody’s sleeping rough, and you house them. Caring is foundationally bound to Christianity. Yet caring, or what psychologists call empathy, has become ancillary and optional in American Christianity today.

We keep hearing horror stories about specifically Christian people who die painfully because they deny the reality of COVID-19. From the beginning of the outbreak, several prominent faith leaders, especially from White megachurches, have denied the disease’s existence, right until the moment they contract it. Some have died with denial on their lips. They’re apparently incapable of caring until the outbreak strikes them personally.

That’s exactly what Jesus warned his first-generation followers not to do. Challenged by temple authorities, he said their religious rules only mattered if they promoted justice and defended the powerless. In his Parable of the Samaritan, Jesus said the Priest and the Levite—that is, his society’s publicly religious leaders—walked by, not just heedless, but actively avoiding the wounded man on the roadside.

Unfortunately, that’s what we Christians look like to outsiders today. We’ve become active defenders of a social structure that exploits the poor, oppresses the foreigner, and makes war out of nothing. We’ve become the Priest and the Levite; we’ve become Caiaphas, the temple priest who purchased Jesus for thirty silver pieces. No wonder “no religion” has become America’s second-biggest religious affiliation, and Britain’s biggest.

That’s saying nothing of our largest congregations’ failure to provide food, water, and comfort during a period of widespread suffering. Christianity is failing in its basic mission.

Don’t misunderstand me; gathering for worship is important. Acts of worship are, for many people, the place where we publicly recommit ourselves to living the values we proclaim verbally. But when this commitment crosses the line from building a community, to self-righteously praying on street corners, we become the very pious frauds whom Jesus excoriated angrily. Publicly demanding my rewards isn’t Christian; CEOs and politicians can do that, and do.

May God protect Mrs. Willow and her husband from the consequences of their actions. I don’t believe their actions arose from malice; they’re probably just products of a conservative, mostly White culture which has enjoyed social protections for so long, it doesn’t realize those protections aren’t simply the natural order. Mrs. Willow probably isn’t a bad person, and I wish no harm upon her family.

Instead, I hope that Mrs. Willow’s experience overcomes the empathy barrier that apparently plagues American Christians. We’ve become self-obsessed, turned inward, and uncaring. We value the indoor structural experience above connecting to humans, especially humans we don’t already know. Because of this, we risk becoming irrelevant, or worse, outright harmful. We’re the religion of “Be mad. Don’t care.”