Friday, February 26, 2021

Am I My Own Worst Entropy?

l-r: Tucker Carlson, President Joe Biden, and Rep. Marjorie Taylor Greene
“I was in danger of verbalizing my moral impulses out of existence.”
—Father Daniel Berrigan, SJ

I never wanted to write a political blog. I intended to write book reviews and make everyone that little bit happier by ensuring they had access to good-quality reading material. Then weeks like this one happen, when powerful people expose their ugly sides to America. And I have to ask myself: what does it mean to have an opinion, even a non-political opinion, in today’s mindscape?

Midweek, two stories broke that guaranteed left-wing outrage on social media. First Tucker Carlson, that beloved sock puppet of dog-whistle racism, claimed he “couldn’t find QAnon” searching the internet. This was followed by Representative Marjorie Taylor Greene hanging an anti-transgender rights poster outside her office, specifically to tweak a neighboring colleague with a transgender daughter. Both stories cultivated predictable Twitter outrage.

Both stories intended to generate online outrage, in a generation whose attention span has been dramatically shortened by internet exposure. These stories lacked substance, and essentially weren’t policy arguments. Both allowed conservatives to congratulate themselves for understanding the inside joke. They also drove leftists to salivating outrage, repeatedly linking both stories, which, as a counterargument, is about as effective as water on a grease fire.

Anger makes people feel good. The sense of righteous indignation—which, in practice, almost always means “my indignation”—activates neural receptors associated with achieving justice and vindication. Then people post their outrage on Facebook and Twitter, allowing them, us, to posture our moral outrage for public applause. Believe me, I drafted an essay about my moral wrath at Carlson and Taylor Green, the great, creeping quislings.

Then, partway through, news broke. President Biden, the guy Marjorie Taylor Greene and Tucker Carlson excoriate as a reincarnated Trotsky, has re-opened prison camps for migrant children. He dropped bombs on Syrian soil. Met with a procedural setback, he shrugged and rescinded his promised minimum wage hike. As many feared, Biden basically continued existing morally execrable policies, because changing things is hard.

Social media, the “public square” of 21st-Century life, rewards demonstrative outrage. When basic-cable TV pundits do stand-up comedy routines about “Antifa,” working pundits, many with lucrative Patreon deals, display their anger, and the proletariat rewards them. Dealing with real-world problems is difficult. There might be legitimate reasons why the Administration needs to warehouse unaccompanied minors or bomb sovereign nations… but it looks bad.

Tucker Carlson famously lacks nuance and subtlety, which audiences who enjoy getting angry at reality like. This has become a hallmark of modern conservatism. President 45 and Marjorie Taylor Greene make bank mimicking Carlson’s style from official pulpits; not for nothing did 45 tout his friendship with Carlson’s stablemate, Sean Hannity. America’s official right wing prospers by making difficult situations look easy.

Our organized left, by contrast, prospers by making everything look difficult. Elaborate procedural hurdles prevent us raising base wages, or forgiving student loans we wished into existence. Obamacare, virtually a free gift to insurance executives, was sold as “the best we can do right now,” the defense we’ve heard since at least Jimmy Carter. To Tucker Carlson, nothing has nuance; Joe Biden and other Democrats have nothing but nuance.

Either way, we expend our energy shouting at one another on Twitter. As Sarah Kendzior writes, Twitter specifically, and social media generally, give frequently overlooked populations an outlet to disseminate their important news. It lets people organize when geography, law, or disability prevent them from offline organizing. But it also lets people squander their energy shouting at nincompoops. Psychologically, we’ve already participated, when in fact we haven’t.

I struggle to remind myself that, just because I’ve written about issues which move me strongly, doesn’t mean I’ve actually accomplished anything. There’s a difference between participating in democracy, and shouting down a well, but sometimes that difference only becomes visible later. The current pattern, practiced under both 45 and Obama, involves linking arms on Twitter, reassuring ourselves we’re right, then… doing nothing.

Meanwhile entropy sets in, as happens in closed systems. The last President tacitly empowered literal neo-Nazis, and the opposition’s rebuttal was “punch Nazis.” That isn’t policy, though. Stopgap measures might temporarily inconvenience bigots and bomb-throwers, but the underlying conditions that make people consider Naziism a serious alternative, like widespread regional poverty, remain. And the New Guy is appeasing them.

We’re busy reminding ourselves that we’re Serious About The Problem, but that hasn’t translated into solutions. And it won’t, while we’re only congratulating one another. So when does the talking stop?

Monday, February 22, 2021

The Black Hole of Time

Given the amount I’ve written recently about the fallacies of nostalgia, and how the past is a curated museum rather than an objective fact, you’d think I’d know better than to Google the highlights of my past. Apparently, though, I’m as vulnerable to glittering illusions as anybody. This weekend, I searched a restaurant I once loved, a Fifties-themed diner I haven’t seen in over thirty years.

Only to discover that it no longer exists.

Having grown up peripatetic and rootless, I spent my youth walking away from people and places I once loved. On occasions, I remained in contact with certain important people for a while. However, it seldom lasted. Formerly momentous relationships retreated into the past, where they became frozen, like flies in amber. My perception of people and places I once would have lived and died for, sometimes haven’t changed in decades.

Over a year ago, following a motor vehicle breakdown, I spent almost 24 hours in relative isolation. I wrote about this at the time: the struggle of realizing that the world outside continues to exist, but my perception cannot keep pace. But, as I recall a Buddhist writer describing once, enlightenment isn’t permanent. The pressures of real life, with a constant barrage of supposedly urgent pressures, can force you to forget even life-changing realizations.

I like to think the world stays how it was when I walked away from it. That Honolulu’s Rose City Diner might still exist, with eager youths dropping quarters into elaborately reconstructed Wurlitzer jukeboxes while shooting toothpicks into the ceiling. That a young woman, with whom I constructed a detailed relationship in my head, remains as fresh-faced and optimistic as when I last laid eyes on her in 1992.

Like the Buddhist forgetting his enlightenment, I forget my insight that my mind and my world aren’t coexistent. When I insist the world remains constant after I walk away, I insist, on some level, that my mind controls things outside. Apparently it’s easy to forget that I don’t control reality, only my relationship with reality. I remain at the mercy of my mind, and my mind isn’t necessarily part of this world.

What, I’m forced to wonder, is my mind? Some neuroscientists insist the human mind is only the natural function of the brain; that is, the mind is what the brain does. But what, exactly, does the brain do? The more we discover about its capacities, the less we understand. Common analogies to computers fall flat, because the brain doesn’t simply process raw data; if that’s all it did, we couldn’t make leaps of inductive reasoning, or be irrational.

Our world exists, regardless of my perceptions. There are no “alternative facts,” neither for myself nor anyone; reality exists. Rose City Diner closed twenty years ago, and Sharon Maloney isn’t seventeen anymore. I have mental constructs of my world, which allow me to make sense of raw data. But these ideas exist entirely within my mind, which isn’t necessarily real. My belief that something still exists, doesn’t give it physical might.

My constructs only mean anything if I test them constantly against reality, and discard them when they fall short. This isn’t easy. Important philosophical notions and transcendent beliefs, learned young, often resist mere evidence, as anyone who’s revised or abandoned their parents’ religion knows. The belief that my mind preserves old places, or prevents old friends from aging, is pure magical thinking.

Surely I’m not the only person who struggles with this? The dichotomy between the belief that my mind isn’t circumscribed by my brain, that I make sense beyond mere biomechanical processes; but simultaneously, I’m trapped within my brain, dependent on my physical senses to provide raw data to make meaning. I am simultaneously greater and lesser than my body, limited and limitless. What does that make me?

Addiction, we now know, arises from an inability to forget the past, even when past influences are clearly maladaptive. What, then, am I addicted to? the beliefs I established thirty years ago, about people and places and how the world works, don’t apply anymore, if they ever did. Yet I retain these false idols with the vigorous jealousy of an opioid addict desperate to kill the pain, and for largely the same reasons.

Unfortunately, I have more questions than answers. Perhaps that’s best; if I’ve solved the important conundrums, what have I left to live for? I just wish, despite the evidence, that I didn’t have to relearn the same sad lessons constantly.

Friday, February 19, 2021

Rush Limbaugh and the Texas Freeze

Rush Limbaugh, before the consequences set in.

Two images should persist in America’s imagination during this, the harshest winter many can remember: Ted Cruz leaving the country, and Rush Limbaugh leaving the Earth. Two prominent Americans refusing to face the consequences of a situation they, as much as anyone, created. Their bizarre, inflexible moral codes have left America unprepared to face unexpected circumstances, and Texas, right now, is paying for it.

It seems flagrant to say Rush left America uglier and more divided. I’d rather emphasize how he turned politics into a moral imperative. No longer was politics about finding the best interpretation of facts, and basing policy on evidence. Rush’s onanistic obsession with cost-cutting and low taxes became a moral absolute, pursued with Inquisition-like zeal. He wasn’t a politician, he was a radio preacher, expounding belief and punishing sinners.

Which leads directly to events in Texas now. An entire state refused to plan for unlikely disasters, including the current extreme freeze. The power grid wasn’t winterized, and there weren’t enough snowplows. People are literally dying because the state made no preparations for a circumstance they considered unlikely. In a state that’s repeatedly cut costs and privatized its grid, spending on unlikely circumstances seemed like an unnecessary luxury.

This is particularly ironic in the state where the Galveston Hurricane killed thousands, simply because nobody planned for it. Because a massive, devastating hurricane had never struck Galveston, city fathers believed none ever could, and refused to build seawalls and other disaster preparations. Galveston has never fully recovered from the hurricane. And Texas, apparently, has never fully learned from it, either.

Rush Limbaugh didn’t cause this, of course. Having never held elective office, he had no decision-making authority in Texas, or anywhere. He did, however, create a moral landscape based on belief that economics is absolute and ineffable. Other right-wing commentators came before and after Limbaugh, certainly. But he created a cultural landscape where we define politics in exclusively moral terms, pugnaciously deny any common good, and kick the weak.

This, which we might call “Limbaughism,” is a reversal of Marxism. Where Marx believed economics drives morality, Limbaugh believed morality drives economics. Chosen economic principles don’t necessarily produce the best outcome, but are morally right, even Godly. We can argue whether those who propounded this view really believed that moral hokum; fact is, that’s the half-religious liturgy Limbaugh sold the nation.

Texas Senator Ted Cruz, caught on camera trying to leave the country

An entire state disconnected itself from the national energy grid, because its leaders, empowered by a slim majority of voters, believed doing so was moral. People are freezing, without water or power, because the moral imperative said cutting costs, no matter the consequences, would pay for itself. The mere fact that it didn’t, and the program’s chief engineer, Enron, went tits-up, didn’t impede the moral argument.

Because that’s how moral absolutes work.

Our entire system is obviously worse for lack of preparation. The Texas deep-freeze is a metaphor for America’s COVID-19 response, which itself is a metaphor for global warming. We could’ve prepared for this disaster, or any other, but we didn’t. Unlikely events don’t seem worth the expense until they strike. Even when they do strike, some people refuse to adapt, rejecting the mask, or taking their previously scheduled holiday in Mexico.

Tim Boyd, the mayor of Colorado City, Texas, was hounded from office this week for suggesting the Texas deep freeze was an opportunity to cull the weak. Those willing to burn their possessions, he implied, deserved to live. This is Limbaughism invested with public power. Elected to serve the public trust, Boyd instead insisted the public trust doesn’t exist, and told Texans they were on their own. His constituents, mercifully, hooted him out of office.

Limbaugh, like Ted Cruz and Tim Boyd, actually recognized a moral need in America. On the Right, White Christianity has failed to adapt to social changes since around 1955, becoming morally slippery and inchoate. The Left, meanwhile, has become increasingly unwilling to call anything wrong, except calling things wrong. Americans flailed for moral guidance. Limbaugh, visibly angered by the same things that angered his listeners, stepped into the void.

The morality he sold, however, was worthless. It left America unprepared for Black Swan events because money became the ultimate moral indicator. Now Limbaugh leaves Earth, just as Cruz tried leaving America, during a moment of payback. As millions of Texans pay for this morality, some with their lives, Limbaugh had better hope that transcendent justice doesn’t exist. Because he needs to face the consequences for the world he leaves behind.

Wednesday, February 17, 2021

Britney Spears, Joss Whedon, and Me

Britney Spears

Britney Spears’ first album dropped the year I turned 25, so I was already outside of her target demographic. That didn’t stop me from expounding my opinion, though. I told anyone who would listen that I considered her a malign influence on impressionable teenagers, that her artistry was negligible, and that I hated her debut single, “Hit Me Baby One More Time.”

That was a lie, though. Then and now, that song slaps.

The constant, on-demand immediacy of online life often deluges us with so much information, everything becomes meaningless. But occasionally, we get the kind of synchronicity that turns gibberish into insight. This happened this week, when battalions of Hollywood stars emerged, condemning Joss Whedon for running an autocratic media empire. Almost simultaneously, the documentary Framing Britney Spears aired, exposing how the media machine built her up, just to tear her down.

Though Whedon’s epoch-making Buffy the Vampire Slayer debuted two years before Britney’s first album, they were virtual contemporaries. The 1990s weren’t the 1960s, when pop culture reinvented itself every eighteen months. Britney and Buffy shared an ethos that a young woman could be blow-dried and manicured, and still kick ass. Britney’s funky, danceable pop had a kick-drum urgency that could’ve been Buffy's stake driving home.

Yet we, the buying public, couldn’t have treated them more differently. Buffy in particular, and Whedon’s corpus generally, were already subject to serious critical scrutiny while the show was still on the air. He drew praise for his artistry, his writing, and his Male Feminist ideals. Even when his “BDSM fantasy gone wrong” series Dollhouse went sideways, it was still considered empowering to watch sexy women punch back.

With Britney, however, we constantly brayed for her to screw up. We mocked her for wearing provocative clothes but claiming to be a virgin, as though teenagers never tried to shock through prurience before. We pooh-poohed a seventeen-year-old for not recording songs forty-somethings wanted to hear. We speculated aloud about her sex life when she was still, technically, underage, turning ourselves into de facto child pornographers.

And when I say “we,” I mean me. As I’ve written recently, I spent half my teens and most of my twenties trying, with mixed success, to convince older adults that I was already their peer. This included ostentatiously disliking anything too new, popular, or fun. Britney made an easy target. She was so thoroughly, explicitly young, that disparaging her instantly burnished my “grumpy old man” credentials.

Joss Whedon

I wouldn’t dare admit, even to myself, that she looked like fun. She was attractive, high-spirited, and impetuous, all things I didn’t let myself be. In public, I joined the jeering masses, disparaging her as unoriginal, lightweight, and fake. In private, I wondered how she felt about somewhat older men.

Certainly, we now know, her public persona was tightly controlled. In many ways, she was like Princess Diana, who weaponized the paparazzi in her battle with Prince Charles, then couldn’t stop them. The media loved their pretty, glamorous Princess, whether Diana or Britney, and fawned over her every move. But they also made bank every time she flipped her shit on camera.

Soon, they started trying to provoke the Princess into a public breakdown.

From very shortly after they became celebrities, Joss Whedon could do nothing wrong, while Britney Spears could do nothing right. They attracted largely the same audience, and had largely overlapping philosophy. But one, a middle-aged man born to Hollywood royalty, was considered sacrosanct, too pure to touch. The other, a teenage girl from Mississippi, beat the odds and became a star. And for that, she had to be punished.

Both artists’ best work is probably behind them. Whedon recently got fired from his first TV development deal since Marvel’s Agents of SHIELD in 2012, the last year Spears had a top-10 hit. A decade on, maybe it makes sense for mass culture to reevaluate their paired legacy. But it’s too late. We can’t rescind the abuse we, the buying public, subsidized whenever we gawked at Britney’s latest meltdown. Nor the abusive environment we subsidized in Whedon's studio.

Most important, though someone can hold Whedon accountable for how he treated subordinates, nobody will hold us, me, accountable for how we treated Britney. We have to live with the images of the ways she disfigured herself, to vicariously punish us. We did that to her, I did that to her, to punish her for the sin of being fun. Like all Grand Inquisitors, I must admit, I’m the sinner.

Monday, February 15, 2021

Witchcraft and Economics: a Very Brief Summary

A well-known illustration of the Salem Witch Trials of 1692

The Roman church didn’t officially condone witch trials before 1486. Locally authorized witch trials certainly happened before then, and some are documented, but the Roman canon officially held witches didn’t exist, and belief in magic was a form of heresy. Only in 1486 did Rome authorize witch trials. The last sanctioned witch trial happened in Dornoch, Scotland, in 1736. Witch trials, as an authorized Christian practice, lasted only 250 years.

Why, though, did this happen? If European Christendom didn’t authorize witch trials until 1486, it certainly wasn’t to expunge prior religions, as some pop sociologists claim; by 1486, Roman Christianity had unquestioned dominion in Europe. Like heresy trials, witch trials only commenced when Christianity had no serious fear of challenge. Yet they certainly feared something; otherwise, why use lethal force against women entrapped in pre-literate superstition?

I started thinking about this concern recently, when a friend posted a question: who taught us to fear witches, rather than the witch hunters? Coincidentally, I’d spent time with this issue several months earlier, but my friend nevertheless made an interesting point: where critics like Jules Michelet or Silvia Federici focus on women as victims of witch hunts, men were also targeted. She specifically cited Giles Corey, who died at Salem, Massachusetts, crushed under a pile of rocks.

However, Corey, a farmer, wasn’t just killed by compression. He died thus, where literally every other Salem witch was hanged, because though accused, he never entered a plea. (Only those who protested their innocence were hanged.) Authorities loaded hundreds of pounds of rocks upon 81-year-old Corey’s chest to extract a plea through torture. Because Corey didn’t plead either way, the courts had no authority to seize his property, and his children inherited his farm.

While the girls who levied charges wildly in Salem, 1692, possibly believed their fairy tales, the adults around them benefited directly, through the redistribution of land and other resources. Many people probably sincerely believed a manifest evil force moved among the True Believers, causing mayhem and destruction for Godless ends. But cynical humans definitely fanned their reactions, because it allowed them to hoover up and hoard cheap farmland.

As stated above, Rome endorsed witchcraft trials in 1486. The cultural context matters: in 1454, Rome also authorized Portugal and Spain to “discover” previously unknown lands, seize territory for Christendom, force conversions among the unbelievers, and take slaves. This culminated, of course, in Columbus’ “discovery” of the Americas. Likewise, in the 1490s, Rome began selling plenary indulgences. These were initially quite popular, though eventually they hastened the Protestant Reformation.

The North Berwick witch trial of 1590

Both the “discovery” and slave-taking, and the selling of indulgences, served important fiscal purposes: the Vatican was cash-strapped, and needed new revenues for maintenance and repairs. Meanwhile, European aristocrats began seizing formerly common pastureland and other shared resources, declaring them private property. Though the Church approved this development, individual priests didn’t, and outspoken clerics, like England’s John Ball, and Germany’s Thomas Müntzer, led violent populist rebellions.

Witch hunting didn’t just concentrate male hegemony over women, as Michelet and Federici assert, and I’ve long believed. It also concentrated Vatican control over local religion, and aristocratic control over local land. My friend’s reference to Giles Corey makes that clear. Powerful people, in religion and government, sowed commoners’ fears of disorder and change, then promised to restore stability and the illusion of comfortable blandness.

Only I can fix things.

Consider the important movements we’ve witnessed in recent years. While BLM asks authority to please not kill civilians, especially Black civilians, so flippantly, observers like Robbie Tolan and Ibram Kendi note, the actual shooting police repeatedly claim they shot so-and-so because he (usually “he”) was supernaturally big, strong, fast-moving, or whatever. Restive African Americans are depicted possessing witch-like superpowers.

Authority figures demand increased prerogative to fight these Black supervillains: Stop-and-Frisk. Qualified Immunity. No-Knock Raids. Despite the modern, technocratic language, are these powers qualitatively different from witch-hunting techniques? I’d contend not. And like witch hunts, these extrajudicial powers effectively remove law-abiding citizens from the economy, and polite society. We no longer break witches on Catherine wheels, but we no longer need to.

Witch hunts, like crackdowns on Black youths, have multiple causes. It’s impossible to say definitively: “This belief causes witch hunts.” However, when we consider what sustains these repressions, we should ask ourselves who benefits. Both in 1486 and 2021, the rich and powerful pretend to champion common decency and religion. Just entrust your work, capital, and safety to us. We’ll kill the monsters. We’ll make Europe great again.

Wednesday, February 10, 2021

Science Fiction, Fantasy, and the Dream of Moral Clarity

The original Star Trek bridge crew, sitting comfortably with their moral certainty

Before becoming a celebrated Christian fantasist, British writer C.S. Lewis was a celebrated literary critic and medievalist. In his monograph The Allegory of Love, he traces the origins of late-medieval verse romances, and their arc across European thought, culminating in decadent excess and collapse before the Renaissance. These myths of “Courtly Love” remain influential upon multiple popular literature genres, including especially Period Dramas and Fantasy.

Early in his monograph Lewis describes Chrétien de Troyes, whose French romances famously created Sir Lancelot, and introduced his adulterous love for Queen Guenevere. Despite being written in French, for French audiences, Chrétien set his stories in King Arthur’s court, hundreds of miles and hundreds of years away. “For him already ‘the age of chivalry is dead,’” Lewis writes. “It always was: let no one think the worse of it on that account.”

In my childhood, Star Trek, a commercial failure on network TV in the 1960s, had become a staple of syndication. Its images of a future characterized by intrepidity, valor, and moral confidence, often shone on TV during the hours after I’d finished homework, but before my family was ready to engage in other activities. So I watched it fairly passively and uncritically, as children do; I internalized its values without realizing they were values.

Meanwhile, thanks to the Scholastic Book Club (does that even exist anymore?), I discovered fantasy. Specifically, I discovered Lloyd Alexander, whose Chronicles of Prydain used the same mythic journey model favored by Gilgamesh and Luke Skywalker. Alexander presented a world loosely based on Wales, a mix of lush forests and cultivated farms, where a valiant youth could leave the homestead and discover himself among the wilds.

Children lack the context to critically analyze their media choices. A few years later, I’d have the capacity to understand that both Star Trek and Lloyd Alexander based their stories “out there,” whether that meant beyond the stars or across the sea, because that gave them clear, unsullied backgrounds for their stories. The Federation, and young Taran, are clearly good, while the Klingons and Arawn are evil. Only distance allows such clarity.

The revived Star Trek crew, with their usual expression of moral befuddlement

Homer and Virgil had similar standards. Though Homer didn’t particularly believe in Good and Evil, only winners and losers, he nevertheless situated his story centuries earlier, during the Late Bronze Age. Virgil did likewise, though he imposed Roman moral dualism on Homer’s Greek dick-swinging masculinity. In ancient times, when everyone assumed the future would resemble the present, these poets, like Chrétien, could only find moral clarity in the past.

With the Renaissance, naïve longing for the distant past fell on disfavor (even as early Humanists yearned for the greatness of lost Rome). Writers of moralistic literature abandoned yesteryear. Works like Thomas More’s Utopia and Francis Bacon’s New Atlantis were set across the ocean, in lands yet undiscovered, appropriate for the so-called Age of Exploration. But this only switched geographic distance for the mythic past.

With the Industrial Revolution, and its attendant social upheaval, audiences could see society evolving within their lifetimes. They realized the future didn’t necessarily resemble the present. Thus new utopian thinkers, from Edward Bellamy’s novel Looking Backward to Karl Marx’s prescriptive economics, suddenly shifted into the future. But these authors shared Homer’s fundamental belief that the present was irretrievably murky and morally corrupt.

My father accused me, reading SF and fantasy, of fleeing the present and being merely escapist. “I like books,” I remember him saying, “that reflect this world, the one we live in, now.” But he read Tom Clancy, whose tales in technological wonders emerged from right-wing moral politics. He contended, basically, that better engines, better computers, or better satellites, would hasten the triumph of America, which was good, over evil Sovietism.

Goodness, therefore, always exists somewhere else. C.S. Lewis himself struggled with this. After his fumbling attempt at science fiction, the Space Trilogy, went essentially nowhere, he tried fantasy. As British critic Farah Mendlesohn writes, Lewis’ Narnia, reached through unique portals, strikingly resembles Christian allegories of getting to Heaven. He created a world, like Chrétien’s Britain, purged of today’s sloppy moral ambiguity and compromise.

Genre fiction is moving slightly away from that. While urban fantasy moves magic into the morally tangled present, J.J. Abrams’ recreated Star Trek largely abandons Gene Roddenberry’s belief in eventual human perfectibility. Creators use genre belief in absolute morality, while abandoning the illusion that good and evil were, or would be, clear. Time will tell whether these works remain as durable as the genre origins from which they first sprang.

Monday, February 8, 2021

We Need To Let the Hippies Go

A pro-war counter-protest in 1966. Note the presence, far right, of 19-year-old
Mitt Romney, future presidential candidate and Senator from Utah. (source)

I grew up wanting to be a Baby Boomer. Surrounded by mass-media images of Boomer ingenuity, relative luxury, and countercultural vigor, I wanted to join that cohort. My parents inadvertently fostered this desire by the profusion of 1960s TV in syndication; I believed that shows like My Three Sons, Family Affair, and Bewitched were still current. Thus, in my teens, I embraced that icon of Boomer defiance, the Hippie.

For twenty years, from age fifteen until thirty-five, I refused (with fleeting exceptions) to cut my hair; at its peak, my ponytail nearly reached my belt. I never embraced tie-dye, but my faded jeans and solid-colored shirts radiated a John Fogerty vibe. My one cannabis experience proved disappointing, and I decided my life didn’t need that influence, but I chose other mind-altering experiences, like late-night psychedelic rock and the Firesign Theatre.

These years, mostly in the 1990s, came surging back recently, reading Jennie Rothenberg Gritz’s essay “The Death of the Hippies.” Rothenberg Gritz, descended herself from hippies, describes how the late-sixties Counterculture surrounded itself with mythology that didn’t match its behavior. The mostly White, relatively well-off hippies clashed with similarly White squares, bankrupted the Great Society, and lived off work they themselves refused to do.

This hippy mythology stands contrary to reality. As Bruce Cannon Gibney writes, hippies were outliers; Boomers overall trended more conservative than the American mainstream. They gave lip service to insurrection, but in practice were generally passive. They resented work, the draft, and other impositions of responsibility as oppressive paternalism, at least verbally. But they didn’t mind imposing upon others: the hippie movement was often covertly racist, and overtly misogynistic.

Hippies’ rejection of purported bourgeoisie conformity mainly came across in their attitudes toward overseas war. Overall, Boomers supported the Vietnam War, though they opposed anyone requiring them to fight in it. What I grew up calling “anti-war protests” were generally just anti-draft protests. Knowing what I know now, most hippies probably didn’t oppose war philosophically, they just opposed the requirement to participate.

The blogger as post-adolescent
hippie wannabe. Probably age 25,
somewhere around the year 2000.

Ironically, by identifying myself with hippie-dip era radicalism, I managed to reinforce for myself a strange hybrid conservatism. Yes, Dr. King accomplished greatness in fighting the lawful protections of bigotry, I told myself. But he won those battles, so the fight should stop now. Similarly with the battles of second-wave feminism and the Stonewall generation: it’s unlawful now to behave with visible bigotry, so there’s nothing left to strive after.

To preserve this mental illusion, I surrounded myself with cultural influences which reflected the era I wished I’d flourished in. I only listened to music by bands which broke up before I was born, like the Beatles, the Byrds, and Creedence Clearwater Revival; or which still existed, but stopped mattering decades ago, like the Stones, Pink Floyd, and the Eagles. This, I told myself, was as cutting-edge as society ever needed.

Thus, I managed a weird balancing act. I embraced the image of adolescent rebellion still projected by Boomer pop, but displaced that ambition by three decades. I rebelled against… my grandparents, I guess? In the present, I supported maintaining, even expanding, the power structure that defined American modernity. A power structure from which I, a White, middle-class child of privilege (like the first-generation hippies), stood to benefit.

Nearly seventy years ago, William F. Buckley, Jr., defined conservatism in the inaugural issue of his National Review. A conservative, he wrote, “stands athwart history, yelling Stop.” To Buckley, this wasn’t a condemnation; he stated this with pride. He wanted human development to freeze. We’d gone far enough, he averred, in 1955. Indeed, he admitted, we could profitably rewind some years, probably to around his parents’ putative youth.

I never would’ve admitted it during the 1990s, when I tried to burrow back to Woodstock Generation, but I did likewise. Buckley seemed stodgy to me; I found his books sludgy and unpleasant. But the perseverance of hippie ethics, for me, was a refusal to accept complexity and moral ambiguity. I wanted to stop history at a moment I considered it comprehensible, which coincidentally happened before I joined it.

Today, I witness this same duality in many people I consider friends. They celebrate the counterculture of 1960s America, while wearing MAGA hats and decrying anything new and sharp-edged. I dare not condemn anyone for this position; I still dig the British Invasion and Summer of Love, sometimes. I respect the hippies’ accomplishments. But when I used that respect to avoid occupying the present, I, like them, began to die.

Thursday, February 4, 2021

Change My Mind—Please!

Last week I shared a political message on Facebook. (“No, Kevin, not you! Surely!”) “Opposing student loan forgiveness because you paid off yours,” the text read, “is like not feeding the hungry because you already ate.” As someone who believes Jesus said to feed the hungry, and also to forgive us our debts as we forgive our debtors, this one felt personal. So yeah, though I didn’t write it, I shared it for everyone to see.

A conservative-leaning friend replied: “Not true! Big difference!” In today’s social media environment, where people curate the information they receive and often only hear claims they already believe, I’m eager to hear what those who oppose me believe, beyond image macros that rally True Believers. So I answered my friend: “Do go on” [no punctuation]. Because I meant it, I wanted to hear the continuing argument.

I’m still waiting to hear back.

When I began teaching college-level writing over a decade ago, I needed to choose my dominant philosophy. I selected Aristotelean rhetoric, for a few simple reasons. First, it has over two millennia of practical testing, meaning there are few surprises. Second, it’s based closely on an awareness, not only of what the writer wants to say, but also what the audience will receive. It requires a level of empathy often missing from modern, and especially digital, rhetorical theories.

Aristotelean rhetoric was designed for a primarily oral culture, where written texts were primarily memory aides or classroom tools. It presumed the communicator primarily spoke aloud, to a live audience, and could gauge their responses immediately. When applying these terms to writing, it often requires imagining how one’s audience will respond in advance. And that means one must know one’s audience enough to accurately anticipate their responses.

So when someone disagrees with me, and I reply “Do go on,” I don’t mean this flippantly. It means I’ve failed to anticipate their response, and need more information to accurately map their beliefs and answer their likely objections. It also means I might have overlooked important information. Therefore, in a rhetorical approach, asking someone to explain their position means necessarily asking them to make a good-faith effort to change my mind.

This last point seems increasingly uncommon in today's information ecosystem. Social media, which creates a custom knowledge bubble, mostly reinforces what audiences already believe. When we only communicate with people who share our core beliefs, we emerge with a more extreme, intolerant version of our original opinions. Psychologists call this tendency “group polarization,” but the military has what I consider an altogether more accurate term: “incestuous amplification.”

It’s become fashionable to complain about polarization in today’s culture. I’ve done so, too. But this trend isn’t new. Though the digital environment increases this tendency, as Jill Lepore writes, this has actually been the trend since at least the middle 1960s, and arguably since World War II. America’s increasingly technocratic, specialist-driven society increases the natural human bent to seek people most like ourselves, and never communicate outside our sphere.

In such a willfully deaf environment, I cannot simply demand others listen to me. That’s the behavior of pre-teens still struggling to learn rudimentary empathy. Rather, Aristotle teaches that to persuade others, I must first listen to them, learn their beliefs and thought maps, and tailor my message accordingly. This means treating others’ opinions as seriously as my own, giving them serious consideration in sober tones.

I cannot treat others’ opinions seriously, and simultaneously pass preemptive judgement upon them. I can only take other people seriously, by first opening myself to persuasion. That is, I must be willing to change my mind. Not that I must be credulous, believe everything somebody says with authority, and get led around by the nose. Rather, I acknowledge I haven’t seen everything, my viewpoint isn’t universal, and others might have something to teach me.

Easier said than done, right? Though I stressed this every semester for my students, it’s taken years to internalize. I still sometimes get it wrong. Like all moral principles, it’s more about striving than achieving: I seriously, solemnly try to remain open to persuasion. When I ask someone to change my mind, unlike certain internet personalities, I mean it. I yearn to remain open to others’ viewpoints, changing my mind for better information.

Therefore, if you say something I disagree with, and I ask you to expand, this isn’t flippant. I mean it. You might not change my mind right away, but I’m willing to let you try.

Monday, February 1, 2021

Some (Incomplete) Thoughts on Men and Guns

Men practicing at a North Dakota gun range

There’s one guy everyone hates to see arrive at my job. Let’s call him “Jack.” Jack installs HVAC components, a job requiring both significant upper-body strength and an eye for fine detail work. He’s extremely good at his job, and everyone knows it. But he’s also constantly irritable, combative, and temperamental. He thinks it’s very manful to ignore basic safeties; management must constantly remind him to wear both a COVID mask and a hard hat.

Jack also, for over a year, angrily demanded his god-given right to open-carry a loaded firearm at work. He strapped an autoloader pistol into a snap-flap holster on his belt, above his right ass cheek. His bosses insisted he not carry his gun. The general contractor insisted he not carry his gun. He was repeatedly ejected from the jobsite for refusing to leave his firearm in his car. Still he demanded his unrestricted 2nd-Amendment rights.

This weekend, an unnamed Phoenix, Arizona, man shot and wounded a bystander while attempting to stop a shoplifter. My initial eye-rolling response reflected a long history of botched gun stories. Notice that, rather than attempting to apprehend the accused offender, the gun owner opted to escalate the situation to potentially lethal violence. Firing a gun isn’t a proportional response to low-level property crime. Yet this rapid escalation is exactly what gun advocates would probably celebrate.

Yet thinking about this man, and HVAC Jack, I realized they probably had something in common. Both men desire to fight injustice where they see it, injustice they believe is so insuperably evil that it requires swift, fatal intervention. The manichaean morality of “Good Guy With a Gun” rhetoric divides humanity into heroes and villains, whose morality is innate and unchangeable. Only bringing the hammer down, even at the cost of human life, restores balance.

When I describe this principle as a “male power fantasy,” it’s tempting to think I’ve simply dismissed men’s feelings flippantly. Admittedly, some do. Yet, watching Jack’s daily working routines, I’ve realized how thoroughly powerless he feels. Despite his demonstrated high skills, he has no workaday autonomy. Management thoroughly owns his daily routine; more than half his waking hours belong to somebody else. Jack is the walking embodiment of powerlessness in the face of capitalist hegemony.

Back in 2018, Spike’s Tactical, a Florida manufacturer of decent, but not particularly distinguished, assault rifles, ran a controversial ad. “Not Today Antifa,” read the banner, over a painting of four White men in store-bought tactical gear and rifles. The subjects formed a cordon between a rampaging mob of violent protesters, and us the viewers. Though the image offers much to unpack, mostly unsavory, it highlights the myth of civilian violence defending a brittle civilization.

Men like Jack, or the Phoenix shooter, see a world defined by powerlessness. Crime seems endemic, amplified by prime-time media reports of continued urban awfulness which make violence seem more imminent and widespread than it actually is. Simultaneously, the leading way many blue-collar men once defended their families, work, is increasingly mechanized, outsourced, or done by undocumented immigrants. Jack has good reason to feel angry and afraid. Capitalism has made him powerless, and arguably useless.

Spike’s Tactical, the NRA, and other for-profit institutions latch onto this feeling of helplessness. Jack comes home every day tired, physically and mentally, from a job defined by hard labor and mental acuity. Asking him to read scholarly reports, or even investigative journalism, regarding gun safety, is ridiculous. He wants easily digestible information, often in visual form. Spike’s Tactical gives him that, reaffirming his belief that somebody without a face is overrunning his dying world.

Evidence highly suggests that guns don’t help much. In combat situations, untrained gunfighters are more lightly to shoot their own fingers off than stop an attacker. In ordinary situations, gun owners are more likely to commit suicide with their guns than defend their property. Suicide is, of course, the ultimate expression of powerlessness. Rendered unnecessary by capitalism, and backward by social evolution, these men face a future of continued uncertainty, or no future at all.

Therefore, when I say “male power fantasy,” I’m not disparaging men like Jack, or the Phoenix shooter. These men feel powerlessness to their bones. A Marxist revolution might fix that powerlessness, but that’s trading one form of uncertainty for another. For all their volatility, guns are at least knowable and immediate. They provide the comfort of at least a little control. Yes, that control is pretty awful. But at least they know what it is.