Friday, January 31, 2020

The False Promise of “Just Quit”


I recently made the Technological Age’s most common mistake: I let myself get dragged into an argument in the comments section. Specifically, I let some asshat, holding forth about the famously dismal wages and working conditions for American schoolteachers, try and prove that things aren’t that bad, and where they are, it’s because of common economic factors. In short, this person insisted, stop complaining about teaching conditions; either work or quit.

That latter option dominated this person’s argument. I’ve heard in frequently on my co-workers’ right-wing talk radio, too: if you dislike your job’s wages or working conditions, quit and get another one. They use one phrase repeatedly: “just quit.” Like they believe finding work consists of leaving one job and walking to another. Especially for skilled employment like teaching, the application process can require months of negotiation, besides the expense of relocating.

The average starting schoolteacher’s salary in Nebraska, where I live, is $34,465 per year, according to the National Education Association. That’s about $5000 below the national average—but, considering Nebraska’s lower cost of living, that’s not bad. I wouldn’t want to raise children on that salary, certainly, but for a beginner, it’s a reasonable middle-class wage, if you’re just starting out. But if you think that’s sustainable, try getting a home loan. Please.

My counter-arguer insists that this wage breaks down to about $24 per hour, significantly about the state average. This seemed weird to me, until I worked his math backwards. To achieve this absurdly high number, this person calculated the average starting salary, figured 180 instructional days per school year (the standard in most jurisdictions), and broke each day into eight hours. This figures to about $23.94 per hour.

If you’re following this, my counter-arguer assumes schoolteachers only work on instructional days, and only work eight hours. If you’ve ever taught, or been teacher-adjacent, you know this makes no sense whatsoever. Teachers spend countless long nights, weekends, and non-instructional days on grading and lesson prep, to say nothing of requirements for continuing education, departmental functions, and bureaucratic “in-service” days. Teachers don’t stop when students go home.



Faced with this failure of reasoning, my counter-arguer retreated into high-school economics, insisting that “supply and demand” devalues teacher wages. Basic economics teaches that scarcity makes something more valuable, while oversupply drives value down. Teachers’ wages suck, in other words, because there are too many teachers. This might make sense if teachers weren’t in such short supply that school districts are actively headhunting qualified teachers from other states.

One could extend this logic to other industries. Construction, the field where I’m currently employed, reported a nationwide worker shortage of 300,000 personnel in June 2019. This especially applies to skilled laborers, such as carpenters and ironworkers. Yet wages remain low: one co-worker, an experienced finish carpenter, can’t find work above $18 per hour without moving across state lines. He and his wife together are barely making house payments.

Two factors drive this reality. First, while there’s arguably an undersupply of skilled workers, probably caused by American education policy favoring universities over trade school, there’s also an undersupply of employers. My co-worker, despite being more skilled than me, cannot find employers willing to pay better, at least locally, because our area has too few contractors employing finish carpenters. While I won’t say those contractors engage in price-fixing, they certainly have little incentive to pay better, under current minimal competition.

Second, there’s an economic truth long acknowledged in certain circles: the more useful a job is for general society, the worse it generally pays. Currently, even after the 2008 economic meltdown, financiers and investors generally have the highest income, while laborers, educators, and service providers—the people who create genuine public welfare and social stability—have seen their wages stagnate for two generations.



This reality comes most clearly in anything related to food. “Tipped workers,” a category dominated by food service workers, haven’t seen their minimum wage increased since 1991. That in an industry with America's highest wage theft rates. Meanwhile, America’s farm debt has reached levels unseen since the 1980s. We nominally laud American farmers, and encourage underemployed youth to get jobs waiting tables, then pay both fields starvation wages.

So yeah, “just quit.” Move to another industry… if you can. Which most people can’t; most teachers, carpenters, and waiters will remain teachers, carpenters, and waiters. Career changes, presented as flippant, are exceedingly difficult. So lobbying for better conditions isn’t whining; working people have a right to demand control in work.

Wednesday, January 29, 2020

The Great Political Sand Trap of “Both-Sides-Ism”

The original, offending quote.
Click to enlarge
Recently I shared a Noam Chomsky quote on social media, accusing America’s two major parties of having essentially similar views and being largely interchangeable. I caught some criticism for this, saying I’d claimed there was no difference between the two parties, a position clearly divergent from current politics, where the Republican party has yoked itself to Donald Trump’s sudden whims. Naturally, this got me thinking.

This quote is classic Chomsky. For those unfamiliar with his politics, he’s long contended that participation in the American electoral system requires candidates to submit themselves to certain shared principles; anyone without these values cannot get elected. Consider Elizabeth Warren’s claim that she’s “capitalist to the bone,” of John Hickenlooper claiming “socialism is not the answer.” To cite just two examples from the current primary campaign.

Sure, Hickenlooper got shown the electoral door early. But Warren remains one of the Democratic Party’s few serious contenders. Besides Joe Biden, who desperately attempts to find “middle ground” where none exists, the Dems’ great hopes currently are Warren and Bernie Sanders, who both promise, to varying degrees, to dismantle the institutional changes which President Trump has instituted, and return to, or expand upon, Barack Obama’s legacy.

What, exactly, is that legacy, though? President Obama campaigned in 2008 on “Change We Can Believe In,” but once elected, largely promised to continue the status quo indefinitely. His continuation of TARP money transfusions attempted to restore the American financial system to its pre-collapse architecture. And his Affordable Care Act, far from reforming American health care, gave private insurance companies legal standing for the first time ever.

Joe Biden
This continues the Bill Clinton presidential practice. President Clinton, for those unaware, rose to national prominence as a member of the Democratic Leadership Council, which consciously and deliberately steered the party away from its leftward roots in equity politics and labor unionism. As presidential historian Gil Troy writes, one of Clinton’s legendary political promises, “welfare reform,” originated as a promise during his long-shot primary campaign, not the general election.

Looking at American political history, it’s clear the two major parties aren’t identical. Democrats have a history of being more liberatarian on personal issues, particularly sex and drugs, and (since 1965) more welcoming to immigrants and people of color. Republicans have a history of libertarianism on economic issues, believing money reflects its users’ moral values, and therefore that free-flowing money encourages a moral society. The parties do differ.

However, on issues of sweeping economic scope, like trade and taxes, both parties, since at least the 1960s, have shared important values. Both have largely agreed that taxes should diminish, though they dispute questions like “for whom” and “how much.” Both have, to different degrees, looked askance at unions, and trusted management. Remember, President Kennedy appointed the CEO of Ford Motor, Robert McNamara, as Secretary of Defense.

In case anybody had forgotten that Vietnam happened.

Yes, the economy is only one aspect of political life. But it’s the aspect which everyone, regardless of race or class or party or other dividing line, shares together. We all have economic need for shoes and houses and work. The parties disagree on handling undocumented immigration, but anybody who follows news knows that immigrants enter America mostly looking for work. The economy isn’t everything, Karl Marx notwithstanding. But it’s part of everybody’s shared experience.

Political writer Eric Blanc writes that, before the 2018 teachers’ strikes, which happened primarily in states that supported President Trump, many union leaders expressed equal disappointment in both parties. Democrats have run a lite-beer version of the same economic policy for which Republicans have repeatedly claimed credit: cut taxes, cut spending, starve the public sector. Then promise that wealthy Americans will cover the shortfall in public goods.

President Trump claimed several voters who supported Bernie Sanders in the 2016 primaries supported Trump in the general election. I, and other left-leaning voters, pooh-poohed this notion. But Blanc found one unionized teacher who said exactly that: she thought Sanders and Trump would both shake up a fossilized system; her greatest disappointment was that Trump clearly bought into the system he’d previously promised to break.

That’s one anecdote, certainly; I can’t imagine any cost-effective way to divine meaningful statistics on who supported both Sanders and Trump in hopes of meaningful change. But this clearly did happen, and it happened because working-class Americans thought the political system served the rich and the well-connected. Sanders’ and Warren’s continued insurgency suggests they’re not wrong. Will either party learn from this while there’s still time?

Friday, January 24, 2020

Midnight's Stepchildren

Charlie Jane Anders, The City in the Middle of the Night

Young Sophie attends college revolutionary meetings, not because she believes the slogans, but because she loves her roommate Bianca. One day, she impulsively protects Bianca from the city’s demi-fascist police. Without even a trial, the police eject Sophie from the city, onto the planet January’s monster-haunted tundra, where, sure enough, she meets a monster. Imagine her shock when the monster tries to befriend her

Charlie Jane Anders’ second novel uses familiar science fiction elements like Lego bricks, building a complex edifice simply so she can tear it down again. Concepts which genre readers have grown comfy with, like interplanetary colonization and narrowly balanced ecosystems (the symbolism is deliberate and aggressive), become staging grounds for Anders’ statements about what makes humans human. Because you know, Anders plans to change that definition, as SF writers do.

Planet January is tidally locked, meaning one side always faces the sun, blanketed in permanent, blazing-hot sunlight, while the opposite side remains shrouded in darkness. The zone of human habitability lives in “the twilight,” a narrow equatorial band. This eternal equipoise embodies the most important theme of this book, the polarity between opposites. Everybody and everything exists in complete extremes; humans on planet January have forgotten how to find balance.

This lack of balance comes across repeatedly throughout the novel. Sophie’s story begins in Xiosphant, a city which handles its permanent half-lit state by regimenting everything. The entire city works, eats, and sleeps at exactly the same time… and they also marry and breed on a schedule which Sophie rejects. Exiled from Xiosphant, Sophie and Bianca relocate to January’s only other city, Argelo, a sybaritic haven of constant parties and lawless self-indulgence.

Caught between Xiosphant’s absolute order, and Argelo’s laissez-faire chaos, Sophie discovers the Gelet, the planet’s native species, living peacefully in the midnight ice. Humans hunt the Gelet for meat, and eat the crops Gelet plant in volcanic fissures. Sophie becomes the first human to realize the Gelet are intelligent, and their crops aren’t just plants, they’re the only system keeping planet January from collapsing into environmental entropy.

Charlie Jane Anders
Anders’ narrative voice switches between Sophie and “Mouth,” which preserves the polar opposites theme. Sophie is an educated, idealistic young adult; Mouth is rough-hewn and nomadic. She also never underwent her tribe’s adulthood rites, trapping her in permanent childhood even as she grows cynical and nihilistic; “Mouth” was a placeholder name until she got her grown-up name. But Sophie and Mouth’s worlds become increasingly intertwined, at both their costs.

In discovering the Gelet, Sophie and Mouth encounter an intelligence so alien from human comprehension, twenty generations of human colonists have never communicated with them. Anders’ physical description of the Gelet resembles Lovecraft, but once you internalize their appearance, that comparison ends. The struggle to communicate across species more resembles writers like StanisÅ‚aw Lem or China MiÄ—ville, authors specializing in the true meaning of “alien.”

Readers expecting a “hero’s journey” for Sophie and Mouth will be bitterly disappointed. Anders doesn’t blow her nose on traditional story forms, but she also recognizes life doesn’t work that way. This story instead consists of connected vignettes from the heroines’ lives in what becomes permanent, rootless exile. Themes develop in fits and starts, and lessons learned consist of what the characters extract by force.

And force it is. Planet January is constantly violent. The interplay between the searing daylight and ice-choked night creates an environment of perpetual scarcity and conflict. Xiosphant prevents barbarity overtaking the population by granting the state a monopoly on violence, a monopoly they use with brutal efficiency. Argelo has no such monopoly, so life is constant bellum omnia contra omnes, the city little more than an ongoing turf war.

The Gelet city provides an antidote to this polarization, if Sophie and Mouth can only convince humanity. Forced by the planet’s evolutionary stimuli, and empowered by an ability to communicate with complete coherence, the Gelet have overcome their physical limitations and built something durable. Assuming the humans don’t break it. And assuming humans have an expansive ability of what “humanity” will ultimately mean.

Anders writes about the common struggle: humans must adapt to our circumstances, natural or technological, but we’re also resistant to change, preferring the comfort of familiarity, even when familiarity is killing us. The novel ends in motion, as one conflict between change and familiarity ends, and another gets rolling. Because, Anders implies, this duality is normal. Humans just learn to live with it, or die resisting. To Anders, it seems, there is no easy middle ground.

Thursday, January 23, 2020

Is It Still Okay to Watch Firefly?

Nathan Fillion as Captain Malcolm Reynolds
I don’t ask the title question flippantly. A trusted friend, a credentialed historian who also first got me interested in this series, recently admitted he has difficulty stomaching the show now. He specifically has problems with its historical parallels: in the DVD commentary, series creator Joss Whedon has admitted patterning the series after the kind of Old West outlaws, like Jesse James and the Younger brothers, who never reintegrated into post-Civil War life.

My friend’s concerns come from a firm foundation. The valorization of outlaws like Frank and Jesse James, who are sometimes misrepresented as folk heroes, serves a definite political purpose. As part of the Lost Cause narrative, people who rehabilitate violent Confederates want to legitimize the Confederate government. This means pretending the Confederate rebellion was about anything other than slavery, the one goal all Confederate leaders agreed the war was about.

Yet I have two reasons why I’m willing to extend Firefly more leeway: one reason from literary criticism, one from history. The lit-crit reason comes in two words: Unreliable Narrator. Like Holden Caulfield or Humbert Humbert, our principal source for understanding the Unification War, Captain Malcolm Reynolds, proves himself untrustworthy, spinning events to preserve his fragile self-image. This probably happens because he knows he’s a common criminal.

>Repeatedly throughout the series, events encourage us to question whether Captain Mal’s explanation of the Unification War make sense. We witness flashbacks to battle in the first and twelfth episodes, but we only get the narrative of causes and motivations delivered verbally. Captain Mal spins an elaborate yarn of independence, self-reliance, and honor, and admittedly, it sounds good. But other times, he lies so flagrantly, it’s hard to say if he even realizes he’s lying.

This reflects Whedon’s love for antiheroes. In his stories, he frequently foregrounds characters who perform good actions for awful reasons, and monstrous opponents whose motivations are nevertheless worthy. Just consider how, in his version of the Avengers, “Earth’s Mightiest Heroes” band together basically on the grounds that “the enemy of my enemy is my ally.” Whedon butters his bread with tales of doubt and moral ambiguity.

But, even taking Whedon’s claims of historical parallelism seriously, that moral ambiguity continues to apply, because historically, the Union victory wasn’t an unalloyed triumph. Though the Union victory resulted in slavery’s abolition, any history reader knows that aboliton was strictly pro forma. As Dr. Ibram Kendi writes, both major political parties were equally racist between about 1870 and 1965, and even today, such bigotry has merely gone underground.

Promotional cast photo of Firefly

Please don’t misunderstand me: I believe America has vast moral potential to do tremendous good in this world. But we frequently haven’t lived up to that potential. As Yale historian Greg Grandin writes, American forces flew General Lee’s flag beside the Stars and Stripes while invading Cuba and the Philippines in 1898. Those invasions transitioned into the Banana Wars, where America invaded its southern neighbors almost continuously for nearly two generations.

So the Union victory gave America the moral grandeur of boasting it abolished slavery. But it also gave America the military might and physical capital to conduct major invasions of non-hostile countries, a pattern that continues to this day. And, let’s be honest, the Civil War provided a brief respite from America’s nearly-continuous fighting against indigenous peoples along its western frontiers, a fight begun under President Washington and ended under… well…

We’ll come back to that.

If Firefly really reflects American post-Civil War history, then the struggle between a lying idealist and a tyrannical state-capitalist government really holds up well. We could draw parallels between the “Hands of Blue” and COINTELPRO, or between the Reavers (as revealed in the sequel film Serenity) and… um, I dunno, MK-Ultra? The bombing of MOVE house? As any lit-crit undergraduate eventually realizes, symbolism is always approximate.

Yes, world history has indubitably benefitted from the Confederacy’s defeat and the abolition of chattel slavery. I’m glad the Union won the Civil War. But reducing that conflict and its aftermath to a Manichean division between a clearly heroic Union and obviously villainous Confederacy, diminishes history’s vibrant complexity. History doesn’t unfold like a medieval morality play, despite the politically circumscribed narrative I learned in high school.

My friend’s squeamishness about enjoying Firefly makes perfect sense. I feel likewise while singing along with “The Night They Drove Old Dixie Down.” But I still sing along, because although I believe the right side won the Civil War, I also believe history steadfastly refuses to play into anybody’s neat, unequivocal moral categories.

Tuesday, January 21, 2020

Living With the Ghosts of H.P. Lovecraft

Kij Johnson, The Dream-Quest of Vellitt Boe

Scandal rocks the ivied streets of ancient Ulthar: a young woman has fled with her beau to another world. The great university city of the dreamlands, Ulthar is prestigious, but precarious. In order to maintain the balance with human benefactors and capricious gods, Professor Vellitt Boe volunteers to track the young woman down and bring her home. But what should’ve been a brief mission becomes a massive quest when it appears the woman has entered that great forbidden realm: the waking world.

Hugo and Nebula award-winning novelist Kij Johnson grew up reading H.P. Lovecraft, according to her annotations. Like many readers, she loved his ability to create feelings of creeping dread and nightmare-like atmosphere, without resorting to easy jump scares. But she also wondered at his deep-seated racism and casual misogyny. So, like many good readers, Johnson dived back into Lovecraft’s published corpus and crafted this, a companion to his legendary Dream Cycle stories.

Professor Boe, an erstwhile nomad, rediscovers the joys of crossing the dreamlands, a continent where distances are arbitrary and monsters lurk behind every corner. Death, to citizens of the dreamlands, is a comfortable neighbor. Accompanied by one of Ulthar’s legendary cats, Boe traverses cities with nameless streets, and forests overgrown with vines, until she finds the temple leading to the waking world. But the temple priests don’t have the key to cross; that lies with Randolph Carter.

Yes, Randolph Carter. Just as Lovecraft’s works include frequent references to his own work, Johnson liberally inserts dense references to Lovecraft. Professor Boe knows she must locate her missing student, before Ulthar is destroyed like ancient Sarnath. She names, and fears, Lovecraft’s fickle Elder Gods. In her allusions to Lovecraft’s works, Johnson channels his nightmare-like tone, without collapsing into mindless pastiche. This is both a Lovecraft work, and something more.

Readers familiar with Lovecraft’s works recognize Randolph Carter as Lovecraft’s Mary-Sue character. Which leads me to wonder: is Vellitt Boe Kij Johnson’s Randolph Carter? (Let’s not grammatically parse that sentence.) Boe and Johnson are similar in age, profession, and even appearance. In chasing Randolph Carter across the dreamlands, and finding him a shadow of his former self, is Johnson confronting Lovecraft and his influence on her writing? That’s what I would do.

Kij Johnson
I’m unsure that’s what Johnson does, though. Rather than confront Randolph Carter, Professor Boe utilizes him to pursue her own goals. If Vellitt Boe serves the same role for Kij Johnson that Randolph Carter serves for Lovecraft, then Carter/Lovecraft is neither a monster to defeat, nor a lover to embrace, though Boe implies he might’ve been both once. Instead, Carter becomes part of the youthful influences that made adult Boe. Johnson handles this better than I might.

H.P. Lovecraft was decades ahead of other writers in the weird horror genre, and his influence echoes across today’s publishing world. He was also so unbelievably racist, even by his day’s standards, that his friends and colleagues felt compelled to comment upon it. This duality, between the progressive writer and the regressive man, influences his dreamlands, a landscape that stretches across multiple stories, populated by bigoted stereotypes and, notably, very few women.

Johnson comments upon Lovecraft’s unexamined prejudices by doing the opposite of anything he would’ve done. His world was overwhelmingly male, so she creates a female protagonist. Lovecraft considered the waking world real, and dream people subordinate, so Johnson gives us a dreamlands native. Randolph Carter refused to age, even as Lovecraft handled the years poorly, so Johnson’s Vellitt Boe wears her iron-haired years with resolute pride. Everything Lovecraft would’ve hated, Johnson puts front and center.

The result feels like both a love letter to Lovecraft’s influence, and possibly a Dear John letter. It’s also a self-contained narrative, a story that doesn’t require any previous familiarity with Lovecraft, because Johnson creates character and atmosphere entirely her own. Vellitt Boe walks through a world that resembles our own dreams: we can traverse the land, but can never return, because the path we’ve taken returns whence it began, our own brains. I create as I speak.

If Vellitt Boe is Johnson’s alter ego, she occupies Lovecraft’s world with grace and dignity seldom seen since the original. And if Vellitt Boe is a woman crossing dangerous territory in her own right, she’s a dauntless heroine, a weird fiction icon for the present generation. This book is short, barely 160 pages, yet finishing the story, you feel like you’ve undertaken a momentous journey. Because maybe, in Lovecraftian terms, you have.

Monday, January 20, 2020

Imaginary Vietnams

The Punisher, art (1994) by Gabriele Dell'Otto
Comic book superhero The Punisher debuts in The Amazing Spider-Man #129, cover dated February 1974. The Jackal hires The Punisher, then a mercenary, to assassinate Spider-Man, convincing him that Spider-Man is a common criminal. When the heroes realize they share mutual goals, and The Punisher resigns his contract, Spider-Man asks why The Punisher, a U.S. Marine, isn't in Asia fighting the war. The Punisher looks aggrieved, saying he has his reasons, and walks away.

Nearing the end of her book Bring the War Home, historian Kathleen Belew describes how the rise of paramilitary White Power organizations justified the accumulation of paramilitary technology in civilian police forces. Facing White Power revolutionaries armed with AK-47s and rocket launchers, the police needed stopping power sufficient to face such excesses. So they began collecting surplus military gear, including tanks, small artillery, and more. The militarization of American police was, initially, a justified circumstance.

But Belew stops counting around 1995. An academic historian, Belew limits herself to documentation, avoids speculation, and stays away from events too recent for dispassionate context. Thus she avoids commenting upon how the militarized police force, mustered to combat militarized domestic terrorist cells, has more recently channeled its aggression onto poor communities, especially communities of color. Tactical forces created to fight White Power groups have begun tacitly doing White Power work on the state’s behalf.

Unlike Professor Belew, I have no such academic restraint. I can observe that since the post-Vietnam White Power movement has dwindled to a shadow of its Clinton-era might, the police have used techniques which the movement first prefected, against the movement’s preferred targets. Just a few years ago, news-hawks like me watched decommissioned Army tanks, repainted in municipal PD colors, patrolling the streets of Ferguson, Missouri, and other cities, keeping the terrified Black populace controlled.

Vietnam War mythology plays heavily into this unfolding history. Professor Belew describes how Vietnam veterans, pumped on anti-Communist propaganda, trained to kill a vaguely defined enemy, returned to America discouraged by the brass’ unwillingness to fight with unrestrained brutality. The Viet-Cong, according to future White Power leaders like Louis Beam and Tom Posey, were so inhuman and pervasive, that killing anybody who looked vaguely Vietnamese was perforce justified to preserve American values and national identity.

This myth, of an overwhelming enemy and a timid government, motivated White Power revolutionaries to declare war on America’s government. But after Timothy McVeigh’s Oklahoma City bombing turned even die-hard conservatives against the militia movement, the mytholgy underwent a massive transformation. The federal government armed its civilian police forces to fight a guerilla war without front lines on American soil, appropriating the White Power movement’s driving narrative for state use.

Then, the police began internalizing that myth.

The Oath Keepers, an anti-state militia, recruiting on the streets of Ferguson, Missouri

New York Mayor Bill DeBlasio got elected, partly, by promising to rescind heavy-handed policing tactics used against poor and minority communities by Mayors Giuliani and Bloomberg. However, in 2014, a police officer killed nickel-and-dime criminal Eric Garner in an illegal choke hold, and DeBlasio’s response was… tepid. It took five years for the NYPD to fire offending officer Daniel Pantaleo, and the police union turned Pantaleo into a hero for killing a civilian.

Though this represents one anecdote, it demonstrates a police culture that sees civilians, especially those of a particular complexion, as incipient enemies of the state, policing as a war to preserve powerful but poorly defined values, and government as an impediment to police forces doing their jobs. The Vietnam myth that metastasized into White Power has become, mutatis mutandis, the mythology of American police. And some law enforcement have chosen, as their mascot, The Punisher.

In a recent interview, Marvel Comics writer Gerry Conway expressed dismay about police appropriating The Punisher. “To me,” he says, “it’s disturbing whenever I see authority figures embracing Punisher iconography.” Except I’d suggest, these police fundamentally aren’t embracing The Punisher; they’re embracing a culture-wide myth of former soldiers returning home, facing a nation that’s changed without them, and retreating into the one skill they’ve learned: punishing anyone who deviates from their script with unstoppable violence.

Throughout her book, Professor Belew notes that the real Vietnam doesn’t matter to White Power organizers. It’s the story that drives them, not the facts. Unfortunately, that story, that mythology, has drifted from violent anti-state terrorists, into the mainstream of the state’s instrument of order. The Punisher was never about justice, he was about quieting dissidents. That’s why rebellious punk teenagers and a militarized, war-myth police have suborned him. Because Vietnam was never about reality.

Thursday, January 16, 2020

White Power and the Other “Lost Cause”

Kathleen Belew, Bring the War Home: the White Power Movement and Paramilitary America

The Vietnam War left nobody happy, least of all those who actually risked their lives fighting a protracted war with shifting goals and no front line. Hundreds of thousands of young American men internalized messages of anti-communism, moralized violence, and implicit (and sometimes explicit) racism. Then the war ended without victory, and those men returned to an America as unprepared for them as they were for it.

University of Chicago history professor Kathleen Belew’s first book seeks to contextualize America’s White Power movement in the last quarter of the Twentieth Century. From the beginning, that context is Vietnam. The military’s hard-right rhetoric, and the constant paramilitary fight, trained young men, and some women, to live in constant fear of incursion and instability. Former soldiers mustered out, and too many transitioned into something new: White Power.

America’s cultural milieu in the 1970s provided constant fodder for White Power activists. The political left continued shattering, and Communist, socialist, and reformist factions squabbled internally, while right-wing groups which formerly hated one another, particularly the Klan and neo-Nazis, began finding common ground. Vietnamese refugee resettlement gave White Power a ready-made enemy. The criminal justice system wasn’t prepared for war on the home front.

Belew repeatedly makes reference to what she calls “the Vietnam War narrative.” This concept holds that history isn’t simply what happened; history is the narrative we use to organize past events into comprehensible nuggets. Some veterans, like the Brown Berets, returned home and organized for left-wing causes, Belew writes. They used Vietnam’s narrative to advocate greater equality. But the Vietnam narrative gave bigots a story of top-down betrayal, racial animus, and anger.

Bringing that narrative home, some White veterans saw a government which hadn’t permitted them to win; a judicial system which appeared, through their lenses, to favor minorities and immigrants; and a political system uninterested in hearing their stories. They thought they’d lost at home, too. So they repurposed the war’s anti-communist rhetoric, gathered among their fellow true believers, and did what America trained them to do: organize for war.

Before 1975, White supremacist organizations, like the original Klan or the John Birch Society, presented themselves as allies of America, preserving the state against degradation and decline. White Power, however, considered the state an enemy. They openly presented themselves as revolutionaries prepared to combat an occupying tyrannical authority. The answer, they believed, lay in overthrowing the state… then killing everyone whose skin wasn’t completely white.



Until this happened, White Power groups like Christian Identity and the nascent Militia Movement practiced placeholding maneuvers. They organized against Communists and the Civil Rights left. They shot organized Communists in North Carolina in 1979, killing five, and juries refused to hold them accountable; similarly, juries ignored reams of overwhelming evidence in 1988, and refused to convict one of the most robustly documented conspiracies ever exposed.

Narrative looms large in Belew’s telling. Not only the war, but the narrative of American history; the narrative of gender roles; the narrative of apocalypse. Works of fiction, like The Turner Diaries, drove White Power as much as military procedure guides. White Power leaders, Belew writes, knew they couldn’t win revolution outright; but they attempted to control America’s storytelling environment, and thus win countless eventual converts to the cause.

This storytelling became increasingly important as the revolution kept not happening. The White Power “movement” descended into nickel-and-dime Mafia behavior, but required constant fresh blood. So the narrative evolved. “As younger activists joined the white power movement in this period, the Vietnam War narrative became increasingly unmoored from a lived experience of combat,” Belew writes. Control of the story, official or otherwise, became as important as control of guns.

Reading Belew’s account, I recalled two other narratives originating among losing armies: the Lost Cause narrative after the American Civil War, and Germany’s Stab-in-the-Back narrative after World War I. Both posited that they could’ve won, under idealized circumstances, and thus felt morally justified in preparing for another war. This created the first Klan and the Nazi Party. Which brings us full-circle to the groups which allied to form White Power.

Belew’s account stretches from 1975 to 1995, ending following the Oklahoma City bombing (which Belew contends, with evidence, was a White Power action, not a lone-wolf attack). Anything after 1995, she says, is too recent for context. But by organizing seemingly far-flung events into a digestible story, she provides current anti-racists with a persuasive counter-narrative of how we reached the present. Because, deep down, she’s right: control of the story matters.

Tuesday, January 14, 2020

Living the New Post-Holiday Blues


I still remember the year I needed to pop into Walgreen’s on December 26th to collect a prescription. Near the checkstands, some poor minimum-wage worker was dismantling the storewide Christmas decorations, while mere steps behind her, another worker was unpacking and assembling the Valentine’s Day bunting. I had to laugh out loud, because the alternative, giving into despair, seemed a little too close to home.

In the moment, I had the usual Middle-Aged White Guy responses: should I tell this corporate behemoth that Christmas lasts until January 6th? Or should I offer a snide comment about how American consumers feel compelled to seek another opportunity to spend money? Aging honkies like me love criticising commercialism, especially when holiday decorations are already out. It makes us feel righteous and vindicated.

The longer I marinate with that experience, though, the more I doubt my initial reaction. If one store caromed quickly from one holiday to another, that’d be a sketchy commercial decision. When every retailer, advertizer, mall, and website is already pumping us for another pan-cultural buying fest, that indicates something’s gone systemically cockeyed. It forces me to wonder who holidays are actually for.

It doesn’t take a degree in complex etymology to recognize that “holiday” is a portmanteau of “holy day,” and that’s what role holidays originally played. German theologian Rudolf Otto writes that calling something holy, rather than conveying moral purity and goodness, originally represented being set apart and different. Being holy meant being separate from this world, and its workaday pressures, whether that meant God’s separateness, land unmeet for walking on, or a day set aside.

That’s what holidays were, too: days set aside from tedium, obligation, and this world’s well-compensated march toward death. In a pre-technological society, the things that make life, like work, food, and children, are also reminders of impending mortality; being alive entails preparing for death. So pre-technological people scheduled separate time intended to celebrate being alive, acknowledging our existence matters while we’re here.

Sometimes, these celebrations were communal. Many prominent holidays on the Western calendar, like Christmas and Easter, were celebrated in unison throughout Christendom: every observant Christian knew every other observant Christian was also saying Easter prayers at the same moment as themselves. Other holidays, like July 4th or Bastille Day, represented only one nation, region, or population—but, importantly, that entire nation, region, or population together.


Other holidays were personal. To this day, Roman Catholic tradition preserves the practice of Saint’s Days, when people celebrate particularly holy persons whose influence the individual hopes to emulate. Baptism and naming days also once loomed large in Christian tradition. Other private holidays have applied in other religions or cultures, including periods of personal spiritual quests, or anniversaries of important accomplishments. Everyone had days which were entirely their own.

Not anymore. For most people, the only personal holidays remaining are your birthday and your wedding anniversary, and these have no cultural standing: if hourly wage earners schedule their birthday off, they’re vulnerable to mockery. You’re only entitled, anymore, to whatever holidays everyone else celebrates communally. Whether religious holidays like Christmas or Eid, or national holidays like May Day or the monarch’s birthday, all celebrations are communal, and scheduled.

Put another way, you only have programmed calendar opportunities to exist for yourself and your spiritual beliefs. Every other day, you’re required to exist for your country and your employer: work at required times, return home to the unpaid labor of raising a family, and sleep when every other obligation allows. The current system—call it capitalism, statism, or whatever—coördinates time to work, time to sleep, and time when you’re allowed to be happy.

And that happiness better be communal, or it’s deemed illegitimate.

Walgreen’s wasn’t demeaning Christmas or Valentine’s Day by rushing from one set of precut bunting to another. They merely acknowledged the collective schedule by which we, and they, are permitted to be happy. Our society reserves specific days, at scheduled intervals, to exist as “holy,” as set apart from public economic expectations. We live regimented lives, and only when authority gives permission may we do something reckless, like be happy.

Perhaps, in today’s socioeconomic system, one of our remaining acts of rebellion is to feel happy off-schedule. I don’t know how to accomplish that, how to simply enjoy living with the sun on my face like I did in elementary school. But I’ll be looking into that immediately. Because if happiness has become countercultural, baby, I want to be the happiest man alive.

Friday, January 10, 2020

What Do I Mean By “Bureaucracy”?


My favorite recent word, apparently, is “bureaucracy.” In writing, I’ve used it lately to describe everything from how corporations and governments ration health care, to how banks strangle locally controlled communities. In speech, I’ve used it to explain why capitalism, communism, and socialism are all equally degrading to human spirits. But this week, when I used the word “bureaucratic” to describe how adults make friends, I have realized I need to define the word better.

The Oxford English Dictionary defines bureaucracy as “a system of government in which most of the important decisions are taken by state officials rather than by elected representatives.” This definition is useful but, like most dictionary definitions, it sands away controversy and nuance. For example, the “rather than by elected representatives” tag implies bureaucracy is a degenerate form of democracy. And the “system of government” qualification suggests bureaucracy only describes states, not other complex institutions.

When I taught at the university, I saw bureaucracy applied to aspects of teaching: course goals written by people outside the discipline, mandatory clauses added to the syllabus by lawyers, and top-level leadership appointed by government without having classroom experience. I can only imagine what public school teachers endure under No Child Left Behind and its successor, Common Core. I’ve heard horror stories of mandatory curricula and standardized tests written by God alone knows who.

In the labor pool, I’ve witnessed site “managers” with no autonomy, lugging binders full of rules written by somebody up the hierarchy, with goals they’re required to accomplish, and regulations they’re required to enforce. Schedules get written by office workers, schedules with no consideration for contingencies like time lost to weather. (In a world wracked by global warming.) Production quotas get written by actuaries who have studied assembly line schematics, but never worked the line.

That’s just my personal experience. I’ve heard similar stories from other fields: a former police cadet who quit the academy when she realized the brass cared more about enforcing regulations than guarding justice. A theatre director who got a day job because he discovered the board cared more about appeasing donors than creating art. A pastor, told to adjust his preaching the gospel to smooth relations with the episcopate. “Bureaucracy” means rules win over honesty.


For me, “bureaucracy” means a system of control where a layer of operatives are appointed to enforce rules, but given no discretion over them. Obeying rules becomes paramount, regardless of whether obedience serves the purpose for which the rules were written. Anybody who’s ever been stuck at an unreasonably long red light at two in the morning knows that sometimes, ignoring the rules is the reasonable choice. But bureaucracy enforces rules because they are rules.

Admittedly, the complexity of living in a technological society with a large population makes some level of bureaucracy necessary. The long supply chains needed to produce computers or diabetic syringes means somebody has to govern the transport and assembly, and the corporate nabobs responsible for top-level decision-making lack the time and skills to do this. So professionals get appointed because they know how to make the necessary decisions to accomplish these goals with minimum delay.

Problems arise when corporations, governments, and other institutions offload such massive responsibilities onto their professionals, that they come to dominate the organization’s time and budget. A recent Reuters headline read: “More than a third of U.S. healthcare costs go to bureaucracy.” Economic historian Jerry Z. Muller writes that government and economy write so many rules for accountability, that institutions have to hire entire staffs to enforce them. Dollar-value economic gains vanish into the system.

So what do I mean when I describe friendships as bureaucratic? Social media has become awash in supposed rules like “Dump toxic friends immediately” and “Only keep friends who lift you up,” rules written by people who don’t know you and your circumstances. Just like CEOs and Congress Critters, these rules writers craft regulations somebody else has to enforce. And that someone is you: you’re both the subject of strict behavior policing, and the police.

Bureaucracy, by its nature, is always blind to real-world circumstances. It always values strict adherence over experience and discretion. Bureaucracy doesn’t permit you, no matter how wise you are, to make informed judgements; compliance, ultimately, is its only principle. And again, in small doses, it’s often necessary. But when bureaucracy becomes its own moral justification, and I believe in our society it has, then bureaucracy becomes anti-human. Then humans develop a moral obligation to refuse.

Wednesday, January 8, 2020

What Are Friends Even For?

“Drop toxic friends before they drag you down.” We’ve seen the message repeated hundreds of times recently, especially if we follow social media: some people have personalities so poisonous, we must jettison their influence immediately. If you’re anything like me, your mind immediately flashed on that “friend” we all have, the friend who foments quarrels for kicks, or encourages you to drink past your limits, or spreads conspiracy theories, or is just an insufferable jerk.

In an article that isn’t recent, but nevertheless recently crossed my desk, political journalist Ephrat Livni complains that this reasonable goal, of removing influences which dirty our spirits, has become itself toxic. Livni cites multiple articles, from top-drawer newspapers and journals, encouraging readers to disaffiliate themselves from friends with glim personalities, negative traits, or poor personal habits. She protests, not unreasonably, that this reduces friendships to capitalist exchanges, and relationships to long term personal interest.

I find myself torn. Like many suburban White kids, I grew up encouraged to be endlessly accommodating, to keep making friends, and never surrender on existing friendships. “Make new friends, but keep the old,” the campfire song we learned in Cub Scouts taught us: “one is silver and the other’s gold.” Even when dealing with schoolyard bullies, my adult influences encouraged me to befriend them: maybe they’re angry, parents and teachers suggested, because they’re lonely.

This encouraged a cycle of enabling dysfunctional personalities and catering to unreasonable demands, a cycle I must consciously resist in adulthood. Not coincidentally, as we witness the violent predations of dysfunctional, unreasonable people like Jeff Bezos and Elon Musk, I’ve realized this message of appeasing destructive people isn’t merely misguided or naïve; it serves a sociopolitical order where hoi polloi like me cater to our overlords’ pig-headed demands. It’s political control disguised as common ethics.

Plato and Aristotle argues what makes a good
friend, too. They never found an answer.
However, I’ve also known people whose personality traits deserve ostracism. People who, for instance, seek justification for their self-destructive habits by encouraging me to mimic them, and finding ways to make me feel belittled if I don’t. People who see disagreement as personal disloyalty, and respond with shock-and-awe force. People whose demands for comfort and nurturance become constant, making me into a surrogate parent for a damaged adult. Sometimes you have to draw the line.

One of modern adulthood’s most vexing questions is: how do we make friends once we’re past traditional school age? As we’ve become unrooted from geographic place and erited community, many grown-ups struggle to make meaningful friendships. Which, I suspect, explains why adults need reminded to shun toxic people, because we fear returning to that unmoored state. Bad friends feel better than loneliness, right? In the short term, anyway. And, oops, capitalism is all short term.

Social and economic orders seek rules which bureaucrats can enforce without recourse, because if everybody follows the same rules equally, nobody can claim they’re treated unfairly, right? We know, of course, that this isn’t so. In third grade, Brian Morse clotheslined me from behind for “being a nerd,” and we both got three days’ detention for “fighting.” That’s how I discovered that treating everyone fairly doesn’t mean treating everyone equally. Bureaucratic rules are frequently unfair.

This applies to friendships, too. Both the heedless adherence to rules like “drop every toxic friend now,” and the magical thinking of believing that I can befriend, and thereby detoxify, every schoolyard bully, mean the application of unthinking bureaucratic rules where they often don’t apply. We seek rules because they provide one-size-fits-all coverage, except they don’t. We’re uniquely human, and so are our friends. Put complex individuals together in friendship, and our uniqueness becomes cumulative.

Psychologist Ellen Hendrickson answers the question of making friends in adulthood concisely. Our friends, she writes, aren’t the people we share the most values and interests with, but those we share the most time with. That’s why children make friends at school, and adults make friends at work. We can govern our time to ensure we meet, and keep company with, people who suit our values. Perhaps we should. But genuine friendships happen over time.

Ephrat Livni bewails using metrics to determine who constitutes a “positive friend.” Me too, because friendships sustain us through bad times, and give us opportunities to practice empathy. But, sometimes healthy people need to discontinue chronically unhealthy influences. Mindlessly applying rules, whether to keep or snub friends, leads us into unhealthy conclusions. Friendships aren’t passive. We must remain actively committed to making wise, adaptive choices. Stop looking for rules, and get to know your friends.

Friday, January 3, 2020

The War To End All—

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 35
Barry Levinson (director), Wag the Dog


An unpopular and pathetically incompetent President needs a war, to distract America from his latest debacle. So his personal fixer does what any political operative naturally does: he phones Hollywood. There, a well-respected producer with political ambitions helps organize the illusion of a humanitarian crisis in Albania, which only America’s military might can solve. Suddenly, patriotism is popular, and the President’s ratings soar.

This 1997 movie has become associated with President Bill Clinton, who engaged the American military in 1997, 1998, and 1999 when sexual scandals threatened his administration. But it was written to satirize George HW Bush, who almost salvaged a flagging presidency by sending troops into Operation Desert Storm. Since its release, every single American President has, at least once, used military accomplishments to build popularity during sudden, violent controversy.

Spin doctor Conrad Brean (Robert De Niro) takes personal pride in his relationship with Hollywood. He boasts that Desert Storm’s most iconic image, a satellite-guided “smart bomb” flying down an exhaust chimney and striking a precise target, was faked using miniatures in a Falls Church, Virginia, studio. So when the President gets caught on tape soliciting an underage girl, presidential aide Winifred Ames (Anne Heche) calls Brean for an encore.

But Brean cannot accomplish this alone. He pulls strings and gets producer Stanley Motss (Dustin Hoffman) into the operation. Motss’ connections secure him a photogenic young “war orphan,” played by ingenue Tracy Lime (Kirsten Dunst), whose apparent flight from her bombed-out village, kitten in hand, becomes nationwide news fodder. They masterfully hide that this “Albanian village” was constructed on greenscreen.

Brean warns Motss, Lime, and everyone involved, that their participation must remain completely secret. The slightest leak, he implies, will result in midnight assassinations by highly trained CIA operatives. Brean doesn’t realize, however, that CIA leakers have already sussed his ruse, and rushed oppo research to a highly popular Senator from the other party (Craig T. Nelson). Suddenly the fake war becomes a proxy for a bitterly controversial reëlection campaign.

Everything so far might conceal one important fact: this movie is a comedy. The humor is subtle and incisive, sometimes specifically targeted at events unfolding in the 1990s. Its understated, dry style, which amuses without necessarily causing laughs, reflects the influence of screenwriter David Mamet. (Mamet shares billing with first-draft author Hilary Henkin, but Mamet wrote Levinson’s final shooting script.) Maybe the wit is dry. Or maybe it’s too precise.

Robert De Niro, Anne Heche, and Dustin Hoffman in Wag the Dog


Brean, Ames, and Motss struggle to maintain the illusion of overseas involvement because they believe their President’s statements of principle. But believing the President’s words often means overlooking his actions. Their President is clearly a sexual predator more intent on maintaining power than governing responsibly. Yet his operatives believe, by keeping him aloft, they can accomplish their goals—which they never discuss, and so maybe don’t agree on.

This story builds upon the principle that Presidents love war. Given the Constitutional balance of powers, one could persuasively argue that, during peacetime, Presidents matter less to ordinary Americans than do Chairs of the Federal Reserve. But wars give everyone shared goals to build towards, something FDR discovered during World War II. Perhaps that’s why America has been engaged in an undeclared shooting conflict with someone, somewhere, continuously since 1947.

Unable to control the war narrative, our trio of weary antiheroes shifts to tubthumping about a POW supposedly left behind enemy lines. Wow, shades of Bowe Bergdahl. They organize an astroturf campaign to rescue Sgt. Schumann (Woody Harrelson), played by a soldier selected from headshots. But their fake POW proves impossible to control, and threatens to become a bigger PR nightmare than the war. Faking the news may be harder than making it.

Levinson directed this movie before social media, “fake news,” and “deepfake” videos became headline-grabbing concepts. He couldn’t possibly know the horrific stories about attempts to control the public narrative which have dominated American journalism since around 2015. Yet he accurately describes the combat between exciting public narrative, and boring old truth, which has become the ascendant conflict in modern politics. He just made it twenty years early.

Late in this movie, our unholy trinity contracts a down-at-heels folksinger (Willie Nelson) to fake an antique 78-RPM Smithsonian Folkways record. One couldn’t find a better analogy for this film. If we wanted to counterfeit a “classic” movie commenting on today’s fraught political scene, it would almost certainly look like this. If I hadn’t personally watched this movie in 2005, I’d think it was too on-the-nose to possibly be real.

Wednesday, January 1, 2020

Some Thoughts on the N-Word

“Content warning.” Isn’t that what you’re supposed to say anymore, before addressing a potentially inflammatory topic? We’ve taken a perfectly reasonable premise, that you shouldn’t discuss possibly offensive topics in public, and interpreted that to mean everyone is a time-bomb of hurt feelings waiting to explode. Which means we can’t name certain topics directly; we can only broach them surreptitiously, like talking about a kid’s birthday party, only more destructive.

I first remember hearing the phrase “the N-word” during the O.J. Simpson trial, in 1995. Simpson’s defense team purported that LAPD detective Mark Fuhrman had used a certain piece of racial invective in the past, sometimes on-the-job, sometimes directly into a tape recorder. They thought Fuhrman’s racism disqualified his testimony about Simpson’s involvement in his accused crimes. But though everyone knew what word Simpson’s lawyers meant, they never said it.

Thus, “the N-word” entered the pantheon of words I learned, in childhood, you mustn’t ever say, because simply arranging those sounds with your mouth caused irreparable harm. “The N-word” landed alongside “the S-word” and “the F-word” as terms we acknowledged existed, and everyone knew what they were, but nobody ever said them, knowing that some authority figure waited to punish us for speaking them. (Let’s ignore The L-Word for now.)

As I’ve gotten older, my understanding has bifurcated. The S-word and the F-word are Anglo-Saxon terms which describe bodily functions everyone performs, yet everyone is embarrassed by. Combining that embarrassment with the common language of a despised conquered people made the words themselves embarrassing, a reminder that having a shit, or fucking your spouse, is a common thing that peasants do. The horror!

Get over yourself.

Actually speaking the N-word, however, conjures up memories of historical injustices perpetrated against a population designated subordinate and inferior for completely arbitrary reasons. I understand why people who look like me shouldn’t speak that word, shouldn’t revive those memories, because us White people can’t go there without getting some “there” on us. Unlike “shit” and “fuck,” the N-word really does belong in the ash heap of history.

However, you know I wouldn’t be writing this if the story stopped there. Because in thinking about what happens if us honkies say that word, I’ve come to distrust the power with which we invest that word. We ofays perform elaborate verbal tap-dances to avoid speaking that word, because the very sound sullies the conversation. Like Lord Voldemort, or saying “Macbeth” in a theatre, the word, in isolation, is so powerful that its very sound causes damage.

Philosopher J.L. Austin, in his 1962 book How To Do Things With Words, describes the ways words change reality. Words like “I christen this ship the USS Valiant,” or “I do,” change the world around them. Sure, they don’t make physical changes. The piece of tooled iron doesn’t become alive by giving it a name, and the married couple is still comprised of two people whose physical presences remain the same.

Yet despite this physical continuity, reality changes, because—follow me here—reality doesn’t exist without humans to organize it. Sure, objective physical reality exists, but that isn’t what we mean when we say “reality.” We mean our capacity to understand, name, and use those things which exist before us. Humans breathed oxygen before we discovered its chemical nature, but by naming and cataloging oxygen, we changed our relationship with matter, and thus changed our reality.

So yes, saying the N-word changes reality, letting the history of racialized violence into the room. But must it, necessarily? Like Lord Voldemort’s name, the word lets evil in because we expect it.

Consider it this way: if I say “the N-word,” you know what word I’m avoiding saying directly. But I’m not avoiding the word’s meaning, I’m just avoiding saying it. I expect you to, silently and internally, insert the content I’ve deliberately omitted. So I’m not saying the word, I’m making you say it. It’s the same as bleeped words on network television: the word doesn’t vanish. They’ve just offloaded the responsibility for saying it onto you.

All this means, in short, that after thinking about this topic, I’m more confused than ever. The N-word isn’t Lord Voldemort, but it isn’t harmless either. Saying it directly invites harm, but avoiding saying it just moves responsibility around. And ignoring it, as we learned in Charlottesville, doesn’t make it go away. We White English-speakers have created a linguistic tapeworm. Now it’s up to us to decide just how much we’re going to feed it.