Friday, July 30, 2021

The Return of Cartoon Morality

A promotional still from Netflix’s Masters of the Universe relaunch

Masters of the Universe wasn’t really my thing in the 1980’s. I watched it occasionally when it aired, but it mostly ran when my parents expected me to do homework, so I never followed story arcs (such as they existed in Reagan-era animation) or savvied the characters. So when Nexflix proudly announced their intent to resume the show where it ended, my first reaction was to shrug.

I realize I’m at the age when the popular culture of my childhood has become mythologized. I grew up watching the Eisenhower Era and its hangover treated as somehow sacred. Nick at Nite reran Mister Ed, The Donna Reed Show, and Rowan & Martin’s Laugh-In with reverence once designated for Bach chorales and High Church Mass. The Wonder Years was one of network TV’s biggest hits, and Classic Rock Radio first became a thing.

So maybe I should’ve been better prepared, thirty years later, when the mass media icons of my childhood became similarly fetishized. From Michael Bay’s Transformers movies to Masters of the Universe, I’ve watched the chintzy animation of my childhood, little better than commercials when they first aired, turn into half-serious mass-market art. The result has left me curious and dumbfounded.

He-Man and Optimus Prime began life, of course, as toys. With media largely deregulated during the Reagan Administration, the strict boundary between TV content creators and the advertisers who subsidized them melted away. The syndicated weekday cartoons, few of which lasted beyond a single season, mostly existed to sell toys. And I, desperate to fit in, purchased some of them, particularly Transformers.

Looking back, these shows’ storylines had messages I didn’t necessarily catch as a child. He-Man and Optimus Prime both had American accents, with greater or lesser degrees of slow Southern drawls; Prime practically sounded like a stand-up comedian doing a John Wayne impression. By contrast, Skeletor, Megatron, and Cobra Commander all had British RPs, offset by their raspy voices. Villains were easy to spot: they spoke like aristocracy with laryngitis.

The morality of these shows had the complexity of comic strips in Boy Scouts magazines. Heroes and villains were unambiguous, and heroism or villainy was simply the characters’ respective natures. Villains lived sumptuously, ate caviar, and shat upon their sidekicks, while heroes ate red meat, slept rough, and had tight friend networks. It’s tough to avoid correlating these stories’ simple morality with Late Cold War propaganda.

Michael Bay with Bumblebee, perennially the most popular Transformer

Therefore, though it’s tempting to treat recent reboots of She-Ra and Voltron as mere nostalgia, the moral complexity these characters introduced strikes me. The revamped Voltron Force’s attempts to maintain enthusiasm for their mission, faced against an empire driven by strange and often poorly defined motivations, seems intended more for adults than children. Are we as Americans, the story seemingly asks, the plucky adolescent heroes, or are we the empire?

I have avoided watching Michael Bay’s Transformers movies, not out of love for cinematic grandeur and distaste for Bay’s notoriously excessive spectacle, but because the Transformers represent a piece of my childhood I’m not proud of. I purchased Transformers toys, not because I wanted to participate in their story, but because I thought I might fit in with other boys. I thought I could purchase my way into normalcy, a late-capitalist attitude that’s hard to shake in adulthood.

Therefore, when I see these artifacts of my childhood revamped for adults, with the complex morality that grown-ups know and fear, I’m left puzzled. We adults know that reality doesn’t break down neatly into good and evil, that villainy is contextual rather than innate, and that carefully placed violence doesn’t really make social problems go away. Yet exactly that attitude is being marketed back to us.

Netflix’s decision, with their Masters of the Universe relaunch, to resume the existing story rather than reboot it as they did with She-Ra, makes me wonder who these products are for. Despite attempts to popularize binary morality with stunts like the War on Terror, we just don’t live in the Cold War anymore; pitting bare-chested American virility against the skeletal Red Menace surrogate makes little sense. Who asked for this?

Then I realize: maybe we did. As the oldest Millenials have already turned 40, and Generation X has rollover IRAs, maybe we wish we really had the moral clarity we imagined in childhood. In a world where our greatest enemy is a virus, and where our fellow citizens are the greatest threat to democracy, maybe we want a bad guy we can punch. Maybe we just miss morality.

Tuesday, July 27, 2021

What Would Father Damien Do?

God save me from ever being this kind of Christian.
via Twitter: Tim Meshginpoosh/@OldReepicheep

As Christian as I am, I deeply distrust public religiosity, like snake handling or praying in tongues. Rather than demonstrating a commitment to living the Gospel, depicted in Jesus’ life and teachings, people who participate in these displays prefer showing fellow Christians how profound their faith is. Like Jesus’ proverbial pharisee, praying on street corners, such Christians choose a vaudeville religion. Now let’s add vaccine denialists to that list.

Antivax Christianity, like snake-handling, is a heresy, because it privileges displays of human glory over quiet lives of faithfulness. Claims, like the one above, that Christians don’t need medicine because we have “the antibody of Jesus Christ,” aren’t meant for healing the sick or comforting the grieving. This form of showboating religion treats humans like God, and mistakes foolhardiness for faith. As I've written before, historically, these displays end tragically.

Watching grotesqueries like this unfold, my mind drifts to Father Damien of Moloka’i. Born Jozef De Veuster in Flemish Belgium, he felt his religious calling at the unusually senior age (by 19th Century Catholic standards) of eighteen. He took the religious name Damien during his novitiate, then traveled to Hawai’i as a missionary, and was formally ordained in Honolulu, then was briefly shuffled around several parishes throughout the Kingdom of Hawai’i.

In 1866, King Kamehameha V authorized creation of a leprosy (Hansen’s disease) colony on Moloka’i. Like syphilis, influenza, and whooping cough, leprosy was introduced to Hawai’i by outside traders; unlike those other diseases, leprosy could be contained. The Kingdom established its colony at Kalawao, a remote peninsula accessible only by sea. The patients confined to Kalawao immediately suffered spiritual dislocation, acedia, and great suffering.

Father Damien volunteered to plant a parish at Kalawao, which eventually became Saint Philomena’s. With 600 patients, the colony was fairly large, and Damien was the only pastor. The bishop intended to rotate several pastors through Saint Philomena’s every six months, to protect the priests’ health, but a paucity of volunteers meant Damien didn’t have any relief. That, my friends, is where Damien becomes relevant today.

The official state statue of Father Damien
at the Hawai’i Capitol in Honolulu

Given the opportunity to leave Saint Philomena’s empty, Damien elected to remain. The medical mission to Kalawao was financially well-supplied, but short-handed. Thus, besides his apostolic mission, Damien also became volunteer nurse, house carpenter, schoolteacher, and eventually, coffin builder and gravedigger. Can you imagine today’s megachurch pastors changing their parishioners’ suppurating lesions? Can you imagine Rick Warren or Joel Osteen digging their parishioners’ graves?

Though Father Damien worked with superintendents who were Hawai’ian or part-Hawai’ian, they rotated out for safety. Damien alone worked without relief. Under his watch, the lepers’ shacks were upgraded to painted houses, and many operated successful farms. Damien was only part of the lepers’ support network, but he was the only part which remained constant for several uninterrupted years. Reportedly, he even began preaching in the Hawai’ian language.

Most important, Father Damien had no magical illusions that Jesus provided him immunity from disease. When he’d ministered in Kalawao for eleven years, he realized he’d stopped feeling pain in his extremities, a diagnostic sign of Hansen’s disease. He’d become a leper. The disease became so advanced that he needed to wear a bamboo frame beneath his liturgical vestments to keep his clothing from blistering his skin.

And still he served his parish. He eventually served sixteen years among perhaps the most feared population in Hawai’i, then died among them, and was buried in their midst. Having the protections of a European in an essentially colonized nation, he could’ve received much better treatment, if he’d only left his parishioners behind. But he didn’t perceive Whiteness as gain; he served lepers, until he became a leper.

Witnessing today’s spoiled, White megachurch Christians claiming God will prevent them getting infected with a highly infectious disease, I can’t help comparing them to Father Damien. For him, Christianity meant getting his hands dirty, getting the castoffs from parishioners’ saturated bandages on his clothes. It meant loving them during their lives, and ensuring they had honorable burials when they died. Christianity meant walking into Kalawao, and never walking out.

I’m not always the best representative of Christianity. I get short-sighted, focused on my own comfort and ease. It’s easy to say “I’m only human,” but so were the Apostles. And so was Father Damien. In an era defined by colonial poverty, a rootless state, and widespread disease, I want to be more like Father Damien. I want to stop thinking about my glory, and instead serve the suffering, those Christ calls us to serve.

Friday, July 23, 2021

Yankees and Ayn Rand and Bears, Oh My!

Matthew Hongoltz-Hetling, A Libertarian Walks Into a Bear: The Utopian Plot to Liberate and American Town (And Some Bears)

Tiny Grafton, New Hampshire, was carved from New England’s forested wilderness before the Revolution, by a cadre of committed tax-dodgers. Its character has changed little in the intervening 260 years. When a committee of Libertarians, jazzed on Internet forums and ideological fervor, went looking for an existing small town to colonize for their values, Grafton looked amenable. But the Libertarians failed to plan for Grafton’s encroaching wilderness.

Advance PR for Matthew Hongoltz-Hetling’s first book is somewhat misleading. It implies that Libertarians, unified by an ethic of complete hands-off government, remade Grafton in their image, then bungled it when New Hampshire’s swelling bear population invaded. This isn’t what happened in Hongoltz-Hetling’s actual telling. If you’re seeking ammunition to deploy against Libertarianism in online arguments, this book isn’t it. No, Hongoltz-Hetling has written something different altogether.

To start, the bears, mostly American black bears, didn’t follow the Libertarians into town. Hongoltz-Hetling describes the first attack, when a previously unexpected bear swooped into Jessica Soule’s lawn and gobbled two of her kittens, happening five years before the Libertarians arrived. New Hampshire’s dark, moody wilderness, once nearly obliterated, has been gradually returning for about seventy years, bringing bears, coyotes, and bobcats with it.

Second, the Libertarians less remade Grafton, which already was chronically tax-averse and had few public services, than exaggerated its already lawless tendencies. The Libertarians took a community, which had never been particularly large or prosperous, and colonized it with anti-government values, though they frequently didn’t agree what that meant. The result was chaotic, and for many, ended in tears.

Despite this, Hongoltz-Hetling, a Polk Award-winning freelance journalist, tells a remarkably complex and humane story of people whose ethics worked well on paper, but faced unanticipated obstacles in real life. He anchors most of his story on two transplants who experience Grafton through unique lenses. John Barbiarz basically invited the Libertarians into Grafton, then couldn’t control them. Jessica Soule and her cats lived in fortress-like solitude as bear encroachments increased.

Libertarians are a big-tent philosophy. Grafton attracted survivalists, revivalists, conservative anarcho-capitalists, centrist Christian utopians, and progressive hippies. They agree that government should generally leave individuals alone, but beyond that, they agree upon little. While some Libertarians attempted to create businesses, churches, and other community amenities for Grafton, others clung to the town’s margins, prepping for the Apocalypse, and foraging like… well, like bears.

Matthew Hongoltz-Hetling

Stoked with ethical fervor, Grafton’s Libertarians disrupted town business, did dangerous things intended to provoke confrontations with bureaucrats, and drained town coffers. As John Barbiarz describes it, they wanted the freedom of Libertarianism, without the concomitant responsibility. Though Libertarians believe the Free Market will fix everything, without repaired roads and basic public services, Grafton’s economy collapsed. The general store, Grafton’s last business, shuttered in 2018.

Meanwhile, the bears became increasingly fearless. Once edging toward extinction, New Hampshire’s forests, and the bears that occupy them, have resurged. But these aren’t like the bears Grafton’s colonists terrorized. Once afraid of humans, these bears have become brazen, wandering into town, preying on domesticated animals. Bears have adapted to human environments. They’ve apparently developed a taste for cats, not normally part of bears’ diets.

Though the Libertarians and bears apparently colonized Grafton separately, they formed a mutually reinforcing loop. Libertarians’ disdain for regulations caused them to sloppily handle trash, which bears loved to raid for cast-off food. Bears’ fearlessness around humans provided Libertarians justification for one of their favorite causes, open-carrying firearms for personal defense. The situation was compounded because some Libertarians shot bears, while others fed them.

Hongoltz-Hetling describes a community fraught with bitter competition: bears versus humans, old Graftonites versus colonists, and sectarian feuds among the Libertarians. Without a guiding philosophy, civic life in Grafton begins drifting. The town becomes too cash-strapped to fight fires, while New Hampshire becomes too broke to manage its spreading wilderness. Soon Grafton becomes the site of something nobody expected: New England’s first unprovoked bear attack in a century.

Though a journalist, Hongoltz-Hetling tells the story with a novelist’s aplomb. His descriptions of bear depredations have an almost Stephen King-like atmosphere, while his descriptions of political wrangling resemble John Grisham dramas. He cites statistics and creates a historical context where necessary, but throughout, the human narrative of Grafton’s unwinding takes center stage. We feel like we’re watching our hometown’s final days.

It would be easy to waste a story like this assigning blame. Though Hongoltz-Hetling clearly has his sympathies, he’d rather emphasize the human aspects. His story is terrifying, complex, and frequently heartbreaking, much like life.

Wednesday, July 21, 2021

Justice or the Peace?

Carl Hulse, Confirmation Bias: Inside Washington's War Over the Supreme Court, from Scalia's Death to Justice Kavanaugh

To those of us who followed Justice Brett Kavanaugh’s confirmation hearings, it looked like nothing we’d ever seen before. But this was an illusion, created by partisan blinders. Though it was more extreme, more fraught, and more acrimonious than any prior Supreme Court confirmation, it was consistent with court nominations since the turn of the millenium. It helps to have seasoned observers to contextualize these events for us.

Carl Hulse, chief Washington correspondent for the New York Times, has spent his career documenting partisan horse-trades, including federal court appointments. Once a rubber-stamp process with minimal impediments unless a nominee had serious problems (Hulse cites Robert Bork’s nomination), federal judgeships have become more contentious since the early 2000s. What happened, then, and why the abrupt change in practices? You may not like the answer.

Senate Democrats took deep exception to George W. Bush’s federal appointees. When Democrats controlled the Senate, they took unprecedented steps to obstruct appointments; when they lost the Senate, Democrats performed the first-ever filibuster against an Appeals Court appointee, Miguel Estrada. The fight over Estrada’s appointment became so bitter that it literally cost human lives: Estrada’s wife miscarried, then died, during the Senate fight.

Therefore, when Justice Antonin Scalia unexpectedly died in 2008, creating a vacancy, Republican Senator Mitch McConnell, vacationing in the Caribbean, made the hasty declaration that nobody would receive a hearing for Scalia’s seat. In McConnell’s mind, he was only doing what Democrats would’ve done, were they in power. This read wasn’t entirely unfair, since McConnel’s decision was a more extreme iteration of what Democrats had actually done.

In Hulse’s telling, this reflects the pattern in judicial nominations for over twenty years now. One party, formerly in the minority, gains the majority and changes the rules to limit minority power. Then that party returns to the minority, carried by ever-shifting political winds, and finds themselves constrained by the rules they previously wrote. The new majority, meanwhile, makes new, even more restrictive rules. And the circle of life continues.

Carl Hulse

Many of these rules changes apply to filibusters. Minority Democrats used filibuster rules to hinder Republican court appointees, sometimes literally for years. (The ghost of Miguel Estrada looms over court appointments in Hulse’s narrative.) Gaining the majority, Democrats changed Senate rules to stop filibusters of most federal court appointees, then quickly lost the majority. Now, holding a narrow majority, some want to kill the filibuster altogether. Some people never learn.

Because it isn’t just Supreme Court nominees who received this treatment, though only those nominees tend to command the 24-hour news cycle. Rules governing federal judge appointments throughout the system have been altered, granting judges lifetime appointments over the objections of home-state Senators and other traditional procedural hurdles. Some of these rules changes have probably remade American courts for the next full generation.

Hulse centers his narrative on three Supreme Court nominees: Merrick Garland, Neil Gorsuch, and Brett Kavanaugh. All three nominees were flawed, in some ways, from Garland’s centrist record and unexciting personal style, to serious criminal accusations against Kavanaugh. Yet what matters for Hulse is that, in all three cases, Senators changed the rules. All three nominations granted more power to the majority, and the effects cascaded throughout the federal courts.

Because of publishing’s long lead times, this book dropped before Amy Coney Barrett became a national phenomenon. But Hulse name-checks her twice, as someone considered an up-and-comer in the federal judiciary. Hulse also acknowledges that everyone knew the Merrick Garland trainwreck was explicitly partisan: he quotes several Senators admitting that, should another vacancy occur in President Trump’s final year, Republicans intended to fill it.

We who watched these nominations unfold on television and online, often got a heated, partisan, contradictory narrative. Monday’s journalism would often be superseded by Wednesday. Hulse retells the events in ways that exclude fleeting scuttlebutt and mass-media clutter, putting the story of these three tumultuous years into a context that makes sense. He also excludes the partisan lens-grinding that distorted unfolding events for many news junkies, including me.

Perhaps most importantly, though he denies both parties are equal, Hulse demonstrates with evidence that both parties are complicit in the current catastrophe. Partisan brawling in Congress has turned the federal judiciary into a political football, making justice harder to dispense. And when Democrats currently (as I write) suggest changing the rules to overcome inertia, they’re basically strategizing how to sneak up behind themselves and pick their own pockets.

Partisan wrangling has essentially made America less free, because we have fewer rules.

Friday, July 16, 2021

Whose Alamo Should We Remember?

Bryan Burrough, Chris Tomlinson, and Jason Stanford, Forget the Alamo: the Rise and Fall of an American Myth

The Alamo is a physical fact: a 300-year-old Spanish mission church near the heart of San Antonio, Texas. But what does that mean? The Alamo’s mythology has shaped how Texas has seen itself since the famous siege, nearly two centuries ago, and mythmakers have coached Americans to see ourselves through the Alamo lens. But new knowledge and controversial facts have forced a recent reevaluation. And that makes some people uncomfortable.

Texas history, including the Alamo, hasn’t always been a respected field; our authors note that historians often go to Texas to bury their careers. Folklore and tall tales often exceed available facts. And Texas, more than probably any other state, actively legislates how history gets taught. This means what actually happened at places like the Alamo and San Jacinto have become creation myths, Just-So Stories that serve a sociopolitical purpose.

Our authors dedicate approximately one-third of their book to uncovering the Alamo’s elusive facts. Yes, they agree, Jim Bowie, William Travis, and Davy Crockett were definitely there. But what were they fighting for? Subsequent folk history has turned the Alamo into a defense of liberty, justice, and democracy. But their contemporary rhetoric shows Texians (White settlers in east Texas) feared Mexico would come for their “property.”

“Property” could only mean two things. Texians treasured land first. The Liverpool textile boom of the 1830s made fertile Texas bottomland valuable, because it nurtured abundant cotton harvests. But Texas land wasn’t scarce, so Mexico City had little motivation to expropriate it. Mexico’s progressive mixed-race government, however, did want to seize the other resource that made cotton farming profitable: White Texians had imported thousands of Black slaves.

Our authors don’t spill copious ink explaining White Texians as slave traders. Once they’ve demonstrated it using Texians’ own words, they’d rather focus on what actually happened: the escalating rhetorical conflict between Texians and the distant Mexican government, as economic need increasingly clashed with social values. Each side projected their fears onto the other. Texians were drifting from Mexico long before troops began massing outside the Alamo.

I was surprised, though, by how quickly our authors moved on from the actual battle. They admit that, owing to longtime historical neglect, verifiable information about the Texas Revolt is pretty scarce. Our authors care more about what happened afterward. Texas began carefully managing Alamo mythology before the gunsmoke cleared; Bowie, Travis, and Crockett had already become martyrs for a democratic parable. The stories quickly outpaced the facts.

Alamo myths have been hotly contested from the beginning. Though Texas associated the Alamo with enlightened democratic values, the stories glossed over non-White allies, including the Tejano and Native American volunteers. (Even our authors briefly mention, then apparently forget, the Cherokee, Comanche, and Wichita nations.) Texas also revered the Alamo abstractly, while permitting the building to decay. Much of the original compound has been lost.

Our authors depict ongoing battles over how Texas should remember what happened in 1836, and to what degree the building deserves preservation. Ardently anti-tax Texans have frequently balked at maintaining or administering the building themselves, delegating responsibilities to clubby networks of San Antonio’s White elite. This has become especially pointed as San Antonio has become a majority non-White city, and Texas overall promises to follow suit.

Governing the Alamo mythology isn’t airy-fairy. The exclusion of Tejanos from Texas’ creation myth, and broad, sweeping stereotypes of Santa Anna’s army, have historically served to fuel anti-Mexican sentiment throughout Texas. Though official state history, including the darkly legendary 7th-Grade Texas History Class, showed signs of liberalizing in the 1990s, but in the early 2000s, conservatives with good PR skills began seizing control of the official narrative.

This battle over official historiography isn’t incidental. It describes in miniature the current battles over how Americans want to memorialize themselves for future generations. The past doesn’t objectively exist. Instead, Texans, and Americans generally, reinforce narratives that reflect values; some of those narratives include monuments, like the Alamo, which have been manipulated for centuries. Whoever controls the past controls the future, as someone once wrote.

Our authors bring the narrative current through summer of 2020, when nationwide protests over monuments and police violence descended into standoffs in Alamo Plaza. Political futures may fall according to how official memory recalls this moldering fortress. Powerful people have proven themselves willing to kill over how the state manages the mythology. But what matters is, the Alamo myth has come unmoored from the old Spanish mission church.

Maybe, our authors assert, it’s time to forget about the Alamo.

Wednesday, July 14, 2021

But What If the Billionaire Spaceships are a GOOD Idea?

The three faces of the Billionaire Space Race: Richard Branson, Jeff Bezos, and Elon Musk

Watching the current Billionaire Space Race, I have multiple reactions. I grew up reading science fiction, so privatized spaceflight isn’t new to me. But depending on whose interpretation you favor, we could be facing a massive dystopian folderol—the position favored by much of Blue Twitter—or the dawning of a new Golden Age. Unfortunately, there’s too much at stake to gamble now.

I remember reading, clear back in the 1980s, thrilling narratives of privatized spaceflight and the potential risks and benefits. Ben Bova springs immediately to mind, though countless other authors whose names I’ve forgotten also addressed the topic. Many saw privatized space travel as a libertarian extravaganza, the opportunity for flourishing human ingenuity. Bova’s Welcome To Moonbase fired my imagination with its ambitious public-private partnership.

Yet I cannot remember this vision, without remembering the other great libertarian space spitball: Star Wars. Though George Lucas depicts an autocratic one-party state, replete with Nazi-fetish costumes and ominously clicking heels, he depicts a remarkably hands-off economy. Darth Vader’s relationships with Lando Calrissian, Boba Fett, and Jabba the Hutt bespeak his galaxy’s defining economic ethos: don’t cross the state, and we’ll let you do whatever you want.

To emphasize the Star Wars economy, consider The Last Jedi, when Finn and Rose visit Canto Bight, the Monte Carlo-like carnival that dominates the movie’s middle third. Finn is initially besotted with public displays of wealth, and almost forgets his mission. Rose sees through the grotesquerie and demonstrates the planet’s moral bankruptcy. The movie not-so-subtly encourages us to see the parallels between that scene, and today’s widespread inequality.

Conventional leftist commentators have carped how today’s Billionaire Space Race proves American and EU capitalists have too damn much money. The three billionaires competing for bragging rights, Elon Musk, Jeff Bezos, and Sir Richard Branson, have so much combined wealth that they could literally defeat world hunger without even noticing the outlay. Why, the commentariat asks, aren’t they competing to accomplish that?

Compelling as this argument seems, I can’t wholly support it. Because of the amortized cost of research and development, many large-scale technological innovations start out prohibitively expensive. Cars, airplanes, television, even the computer or smartphone you’re using to read this essay, were once exclusive playthings of the obscenely wealthy, condemned by populists as dangerous disruptors of good, honest values. The complaints have a redundant quality.

The three faces of the Star Wars economy: Darth Vader, Lando Calrissian, and Boba Fett

Equally importantly, the pressing terrestrial concerns which our Space Racers are purportedly ignoring—global warming, mass poverty, resource depletion, and authoritarian violence—will also be punishingly expensive. Currently, the American and British governments (where our Space Racers live) are bogged in partisan arguments about whether global warming even exists, while huge tracts are literally on fire. Conventional centralized solutions aren’t exactly forthcoming.

During the Cold War, America took pride in its research universities, scientific think tanks, and NASA. We considered spending taxpayer dollars and private grant money on these pursuits worthwhile. Then the Cold War ended, and we changed our minds about science. Capitalists decided they’d rather keep their money, and their pet legislators agreed. Now our economy resembles Bespin, rather than Moonbase.

Changing our economic presumptions would be time-consuming. We didn’t arrive at today’s economy with one tax bill or regulatory scheme, and we won’t fix today that way either. Our Star Wars protagonists didn’t bother fixing Bespin’s economy, and made only symbolic gestures on Canto Bight. Sure, the underlying system was a problem, but it wasn’t what was attempting to kill civilians at that exact moment.

Instead, like President Kennedy did when rallying America around the proposed moonshot, we have the potential to unify world citizens around flashy, high-profile technological wonders, like putting a billionaire on Mars. Perhaps, the spectacle will subsidize the necessary but unsexy quest for other solutions, too. Like how DARPA invented the Internet, and me becoming a blogger was merely a side effect.

Don’t misunderstand me. Trusting and utilizing these billionaires and their private empires might be expedient, but we can’t expect them to deliver us salvation. We have to address our problems, collectively, through economics, philosophy, and the state. Piggybacking our hopes for technological solutions on the Amazon moonshot doesn’t preclude us demanding reforms to our badly broken system.

But, like our parents and grandparents, why not tie this moonshot to other attempted remedies? Things could turn out awful, but things are awful now, and wishing for better background noise is wasteful when we have challenges right here. Maybe everything will turn into Star Wars. But maybe, just maybe, it’ll turn into Moonbase.

Saturday, July 10, 2021

Some Thoughts on the “Cat Person” Controversy


In September of 1989, an episode of sitcom Family Matters featured Rachel, a young author, selling a manuscript based on her brother’s family. Her brother’s family recognize themselves in the story, and are outraged by the negative depictions. They perceive Rachel as highlighting their worst characteristics for a hasty sale. Rachel spends the episode explaining how fictioneering works, and gradually making peace with her brother’s extended family.

For context, this was Season 1, Episode 3, so something the writers’ room presented early. Presumably they wanted to remind their own families that, though they used personal experiences as story foundations, they didn’t mean anything literally. Something relevant must’ve happened in Hollywood around that time, because six months later, a Golden Girls episode featured Blanche spitting outrage when she saw herself depicted in her sister’s Jacqueline Susann-like sexploitation novel.

Kristen Roupenian’s “Cat Person,” a 2017 New Yorker story about a dysfunctional May-late August relationship, regained currency this week. Alexis Nowicki, a Brooklyn book publicist, published a Slate article claiming the story appropriated significant portions of her life, including her college relationship with a much older graduate student. This despite, Nowicki admits, never meeting Roupenian. The Twitter commentariat started circling like buzzards, as it does.

The occasional bomb-thrower claims Roupenian transgressed somehow by using somebody else’s story. I even saw the word “plagiarism” misapplied. But these were outliers, the people Mick Hume calls “full-time professional offense takers.” The largest fraction of audience responses involved wisecracks about the appropriateness of Nowicki’s response. Since when, these respondents (many of them writers) asked, is it wrong or inflammatory for writers to fictionalize other people’s stories?

My favorite responses to Nowicki’s article ask: what responsibility do writers have in fictionalizing stories that aren’t theirs? We all, as writers and readers alike, understand humanity through other people, and therefore process difficult or morally ambiguous situations through stories. When our personal experiences don’t provide the necessary narratives, we turn to others’ stories. We all do it, but what moral onus does it place upon us?

Ernest Hemingway

In high school, my teachers frequently treated fiction and poetry like crossword puzzle clues. American Literature classes particularly repeated this. Authors, especially Hemingway and Fitzgerald, were treated like cypers from the past. Like Ralphie in A Christmas Story, we needed to apply our decoder rings to unlock the concealed meaning, which was always one-to-one. Literature wasn’t presented as art to mull over and live with, but a message to understand.

Perhaps you’ve heard that wheezy advice frequently given to young authors: “write what you know.” We assume that authors only create work by fictionalizing their personal experience, then we reverse-engineer such experience onto others’ work. I’ve always distrusted this advice, since it implies you can’t understand anybody’s experiences except your own. After all, Stephen Crane didn’t need to fight in the Civil War to write The Red Badge of Courage.

According to Nowicki, Roupenian admitted writing her story in an MFA workshop, and sending it into the New Yorker slush pile, where literary fiction usually goes to die. For whatever reason, it struck a chord, and the editors bought it, propelling a young graduate student into the ranks of luminaries like David Sedaris and Jamaica Kincaid. Then, because Roupenian addressed emotionally loaded topics, her story quickly became a lightning rod for cheap online outrage.

I’m reminded of 2016, when online critics, including friends of mine, willfully misread a Calvin Trillin ditty. Or January, 2020, when firebrands responded to an Isabel Fall story, without apparently reading past the title, with such ferocity that the author needed psychiatric treatment. In each case, audiences “decoded” literal, singular messages from these works, like reading Hemingway in 11th-grade AmLit, giving the works one, singular, invariable meaning.

Worse, like Rachel’s family or Blanche, these audiences seek the most negative, adversarial meaning. They project their insecurities onto, not the work, but the author, attributing malicious intent rather than the desire to write compelling fiction. The Family Matters and Golden Girls writers’ rooms shared this premise, presumably because their own families thought they’d recycled family drama. Which perhaps they did, but not malevolently.

Alexis Nowicki, like those TV families, wanted everyone to know her story wasn’t as toxic as that depicted in fiction. She recalls her relationship with “Charles” fondly, but distantly, like many college relationships. But like those families, Nowicki’s self-defense imputes authorial intent onto Kristen Roupenian. She sees Roupenian’s writing not as art, but as journalism. In doing so, she diminishes art’s psychological impact, and makes other minds less real.

Wednesday, July 7, 2021

Steampunk and the Next-Wave Revolution

Aleksandr Boguslavsky, director, Abigail

The walled city of Fensington has managed to protect itself against the pestilence racking the outside world, but at great cost: Inspectors demand constant random infection checks. Anyone showing signs of illness get bundled into dark sedans and whisked away. Young Abigail watched the Security Division abduct her scientist father, calling him infected. Now a young adult, Abigail feels the first stirrings of infection inside herself.

Steampunk, as a genre, is frequently antimodernist, highlighting a premature collision between technological modernity and traditional culture. I sometimes pooh-pooh American steampunk, which often reeks of naïve pastoralism. But this Russian confection has a different ethos. Coming from a nation whose administration has been accused of assassinating dissidents, this movie’s antiauthoritarian heart feels brave, and openly tries to kick the Putin Administration in the balls.

One chance encounter shows Abigail that the Inspectors’ reach isn’t as universal as she’d thought. The state’s power, she discovers, depends on citizens’ compliance. The centralized state rewards conformity and mediocrity. Citizens willing to obey get rewarded with stable, undistinguished, middle-class jobs. But Abby is dissatisfied going along to get along; she demands to know where the Security Division took her father.

In some ways, this movie demands comparison to the original Star Wars. Abigail, blessed with the first glimmerings of supernatural power, seeks guidance, and discovers an underground resistance to the empire’s well-paid fatalism. Even the corps of masked Inspectors attack and die with the persistence of Imperial Stormtroopers. The visual design also recalls Harry Potter’s Diagon Alley, with its flamboyant spellcasters and carnival swamis.

Like those franchises, director Aleksandr Boguslavsky uses atmospherics, turning the physical space into a character. Boguslavsky turns the narrow, medieval streets of St. Petersburg, Russia, and Tallinn, Estonia, into spaces packed with terror and wonder. Fensington’s draconian mayor has greywashed the city, but the magical underground has built their own community, brightly lit, lush in color, and brimming with life. Now they have to defend it.

Unlike Luke Skywalker or Harry Potter, though, Abigail doesn’t find a resistance prepared to uphold brave humanistic values or support the transcendent. Fensington’s magical underground has become paranoid and insular, and fallen under command of Bale, a sultry-eyed, Robert Pattinson-like demagogue. Bale is charismatic and intense, but also fixed and dogmatic. Before long, Abigail realizes the resistance is as authoritarian as the government it resists.

Abigail (Tinatin Dalakishvili) makes her stand against the evil Inspectors, in Abigail

This movie’s CGI grandeur and retrofuturistic design help expound a very real conundrum in which many youth find themselves today. Abigail wants to recapture the sunlit optimism she experienced with her father, a world of nature and allegory and possibility. But she finds herself caught between two conflicting powers which have more in common than either acknowledges. Both will force her to forego her dreams, or else get her killed.

On one level, this story is very Russian. Historically, Russia and its CIS satellites have found themselves torn between conflicting absolute theories: tsarism versus Bolshevism, mafia capitalists versus secret police. In choosing between official or illicit authoritarianism, Russians have, for over a century, been forced to choose which denial of human individuality they prefer. Never free, they’re only allowed to choose which boots they have on their necks.

Yet this experience is, I suggest, not specific to Russia. As conventional religions, economies, and governments have proven themselves inadequate in recent years, the alternatives which arise to replace them have served to concentrate power, not spread it. The re-emergence of small-f fascism in multiple nations reflects the ways “free” citizens have to select which illiberal, antidemocratic force they prefer ruling their lives.

Abigail prefers another path. Guided by her absent father and his clockwork talismans, she proposes that salvation lies outside Fensington’s walls. She offers to show Fensington’s magical underground a truth hitherto unimagined: wide, sweeping vistas under blue skies. The symbolism isn’t subtle. But she can’t escape the city’s habituated limits alone; she needs other wizards’ solidarity. She needs them to step out in faith.

The fact that Boguslavsky made this movie in English bespeaks his global ambitions. (His mostly Russian and CIS actors have their lines dubbed by American voices.) Fighting battles like we’ve always fought them, Boguslavsky says, ties us to outcomes we’ve already seen. We need new approaches, and equally importantly, we need new leaders, untethered to their own glory. We have to step outside our own walls.

Boguslavsky’s grown-up fairy tale has the potential to inspire future generations to change. It also challenges us to accept the uncertainty and wonder that come with taking back our own destiny.

Monday, July 5, 2021

The Best Horror a Dollar Store Can Provide

Nigel Bach, writer/director, Bad Ben

Fat, happy Tom Riley thinks he’s scored a major coup: an enormous suburban New Jersey house, scored at fire-sale prices. He expects to flip the property for a profit, though his plans appear vague. The house has other goals, though, including isolating him from the basement, and punishing him if he wanders past the fenceline, which borders on the Pine Barrens. When he disobeys the house, he finds something he didn’t expect: a small, shallow grave.

First-time auteur Nigel Bach apparently shot this movie entirely by the seat of his pants. He had a beat sheet, but no script, and his all-volunteer crew dropped out right before shooting. He uses a found-footage style that excuses the low-res image that comes from shooting over half the movie on his iPhone. This movie could’ve descended into irredeemable silliness, but Bach avoids that outcome by embracing his shaky, amateurish side and having fun.

Tom has a distinctly ambivalent relationship with his house. He lives in it during the renovation, even sleeping in the previous owners’ armchair. But he mocks and disparages the former owners, especially their displays of overt religiosity, and he throws their family Bible in the trash, thinking religious displays will hurt resale value. He’s especially baffled that the previous owners vacated the premises so quickly that they left all their furniture. Seriously, who does that?

Another thing Tom can’t understand: what’s with the dozens of security cameras? Every room but the bathroom has motion-sensor cameras, providing constant, digital surveillance of everything that happens. It’s almost like the previous owners distrusted their own house. But as furniture starts moving itself, blocking his access to the basement, he starts wondering whether having thorough documentation might help. Because something has a message for him, if he can figure it out.

If this sounds very Amityville Horror, you’re right. Like that movie’s George Lutz, Tom buys a house, but quickly realizes it owns him. The implications are distinctly Freudian, and Ben, the spirit plaguing Tom’s house, represents a manifestation of suburban home-owning id. This representation isn’t even concealed, as much of Tom’s investigation turns on his basement and attic. Secrets lie concealed in the parts of the house that everyone has, but avoids ever looking at.

This film mimics the beat sheet of The Blair Witch Project, a film I disliked. Yet I feel much more warmly toward Nigel Bach and his production. Maybe it’s because he didn’t court the same puffed-up publicity that Artisan Entertainment sought; Bach just released this shoestring baby to streaming services and trusted the audience. Or maybe it’s because Bach, and his character Tom Riley, aren’t so damned self-serious; they understand their situation is campy, and go with it.

Tom Riley (Nigel Bach) checks his security cameras in Bad Ben

Let me emphasize that campy aspect. It’s impossible to escape this movie’s goofy, low-budget quality. Besides Tom Riley, the only onscreen character is a poorly glimpsed shadow in a few shots. To provide exposition, Tom occasionally talks into his cell-phone, and keeps extensive iPhone video logs. He lampshades how artificial this is, admitting his actions make no sense. But we, the audience, understand the purpose: he doesn’t realize he’s a character in a shoestring movie.

Playing Tom, extemporizing his terse, impatient monologues, Nigel Bach is clearly having so much fun that he visibly has difficulty keeping a straight face. Unlike Heather Donohue, who needed in-story justification to explain weeping into the camera, Bach plays Tom with eye-rolling extravagance. To Bach, this movie is a middle-age passion project, a fun activity to occupy spare time, like retooling a Volkswagen. He has no pretense of art or commerciality, he’s just having fun.

Importantly, Bach’s attitude has infected his audience. Despite being only lightly scary, his infectious joy has turned this movie into a veritable indie enterprise. He’s crowdfunded six sequels, none of which has had an official release, instead shuffling straight to streaming with minimal fanfare. Horror movie fans have embraced his low-fi, mostly bloodless camp, and paid for more. Maybe we like Bach’s themes of suburban ennui. Maybe we’re having as much fun as he is.

Don’t enter this movie expecting jolting scares or blood-curdling terror. Though Bach cultivates nagging dread, and two or three well-placed jump scares, this movie mostly isn’t some unstopping chill-fest. Rather, it’s an experiment in narrative, an attempt to recreate a legitimate moviegoing experience with a cash-on-hand budget. It’s flamboyant, artificial, and sometimes stilted. It’s also just plain fun. Nigel Bach invites us on his passion project, and at the end, we’re glad we came along.

Thursday, July 1, 2021

Gwen Berry, the Flag, and the Nature of Ceremony

The photo that made Gwen Berry (left) infamous.
With DeAnna Price (center) and Brooke Andersen

This weekend, during national trials for the Tokyo Olympics, the Women’s Hammer Throw event mattered for the first time, probably, ever. Gwen Berry, two-time gold medalist at the Pan-American Games, finished third, qualifying for the Olympics. But during the medal ceremony, Berry, who is Black, didn’t participate in the National Anthem, turning instead to face the crowd.

The Usual Suspects flipped their wigs, of course. Matt Walsh, Meghan McCain, and Ben Shapiro used their mass-media platforms to shriek, sometimes incoherently, about Berry’s horrendous travesty. Representative Ben Crenshaw went on the Former President’s favorite news network to demand Berry be ejected from the team. Wypipo Twitter, hardly a bastion of nuanced deliberation, turned downright ugly in its condemnations, and frequently racist.

(No, I won’t link their Tweets. They don’t deserve the oxygen.)

In my ongoing quest to better understand American conservatives, in terms they’d use to describe themselves, I spent time performing what rhetorician Peter Elbow calls “the believing game,” taking another person’s position seriously and respecting it, without trying to debunk it. Conservative allegiance to ceremonies of Americanism have long baffled me. Yet after some consideration, I’ve had some possibly helpful insights I’d like to share.

The pundits outraged over Berry’s demonstrative refusal are, unsurprisingly, the same pundits outraged over Colin Kaepernick kneeling. These same pundits also complain whenever news re-emerges that grade school children aren’t universally required to perform the Pledge of Allegiance daily. Fox & Friends, where Representative Crenshaw aired his demands, also complained, during the Obama Administration, that President Obama was inconsistent and frequently sloppy in returning military salutes.

(Executive protocol doesn’t require Presidents to salute, but many do so out of good taste.)

To outsiders, these forms of demonstrative Americanism appear entirely ceremonial. That is, people perform these actions simply because we perform these actions. Yet to their loyalists, the ceremonial quality isn’t empty; the ceremony gives it mass. Like a priest speaking the words of institution, and thereby transforming inert bread into the Body of Christ, ceremonial Americanism transforms the person in the ceremony.

Remember when the Former President hugged an American flag at CPAC, to thunderous applause? Progressives and liberals derided the action. Yet for the Former President’s intended audience, this action had significant weight. Ceremonies like saluting the flag, singing the National Anthem, and speaking the Pledge of Allegiance, make the ceremony enactors more American, a state that needs constantly renewed.

Would critics have been happier with this older photo of Gwen Berry? Who can say?

If we’re honest, liberals and progressives understand this impulse. Even unbelievers want someone with official standing to officiate their weddings, because the ceremony, not the sentiment, makes the marriage. Likewise, progressives trust the Courts, a bastion of ceremony, to enforce laws justly. Consider how important it is to dress appropriately, use people’s official titles, and rise or sit without hesitation.

Observance of ceremonial rules, in other words, matters. The only difference is which ceremonies different groups honor. The progressives who claim they don’t understand the outrage over Berry or Kaepernick’s non-participation in flag ceremonies, would understand altogether if, say, one of Former Guy’s indicted advisors addressed an enrobed judge by first name inside a courtroom.

Ceremonies like the National Anthem, or “all rise,” gain their authority because, like religious liturgy, everyone performs them together. A stadium facing the flag, hands on hearts, stops being, temporarily, massed individuals; they become united in purpose. Individuals stop existing, provisionally. For the moment we’re singing “Oh Say Can You See,” we briefly no longer exist.

Therefore, protests like Berry’s or Kaepernick’s also gain substance from ceremony. By disrupting the unified performance, they drag us back into ourselves during our moment of ecstatic transport, making us again, unwillingly, human. They ask us whether the mass to which we’ve surrendered our individuality actually deserves such ceremonial acclaim. They force us to be conscious during a moment of unconsciousness.

This defiance of rules matters in sports, an activity defined entirely by rules. Obscure rules like the Fair Catch Kick don’t inhibit football; the rules make the game. Like standards surrounding the National Anthem, the individual disappears into a regulated world and is fleetingly transformed. Therefore, maybe it shouldn’t surprise us when lovers of the rules hate seeing others honoring the rules selectively.

Because some people desire to disappear, briefly, into ceremonial unity, reminders of human finitude disturb them. Again, for them, the ceremony isn’t emotion; it literally transforms and renews their American natures. Like a caterpillar in a chrysalis, breaking them loose prematurely stops them becoming whole. No wonder they’re angered by displays like Berry’s. They’ve been denied their renewal.