Wednesday, December 29, 2021

The New Gothamite Existential Drama

Raven Leilani, Luster: a Novel

Young Edie lacks direction in her life. She sleepwalks through a desirable but unfulfilling Manhattan publishing job, and despises her roommate. One day, desperate for connection with another human being, she answers a personal ad. That’s how she finds herself having an affair with a White man, twice her age, who feels stifled in his open marriage.

Front-cover copy calls Raven Leilani’s first book “a novel,” but that's somewhat misleading. It’s episodic, has only a vaguely defined through-line, and has more loose ends than a shoelace factory. Rather, it flows loosely, acrobatically, like a novel-length prose poem. Perhaps that explains the starkly divided opinions it has drawn from readers; your response depends on your willingness to immerse yourself in Leilani’s subtle undertow.

By her own admission, our first-person narrator, who is Black, has made some bad decisions. (She introduces herself as “Edie” very early, then never repeats it; she’s mostly a nameless cipher, even to herself.) She shits where she eats, has only a halfhearted commitment to her career, and frequently lets life happen to her. Accepting a wilful sexual relationship with a White man literally twice her age is painfully on-brand.

Appropriately, for somebody who spends copious ink regretting her past, Edie’s choices return to plague her. She loses her poorly-paid job, then her roommate, then her apartment. Trapped on a downward spiral caused by Manhattan’s “move up or move out” pressures, she caroms quickly through meaningless jobs, desperate for validation. To her horror, she finds temporary deliverance from the unlikeliest source: her boyfriend’s wife.

Leilani’s story moves forward, not by character or plot, but by theme. This perhaps reflects her narrator’s position: young, Black, female, and poor by city standards. Edie engages in ill-chosen sexual encounters, as much as anything, because they’re something to do. The novel’s title is deliberately ambiguous, reflecting the glamor Edie sees in her well-off boyfriend’s suburban life, but also her tendency to measure her life by sex.

Raven Leilani

Edie used to be a promising artist. But, like millions of art-school graduates, she shelved her talent when making a living became a higher priority. Finding herself suddenly sheltering with her boyfriend and her boyfriend’s wife, the phantom of imminent starvation temporarily abated, she rediscovers the motivation to create. However, she struggles to reconnect that motivation with the part of her that sees like an artist.

Anybody who’s taken grad-school creative writing workshops recognizes this: writers frequently create visual artists as narrative surrogates for themselves. Given the limited amount of biographical info available for reviewers, it’s tempting to wonder exactly how autobiographical Leilani’s story really is. If this isn’t a roman à clef for Leilani’s life, Edie at least represents the author’s attempt to organize and comprehend herself.

Eric, Edie’s boyfriend, is a library archivist; his wife Rebecca performs autopsies for the VA. They traffic in history and mortality, which warps their ability to communicate. Eric fails to remain hip and relevant (Edie repeatedly stresses his GenX credentials), while Rebecca sneaks off to secretly mosh at all-night metal concerts. Their household communications happen in weirdly coded language.

Meanwhile, Edie meets Eric and Rebecca’s Black daughter. The meeting isn’t exactly amicable. However, they quickly become a force in one another’s lives: Akila has a creative spirit undimmed by years of adult cynicism, while Edie has life skills necessary to teach Akila how to be Black in a world still organized for White convenience. The pair need, but don’t understand, each other.

Readers weaned on conventions of mass-market fiction, of unified story and of action rising to a climax, may find Leilani’s style confounding. Edie is timid and adrift, more acted upon than acting. Her story is segmented and episodic, and while something happens that arguably serves the climactic role, it’s disconnected from most of what came before. Again, this is less a novel than a prose poem.

This is compounded because we wonder how seriously to take Edie’s narration. Edie frames events to exempt herself from blame, even though the story indicates she’s partly responsible for her frequent setbacks. Though she’s prone to lengthy monologs explaining her backstory, she presents them with the dispassion of an archivist, which, in context, is ironic.

However, audiences willing to immerse themselves in Leilani’s storytelling, to let the narrative current carry them along, will find moments of intense clarity. This novel isn’t for everyone. But readers able to pause their egos, and follow Edie along her strange, often opaque journey, may find more than a little of themselves in her story.

Tuesday, December 28, 2021

Modern Violence and Modern Repentance

My mother and me. We're not great at taking photos, so awkward selfies are what we have.

My mother said something this holiday weekend that hasn’t always come completely easily to her lips: “I’m sorry.” But that could sound wrong. She’s never had difficulty apologizing if she did something wrong in the moment, like stepping on someone’s toes. In those situations, she could easily acknowledge the mistake, and try to make amends. That part has never been in question.

This time, she apologized not for something she did momentarily, but for something she believed over the long haul. It occurred in the context of describing an adventure I had when, a week before Christmas, I drove into Kansas City to attend a concert. She expressed admiration for my ability to remain calm and even-headed driving on some of America’s worst urban highways, something that’s always terrified her.

“I’m sorry for holding you back,” she said. “I’m sorry for projecting my fears onto you. The conflicts of big-city living were always so intimidating for me, I know that when you were younger, you wanted the kinds of experiences that only happen in big cities. That scared me so badly, and I projected that onto you, and I know that sometimes held you back. And for that, I’m sorry.”

My parents grew up in small, close-knit Nebraska towns like Ogallala and Hay Springs, amiable communities where people knew and trusted one another. Mostly. For those willing to conform themselves to community standards, Nebraska can be an inviting, friendly place. But Nebraska has also sometimes been intensely violent. When we returned here in 1992, Nebraska was also an overwhelmingly White state. And that, my friends, has never been a coincidence.

For all her willingness to embrace new experiences, my mother never particularly needed to interact with diverse communities until well into adulthood. Until my father commenced his Coast Guard career when they were approaching thirty, they’d lived in a succession of small- to medium-sized towns with White Protestant heteronormative populations. That limited experience with diversity continues to influence her thinking to this day.

Though registered Republicans, my parents hold distinct egalitarian views, at least on a person-to-person level. They believe racism, sexism, and economic injustice are wrong, and need redressed. But until recently, they saw these problems as essentially individual. They saw racism, for instance, as an individual White person dropping an N-bomb in public; when it comes to populations, they rationalized away practices like redlining and racialized mass incarceration.

Riot police in Baltimore, Maryland, April 27th, 2015

When confronted with entire neighborhoods, many more populous than the towns they grew up in, that were functionally segregated along racial, social, or economic lines, my parents faced a society that remained deeply unjust. And, like most White people, this injustice reflected on them. This created a deep moral disquiet within their souls, as you can imagine. But unlike today, they lacked a framework to understand this disquiet.

Today, we often describe practices like redlining, unequal educational access, and other forms of half-legal racism as “violence.” Same for sexism, genderism, and other forms of exclusion. I think my mother unconsciously recognized this violence, but lacked the terminology to understand it. Therefore, faced with the perception of violence, but no clear bloody noses, she perceived the violence as directed against her.

It’s easy to pooh-pooh the defense of “it was a different time.” Some White Protestant heteronormative people in past eras understood injustice on the ground, so anybody, the reasoning goes, could. But that’s like saying all Black people could overcome racism because Oprah did. Not everyone has the intellectual framework, moral backbone, or community support to escape their social context. My mother deplored unfairness; she just couldn’t always see it.

Therefore, to restate: my mother saw and understood that living in diverse American cities, driving the crowded city highways, and taking the economic risks to pursue an art career, as violent. Because, at least on a psychological level, it was. But because she couldn’t clearly understand who caused the violence, and who was on the receiving end, she internalized the violence around her.

A lot of White people probably did this, and believed, not unreasonably (in that context), that the violence was directed at them. As parents do, they wanted their children to avoid the crushing psychological pain of violence. Therefore they defend what limited privilege comes from a society that defines Whiteness, Christianity, and heterosexuality as “normal.” They arguably aren’t malicious; they just don’t see their part in the violence.

My mother only wanted to keep me safe. Unfortunately we can only remedy this violence by acknowledging it, walking into its source, and challenging it where it lives. At her age, my mother can only repent her part in perpetuating this violence. I’m still young enough to take responsibility and walk into the fire. In a very real way, her apology gives me the freedom to do so.

Thursday, December 23, 2021

J.K. Rowling and Modern Inflexible Morals

A college acquaintance shared a milestone on Facebook this week: her grade-school daughter had read the entire Harry Potter series, in hardback, cover to cover, in under four months. Books this glimmering-eyed girl finished, not because some authority had assigned them, or because they provided some material advantage, but simply because the experience of reading brought her joy. This girl’s broad smile with her books was frankly an inspiration.

She shared this accomplishment the same week that Harry Potter creator J.K. Rowling continued her years-long pattern of punching herself in the face on Twitter. Rowling, a longtime supporter of the Labour and Scottish National Parties, with their broad center-left positions, has cultivated an audience somewhat more progressive than the general population. Which makes this week’s tweets, frankly too distressing to reproduce here, even more upsetting to her core fans.

How to reconcile these two halves? On one hand, Rowling’s works encourage young people to sit patiently with a book, practicing skills of patience and self-discipline that aren’t always rewarded in today’s media-saturated culture of instant gratification. On the other hand, Rowling, once a darling of the economic left, has expressed opinions so reactionary, devoted fans have described her as “transphobic, hateful, harmful and factually inaccurate.”

Her works have attracted audiences of diverse ages for the same reason my friend’s daughter enjoyed them: they’re a pleasure to read. And I mean that on multiple levels. One can appreciate them as childlike diversions, the belief that a magical wonderland exists in parallel with the everyday banality around us. Or as modern parables of ethics, responsibility, and justice. Or as an attempt to reckon with Britain’s wartime history.

This complexity partly explains their appeal. My friend’s daughter probably enjoys these novels for reasons entirely her own, but by having read them, she opens herself to the nuance that comes from rereading them in her future. Like life itself, the interlocking layers of Rowling’s story permit events to reflect the reader back at herself. We, like Harry Potter, are constantly changing, constantly learning, and the story reflects ourselves.

I’ve written before that it's foolish to expect great artists to be good people. The churning mass of undigested trauma that usually turns humans into artists, also generally leaves lifelong scars on souls; the personal stench that follows time-honored authors, from Christopher Marlowe to Ernest Hemingway to [fill-in-the-blank] is inseparable from their creative process. Healthy, well-adjusted minds often aren’t prepared to wallow in the psychological refuse where art is conceived.

It’s tempting to spew time-tested bromides about separating the artist from the art. But in Rowling’s case, that’s virtually impossible. The combination of unprecedentedly popular novels and modern social media has provided Rowling a public platform almost unmatched in world history. The same Twitter feed she used to dribble out story revelations, she now uses to grandstand opinions which her audience finds execrable.

Thus I’m no closer to resolving the underlying conundrum. Like Orson Scott Card before her, Rowling herself lacks the underlying nuance and self-reflection embodied in her work. Both authors reassured youthful readers that their ability to navigate an innately abusive social order didn’t make them as bad as the world around them. Both authors encouraged readers to perceive themselves as something beyond the aggregate weight of their actions.

Yet both authors reduced themselves to slovenly caricatures. Both jettisoned their accrued cultural goodwill and chose to focus with laser-like acuity on forms of queer sexuality that, frankly, didn’t affect them that much. Their works remain in print, testimony to aspirational virtues that, sadly, the authors don’t seem to share. Youth, and former youth, who read and learned from these books, are left to wonder: what now?

By encouraging my friend’s daughter to perceive reading as a meaningful good in its own right, Rowling has accomplished something worthwhile. Today’s culture doesn’t reward slow, self-contained tasks like reading. Our economy insists anything worth doing should be monetized, while our technology and entertainment industries encourage instant gratification. In teaching a rosy-cheeked grade schooler to value reading, Rowling has encouraged her to be a deeper thinker and better person.

That accomplishment, however, cannot be read separately from Rowling’s odious opinions. She preaches a doctrine that insists some people are inherently untrustworthy and must be punished in advance, a position definitely more Voldemort than Dumbledore. Why, it’s almost like Rowling is neither all good nor all bad, neither saint nor sinner. This ambiguity is unacceptable in today’s stark moral climate.

Maybe that’s the lesson to take from this situation.

Wednesday, December 22, 2021

The Door That Opened Into Somewhere

Alix E. Harrow, The Ten Thousand Doors of January

Young January Scaller lives a life straight out of a post-Victorian pulp romance: while her archeologist father globetrots for exotic artifacts, she lives with her father’s sponsor, in unparalleled luxury. Sure, she misses her father. But Mr. Locke’s wealth and connections have provided her an education unavailable to most mixed-race children. Then one day January stumbles through an impossible door into a world that shouldn’t exist, and wonders: what other worlds exist behind this one?

It’s possible to find the political metaphors in Alix E. Harrow’s first novel. The teenager’s discovery that her privileged childhood doesn’t reflect how others live; the ways powerful people preserve their power by feeding on others; the influence race has on how Americans interact with one another. But I prefer another reading. Harrow has written an American fairy tale, channeling our better instincts and higher ideals. We become ourselves, Harrow suggests, by rediscovering childhood wonder.

Mr. Locke holds a poorly defined role within The Society, a group of gentleman archeologists who pay actual credentialed academics to crisscross the world looking for trinkets. The Society doesn’t publish scholarship or compile information, however; it supports itself with a thriving black market in antiquities, turning other worlds’ legacies into cheap cash. January, whom Mr. Locke is training as his junior accountant, gets fleeting glimpses of this corrupt world, never enough to understand it.

Meanwhile, January holds onto fleeting memories of her father’s extravagant tales of distant lands and mysterious peoples. She also half-recalls an incident when she was seven, in 1901, when she wrote words on paper, and those words opened a door into an exotic, spice-scented world. Did she really make something magical happen by simply writing it down? Mr. Locke discourages such speculations. He’s a rationalist, and insists that only this world matters enough to study.

January actually occupies a world riddled with Doors. Mr. Locke and his society have another, ruder word for Doors. But whatever name, these Doors open onto strange and mysterious worlds of wonder and possibility, many of them magical. When a puzzling book hidden inside an impossible chest reveals to January that the door of her childhood was very real, she sees new opportunities opening immediately. Mr. Locke, however, sees a threat which must be stopped.

Alix E. Harrow

Harrow’s writing straddles the line between fantasy thriller and social parable. The aptly named Mr. Locke has no patience with doors; he uses money and connections to preserve the Earth he loves, making our world smaller, safer, and more immune to change. January, a half-caste child whose father trades in mystery and exoticism, misses the thrill of wonder she experienced in childhood, the power of believing that, somewhere, magic still happens. The conflict is generational.

This metaphor doesn’t limit Harrow’s writing, however. Her first priority is creating engaging characters in difficult situations. In January’s first-person main narrative, she first challenges powerful institutions from inside; when this proves fruitless, she crosses the boundary to rediscover the outside world she was born in, but doesn’t remember. Meanwhile, between the covers of her puzzling book, she discovers Yule Ian Scholar, whose puzzling memoir might hold the key to January finding her way home.

Themes emerge quickly: what does going home mean? Does it mean returning to the comfortable untruths we learned in childhood? Or does it require passing through painful, harrowing (pun intended) uncertainty in search of truth? Like L. Frank Baum’s Dorothy, or Thomas Wolfe’s George Webber, January’s life is plagued by homesickness; but she has only a vague, half-formed notion of home. She only knows that truth exists, and it doesn’t necessarily correspond with mere reality.

Harrow also emphasizes the ways human words create other realities. The Society uses arcane rituals, influenced by Scottish Rite Masonic traditions, to create a nexus of power which the rest of reality can’t see. (Late in the book, Harrow’s distrust of the Scottish Enlightenment becomes glaring.) Meanwhile, January uses words to reveal hidden truths and actually increase uncertainty. The Society sees uncertainty as chaos, but January sees uncertainty as opportunity. Which set of words prevails?

This novel presents a world seeking resolution. Is trust always better than paranoia, is certitude always better than doubt? Harrow, by day a scholar of American race history, has definite opinions on these questions, though she doesn’t lay them out prescriptively. Instead, she walks readers, youth and adult alike, through the turmoil of finding our own resolution. By the end, maybe we don’t have all the answers. But, like January, we now have better questions.

Tuesday, December 21, 2021

Rudolph the Dog-Earred Stereotype (Part Two)

Santa and Rudolph in Videocraft International's 1964 stop-motion Christmas special

Three years ago today, I wrote an essay entitled Rudolph the Dog-Earred Stereotype, claiming that the Christmas classic “Rudolph the Red-Nosed Reindeer” was a call to action against Hitler. It went largely unnoticed at the time. Last week, I linked that essay on a popular meme-sharing group on Facebook, and it exploded. In under eighteen hours, it received three times as many hits as it received in the previous three years.

It also received several comments, many of them hostile. Nothing energizes Netizens more than having their opinions challenged, and apparently that’s what I did. The objections fall into three basic categories, which I will address in no particular order. Here goes.

• “Santa isn't enabling ablism, he's enabling antisemitism isn't the defense I think it was meant to be”

I never said Santa was enabling antisemitism, I said America in 1939 was enabling antisemitism, and Robert L. May, Rudolph’s creator, was calling on America to take a stand. Moreover, America definitely enabled antisemitism; the same year May wrote the first Rudolph coloring book on contract for Montgomery Ward, America turned away a shipful of German Jewish refugees, most of whom eventually wound up in Germany’s Holocaust camps.

It bears emphasizing that bigotry seldom begins at ground level. Scholars of race, history, and law, like Ibram X. Kendi and Michelle Alexander have demonstrated that bigotry doesn’t cause discriminatory policies, but rather, discriminatory policies cause bigotry. If “all of the other reindeer” were mocking and excluding Rudolph, it was because Santa, atop the North Pole power pyramid, created that environment. Thus it was on Santa to change it.

If audiences limit ourselves to reading art on the surface, we always miss the meaning. Yes, superficially, Santa encouraged bigotry against Rudolph. But Santa isn’t just Santa, he stands for powerful people everywhere: heads of state, religious leaders, media figures deciding which stories deserve reporting. Storyteller Robert L. May wanted these powerful people to change their harmful policies, hopefully before that catastrophic “foggy Christmas Eve.”

• “This is fuckin [sic] dumb. Why not write a book for adults who might actually get the point (if that was actually the case)”

I dunno, man, why do any authors ever write about important topics for children? Why did Madeline L’Engle’s A Wrinkle In Time address spiritual malaise in a technological age? Why did Lloyd Alexander’s The Chronicles of Prydain deal with World War II issues in a medieval Welsh setting? Why did Suzanne Collins deal with economic inequality, climate change, and resistance to unjust authority in The Hunger Games?

Perhaps because children, from an early age, demonstrate a strong sense of morality and fairness. If we’ve learned nothing from the last two years, we surely agree that adults perform elaborate moral contortions in order to defend selfish interests while still thinking of themselves as good people. We’ve watched grown-ups use religion, pseudoscience, and a truncated form of history to justify racism, closed-door nationalism, and spreading a plague unchecked.

Children, historically, have used their innate sense of justice to goad adults into right actions. As Elizabeth Hinton writes, many acts of resistance to institutional racism during the peak Civil Rights Movement began in high schools and colleges. Media pundits and defenders of the status quo still use the stereotypical college liberal as their catch-all demon, because youth have strong moral codes, undistorted by economic pressures and adult cynicism.

This doesn’t mean children always know, or do, the right thing. On the small scale, grade-school students already show hostility to people groups whom adults around them shun or belittle. On the large scale, historian Jill Lepore records how some late hippie-era college activism crossed the line into light-beer Stalinism. That’s why children should read literature with a strong moral backbone, to help steer and modulate their instinct toward fairness.

• “tl;dr”

That’s on you, dude. Most of my blog essays run 750 words, which anyone with a healthy attention span should be able to read in about three minutes. I keep things brief because screen reading attenuates people’s willingness to stick with long or detailed content. As the above arguments indicate, I didn’t explain important points in enough detail; believe me, I could go much, much longer than I usually do.

In conclusion, I don’t expect my essays to change people’s deeply entrenched beliefs. Readers will often respond to challenged beliefs much like they’d respond to someone attempting to chop off their arm. But I do expect mature audiences to read what I said, not what they wish I said.

Friday, December 17, 2021

Mark Lowry vs. the Hot-Take Bandit

Mary Enthroned, painted by Giotto

Apparently there’s something about Christmas that makes ordinary Netizens swarm like jellyfish against popular songs from the past. Remember a few years ago when all of social media became irrationally enraged at Frank Loesser’s vaudeville song “Baby It's Cold Outside”? Or the annual light-beer Marxist rage against Johnny Marks’s “Rudolph the Red-Nosed Reindeer”? This year’s designated target appears to be Mark Lowry and Buddy Greene’s “Mary, Did You Know?”

For those unfamiliar, Lowry’s poem, arranged by Greene, poses eleven rhetorical questions to Mary, mother of Jesus. Each asks whether Mary anticipated the miracles, wonders, and promises of redemption for which Jesus’ earthly ministry remains influential. Lowry wrote the poem for a Christmas pageant, but struggled to compose satisfactory music. Years later, Greene finally wrote an appropriate tune while Lowry and Greene were members of the Gaither Vocal Band.

This year’s seasonal pile-on happened because the answer to whether Mary knew what Jesus would accomplish, from a biblical standpoint, is clearly yes. At the beginning of Luke’s Gospel, Mary sings the magnificat, anticipating the themes of Jesus’ ministry, before Jesus is even born. Not only did Mary give birth to Jesus, she preached His saving message of liberation to the powerless and deliverance from the empire long before any man.

Lowry’s detractors read his rhetorical questions as denigrating Mary’s involvement in the Christmas narrative. Because the song addresses Mary, but never records Mary’s responses, this interpretation makes some sense. Throughout church history, Christians have pooh-poohed women’s contributions to Jesus’ ministry. Male clergy have turned Mary Magdalene into a prostitute, insisted Priscilla and Aquila weren’t ordained though they clearly were, and turned St. Junia into a man.

I always believe that close readings of any literature must answer two questions: who wrote it? And who did they write it for? (Serious critics will ask even more questions than that, but these two matter most right now.) Lowry, whose stage performances usually include equal mixes of singer-songwriter music and stand-up comedy, wrote these lyrics for inclusion in a Christmas pageant. Let’s consider this song in that context.

Religious theatre, such as Christmas and Easter pageants, exist for a liturgical purpose. We don’t just watch what’s happening; the shows intend to transport viewers into the moment. We stop observing from a dispassionate 2000-year remove, and become part of events. This practice of imagining ourselves into biblical stories has a long history, and was a spiritual exercise recommended by several important Christians, including Margery Kempe and St. Ignatius.

Mark Lowry

Just as important, we don’t merely watch religious pageants; we watch them together. Scholars like Émile Durkheim and Max Weber agree that the community aspect of religion matters, that coming together to share lessons and speak creeds and pray as a people, helps us commit publicly to living our principles. That, arguably, is the difference between religion and faith: we practice religion publicly, and by so doing, become unified.

Mark Lowry wrote “Mary, Did You Know?” for Christians, those already committed to faith, to come together and project themselves back to the Nativity. The questions in Lowry’s lyrics weren’t intended for Mary, they were intended for us, to remind Christians that our practices aren’t empty observances, but mean something in our lives. He wants us to imagine ourselves in Mary’s position, and ask these questions of ourselves.

Tragically, we’ve recently witnessed what happens when Christians don’t recommit themselves to Christian principles. Self-identified White Evangelical Christians have taken lead in ostracizing immigrants, shaming the poor, and persecuting sexual and ethnic minorities. The practice of calling ourselves Christians, without constantly recommitting ourselves to the precepts Christ preached, has taken a catastrophic toll on our nation, our environment, and our people.

When Mark Lowry asks his eleven rhetorical questions, he means them for us. And he means us to recommit ourselves to the knowledge that every answer is “Yes.” Mary, a poor, unmarried citizen of a marginalized nation conquered by a foreign empire, knew. She knew these truths, and sang the Magnificat surrounded by the knowledge. She never forgot; but we, who claim to believe her child’s message, forget a lot.

If Lowry really intended these questions for Mary, they’d be presumptuous, rude, and biblically illiterate. But that’s a misreading of Lowry’s intent. These questions aren’t for Mary, they’re for us. If we can stop reflexively rushing to defend Mary from an attack that isn’t happening, we can maybe ask ourselves whether we know. Because too often, we need only look around to see that the answer is, tragically, no.

Wednesday, December 15, 2021

Freshman Comp and the Failure of Modern Journalism

Cathy Birkenstein and Gerald Graff (promotional photo)

Back in 2005, University of Illinois literature professors Gerald Graff and Cathy Birkenstein published the first edition of their composition textbook They Say/I Say: the Moves That Matter in Academic Writing. The textbook’s core premise is simple, that in college-level writing, the most important rhetorical maneuver is to define the consensus on some topic, then counter with your own words. This may mean disagreeing, clarifying, expanding, or refuting altogether.

In my teaching days, I adopted Graff and Birkenstein’s model early: I photocopied their published templates, even before their textbook shipped. I made the first edition required reading for my freshman comp students. But soon I noticed problems. Though their model remains solid, Graff and Birkenstein omit one important fact: your words must be supported with evidence. My students pitched straw-man opponents, then baldly asserted their unsubstantiated opinions.

This week, conservative Catholic journalist Matthew Walther published an op-ed on The Atlantic’s website: Where I Live, No One Cares About COVID. The article is paywalled, so I’ll synopsize: we good Christians in rural Michigan have never cared particularly about COVID-19 or the regulated steps recommended to prevent transmission. Every time journalists and media pundits expound on taking steps against COVID, Walther says, they prove themselves out-of-touch with America’s heartland.

Even beyond the undeniable fact that Walther’s opinion is counterfactual and doesn’t deserve a national platform, this op-ed is sloppily written. Walther provides only anecdotal evidence, sneers at anyone who cites science, and claims his ways are simply authentic because yours aren’t. Matthew Walther formerly wrote for prestigious publications like The Week and the New York Times, but this editorial reads like something a freshman tore off quickly on deadline.

Matthew Walther, who publishes so many
photos of himself smoking that he clearly
thinks cigarettes are very grown-up.

Walther states his thesis early: “I don’t know how to put this in a way that will not make me sound flippant: No one cares.” He admits that’s hasty generalization, but turns around and says that nobody in rural America particularly cares. He supports this position with copious “I” statements:

  • “In 2020, I took part in two weddings, traveled extensively, took family vacations with my children, spent hundreds of hours in bars and restaurants, all without wearing a mask.”
  • “The CDC recommends that all adults get a booster shot; I do not know a single person who has received one.”
  • “Until I found myself in Washington, D.C., on a work trip in March, I had never seen anyone wearing a mask outside.”

I live in one of America's most convervative congressional districts, and okay, Walther isn’t entirely wrong. I’ve seen exactly what he’s talking about in every grocery store. Even I, the first person on my jobsite to start masking up, eventually wearied of fighting the tide and stopped wearing one. And then I got COVID. In my company, entire jobsites were shuttered in November 2020 because actual numerical majorities of workers caught the plague.

Essentially, Walther claims his rural neighbors have never cared or taken routine precautions, and their indifference should be treated seriously. He never considers, even momentarily, that such widespread indifference might be why we need continued precautions against a mostly avoidable contagion. Walther presents unconcerned and disengaged Americans as a base that deserves catered to. He never acknowledges they’re maybe plague rats who keep everyone else from getting better.

The fact that Matthew Walther, individually, doesn’t know anybody who wears masks or gets their shots, means nothing. I know nobody who listens to The Weeknd or eats home-cooked insects, but my unfamiliarity doesn’t matter. Globally, these activities are common and influential. If my freshmen provided such anecdotal evidence to support their vague assertions, I’d have returned their papers marked “Cite your sources.” Indeed I did, so often that I changed my textbooks.

Equally important, I wouldn’t give such baseless assertions a global platform. The Atlantic, sadly, has a history of publishing foolish, unsupported opinions by cartoonish people. We shouldn’t amplify antivaxxers, global warming denialists, or Nazis, because they’re wrong. Perhaps someday, if they provide substantiated evidence to support their positions, we might reconsider this position. But currently, baseless straw-man editorials, like Walther’s, don’t deserve space in respected periodicals.

Graff and Birkenstein aren’t wrong; voicing positions and counterpositions is a tool of serious writing. But I discovered that this triangulation means nothing until student writers know how to back their positions with evidence. If writers and audiences don’t have sufficient tools to evaluate evidence, they wind up thinking their adolescent opinions are equal to scientific arguments. And gullible magazines can be tricked into sending these bullshit opinions out worldwide.

Monday, December 13, 2021

Nostalgia Night at the Old-School Science Fiction Buffet

Cassandra Khaw, The All-Consuming World

Someone is scouring the galaxy, executing old forgotten criminals, the dregs of the post-humanist economic underbelly. Technology specialist Rita, and her hired gun Maya, realize they’re in this unidentified vigilante’s crosshairs. So they do what any reasonably well-informed cyborg criminal masterminds would do: get the old gang back together. This isn’t easy, since some of their fellows haven’t forgotten the old wounds and betrayals of their last, disastrous job.

Cassandra Khaw’s first novel reads like a literary catalog of allusions, callbacks, and nostalgia. How one responds will likely depend on how one feels about Khaw’s source material, and also how one feels about those sources being so transparently displayed. One could make a drinking game of identifying the donor parts that comprise this Frankenstein’s Monster of speculative literature. Nor does Khaw conceal these allusions, spray-painted big across the surface.

Maya, who has died and been reborn in a succession of augmented clone bodies for centuries, absolutely loves Rita, her “scientist.” We know because she tells us repeatedly. When Rita sends Maya into no-win situations to recover the Dirty Dozen’s lost children, Maya acquiesces, despite her misgivings. After all, last time the Dirty Dozen fought together, two of their number suffered deaths so catastrophic, even their advanced technology couldn’t help.

I remember reading these basic premises decades ago. Khaw’s jacked, technologically augmented protagonists come directly from cyberpunk, especially Pat Cadigan and Bruce Sterling. Not even tangentially, either: Khaw uses terminology pinched wholesale from Sterling and, with occasional words changed, William Gibson. Khaw’s repurposed Sprawl Cowboys might make speculative fiction readers very comfy, because they challenge seasoned readers minimally.

Moreover, Maya’s picaresque adventures collecting her team’s survivors feel very familiar. The first is now running a Fight Club so specific, its leader Ayane clearly thinks herself Tyler Durden. (An orphaned misquote reveals Khaw pinched David Fincher’s movie, not Chuck Palahniuk’s novel.) Another has squelched her past and remade herself as the galaxy’s most celebrated pop star, a trope found in countless Japanese anime through the years.

One by one, the Dirty Dozen’s barely half-dozen survivors gather on their old ship, nerving themselves, however reluctantly, to fight their unidentified enemy. But as they begin recounting narratives of their former mercenary careers, we start seeing truths these characters have willfully forgotten. Rita has deliberately pitted them against each other, sent them on suicide missions, and otherwise poisoned the well which this supposed sisterhood drinks from.

Cassandra Khaw

It takes until around the novel’s halfway mark before one character voices something I’d suspected for a while: these women (some of whom stopped identifying as women after their retirement) aren’t unified, they’re trauma-bonded. And, as they demonstrate their bizarre admixture of reverence and fear of Rita, it’s clear they have Stockholm Syndrome. As you’d expect, everyone realizes this but themselves, because the sick seldom realize they need healing.

More important, I realized something these characters didn’t: Rita regularly withholds information, and lies as easily as breathing. Their mission’s entire premise turns on the expectation that someone’s killing old mercenaries. Except we have no evidence supporting this premise, besides Rita’s word. Since we, and they, know Rita lies, the question becomes: when will everyone realize their entire mission is fabricated bullshit?

Khaw, a professional RPG designer whose résumé includes Dungeons & Dragons, performs the old Dungeon Master’s trick of framing their narrative through careful omission. We’re assured the Dirty Dozen are outlaws but, in the best cyberpunk tradition, that law is distant and poorly defined, apparently for sale through mercenaries like, say, the Dirty Dozen. We know certain characters are dead because an inveterate liar told us so. Pieces don’t add up.

Meanwhile, Khaw just keeps setting the scene, and setting the scene, and setting the scene. Rather than a singular story, Khaw regales their audience with a succession of vignettes on a theme. Again, like the separate nights of a game campaign. Because the story isn’t unified, our experience isn’t the action, but the characters. We read because we want the characters to learn and grow over the succession of events in their experience. Which, sadly, they mostly don’t do.

I don’t hate this novel. I particularly love Khaw’s narrative voice, a beatnik prose-poem banter that doesn’t so much inform us of events as immerse us in circumstances. But somewhere around page 200, I realized Khaw was still clearing their throat, still introducing characters and scenes for a story that’s always just about to happen. And, I realized, I no longer cared. Roll me a Natural One, I’m done.

Saturday, December 11, 2021

The Power of Guns and Christmas

Click to enlarge (source)

Remember the fun, energetic folderol last week when Representative Thomas Massie (R-KY) and family posed in front of the Christmas tree flexing their guns? Wasn’t that fun? Watching online leftists lose their collective marbles because the Congressman pulled a clickbait stunt on Twitter. Then for an encore, Representative Lauren Boebert (R-CO), probably disappointed she didn’t plan the extravaganza first, recreated the image with her four minor sons.

Initial criticism targeted the Representatives for posing with guns just days after the Oxford, Michigan, school shooting. More interesting for me than the rifles, though, is the other two props these photos share: children and Christmas trees. Both Representatives are professing Christians; Massie is Methodist, and Boebert describes herself as born-again. Why would they conflate military-grade firearms with the birth of the Prince of Peace?

I’m certainly not the first Christian to complain that Christmas has come unmoored from its liturgical roots. Long before Republicans embraced the “War on Christmas,” the expression “Keep Christ in Christmas” meant resisting the holiday’s creeping commercialization. It’s too late for that, though. Like Halloween before it, Christmas has become a secular marketing phenomenon, unrelated to a religion to which America overall has a fairly lukewarm relationship.

Once associated with childlike wonder as kids discover exactly how “naughty or nice” Santa deems their year, Christmas has become a holiday of adult self-indulgence. From relatively innocent traditions like spiked eggnog and ugly sweater parties, to competitive light displays and Elf on a Shelf, Christmas has become a commercial spectacular. Again like Halloween, Christmas is no longer for children; it’s become a massive, culture-wide bender, verging on suicidal.

This transition of childhood holidays into adult extravaganzas hasn’t happened coincidentally. It’s part of a pattern where citizens of the industrialized world see the season of childlike dependence extended well into adulthood. My parents’ generation could get married, buy a house, and start a family on a job secured with a high school diploma. Today, people with multiple college credentials are deferring home ownership and parenthood into their forties because they’re broke.

Children love displays of supposed adulthood. Feeling their social, economic, and sexual maturity thwarted by the constraints of industrial modernism, they find ways to display the grown-up identity they’ve been denied. Remembering my high school years, I recall minor girls finding inventive ways to display their breasts and butts, while boys displayed images of military-adjacent machismo. Carrying gun aficionado magazines to school sure seemed grown-up to us teenagers.

Click to enlarge (source)

Representative Boebert in particular matters here. A high-school dropout, Boebert never underwent commencement, one of the few adulthood rites remaining in American society. She later redeemed herself through entrepreneurship and a knack for PR, certainly. But in many ways, Boebert has become a public face of an American generation denied any passage to adulthood. Even her allies frequently comment on her youthful good looks, innately infantilizing her.

Consider the difference between the two photographs. Representative Massie’s family is well-groomed and posed, and look comfortable holding their firearms. (I’m unclear whether the small girl on the couch is Massie’s youngest child, or a grandchild.) By contrast, Representative Boebert’s kids look disorganized and are pointing their rifles randomly. Their facial expressions run the gamut from numb to hysterically terrified, less a family photo than a hostage situation.

Yet it’s Massie whose caption specifically invokes Santa. (The NRA posted a similar Santa-themed tweet days earlier, to similar umbrage.) This entreaty to childhood innocence, while desperately trying to look grown-up with military-grade weapons, reflects the conundrum almost every American born after 1970 has felt: we’re expected to act adult and responsible long before our society and economy trust us with more than the most rudimentary adult responsibility.

These demented family photographs make perfect sense in context. And that context is that of parents, even grandparents, trapped in a permanent cycle of adolescent dependence. We’re expected to demonstrate unquestioning loyalty to our economy, our government, our church, our nation—the wide-eyed loyalty of a child for a mother. A loyalty which anyone who’s ever worked for an hourly wage knows we’ll never see reciprocated in this lifetime.

I admit having an adverse relationship with guns. When American lawmakers display their weapons as literally children’s Christmas toys, my immediate reaction is hostility. But when I distance myself from that history, and consider the situation from the Representatives’ perspective, I realize: they’re a lot like me. They feel powerless and estranged in a culture they don’t understand. They handle it differently, but fundamentally, they’re not that different from me.

Wednesday, December 8, 2021

Society Is a Machine To Be Broken (Part Three)

This essay follows Society Is a Machine To Be Broken (Part One) and Society Is a Machine To Be Broken (Part Two)

The genetic similarities between the Terminator and Matrix franchises are so obvious, they barely deserve further description. Both depict humanity overthrown from within, by the machines we built to serve us, by the social systems we created and couldn’t control. But the differences between these franchises speak volumes to the options available to humanity going forward. And those differences originate in the contexts that created them.

Written and filmed in the bleakest years of the Cold War, The Terminator assumed the defense networks humans created would identify humanity overall as the enemy. This isn’t a stretch, after all, as the greatest threat to world peace is certainly the humans living on the world. Those of us old enough to remember Ronald Reagan and the SALT Treaty negotiations will remember believing that little fundamentally mattered, because the bombs would drop any day.

By contrast, The Matrix dropped in the late 1990s. As fear of imminent global firebombing retreated, America retreated into a hangover of Furbies, Spice Girls, and NAFTA. Where The Terminator anticipated a war machine classifying humanity as the enemy, The Matrix forecasted humanity’s synthetic workforce rising against us, the bloated consuming class. One franchise considered humanity overprepared for war; the other saw humanity fattened on peace.

What each franchise anticipated arose from its circumstances. The Terminator foresaw a militarized future, where constant war against the machines has become humanity’s default; to be human, in Skynet’s world, is to be permanently part of the French Resistance. Kyle Reese sleeps with his helmet on, cuddling his M-16 like a teddy bear, because gun-toting infiltration units could overrun his bunker at any time. Humans, in The Terminator, are constantly awake.

Compare that to The Matrix, where humans are constantly asleep. Rather than occupation and slaughter, these machines offer humanity comfortable dreams and superficial meaning, in return for our complete subservience. The illusion lets humans continue living, working, and even rebelling, in ways controlled and permitted by the system. Of course, none of it was real, and the rewards generated went entirely to the machines, who owned everything.

Note that before his liberation, Neo alternates between the conformity of employment, and the counterconformity of acid rave culture. He maintains the postures of rebellion, the actions of freedom. But this freedom, this rebellion, doesn’t really exist; no choices exist, except those for which the machines have written a template. He yearns for autonomy, but can’t see past the illusions created by the socioeconomic system.

Please note, though: reality didn’t unfold that way. We GenX’ers whose childhood was dominated by fear of annihilation, haven’t been constantly awake; we’ve largely ceded authority to a gerontocracy that retains control for literally decades, often without improving anything. Millennials and Zoomers, however, those raised on the post-Reagan economic surfeit satirized in The Matrix, are among history’s most politically engaged.

Either way, these franchises promise humanity a singular messiah, a designated deliverer whose unique skills and connection to the system will redeem humanity from its disaster. Whether a military conqueror like John Connor, or a spiritual guru like Neo, both assume that, when circumstances permit, the chosen individual will appear and restore the balance. Our enemies will be routed, our freedom will overcome, and humanity will reclaim its independence.

And yet…

In both cases, this triumph appears transitory. John Connor’s victory in Terminator 2 is repeatedly overwritten in sequels, though these revisions never stick—perhaps, I’d suggest, because the Cold War circumstances that made The Terminator so terrifying no longer exist. Meanwhile, trailers for the anticipated fourth Matrix movie depict Neo having to relearn his messianic nature. It seems, in Hollywood, our sci-fi messiahs never quite stick.

We can attribute this, partly, to the requirements of the franchise. If the characters’ problems are ever permanently resolved, the story ends. (Witness how the stories flailed when Disney tried to keep the galactic civil war alive in Star Wars.) But it also reflects our inability to imagine what happens on the other side of this life. We can’t imagine salvation, or even post-capitalism, except through analogies to this life, which are generally unsatisfying.

No matter whether we grew up amid the nihilism of the Cold War, or the orgiastic backwash that came afterward, we’re conscious, on some level, of the machinery that dragged us here. We know this condition isn’t natural, or inevitable, and we seek someone to deliver us from this system. But that deliverance is permanently in the future, because deep down, we can’t imagine any lasting condition but servitude.

Tuesday, December 7, 2021

The “Spiteful Style” in American Politics

You probably saw it circulating on Twitter this week: a comically ridiculous image claiming to be an official Christmas card, signed “President Donald J. Trump” and datelined “The Winter White House.” The phallic imagery is glaring, and the design looks homemade. It looked extravagant and gauche, which made it more plausible: we’re discussing a man who owns a solid gold toilet in a Manhattan penthouse.

Only one problem, of course: it’s fake. The “card” doesn’t appear on the former president’s website, and I’d question the legality of him using the presidential title (though he’s entitled to be addressed as “Mr. President” for the rest of his life). The tuxedo image comes from a particularly embarrassing photo taken during a state meeting with Queen Elizabeth II in June of 2019. The aggregate image looks handmade because it probably is.

Despite being a transparent forgery, the image burned hard across Twitter yesterday for one simple reason: it told True Believers what they already believed. By depicting the former president as a literal dickhead, it permitted his opponents to engage in worldwide mutual congratulations, an exercise that blurred the lines between applauding each other, and a circle jerk. It made us feel good about ourselves without actually doing anything.

The contemptuous attitude underlying this meme definitely precedes the Internet. Ridiculing your ideological opponents’ looks and lack of perceived grace has been a staple of talk radio, late-night comedy, and basic cable discourse at least throughout my lifetime. Its tendencies, however, have been accentuated by use of algorithms on search engines and social media. We’re guaranteed to see whatever image verifies our existing views.

Watching this demonstrably fake image circulate this week, I’m reminded of something I told Freshman Comp students back in my teaching days: being mean is fun. Taking potshots at your opposition, highlighting their awkward quirks and personal shortcomings, maximizing the other side’s insignificant image problems: it’s fun. Anybody who’s ever participated in online dogpiling knows this. Like schoolyard bullies, you quickly get addicted to the endorphin rush of being cruel.

The source of the tuxedo image, June 2019

Furthermore, rhetorical cruelty can serve a valid purpose: it builds group bonds. Conservatives listening to Rush Limbaugh dump personal invective on Democrats, or Keith Olbermann unload personal animus on Republicans, has felt the rush of insidership. We’ve felt vindicated, watching these professionals state the opinions we already have, except with more eloquence. Watching this kind of partisan pettiness reminds us we’re on the “right” side of the argument.

Therefore, if you want to energize a base to action, casual cruelty is an effective tool. The former president utilized this tool effectively, and while it never netted him a popular majority, it delivered an electrified base sufficient to win on a technicality. Democrats, preserving the Obama-era philosophy of “When They Go Low, We Go High,” have meanwhile struggled to hold onto a fiddling plurality by their shrinking fingertips.

However, whatever short-term gains rhetorical spitefulness delivers, it always undermines the long-term agenda. Coalitions held together by malice and wrath attract few converts, and inevitably lose followers when someone crosses some invisible line. Consider your MAGA friends who supported the January 6th insurrection, and ask yourself: would you trust them to babysit your dog? Do you think you’ll ever trust them again?

Whenever I see people I generally consider trustworthy sharing the above meme, with its implicit message of “Haha look, the man’s a literal dickhead,” I find myself questioning how trustworthy I’d consider them in power. If someone’s entire political mindset is based on being as mean as possible to the other side, I question how genuine their motivations are. What’s to stop them, when their candidate gets elected, from seeking vengeance?

Don’t misunderstand me: I’ve fallen prey to this before. I too-credulously accepted accusations like the “Defend Billionaires” billboard or fabricated politician quotes. It’s easy to assume that the worst possible behavior from the other side is not only plausible, but commonplace. And it feels good to push that conclusion to its ultimate end, pat myself on the back for being right all along, and brag. I definitely understand the temptation.

But that doesn’t make the action any less wrong. Just as important, it doesn’t make the behavior any less detrimental to our long-term goals. The algorithm and the 24-hour news cycle mislead us into believing scoring points against the enemy is of paramount importance— and, in fairness, it helps. But petty spitefulness only works to deepen divisions, make the other side unwilling to listen, and make our national situation worse.

Friday, December 3, 2021

Do We Even Believe In “Justice” Anymore?

The 2021 U.S. Supreme Court (official photo)

The disgraceful spectacle of this week’s Supreme Court arguments should shock any Americans who believe in our founding principles. In Dobbs v. Jackson Women's Health Organization, the Court’s conservative majority signalled the likelihood that it will give constitutional imprimatur to America’s most restrictive abortion regulations in fifty years. The ruling’s date remains unclear, but its contents are a foregone conclusion.

I was struck by something Justice Sonia Sotomayor said, challenging not the litigants, but fellow judges: “Will this institution survive the stench that this creates in the public perception that the Constitution and its reading are just political acts? I don’t see how it is possible.” Sotomayor is exactly right: anti-abortion activists pushed this case now specifically because recent partisan tilts have made a formerly unthinkable stare decisis change possible.

This got me thinking: what, exactly, constitutes “justice”? How do we know a just and honorable outcome when we see it? America’s entire precept of representative democracy begins with the assumption that, given the opportunity to debate publicly, sufficiently large groups will reach a consensus that everyone agrees is fair. But two-and-a-half centuries of American law provide enough evidence to call this precept into question.

NYU psychology professor Jonathan Haidt writes that, while Western humans like to believe ourselves rational beings who reach moral conclusions through exhaustive reasoning, this is an illusion. We reach conclusions more-or-less instantaneously, then construct reasoning retrospectively, to justify what we already believe. Frequently, our definition of “fairness” and “justice” reflects not some external agreed-upon precept, but whatever justifies our already-held beliefs.

This runs counter to America’s founding principles. “We hold these truths to be self-evident” assumes that not only does truth exist externally, but it’s visible to anyone watching with dispassionate eyes. This, sadly, just isn’t true. The longer I watch politics and current events, the more glaring it becomes that most legislators and jurists enter arguments with a foreordained notion of truth, a notion entirely consistent with their partisan alignments.

Superficially, it seems obvious to me that “justice” must involve defending the poor, minorities, and the disfranchised against the whimsy of the majority. Any system where the majority runs unchecked will inevitably result in the injustices embodied in tyrants like George Wallace and Bull Connor. Obviously. I enter any debate about justice with this supposition already prepared, ready to deploy against any challengers.

The U.S. Supreme Court building

Yet I increasingly realize this definition is vulnerable. If the minority needs defended, than minority status becomes treasured: consider those White evangelical Christians who believe themselves oppressed because they’re not permitted to exclude Black people and LGBTQIA+ with impunity. But it goes further. What constitutes “oppressed enough”? Consider Jussie Smollett, who, though Black and gay, felt he needed a violent backstory to justify his financial success.

Therefore my definition of “justice” is as incomplete as anybody else’s. Though I may reflexively rush to defend minorities, I lack a sufficient idea of what constitutes a “minority,” and also what constitutes “defense.” I expect the nuances to work themselves out in practice, leading inevitably to the question: do they? Recent American history suggests they do not. My definition of “justice” is as insufficient as anybody else’s, apparently.

Watching this week’s arguments, I felt queasy about the almost inexorable outcome. Though I dislike the idea of abortion as birth control, I recognize that it's sometimes the best alternative. The women most likely to need abortion are disproportionately poor, young, poor, and not White: a massive trifecta of disfranchised groups. These, to me, are the groups most immediately in need of defense by those who call themselves “Justices.”

The very precept of the Supreme Court holds that justice derives, not from individual principles, but public deliberation. We can argue and dispute our way to knowing the truth. We White Americans like to believe the Supreme Court’s oracular wisdom, justified by decisions like Brown v. BOE. But those decisions are statistical outliers; historically, the Court more often defends existing power structures than challenges them.

Our Constitution was written by people who believed that argument and informed dispute could uncover the truth. But they didn’t represent America generally; the transcripts of the 1789 Constitutional Convention reveal the words and opinions of an entirely White, well-off, male country club, whose members disagreed mainly on regional grounds. That’s why the Constitution is so short, at only 7,591 words (including amendments), because it’s a compromise document.

Justice Sotomayor is right, this stink will never wash off. But not because anything changed this week; only because the truth is now visible.