Monday, September 23, 2024

The Persistence of Epic Novels

Plato and Aristotle, depicted by Raphael

Aristotle, in the Poetics, records a sentiment probably shared by 3,000 years of classical studies students: the epics of Homer are too damn long. The Iliad and the Odyssey each comprise twenty-four “books,” each book running hundreds of lines. In the ancient Athenian recitation competitions, one speaker would memorize one book, and one or two would speak per day, meaning recitations lasted anywhere from nearly two weeks, to nearly a month.

To Aristotle, this was excessive, and he asked for “epics” which a single performer could recite in one or two nights. But Aristotle, writing half a millennium after Homer, might’ve missed something important. Classicist Barry B. Powell suggests, in the introduction to his translation of the Odyssey, that Homer’s performances probably did last only one night. Today, when Homer is “required reading,” we forget he was pop entertainment in his day.

Powell postulates that Homer was probably illiterate, and didn’t have his massive epics memorized. Instead, Powell suggests that Homer improvised his poetry on well-known themes, and each performance was unique. They probably lasted for one- or two-night engagements, But when somebody finally pledged to transcribe Homer’s poetry, Powell believes, Homer composed in long-form, unpacking details usually elided, to create a sweeping saga intended for the page.

This difference matters. Even three millennia ago, Homer realized that people reading a book committed themselves to something vaster, with a willingness to persevere across the span of days. Works intended for public recitation generally run much shorter—it takes approximately two hours to read the entire Gospel of Mark aloud. But books are a different order of beast, and Homer, transcribing early in the medium’s history, already knew that.

Now, as then, we have conflicting desires for short-form and longer entertainments. There’s nothing new under the sun. We want stories we can devour in one sitting, often already prepared for the next one, and we want stories that take days to consume, which we must pursue doggedly. But the changing media market has split our interests. First television, and more recently the internet, swallowed up the market for short-form, one-evening entertainments.

An undated traditional depiction of Homer

Authors like F. Scott Fitzgerald once wrote short stories for commercial purposes. The magazines in which they published, like The Atlantic, the Saturday Evening Post, and The New Yorker, had large audiences and paid handsomely. At his peak, Fitzgerald notoriously tore off short stories for magazine markets to get paid, and wrote the novels for which he’s now more famous to indulge a literary inclination that wouldn’t have paid particularly well.

Since Fitzgerald’s time, the magazine market has become considerably less lucrative. The Atlantic no longer publishes fiction and The New Yorker mostly only publishes well-known names or controversial content that drives engagement. A handful of genre magazines pay moderate rates, and art purists still regularly launch small-circulation “zines,” but the short-form entertainment market has moved mainly onto electronic media.

Conversely, contra Aristotle, the book continues to harbor the epic format he thought Homer overstretched. From Dante and Milton to J.R.R. Tolkien and Tom Clancy, authors preserve Homer’s physical length and dramatic sweep, because people really want that scale and that depth of immersion. People want an “entertainment” that will take them out of themselves and their own lives for literally days, even weeks, at a stretch.

And electronic media largely can’t provide that. Compare the prose and streaming versions of one of our time’s greatest epic authors. Robert Jordan’s Wheel of Time series certainly has its detractors, but Jordan’s audience remained loyal, even when many felt his middle novels lagged. Not only were the books physically and thematically massive, but his story unfolded across multiple books, and readers remained immersed in his milieu for years.

But the streaming media adaptation of Jordan’s books largely fell flat. Not only did it take two years to produce a second season, even with the storyline already written, but when the product finally emerged, it suffered from cost overruns and content bloat. The audience from Jordan’s novels disliked the compromises which the transition between media forced, and the series has struggled to find a non-book audience.

Simply put, Aristotle was wrong. Audiences don’t find the length, density, and complexity of Homer’s epics off-putting. Indeed, as TV and Netflix continue consuming our short-form entertainments, audiences not only still buy books, but they support longer books and convoluted book series. Aristotle thought the hoi polloi weren’t lettered enough to follow stories that took days to develop, but given the choice, that’s apparently what audiences want.

Monday, September 16, 2024

Okay, But Why Haitians Specifically?

Hispaniola, with Haiti in green, in a map from the Encyclopedia Britannica

Springfield, Ohio, mayor Rob Rue has spent the last week on a multimedia blitz campaign to shut up the former President of the United States. After Donald Trump lied outright in last week’s Presidential debate about Haitian immigrants eating pets, his community has been plagued with harassment, bomb threats, and constant insipient violence. Springfield’s Haitians, mostly factory workers and their families, are reputedly afraid to walk outdoors.

Those who know me best, know that Haiti and Haitian issues weigh heavy on my heart. I learned of the country in 1991, when a military junta under General Raoul Cedras, overthrew the legitimately elected government of Jean-Bertrand Aristide. I became curious about why Haiti, the second-oldest independent nation in the Western Hemisphere, hasn’t become as prosperous, democratic, and free as the United States. So I set out to learn.

Unfortunately, few sources publish much about Haiti outside close-knit foreign policy circles. Most public libraries have only two authors: Wade Davis and Paul Farmer, who perceived Haiti very differently. Despite being within spitting distance of Florida and Puerto Rico, Haiti remains terra incognita to most Americans. Therefore I don’t proclaim to be a Haiti expert or Haitian studies scholar; I’m some guy who cares deeply and seeks information wherever it dwells.

In that capacity, a friend asked me this weekend: why do politicians single out Haitians, specifically, to embody their paranoia about immigration? After all, Trump previously identified Haiti by name in his notorious “shithole countries” comments, and conservatives from Ted Cruz to Ted Turner have name-checked Haitians as diseased, criminal, and otherwise generally undesirable. In today’s fraught world, what makes Haitians so noteworthy?

I cannot encapsulate Haitian history into a 750-word blog without performing a gratuitous disservice. Though rebellious slaves drove French colonial powers from the colony formerly known as San Domingue in 1804, France withheld diplomatic recognition until 1824. The United States didn’t recognize Haiti until 1862, because before then, Southern slaveholders couldn’t stomach a republic born of slave rebellion on America’s back porch.

The United States came into existence with a landed aristocracy, a veteran diplomatic corps, and allies in France and Spain. Haiti had nothing like that, and developed in isolation for its first generation. Without experienced governors, the country’s founders bungled the launch, and the nation never fully recovered. Though San Domingue’s slave-driven sugar plantations made it the richest colony in the Caribbean, the liberated slaves couldn’t sustain the economy.

Springfield, Ohio, mayor Rob Rue has become a national celebrity this
week. One suspects he probably didn't want the notoriety.

This dispossession opened Haiti to manipulation. Especially during the Cold War, America propped up military strongmen throughout the Western Hemisphere as safeguards against communism. This includes Haiti’s Duvalier dynasty, which governed through terror of its secret police, the Tonton Macoute. When Baby Doc Duvalier, manifestly incompetent, finally fled Haiti, American intelligence forces collaborated to shelter him in France. Later, future dictator Raoul Cedras attended the School of the Americas.

The United States and France have feuded for economic dominion over Haiti, as most populous Francophonic nation in its region. America currently holds sway in that regard, as President Clinton made President Aristide sign a free-trade agreement as a precondition for American intervention. As usually happens when agrarian nations sign free-trade agreements with America, cheap American produce ramped rural poverty above seventy percent.

Throughout the 1970s, tourists from France, America, and Quebec flocked to picturesque Haitian beaches, mostly around Cap-Haïtien and Tortuga. As often happens in chronically impoverished countries, tourists began paying for sex with cash-strapped locals. The timing was unfortunate. A strange, little-understood sexually transmitted virus was just gaining ground in North America at the time. By the 1980s, AIDS was rampant among Haiti’s poor.

Notice the pattern. European (and later American) powers imported slavery, political instability, poverty, and disease into a once-thriving nation. Literally from the very beginning, as some historians believe Christopher Columbus made his first Western Hemisphere landfall in Haiti. “Ayiti” is the native Taino name for the island of Hispaniola. Wealthier, stronger powers spent over five centuries exporting our worst sociopolitical outcomes to Haiti, then blaming Haiti for receiving them.

Whenever strongmen like Donald Trump look at Haitians, they don’t see people. They see the outcome of Euro-American policies that target the poor, the non-White, and the distant. They see the living, walking embodiment of Western imperialism’s consequences for the most disadvantaged. Haiti, by its existence, convicts Western imperialists of their sins.

Given an opportunity to make even the most nominal repairs to the damage Empire causes, they instead turn blame outward. Imperialists, apparently, will always punish the poor among us for being poor.

Wednesday, September 11, 2024

“Simple Facts,” and Other Political Lies

Steve Benen, Ministry of Truth: Democracy, Reality, and the Republicans' War on the Recent Past

Donald Trump notoriously occupies a parallel world, where he twice won the popular vote, engineered a thriving economy, and wasn’t helped into office by Russia. That’s fine, by itself; many people suffer from what netizens call “Main Character Syndrome.” Trump, however, has a unique capability to draw others into his fantasies. Political writer Steve Benen offers to analyze how exactly this happens. Yet somehow, the analysis never quite arrives.

Specialists within a field will sometimes promise to explain something fundamental to their discipline, but ultimately just describe it. I encountered this phenomenon in the writings of sociologist Duncan J. Watts, though it predates him. Watts writes that art historians, purposing to explain why Leonardo’s Mona Lisa is the best artwork ever, will then describe its traits. The Mona Lisa is best, Watts claims, because it resembles the Mona Lisa.

That, I believe, happens with this book. Benen breaks the Republican machine’s structure of lies into seven broad categories, including election denialism, the unfinished Wall, his merely okay economy that couldn’t withstand COVID-19, and the January 6th insurrection. Benen lists the lies extensively, though largely in ways that assume you remember what really happened. And, unfortunately, that’s about all he does, reprinting a 200-page laundry list.

First, notwithstanding the title, Benen reveals a problem less with the Republican Party than with the Trump political apparatus. Benen cites several Republicans who directly criticize Trump’s specious narrative, including Dan Coates and John Bolton, both former Trump insiders. But Coates and Bolton aren’t running for reelection. Sitting Republicans must share Trump’s parallel universe because, if they don’t, they know they’ll get turfed out in the primaries.

Then, having listed the former administration’s fantasies, which he debunks with rudimentary Google searches and appeals to our own memories, Benen… stops. Having identified a pattern of obfuscation, Benen believes he’s completed his responsibilities. But his intended audience, which shares his overall dim opinion of the Trump years, will likely respond: “No shit.” Because, as Benen writes, we remember what really happened. We watched it happen on live TV.

Don’t mistake Benen’s motives, or mine. Political operatives spin events to suit a partisan narrative, and wise voters anticipate it. Skillful politicians in make defeat look temporary, embarrassment seem minor, and the other side’s foibles appear as moral catastrophes. Benen acknowledges this early, and clarifies that he means something altogether different. Trump’s organization repeatedly insists our eyes lie, and reality is Trump’s narrative, not the evidence of our senses.

Steve Benen

Trump and his political hangers-on began spinning false narratives with such alacrity, it boggled the imagination. From Day One, he dispatched Sean Spicer to propagate false reports of massive Inauguration Day crowds, despite the paltry attendance displayed on global TV the day before. Trump’s machine began spinning false yarns about the Lafayette Park clearance on June 1st, 2020, or the January 6th insurrection, literally within hours.

More interesting, Trump’s lies aren’t even internally consistent. Anybody who fibbed to their parents about grades or curfews knows that sticking to the fabricated narrative is key. But, as Benen’s accounting shows, Trump’s falsehoods shift with political winds and Trump’s personal mood. Partisan accounts of, say, the Russian election interference case, change repeatedly. The only constant is that any account that Trump dislikes, is dismissed.

Benen’s audience probably shares my curiosity. We remember what happened because, as Benen writes, we watched it happen live. I care more about two follow-up questions: how can Trump and his supporters be so brazen in their untruths? And why do rank-and-file Republicans believe this baloney-sauce when they, like everyone else, saw what really happened? What political or sociological mechanism lets Trump spin obvious fantasies, while others eat it up?

Those answers aren’t forthcoming. Benen’s mostly left-leaning audience knows already that Trump can lie flagrantly, about events we all saw happen, and face few electoral consequences. We also know that nobody apparently shares Trump’s mojo. We all watched Ron DeSantis and Kari Lake using Trump’s playbook, and we watched them descend into national laughingstocks. Trump, alone, has this ability to invent another reality, and sell it to a loyal base.

Again, no kidding. These facts (ironically enough, under the circumstances) aren’t in dispute. But without further understanding, we’ll find ourselves trapped inside the event horizon of Trump’s black hole, debunking similar falsehoods forever. Benen admits, in his epilogue, that he lacks any solution. But he also lacks any deeper analysis, any willingness to read between the lines. He tells us what we already know—and what Trump already denies.

Monday, September 9, 2024

I Don’t Like How Pessimistic I’ve Become

Scarcely had the dust settled following last week’s shooting at Apalachee High School, in Barrow County, Georgia, before the speculations started—most of it predictable. It’s the ubiquity of guns in America! It’s the lack of access to mental health care! Is the shooter trans, queer, or otherwise marginalized? Batten the hatches, kids, a massive political shitstorm is a-brewin’!

Little information is forthcoming about the shooter’s motivations. We know he’s fourteen years old,* that his parents purchased the firearm he used, and that, in an unusual maneuver, authorities have charged his parents as equally culpable for the catastrophe. The swirling accusations about overwhelming cultural trends might speak to whatever motivated the shooter, but for now, verifiable facts remain scarce.

This violence comes amid perhaps the most acrimonious election season America has seen, outdoing the previous most acrimonious election cycle, which was the last one… and the one before that. Apocalyptic rhetoric on both sides has become de rigeur in Presidential campaigns. America’s economy is supposedly thriving, but doesn’t feel productive for most Americans, since only the rich reap rewards.

As usual amid such apocalyptic circumstances, ordinary people try to reclaim the supposed lost grandeur of the past, while ignoring planning for the future. One Presidential candidate promises to “make America great again,” while the other yearns for the New Deal unitary order. Exactly when this greatness occurred remains vague, since the New Deal was as organizationally racist as the lily-white simplicity of Leave It To Beaver.

Meanwhile, hucksters remind every demographic that they can reclaim lost meaning… for cash on the barrelhead. “Men used to go to war,” laments the recurrent meme, “and now they [insert ordinary thing people do for fun].” Alpha male influencers like Andrew Tate and Alex Jones offer classes, nutritional supplements, and private guidance to become a dominant he-man. Dave Ramsey promises that Jeus wants to make you rich.

Since Andrew Tate is currently awaiting trial for sexual assault and human trafficking, we know what, in his mind, constitutes male strength. Anybody who’s worked the Sunday lunch rush at most restaurants knows that outspoken Christians can’t be trusted with money. Then, while men teach other men to abuse women, and Christians teach other Christians to hate workers, the rich think they can outspend the Grim Reaper.

Karl Marx believed that industrial capitalism would empower the working poor to develop class consciousness and overthrow their economic overlords. That maybe seemed reasonable amid the Dark Satanic Mills of pre-Victorian England, when labor actions frequently ascended into armed confrontations with literal liveried royal soldiers. To Marx, it probably seemed inevitable that what he called “alienation” would soon be universal.

It is, yes, but not as he envisioned. As the agrarian ideal recedes in memory, and factories, coal mines, and shopping malls seem inevitable, labor actions have stopped resisting the employer; workers instead defend their way of life, the system that keeps them permanently impoverished, and the bosses and billionaires who loot the masses for money. Because industrialism seems inevitable, the poor stan shamelessly for the rich.

This maybe explains trends we’ve all witnessed. It seems straightforward to me, that billionaires and resource hoarders keep the White working class impoverished, but White workers turn their rage on Black and Brown people and immigrants. Women certainly aren’t sending men to war or to the salt mines—most hyper-rich are men too—yet men pummel women, literally or figuratively, to reclaim their masculinity.

Barrow County, Georgia, is mostly White and relatively middle class. Like many similar regions, however, it’s seen its racial demographics become more diverse, and its average income stagnate, for the last quarter century. As usually happens, the economic powers use this changing population to chisel huge concessions, which means all economic gains, insofar as there are any, will trickle up.

The Apalachee High School shooter, like millions in his generation, watched the future he and his family were promised shrivel to almost nothing. Like millions of others, he looked directly at the entrenched powers making his old world look inevitable, and who now stand bathd in the economic carnage they wreaked upon the community. And like always, he blamed the poor for his plight.

Until Americans demonstrate enough imagination to realize that another world is possible, this violence will repeat itself in America’s schools, centers of commerce, and public spaces. These are the places where the world inside our minds has become small, circumscribed by our economic conditions. Mass gun removals, besides being impractical, won’t change the underlying mindset.

*Per my standard practice, I will not say the shooter's name, lest I contribute to his unearned notoriety.

Friday, September 6, 2024

An Oasis in the Desert of Reality, Part Two

The original Oasis lineup from 1994, as per NME
This essay is a follow-up to An Oasis in the Desert of Reality

The recently announced Oasis reunion was followed almost immediately by something that’s become widespread in today’s music industry: price gouging. Euphemistically termed “dynamic pricing,” the online structure lets ticket outlets jack prices commensurate with surging demand. This means that, as tickets became available for the paltry seventeen dates around the UK and Ireland, computers hiked prices to £350 ($460) per ticket, outside their working-class audience’s budgets.

American audiences could summarize the exorbitant prices with two words: Taylor Swift. Tickets for Swift’s Eras Tour originally sold for $49. Quickly, however, a combination of forces, including the Ticketmaster/LiveNation merger, unauthorized resale, and limited access, bloated prices to over $4000 in some markets. Swift’s mostly young, mostly female audience base obviously can’t afford such numbers. In both cases, predatory marketing ensured that fans best positioned to appreciate the respective artists, could least afford access.

But both artists share another characteristic. Why do these acts share monolithic grips on their markets? This perhaps made sense when the Beatles upended the music industry in 1963 and 1964, when fewer media options ensured a more homogenous market. But advances in technology have granted listeners access to more radio stations, innumerable streaming services with nigh-infinite channels, and more opportunities to share music with formerly niche fandoms.

Yet increased diversity produces a paradoxical outcome, embodied in the crushing demand for limited tickets. As listeners have access to more artists, more styles, and the entire history of recorded music, demand has concentrated on a smaller number of artists. I’ve noted this trend before: as the buy-in to create and distribute music has become more affordable, the actual number-one position on the Billboard Hot 100 has become less diverse.

Some of this reflects the way conglomerate corporations throttle access. Sure, entrepreneurial artists can record music in their bedrooms at higher quality than the Beatles dreamed possible, and distribute it worldwide almost for free. But approximately half the music market today travels through one outlet, Spotify. And, as Giblin and Doctorow write, Spotify’s algorithm actively steers listeners away from indie, entrepreneurial, and otherwise non-corporate options.

Taylor Swift in 2023

Three corporations—Sony BMG, Universal, and Warner—currently control over eighty percent of the recorded music market. That’s down from four corporations controlling two-thirds of the market in 2012, the year Universal absorbed EMI. (EMI, in turn, owned Parlophone, the label which published the Beatles.) Without support from the Big Three, artists have limited access to publicity, distribution, or the payola necessary to ascend Spotify’s ladder.

Aspirational musicians might, through good business management, make a middle-class living through constant touring and indie distribution. But without a major-label contract, musicians can’t hope for basic amenities like, say, a full weekend off, much less enough pay to buy a house. And with only three corporate overlords controlling the supposedly large number of major labels, musicians will always have a severe disadvantage.

Sure, the occasional musician might beat the odds and challenge the Big Three oligarchy. Taylor Swift notoriously forced both Spotify and Apple Music to pay overdue royalties to backlist artists a decade ago by withholding her lucrative catalog. But most musicians, even successful artists with chart hits, can’t expect to ever having such influence. Citing Giblin and Doctorow again, many artists with chart hits still bleed money in today’s near-monopoly market.

This structure encourages blandness and conformity, from both artists and audiences. Oasis sounded new and different thirty years ago, but they’re now comfort listening for a middle-aged, middle-income British audience. Taylor Swift samples styles and influences so quickly that she’s become a one-stop buffet with something to please (or displease) everybody. Both artists give audiences what they expect to hear—assuming they can hear anything over the stadium crowds.

Collective problems require collective solutions. Start by supporting politicians who enforce existing antitrust and anti-monopoly laws—and not supporting the rich presidential candidate who brags about not paying contractors. But we also have private solutions, starting with attending concerts and buying merch from local and indie artists who reinvest their revenue into their communities and stagehands. Tricky for some, yes, as most music venues are bars.

The problem isn’t hypothetical; we have changed in response to a diminished market. As Spotify, LiveNation, and the Big Three have throttled the music industry, we listeners have responded by accepting diminished standards, and consuming blander art. Though we have the illusion of choice, we allow the corporations to channel us into a less diverse market, and buy their pre-screened art. Independence is neither cheap nor easy, but it’s urgently necessary.

Wednesday, September 4, 2024

An Oasis in the Desert of Reality

This photo of Liam and Noel Gallagher has accompanied the announced Oasis reunion tour

Perhaps my opinion on the announced Oasis reunion tour is distorted by me being an American who streams British media online. This makes Oasis seem both larger and smaller than they actually were: they planted twenty-four singles in the UK Top Ten, but only one in America. “Wonderwall,” obviously. Two other songs, “Don’t Look Back in Anger” and “Champaigne Supernova,” have also had lingering afterlives on alternative radio.

Thus, although I appreciate the panicked rush to purchase reunion tickets, I can’t participate. Oasis represents a place and time, one in which I couldn’t participate because worldwide streaming scarcely existed before the Gallagher brothers stopped working together in 2009. I realize that lost era means something important to those who lived through it. However, the timing seems exceptionally pointed, knowing what I do now.

The Gallagher brothers announced their reunion tour as Oasis almost exactly thirty years after the release of their record-setting debut album, Definitely Maybe. Released on 29 August 1994*, this album outsold any British freshman album until that point, and all four singles went Gold or Platinum. Though Oasis wouldn’t have an American breakout until their second album, they didn’t need one; they already had Beatles-like acclaim on their first try.

Nostalgia vendors always seem to think things were Edenic approximately thirty years ago. Growing up in America in the 1980s, popular media regaled me with giddy stories of how wonderful things were during the Eisenhower administration. TV series like Happy Days an MASH, or movies like Stand By Me and Back to the Future, though they commented on then-current events, nevertheless pitched a mythology of prelapsarian sock-hop ideals.

Importantly, these shows all spotlighted not how the 1950s were, but how the aging generation remembered them. MASH endeavored to humanize the horrors of war, during a time when Vietnam was still too current for commentary, but it did so in ways that only occasionally included any indigenous population. Happy Days was set in Milwaukee, Wisconsin, one of America’s most racially segregated cities, but featured almost no Black characters.

This whitewashed nostalgia reaches its apotheosis of silliness with Back to the Future, in the prom scene, where White Marty Mc-Fly shows a Black backup band how to really rock out. The maneuvers he tries might’ve looked shocking to the White kids on the dance floor, because they lived pre-Jimi Hendrix. But they were hardly new; Marty didn’t do anything Charly Patton and Son House hadn’t invented in the 1920s.

For Oasis to reunite almost exactly thirty years after they deluged British music, puts them in the same position. The announced reunion lineup currently includes only the Gallager brothers, omitting the revolving door of sidemen and rhythm sections that supported them for fifteen years. The band was contentious even before the brothers stopped working together; they never produced a dark horse, like George Harrison, to emerge from the wreckage.

I’ve written before that the nostalgia impulse produces the illusion of inevitability. Because events happened a certain way—because the Beatles hit number one, because Dr. King bested Bull Connor, because America beat the Soviets to the Moon—we believe things had to happen a certain way. This removes human agency from history, giving us false permission to go with the flow, like a dead salmon floating downstream.

Oasis hit the mainstage when John Major’s Tory government was walking wounded. Though Major had received a Parliamentary mandate in 1992, he was already deeply unpopular by 1994, helping Thatcherism limp timidly across history’s finish line. Like their beloved Beatles, who broke just as Alec Douglas-Home’s doomed premiership ushered the Tories out, Oasis arrived to supervise a Conservative collapse.

And now they’re reuniting as Rishi Sunak has nailed shut another Conservative coffin.

But history isn’t inevitable. Tony Blair’s nominally progressive government, like Bill Clinton’s, openly embraced moralism and militarism, before ending in the massive moral sellout of Operation Iraqi Freedom. Likewise, Kier Starmer is possibly the blandest person ever elected PM, winning only because the Tories squandered every advantage. Starmer, like the Oasis reunion, bespeaks a rare British optimism, without much of a plan.

I’d argue that most people don’t really want an Oasis reunion. I strongly doubt anyone wants to watch two White men pushing sixty warble about the troubles of 1994 onstage. Their largely British audience simply wants to time-travel to a moment when they didn’t have to know what Brexit was, who Boris Johnson is, or why a down payment on a London house is triple the average annual income.

*Out of respect, I use British dating conventions in this essay, and only this essay.

Monday, September 2, 2024

Low-Budget Monsters and High-Price Consequences

Paul Tremblay, Horror Movie: a Novel

Thirty years ago, when longshot indie movies became a realistic media presence, four New England kids decided to make a horror film. Decades later, the unfinished movie’s moldering remains have become a viral internet sensation, and the last surviving cast member is involved in helping the production “go Hollywood.” Between the two stories of cinematic hubris, one survivor recounts his tale of deep immersion, and the stains that don’t wash off.

Stoker Award-winning novelist Paul Tremblay’s books are often deemed “postmodern” because they comment on storytelling and the creative process. In this one, the nameless narrator recounts the two productions of his horror movie, entitled Horror Movie. Tremblay divides the novel into three braided strands: “Then,” “Now,” and the screenplay from the unfinished movie. Each, in various ways, criticizes the Hollywood process, while praising the human relationships which make Hollywood possible.

In the “Then” section, set mostly in 1993, a trio of starry-eyed artistes dragoons our narrator into their movie, mostly because a key prop fits his face. The movie, featuring adults playing teenagers in the 1990s style, addresses adolescent themes that seem simultaneously dated, and completely timeless. We know, because the narrator warns us, that this production ends in tragedy; we wait tentatively to discover what happens.

The “Now” section, set in 2023, describes how our narrator collaborates with a major studio to remake the unfinished movie. The original director used the internet to create buzz around a movie nobody saw, and the broken lives left in the movie’s wake, before she died—a death revealed only slowly. Our narrator gives the “E! True Hollywood Story” version, but only through his own, distinctly traumatized viewpoint.

Finally, the screenplay is… well… unfilmable. It represents the grandiosity that infected would-be auteurs after Kevin Smith and Richard Linklater made guerilla filmmaking look easy. Three teenagers turn the pain and ennui of suburban life into tortures they inflict on their anonymous friend, naively yearning to live inside their favorite horror films. They want, but don’t want, to create a monster. The script includes long, discursive passages which voice the screenwriter’s private misery.

Paul Tremblay

The word “metafiction” has been tossed around so heedlessly in recent years, that it’s become a parody of itself. We know the boilerplates: characters leaning on the fourth wall, aware they’re fictional constructs, commenting directly on the art-making process or the relationship with the audience. Sometimes it’s cute. But, like any movement that becomes sufficiently popular, it’s become cheapened by imitators who prefer the trappings to the substance.

Therefore, how readers receive Tremblay’s story will depend on what they bring into the experience. Fans of horror films from the peak slasher era will certainly recognize themselves. Tremblay’s characters, like their movie, and presumably like his intended audience, have been well-off and comfortable for long enough to become numb. His characters want to feel something, anything. However, their bid to create feeling, only spreads misery around.

Our narrator has confabs with Hollywood producers, attends fan conventions, collaborates with the countless hard-working technicians who make movies possible—all without telling us his name. He’s a complete cypher, an anonymous everyman moving through life propelled by others’ demands. Even before the unfinished movie made him legendary, it’s clear the aspiring auteurs cast him because he was someone they could control, and he never completely escapes them.

The overall result is thus less frightening than knowing. Tremblay doesn’t just signpost the looming moments, the tropes where horror cinema spills into its characters’ lives; he actively warns us what’s coming. When something gory finally happens, we already know the broad strokes; we only await Tremblay describing the finer details. Reading, we feel anxiety and anticipation, but never really fear.

This isn’t entirely a criticism. Tremblay presents a literary novel about horror, without necessarily being horror. Throughout the novel, Tremblay brings out themes of characters desperate to feel something, anything. The feelings they achieve, however, prove transitory and meaningless. Even the denouement, where the narrator finally shows some gumption, ends with him admitting: I don’t know what happens next. You, the fans, need to tell me.

Perhaps this reflects Tremblay’s own disappointment. Hollywood optioned his 2018 novel The Cabin at the End of the World, handed it to M. Night Shyamalan, and it did okay without making any waves. One can read this novel as a statement of both disillusionment and powerlessness. It teems with knowing winks to Tremblay’s intended audience, who watch these movies regularly. And it reminds them that, sometimes, it’s okay to feel nothing.