Wednesday, February 28, 2024

Burnt Offerings in America, Part Two

This essay is a follow-up to Burnt Offerings in Modern America
Émile Durkheim

Unfortunately, when men (and it’s mostly men) like Thích Quảng Đức, Mohamed Bouazizi, or Aaron Bushnell offer themselves as burnt offerings, we don’t know where those offerings go. With burnt offerings of animal flesh in the Epic of Gilgamesh or the Hebrew Tanakh, offerings go directly to God or the gods, who take delight, pleasure, and nourishment from humans’ sacrifices. Nowadays, we lack such confidence.

Nearly all early civilizations practice some form of blood sacrifice. Some are dramatic, like Abraham’s averted sacrifice of his son Isaac, or Menelaus’ unaverted sacrifice of his daughter Iphigenia to start the Trojan War. Others are merely grotesque, as the human sacrifice supposedly practiced among the Mexica (often misnamed the Aztecs), a narrative mainly remembered in lurid Spanish retellings. But early religions agree, the gods require blood.

However, religions generally move away from blood sacrifices. They gradually replace spilled blood with the first fruits of the people’s harvest, or gold, or ultimately the cheerful work of devoted hearts. We might imagine, optimistically, that True Believers gradually realize their gods require human hands to perform divine missions. More realistically, they probably realize that propitiating sky spirits with gifts doesn’t do much by itself.

Émile Durkheim believed that pre-literate Earth Spirit religions started without gods. Early peoples, in Durkheim’s telling, sought the people’s well-being, and selected a totemic image, usually an animal, to represent the people’s collective spirit. Across succeeding generations, though, worshippers forgot the image’s original symbolic meaning. They took metaphorical stories literally, and started worshipping spirits which their priestly ancestors never intended anyone to factually believe.

Durkheim, and his rough contemporary Sigmund Freud, wrote extensively about what they termed “primitive” totemic religions in Africa and Australia. Unfortunately, they wrote without visiting those places. Both thinkers wrote mainly about their own places and times. Watching religion fade from French public life, Durkheim saw “Liberté, Egalité, et Fraternité” and images of Marianne, the personification of France’s national spirit, march into the spaces God recently vacated.

No society, Durkheim believed, could survive long without having something it considers sacred. Societies create mythologies, wither of sky spirits or of national heroes like Robespierre and George Washington, to embody the nation’s spirit and embolden shared identity. Whether the object of worship is Jehovah or Paul Revere, what we worship isn’t really the identity which might have existed somewhere, once. It’s the moral principle that identity represents.

Aaron Bushnell

Which necessarily elicits the question: what principles do Americans consider sacred?

American patriots seek sacred principles in the Declaration of Independence or the Federalist Papers—while conveniently ignoring impolitic passages, like the “Merciless Indian Savages” clause. Without either a king or a state church, America has recourse only to Enlightenment philosophy and humanist precepts. Christian Nationalists might think America has a state church, but only in vague terms; pressed for details, they, like most Christians, fall quickly to infighting.

Americans demand that schoolchildren learn the mythology of Thanksgiving, and recite the Pledge of Allegiance daily. These myths and rituals serve social needs left vacant by religion’s retreat from public life. They give Americans a unifying narrative and shared identity, while we recite public moral statements in unison, exactly like the Apostles’ Creed. As in church, these secular values are vague, but they’re shared, which is what really matters.

Those American principles, however, have not withstood scrutiny. Tales of American atrocities which trickled in slowly from the Philippine-American War or Mexican Border War, accelerated in the Twentieth Century. War crimes in Vietnam or Operation Desert Storm hit the nightly news, and the hideous violence and mission drift of the Global War on Terror happened instantaneously online. Now America’s proxy wars in Ukraine and Gaza are streaming live.

When Aaron Bushnell immolated himself this weekend, he wore his military uniform, then live-streamed his suicide on Twitch. Therefore, he didn’t just destroy himself. American secular religion, embodied in his uniform, burned first. And he distributed the image to goggle-eyed Americans instantaneously, circumventing a commercial media apparatus that’s often seen its independence undermined by state intervention, especially during wartime. This wasn’t just a statement, it was a religious declaration.

Therefore, only one question remains: will True Believers accept this declaration? Bushnell’s suicide was only secondarily about his stated beliefs; like the Pledge of Allegiance or the Apostles’ Creed, his final manifesto was necessarily vague. Religion isn’t about information, it’s about the True Believers themselves, and it doesn’t intend to educate them, but to transform them. Are we, who take Bushnell’s principles seriously, willing to let ourselves be transformed?

Monday, February 26, 2024

Burnt Offerings in Modern America

Aaron Bushnell

Yesterday afternoon, a man identified as an active-duty Air Force intelligence analyst lit himself on fire outside the Israeli embassy in Washington. Firefighters smothered the flames and rushed the individual, tentatively identified as Aaron Bushnell, to a local hospital, but Bushnell died of his injuries. According to his manifesto, Bushnell described Israel’s ongoing barrage of Gaza as a “genocide,” and described his military participation as “complicity.”

Two years ago today, I published an essay entitled “War Is Not the Answer, Except When It Is.” I compared Vladimir Putin’s invasion of Ukraine, a war that remains ongoing, with Operation Desert Storm, America’s first military intervention in Iraq. An American response in Ukraine sure looks justified, I wrote, but it looked justified in Iraq, too. We now know the justification for war in Iraq was falsified by PR professionals.

PR surrounding the current conflagrations in Ukraine, Gaza, and Yemen have been spotty. After initial international furor, the Ukraine war has retreated from headlines, except when Republicans withhold funding for military support. America’s decision to jump into Yemen attracted initial outrage, but failed to sustain feelings. Only the Gaza conflict remains a reliable headline-grabber, and not necessarily for the right reasons.

The Gaza death toll threatens to exceed 30,000 this week. As the Netanyahu government forbids Palestinians to leave Gaza, but continues strafing civilian neighborhoods, the conflict increasingly resembles the liquidation of the Warsaw ghetto. Yet English-speaking journalists find themselves shackled to a pro-Israeli narrative. Public-facing writers for MSNBC and the BBC have found themselves benched, their stories spiked, for criticizing Israel.

Aaron Bushnell’s self-immolation makes sense in historical context. From Vietnam to Tunisia, protestors have lit themselves on fire to force change in the public awareness, and to draw attention to widespread government corruption. Thích Quảng Đức’s suicide in Vietnam closely preceded the coup which overthrew President Dien’s illegal regime. Mohamed Bouazizi helped kick-start the Arab Spring, leading to pro-democracy revolutions.

Mehdi Hasan

Yet one cannot help questioning whether either death did any good. American involvement in Vietnam dragged on another decade after Thich’s death, while the Syrian civil war—which, like the Ukraine conflict, has lost Western front-page headlines—is currently well into its thirteenth year. If Aaron Bushnell’s death moves the needle for American public awareness, I applaud his sacrifice, yet I wonder whether it’s actually done any good.

Taken together, these facts force me to question who benefits from the current trajectory in American and world affairs. American silence on the Gaza atrocities has damaged the Biden Administration, but it hasn’t exactly won favor for opposition Republicans, who are aggressively pro-Netanyahu and pro-Putin. Networks losing their star journalists aren’t exactly seeing ratings boosts. Nobody but defense contractors profits from blood and destruction.

American presidents love overseas war. Because presidents also serve as commander-in-chief of the military, American military successes accrue to the President’s reputation, while defeats tarnish his name forever. Flag-waving, naming enemies, and ginning up nationalist slogans, help unify American voters around the state, and the President as head of state. The opposition party knows this, certainly, and will withhold money to deny the other side a win.

Except that hasn’t happened this time. Unlike Operation Iraqi Freedom, which certain candidates famously voted for before they voted against, American commitments in Ukraine and Israel have not produced massive national unity. Nobody’s flying flags and chanting “United We Stand” in facing down dictatorial right-wing regimes in Moscow or Jerusalem. George W. Bush parlayed Iraq into a second term, but Joe Biden is currently watching his coalition shatter.

Like Lyndon Johnson before him, we’re watching the Biden Administration snatch defeat from the jaws of victory. A fairly popular president with a relatively successful economic agenda (more on that to come) managed to alienate his own backers by supporting an unpopular war in an anti-democratic state. Just as Johnson’s personal collapse ushered in the manifestly criminal Nixon, Biden is currently holding the door for Donald Trump.

It’s tempting to describe Aaron Bushnell’s suicide as a sacrifice. But we often forget that, in origin, the word “sacrifice” doesn’t mean to give something up, it means to make something holy. Just as many early civilizations relinquished burnt offerings to petty, tyrannical gods as bribes to protect the people, Bushnell’s death represents a cosmic order that doesn’t protect the ordinary people from overwhelming whimsy on high.

For Bushnell’s death to actually sanctify America, we must start by asking ourselves: what in our country requires burnt offerings? What do we hold sacred, and why isn’t it helping?


Continued in Burnt Offerings in America, Part Two

Saturday, February 24, 2024

Twilight of the Global Bullies

In my childhood, I had a deeply conflicted relationship with bullying, as children do. Whenever confronted by bullies, the adults around me—parents, teachers, concerned outsiders—encouraged me to cultivate internal strength and resilience to remain unperturbed. But if my internal strength manifested as pushing back against bullies and asserting my own dominance, those same adults punished me. I was supposed to be strong, but not strong like that.

As an adult, I understand the difference. Child bullies appear strong in the moment, and children, lacking perspective, think the current moment will exist forever. Children haven’t seen swaggering, overstuffed bullies cross that invisible line and get smacked down. Adults realize bullying bluster always contains the seeds of its own destruction (though we frequently forget in contentious moments). Children only know that big Jimmy punched me and adults did nothing.

Children are fairly singular, notwithstanding their unique and diverse personalities. They perceive reality as eternally present, assuming that past and future essentially resemble now, with different set dressing. Not until early adolescence do children develop the ability to perceive change in the historical context, to understand that the domineering forces in their lives right now, including both adults and bullies, cannot possibly hold sway forever.

Such development isn’t inevitable, however. We all know adults who continue behaving like childhood bullies, and seemingly get rewarded for it. Workplace jerks whose infantile bluster ensures nobody likes them, but they get promoted anyway, because management knows who they are. Financiers who gambled with the stored value of customers’ homes, and imploded the economy in 2008. The IDF, currently bombing hospitals and neighborhoods in Gaza.

We now know, as children cannot possibly know, that empathy for other people’s suffering has a neural basis. As a bullied kid, I thought some people just learned empathy later in life, but no: empathy is a stage of brain development. People who see others emotions, good or bad, and remain unmoved, aren’t just unskilled or unlearned; they’re suffering a form of brain damage in their mirror neuron system.

Perhaps we see this most evidently in wealthy people. Readers of a certain age will recall the stories surrounding the Enron collapse, when we discovered that corporate executives literally celebrated their customers’ suffering. More recently, Elon Musk has aggressively acquired corporations, then demolished them, to settle personal grudges. Then there’s the watchword of modern far-right politics: “the cruelty is the point.”

These people, either wealthy themselves or desperate to ally themselves with wealthy idols, demonstrate incapacity to feel others’ pain. Like schoolyard bullies, they take pleasure at seeing poor people or smaller kids crying. This forces a necessary question: did they never learn to see other people, and their feelings, as equally real to their own? Or did they maim and scar their own brains to make such knowledge go away?

I’m guessing a little of both.

Few people achieve positions of power without some demonstrated will to ignore others’ feelings. No matter which party holds the White House, Number Ten, or other halls of power, the winners probably stepped on others’ necks to get there. George Dubya’s Global War on Terror, or Barack Obama’s targeted drone killing campaigns, are only the most globally visible manifestations. Winning power always necessarily entails lack of empathy.

However, the present offers a rare opportunity to change this dynamic. Only the most ridiculous political sophists can deny that the Netanyahu government’s campaign of terror in Gaza, or Vladimir Putin’s interminable war in Ukraine, demonstrate a failure of baseline empathy for others’ suffering. But the ripple effects of both conflicts have demonstrated the weakness of countervailing forces, like NATO, which pick and choose whom to defend from atrocities.

As governments immolate, as police forces prove themselves deaf to justice, and as capitalism flips like a pancake, I believe we’re witnessing an important moment. Not the collapse of the economy or the social structure, but the collapse of the man-children who have profited from the structure’s weaknesses. Centuries of domination by people demonstrating what we now know is brain damage, may perhaps end within our own lifetimes.

What if, rather than choosing our leaders by their ability to dominate debates, we chose them by their demonstrated ability to care? Seems far-fetched, admittedly, in a society that favors glib charisma and photogenic glamor. Yet if we organize ourselves, if we take time to determine what standards of empathy and accomplishment we consider worthy of reward, then why not? Our current self-seeking leaders have international egg on their faces.

Wednesday, February 14, 2024

In Dispraise of “Originality”

Jimmy Page (with guitar) and Robert Plant of Led Zeppelin

“You mean other people are allowed to use and repurpose music that already exists?” Sarah exclaimed, eyes wide and jaw dropped. “When I took the composition class in college, they insisted I had to invent my music out of whole cloth!”

I’ve forgotten how we reached the topic—casual conversation is frequently winding and byzantine. But I’d mentioned the multiple lawsuits surrounding Led Zeppelin, who have costly judgements against them for appropriating works by Black American songwriters, making fiddling changes, and slapping their own bylines on them. I’d explained the likely explanation: there’s a long blues tradition of songwriting by jamming around existing songs until a new song emerges.

This left Sarah flabbergasted. “My professor so thoroughly insisted on complete originality that she demanded we start composing with random notes, and building the piece around that.” I could completely believe that, too. Having attended our college’s new music showcase a few times, I remembered the preponderance of discordant, atonal music. I thought every undergraduate considered themselves another John Cage. Turns out the professor liked that effect.

Sarah felt faux outrage at the injustice of having been told that every composition had to be original. (Okay, “outrage” is overselling it. But definitely astonishment.) Yet while classical and orchestral composers, trained in frequently grueling college and conservatory programs, have an ethos of complete originality drilled into them persistently. Meanwhile, working songwriters crafting genres people actually pay money to hear, pinch and repurpose existing themes regularly.

Bob Dylan described his early songwriting style as “love and theft.” His earliest recordings show how frequently he lightly reworked existing Woody Guthrie or Dave Van Ronk tunes. Only with years of experience did he develop his own songwriting voice. Lennon and McCartney are among the bestselling songwriters ever, yet three of the Beatles’ first four albums are half cover songs, because the Beatles hadn’t found their voices yet.

I have no songwriting experience; I’m about as musical as a steel anchor. Yet in college creative writing and playwrighting classes, my textbooks espoused a similar ethos of complete originality. I remember one textbook pooh-poohing genre fiction as a “guided tour” of existing repurposed themes, while “literary” fiction always strives to be completely original. Don’t be like those popular paperback writers, the textbook urged; always create something new.

Our professor smiled ruefully and reminded us that textbook authors have their blind spots, too.

The Beatles, photographed at the peak of their star power

Literary authors and playwrights mimic one another relentlessly, and their genres are intensely fad-driven. As a playwright, it too me years to shed David Mamet’s influence, like songwriters struggle to differentiate themselves from Dylan. Most college-educated American writers pass through their John Steinbeck, Elmore Leonard, and Toni Morrison phases before achieving distinct voices. The lucky few see those exercises published.

Originality emerges in art, where it does, only gradually. Both Salvador Dalí and Pablo Picasso, famous for nonconforming paintings, began their careers with Renaissance-style portraits and church scenes. Jackson Pollock tried several techniques before uncovering his dribbling, wholly non-objective Abstract Impressionist style. Importantly, all these artists were disparaged when their approaches first appeared; they achieved acclaim only latterly, sometimes posthumously.

Yet even incidental mimicry draws ire. Returning to music. Former Beatle George Harrison’s signature hit, “My Sweet Lord,” made his solo career. Yet within months of release, lawyers fired off a lawsuit because it resembled the Chiffons’ forgettable 1963 hit “He’s So Fine.” That lawsuit commenced in 1971, and wasn’t wholly resolved until 1998, dominating his solo career, and rendering him timid as a songwriter forever after.

This trend achieved its culmination with the “Blurred Lines” lawsuit. Heirs of Marvin Gaye claimed the songwriters behind Robin Thicke’s icky 2013 hit stole Gaye’s “groove.” That is, they claimed the song resembled, not something Marvin Gaye wrote, but something Marvin Gaye could have written, and therefore was plagiarized. And they won. This sets a courtroom precedent that simply imitating venerable artists, even while creating wholly new art, is plagiarism.

No wonder Sarah’s college composition professor (now retired) favored originality over tone. Instructors, textbook authors, and now courts demand that artists constantly reinvent the wheel. Blues icons jamming in some underlit cellar are now plagiarists, not artists. Don’t build your next track around a riff from “Crossroads” or “John the Revelator,” boys, the boundaries of ownership are set!

Artists aren’t unique individuals; they’re a community of give-and-take, constantly improving one another’s raw material. Yet the ownership ethos demands nobody pinch from anybody, even incidentally. The mere fact that working artists have never done this doesn’t change the story.

Wednesday, February 7, 2024

Some Parting Thoughts on Toby Keith

Toby Keith as he appeared at his debut,
with that signature icky 1990s mullet

When Toby Keith’s debut single, “Should’ve Been a Cowboy,” raced to #1 on the Billboard country charts in 1993, I still listened to country radio. I hadn’t grown jaded on the peppy country-pop hybrid that would overtake mainstream country music in the 1990s, an overtaking that Keith helped facilitate. Therefore, I heard it go into regular rotation, as country disc jockeys praised Keith’s tapping into the key country music zeitgeist.

“Should’ve Been a Cowboy” dropped when Keith was 31 years old. That’s older than most aspiring musicians get before they quit, disgusted with dead-end opportunities and industry gatekeepers. It’s also remarkably old to debut in country music. Despite its middle-aged conservatism, since the 1990s, country music has notably disdained artists past forty. America teems with musicians every bit as competent and inventive as Johnny Cash who quit because they needed groceries.

Despite being only nineteen myself, I recognized the sentiment dominating “Should’ve Been a Cowboy.” Keith’s surface-level themes aren’t exactly concealed: his life lacks spark that would’ve been present had he lived in another time and place. I appreciated the sentiment as, in 1993, I struggled with meaningless jobs while living in a Western Nebraska town that celebrated its cattle drive-era heyday. It’s impossible to ignore the past’s alluring appeal.

However, I also recognized something below Keith’s surface-level themes. Despite longing to be a “cowboy,” his song never mentions the workaday tedium of cowpunching. Instead, he cites Gunsmoke, The Lone Ranger, and the classic “Singing Cowboys,” Gene Autry and Roy Rogers. He wants to romance women, chase outlaws, and sing around the campfire. He presents a cowboy mythology completely devoid of actual cattle work.

In 1993, I lacked the vocabulary to explain something that I innately understood, but would only verbalize years later: the legendary Wild West didn’t exist. American culture celebrated cowboys only after they were dead, inventing fine-sounding fables about heroism, hard work, and gun-barrel justice. Owen Wister’s The Virginian, the novel which defined the Western genre, begins with a prelude lamenting that cowboys, like chivalric knights, now remain only in memory.

So in extolling cowboy goodness, Toby Keith yearned to time-travel to a time that existed only in paperback novels and Hollywood fantasies. He wanted a life with the dreary bits removed, with moral ambiguity excised, with heroes and villains clearly demarcated by the color of their Stetson hats. I don’t say this unsympathetically: Keith, a former oil derrick worker, understood intimately how modern labor strips life of meaning.

Toby Keith as he appeared in 2023, thinned
by the cancer that eventually killed him

Yet the yearning for moral clarity and escapism in “Should’ve Been a Cowboy,” eventually overtook Keith’s work. Through the 1990s, Keith released middle-of-the-road Nashville fare like “How Do You Like Me Now?” and “I Wanna Talk About Me,” songs that were okay and did well on the charts. But he struggled to find his unique voice. This matters especially since, unlike other controversial Nashville artists, Keith wrote his own material.

He finally found his metier after September 11, 2001. That’s when he released the songs likely to define his legacy: “Courtesy of the Red, White, and Blue (The Angry American)” and “Beer For My Horses.” The former, known colloquially as “the Boot in the Ass song,” became an anthem of pro-war Americans during Bush’s War on Terror. The latter is a cartoonish pro-lynching song extolling cowboy myths of civilian justice.

That’s when the cosplay cowboy ethos behind Keith’s breakout single finally consumed him. No longer satisfied yearning for a past that never existed, Keith dropped himself narratively into America’s moral conflicts. Backed by Nashville’s multi-million-dollar publicity machine, he pretended to be a sandbox soldier and civilian justice-bringer. He traveled the lucrative arena circuit whipping audiences to think likewise.

Keith’s musical persona embraced absolutes. He favored “this country that I love” and inveighed “against evil forces.” He never explained exactly how he identified evil forces, except that they didn’t love America like him. In Keith’s world, apparently, evil is as evil does. Morality equated to conformity, pro-Americanism, and buying into the official state narrative. His cosplay righteousness wasn’t in the past anymore, it was merely silenced in the present.

But the shine eventually wore off the War on Terror. As America abandoned absolutism and relearned that difficult situations deserved more nuanced treatment, Keith stopped making hits. His songs remain popular with the flag-waving crowd, but he last creased country’s Billboard top twenty in 2012. His final years were characterized by silliness like “Red Solo Cup.” Country music moved on, leaving his absolutism behind. So, I hope, did the country.

Saturday, February 3, 2024

The Devil Walks on Rural Roads

Ania Ahlborn, The Devil Crept In: A Novel

Young Stevie Clark has watched enough cop dramas to know a lackluster investigation when he sees one. Police in Deer Valley, Oregon, aren’t taking Jude Brighton’s disappearance seriously. As Jude’s cousin and best friend, Stevie decides to pursue the case himself. Retracing Jude’s steps, he finds a monster lurking on Deer Valley’s periphery. But poor, tongue-tied, neurodivergent Stevie can’t make anyone take his warnings seriously.

This is my second Ania Ahlborn novel, and I wonder whether it’s too early to identify a pattern. Ahlborn takes familiar horror boilerplates, and revisits them from another angle. This time, Ahlborn spotlights a joyless small town, a dysfunctional family, and a community that doesn’t need to bury its secrets, because it hasn’t accepted that it even has any. If this sounds familiar, these are the Lego blocks Stephen King regularly builds with.

Initially, Stevie’s investigation more resembles mystery than horror. Disgusted with Deer Valley’s shrugging, nonchalant investigation, Stevie seeks answers himself. Notwithstanding his sighting a monster, Stevie’s story has more personal drama than out-and-out terror. As we follow Stevie’s parallel investigation, though, we discover facts about his relationship with Jude. The two pre-adolescents seem less friends, more trauma-bond survivors.

Stevie resembles King’s frequent child protagonists: preternaturally bright, but surrounded by authority figures too entrenched to heed his warnings. He’s also, despite his intelligence, an unreliable narrator, plagued with echolalia and visual delusions. That enables Stevie’s abusive stepfather and willfully blinkered mother (two other King standards) to discount Stevie’s warnings, even when the mystery starts penetrating their house and family.

Behind this front-story, another narrative unfolds. Rosie Alexander believes herself unloved and unlovable, especially when she miscarries immediately before her husband’s fatal accident. She retreats inside her rural cottage, just her and the secret she cannot let anyone else discover. Rosie shares her house with a slavering creature of appetite, a carnivorous hungry ghost she cannot kill, because in her twisted way, she loves it.

Deer Valley binds Stevie’s and Rosie’s stories together (though they unfold asynchronously). It’s a melancholy community, a graveyard of hope where nothing happens and everyone is doomed to disappointment, despite apparently being fairly populous and having a picturesque downtown full of people. Nobody in Deer Valley keeps pets, despite the numerous feral cats. Nobody talks about the future, because apparently there isn’t one.

Ania Ahlborn

Years earlier, something happened to Max Larsen, a Deer Valley child who didn’t heed his parents. Max’s story is every parent’s nightmare, but it’s also the phantom adults use to scare children into complacency. Stevie has heard Max’s story, and considering that his mind frequently manifests his fears, he walks a tightrope between giving into Deer Valley’s mindless blandness, and pursuing the truth he knows exists, out there, somewhere.

Ahlborn’s two protagonists, Stevie and Rosie, who never meet, and the sepia-toned community they share, seem almost comforting in their bleakness. Their story seems remarkably familiar. That’s because Ahlborn pinches them wholesale from King’s Castle Rock novels, in theme if not actual words. Both communities teem with people merely going through the motions because they’ve forgotten that anything else exists.

In comparing Ahlborn to King, I don’t mean this disparagingly. Ahlborn writes an homage to King’s style and themes, but places her spin on them. King’s child protagonists, like Danny Torrance or the Losers’ Club, regularly struggle with surrounding adults, yet we know, with the clear-eyed conviction of youth, that they’re telling the truth. With Stevie, whose senses regularly deceive him, we have no such assurance.

Stephen King is, essentially, the Beatles of mass-market horror: a commercial force so monolithic, other artists only achieve success by passing through him. But he’s also an industrial product. As I’ve written recently, King’s oeuvre is consistent enough that other writers can effectively create new Stephen King material without his actual involvement. Like the Beatles, other artists could mimic King and sign their paychecks.

By contrast, Ahlborn both embraces and resists this propensity. By writing her own Stephen King novel, revisited from her unique viewpoint, Ahlborn demonstrates that King’s technique is still artistry, provided authors have their own voice. There’s nothing wrong with giving audiences what they want; bakers make the bread people will eat, not what expresses their inner turmoil. But every baker perfects a technique totally their own.

Ultimately, Ahlborn squeezes terror from a subject readers will find excruciatingly familiar: a large soul in a small, constricting world. Stevie can’t accept Deer Valley’s pervasive lies, so Deer Valley must make him fit. We only wonder what tortures Deer Valley will inflict to preserve its myths of comforting blandness.