Friday, September 30, 2022

My Vacation is Trying to Kill Me! (Part 2)

This essay is a follow-up to Star Trek: My Vacation is Trying to Kill Me!
Data and Worf in the Star Trek episode “A Fistful of Datas”

The Star Trek holodeck episode “A Fistful of Datas” has major overlapping themes with Michael Crichton’s 1973 schlock adventure flick Westworld. Both feature high-tech modernists wandering into a simulation of the American West, expecting to swagger around in buckskins while having gunfights. But in both stories, the simulation’s safeties fail, leaving protagonists at the mercy of Colt-wielding synthetic outlaws who can’t be killed.

Smarter critics than me have observed the recurrent theme of killer entertainment in Michael Crichton’s bibliography. Both Westworld and Jurassic Park feature amusement parks where people go to gawk at merciless killers, only to find themselves on the receiving end of that violence. But Crichton also addressed this theme less overtly: White explorers in “darkest Africa” in his novel Congo, for instance, or thrill-seeking stormchasers in his screenplay Twister.

Putting Westworld and “A Fistful of Datas” together, considering what I wrote previously about the moralistic tone implicit in holodeck episodes, I’m struck by the judgment inherent in both. Each story shares the image of high-tech moderns wanting to participate in what they consider a more savage time. Both present the “American West” in overlit desert colors, contrasted with the adventurers’ dusty earth-toned clothes.

Equally important, both stories feature moderns expecting to witness savagery, but not get hurt. Both stories assume the presence of safeties ensuring that living humans won’t get killed, or even hurt that badly. But then the safeties fail, because of course they do. There’s no story if our protagonists can’t get hurt. Our modern heroes must fight for their lives against undying synthetic humans which they, or their technological overlords, created.

For Westworld to make any sense, we must accept two assumptions that don’t withstand much scrutiny. First, we must accept the premise, common in Marxism and other early-modern philosophies, that natural humans are naturally violent and destructive, and social evolution has been a constant movement away from that. Current anthropology doesn't believe this, of course; evidence suggests we’re getting more violent, not less. But the story remains persistent.

Second, we must believe the Freudian supposition of thanatos, the innate psychological drive toward death. We must believe that youth seek adrenaline-producing behaviors, like driving recklessly and picking fights, while the elderly embrace religion and other transcendental ideas, because deep down, we secretly want to die. This premise doesn’t withstand scrutiny simply because we need only look around to realize these behaviors are neither universal, nor inexplicable.

Teddy (James Marsden) and Delores (Evan Rachel Wood) in Westworld

Westworld, Jurassic Park, and the more action-oriented holodeck episodes share a theme: tourists go to watch others die. Consider the innate frisson of watching that goat lowered into the T-Rex enclosure. Whether we accept primal human savagery or “nature red in tooth and claw,” we want someone, or something, to die for our amusement. Things become less amusing when death redirects its attention onto us.

Crichton’s body of work broadly distrusts science, technology, and modernity overall. He repeatedly assumes scientists will create (or, in stories like The Andromeda Strain, invite) catastrophes that threaten human survival. But Crichton, like Star Trek’s Gene Roddenberry, ultimately shares the humanist principle that humanity deserves to survive. Whether faced by Yul Brynner’s Man in Black or a rampaging velociraptor, humans are finally the good guys.

The 2018 television remake of Westworld doesn’t share that presumption. The first-season arc, which (Spoilers) reveals that the evil, marauding Man in Black is actually gentle, lovestruck William, depicts what happens when the safeties remain active too long. Given godlike power over synthetic humans, William becomes cruel. He indulges the violence of the setting until it permeates him, confident in his faith that it can’t hurt him.

Until, of course, it does.

Meanwhile, like Star Trek’s Moriarty, the synthetic humans become aware of their artificial environment and the digitally replicated nature of their experience. They realize the ways they’ve been abused, raped, murdered for decades. Star Trek, with its humanist presumptions, handled this by appeasing Moriarty with a second-level simulation. Westworld, stripped of such presumptions, decides that somebody has to pay.

Viewed another way, Crichton and Roddenberry both assumed that humans are innately good, and getting better. But both demonstrated something very different, that tweaking the rules slightly reveals humanity’s poorly-buried violent streak. Worf and William enter their respective simulations believing death is aberrant and truth will win if spoken confidently. But when the safeties fall, and death becomes normative, it’s a very different game.

And in some games (the stories imply), losers deserve to be punished. Good people don’t play games, they remain in the real world.

Thursday, September 29, 2022

Star Trek: My Vacation is Trying to Kill Me!

Lieutenant Barclay departs the holodeck in the TNG episode “Hollow Pursuits”

I’ve long forgotten who first pointed out that much 1990s science fiction begins with the premise that we can’t tell what’s real. Movies like Alex Proyas’ Dark City or the Wachowskis’ The Matrix address different aspects of what happens when a gormless Everyman (who is, naturally, White and male) discovers that his entire life is a simulation. The original list included the ubiquitous Holodeck episodes in Star Trek: TNG.

Having ruminated upon that thought for a while, I was recently struck when a friend described watching a TNG episode and noting that the holodeck wanted to kill the crew, again. I realized that the holodeck isn’t really the same as those all-encompassing simulation narratives. First, the characters usually know they’re in simulations. Second, the characters’ own malevolent pleasures want to kill them.

Unlike those other movies, most holodeck episodes begin with the protagonists generating their own simulation. The holodeck’s purpose, established in the TNG pilot “Encounter at Farpoint,” is to offer the crew the deep-space equivalent of shore leave, the opportunity to put aside their Starfleet identities and be someone else, somewhere else, for a few hours. Options, the show demonstrates, are endless; they’re circumscribed only by characters’ imaginations.

That imagination, however, repeatedly proves the characters’ undoing. The most common outcomes are a mechanical failure trapping characters inside their own simulation, a recurrent theme in Picard’s Dixon Hill episodes; characters getting so entranced by their fantasy that they lose track of reality, as in “11001001”; or the simulation seizing control of the ship, as in “Elementary, Dear Data.” Characters never stop being aware that the simulation is a simulation.

Therefore, rather than repeat the 1990s theme that we’re trapped in a program we can’t even see—which, in practice, starts looking like a reheated critique of capitalism—the holodeck takes on a moralistic tone. The characters are punished for indulging their imaginations too often. Holodeck episodes don’t encourage audiences to resist the unseen program dictating their lives. Rather, it scolds them to more closely police their own imaginations.

In movies like Dark City and The Matrix, the simulation generally doesn’t want to kill humanity. The Strangers or The Agents only actively oppress those who dare stray from the official narrative; those who comply, get ignored, or occasionally even rewarded. Sure, the characters have no control over the simulation they’re trapped in, which the movies depict as oppressive. But it’s easy to avoid punishments; just go with the flow.

LaForge, Pulaski, and Data in the TNG episode “Elementary, Dear Data.”

By contrast, the holodeck punishes those whose imaginations are too fruitful. When characters believe in their dreams, those dreams become malevolent. Consider the recurrent Moriarty character, depicted as the only character ever to escape the holodeck. He actively punishes his creators for giving him an identity, because they were busy dreaming when they should’ve been working. Agent Smith wants Neo to remain in the dream; Moriarty punishes anyone for straying from reality.

This theme becomes most pointed with the character of Lieutenant Barclay, whose holodeck fantasies are presented as inappropriate, but essentially harmless. (More on that elsewhere.) Barclay’s retreat into fantasy is, on some level, a scolding of fanfic writers and cosplayers who rewrite the canonical characters to suit their own sexual or power fantasies. The show basically reprimands fans for getting too deeply immersed in their favorite franchise.

As the holodeck premise becomes more familiar, writers do expand their vision. The Voyager episode “Bride of Chaotica!”, for instance, depicts a photon-based lifeform that thinks the holodeck is real, and the organic crewmembers are invaders. But it still fundamentally scolds Paris and Kin for creating a fantasy so realistic that it threatens to overtake the ship. Like always, the sin is dreaming too big.

Star Trek isn’t opposed to play. Episodes show characters using the holodeck as dance studios, dojos, and gymnasia. Some episodes even show characters using the holodeck to practice religious devotion, a rarity in Star Trek. The show only punishes them for caring too deeply about their fantasies. When the holodeck becomes more important than “reality,” the holodeck chastises them and returns them to the real world, where they belong.

The Matrix depicts humanity fleeing the illusion to rediscover reality. The holodeck, by contrast, scolds characters for escaping reality and preferring their illusions. It moralistically demands that characters ground their lives in “reality,” which is defined by uniforms, rank, and work. The holodeck is basically a starched-collared schoolmaster, rapping a ruler on a student’s desk and saying, “Get your head out of the clouds.” Which is, of course, ironic.

You can find this analysis continued in My Vacation is Trying to Kill Me (Part 2)

Monday, September 26, 2022

The Future is Dead, Long Live the Past!

What does a naïve fascination with Laura Ingalls Wilder have in common with this week’s projected election of a literal Fascist government in Italy? I hear the knives coming out among my few regular readers even as I write this. Yet I contend that America’s attachment to a beatified past reflects the same sentiments that have installed Giorgia Meloni as Italy’s next presumptive Prime Minister.

Smarter critics than me have written about cottagecore, a design aesthetic based on a supposedly better agrarian past. From the architecture and artwork to the now-notorious “prairie dresses” briefly sold at Target, the cottagecore ethos lionizes a time in America’s past when people lived simply, ate food from their own gardens, and didn’t busy themselves with sleek design. Many cottagecore enthusiasts have attested their loyalty to Wilder’s Little House books.

Laura Ingalls Wilder didn’t simply write her simple novels to entertain children, though. She wrote at the urging of her daughter, Rose Wilder Lane, who probably served as unbilled co-author. Lane was also one of the founders of Libertarianism, the political philosophy holding that everything would be better if rules and regulations were unilaterally rescinded, and everyone were free to follow their internal moral compass.

Ingalls’ Little House books are replete with messages of self-reliance and autonomy. Repeatedly throughout the novels, the characters learn the importance of swallowing their complaints, working hard, and not asking anybody for help. The characters are resolutely unmoored from community, with the assistance which community entails. One novel commences with the family moving west, claiming their motivation was that too many people were around.

Contrast that with Peter Weir’s 1985 movie Witness. Forced from a Back-East city to harbor among the Amish, detective John Book (Harrison Ford) learns the exact opposite lessons which Wilder taught: restraint, sharing, community. Important moments happen when he learns, for instance, to sip his lemonade, not guzzle it. I can’t be the only American who considered decamping to Amish Country after watching Book participate in an Amish barn raising.

Cottagecore espouses rural simplicity, but not a communitarianism. It’s all Laura Ingalls Wilder, no John Book. Adherents believe, to a greater or lesser degree, the popular White American myth of autonomy and rural solitude. But real early rural life was deeply communitarian, because it needed to be. Besides, as historian Nancy Isenberg writes, the Ingalls family probably moved west, not to avoid crowds, but because richer Whites chased them off the land.

Meanwhile, as Americans of every political hue romanticize Oregon Trail pastoralism, Italians have elected a presumptive PM whose own party calls themselves “heirs of Il Duce.” Giorgia Meloni promises to reinstate nearly the entire Mussolini policy agenda, because that ended so well last time. Mussolini literally gave global nationalism the shorthand name of Fascism, and provided the political blueprint that Hitler duplicated and expanded upon.

Meloni cites the endless panoply of evil-bringers which American and British audiences will recognize: immigrants, “globalists,” homosexuals. She promises to restore lost national greatness that definitely existed in the vaguely defined past: make Italy great again, if you please. Fundamental to small-F fascism is a belief that things used to be good, but now they’re not— though that goodness, and that lost time, are always fuzzy and intangible.

Italy isn’t alone in yearning for a more authoritarian past. Vladimir Putin has lamented the Soviet collapse, and attempted to regain Russian greatness by repeating Stalin’s greatest sin, oppressing Ukrainians. Britain’s new PM, Liz Truss, announced her own good-old-days policy by slating Thatcher-level tax cuts, and global currency markets responded with scorn. Because if there’s anything Europe needs, it’s 1980s-level unemployment and labor unrest.

As I’ve recently written, popular culture keeps looking backward because, I suspect, it’s leery about whether we have a future. But in pulling focus outward to encompass the larger social terrain, that seems to be a widespread attitude outside Hollywood, too. From gingham dresses to brownshirts, our discourse is dominated by nostalgic longing for a beatified past that never quite really existed. Just ask anybody who survived Jim Crow.

I understand this fear of having no future. Look outside your window: it’s blazing hot and getting hotter, and we’re running out of clean water. Capitalism has created intense poverty, but rosy-eyed alternatives don’t fix injustice, they move it around. A small handful of centibillionaires are so wealthy they’d rather flee Earth than fix its problems. It’s easy to feel like there’s no future. But if we don’t find our future soon, the future will surely find us.

Saturday, September 24, 2022

Quantum Leap: Time Travel Fiction in a World Without a Future

Caitlin Bassett and Raymond Lee in the Quantum Leap soft reboot

I’ve heard good things about the newly-launched soft reboot of the 1980s TV series Quantum Leap. Though only the first episode has streamed at this writing, friends and reviewers who’ve watched it describe it as a worthy relaunch of the story concept, without trying to retell the exact story. Yet I can’t bring myself to watch it. The very thought of yet another soft reboot from the Hollywood dream factories just makes my skin crawl.

Fans use the term “soft reboot” to signify a story that’s technically an extension of the original, but which attempts to shoehorn new characters into the protagonists’ role. The way, for instance, Disney gradually wrote the original trinity of characters out of Star Wars in favor of younger, handsome successors. Or the way Terminator: Dark Fate slowly squeezed Sarah Connor and the T-101 in favor of a new resistance and a new, looming future terror.

According to reviews, this Quantum Leap exists in parallel to the original. It’s unclear whether any actors from the original series will appear, though Scott Bakula has given interviews from the sideline. (The only other actor to appear throughout the original series, Dean Stockwell, passed away in 2021.) The new series features entirely new characters, and apparently mentions the original characters, so it’s possible that original protagonist Sam Beckett will reappear in the unspecified future.

Thus Quantum Leap joins the ranks of previously ended shows, like Roseanne, iCarly, and The X-Files. These series had clearly delineated final episodes, and came to definitive conclusions, sometimes under poor circumstances. Sure, Quantum Leap was remembered more fondly in its absence than when it actually aired. But Roseanne languished for literally decades, its final season a subject of mockery and scorn, before somebody decided to try again.

But somehow, nostalgia won out over derision.

It’s easy to pooh-pooh these studio reboots. In an era when instant digital interconnectedness puts thousands of writers within easy reach, the studios aren’t seeking new ideas or new thinkers; they’re ransacking the vaults to retool old ideas with minimum effort. But this isn’t a new complaint. Cultural snobs disparaged the love of remakes and sequels at last back to the 1980s, when I began paying attention. This is something worse.

I don’t think the studios believe there’s a future anymore.

Scott Bakula and Dean Stockwell in the original Quantum Leap

This is perhaps remarkable to say when so many reboots, hard and soft, are in the science fiction realm. We currently have four Star Trek series currently in production, the largest simultaneous output the franchise has ever seen. But the future depicted in Star Trek is the future Gene Roddenberry envisioned sixty years ago. Resurrecting characters like Captain Pike and Mister Spock isn’t visionary, it’s downright cowardly and retrogressive.

Quantum Leap takes this to new heights. Though it’s an essentially optimistic show, grounded on the idea that we can be better people if we’re given the opportunity to make better choices, it’s also necessarily based on regret. Series protagonist Dr. Ben Song wants to rewrite the past and make historic mistakes just disappear—a feeling familiar to anybody who survived middle school. On some level, he just wants to make the past go away.

In fairness, this approach possibly could help audiences deal with today’s issues. Faced with a present where old issues—racism, war, religious nationalism, and environmental devastation—are happening all over again, maybe it helps to revisit past mistakes and consider what choices we could’ve made, but didn’t. It depends on how showrunners handle the developing story. We’ll need to see whether they show imagination and courage, or retreat into low-friction storytelling and cheap fan service.

I’m not entirely averse to reboots and relaunches. Both Doctor Who and Battlestar Galactica came back from the Hollywood graveyard with new approaches to addressing modern issues. When America felt adrift post-9/11, with little unifying us except a poorly defined enemy, Galactica made that mission drift appear literal onscreen. It gave Hollywood appropriate tools to dissect America’s political malaise, and our need to find an identity if we wanted to handle our modern problems competently.

But these are outliers. Most reboots, hard or soft, aren’t addressing today’s problems with today’s tools. The recent Twin Peaks third season or today’s orgiastic Star Wars smorgasbord are attempts to snuggle up in past media products like a security blanket, and avoid addressing today’s issues. Hollywood studios are essentially throwing their hands in the air and refusing to even look at the world around them. And it shows in the reheated product they’re producing.



See also: Quantum Leap, the Enlightenment, and “Laplace’s Demon”

Thursday, September 22, 2022

Creedence Clearwater: Burnin’ Hard and Burnin’ Out

Bob Smeaton (director), Travelin’ Band: Creedence Clearwater Revival Live at the Royal Albert Hall

Promo photo taken during Creedence Clearwater’s only European tour

Fans of the rock band Creedence Clearwater Revival often discuss them in awe-filled, nearly reverential terms. I should know, I’m one of them. They burned fast and hard, producing five albums in 1969 and 1970, working under singer-songwriter John Fogerty’s iron-fisted rule. But they burned out equally quickly: by early 1971, John’s brother Tom Fogerty quit the band, and they barely limped across the finish line a year later.

This documentary depicts CCR’s April 1970 performance at London’s Royal Albert Hall. That concert, though, runs barely 45 minutes, too short to constitute a feature-length performance. So to flesh out the performance, the film is bookended with a documentary, more a hagiography, narrated by Jeff Bridges. This recounting presents CCR as “the biggest band in the world,” presented as literal heirs to the Beatles, who disbanded weeks before this concert.

Reviewing something like this, we’re split between two forces. The concert itself is solid, fast-moving, and entertaining, a breathless display of CCR’s aggressively American musical ethos in an iconic British venue. The documentary is… something else. It was clearly written in an attempt to recapture the experience of being a CCR fan in Spring of 1970, and blithely ignores much of what we now know was happening behind the scenes.

Bridges’ linking narration, written by John Harris, starts with the band’s origins in El Cerrito, California. Anchored by kids who’d known each other since middle school, the band, initially known as the Blue Velvets, consciously rejected British Invasion influences and played blues-rock based on Buddy Guy and Leadbelly. Snippets of seldom-heard 45s sound anachronistic for the middle 1960s, granting insight into John Fogerty’s early anti-rock influences.

As engaging as this chronicle is, though, fans can’t ignore that by 1970, fault lines were already developing. We know this, but the documentary apparently doesn’t. Tom Fogerty in particular appears disconnected from the band, stone-faced and out of sync even while singing classics like “Bad Moon Rising” and “Proud Mary.” Worse, the narration is uncritical of Fantasy Records, with whom CCR notoriously had one of rock’s most lopsided contracts.

Therefore we know, but the documentary apparently doesn’t, that the seeds of breakup already existed. Like the Beatles’ Get Back, which shows the band playing live on a London rooftop years after they’d already disbanded, the documentary depicts the illusions of the moment, not the historical scope. CCR appears only in archive footage, film shot and curated in 1970 by Fantasy Records’ PR department to sell albums, not depict reality.

A still from the performance video

After the throat-clearing, the film transitions to the concert itself. Here’s the part we actually wanted. CCR plays with a musical alacrity most acts only capture briefly: old enough to know their instruments and play with passion bordering on violence, but young enough to withstand the hot lights and screaming crowds. They play (most of) their classics for an audience to whom this music is still new and dangerous.

The 52-year-old footage shows signs of the times. Camera operators keep trying to capture band members, especially John Fogerty, in tight face holds, a common maneuver in 1970, substantially undercut by Fogerty’s refusal to stand still. John’s brother Tom plays rhythm on a big white arch-top guitar, but mostly stands still, frequently overshadowed by the amp stacks. Camera operators mostly ignore him, which is unfortunate.

I saw John Fogerty perform live in 1997, by which time “John Fogerty” had become a stage character as much as an individual. Like Mark Twain, Fogerty wore his persona consciously, with his affected semi-New Orleans accent and nostalgic ramblings. That Fogerty isn’t on display here. This Fogerty is soft-spoken but hard-rocking, aggressively belting hits with almost no stage banter. He appreciates the audience without courting them.

This is a period piece. The footage, though digitally restored, is aged, and the sound reflects 1970 amplifier technology. But I appreciate one aspect: unlike The Band’s The Last Waltz or the Rolling Stones’ Gimme Shelter, this film doesn’t interrupt the concert. CCR plays straight through, without self-conscious cinematic intrusion. One suspects this might be what it felt like to see them playing at their peak.

In the final moments, Bridges’ narration calls CCR “the biggest band in the world.” But ten months later, they were already disbanding. Fans will watch this performance with a fatalism that the documentary tries to avoid. Notably, no band members were involved with this documentary; it’s a product of the money machine, not art. But it’s also a tight, muscular performance of a band whose work really mattered.

Tuesday, September 20, 2022

The Moral Boundary of “Over There”

Florida Governor Ron DeSantis in his
favorite pose: angrily lecturing the crowd

U.S. military drone pilots aren’t eligible for America’s most prestigious combat awards, no matter how many state enemies they’ve successfully killed. As Eyal Press writes, top brass doesn’t consider kills delivered remotely to equal those delivered in-person. Though drone pilots often rack up higher totals, and also witness more of the carnage they leave behind, they’re only entitled to second-class commendations, and aren’t officially deemed “heroes.”

I thought of these unaccredited veterans this week while watching the unfolding folderol around Florida Governor Ron DeSantis dumping dozens of undocumented immigrants on Martha's Vineyard. Just as the kills delivered remotely aren’t regarded as officially “real,” similarly, the suffering DeSantis and his allies delivered upon both those migrants, and the permanent residents of Martha’s Vineyard are somehow less than real. Chain of causality somehow doesn’t matter.

Based upon the middle-class Christian morality I grew up with, DeSantis is morally culpable not only for the sufferings of those migrants, but the burden imposed upon Martha’s Vineyard residents. Information keeps spilling out about how DeSantis’ operatives outright lied to migrants who only wanted shelter and honest work. Then he stuck Florida taxpayers with the bill for his deception. All to score points with MAGA voters who vocally despise immigrants.

Conservatives apparently think DeSantis scored points against the Vineyard’s rich liberal denizens. (Both Bill Clinton and Barack Obama regularly vacationed on the Vineyard during their presidencies.) But those rich denizens only live there seasonally; the Vineyard’s year-round residents are mainly poor, working-class, and thanks to a fluke of history, disproportionately likely to be born deaf. The rich summer residents went home weeks ago.

Watching everything unfold, I realized: DeSantis doesn’t think what’s happening is his fault. And when I say “fault,” I don’t mean “responsibility”; I mean he sees the chaos caused by Vineyard residents scrambling to provide food and shelter with no advance warning as the Vineyard’s moral failing. Like an officer in the drone pilots’ corps, he thinks what happens thousands of miles away is somebody else’s doing, even when his own apparatchiks pulled the trigger.

This realization clarified something for me which has always been implicit in modern politics, but never quite so forthright: conservatives see proximity as a necessary component of morality. Concepts like the chain of causality don’t matter. You aren’t responsible for the knock-on effects of your choices; only the individual closest by actually holds culpability. Once those migrants were on DeSantis’ plane and away from him, he stopped bearing any responsibility for them.

President Theodore Roosevelt

These immigrants, coming to America mainly for honest work, didn’t just happen. America has contributed to Latin American poverty and violence at least since the days of the United Fruit Company and Teddy Roosevelt’s “Gunboat Diplomacy,” if not earlier. Mexico has trembled at notorious Norteamericano whims since, arguably, James K. Polk. America stripped Latin Americans of their land and labor; now they come here, looking for work.

But conservatives don’t believe this matters. President Trump’s notorious “shithole countries” comment represents a continuous pattern since at least Ronald Reagan’s disparagement of Salvadorans seeking asylum in America: that Latins, and only Latins, are responsible for continued Latin American poverty. Never mind that the partisans shooting up 1980s El Salvador were literally trained in Fort Benning, Georgia.

Nor is this pattern exclusively international. American poverty is so geographically concentrated that “urban” has become a dog whistle meaning “poor and Black.” Yet our national poverty policy, at least since Reagan, and definitely amplified by Clinton, has been to blame America’s poor for their continued neighborhood blight. From police crackdowns to enforcement of nickel-and-dime regulations on EBT recipients, our government attempts to punish people out of their poverty.

Again, this isn’t new. Even before “redlining” became common parlance, Jane Jacobs observed how banks and governments used neighborhoods’ poverty to justify refusing to do anything to fix the communities. Those in proximity bore fault for that poverty; generational efforts to create poverty stopped being anyone’s responsibility. Those whose laws or financial transactions created that poverty felt free to stick their hands in their pockets and whistle.

This has been the pattern, I now realize, throughout my lifetime. Conservatives like Reagan and Trump, and even centrists like Bill Clinton and Barack “The Drone Ranger” Obama, feel like nothing sticks to them if it simply happens far enough away. The DeSantis Martha’s Vineyard clown show is simply the latest manifestation of this: he made a choice, and took an action, but in his mind it stopped being his fault once it crossed state lines.

Friday, September 16, 2022

Modern Life, and the False Yearning For “Inspiration”

Like many writers, I have wasted countless hours staring at blank pages or flickering screens, waiting for inspiration to overcome me. All writers know this doesn’t work. Though most of us can recount the occasional moment of overwhelming afflatus, when an idea emerges fully formed and we run with it, these are the outliers. Usually, inspiration only happens after the creative act begins: we become inspired after, not before, we write.

I discovered years ago that, to make that supposed magic of inspiration happen, I simply needed to start writing. I set a ten-minute timer, force myself to type with the urgency of a prisoner writing his confession, and somewhere around minute seven or eight, I surprise myself by discovering whatever hidden message I secretly wanted to write all along. I know this, I’ve “discovered” it several times. Yet I keep forgetting, and needing to relearn this important lesson.

How many of us, I wonder, spend our lives wandering largely aimlessly, waiting for inspiration to overtake us? Waiting for God, the universe, or the shared unconscious to speak some message we need, and provide our lives direction? When I watch people trudging through jobs that pay well, but provide no larger satisfaction, or sticking with hobbies, relationships, religions, and other activities they clearly hate, this question always emerges again.

Wandering is a useful activity, I’ll acknowledge. Elijah, the Buddha, Jesus, and Mohammed all needed to spend time wandering, searching for that higher message they’d eventually bring back to the nations. An entire world religion, Taoism, is dedicated to attentive wandering, to finding one’s place by encountering new experiences without a schedule. So I never want anyone to accuse me of disparaging the act of wandering.

However, there’s a difference between wandering purposefully, with an attentive mind seeking answers, and traipsing aimlessly through life, hoping something might emerge, fully formed, and give our lives meaning. Modernity has stripped most people of purpose, whether they find that purpose through religion— which continues retreating from daily life— or through worldly pursuits like work, which is increasingly automated. We meander, today, because there’s no destination left.

In my childhood, my parents told me writing wasn’t a pursuit for grown-ups. Most people who aim to become writers flail about and never make a profit, they warned me; so act like a responsible adult and learn a trade. Because they provided no guidance in finding a trade, but also no support in becoming an artist, I became neither. Though I think I’m a pretty good writer, I’ve never made my bones in the field, because I never learned the business aspects of art.

(Full disclosure, my parents eventually reversed themselves and said I was a natural-born artist, and should pursue that. But not until I was nearly forty, and my habits had calcified.)

From a political perspective, progressives and leftists like speaking of systems. You’ll hear them harping relentlessly on systemic racism, systemic poverty, systemic injustice, and the systemic kitchen sink. Conservatives condemn them for this, using the language of individualism and autonomy, but their actual policies swap systems for hierarchies, and urge everyone to find their place on the pyramid. Systems or hierarchies: either way, shut up and comply.

I could continue listing ways modernity fails us. Education, by dividing life into discrete subjects, separates art from business, philosophy from science, and pits these disciplines in gladiatorial combat. Therefore too many aspiring artists never learn the fiduciary responsibilities of art-making, while, as we learned in 2008, too many business professionals overlook the human aspects of finance. Then artists and economists both crash.

As in writing, life doesn’t provide ready-made inspiration. Jesus didn’t promise anyone salvation through abstract belief; he told believers: “Go, and do likewise.” We can give mental assent to anything: I’m going to love my neighbor. I’m going to write this book. I’m going to start my business. But until we do, it doesn’t matter. Our brains respond to the stimuli we give them; we change our brains, and arguably our souls, in the doing, not the rumination.

How many people, I wonder, march through somebody else’s life script, hoping that inspiration will emerge? More than a few, surely, and that includes me. But in life, as in writing, inspiration comes second. We live first, with the mistakes and the uncertainty which life entails, then inspiration follows: clarity and insight emerge from what seems, at first, like chaos. Inspiration isn’t a fully formed idea; it’s the moment we see patterns in life’s mess.

Monday, September 12, 2022

The Woman Elizabeth vs. The Queen

Elizabeth II at her coronation in 1953
In the four days since Queen Elizabeth II’s death, I’ve seen two contradictory opinions trending online; and neither is necessarily wrong. While lovers of tradition and continuity mourn Elizabeth’s person, less charitable critics, particularly from current or former colonies, excoriate her part in an imperial machine that devastated countless nations and crippled entire continents. Each side condemns the other for being unwilling to give ground, especially in time of mourning.

I’m unwilling to call either side wrong. The British imperial juggernaut bears massive responsibility for widespread damage. As historian Walter Rodney writes, Europe overall, and Britain particularly, orchestrated an economy of race-based deprivation from which Africa hasn’t recovered; the same broadly applies to South Asia and the South Pacific. Britain didn’t act alone, but no other European colonial power matched Britain’s reach.

However, by Elizabeth’s accession in 1952, that colonial enterprise was long over. Elizabeth’s distant cousin, Lord Louis Mountbatten, had already overseen Britain’s release of authority in India, Pakistan, and Burma. Egypt was already independent, and Britain would relinquish its other African colonies incrementally over the coming decades. They didn’t do so peacefully, sure, nor from simple big-hearted charity. But Elizabeth’s British “Empire” was broke, war-weary, and winding down.

It bears stressing that Britain’s monarchy survives, in no small part, because of its essential powerlessness. While French monarchs were frog-marched to the guillotine, while Kaisers were chased out of Germany in disgrace and Tsars were executed under cover of darkness, British monarchs held office remarkably peacefully, even through two world wars. And that office survives, substantially, because in real-world policy terms, it doesn’t matter that much.

The last British dynasty to actually set policy, the Stuarts, were chased from power in 1688. As Thomas Levenson writes, the two greatest royal prerogatives, the national army and the national purse, passed into Parliamentary control. While the French army and treasury remained the monarch’s personal possessions, the British state separated from the royal person. The throne retains titular authority, but the person holding the throne hasn’t mattered in nearly 350 years.

That’s why Britain triumphed in the Nine Years’ War, despite France’s greater wealth and population. It’s probably also why Britain survived the transition to industrial capitalism with fairly limited civil uprisings and the occasional testy William Blake poem. Nascent capitalism destroyed French and German monarchies, and precipitated the American Civil War. Britain, meanwhile, invented securities futures and kept going pretty smoothly.

The last photo of Elizabeth, taken
at Balmoral, days before her passing

Don’t misunderstand me. This transition to state-based rather than monarchical power wasn’t without consequence. British colonization in North America was a desultory, poorly organized enterprise, but the newly capitalized British state ransacked Africa and India with an industrial efficiency previously unseen in world history. British colonialism overtook the world because of, not in spite of, Britain’s weak monarchy; no other country could’ve done that.

I watch the dueling strains of aftermath unfold fully conscious of this history. There’s the woman, Elizabeth, who famously disdained royal frippery and shook hands with commoners. After catastrophes like the Grenfell Tower fire, while Prime Minister Theresa May issued toothless injunctions from inside Downing Street, Elizabeth actually met with survivors, and took point in gathering statements—acts of leadership frustratingly absent from recent British governments.

Contrary to the woman Elizabeth stands the Queen. Though the monarchical person hasn’t mattered since 1688, the monarchical role remains necessary to validate the ever-changing state. And that role bears responsibility for worldwide colonial devastation and postcolonial malaise. Even in former colonies that have relatively flourished, like the U.S., Canada, and Australia, that prosperity isn’t distributed equally, and few First Peoples enjoy much security.

The woman Elizabeth stands against her rank of Queen. The fact that colonized peoples continue blaming Elizabeth for postcolonial ruination reveals that they need a single, unitary individual to hold the narrative together. Other nations have sought a similar unitary individual: Bonapartism in France, the Führer in Germany, and contemporary strongman leaders like Trump and Putin. Even in postcolonial states, autocrats like Nigeria’s Abdulsalam Abubakar tie the story together.

Thus I can understand the contradictory desires. Some want to mourn Elizabeth, while others want to condemn the Queen. These two stories aren’t contradictory, because history would’ve wreaked a path of destruction even without the presence of Elizabeth’s person. But Elizabeth, unlike her Nazi-sympathizer uncle Edward VIII, chose to use her powerless title for ordinary British people’s betterment.

She wasn’t always successful, certainly. But, considering the alternative was more power to people like Margaret Thatcher or Boris Johnson, the world was arguably better with her in it.

Saturday, September 10, 2022

Time and Relative Dimension in Space

When Picasso debuted Les Demoiselles d’Avignon in 1916, it provoked outrage; critics called it hideous, pornographic, and “a loss to French art.” It’s now considered a pathbreaking work of Twentieth Century modernism. But it’s also somewhat overshadowed by time and familiarity. Like me, you’ve probably seen it reprinted in countless art history texts, coffee-table books, and cheap reproductions. It’s become comfy, bland.

Earlier this year, the artificial intelligence program Dall-E Mini, and its altogether more intricate cousin Midjourney, began shilling computer-drawn art online. I scoffed at both, not only because I disbelieved an AI program could make innovative artistic choices, but also because the programs kept getting clogged by users. Like disco or Bill Gates, anything that becomes too popular too quickly usually becomes toxic just as quickly.

However, I’m also curious, and when by happy coincidence I landed on an opportunity to use Dall-E, I grabbed it. As a fan of the British TV series Doctor Who, I decided to test-drive Dall-E by inputting an easy keyword: TARDIS. Then, just to see what would happen, I connected a second keyword: Picasso. Just to see what might’ve happened if the notorious Pablo Picasso had painted Britain’s most famous time and space capsule.

The experiment didn’t disappoint.

In Les Demoiselles, Picasso reduced the female form to its most rudimentary, stripping the niceties which art inherited from Renaissance realism. Though artists like Leonardo had rendered human forms in unprecedented detail, his successors had cheapened realism; art had descended from Renaissance verisimilitude, through Rococo frippery, and into cheapjack schlock by the early Twentieth Century. Picasso purged that habit of fake realism from his work.

But more important, in stripping artifice from painting, Picasso stripped artifice from his subjects. His Demoiselles were prostitutes, resident at The Avignon, a Barcelona brothel. Though these women tried to look enticing for men, Picasso captured the feeling beneath their erotic finery; they gaze through their audience with angle-eyed contempt that contrasts with their poses, exposing their scorn for the clientele upon whom the depend for their living.

Except, as I said, that meaning has become murky through familiarity. Picasso spent months perfecting this one painting; by his later career, he cranked out large-scale paintings at a breakneck pace of one canvas per day, hoping to recapture the magic he demonstrated in classics like Les Demoiselles or The Blue Guitar. He became an artistic assembly line, producing the artistic equivalent of wallpaper for a lucrative audience.

When Doctor Who returned to television in 2005, it enjoyed vastly improved technology over how it last appeared, in 1989. Producers could merge shots, allowing the TARDIS to materialize into tracking shots, or permitting the camera to follow Rose from an ordinary London street into the otherworldly TARDIS interior. I remember watching those first episodes, wide-eyed with childlike wonder. The show was as perfect as it possibly could’ve been.

But like Picasso, the BBC had to repeat the wonder of that first impression forever. It needed to constantly create more elaborate monsters, more apocalyptic catastrophes for the Doctor to avert, and more life-altering consequences for the Doctor’s companions to endure. Early story arcs, like Bad Wolf or Harold Saxon, turned into ridiculous multi-season extravaganzas like the Cracks in Time or Missy’s Redemption. Wide-eyed wonder turned to eye-rolling cynicism.

Dall-E’s renderings of the TARDIS in Picasso’s style are crude, especially compared to the detailed realism which Midjourney is capable of producing. Yet in their crudeness, in their simplicity, they reveal Doctor Who in terms I remember from watching low-tech Tom Baker episodes on PBS in 1982. Sweeps of pattern and color, of hard lines against soft contours, remind me why I first loved this series.

Art that once moved us eventually becomes familiar and banal. Mondrian’s Composition series, once a shocking demonstration of color and line, has become so commonplace that it’s used as a literal wallpaper pattern. The same happens with popular art—and I’d consider television “art” in those terms. Doctor Who, like Law & Order or Gunsmoke, used to feel trailblazing and dangerous. But the need to keep producing reduces them to gimcracks.

I sincerely doubt Dall-E or Midjourney can create innovative art. Both programs work from existing algorithms, and replicate choices which previous human artists have made; they don’t make new choices of their own. But, in combining disparate influences, they can nevertheless have a real emotional impact on the audience. Most important, they can remind us why the familiar art with which we’ve grown tragically comfy, once mattered so much.

Monday, September 5, 2022

The Rings of Power, and the Power of Trolls

Morfydd Clark as Galadriel in The Rings of Power

Back when Peter Jackson’s Lord of the Rings movies first dropped, I remember disagreeing with this kid at the university I attended. He angrily disputed the characterization of Faramir in The Two Towers, a characterization which, in fairness, was at odds with the novel. “He’s the good son!” this kid argued. “His entire character point is that he isn’t tempted by the Ring; he shrugs off its influence and helps Frodo achieve his goals!”

Everyone around him, me included, tried to reason with him. Faramir, in the novel, literally sits Frodo down with some tea, talks out his history, and finally pledges to aid Frodo’s quest. However, we all agreed, that would’ve been talky and boring in a strictly visual medium like film. Characters solving disagreements with language is satisfying in prose, but not on screen; that’s why the Council of Elrond—20,000 words in Tolkein—was similarly reduced in the first movie.

Nothing doing. This kid adamantly insisted that his mental construct of Faramir’s character, and only his mental construct, deserved depicted onscreen. When everyone around him presented carefully constructed, evidence-based reasoning that language-driven scenes, performed mostly sitting down, make boring cinema, he simply got increasingly angry. “But he’s the good son,” this kid shouted, the same thing he’d already been saying.

I recalled this kid, and his angry one-note assertion, with the online reaction to Amazon Studios’ The Rings of Power, which dropped last week. The usual suspects crawled from the online woodwork with time-tested reasons why the adaptation, based on backstory notes which Tolkein published as an appendix, supposedly deviated from the author’s vision. But when they say “the author’s vision,” these amateur critics clearly mean their own mental construct.

Not all comments were bad-faith. Some complained about the mostly English and New Zealand actors playing Harfoots with faux Irish accents almost as off-base as Dick Van Dyke’s notorious Cockney. Meanwhile Owain Arthur as Durin, prince of Kazad-dûm, is an over-the-top caricature of an angry Glaswegian coal miner. Serious fans might question these easy choices in devising characters whom Tolkein didn’t necessarily characterize thus.

However.

An appalling number of critics complain about the presence of non-White actors in their supposedly lily-white Middle Earth. Showrunners for The Rings of Power cast British comedian Lenny Henry as a Harfood (Hobbit) leader and Puerto Rican actor Ismael Cruz Córdova as an Elven sentry, among others. Unpaid reviewers, many using pseudonyms, have rate-bombed the first two episodes on Reddit and Rotten Tomatoes for this shirttail diversity.

Sir Lenny Henry (center) with Sara Zwangobani and Thusitha Jayasundera
in The Rings of Power

Others have attacked Welsh actress Morfydd Clark’s portrayal of young Galadriel. Like that kid in college, these reviewers have a particular image in their heads, and respond with rage when it’s violated. Clark plays Galadriel like a youthful, sometimes vindictive warrior, not the slow-moving pre-Raphaelite monarch Cate Blanchett portrayed in Peter Jackson’s movies. (By contract, this series is unrelated to Jackson’s movies, though they clearly overlap stylistically.)

To constrain these bad-faith reviewers, Amazon has, among other things, began limiting access to customer reviews. (This didn't begin with The Rings of Power, but it definitely pushed the cause.) Answering these troll-like reviewers on Twitter and Instagram has become a full-time job for some fans. I wish we could politely decline to engage the trolls, but we can’t. As Kelly Marie Tran learned, the trolls’ combined might and lack of shame have changed the landscape.

Popular social media has made outrage marketing remarkably profitable. Trolls use bad-faith argument to steer the internet’s attention to themselves, usually by provoking normal people into outrage, which lets them claim moral high ground. Politicians, comedians, and ordinary people who enjoy the spotlight use the internet to antagonize those blessed with empathy and honor. They don’t care about outcomes; they only want attention.

Admittedly, I’m biting the hand that feeds. I get virtually all my blog business through Facebook, Twitter, and Instagram; the utility of read-write social media has made it possible for me to share my opinion worldwide. But I attempt to communicate in good faith, something painfully lacking in anonymous media like Reddit and Chan boards. And because I don’t stoke outrage, probably not many people will ever read this essay.

My college friend wasn’t a troll. When The Two Towers dropped in 2002, social media barely existed, and I doubt he would’ve misused Reddit’s reach if he’d had it. But he had the same bad-faith deafness to reason, and the arrogance to believe his opinion should carry weight. And that kind of attitude is dangerous today, as we keep learning.

Friday, September 2, 2022

The Crater in the Center of Town

A vintage postcard featuring the Palmer House Motel

I stand before the putty-colored monstrosity and wonder: how did the place I love become this? I already knew they’d flattened the Palmer House Motel in Auburn, Nebraska. But I hadn’t known that the new owners intended to replace it with another chain store, another monument to the ways commerce makes every place the same. There’s no sign on it yet, but I recognize the architecture as belonging to Dollar General.

When my truck collapsed beside Highway 75 in February 2020, the Palmer House Motel had provided me an inexpensive room in a strange town when I had nowhere else to crash. That’s not a major personal investment, of course. Anybody hypothetically could’ve done that. The motel’s owners also provided me transportation, recommended food at affordable rates, and took time to make me feel welcome. They didn’t have to do any of that.

Palmer House mattered to me when I had nobody else to rely on. When all the possessions I’d accumulated were somewhere else, and I lacked food or a pillow, they offered a place to sleep, a connection to the world, a way that I could, for one scared and desperate weekend, still be part of the world. Palmer House was part of my life for three days, but those three days were embossed on my brain as a frightening encounter that, with their help, I successfully avoided.

And now they’re gone. No longer physically there, no longer part of the larger community fabric. Instead, there’s another standardized retailer, a place without any sense of local identity. They literally flattened the rolling landscape and made the community conform itself to the chan store’s demands, unlike the motel, which used the texture of the land. Palmer House was part of the community, and the people who worked there also lived there.

Before the Interstate channeled most Nebraska traffic into one centralized corridor, Palmer House offered travelers a connection to the community. Even after most traffic moved thirty miles north of Auburn, Palmer House attracted migrant workers and short-term residents, offering them a local address within walking distance of downtown. It offered people a base they could leave every morning and return to every night, right in the middle of town.

Now, it’s been replaced by commercial anonymity, a building useful but without character. A building people can visit without having any connection, where they transit across the space with ghost-like ephemerality. They aren’t part of the community, and the community doesn’t change them. Dollar General offers no opportunity for introspection or identity, like I received when I was stuck in Auburn. There’s just a vacancy at the heart of town.

The criminally underutilized downtown of Auburn, Nebraska

Auburn generally, and Palmer House specifically, offered a chance to look inside myself. A chance to evaluate my principles, and decide whether I approved or disapproved of my life’s choices. Three days wasn’t time enough to truly understand the community, but the community offered me an opportunity to understand myself a little better. An opportunity that, in my mind, is inextricably connected to a specific place.

Auburn has other motels. A trucker’s motel on the far north side of town, for instance. But, like the dollar store, it’s built on flattened land that ignores the natural terrain. Like the dollar store, it’s an imposition, a place people can duck into and then leave without having to see anything that changes their viewpoint. Palmer House was part of the community; the trucker’s motel is a crater, a concerted absence of local experience.

My experience in Auburn was colored by everything that came before, but also everything that came after. During my “lost weekend” at Palmer House, COVID-19 was making its first appearance in America. I had no idea that this introspective weekend would be my last opportunity to travel more than eight miles from my house for the next fourteen months. That weekend with no car, no phone, limited social contact, was a prelude for everything impending.

I learned important lessons that weekend, but sometimes I wonder whether I learned enough. I find myself drifting back into comfy but meaningless patterns of work, screen staring, and sleep. Before Palmer House was demolished, I’d considered going back, not as a scared and desperate wanderer, but as a seeker, to recapture the moments spent wondering. But, like so many, I waited too long. The place I needed is gone.

How many of us, I wonder, can look to the spiritual refuge we need, and find just another Dollar General in its place?