Showing posts with label pop culture. Show all posts
Showing posts with label pop culture. Show all posts

Monday, March 2, 2026

Will We Ever Get Tired of Re-Fighting Old Battles?

Promo still from the last time someone dragged The X-Files out of the deep freeze

This weekend’s illegal American bombing of Iran arrives hand-in-glove with another cultural announcement: Hulu is relaunching The X-Files. Preliminary announcements call it a “reboot,” but deeper reportage suggests it’s more a soft reboot, a continuation with new leads. Simultaneously, reports suggest there might be a long-awaited season five for Veronica Mars. (This is more ambiguous, maybe misreading the series being acquired by Netflix; wording is fuzzy.)

I’ve complained before about the cultural currents behind constant reboots. Pop culture is always behind the times anyway, and the flood of streaming media has made the biggest entertainment conglomerates more timid, not less. But this feels different. The resurrection of two popular franchises, thirty-three and twenty-four years old respectively, amid a “Make America Great Again” culture feels more than timid. It feels like a hasty retreat from reality.

Throughout the Current President’s 2016 campaign, he decried urban violence and burning cities, despite such violence being at near-historic lows. But his rhetoric makes sense in his life context, as the Bronx famously caught fire in the late 1970s, the same time he moved into Manhattan real estate with his purchase of the former Commodore Hotel. The poor future President was simply trapped in the sociopolitical milieu of his thirties, unable to grow.

Similarly, this weekend’s bombing of Iranian civilian targets mirrors the President’s unhealed past. Consider his inability to stop heaving accusations against the Central Park Five, nearly a quarter century after they were exonerated. This President retains grudges and political interpretations molded by a privileged youth and segregated social set. In context, he likely bombed Iran, not really for its nuclear program, but as payback for the 1979 Hostage Crisis.

This has become the default for much American politics. We aren’t facing the past, we’re relitigating the past. In the 1980s, both political discourse and mass media desperately wanted to re-fight the Vietnam War, but correctly this time. Franchises like Iron Eagle, Rambo, and Top Gun promised to purge America’s Vietnam disgrace. More recently, Call of Duty and James Bond try to tweak our memory of the Cold War.

Caught in the interregnum between the Cold War and the Global War on Terror, the Clinton decade offered enforced cheerfulness, a frothy meringue of Empire Records and Ben Stiller’s early career. The X-Files directly countered that, maintaining post-Reagan cynicism toward America’s surface culture. Scully and (especially) Mulder walked through neon-soaked midnight landscapes, uniquely able to see the venality that made that era’s party ethos possible.

Kristen Bell in the original network run of Veronica Mars

Veronica Mars pushed this contrast to the extreme. Read superficially, the series presented a stereotyped Southern California panorama, all hypersaturated colors and loud, jangly indie pop soundscapes. Only Veronica and her father—and, eventually, those trapped in their decaying orbit—understood the vulgar horse-trading and human commodities that subsidized Neptune, California’s skin-deep glamour. Veronica, like Mulder, was ready to expose the lie, damn the consequences.

Both franchises took dim opinions of power structures. Veronica Mars fought plush-bottomed police as often as criminals, while Mulder and sometimes Scully brought official corruption to light despite, not because of, the law. But both presented a morally distinct, binary universe. Neptune’s Sheriff Lamb and the Smoking Man were clearly evil, and needed exposed to a public which their shows depicted as passive and sheep-like, desperate for an underdog hero.

Unfortunately, the political tenor has changed. From the impotent government depicted in the 1970s, to the malignant one of the 1990s, the problem has been presented as siloed at the top. The disclosure of the Epstein documents, like the Panama Papers before them, has revealed a network of politicians, capitalists, entertainers, academics, and scientists colluding to support an otherwise decrepit system. The “secret” isn’t secret anymore.

While politicians and media captains want to refight the battles of their, or our, childhood, rapidly unfolding news reveals their vision of the problem as charmingly naïve. Nary a top-tier capitalist or government insider didn’t share information with Epstein. Public intellectuals like Noam Chomsky and Richard Dawkins had their hands in his pockets. The rot isn’t an isolated, partisan tumor. Everyone, everywhere in the system, has been proved complicit.

Veronica Mars and The X-Files helped define a generation’s idea of acceptable villains. They showed our lawkeepers were complicit with lawbreakers in the anarchy most people felt in their ordinary lives. But reality has overtaken the scope these shows made possible. Bringing back the monsters of my twenties is worse than quaint. It offers audiences my age an excuse to avoid the monsters that have revealed themselves in reality.

Monday, April 28, 2025

Further Thoughts on the Futility of Language

Patrick Stewart (left) and Paul Winfield in the Star Trek episode “Darmok”
This essay is a follow-up to my prior essay Some Stray Thoughts on the Futility of Language

The popularity of Star Trek means that, more than most science fiction properties, its references and in-jokes exceed the bounds of genre fandom. Even non-junkies recognize inside references like “Dammit, Jim,” and “Beam me up.” But the unusual specificity of the 1991 episode “Darmok” exceeds those more general references. In that episode, the Enterprise crew encounters a civilization that speaks entirely in metaphors from classical mythology.

Berkeley linguist George Lakoff, in his book Metaphors We Live By, contends that much language consists of metaphors. For Lakoff, this begins with certain small-scale metaphors describing concepts we can’t describe directly: in an argument, we might “defend our position” and “attack our opponents.” We “build an argument from the ground up,” make sure we have “a firm foundation.” The debate ends, eventually, when we “see the other person’s point.”

Such first-level metaphors persist across time because, fundamentally, we need them. Formal debate structures shift little, and the figures of speech remain useful, even as the metaphors of siege warfare become obsolete. While speakers and authors repeat the metaphors, they retain their currency. Perhaps, if people stopped passing such metaphors onto the next generation, they might fade away, but so far, that hasn’t happened in any way I’ve spotted.

More pliable metaphors arise from cultural currents that might not persevere in the same way. Readers around my age will immediately recognize the metaphor when I say: “Read my lips, no new taxes.” They may even insert President George H.W. Bush’s hybrid Connecticut/Texas accent. For several years in the late 1980s and early 1990s, the “Read my lips” metaphor bespoke a tough, belligerent political stance that stood involate… until it didn’t.

In the “Darmok” episode, to communicate human mythic metaphors, Captain Picard describes the rudiments of the Epic of Gilgamesh, humanity’s oldest known surviving work of fiction. Picard emphasizes his familiarity with ancient myth in the denouement by reading the Homeric Odes, one of the principal sources of Iron Age Greek religious ritual. For Picard, previously established in canon as an archeology fan, the earliest myths represent humanity’s narrative foundation.

But does it? While a nodding familiarity with Homer’s Odyssey and Iliad remain staples of liberal education, how many people, outside the disciplines of Sumeriology and classical studies, read Gilgamesh and the Homeric Odes? I daresay that most Americans, if they read mythology at all, mostly read Bulfinch’s Mythology and Edith Hamilton’s Mythology, both of which sanitized Greek tradition for the Christian one-room schoolhouse.

The attached graphic uses two cultural metaphors to describe the writer’s political aspirations. The reference to Elvis on the toilet repeats the widespread cultural myth that Elvis Presley, remembered by fans as the King of Rock and Roll, passed away mid-bowel movement. There’s only one problem: he didn’t. Elvis’ loved ones found him unconscious on the bathroom floor, following a heart attack; he lingered a few days before dying in hospital.

The drift between Elvis as cultural narrative, and Elvis as historic fact, represents the concept of “mythology” in the literary critical sense. We speak of Christian mythology, the mythology of the Founding Fathers, and the myths of the Jersey Devil and prairie jackalope. These different “mythologies” represent, neither facts nor lies, but stories we tell to understand concepts too sweeping to address directly. Storytelling becomes a synecdoche for comprehension.

Similarly, the broad strokes of Weekend at Bernie’s have transcended the movie itself. It’s questionable how many people watched the movie, beyond the trailer. But the underlying premise has become a cultural touchstone. Likewise, one can mention The Crying Game or The Sixth Sense, and most Americans will understand the references, whether they’ve seen the movies or not. The vague outlines have become part of our shared mythology.

But the movies themselves haven’t become so. Especially as streaming services have turned movie-watching into a siloed enterprise, how many people watch older movies of an evening? We recognize Weekend at Bernie’s, released in 1989, as the movie where two doofuses use their boss’s corpse as backstage pass to moneyed debauchery. But I doubt how many people could state what actually happened, beyond the most sweeping generalities.

Both Elvis and Bernie have come unmoored from fact. Their stories, like those of Gilgamesh and Darmok, no longer matter; only the cultural vibe surrounding them survives. Language becomes a shorthand for understanding, but it stops being a vessel of actual meaning. We repeat the cultural references we think we share, irrespective of whether we know what really happened, because the metaphor, not the fact, matters.

Friday, March 14, 2025

How To Invent a Fake Pop Culture

I don’t recall when I first heard the song “Sally Go ’Round the Roses.” I know I first heard Pentangle’s folk singalong arrangement, not the Jaynetts’ Motown-tinged original. Like most listeners my age, who grew up with the mythology of Baby Boomer cultural innovation, I received that generation’s music out of sequence; the 1960s appeared like a single unit, without the history of cultural evolution that define the decade.

Therefore I didn’t understand how influential the Jaynetts’ original version really was. Its use of syncopated backbeat, gated distortion effects, and enigmatic lyrics were, in 1963, completely innovative. The British Invasion hadn’t hit America yet, with the inventive tweaks that the Beatles and the Kinks experimented with. The original label, Tuff, reportedly hated the song until another label tried to purchase it, causing Tuff to rush-release the record.

Eventually, the track hit number two on the Billboard Hot 100 chart. More important for our purposes, though, a loose collective of San Francisco-based musicians embraced it. Grace Slick recorded a rambling, psychedelic cover with her first band, The Great Society, and tried to recreate its impact with classic Jefferson Airplane tracks like “White Rabbit” and “Somebody To Love.” Much of her career involved trying to create that initial rush.

Once one understands that “Sally” came first, its influence becomes audible in other Summer of Love artists, including the Grateful Dead, Creedence Clearwater Revival, Moby Grape, and Big Brother and the Holding Company. These acts all strove to sound loopy and syncopated, and favored lyrics that admitted of multiple interpretations. Much of the “San Francisco Sound” of 1966 to 1973 consisted of riffs and jams on the “Sally” motif.

That’s why it staggered me recently when I discovered that the Jaynetts didn’t exist. Tuff producer Abner Spector crafted “Sally” with two in-house songwriters, an arranger who played most of the instruments, and a roster of contract singers, mostly young Black women. The in-house creative team played around and experimented until they created the song. It didn’t arise from struggling musicians road-testing new material for live audiences.

Grace Slick around 1966, the year she
covered “Sally Go ’Round the Roses”
with the Great Society

A New York-based studio pushed this song out of its assembly-line production system, and it became a hit. Like other bands invented for the studio, including the Monkees and the Grass Roots, the Jaynetts didn’t pay their dues, the studio system willed them into existence. They produced one orphan hit, which somehow travelled across America to create a sound-alike subculture, back when starving musicians could afford San Francisco rent.

Culture corporations, such as the Big Three labels which produce most of America’s pop music, and the Big Five studios which produce most of America’s movies, love to pretend they respond to culture. If lukewarm dribble like The Chainsmokers dominate the Hot 100, labels and radio conglomerates cover their asses by claiming they’re giving the customers what they want. Audiences decide what becomes hits; corporations only produce the product.

But “Sally’s” influence contradicts that claim. Artists respond to what they hear, and when music labels, radio, and Spotify can throttle what gets heard, artists’ ability to create is highly conditional. One recalls, for instance, that journalist Nik Cohn basically lied White disco culture into existence. Likewise, it’s questionable whether Valley Girl culture even existed before Frank and Moon Zappa riffed in Frank’s home studio.

It isn’t only that moneyed interests decide which artists get to record—a seamy but unsurprising reality. Rather, studios create artists in the studio, skimming past the countless ambitious acts playing innumerable bar and club dates while hoping for their breakthrough. This not only saves the difficulty of having to go comparison shopping for new talent, but also results in corporations wholly owning culture as subsidiaries of their brand names.

I’ve used music as my yardstick simply because discovering the Jaynetts didn’t exist rattled me recently. But we could extend this argument to multiple artistic forms. How many filmmakers like Kevin Smith, or authors like Hugh Howey, might exist out there, cranking out top-quality innovative art, hoping to become the next fluke success? And how many will quit and get day jobs because the corporations turned inward for talent?

Corporate distribution and its amplifying influence have good and bad effects. One cannot imagine seismic cultural forces like the Beatles without corporations pressing and distributing their records. But hearing Beatles records became a substitute for live music, like mimicking the Jaynetts became a substitute for inventing new culture. The result is the same: “culture” is what corporations sell, not what artists and audiences create together.

Wednesday, December 4, 2024

Capitalism and the Winners’ Society

Jeopardyhost Ken Jennings

I just did something I used to do frequently, but haven’t done for years: I watched a full episode of Jeopardy. Television has lost its allure in recent years, and Jeopardy’s appeal to shallow knowledge of inconsistent topics has become an emblem of modern society. Too many people know surface-level flotsam about nearly everything, giving us all a false sense of expertise on topics where we’re profoundly ignorant.

One player, working with speed and confidence, managed to rack up the largest cash haul on the stage, then he hit a Daily Double. He decided to risk it all on a question about corporate ad slogans. Yes, he risked it all--and lost it all. He went from being in the lead, to having nothing.

American capitalism loves winners. I’ve witnessed journalists, economists, and social media stans tripping over their own packers to lavish praise on billionaire CEOs. It’s often unintentionally hilarious to watch White boys without two dimes to rub together, invent phony narratives to explain why Elon Musk, Jeff Bezos, and Mark Zuckerberg are epoch-making geniuses, not just winners of America’s greenback lottery.

Business writers I've reviewed on this blog have lavished praise on society's winners. They will perform seriocomic contortions to justify why, say, Jack Welch or Sam Walton are intrepid adventurers and the epitome of moneyed masculinity. To critics like me, these paragons of capitalism are highway robbers and guttersnipes who hoard the wealth created by others. But their fluttering groupies insist their favorite CEOs generated their wealth by rubbing two sticks together in the woods.

Elon Musk

Among these praises, one seems pointedly recurrent: billionaires deserve their riches because they risked their own seed capital. Risk looms large in billionaire mythology. Michael J. Sandel quotes philosopher Robert Nozick as saying that all wealth is the product of bold risk-taking and unique innovation, the outcome of intrepid individuals who somehow exist in an economic vacuum.

However, even if we accept the risk mythology, it fumbles on one level: we evaluate risks by their success. It's easy to praise Bill Gates or Steve Jobs for their success, when we don't simultaneously evaluate why the founders of Sun Microsystems and Commodore International flopped so ignominiously. When we only look at those who risked and won, it's easy to think risk leads to reward.

But for every world-renowned success, there are uncountable failures. Eighty percent of American companies fail within five years. A handful of actors become million-dollar stars, but most limp along on day jobs before leaving the industry altogether. For every Facebook, there's a bloodbath of MySpaces, Geocities, and Friendsters. Failure, not success, is the usual outcome for risky courage.

Watching Jeopardy, that one player who lost everything didn't give up. He slowly won back the money he lost, then in Final Jeopardy, gambled everything again. He turned out to be the only player to recognize a question about Dashiell Hammett, and finished the game with almost four times as much as the second-place finisher. In under thirty TV minutes, he went from wearing egg on his face, to becoming the reigning champion.

Bill Gates

But that outcome wasn't inevitable. He won, not only because of his own wide-ranging knowledge, but because the rules called a halt to the game before he had a chance to lose everything again. He benefited from writers who corresponded with his backlog of trivia knowledge. And he handled the buzzer effectively--something many past players have said isn't easy.

When Elon Musk and his giddy evangelists call him a self-made billionaire, they overlook the environment that created his wealth. As Giblin and Doctorow write, the monopsony economy that makes Musk's companies possible results from public policy and social order. Regulations written to prevent past economic abuses, become barriers to entry for small start-up entrepreneurs. Inequality festers unchecked.

This doesn't mean risk-takers don't deserve reward. Musk, Bezos, Gates, and others did indeed risk their, or their investors’, seed capital, and could've eaten dust. But that doesn't mean they work alone. They used others’ skills, labor, and time. They benefited from technicians trained in state schools. Many, like Musk, received direct government subsidies to offset the costs which their risks carried.

American rhetoric in support of capitalism emphasizes the goodness of taking a risk. But in practice, our economy doesn't reward those who take a risk, but those who win. And victory is always socially conditioned. Success means what customers pay for, not what billionaires prefer. There's literally no difference between Microsoft and Sun Microsystems, except that one guessed right.

Friday, September 6, 2024

An Oasis in the Desert of Reality, Part Two

The original Oasis lineup from 1994, as per NME
This essay is a follow-up to An Oasis in the Desert of Reality

The recently announced Oasis reunion was followed almost immediately by something that’s become widespread in today’s music industry: price gouging. Euphemistically termed “dynamic pricing,” the online structure lets ticket outlets jack prices commensurate with surging demand. This means that, as tickets became available for the paltry seventeen dates around the UK and Ireland, computers hiked prices to £350 ($460) per ticket, outside their working-class audience’s budgets.

American audiences could summarize the exorbitant prices with two words: Taylor Swift. Tickets for Swift’s Eras Tour originally sold for $49. Quickly, however, a combination of forces, including the Ticketmaster/LiveNation merger, unauthorized resale, and limited access, bloated prices to over $4000 in some markets. Swift’s mostly young, mostly female audience base obviously can’t afford such numbers. In both cases, predatory marketing ensured that fans best positioned to appreciate the respective artists, could least afford access.

But both artists share another characteristic. Why do these acts share monolithic grips on their markets? This perhaps made sense when the Beatles upended the music industry in 1963 and 1964, when fewer media options ensured a more homogenous market. But advances in technology have granted listeners access to more radio stations, innumerable streaming services with nigh-infinite channels, and more opportunities to share music with formerly niche fandoms.

Yet increased diversity produces a paradoxical outcome, embodied in the crushing demand for limited tickets. As listeners have access to more artists, more styles, and the entire history of recorded music, demand has concentrated on a smaller number of artists. I’ve noted this trend before: as the buy-in to create and distribute music has become more affordable, the actual number-one position on the Billboard Hot 100 has become less diverse.

Some of this reflects the way conglomerate corporations throttle access. Sure, entrepreneurial artists can record music in their bedrooms at higher quality than the Beatles dreamed possible, and distribute it worldwide almost for free. But approximately half the music market today travels through one outlet, Spotify. And, as Giblin and Doctorow write, Spotify’s algorithm actively steers listeners away from indie, entrepreneurial, and otherwise non-corporate options.

Taylor Swift in 2023

Three corporations—Sony BMG, Universal, and Warner—currently control over eighty percent of the recorded music market. That’s down from four corporations controlling two-thirds of the market in 2012, the year Universal absorbed EMI. (EMI, in turn, owned Parlophone, the label which published the Beatles.) Without support from the Big Three, artists have limited access to publicity, distribution, or the payola necessary to ascend Spotify’s ladder.

Aspirational musicians might, through good business management, make a middle-class living through constant touring and indie distribution. But without a major-label contract, musicians can’t hope for basic amenities like, say, a full weekend off, much less enough pay to buy a house. And with only three corporate overlords controlling the supposedly large number of major labels, musicians will always have a severe disadvantage.

Sure, the occasional musician might beat the odds and challenge the Big Three oligarchy. Taylor Swift notoriously forced both Spotify and Apple Music to pay overdue royalties to backlist artists a decade ago by withholding her lucrative catalog. But most musicians, even successful artists with chart hits, can’t expect to ever having such influence. Citing Giblin and Doctorow again, many artists with chart hits still bleed money in today’s near-monopoly market.

This structure encourages blandness and conformity, from both artists and audiences. Oasis sounded new and different thirty years ago, but they’re now comfort listening for a middle-aged, middle-income British audience. Taylor Swift samples styles and influences so quickly that she’s become a one-stop buffet with something to please (or displease) everybody. Both artists give audiences what they expect to hear—assuming they can hear anything over the stadium crowds.

Collective problems require collective solutions. Start by supporting politicians who enforce existing antitrust and anti-monopoly laws—and not supporting the rich presidential candidate who brags about not paying contractors. But we also have private solutions, starting with attending concerts and buying merch from local and indie artists who reinvest their revenue into their communities and stagehands. Tricky for some, yes, as most music venues are bars.

The problem isn’t hypothetical; we have changed in response to a diminished market. As Spotify, LiveNation, and the Big Three have throttled the music industry, we listeners have responded by accepting diminished standards, and consuming blander art. Though we have the illusion of choice, we allow the corporations to channel us into a less diverse market, and buy their pre-screened art. Independence is neither cheap nor easy, but it’s urgently necessary.

Wednesday, September 4, 2024

An Oasis in the Desert of Reality

This photo of Liam and Noel Gallagher has accompanied the announced Oasis reunion tour

Perhaps my opinion on the announced Oasis reunion tour is distorted by me being an American who streams British media online. This makes Oasis seem both larger and smaller than they actually were: they planted twenty-four singles in the UK Top Ten, but only one in America. “Wonderwall,” obviously. Two other songs, “Don’t Look Back in Anger” and “Champaigne Supernova,” have also had lingering afterlives on alternative radio.

Thus, although I appreciate the panicked rush to purchase reunion tickets, I can’t participate. Oasis represents a place and time, one in which I couldn’t participate because worldwide streaming scarcely existed before the Gallagher brothers stopped working together in 2009. I realize that lost era means something important to those who lived through it. However, the timing seems exceptionally pointed, knowing what I do now.

The Gallagher brothers announced their reunion tour as Oasis almost exactly thirty years after the release of their record-setting debut album, Definitely Maybe. Released on 29 August 1994*, this album outsold any British freshman album until that point, and all four singles went Gold or Platinum. Though Oasis wouldn’t have an American breakout until their second album, they didn’t need one; they already had Beatles-like acclaim on their first try.

Nostalgia vendors always seem to think things were Edenic approximately thirty years ago. Growing up in America in the 1980s, popular media regaled me with giddy stories of how wonderful things were during the Eisenhower administration. TV series like Happy Days an MASH, or movies like Stand By Me and Back to the Future, though they commented on then-current events, nevertheless pitched a mythology of prelapsarian sock-hop ideals.

Importantly, these shows all spotlighted not how the 1950s were, but how the aging generation remembered them. MASH endeavored to humanize the horrors of war, during a time when Vietnam was still too current for commentary, but it did so in ways that only occasionally included any indigenous population. Happy Days was set in Milwaukee, Wisconsin, one of America’s most racially segregated cities, but featured almost no Black characters.

This whitewashed nostalgia reaches its apotheosis of silliness with Back to the Future, in the prom scene, where White Marty Mc-Fly shows a Black backup band how to really rock out. The maneuvers he tries might’ve looked shocking to the White kids on the dance floor, because they lived pre-Jimi Hendrix. But they were hardly new; Marty didn’t do anything Charly Patton and Son House hadn’t invented in the 1920s.

For Oasis to reunite almost exactly thirty years after they deluged British music, puts them in the same position. The announced reunion lineup currently includes only the Gallager brothers, omitting the revolving door of sidemen and rhythm sections that supported them for fifteen years. The band was contentious even before the brothers stopped working together; they never produced a dark horse, like George Harrison, to emerge from the wreckage.

I’ve written before that the nostalgia impulse produces the illusion of inevitability. Because events happened a certain way—because the Beatles hit number one, because Dr. King bested Bull Connor, because America beat the Soviets to the Moon—we believe things had to happen a certain way. This removes human agency from history, giving us false permission to go with the flow, like a dead salmon floating downstream.

Oasis hit the mainstage when John Major’s Tory government was walking wounded. Though Major had received a Parliamentary mandate in 1992, he was already deeply unpopular by 1994, helping Thatcherism limp timidly across history’s finish line. Like their beloved Beatles, who broke just as Alec Douglas-Home’s doomed premiership ushered the Tories out, Oasis arrived to supervise a Conservative collapse.

And now they’re reuniting as Rishi Sunak has nailed shut another Conservative coffin.

But history isn’t inevitable. Tony Blair’s nominally progressive government, like Bill Clinton’s, openly embraced moralism and militarism, before ending in the massive moral sellout of Operation Iraqi Freedom. Likewise, Kier Starmer is possibly the blandest person ever elected PM, winning only because the Tories squandered every advantage. Starmer, like the Oasis reunion, bespeaks a rare British optimism, without much of a plan.

I’d argue that most people don’t really want an Oasis reunion. I strongly doubt anyone wants to watch two White men pushing sixty warble about the troubles of 1994 onstage. Their largely British audience simply wants to time-travel to a moment when they didn’t have to know what Brexit was, who Boris Johnson is, or why a down payment on a London house is triple the average annual income.

*Out of respect, I use British dating conventions in this essay, and only this essay.

Friday, August 30, 2024

Some Overdue Thoughts on Neil Diamond

Neil Diamond in 1971, the year he
released “I Am… I Said”

I started ragging on Neil Diamond’s 1971 top-five hit “I Am… I Said” years before I heard it. Despite its high Billboard ranking, it generally isn’t regarded among Diamond’s greatest hits—let’s acknowledge, it’s no “Solitary Man” or “Sweet Caroline.” It doesn’t get extensive classic rock radio airplay like others of Diamond’s peak career recordings. Even for many fans, it’s largely a cypher.

Therefore, when humorist Dave Barry made it a recurring theme to belittle Neil Diamond in general in the 1990s, and “I Am… I Said” particularly, I didn’t blink. I knew Barry’s mockery was exaggerated for comic effect, because no matter how earnestly over-written Diamond’s hits were, hell, the man still wrote “I’m a Believer” and “Cherry Cherry,” and I’ll fight you if those aren’t classics. But “I Am… I Said”? Surely radio programmers buried it on purpose.

Barry quoted Diamond’s lyrics, particularly the central hearing-impaired chair, extensively. He said nothing about Diamond’s music, his life, or the cultural context amidst which Diamond wrote. Barry simply threw out Diamond's refrain lyrics, which aren’t exactly Robert Frost. Without context, and especially without the more subdued stanzas surrounding the refrain, the lyrics looked bathetically ridiculous, like an Angora cat in the rain.

Superficially, I had no reason to believe Dave Barry wasn’t representing Neil Diamond accurately. If I’d thought more deeply, I would’ve realized Barry also pooh-poohed “Cracklin’ Rosie,” which is maybe a bit overproduced but seriously still slaps. Cool, rational thought might’ve told me that, if Barry disparaged a banger like “Cracklin’ Rosie,” maybe his representation of “I Am… I Said” wasn’t wholly reliable.

In my limited defense, I hadn’t turned twenty yet.

Years later, I finally heard the song. When my local radio station started playing the opening riff and first stanza, I clearly identified it as belonging to the 1970s, a decade when hippie utopianism began surrendering to ennui, age, and the realization that it required more than optimism to change the world. Though most artists didn’t record anything quite this melancholy until after 1973, it’s instantly recognizable in its time.

More importantly, “I Am… I Said” is pretty good. It isn’t Diamond’s best, not in a career that produced classics like “Red Red Wine” and “Kentucky Woman,” but it’s a substantial glimpse into the psyche of a man facing his own age and mortality. The contrast between Diamond’s understated, more poetically complex stanzas, and ostentatious orchestra behind his choppy refrain, presages later anthems to adult futility, like Nirvana’s “Smells Like Teen Spirit.”

Neil Diamond in 2018, the year his
health forced him to retire from touring

I believed Dave Barry’s criticisms in the 1990s because I hadn’t yet heard Diamond’s song, and I presumed Barry represented the song accurately. I realized Barry, a humorist, might privilege the joke above facts. Yet in 1993, when one couldn’t check YouTube or Spotify to verify the source, I chose to assume Barry was essentially honest. I adopted Barry’s jokes as my own opinion, and repeated them for nearly thirty years.

Everyone sometimes adopts others’ opinions as our own. Nobody can possibly have encyclopedic knowledge of, say, climate science or presidential politics or big-ticket TV productions. We must trust scholars, critics, friends, and others. When that happens, we must obviously evaluate whether that person’s opinion is trustworthy enough. Is the scholar scholarly enough to be reliable? Has the movie reviewer seen enough movies?

Dave Barry is probably the funniest White person of my lifetime, a man who often extracted comedy from well-written descriptions of furniture. He commanded language to cultivate emotions in readers, without depending on voice and performance, a mark of somebody who thinks deeply about every word and phrase. Because he commanded written English with an ease I find enviable, I presumed Barry must’ve thought equally deeply about his subjects.

It never occurred to me that Barry might’ve misrepresented his subject, or omitted information that would’ve influenced my opinion, such as Diamond turning thirty, divorcing his high school sweetheart, or having little to show for his career. I trusted the evaluation of a critic who, it appears, was more invested in the joke than the facts. Barry’s take-down of Diamond’s lyrics remains hilarious, but frustratingly divorced from reality.

This forces me to ponder: what other untrustworthy “experts” have I trusted? As an ex-Republican, I certainly shouldn’t have trusted P.J. O’Rourke and Thomas Sowell, who influenced my early politics. My parents admitted the ideas they taught me were often informed by fear. Much of adulthood involves purging false teachings from untrustworthy mentors who concealed their agendas.

And that chair totally heard you, dude.

Monday, May 6, 2024

The Matrix and the Messianic Lie

Much of the advance publicity surrounding The Matrix Resurrections focused on Act One’s satirical nature. The movie mocked the production house, Warner Bros., by name, for their demand for a lucrative sequel, whether the art demanded it or not. Warner, in a remarkable show of grace, leaned into that mockery and included it in the PR packet. Sounds cool, I remember thinking, but not compelling; I’ll wait for home video.

Now that it’s streaming, I find myself struck by what the PR omitted: a much more fatalistic tone, admitting that the original trilogy’s prophecies fell flat. The original movie remains relevant and talked-about a quarter century after its release, but its messianic promise of deliverance from corporatized autocracy seems naïve now. We haven’t escaped the machine, this movie warns us. If anything, it’s stronger now than it appeared in 1999.

The movie begins with Neo, having returned to his pre-liberation name of Thomas, working a soulless corporate job, like he did in the first movie. Instead of toiling in the anonymous cube farm, however, he now occupies the corner office, and has personal confabs with the corporate straw-boss. But he’s profoundly dissatisfied, treating his malaise by chronically overdosing psychiatric medications. Then word comes that Warner wants a sequel.

Fans embraced the first movie for two defining characteristics: cutting-edge visual effects, and long, maundering philosophical monologues. We who are old enough to remember the first movie, without the historical baggage that followed thereafter, probably remember feeling almost vindicated by the Wachowskis’ take on the decade between the Cold War and 9/11. That decade was slick, shiny, and frequently fun, but also stultifyingly boring in its safety.

Thereafter, some aspects of the first movie seemed downright prophetic. Scenes of urban destruction and gun violence looked eerily like footage of both 9/11 and the War on Terror. It’s hard to watch Neo and Trinity piloting a helicopter gunship into a skyscraper, without remembering the street-to-street fighting of the Siege of Fallujah. The videogame-like violence would become only worse as footage of American drone strikes became cable news fodder.

Unfortunately, if the first movie evidently recognized the boredom of safety, and the violence which boredom begets, the sequels fell flat. They pushed heavily on images of neo as messianic deliverer, whose unique person promises to challenge the system. As Agent Smith increasingly possesses everyone and everything, remaking the Matrix in his self-serving image, the movie promises that Neo, operating alone, will reset the changes and release the captives.

Christians worldwide have, certainly, believed their singular messiah would bring that promised deliverance. But often, instead of acting boldly in trust that Christ would vindicate the just, Christians used the promise of future deliverance to sit idly by, expecting Jesus to fix everything. Christians tolerated, if not outright participated in, war, slavery, exploitation, and empire. Molding ourselves to the world is okay, if the Messiah will triumph eventually.

Mass media in the post-Matrix decades has embraced the Chosen One myth. Rey Skywalker, Captain America, Katniss Everdeen, and even Keanu Reeves’ own John Wick have raced headlong into pits of vipers which seems insuperably large, and emerged triumphant. Despite occasional interludes, like the cinematic Les Misérables, American corporate media keeps promising a singular messiah that will redeem us from… well, from America, mostly.

Meanwhile, The Matrix Resurrections acknowledges directly that conditions have gotten worse. English-speaking conservative parties repeatedly promise to actively make oppression more oppressive, while progressive parties limply pledge that, under their supervision, things won’t get much worse. War, disease, and poverty have gotten worse, not better, since 1999. As Ian Haney López writes, the machinery of oppression proves infinitely capable of adapting to every direct challenge.

This movie takes that adaptability literally. A machine learning heuristic seizes the heroes that previously challenged the Matrix, and turns them into the Matrix’s driving engines. It uses the very human embodiment of justice to fuel injustice—shades of Republicans appropriating one orphan MLK quote to proclaim themselves the real arbiters of fairness. This movie admits that we who believe in freedom must adapt faster than the oppressors.

Hovering over San Francisco streets, Neo realizes something the original trilogy missed: if he hoards the power of salvation, then he’s already failed. The messianic impulse may begin with one individual, but unless it radiates outward, unless others join the kingdom as priests and kings themselves, messianic deliverance will never arrive. Humans, even messiahs, eventually die. The final shot abjures the sequels, and restores the “we”-centered salvation the first movie promised.

Friday, October 27, 2023

Barbie, Disability, and the Death of Formal Rhetoric

Ben Shapiro expresses his well-thought-out opinion in a totally reasonable manner.

Aristotle defined rhetoric as “the capacity to discover the possible means of persuasion concerning any subject.” Pointy-headed and abstruse, yes, but a reasonably concise description of what I tried to teach in Freshman Composition. When structuring our language around contentious issues or painful controversies, we must think in terms of what will persuade our intended audience. That standard is often subjective, and moves almost whimsically.

Two recent events have re-centered this difficult process for me. I recently witnessed an unpleasant online dispute quickly spiral out of control. A disabled person noticed that a friend’s anecdote about helping a disabled stranger contained certain ableist prejudices. The story was well-intentioned, but fit a genre of short narrative sometimes disparaged as “inspiration porn.” All such stories mean well, but misfire by showcasing able-bodied generosity over disabled autonomy.

As sometimes happens when disadvantaged persons ask for consideration, some observers saw this criticism as personal attack. Like The Former President, who saw kneeling football players as disloyal “sons of bitches,” the OP’s friends closed ranks defensively, lambasting the critic for “attacking” their friend and “ripping him to shreds.” The defensive posture became so energized that they persisted even after the OP cautioned them to back off and cool down.

When Ben Shapiro, the massively online full-time professional offense-taker, protested Greta Gerwig’s Barbie movie this summer by lighting two Barbie dolls on fire, he apparently thought he was making a serious point. The entire internet, however, responded with aggressive disdain. Shapiro evidently thought this fire was an appropriate synecdoche for his internet-friendly outrage. But even his staunchest allies had little support for a petulant boy destroying his sister’s toys.

Shapiro makes most of his living doing personal appearances on college campuses, engaging undergraduates in “debate.” He organizes his public persona around the motto “Facts Don’t Care About Your Feelings.” Superficially, Shapiro seems to advocate Aristotelean rhetoric as persuasion through evidence. Yet the Barbie incident demonstrates something Shapiro’s critics have long noted: he cares only about winning, usually by personally demolishing anyone who disagrees with him.

Proof of the standing stereotype
of what constitutes a disability

The disability debate pushed me into an awkward position. Both the OP and his critic are friends, whom I respect dearly. I struggled to triangulate a position where I supported both while clarifying that I considered the criticisms justified. This meant finding ways to say “you’re wrong” without making the statement personal, and managing the feelings of defenders whose emotions already ran high. Therefore my participation mainly consisted of overthinking and extended paralysis.

Rereading the debate afterward, I noticed something I missed in the moment: the critic and the defenders kept talking past one another. The critic offered copious evidence, including cited sources and hyperlinks. The defenders hand-waved all the evidence, focusing on the perceived personal slight in the original callout. Because the critic intended no personal slight, she never addressed it. Therefore, both sides’ core concerns never got addressed.

When Ben Shapiro mistakes destroying toys for pitching an argument, the core problem probably resides in who he thinks his audience is. Shapiro has garnered acclaim by performing stunts designed to embarrass progressives and dissidents. Such displays help unify his hard-right audience and create a base primed to listen (and, importantly, to buy his advertisers’ sketchy products). But it’s more likely to alienate anyone who doesn’t already agree.

In other words, Ben Shapiro, his Daily Wire media company, and similar massively online conservative outlets like Daily Caller and The Blaze, create loyalty to an ideological brand. As I’ve noted before, these outlets generate an almost religious sense of unity. Sure, the ideological sense of aggrieved White masculinity coaxes new converts through the door. But once inside, the politics generally matter less than the sense that we’re traveling together.

That, I realized (with some pain), happened with the disability debate. While the critic attempted to structure a formal argument supported with evidence, the OP’s defenders formed a perimeter around group loyalty. Rereading the previous sentence, I realize it sounds pejorative. Not so; when disadvantaged groups face systematic challenges, group membership enables them to organize and support one another.

Don’t misunderstand me: I’m not here to call out or condemn anyone individually. Rather, to return to classical rhetoric, I believe the two groups had “mixed stases,” that is, they were having two different arguments. But that’s become the problem with online ideology. Too often, we care more about defending the group than seeking the truth; that goes double for us White able-bodied cishet males. The group becomes paramount; the truth gets lost.

Wednesday, September 20, 2023

Richard Madden and the Systems of the World

Richard Madden in Citadel, with Prianka Chopra Jonas

Watching Amazon Studios’ recent over-the-top spyfest Citadel, I couldn’t help wondering why the MC, Kyle Conroy, looked suspiciously familiar. Oh, yeah, because he’s played by Scottish actor Richard Madden, who attracted global attention in 2018 when the ITV/BBC thriller Bodyguard became an international streaming sensation. Though Madden plays Conroy with an American accent, both stories feature Madden as a war-scarred veteran dragged back into somebody else’s war.

These two vehicles play very differently. Bodyguard is a conventional British police drama: gritty, unsentimental, and character-driven. Citadel is campy and overblown, despite its largely serious tone; it resembles the unintentionally silly James Bond films that murdered Pierce Brosnan’s take on the character, and prompted the series reboot with Daniel Craig. Bodyguard is often visually murky, with jarring handheld camera work, versus Citadel’s oversaturated colors and elaborate sound design.

Importantly, Bodyguard features real-world politics. Madden’s character, Police Sergeant David Budd, fought in Afghanistan, and now works for London’s Metropolitan Police. He’s assigned to protect the Home Secretary, a powerful office within Britain’s Cabinet. Early episodes contrast Budd’s PTSD scars with Secretary Julia Montague’s strict authoritarianism; after an abrupt tonal shift, later episodes pit the Met’s civilian Counter Terrorism Command against MI5’s militarized Security Service.

Citadel features two fictional intelligence agencies. Both the titular Citadel, of which Madden’s Conroy discovers he’s a deep-cover agent, and the enigmatic Manticore believe themselves heroic. Citadel hunts and bags potential terrorists, while Manticore hunts Citadel, which it believes has grown corrupt. Both agencies have elaborate technology, an army of agents, bottomless funds, and global reach, despite being non-state actors. Who, we wonder, bankrolls these feuding Illuminati groups?

What these series share, besides Richard Madden, is a prior assumption that massive, shadowy systems control our lives. David Budd must investigate crimes which could destabilize British government, fighting an enemy that can make evidence vanish from locked rooms and air-gapped computers. Kyle Conroy (dba Mason Kane) must unlock secrets which two quasi-legal agencies want buried, many of which involve himself. Both men ask: am I sure I’m representing the good guys?

From the mid-1980s to the mid-2000s, multiple mass-media properties asked whether our lives are falsified. The Matrix, Dark City, and Star Trek’s later holodeck episodes spotlighted the idea that “reality” is only what we accept as reality, and powerful people can deceive our senses to condition our acceptance. Citadel and Bodyguard signify a shift away from reality itself, onto the people who control our ability to perceive reality.

Richard Madden in Bodyguard, with Keeley Hawes

We live, both series imply, beneath powerful structures that speak in our names, and make moral decisions for us, but which we don’t control. Nobody elected the Citadel, and while Secretary Montague was elected MP, she achieved her executive position through intra-party horse-trading. Violence, strategic deception, and force of law compel us to accept these unelected power structures, because we can do nothing about them except join opposite-number violent organizations.

Perhaps these themes are unsurprising. As we’ve acknowledged systemic concerns like “structural racism” or disaster capitalism, we increasingly understand how little individual control ordinary people have. Politics, economics, and war aren’t gods we can petition in temples; they’re forces, like hurricanes, that destroy everything they encounter. Doing right in politics or economics changes nothing, because we’re individuated and lonely, and the forces are systemic, impersonal, and huge.

Bodyguard and Citadel drew my attention because of Richard Madden, demonstrating how essentially powerless Madden’s characters are, despite their shared dedication to law and justice. But once aware of these themes, I started seeing them everywhere. Heart of Stone, a Netflix showcase for Gal Gadot, features a similar non-state intelligence agency that pervades everything, yet is so elusive that even MI6 can’t root it out.

The recent Equalizer movies with Denzel Washington, Netflix’s The Grey Man with Ryan Reynolds, and the Mission Impossible movies mostly don’t impute non-state actors with the kind of reach (and finances) only available to governments. However, they frequently feature government corruption, incestuous relationships between money and power, and people who profit unfairly from the status quo. These malefactors oppress our heroes, who often go rogue to root out corruption.

However, these heroes are equally defined by what they can’t do as what they can. There’s no Chosen One, no Neo or Luke Skywalker to establish a just world. Rachel Stone, Ethan Hunt, Robert McCall, and Richard Madden might remove corrupt operators, but they can’t dismantle unjust systems. They (and therefore we) can only reset broken systems to the status quo ante. Reality now exists, but reality is historically unfree.

Monday, February 27, 2023

Life Under the Un-Free Market

Rebecca Giblin and Cory Doctorow, Chokepoint Capitalism: How Big Tech and Big Content Captured Creative Labor Markets and How We’ll Win Them Back

Only five corporate conglomerates publish about eighty percent of America’s books, a concentration of power largely unchanged since the 1990s. But only one company, Amazon, sells about half of America’s books, and nearly all of America’s self-published and indie books. Australian law professor Rebecca Giblin and Canadian author Cory Doctorow consider this squeeze to be a harbinger of future first-world economics, if something doesn’t change soon.

Concentrations of market authority create what our authors call “chokepoints,” where one company has immense power over consumer access. They admit this chokepoint is a casual term, and describe the more technical description: monopsony, a little-known economic force where one or a few buyers control the market. Most monopsonies are also monopolies, since they resell their captive market to us, the general public, who can’t negotiate a better price.

Similar market concentrations obtain throughout the culture industry: three record labels publish most music you listen to, but most audiences only listen anymore through Spotify. And the TicketMaster/Live Nation merger gave one company a death grip on A-list live music. Five studios own most of Hollywood, but even they’re beholden to the Big Four talent agencies.And Google’s stranglehold on the online ad market gives them power over nearly everyone.

These monopsony markets didn’t just happen. Their controllers manipulated market conditions to their advantage, often by misusing antitrust regulation in ways that contradicted why those regulations were written. But since the 1970s, a new market philosophy has dominated. Invented by Robert Bork and popularized by Chicago School economists, the dominant bipartisan theory holds that monopolies and un-free markets are acceptable, provided consumer costs remain low.

Markets, our authors remind us repeatedly, don’t objectively exist. They consist of laws, traditions, ad hoc regulations, and personal agreements, and therefore we humans invent and reinvent markets constantly. This book’s first half dismantles the long processes through which powerful people have captured markets—and market regulations!—to serve the wealthy resource owners, usually at the expense of creative workers, and to the detriment of their audiences.

Rebecca Giblin (left) and Cory Doctorow

The second half switches gears, tackling ways creative professionals have challenged the monopsony market, and potential ways these approaches could expand. Our authors don’t like traditional liberal regulatory approaches. They extend limited praise to the Biden Administration, which has detailed committed professionals to enforce antitrust regulations which have lain mostly dormant since the 1970s. But those regulations were written, they note, for 19th Century markets.

For instance, though union drives are lawful and protected in America, that only applies to directly employed workers. Independent contractors and freelancers can’t organize, because under 19th Century antitrust law, that’s a “price-fixing cartel,” and totally illegal. Unfortunately, many creative professionals aren’t directly employed. And where they are directly employed (as under the Hollywood studio system), they’re frequently employed through agencies, making their contract status legally squishy.

Our authors prefer a hybrid approach. They definitely believe collective action holds the greatest promise for fixing market concentrations, because where the wealthy control markets through money, that money doesn’t mean much if it can’t buy human labor. Collective action requires quick thinking, though: for every solution creative professionals invent, Big Tech capitalists have demonstrated their ability to manipulate that to their advantage, eventually. Modern problems require modern solutions.

They also describe public shaming operations which have worked well. For instance, when Disney bought Lucasfilm, they claimed they bought the rights to everything Lucas did, but not necessarily any obligation to pay creative workers. A loose affiliation of writers and other creatives, let by novelist Alan Dean Foster, pushed a hashtag campaign that humiliated Disney into paying its workers and honoring its contracts. Shaming the rich evidently works.

I’d go further. This book shipped before the TicketMaster/Live Nation catastrophe saw Taylor Swift tickets going for $4000. Our authors describe how Swift previously motivated her mainly young, energetic audience to overturn Spotify’s pay structure, and distribute royalties fairly. Now, that same audience has pushed Congress to hold corporate America accountable, for the first time in two generations. It’s early to say, but this may be the beginning of a movement.

These authors describe the consequences of monopsony economics on creative workers, and how creative workers have resisted. This may seem abstruse to some audiences. But they establish in Chapter One, and reiterate throughout, that the creative industry is a bellwether for the economy overall. The same forces are already spilling into gig work, and may displace white-collar work soon. This isn’t purely academic; it’s about your economic future, and mine.

Monday, February 28, 2022

The Terminator, the TechNoir, and the Scars of a Generation

When I say “the TechNoir scene in The Terminator,” every GenX reader knows exactly what I mean. Sarah Connor escapes Kyle Reese, whom she mistakes for a violent stalker, by hiding inside a neon-fronted dance club. She sits uncomfortably at a table, awaiting police intervention, while dancers kick up around her, their brightly colored party clothes to the dimly lit club. Unfortunately, her patiently waiting makes her a sitting duck for her real attacker, Arnold.

This weekend, Vladimir Putin, Russia’s semi-elected dictator-for-life, put Russia’s nuclear arsenal on “special alert.” Putin pulled this move, not in response to his army’s inability to take smaller, less populous Ukraine quickly, but in response, he said, to Western rhetoric surrounding his invasion. That is, Putin probably won’t nuke Ukraine, with its valuable farmland and mineral resources, but any NATO countries which come to Ukraine’s defense. He’s put the world on a Cold War footing.

It’s impossible to separate The Terminator from the Cold War. Though the United States had dropped the feel-good maneuver of duck-and-cover drills, with their illusion that one could survive a nuclear blast, by the time I hit elementary school in 1980, we were still daily conscious that the bombs could fall at any minute. Reagan-era rah-rah confidence was double-edged: hard work and determination could take us anywhere, but we were also constantly about to die.

American youth handled this fatalism in divided ways. Much recent 1980s nostalgia, like Netflix’s Stranger Things and ABC’s The Goldbergs, focus on the brightly colored flamboyance that got captured in a lot of photographs. But smarter critics than me point out that bright colors and garish prints were the exception, not the rule; much of the 1980s was brown, from cars and restaurant decor, to the clothing adults wore daily. It was a drab-colored era.

We see this spotlighted in the TechNoir scene. The characters are dressed ostentatiously, the colors bright and the prints vibrant. But the room itself is poorly lit, a problem accentuated by the low-hanging pall of cigarette smoke. Several dancers prance around, doing a strange pony trot in place, bent at the waist so they’re studying their partners’ shoes. Almost like, in the midst of the party atmosphere, the dancers are fleeing something dark and terrible.

This weekend’s nuclear alert pushes the world closer to full nuclear conflict than at any time since Able Archer 83. In 1983, NATO forces held a joint preparedness wargame effort in the North Sea, between Scotland and Norway. The only problem was, nobody bothered to warn the Warsaw Pact. Viewed from Moscow, the wargame looked like forces organizing for an amphibious assault. Moscow prepared to defend itself from the anticipated attack, with nukes if necessary.

It’s possible to find 1980s movies about nuclear war caused by Malice, and in most cases, it’s presented as Soviet malice. Red Dawn or The Day After showcase military forces ranging from incompetent to downright evil. But following Able Archer 83, other movies depicted a military machine unprepared for rapidly changing technology. The Terminator, like WarGames, depicts an American military arsenal hijacked by intelligent computers, and humans unprepared to stop the terror they had created.

Faced with this reality, American youth split. Consider The Breakfast Club, and particularly the collision between Claire Standish and John Bender. Twho over-the-top personalities, they represent respective responses to the times. Claire dresses garishly, lives for sensual pleasure, and skips school to go shopping. Bender wears muted colors, dresses like he’s preparing for a street fight, and antagonizes others. Both manners are nihilistic responses to the expectation that, in nuclear-armed times, there is no future.

The fact that Claire and Bender kiss at the movie’s culmination serves the same purpose that dancing in the TechNoir provides. Of course these two share an attraction; notwithstanding Bender’s abusive behavior, they share fundamental values. Bright colors dancing in the dark, or the brawler and the princess exchanging saliva; it comes to the same end either way. To the extent that the 1980s were garish-colored, it was because those living then expected to die.

The Terminator, with its 1980s collision of party culture and violence, gave way to Terminator 2, in which John Connor starts the movie zooming around, looking for a purpose. The threat of nuclear annihilation was terrible, certainly. But at least it gave us something to live for right now, which the subsequent peace didn’t. The grouchy snarling of 1990s pop culture represented a culture cut loose. Unless something happens, it’s the future we now have.

Thursday, December 23, 2021

J.K. Rowling and Modern Inflexible Morals

A college acquaintance shared a milestone on Facebook this week: her grade-school daughter had read the entire Harry Potter series, in hardback, cover to cover, in under four months. Books this glimmering-eyed girl finished, not because some authority had assigned them, or because they provided some material advantage, but simply because the experience of reading brought her joy. This girl’s broad smile with her books was frankly an inspiration.

She shared this accomplishment the same week that Harry Potter creator J.K. Rowling continued her years-long pattern of punching herself in the face on Twitter. Rowling, a longtime supporter of the Labour and Scottish National Parties, with their broad center-left positions, has cultivated an audience somewhat more progressive than the general population. Which makes this week’s tweets, frankly too distressing to reproduce here, even more upsetting to her core fans.

How to reconcile these two halves? On one hand, Rowling’s works encourage young people to sit patiently with a book, practicing skills of patience and self-discipline that aren’t always rewarded in today’s media-saturated culture of instant gratification. On the other hand, Rowling, once a darling of the economic left, has expressed opinions so reactionary, devoted fans have described her as “transphobic, hateful, harmful and factually inaccurate.”

Her works have attracted audiences of diverse ages for the same reason my friend’s daughter enjoyed them: they’re a pleasure to read. And I mean that on multiple levels. One can appreciate them as childlike diversions, the belief that a magical wonderland exists in parallel with the everyday banality around us. Or as modern parables of ethics, responsibility, and justice. Or as an attempt to reckon with Britain’s wartime history.

This complexity partly explains their appeal. My friend’s daughter probably enjoys these novels for reasons entirely her own, but by having read them, she opens herself to the nuance that comes from rereading them in her future. Like life itself, the interlocking layers of Rowling’s story permit events to reflect the reader back at herself. We, like Harry Potter, are constantly changing, constantly learning, and the story reflects ourselves.

I’ve written before that it's foolish to expect great artists to be good people. The churning mass of undigested trauma that usually turns humans into artists, also generally leaves lifelong scars on souls; the personal stench that follows time-honored authors, from Christopher Marlowe to Ernest Hemingway to [fill-in-the-blank] is inseparable from their creative process. Healthy, well-adjusted minds often aren’t prepared to wallow in the psychological refuse where art is conceived.

It’s tempting to spew time-tested bromides about separating the artist from the art. But in Rowling’s case, that’s virtually impossible. The combination of unprecedentedly popular novels and modern social media has provided Rowling a public platform almost unmatched in world history. The same Twitter feed she used to dribble out story revelations, she now uses to grandstand opinions which her audience finds execrable.

Thus I’m no closer to resolving the underlying conundrum. Like Orson Scott Card before her, Rowling herself lacks the underlying nuance and self-reflection embodied in her work. Both authors reassured youthful readers that their ability to navigate an innately abusive social order didn’t make them as bad as the world around them. Both authors encouraged readers to perceive themselves as something beyond the aggregate weight of their actions.

Yet both authors reduced themselves to slovenly caricatures. Both jettisoned their accrued cultural goodwill and chose to focus with laser-like acuity on forms of queer sexuality that, frankly, didn’t affect them that much. Their works remain in print, testimony to aspirational virtues that, sadly, the authors don’t seem to share. Youth, and former youth, who read and learned from these books, are left to wonder: what now?

By encouraging my friend’s daughter to perceive reading as a meaningful good in its own right, Rowling has accomplished something worthwhile. Today’s culture doesn’t reward slow, self-contained tasks like reading. Our economy insists anything worth doing should be monetized, while our technology and entertainment industries encourage instant gratification. In teaching a rosy-cheeked grade schooler to value reading, Rowling has encouraged her to be a deeper thinker and better person.

That accomplishment, however, cannot be read separately from Rowling’s odious opinions. She preaches a doctrine that insists some people are inherently untrustworthy and must be punished in advance, a position definitely more Voldemort than Dumbledore. Why, it’s almost like Rowling is neither all good nor all bad, neither saint nor sinner. This ambiguity is unacceptable in today’s stark moral climate.

Maybe that’s the lesson to take from this situation.

Friday, July 30, 2021

The Return of Cartoon Morality

A promotional still from Netflix’s Masters of the Universe relaunch

Masters of the Universe wasn’t really my thing in the 1980’s. I watched it occasionally when it aired, but it mostly ran when my parents expected me to do homework, so I never followed story arcs (such as they existed in Reagan-era animation) or savvied the characters. So when Nexflix proudly announced their intent to resume the show where it ended, my first reaction was to shrug.

I realize I’m at the age when the popular culture of my childhood has become mythologized. I grew up watching the Eisenhower Era and its hangover treated as somehow sacred. Nick at Nite reran Mister Ed, The Donna Reed Show, and Rowan & Martin’s Laugh-In with reverence once designated for Bach chorales and High Church Mass. The Wonder Years was one of network TV’s biggest hits, and Classic Rock Radio first became a thing.

So maybe I should’ve been better prepared, thirty years later, when the mass media icons of my childhood became similarly fetishized. From Michael Bay’s Transformers movies to Masters of the Universe, I’ve watched the chintzy animation of my childhood, little better than commercials when they first aired, turn into half-serious mass-market art. The result has left me curious and dumbfounded.

He-Man and Optimus Prime began life, of course, as toys. With media largely deregulated during the Reagan Administration, the strict boundary between TV content creators and the advertisers who subsidized them melted away. The syndicated weekday cartoons, few of which lasted beyond a single season, mostly existed to sell toys. And I, desperate to fit in, purchased some of them, particularly Transformers.

Looking back, these shows’ storylines had messages I didn’t necessarily catch as a child. He-Man and Optimus Prime both had American accents, with greater or lesser degrees of slow Southern drawls; Prime practically sounded like a stand-up comedian doing a John Wayne impression. By contrast, Skeletor, Megatron, and Cobra Commander all had British RPs, offset by their raspy voices. Villains were easy to spot: they spoke like aristocracy with laryngitis.

The morality of these shows had the complexity of comic strips in Boy Scouts magazines. Heroes and villains were unambiguous, and heroism or villainy was simply the characters’ respective natures. Villains lived sumptuously, ate caviar, and shat upon their sidekicks, while heroes ate red meat, slept rough, and had tight friend networks. It’s tough to avoid correlating these stories’ simple morality with Late Cold War propaganda.

Michael Bay with Bumblebee, perennially the most popular Transformer

Therefore, though it’s tempting to treat recent reboots of She-Ra and Voltron as mere nostalgia, the moral complexity these characters introduced strikes me. The revamped Voltron Force’s attempts to maintain enthusiasm for their mission, faced against an empire driven by strange and often poorly defined motivations, seems intended more for adults than children. Are we as Americans, the story seemingly asks, the plucky adolescent heroes, or are we the empire?

I have avoided watching Michael Bay’s Transformers movies, not out of love for cinematic grandeur and distaste for Bay’s notoriously excessive spectacle, but because the Transformers represent a piece of my childhood I’m not proud of. I purchased Transformers toys, not because I wanted to participate in their story, but because I thought I might fit in with other boys. I thought I could purchase my way into normalcy, a late-capitalist attitude that’s hard to shake in adulthood.

Therefore, when I see these artifacts of my childhood revamped for adults, with the complex morality that grown-ups know and fear, I’m left puzzled. We adults know that reality doesn’t break down neatly into good and evil, that villainy is contextual rather than innate, and that carefully placed violence doesn’t really make social problems go away. Yet exactly that attitude is being marketed back to us.

Netflix’s decision, with their Masters of the Universe relaunch, to resume the existing story rather than reboot it as they did with She-Ra, makes me wonder who these products are for. Despite attempts to popularize binary morality with stunts like the War on Terror, we just don’t live in the Cold War anymore; pitting bare-chested American virility against the skeletal Red Menace surrogate makes little sense. Who asked for this?

Then I realize: maybe we did. As the oldest Millenials have already turned 40, and Generation X has rollover IRAs, maybe we wish we really had the moral clarity we imagined in childhood. In a world where our greatest enemy is a virus, and where our fellow citizens are the greatest threat to democracy, maybe we want a bad guy we can punch. Maybe we just miss morality.

Wednesday, June 2, 2021

Racism, Ellie Kemper, and History

Ellie Kemper

Twitter, which is frequently vulnerable to the worst forms of swarming behavior, apparently remembered that Ellie Kemper existed yesterday. The Kimmie Schmidt actress netted attention she probably didn’t want when someone discovered she once won a lite-beer “beauty pageant” organized by a St. Louis civic group with racist ties. The usual responses poured forth like a fountain: Kimmie Schmidt is a Nazi agent! Back off Kimmie, she was only nineteen!

The weird unfolding drama (Kemper hasn’t commented as I write) swirled around questions of exactly how culpable social media should hold Kemper, for having ever been involved with such an organization. Twitter’s short character count excises all nuance, of course, so the controversy devolves into painting Kemper, and the organization that rewarded her, with a broad brush. Everything descends, though, to a simple dichotomy: Kemper is, or isn’t, guilty.

I propose this misses the point. Faced with evidence that a historically racist organization controls one of St. Louis’s major annual summer festivals, tweeters are apparently casting around for someone to blame, individually. Put another way, this news presents evidence that systemic racism continues to exist, and picks society’s winners, in a major American city, and instead of decrying the system, tweeters seek a high-profile individual to blame.

Double-checking sources doesn’t make things any easier. Nearly every accusation cites this Atlantic article from 2014, which never mentions Kemper’s name. Moreover, this source says the formerly segregated organization that awarded Kemper this “pageant” (which, apparently, wasn’t competitive) isn’t nearly so bigoted as her accusers suggest. The article suggests its organizers have struggled, somewhat, with the group’s clearly racist past.

Judging from the source, the Fair Saint Louis, formerly the Veiled Prophet Fair, definitely had its roots in the desire to preserve racial and economic privilege. Author Scott Beauchamp describes how founders wanted to exclude the city’s Black population, poor whites, and trade unionists from power. However, Beauchamp also describes how the organization desegregated twenty years before Kemper was crowned, a year before she was even born.

Saying the group officially desegregated itself only says so much, of course. As journalist Sarah Kendzior writes, St. Louis today is a majority-Black city, but economic might and political authority remain concentrated in White hands. If Fair Saint Louis preserves its role in keeping the wealthy connected, nominally letting Black Missourians through the door doesn’t fix anything. I can’t address that; I prefer Kansas City myself.

St. Louis, Missouri

Several years ago, sociologist Arlie Russell Hochschild’s book Strangers In Their Own Land described the experience of mostly White people living in chronically poor parts of America. Hochschild chose her subjects for their poverty, but also because their regions traditionally supported a conservative political agenda that willfully punished the poor. Why, she wondered, do poor Whites seem to vote for their own continued impoverishment?

Ever the scientist, Hochschild avoided assigning explanations to her subjects’ behavior. She simply observed that people conformed themselves to their environments; even transplants from left-leaning areas, many of them registered Democrats, took on the political and religious leanings of their region. Humans, ever the consummate social animals, came to resemble the places they chose as home. Possibly because doing otherwise was like fighting the tide.

(Matthew Desmond wrote something similar: nobody blames the poor for their plight more than other poor people.)

Therefore, I have difficulty blaming Kemper for what happened when she was nineteen. Already an aspiring actress, she showed up in public and participated in a cheapjack attention grab, hoping it would kick-start her career. (As a trained actor myself, one of the reasons my career stalled at the gate was my unwillingness to participate in such displays.) She was part of her community, and comported herself accordingly.

Whether Fair Saint Louis remains racist, despite desegregating over forty years ago, is irrelevant. So is whether Kemper, at nineteen, had enough individual agency to make decisions about history. What matters is that twenty-two years later, Twitter, a notorious engine of self-righteous dogpiling, decided that blaming Ellie Kemper for the persistence of racism in America would fix… well… anything. Because we still think assigning blame equals solving the problem.

Fair Saint Louis was certainly founded in racism and classism. Then it survived its time because people go along to get along. Now, decades later, the exact same conformist mob mentality seeks to blame Ellie Kemper because America’s racist legacy hasn’t gone away yet. The mob can’t handle that individuals can’t sway systems (ironically enough). The Twitter mob wants an individual to blame. The system just laughs.

Wednesday, April 14, 2021

The Cost of Being a Grown-Up During the Pandemic

Arclight Cinemas' flagship location, on Hollywood Boulevard, before the pandemic

When news broke this week that Decurion Corporation, owner of Pacific Theatres and Arclight Cinemas, would shutter its locations, I initially laughed. Corporations like Decurion have participated in the contracting entertainment market for decades, until just five companies now control most of Hollywood, and one company, Disney, controls a one-third share. Did another corporation get too massive to withstand economic setbacks? LOL, whose fault is that?

Almost immediately, though, testimonials began emerging from Decurion’s Southern California market base. People mourned, not the big-ticket Hollywood movies they watched at Pacific and Arclight, but the memories they shared of other people. First dates, chance encounters, and more: people love these cinemas because of the people there. The facilities and their products seem almost ancillary to people’s love of these cinemas, deriving from their human experiences.

Decurion’s closure removes another place we can meet people we don’t already know. As urban designer Jeff Speck writes, the suburbanization of America has narrowed the number of placed we’re allowed to enter without prior invitation. If you’re too old for school, your opportunities to meet new people are generally limited to shopping malls and houses of worship. But those places subdivide people according to shared faith or buying habits.

Cinemas, by contrast, opened new vistas. Sure, the movies were mass-manufactured by Hollywood dream factories, but consider the images of moviegoers from the 1940s and 1950s, the years when American religion was calcifying. People wore their finest to the cinemas, because moviegoing was an opportunity to be in public. It meant an ironically unscripted opportunity to participate in society. Until recently, opening-night lines were still a destination.

Multiplexes swallowed locally owned cinemas after the 1970s, just as malls swallowed locally owned department stores, and Barnes & Noble swallowed local bookstores. Yet even as the business dynamic became more concentrated and inflexible, the experience remained valuable. Cinemas weren’t just places people saw movies; people saw those movies together. I’m old enough to remember the 1990s Star Wars rerelease, so this communal experience still happened fairly recently.

Americans are losing places where we have freedom to be something other than our work. Market consolidation means narrowing other opportunities: without local businesses, we’re losing street theater and guerilla culture, arts and experiences that aren’t subsidized by corporations. We’re losing opportunities for grown-up spontaneity. And it’s happening because opportunities to be an unmediated adult are not profitable.

Speaking purely anecdotally, I’ve lost count how often I see one question written on Internet discussion boards: “How do you make friends as an adult?” Sadly, the answer is widely documented, if not widely known. Your friends are the people you spend the most time with, so you make friends by spending time with other adults. Night classes, volunteer activities, and cultural events are where grown-ups go to make friends.

Closing cinemas are, therefore, symptomatic of a larger loss: places we can meet strangers, without being steered by algorithms. We can stream movies and TV digitally; we can’t stream humans. Multinational conglomerates absorbed local businesses, then digital marketing and the pandemic submarined the conglomerates. In their place, we’re left with Amazon, Netflix, and Facebook, which are valuable in their place, but lack mass. We’re left with no place at all.

I mustn’t forget myself: Pacific and Arclight are for-profit corporations too, and they sell movies from for-profit Hollywood studios. Like other chain stores we've lost since 2008, Decurion is a brand, not a friend. Yet notwithstanding their flaws, they offered opportunities to meet people and have our own experiences, opportunities increasingly neglected when algorithms decide what options we even get to see. Decurion is imperfect, but serves a role.

After the pandemic, we’ll have choices to make. These include individual choices about how much of our personal money to spend locally, but also society-wide choices about how much consolidation we consider acceptable, how much wealth we permit CEOs to accumulate before they threaten society. As half of Earth’s internet ad revenue moves through two corporations, Facebook and Google, and one-third of digital commerce goes through Amazon, these questions matter.

Because when chain cinemas close, kicking more power to Jeff Bezos and Reed Hastings, we aren’t just losing companies. We’re losing opportunities to meet strangers, broaden our horizons, and have grown-up experiences. It sounds hyperbolic to say our worlds are getting smaller, but that’s the evidence we’re getting from the anecdotes around Arclight and Pacific. They aren’t places, they’re memories, which we’ll never get back.

Just as important, we’ll never have chances to make new memories.