Showing posts with label nostalgia. Show all posts
Showing posts with label nostalgia. Show all posts

Thursday, November 13, 2025

Breaking Rust: the Final Boss of Country Music

In my father’s household, the only music was country music. And the only country music was honky-tonk. Lefty Frizell, Patsy Cline, and especially Old Hank remained the only benchmark of artistic accomplishment. Dad would permit newer, slightly innovative artists like Alabama or the Oak Ridge Boys into the house, but for all practical purposes, Dad believed good music ended when the Statler Brothers stopped having chart hits.

I remembered Dad’s rock-ribbed loyalty to old-school authenticity this week, when a generative AI song hit number one on Billboard’s Country Digital Song Sales chart. Breaking Rust appears to be an art project combining nostalgic country sounds with still images; music journalist Billy Dukes tried, without success, to track the originating artist. Critics, journalist, and armchair philosophers are debating what Breaking Rust means for future commercial music.

“Walk My Walk” is a pleasant, but undistinguished, Nashville hat-act song. Rhythmically, it’s a prison work song, and the lyrics tap into a White working-class motif of “hard livin’ won’t break me.” The digitally generated voice has a deep bass rumble with a nonspecific Southern accent, a sort of dollar-store Johnny Cash. Its message of facing poverty and setbacks with dignity and grace goes back decades in country music.

We shouldn’t over-interpret the track’s success. It topped a relatively obscure chart, not Country Airplay or Hot Country Songs. Because it exists wholly digitally, and the artist is apparently unsigned to a Big Three label, Breaking Rust’s success probably reflects the Spotify algorithm. It bears repeating, as Giblin and Doctorow write, that half of all music in America gets heard through Spotify, giving one algorithm extreme control of American taste.

That’s saying nothing about whether number one means anything anymore. I’ve written previously that, far from reflecting public taste or artistic merit, as it supposedly did during the heyday of Elvis or the Beatles, Billboard now often reserves the top slot for the blandest, most committedly inoffensive tracks. Far from the best song out there, today’s number one is the song most conducive to playing as background music while studying or commuting.

Breaking Rust is the smoothest puree of recent country music, with its sound, its lyrical themes, and even mid-tempo walking rhythm. I fear I’m repeating myself, as I’ve recently written something similar about AI-generated television. But I’ll repeat myself: AI “art” inevitably results in what mathematicians call “regression toward the mean.” That is, the larger your core sample, the closer your product falls toward dead center.

The AI-generated portrait of Breaking Rust: the ultimate banal Marlboro Man

Yet for algorithm-driven entertainment businesses, regression isn’t a flaw. Country music itself proves this. The genre was founded by hardscrabble people from poor backgrounds, artists who lived the life they described. Hank and Lefty, Jimmie Rodgers and the original Carter Family, were working people who recorded the songs of their friends and communities. Rodgers recorded his final songs literally on his deathbed, because his railroad work had destroyed his lungs.

But around 1963, when the British Invasion caused a sea change in rock music, several former rockers reinvented themselves as country singers. Johnny Cash, Conway Twitty, Jerry Lee Lewis, and even Elvis tapped into the twangy side of their Deep South artistic influences and became country musicians. But in changing their style, they also changed their market: their songs became much more past-oriented and nostalgic.

Growing up, I listened to songs like Cash’s “Tennessee Flat Top Box,” the Statlers’ “Do You Remember These,” and Johnny Lee’s “Cherokee Fiddle,” all of which yearned for a past uncluttered by responsibility, noise, and haste. Country music’s founders sang about working-class rural living—and, in the alt-country enclaves, many still do. But after 1963, naïve yearning for bygone simplicity dominated the radio mainstream.

Even before generative AI, the system reinforced itself. Longing for a simplified past became the soil where new artists grew. Neotraditionalists like Dwight Yoakam, Mary Chapin Carpenter, and Toby Keith built careers around pretending to be older than they were, and wishing they lived in a countrified candyland that only existed in songwriters’ imaginations. Much as I love Johnny and Dwight, their nostalgia became a trap I needed to escape.

In that regard, Breaking Rust is the inevitable outcome. Country music, which began life as White soul music, has been a nostalgia factory since before I was born. Moving out of my father’s house, I had to unlearn the genre’s myths of honest work, stoicism, and White hardship. Other fans chose not to unlearn it, and Breaking Rust has doubled down, becoming the perfect recursive loop of self-inflicted White persecution.

Sunday, February 9, 2025

One Foot in Bakersfield, One Foot in the Future

Dwight Yoakam, Brighter Days

Dwight Yoakam’s best music, especially his hits from nearly forty years ago, has always striven to make him sound older than he really is. His 1986 breakthrough single, “Honky Tonk Man,” was a cover of a 1956 Johnny Horton barn-burner, and his best work always strove to sound like Bakersfield, 1960-ish. But sounding older means one thing when you’re thirty; how does he maintain that strategy now at 68?

This, Yoakam’s sixteenth studio album, is emphatically not timeless. Yoakem stages a deliberate callback to his own neo-traditionalist roots, appealing for those who think history has gone badly and want to fix its errors. He channels the twangy, backbeat-heavy Central Valley country music that made him famous. In doing so, he attracts an audience who probably shares my assessment that country music went cockeyed somewhere around 1996.

Despite this, Yoakam avoids the modish cynicism that often accompanies older artists recording nostalgia bait. He’s remarkably optimistic, even on the more melancholy tracks, his expressive sadness often transitory. Perhaps this reflects the two pulls on Yoakam’s artistry: he’s always been artistically (and often ideologically) conservative. But since last recording, he became a father for the first time, at a mere sprightly 63.

So his sound remains retro, but he has a pointed hope for the future. The album opener, “Wide Open Heart,” has the aggressive chomping chords that made both country and rock sound so distinctive in Southern California in 1960. But it’s a love song, full of “She’s all mine to love” and “Come on let’s get it done.” Except it’s love for his carefully restored, chrome-plated street racer car.

Because yeah, Dwight’s old, but he’s young enough to care. This album brims with red-hot emotions for whatever gives Dwight hope enough to keep moving forward. Many seem dedicated to his wife, Emily. (Despite several high-profile relationships in the 1990s, he never married until 2020.) Others are dedicated to music, touring, and a cover of the Carter Family classic “Keep On the Sunny Side,” an ambiguous nod to spirituality.

Dwight Yoakam

The sound remains retro, certainly. He loves crunchy acoustic-electric guitars supplemented with a heavy marmalade of Hammond organ, sounding like something the Wrecking Crew would’ve mass-produced sixty years ago. Most songs maintain a steady 4/4 or 4/6 time; you could line-dance to even the album’s slowest, most navel-gazing tracks. “Can’t Be Wrong” opens with almost the same chords Yoakam used on “Please Please Baby” in 1987.

Yet notwithstanding that conservatism, Dwight sees a brighter future. “A Dream That Never Ends,” with its Laurel Canyon vocal harmonies, implies, despite its title, that love isn’t infinite, and somebody might leave—but he insists he’ll keep believing anyway. “If Only” dreams of what could happen if we shed our carefully constructed cynicism, and includes the eminently quotable line: “If only you’d choose love, love would choose you.”

“California Sky” is perhaps Yoakam’s  most thoroughly engineered track here, with its Tex-Mex guitars and slight nod to fatalism. “Hand Me Down Heart” is exactly what you’d expect from the title, a lament for the suffering he’s previously endured. But instead of surrendering to despair, he presents his heart as something capable of healing, worthy of redemption. Second chances are real in Yoakam’s world.

You might mistake Yoakam’s title track for a love song, with its unifying Tom T. Hall riff, but he wrote it for his son, who closes it with half-gibberish lyrics. “Brighter days are what you promised me,” he sings: optimistic, but with implied consequences if the promise falls through. “Bound Away” laments the touring musician’s life, like CCR’s “Traveling Band,” but with the recognition that “I’m trying to come home, I’m trying to land.”

At nearly an hour, this album runs almost twice conventional LP length. Despite the vinyl revival, Yoakam knows he’ll mostly be streamed or downloaded, and isn’t circumscribed by physical limits. This gives him freedom to play generously with composition and arrangement. Though he doesn’t do anything revolutionary—no Mike Oldfield half-hour experimentation here—he does plumb the full depth of his conservative ethos.

Listening to this album, we’re aware we’re hearing an artist who hasn’t had a Top-40 hit since 2000, and no breakout smash since 1993. His entire career is a nostalgia circuit for aging fans who need reminded why we embraced him so aggressively nearly forty years ago. Yet within that limit, Yoakam expresses an optimism his 1990s recordings sometimes forgot. Dwight’s old, yes, and so am I, but he reminds me that we both still have a future.

Wednesday, September 4, 2024

An Oasis in the Desert of Reality

This photo of Liam and Noel Gallagher has accompanied the announced Oasis reunion tour

Perhaps my opinion on the announced Oasis reunion tour is distorted by me being an American who streams British media online. This makes Oasis seem both larger and smaller than they actually were: they planted twenty-four singles in the UK Top Ten, but only one in America. “Wonderwall,” obviously. Two other songs, “Don’t Look Back in Anger” and “Champaigne Supernova,” have also had lingering afterlives on alternative radio.

Thus, although I appreciate the panicked rush to purchase reunion tickets, I can’t participate. Oasis represents a place and time, one in which I couldn’t participate because worldwide streaming scarcely existed before the Gallagher brothers stopped working together in 2009. I realize that lost era means something important to those who lived through it. However, the timing seems exceptionally pointed, knowing what I do now.

The Gallagher brothers announced their reunion tour as Oasis almost exactly thirty years after the release of their record-setting debut album, Definitely Maybe. Released on 29 August 1994*, this album outsold any British freshman album until that point, and all four singles went Gold or Platinum. Though Oasis wouldn’t have an American breakout until their second album, they didn’t need one; they already had Beatles-like acclaim on their first try.

Nostalgia vendors always seem to think things were Edenic approximately thirty years ago. Growing up in America in the 1980s, popular media regaled me with giddy stories of how wonderful things were during the Eisenhower administration. TV series like Happy Days an MASH, or movies like Stand By Me and Back to the Future, though they commented on then-current events, nevertheless pitched a mythology of prelapsarian sock-hop ideals.

Importantly, these shows all spotlighted not how the 1950s were, but how the aging generation remembered them. MASH endeavored to humanize the horrors of war, during a time when Vietnam was still too current for commentary, but it did so in ways that only occasionally included any indigenous population. Happy Days was set in Milwaukee, Wisconsin, one of America’s most racially segregated cities, but featured almost no Black characters.

This whitewashed nostalgia reaches its apotheosis of silliness with Back to the Future, in the prom scene, where White Marty Mc-Fly shows a Black backup band how to really rock out. The maneuvers he tries might’ve looked shocking to the White kids on the dance floor, because they lived pre-Jimi Hendrix. But they were hardly new; Marty didn’t do anything Charly Patton and Son House hadn’t invented in the 1920s.

For Oasis to reunite almost exactly thirty years after they deluged British music, puts them in the same position. The announced reunion lineup currently includes only the Gallager brothers, omitting the revolving door of sidemen and rhythm sections that supported them for fifteen years. The band was contentious even before the brothers stopped working together; they never produced a dark horse, like George Harrison, to emerge from the wreckage.

I’ve written before that the nostalgia impulse produces the illusion of inevitability. Because events happened a certain way—because the Beatles hit number one, because Dr. King bested Bull Connor, because America beat the Soviets to the Moon—we believe things had to happen a certain way. This removes human agency from history, giving us false permission to go with the flow, like a dead salmon floating downstream.

Oasis hit the mainstage when John Major’s Tory government was walking wounded. Though Major had received a Parliamentary mandate in 1992, he was already deeply unpopular by 1994, helping Thatcherism limp timidly across history’s finish line. Like their beloved Beatles, who broke just as Alec Douglas-Home’s doomed premiership ushered the Tories out, Oasis arrived to supervise a Conservative collapse.

And now they’re reuniting as Rishi Sunak has nailed shut another Conservative coffin.

But history isn’t inevitable. Tony Blair’s nominally progressive government, like Bill Clinton’s, openly embraced moralism and militarism, before ending in the massive moral sellout of Operation Iraqi Freedom. Likewise, Kier Starmer is possibly the blandest person ever elected PM, winning only because the Tories squandered every advantage. Starmer, like the Oasis reunion, bespeaks a rare British optimism, without much of a plan.

I’d argue that most people don’t really want an Oasis reunion. I strongly doubt anyone wants to watch two White men pushing sixty warble about the troubles of 1994 onstage. Their largely British audience simply wants to time-travel to a moment when they didn’t have to know what Brexit was, who Boris Johnson is, or why a down payment on a London house is triple the average annual income.

*Out of respect, I use British dating conventions in this essay, and only this essay.

Saturday, March 18, 2023

Your Childhood Nostalgia Is Stupid, and Dangerous

I’ve been thinking about Bobby Harrow*, who held Mrs. Martin’s class at Rancho Elementary School in Spring Valley, California, hostage for two years. He was boisterous and energetic, his mind not circumscribed by such effluvia as coursework or the fact that someone was talking. His nigh-pathological refusal to sit down, shut up, or stay on topic had the ability to derail the entire classroom. As often happens, Mrs. Martin ran her class to mollify Bobby.

Nowadays, of course, we’d diagnose Bobby with ADHD and prescribe pharmaceutical stimulants. But I knew Bobby eight years before I ever encountered the term Attention Deficit Disorder. In the early 1980s, he was simply high-spirited and disruptive, something his teacher needed to handle, and his classmates needed to live with. He was also intensely creative and a natural problem-solver, so Mrs. Martin needed to teach him channeling techniques, not just silence him.

I remembered Bobby this week, upon reading this spectacularly slovenly piece of GenX nostalgia:

Click to enlarge

Judging by the author’s profile, he’s probably approximately my age. Like me, he’s White. Nothing clearly indicates his personal background or childhood, but I’d venture that, like me, he comes from a conventionally suburban upbringing, where parents left the neighborhood daily for work, and school and perhaps the local strip mall were the only factors holding the community together. Neighborhoods like mine, and probably his, weren’t community, they were mailing addresses.

It wasn’t one neighborhood, either. My family moved around, pursuing my father’s Coast Guard career, so I experienced schools and neighborhoods in California, New York, Louisiana, and Hawaii. My parents always chose suburban neighborhoods, which they deemed “safe”; they’d never have admitted it aloud, probably not even to themselves, but that meant “White.” Therefore my upbringing was reasonably whitewashed, free of contact with unconventional demographic groups.

Because of my suburban raising, I didn’t know gay people existed until high school. Of course, they definitely existed; groups like Lambda Legal, the Gay Liberation Front, and PFLAG predate my sheltered Caucasian childhood. Similarly, I knew racism existed, but because I lived in well-scrubbed suburban neighborhoods, I never saw it; therefore, I never thought about it, and grew up believing it belonged to another historical epoch. I never had to know, so I didn’t.

Yet I definitely knew other things. Though I never encountered the term ADHD until 1991, Bobby Harrow definitely proved it existed; we just didn’t force-feed students like him Ritalin or Adderall to make them compliant. Similarly, I know, because Mrs. Martin told us, that she maintained a classroom first-aid kit that included epinephrine, because at least two classmates had food allergies. And autism definitely existed; the school just quietly dumped those students into Special Ed.

Therefore I find myself both willing and unwilling to forgive the tweeter above. For my first eighteen years, I didn’t have to know “different” people existed; I thought my suburban upbringing was normal. When I discovered it wasn’t, that some people had very different childhood experiences, I chose to learn and grow. Other people, mostly White adults, keep discovering different people have different experiences, and simply shout “nuh-uh!” It’s a thoughtless reaction, but I understand.

But I also can’t forgive people who don’t remember things that definitely existed. The simple claim that other people didn’t have neurological conditions, allergies, or even obesity, thirty years ago, only makes sense if this person and others like him willfully edit their memories. And to suggest that grade-school students sat quietly back then? Preposterous! Then as now, kids hated being confined indoors every day, and got stroppy and rebellious, needing repeated discipline.

When people younger than me complain about how lawless, crazy, and self-indulgent children are “these days,” I know they’ve made a choice. I was alive then; I know these problems existed, and often caused great disruptions. Yet somehow, I repeatedly encounter people who believe that today (whenever today is) has become monumentally worse than their own beloved childhood. And I wonder what kind of velvet-coated angel hatchery they claim they emerged from, free of friction.

Why do ADHD and autism diagnoses seem so common nowadays? Because we bother to look for them, Bradley! Why did nobody have nut allergies or Celiac disease when in your school daze? Probably because back then, people with food allergies just died, Susanne! Things aren’t getting worse, we’re just aware that bad conditions exist. If you can’t keep up, that’s a “you problem,” and you need to stop cluttering the discourse with your fake nostalgia.

*not his real name

Monday, September 26, 2022

The Future is Dead, Long Live the Past!

What does a naïve fascination with Laura Ingalls Wilder have in common with this week’s projected election of a literal Fascist government in Italy? I hear the knives coming out among my few regular readers even as I write this. Yet I contend that America’s attachment to a beatified past reflects the same sentiments that have installed Giorgia Meloni as Italy’s next presumptive Prime Minister.

Smarter critics than me have written about cottagecore, a design aesthetic based on a supposedly better agrarian past. From the architecture and artwork to the now-notorious “prairie dresses” briefly sold at Target, the cottagecore ethos lionizes a time in America’s past when people lived simply, ate food from their own gardens, and didn’t busy themselves with sleek design. Many cottagecore enthusiasts have attested their loyalty to Wilder’s Little House books.

Laura Ingalls Wilder didn’t simply write her simple novels to entertain children, though. She wrote at the urging of her daughter, Rose Wilder Lane, who probably served as unbilled co-author. Lane was also one of the founders of Libertarianism, the political philosophy holding that everything would be better if rules and regulations were unilaterally rescinded, and everyone were free to follow their internal moral compass.

Ingalls’ Little House books are replete with messages of self-reliance and autonomy. Repeatedly throughout the novels, the characters learn the importance of swallowing their complaints, working hard, and not asking anybody for help. The characters are resolutely unmoored from community, with the assistance which community entails. One novel commences with the family moving west, claiming their motivation was that too many people were around.

Contrast that with Peter Weir’s 1985 movie Witness. Forced from a Back-East city to harbor among the Amish, detective John Book (Harrison Ford) learns the exact opposite lessons which Wilder taught: restraint, sharing, community. Important moments happen when he learns, for instance, to sip his lemonade, not guzzle it. I can’t be the only American who considered decamping to Amish Country after watching Book participate in an Amish barn raising.

Cottagecore espouses rural simplicity, but not a communitarianism. It’s all Laura Ingalls Wilder, no John Book. Adherents believe, to a greater or lesser degree, the popular White American myth of autonomy and rural solitude. But real early rural life was deeply communitarian, because it needed to be. Besides, as historian Nancy Isenberg writes, the Ingalls family probably moved west, not to avoid crowds, but because richer Whites chased them off the land.

Meanwhile, as Americans of every political hue romanticize Oregon Trail pastoralism, Italians have elected a presumptive PM whose own party calls themselves “heirs of Il Duce.” Giorgia Meloni promises to reinstate nearly the entire Mussolini policy agenda, because that ended so well last time. Mussolini literally gave global nationalism the shorthand name of Fascism, and provided the political blueprint that Hitler duplicated and expanded upon.

Meloni cites the endless panoply of evil-bringers which American and British audiences will recognize: immigrants, “globalists,” homosexuals. She promises to restore lost national greatness that definitely existed in the vaguely defined past: make Italy great again, if you please. Fundamental to small-F fascism is a belief that things used to be good, but now they’re not— though that goodness, and that lost time, are always fuzzy and intangible.

Italy isn’t alone in yearning for a more authoritarian past. Vladimir Putin has lamented the Soviet collapse, and attempted to regain Russian greatness by repeating Stalin’s greatest sin, oppressing Ukrainians. Britain’s new PM, Liz Truss, announced her own good-old-days policy by slating Thatcher-level tax cuts, and global currency markets responded with scorn. Because if there’s anything Europe needs, it’s 1980s-level unemployment and labor unrest.

As I’ve recently written, popular culture keeps looking backward because, I suspect, it’s leery about whether we have a future. But in pulling focus outward to encompass the larger social terrain, that seems to be a widespread attitude outside Hollywood, too. From gingham dresses to brownshirts, our discourse is dominated by nostalgic longing for a beatified past that never quite really existed. Just ask anybody who survived Jim Crow.

I understand this fear of having no future. Look outside your window: it’s blazing hot and getting hotter, and we’re running out of clean water. Capitalism has created intense poverty, but rosy-eyed alternatives don’t fix injustice, they move it around. A small handful of centibillionaires are so wealthy they’d rather flee Earth than fix its problems. It’s easy to feel like there’s no future. But if we don’t find our future soon, the future will surely find us.

Friday, February 11, 2022

The Day I Discovered Time Travel

Ani DiFranco

The Ani DiFranco song ambushed me at a moment when, too sure of myself, I let my mind sprawl like a lover on a divan. I had a channel of women singer-songwriters on my preferred streaming service, playing a regular selection of Dar Williams, Indigo Girls, Patty Griffin. The channel played Ani regularly, but mostly her late-nineties stuff, when she was pushing thirty, tracks like “32 Flavors” or “Untouchable Face.”

Instead, the channel played “Both Hands,” from her eponymous debut, recorded when Ani was only nineteen. I hadn’t heard that track in nearly twenty years. I still have my CDs I bought before college, though like most people, I only blow the dust off them occasionally. They’re part of a world I no longer occupy, a world I occupied only loosely even then. A world I frequently intend to revisit, temporarily, but never quite do.

Until I did. That song played, and I was instantly transported. Like Marcel Proust’s famous macaroon, the song had magical powers. I didn’t just remember being twenty-three, frustrated and disappointed while waiting for my life to commence; I was literally there. I was sitting on a concrete porch in a small Nebraska town, with a battery-powered boombox, watching kids play soccer on the schoolyard across the street and thinking: now what?

Both Hands,” a lament of self-discovery in a poorly chosen relationship, could only have been written by a teenager. But the recording I first heard, from Ani DiFranco’s first live album, was recorded when she was twenty-six, deep in the thicket of what we euphemistically call adulthood. I’ve seen this before, though with DiFranco, I was too young to understand. Songs by youth, about waning adolescence, hit harder from older, wearier voices.

I grew up surrounded by images of Baby Boomers extolling their lost youths. The first generation to grow up in the embracing womb of pervasive television, Boomers have icons that let them time-travel the way I did. One of my earliest memories of TV is a PBS documentary about how cushy the idiot box was for boomers. It ballyhooed diverse nostalgic images, from Leave It To Beaver and Felix the Cat, to Woodstock and Cronkite at Da Nang.

The blogger as post-adolescent
hippie wannabe. Probably age 25,
somewhere around the year 2000.

I wanted that connection. I wanted to exist on a continuum of nostalgia like I saw praised on public broadcasting between Big Bird and Doctor Who. When I finally rebelled against my parents’ performatively countrified upbringing, I embraced a hippie-era ethic of crunchy rock, long hair, and groovy vibes. I’ve written about this before. But I wasn’t really time-traveling then, because the world I glommed onto involved only filmed images and photos, which I’d witnessed fairly recently.

Things have changed, though. I’m at the age when I can say “twenty-five years ago” and refer to something that happened when I was already an adult. After passing through a long, lingering hippie-dip adolescence, Ani DiFranco was the first contemporary artist I landed on, recording for a mass audience right now. She ushered in my fondness for indie rock and folk music, contra my parents’ country & western, or the classic rock with which I rebelled.

So there I was, simultaneously in my office, doing my grown-up job, and also on the porch of my first house, listening to Ani and feeling smothered by ennui. Powered only by my brain, I’d slipped outside the bonds of present responsibility, the beige-tinted walls of adult sobriety, and traveled into a version of myself that both somehow was, and wasn’t, considered an adult. But I wasn’t there in the past; I could only watch from outside.

Suddenly, those PBS documentaries praising Boomer childhoods made sense. The intended audiences, then pushing forty, weren’t gleefully watching their younger selves; they were coming to grips with the choices they’d made. Watching myself from outside, I realized I was seeing someone whose life appeared empty of direction, but only superficially. I had millions of choices ahead of me, but I could only live in one direction. Once made, those choices were forever.

As the song ended and I returned to myself, a thought struck me: God and healthcare willing, I’ll someday look back on this moment the same way I just looked back on myself. And I’ll ask myself the same questions: did I make the right choices? Did I do something to make my future self proud? The only difference is, this time I’ll know I’m being watched.

Sorry, Doctor Who: time travel is overrated. It only makes me aware of the present.

Monday, December 13, 2021

Nostalgia Night at the Old-School Science Fiction Buffet

Cassandra Khaw, The All-Consuming World

Someone is scouring the galaxy, executing old forgotten criminals, the dregs of the post-humanist economic underbelly. Technology specialist Rita, and her hired gun Maya, realize they’re in this unidentified vigilante’s crosshairs. So they do what any reasonably well-informed cyborg criminal masterminds would do: get the old gang back together. This isn’t easy, since some of their fellows haven’t forgotten the old wounds and betrayals of their last, disastrous job.

Cassandra Khaw’s first novel reads like a literary catalog of allusions, callbacks, and nostalgia. How one responds will likely depend on how one feels about Khaw’s source material, and also how one feels about those sources being so transparently displayed. One could make a drinking game of identifying the donor parts that comprise this Frankenstein’s Monster of speculative literature. Nor does Khaw conceal these allusions, spray-painted big across the surface.

Maya, who has died and been reborn in a succession of augmented clone bodies for centuries, absolutely loves Rita, her “scientist.” We know because she tells us repeatedly. When Rita sends Maya into no-win situations to recover the Dirty Dozen’s lost children, Maya acquiesces, despite her misgivings. After all, last time the Dirty Dozen fought together, two of their number suffered deaths so catastrophic, even their advanced technology couldn’t help.

I remember reading these basic premises decades ago. Khaw’s jacked, technologically augmented protagonists come directly from cyberpunk, especially Pat Cadigan and Bruce Sterling. Not even tangentially, either: Khaw uses terminology pinched wholesale from Sterling and, with occasional words changed, William Gibson. Khaw’s repurposed Sprawl Cowboys might make speculative fiction readers very comfy, because they challenge seasoned readers minimally.

Moreover, Maya’s picaresque adventures collecting her team’s survivors feel very familiar. The first is now running a Fight Club so specific, its leader Ayane clearly thinks herself Tyler Durden. (An orphaned misquote reveals Khaw pinched David Fincher’s movie, not Chuck Palahniuk’s novel.) Another has squelched her past and remade herself as the galaxy’s most celebrated pop star, a trope found in countless Japanese anime through the years.

One by one, the Dirty Dozen’s barely half-dozen survivors gather on their old ship, nerving themselves, however reluctantly, to fight their unidentified enemy. But as they begin recounting narratives of their former mercenary careers, we start seeing truths these characters have willfully forgotten. Rita has deliberately pitted them against each other, sent them on suicide missions, and otherwise poisoned the well which this supposed sisterhood drinks from.

Cassandra Khaw

It takes until around the novel’s halfway mark before one character voices something I’d suspected for a while: these women (some of whom stopped identifying as women after their retirement) aren’t unified, they’re trauma-bonded. And, as they demonstrate their bizarre admixture of reverence and fear of Rita, it’s clear they have Stockholm Syndrome. As you’d expect, everyone realizes this but themselves, because the sick seldom realize they need healing.

More important, I realized something these characters didn’t: Rita regularly withholds information, and lies as easily as breathing. Their mission’s entire premise turns on the expectation that someone’s killing old mercenaries. Except we have no evidence supporting this premise, besides Rita’s word. Since we, and they, know Rita lies, the question becomes: when will everyone realize their entire mission is fabricated bullshit?

Khaw, a professional RPG designer whose résumé includes Dungeons & Dragons, performs the old Dungeon Master’s trick of framing their narrative through careful omission. We’re assured the Dirty Dozen are outlaws but, in the best cyberpunk tradition, that law is distant and poorly defined, apparently for sale through mercenaries like, say, the Dirty Dozen. We know certain characters are dead because an inveterate liar told us so. Pieces don’t add up.

Meanwhile, Khaw just keeps setting the scene, and setting the scene, and setting the scene. Rather than a singular story, Khaw regales their audience with a succession of vignettes on a theme. Again, like the separate nights of a game campaign. Because the story isn’t unified, our experience isn’t the action, but the characters. We read because we want the characters to learn and grow over the succession of events in their experience. Which, sadly, they mostly don’t do.

I don’t hate this novel. I particularly love Khaw’s narrative voice, a beatnik prose-poem banter that doesn’t so much inform us of events as immerse us in circumstances. But somewhere around page 200, I realized Khaw was still clearing their throat, still introducing characters and scenes for a story that’s always just about to happen. And, I realized, I no longer cared. Roll me a Natural One, I’m done.

Wednesday, June 9, 2021

What If the 1950s, But Sillier?

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 44
Walter Hill (writer-director), Streets of Fire

Glamorous rock star Ellen Aim has returned to her hometown to play a benefit gig before an adoring local crowd. But jealous biker Raven, leader of the Bombers, has other goals: his black-clad greasers rush the stage, overpower Ellen’s entourage, and carry her away like a trophy. Thousands watch helplessly, but one local woman contacts her secret weapon, her brother, the mercenary Tom Cody.

Director and co-writer Walter Hill produced this picture, an epitome of 1980s values, in the immediate wake of his runaway hit 48 Hrs. A slick package of highly choreographed fight scenes, teenage love revisited, and rock aesthetics, everyone involved anticipated another smash. It was dead on arrival, losing millions. Recent trends, however, have led critics to reevaluate this movie, reclassifying it as an ahead-of-its-time beauty of Reagan-era excess.

Tom Cody declares he doesn’t care to rescue Ellen Aim. Why get involved in local gangs and police politics? Some banter with his sister reveals Tom and Ellen were involved, years prior, but when her singing career became lucrative, they drifted apart. Tom carries a grudge. But Ellen’s nebbishy boyfriend, also her manager, offers a brick of cash, and Tom becomes interested. He buys some black-market guns and ventures into the darkest part of town.

Despite its dark premise, this movie’s defining trait is silliness. It presents all action with the depth and complexity of a Looney Tune. Its outdoor sets and streetscapes are so close-in and narrow that you never forget it’s a soundstage. Characters are exactly as deep as the plot requires, letting the script carry them from scene to scene, because they don’t have deep inner motivations; things simply happen because it’s time.

Yet somehow, we viewers feel yoked to the story’s potential. The silliness becomes downright operatic, with its tendency towards Grand Guignol and its elaborate, Tim Burton-like design. Like vintage melodrama, the characters are having enough fun that they see no reason to interrupt the proceedings. They want things to reach their inevitable conclusion because they enjoy being slick, commercial, and drenched in early-MTV sumptuousness.

In essence, this movie is a designer’s vehicle; even the rococo sets remind us we’re participating in conscious art. The nameless city’s streets have an Edward Hopper depth, very close and angular, with bare concrete under painted steel facades (which are clearly plastic and Styrofoam). Like in a dream, or myth, everything is very close together: the city’s worst street is around the corner from its best.

Ellen Aim (Diane Lane) and Tom Cody (Michael Paré) in Streets of Fire

A 1950s aesthetic pervades this film, but not deeply. Shark-fin cars and greaser boots are everywhere, but so are upswept 1980s hairdos and oversaturated music-video colors. An early title card tells us this story happens in “Another time, another place.” That time and place is clearly inside somebody’s head, because this isn’t historic; it's a Reagan-era dreamscape fueled by Top-40 skifflebop and anti-juvenile delinquent PSA’s.

Then we have the fight scenes, for which this movie was written. Unnamed characters fall off motorcycles, get whanged with sledgehammers, and tumble out of moving cars, but nobody is ever really hurt. Like I said, it’s a Looney Tune, a Bugs Bunny caper. We don’t expect realistic consequences for cartoon violence, we expect people’s heads to bounce off pavement like it’s made of rubber. Violence is slapstick, not horrific.

This pervasive silliness is underscored by the movie’s rock-and-roll soundtrack, which almost never stops. Its rockabilly vibes remind us we’re watching somebody’s nostalgic fantasy. (This is the same era when the Stray Cats and the Cramps updated Fifties vibes for a more commercial age.) This movie pines for fast guitars, slick cars, and back-alley rumbles. Like much of its era, it yearns for a simplicity that probably never really existed.

This movie plays out a Reaganite wistfulness for a simplified 1950s, divided between obvious heroes and villains. It pits calm, big-shouldered Tom Cody, the ex-soldier, against greaser Raven and his gangsters; but it also pits Tom’s demonstrative manfulness against Billy Fish, Ellen’s geeky manager and new boyfriend. Tom’s violence works, but it’s also outdated; even he admits the future belongs to people like Billy, not himself.

As stated, this movie landed with a quiet thud. This didn’t bother writer-director Hill, who was massively prolific and moved onto another project. Nearly forty years later, though, fans have reevaluated its legacy. It has more in common with mythologies like Lord of the Rings than the semi-realistic action flicks which dominated 1980s cinema, while also embodying its era’s pining for lost moral simplicity. And it’s also just silly fun.

Wednesday, April 28, 2021

Disneyland is a Fortress and a Temple

When I was small, my father took up latch-hooking as a hobby, to keep himself at home and responsible for his family. I remember one of his largest projects, a three-by-five-foot wall rug with an image of Donald Duck. He made it when I still slept in a cradle, and I kept it on my wall, through multiple moves and homes, until I was eighteen. It was a talisman, not only of the innocence Donald represented, but also of the knowledge that my dad loved me.

Last week, Nevada writer Jonathan VanBoskerck published an op-ed in the Orlando Sentinel whining that “I love Disney World, but wokeness is ruining the experience.” (The Sentinel moved it behind a paywall after its viral flare-up passed, but it’s available on other websites.) I wasn't inclined to write about VonBoskerck’s entirely predictable, formulaic whimper, because frankly, we’ve heard it before.

Then I remembered my Donald Duck rug.

To his credit, VanBoskerck doesn’t assume Disney used to be apolitical. He contends the massive media conglomerate bearing Walt Disney’s name is drifting from it’s founder’s vision. A vision which, coincidentally, ratifies his own as “a Christian and a conservative Republican.” Uncle Walt, an ardent capitalist and flag-waving patriot, apparently made GOP voters feel good about themselves; today’s company makes them feel bad and picked on.

This claim stumbles when studying anything about media history. Princeton historian Kevin M. Kruse writes that Uncle Walt founded his company as a New Deal Democrat, and gave FDR unpaid advertising in his early featurettes. Only during World War II, and the subsequent Cold War, did Disney’s personal and business philosophy drift right. By 1955, Disney was, with Billy Graham and Ronald Reagan, part of America’s triumvirate of mass-media Christian Conservatism.

Therefore, VanBoskerck isn’t championing Uncle Walt’s vision as it existed, pure and unchanging. He wants to trap Disney, to veritably cryogenically freeze him, at the moment which gives VanBoskerck most comfort. Judging by VanBoskerck’s photo, he was probably born after Disney, the man, passed on; so he’s free to select any moment from the abundant buffet of political positions Disney expressed throughout his career.

The illusion of an apolitical past presumes that what you grew up with is simply normal, a neutral and stationary baseline immune to change. VanBoskerck’s vision of Disney, the man, drips with unspoken prior assumptions. Somehow, the decision to stop circulating Song of the South, for instance, was a sort of act of God. Now he has to see the desire for change as motivated by humans, which he identifies as political.

VanBoskerck fundamentally wants to retreat into childhood innocence, to resume never having to know about injustice and suffering. Like me, hanging my dad’s Donald Duck rug long after I’d outgrown its literal significance, he requires a place of moral neutrality where he needn’t take positions, or feel shame for failing to do so. That’s why he inveighs against Disneyworld specifically, over the Disney Corporation: he wants a physical destination without moral weight.

But this assumes the beatified past was simply natural, that his view matched that of children everywhere. That young kids of all eras were driven by the same motivators that he was, that things simply existed. It’s a definition of “political” that makes certain groups, like women and children, innately apolitical, even when they’re demonstrably not. That isn't a base for making good, meaningful decisions. It’s a reliance on ignorance.

I sympathize with VanBoskerck’s views. I’d love to revisit a time when music and TV and movies and books simply existed, with no need to be aware of the conditions from which they arose. We White people have this idea that everyone widely shared our innocence, and everyone wants to return to that. But we can’t go back, neither as individuals nor as a nation, because that time never really existed.

We all come from a background based on our parents’ economic standing and the choices our forebears made. White people didn’t have to see that when we were kids: we simply saw our families, schools, and neighborhoods, and considered them normal. The mere fact that we, as adults, have material evidence they weren’t, doesn’t change some people’s belief that the past was a morally blank slate.

This desire to flee the present makes perfect sense, of course. But once you can witness the world through another person’s eyes, you can never again fail to see that, for instance, Song of the South was pretty offensive, that Dumbo’s crows were aggressively racist, that Donald Duck was never neutral. My father’s love shielded me for years. But I’m an adult now, with all the burdens that entails.

Friday, March 12, 2021

Dr. Potato le Pew's Cancel Culture Extravaganza

A cross-marketed Potato Head family, based on the Toy Story franchise

“Cancel Culture” is the new “Politically Correct”: an insignificant, half-joking leftist meme which fuddy-duddies have blown way out of proportion. Under ideal circumstances, I would consider both buzzwords frustrating annoyances by petty people who want to generate controversy where none exists. I haven’t written about it previously, because I consider it a gadfly issue: annoying, but harmless. But things changed this week.

When members of the House of Representatives used their allotted floor time, a resource so finite that it’s allocated in the Constitution, to inveigh against recent changes in Dr. Seuss and Mr. Potato Head. During a time of national crisis unmatched since World War II, with pending bills regarding COVID relief and the January 6th insurgency, elected Republicans preferred to waste taxpayer-funded time bellyaching about reading primers and plastic toys.

Smarter writers have already bled ink explaining these accusations have no foundation. I care more about which objects have become emblems of hard-right outrage. These wastrels consume our limited time and attention on books written for preschool children struggling with rudimentary phonics, and a toy potato labeled “Age 2+.” They’ve literally redirected the national discourse to complain about products targeted at kids just learning how to understand words and faces.

Two explanations readily avail themselves. First, maybe top-level Republicans are operating at a level of rudimentary object-impermanence that makes Dr. Seuss illustrations and Mr. Potato Head face-recognition exercises necessary—or maybe they assume their voting base operates at that level. I’d rather not entertain this argument, because it feels like name-calling, and will only poison the well. But it needs acknowledged.

A second, more likely explanation arose this week, when Fox News and its acolytes spotted another supposed victim of “Cancel Culture”: Looney Tunes. With the revelation that lecherous skunk Pepé le Pew was written out of the Space Jam sequel, right-wingers latched onto another supposedly hatcheted figure of their past. This comes eight months after the brouhaha over a mooted Looney Tunes relaunch that disarms Elmer Fudd and Yosemite Sam.

Do people understand that Pepé le Pew is the villain of his story?

Watching elected officials soil themselves over anthropomorphic potatoes, pre-K primers, and Looney Tunes, I put it together: these numbskulls are trying to protect their childhoods. They see things they loved as grade-schoolers, possibly the last time their lives felt stable and unwavering, and they panic. Because their lives seem volatile enough, without their treasured memories being attacked as “problematic.” They believe they’re defending their own past selves.

Back in 1980, Professor Thomas Goodnight identified the overarching presumptions of Left and Right in American politics. Where “liberals” (an often misused term) perceive change as inevitable, and want to manage it, conservatives see all change as decline. I've written about this before. Conservatives believe, incorrectly, that the changes they see happening are new, and want to halt them, before they destabilize the homogenous world they think they grew up in.

I have trouble accepting the argument that Pepé le Pew normalizes sexual assault, or that Elmer Fudd normalizes guns. These characters’ entire role is premised on them being too stupid to understand themselves and their circumstances. Pepé receives repeated, glaring indications that his intended wants no part of him, and apparently never sees them. And I don’t recall Elmer Fudd’s blunderbuss ever hurting anyone but himself.

Notwithstanding my disbelief regarding the logic, we face a problem: this isn’t censorship. No official body made a preemptive decision to make Republicans’ childhood toys go away. Three private corporations made self-interested decisions on how to utilize their own properties, in ways to maximize their profits. Isn’t that what we’ve been told capitalism is? Would they really hijack private property to protect their… their… whatever they’re trying to protect?

Yes, apparently, they would.

Thinking about it, I realized, despite their Cold War-influenced rhetoric, capitalism isn’t a First Principle for conservatives. Fundamentally, they like capitalism, not for itself, but because it’s what they know. Capitalism comes second to their first love, continuity. Ultimately, their fear of racial justice, economic reform, and “Cancel Culture” boils down to distrust of change. They don’t want justice, they want continuity. And they’ll sacrifice everything else to get it.

Sudden, market-driven change has grown-ass men clutching their blankies and calling for their mamas. Because change terrifies them, or at least, they assume change terrifies their voters. I can’t entirely blame them; change always produces unanticipated consequences, and nobody knows where they’ll end up on the other side. But we’re watching what happens when people resist change:

They can’t save the already broken system. The system will just break them.

Thursday, July 18, 2019

“Stranger Things” and the Illusion of Capitalism

The Starcourt Mall, as seen in Stranger Things 3

The producers of Stranger Things presumably thought placing a hidden Soviet research base underneath an American shopping mall in season 3 would make an ironic juxtaposition. After all, the brightly lit, party-like atmosphere couldn’t be further from the grim, featureless government installation which several characters find themselves trapped in through most of the season. What gap could be further, than between the Evil Empire and a capitalist acquisition extravaganza?

I propose, though, the producers knew the gap was notably small. Sure, the brightly lit Starcourt mall, where shoppers walk surrounded by endless choices and seemingly infinite social interaction, looks like an individualistic paradise. It certainly suckered countless suburban youths, like me, who grew up during the Reagan years, believing that these compressed venues of commerce represented the necessary end point of free enterprise. Having uncountable options in one place feels exciting.

However, when your shopping options are limited to which multinational corporation sells you shoes and fast food, the choice is illusory. The Starcourt Mall brims with chain stores, many of which peaked around the time this story is set (1985), but don’t, or scarcely, exist anymore. Names like Sam Goody, Waldenbooks, Orange Julius, and Radio Shack once tempted GenX into air-conditioned malls with promise of endless choice, overseen by supposedly benign corporate overlords.

Steve Harrington, who in previous seasons strutted around Hawking High School with swagger pinched from a Duran Duran video and dated the prettiest girl in school despite treating her like shit, now finds himself thrust into the adult economy. Rather than running things, he, like everyone who ever existed, must abase himself before Mammon. In his case, this means scooping ice cream while wearing a sailor suit so ridiculous, it makes Buster Brown look dignified.

Within moments, though, Steve and his coworker Robin uncover that the Starcourt Mall is a Soviet front; suited and booted Communists operate in subterranean caverns, conducting illicit research. Through means that matter to the plot, but not this commentary, Steve and Robin find themselves getting interrogated by Soviets in full Stalinist regalia, while themselves still wearing their degrading work uniforms. The parallelism between forms of enforced conformity is glaring.

The Starcourt Mall succeeds, as mall capitalism does, by convincing consumers their every desire is met inside climate-controlled, aesthetically pleasing indoor space. The sloppiness and chaos of downtown capitalism, with its sidewalks and rain, gets subordinated to the mall’s constant sensory overload. And superficial choice reigns supreme; if you dislike Sam Goody, try Tower Records instead. This illusory freedom finds its apotheosis in that acropolis of excess, the Food Court.

Yum yum! I'll have a double scoop of capitalism, please! (Netflix promotional image)

This putatively beats Soviet Communism because you have freedom to choose your employment, commerce, and loyalties. You don’t work the commune, shop at GUM, and salute Party logos. But if Steve and Robin’s employment is circumscribed by choosing between Scoops Ahoy or Taco Bell, how much liberty do they really have? They can’t afford the buy-in to start a funky take-out shawarma joint across the Food Court.

That’s even before the limitations of privately-held public space. Take a picket sign into any American mall, or even a guitar. I dare you. Malls, owned by corporations which usually aren’t headquartered locally, have no legal or moral obligation to respect free speech or regionally generated culture. And if you nominally have the right to say anything, but are precluded from venues where anybody can hear you, are you really free? Really?

Online conspiracy theorists postulate that Stranger Things 3, appearing almost simultaneously with HBO’s Chernobyl miniseries, reflects possible growing fear of anti-capitalist rebellion. Our media sovereigns must squelch this cultural trend through intellectual mockery immediately! But actually watching Stranger Things 3, it doesn’t make Reagan-era mall capitalism look any better. If Soviet Communists can successfully disguise themselves as corporate maestros, is there any real difference?

(As an aside, malls and other aggregate capitalism have definitely narrowed the economy, but that doesn’t mean pre-mall capitalism was some Eden of entrepreneurial vigor. Anyone who visits communities where city fathers resisted the mall impulse, knows that locally owned businesses can be just as imperious and anti-democratic as large corporations. Don’t fall prey to appeals for false nostalgia; unrestricted capitalism has always been deeply anti-individuality.)

Writing between World Wars, GK Chesterton observed that, for ordinary citizens, the difference between capitalism and communism is vanishingly small. The only choice is, do you prefer your life dominated by state bureaucrats or corporate bureaucrats? The Starcourt Mall demonstrates what that means. Steve and Robin as workers, and the people of Hawkins as consumers, are supposedly free under mall capitalism. But they’re free in ways that just don’t matter.

Wednesday, October 11, 2017

Little Pieces of America All Around Us

Yeah? What America is that?
(click to enlarge)
I really, really like Creedence Clearwater Revival. But the reason why is pretty embarrassing: when, at sixteen, I rebelled against my parents’ popular culture, as sixteen-year-olds do, I wasn’t ready to embrace Nirvana and Pearl Jam like my peers. I feared getting into anything “new,” and getting left behind, like my friends who’d previously enjoyed Nu Shooz or Duran Duran. Novelty was risky; old stuff came pre-screened. So I started listening to the oldies station.

As half-hearted rebellions go, mine probably seems mild. Given the recent popularity of steampunk, crypto-fascism, and hipsters dressed as Canadian loggers, digging the rock’n’roll of a prior generation isn’t that bad. Except, I’ve increasingly realized, I didn’t really embrace that generation’s vision. Any listen through Casey Kasem’s back catalog reveals that American Top Forty radio has long been dominated by tedious music, driven by labels and producers who manipulate, rather than listen to, the market.

So yeah, I understand the impulse driving people made uncomfortable by today’s cultural divides. I witness friends, people I like and trust, embracing the “Make America Great Again” motto, creating excuses for everyone from Bill Cosby to Peter Cvjetanovic, and calling anything that doesn’t support their power structure “fake news.” Meanwhile, the political party that represents organized progressives offered voters a choice, in the last presidential primary, between nostalgia for the 1990s or the 1950s.

This massive aversion to risk comes at a time when America’s structure is already changing. Our demographics are in motion, as immigration from Asia, Latin America, and the Middle East give this country an increasingly brown complexion. Our range of media options continues to increase as the carrying capacity of TV and Internet sources improves, and we’re drowning in new ideas. Even commerce has become chaotic, with the hectic panoply of chains and online retailers.

Naturally, a large fraction of Americans retreat into what’s comfortable. Whether that means pining for a sunlit Norman Rockwell townscape, or voting for the candidate who promises to restore what we consider our glory days, or listening to “Bad Moon Rising” with the volume at eleven, we’re seeing the same motivation. People intimidated by change, which happens faster now that we can (or choose to) manage, naturally retreat into their favorite version of the past.

I understand this impulse, but I fear it, too. Back in the 1980s, when I began paying attention to social issues, I remember people already complaining that suburban sprawl, with its lack of shared common spaces like parks and downtowns, created vast “communities” bound together only by geographical proximity. Residents sorted themselves into real communities by their workplaces, churches, watering holes, and their children’s schools. Ideas, like people, became unofficially segregated in our diverse America.

Today’s media landscape sees that segregation happening even more quickly. We watch Fox News or MSNBC and have our favorite prejudices ratified by well-coiffed pundits, and equally importantly, we see our ideological challengers reduced to manageable caricatures. We choose our radio stations to ensure we hear only what we know we already enjoy, and, as Gretchen Rubin writes, streaming services like Pandora and Spotify actually narrow our exposure. We’ve improved innovation exposure to a science.

Nor am I immune to this. After resisting new culture for decades, I embraced indie rock when I was pushing forty. But at a recent concert, I realized: this audience is almost as white as the Charlottesville Nazi rally. I could excuse even that as the natural self-sorting nature of crowds, except that I’d driven over 320 miles to see this concert, which I’d heard advertized on an out-of-town radio station I listen to online.

Sadly, I have no ready solutions. I see how aversion to novelty reduces me to a stereotype, the middle-aged white “kid” listening to indie with other honkies. But the alternative is switching my listening habits to locally available radio, which not only bores me, but is overwhelmingly owned by out-of-town corporations famously unresponsive to local needs. I could complain that corporations shattered my community… but I’d have to admit they did it with my assistance.

If America is shattered, as the nostalgia vendors claim, then we have broken it, you and I. We could, as many do, pin responsibility on corporations, or government, or millennials. But that’s just punting the issue down the field. We elect a government, but we lack leaders. We join social networks, but we don’t organize. We look at the little pieces of America all around us and, like good little passive citizens, we do… nothing.

Wednesday, September 14, 2011

Toto, I Think We're Not in the 1980s Anymore



Toto’s “Rosanna” has been following me around town lately.

Ever since I used Toto to epitomize flash-in-the-pan culture in a recent blog post, the 1982 arena standard has cropped up everywhere. I can hardly run to the grocery store, spin the dial on the way to work, or sit on hold without David Paich’s plaintive chords hitting me from somewhere. This one song has become a soundtrack for these weeks in my life.

This is hardly the first song to take on unusual proportions in my life. Before this, I couldn’t shake U2’s “Mysterious Ways.” At other times, “Ticket to Ride” and “Whiter Shade of Pale” have redefined ubiquity.

But somehow, this feels different. This almost feels like a deliberate rebuke, a reminder that, despite my flip words, Toto has demonstrated remarkable staying power. After all, though Toto’s popularity in America dwindled after 1984, they remained a powerhouse touring act in Europe and Asia through 2008, and a recent reunion tour set sales records.

Compared to other music from the time, “Rosanna” endures remarkably well. No one would doubt, with its synth licks and rack of guitarists, that this is a period piece; this song could not get recorded today, at least not in this arrangement. But compare Toto to Madonna. She has enjoyed greater public staying power, but her earliest singles, sung in a strange piping soprano, are almost unlistenable.

American culture focuses markedly on the past. “Classic Rock,” a euphemism for what we once called oldies, is one of our most common radio formats. Cable TV is lined wall to wall with channels dedicated to rerunning nostalgic shows from my parents’ childhood. Though I Love Lucy is less common than a few years ago, I can hardly change channels without running into The Andy Griffith Show.

That seems strange for a country that proclaims its national youth and vigor. We tell ourselves that, unlike wheezy old aristocratic Europe, with its centuries of history and baggage, we’re still young, hip, and with-it. Yet somehow, we keep looking backward, like a retiree retreading the glory days.

Back in the 1990s, Chevrolet ran ads touting that only a country as young and vibrant as America could devise something as revolutionary as Chevy’s then-new model year. But Chevy ballyhooed their supposed adolescence to the backing of Jimi Hendrix’s “Fire.” They shouted their youth with the support of a musician who died before I was born.

The decade before Toto’s “Rosanna” was littered with faux nostalgia. Aerosmith’s 1973 “Dream On,” recorded when singer Steven Tyler was only 25, repeats its fatalistic lyrics about how “the past is gone” and “maybe tomorrow, the good Lord will take you away,” with drum-like constancy. Even Bruce Springsteen’s energetic “Born to Run” came from an artist who, at the time, consciously made himself up like a fifties greaser.

Despite its heartbreak themes, “Rosanna” seems almost opposite to that. Like the Beatles, Toto hit the stage with smiles and a set list. They may have had long hair, but they seemed clean-cut and forward-looking. Except, notice the video’s subtheme of the feud between the Jets and Sharks. West Side Story debuted twenty-five years before this video was recorded. Nostalgia lives!

Ignoring the “classic” radio and TV channels that protect people from encountering anything new, even current culture appears past-addled. Tim McGraw’s recent annoying C&W hit “Back When” features the repeated hook “I miss back when.” Unfortunately, the past he describes substantially occurred before his birth. He pines for his parents’ youth.

I cannot exempt myself from this criticism. I own the complete Beatles catalog on CD, can sing “Piece of My Heart” from memory, and watch DVDs of Doctor Who episodes first broadcast while I was learning to walk. The past has a way of keeping a firm hold on anyone.

Nor should we abandon the past. A culture without history or tradition must reinvent the wheel every generation. We shouldn’t desire that. Isaac Newton famously said: “If I have seen a little further it is by standing on the shoulders of Giants.”

I only fear it can become tempting to live in the past. A song stalking me can beckon, like a siren, to rest on yesterday’s successes. Consider all the rah-rah patriotism reminding Americans of the Moon race, World War II, or the Oregon Trail. There’s a fine line between revering the past, and nesting in it.

I miss you, Rosanna. But I have to balance my past against making sure I have a future.