Friday, September 29, 2023

A Shadow on the Center of the World

Martyn Rady, The Middle Kingdoms: A New History of Central Europe

Readers my age will remember “Central Europe” primarily as a principal front in the Cold War. The Berlin Wall, the border between Austria and Hungary, and the proxy fight for control of the Italian government defined the struggles between the aggregated NATO and Warsaw nations. Urbane Western Europe, and countries under its cultural influence, have long disdained Central Europe as a cultural and political backwater, not entirely without cause.

Emeritus history professor Martyn Rady has dedicated his career to Central Europe generally, and Hungary specifically. Now retired, one suspects he’s written this massive, panoramic one-volume history as a career capstone document. His chosen subject is huge, both in terms of geography and timeline, and hindered by the limited early documents of his chosen region. Rady crafts a readable introduction to regional history, but delves into little with greater depth.

The umbrella term of “Central Europe” is vaguely defined, even herein. Rady mostly focuses on the nation-states now found in Germany, Austria, Hungary, Poland, and the bumper states between them. His narrative spills somewhat into Lithuania, Romania, Northern Italy, and the crazy quilt of former Yugoslavian member states. He starts with the Western Roman Empire’s dying throes, and in around 700 pages, brings us to approximately last week.

Rady’s early chapters deal fleetingly with entire centuries. Limited documentation exists on the rampaging Huns, emerging proto-German states, and various invading peoples. Rady dedicates a few pages to the Avar nation, which didn’t exist, then was mentioned intermittently for centuries in Latin histories, then apparently disappeared again. This signifies the difficulties Rady faces in compiling authoritative history in vernacular language for non-specialist readers.

German “nations,” like the Saxons and Ostrogoths, arose not as states, but as confederations of clans and tribes. This set the pattern for future national identities, such as Poles and Czechs, which organized themselves from isolated regions with local needs. The Frankish Charlemagne tried to impose a unitary culture on these loosely related confederations, but died frustrated, and his empire returned to insular tribes. Again, this pattern would repeat itself.

Professor Martyn Rady

Regional aristocracy, some of whom styled themselves kings, found ways to temporarily unify swaths of land. Many had grandiose nation-building aspirations, which often involved establishing colonies in conquered regions. Most modern states, named for their majority populations, are riddled with minority communities speaking German, Polish, Slavonic, and countless new languages that arose when peoples muddled under the neglectful hands of slipshod dynasties.

If one theme dominates Rady’s history, it’s the conflict between governments that want to homogenize their states, and ethnic groups reluctant to change. Rady acknowledges that most individuals residing in Central Europe have a smorgasbord of ethnic backgrounds to choose from, and many speak multiple languages. The more that states strive to eliminate regional and ethnic differences, the more they created the conditions they strove to prohibit.

As Rady approaches the present, his history becomes more detailed, as you’d expect when the production and preservation of documents becomes cheaper. Early chapters sometimes cover multiple centuries, but Napoleon and the World Wars each merit their own chapters, as well as the interregnums between the wars. Some chapters are arranged thematically—industrialization in the Rhine valley under Bismarck, for instance—and others chronologically.

Culture isn’t really part of this book. Rady has a chapter dedicated to the profusion of German popular culture in the years between Napoleon and Kaiser Wilhelm, but this is an outlier. Perhaps Rady writes this chapter because pre-Reich German literature is frequently political; Rady parallels this entire chapter with a novel by E.T.A. Hoffman. But mainly, Rady focuses on politics, statecraft, and nation-building. Culture matters only when it illuminates politics and government.

He also largely elides certain topics, which presumably exceed his perceived remit. He mentions Germany’s colonial empire in the late 19th century, for instance, but having mentioned it, largely moves on. Rady apparently considers himself hemmed by geography; events outside his physical domain only matter to the extent that they influence inside events. German colonial history still matters in, say, Tanzania and Rwanda, but isn’t within Rady’s self-appointed scope.

Rady probably writes to leave a legacy outside the Ivory Tower. He’s dedicated thirty-five years to Central Europe, and to offers generalists like me an opportunity to share that knowledge. His narrative is often overly concise, and I wish he lingered on certain topics; for a book the size of a cinder block, it feels remarkably short. Rady offers a plain-English introduction to the topic, sure to whet your appetite without leaving you feeling full.

See also: A Brief History of Germany Before “Germany”

Thursday, September 21, 2023

The New American Serfdom, Part Three

This essay follows Walmart and the New American Serfdom, and The New American Serfdom, Part Two
Serfdom as depicted in the documents of its time

American and world history often plays out with the inevitability of a medieval morality play, at least in public (state) school classrooms. At any given moment, history’s most noxious operators may dominate the scene; but wait long enough, and someone will swing in brandishing our society’s most beloved virtues, and rescue us from tyranny. Dr. King was obviously heroic, while Bull Connor was obviously villainous—nothing more to discuss.

Individual teachers frequently resist this moralism, certainly. But in doing so, they’re fighting the tide. As the late James Loewen demonstrates, scholars write textbooks under corporate supervision, with an eye toward state education boards, which have approval authority over textbook selection. Especially in “red” states, including populous states like Texas and Florida, textbooks admitting to moral ambiguity, or acknowledging that America isn’t relentlessly virtuous, are doomed in advance.

This tendency to reconstruct history as a movement from ignorance to enlightenment, from injustice to parity, is sometimes called “Whig history.” The term comes from British historian Herbert Butterfield, who said that Britain’s now-defunct Whig Party believed all history is a narrative of improvement. To Whigs, in Butterfield’s writing, we’re becoming less monarchical and more democratic, less religious and more secular, less authoritarian and more free.

Whig history infects America’s primary school classrooms. Using more than a dozen widely read classroom textbooks, Loewen demonstrates that students learn American history as an arc from violence to peace, from slavery to liberty, from colonialism to modernity. This arc overlooks that modernization doesn’t inevitably lead to liberty. English-speaking Americans made the choice to start keeping slaves, for instance, and institutional slavery became more cruel the closer emancipation came.

When I speak of “serfdom” infecting American economics, it’s important that serfdom isn’t something European monarchies abandoned. Serfdom—an economic system where the manorial lord owns the land, and the land owns the people—only became possible late in feudalism. Human ownership only became possible after manorial lords didn’t have to constantly defend their lands against Huns, Avars, and Turks, and could afford to pave roads and standardize measurements.

Early in feudalism, when lords primarily defended against foreign invaders and natural disasters, serfdom would’ve cost too much money, and diverted manpower from more pressing issues. Peasants’ lives under early feudalism weren’t bucolic by any means; most lived on porridge three meals a day and died of childbirth or the plague. But at least their lords didn’t own them. If they paid their rent regularly, they lived mostly in peace.

One face of modern serfdom: the American meatpacking plant

Only after Turks no longer threatened Christendom, and international trade routes ensured everyone had access to plentiful food, could manorial lords spare money and time enough to throttle their subjects. The improvements which accompanied peace made it possible for feudal autocrats to curtail freedom in their domains. This was around the time European nations began “discovering” distant lands, keeping chattel slaves, and (officially)burning witches.

This violates official state narratives, of course. I, an ex-Republican, grew up on myths that prosperity creates liberty. Maybe it does, for certain individuals; but in the aggregate, prosperity trends upward, creating a wealth and power bottleneck. This matters today, when we’ve created a new economic class, the centibillionaire. Just like in late feudalism, we’ve created a minority too rich to care about the people whose lives they control.

When Walmart constricts its workers’ wages, it doesn’t only hurt Walmart workers. Because big-box discounters displace locally owned businesses, and the professionals like lawyers and PR personnel they support, Walmart cutting wages reduces the amount of money circulating in communities. Some large cities can absorb this pressure because they have economic diversity and a broad tax base. But small towns and suburbs have no choice but to accept enforced poverty.

The same applies to other industries. A century ago, America’s meatpacking industry was centralized in Chicago; now it’s diffused nationwide. This has created history’s cheapest, most abundant meat, a boon for America collectively. But it hurts workers individually: if Cargill undercuts wages (and they do, frequently), workers can’t take their skills across the street to Tyson or JBS Swift. The only major employer in town holds its workers hostage.

Perhaps this is why conservative nationalists fight so aggressively to constrict schools: so ordinary Americans can’t see that history is not just, progress is not inevitable, and prosperity is often harmful. Serfdom only ended when the entire feudal economy collapsed, when technology ended the agrarian society. Unless we resist this arc, it will happen again, ending in either widespread poverty or social collapse. Or maybe both.

Wednesday, September 20, 2023

Richard Madden and the Systems of the World

Richard Madden in Citadel, with Prianka Chopra Jonas

Watching Amazon Studios’ recent over-the-top spyfest Citadel, I couldn’t help wondering why the MC, Kyle Conroy, looked suspiciously familiar. Oh, yeah, because he’s played by Scottish actor Richard Madden, who attracted global attention in 2018 when the ITV/BBC thriller Bodyguard became an international streaming sensation. Though Madden plays Conroy with an American accent, both stories feature Madden as a war-scarred veteran dragged back into somebody else’s war.

These two vehicles play very differently. Bodyguard is a conventional British police drama: gritty, unsentimental, and character-driven. Citadel is campy and overblown, despite its largely serious tone; it resembles the unintentionally silly James Bond films that murdered Pierce Brosnan’s take on the character, and prompted the series reboot with Daniel Craig. Bodyguard is often visually murky, with jarring handheld camera work, versus Citadel’s oversaturated colors and elaborate sound design.

Importantly, Bodyguard features real-world politics. Madden’s character, Police Sergeant David Budd, fought in Afghanistan, and now works for London’s Metropolitan Police. He’s assigned to protect the Home Secretary, a powerful office within Britain’s Cabinet. Early episodes contrast Budd’s PTSD scars with Secretary Julia Montague’s strict authoritarianism; after an abrupt tonal shift, later episodes pit the Met’s civilian Counter Terrorism Command against MI5’s militarized Security Service.

Citadel features two fictional intelligence agencies. Both the titular Citadel, of which Madden’s Conroy discovers he’s a deep-cover agent, and the enigmatic Manticore believe themselves heroic. Citadel hunts and bags potential terrorists, while Manticore hunts Citadel, which it believes has grown corrupt. Both agencies have elaborate technology, an army of agents, bottomless funds, and global reach, despite being non-state actors. Who, we wonder, bankrolls these feuding Illuminati groups?

What these series share, besides Richard Madden, is a prior assumption that massive, shadowy systems control our lives. David Budd must investigate crimes which could destabilize British government, fighting an enemy that can make evidence vanish from locked rooms and air-gapped computers. Kyle Conroy (dba Mason Kane) must unlock secrets which two quasi-legal agencies want buried, many of which involve himself. Both men ask: am I sure I’m representing the good guys?

From the mid-1980s to the mid-2000s, multiple mass-media properties asked whether our lives are falsified. The Matrix, Dark City, and Star Trek’s later holodeck episodes spotlighted the idea that “reality” is only what we accept as reality, and powerful people can deceive our senses to condition our acceptance. Citadel and Bodyguard signify a shift away from reality itself, onto the people who control our ability to perceive reality.

Richard Madden in Bodyguard, with Keeley Hawes

We live, both series imply, beneath powerful structures that speak in our names, and make moral decisions for us, but which we don’t control. Nobody elected the Citadel, and while Secretary Montague was elected MP, she achieved her executive position through intra-party horse-trading. Violence, strategic deception, and force of law compel us to accept these unelected power structures, because we can do nothing about them except join opposite-number violent organizations.

Perhaps these themes are unsurprising. As we’ve acknowledged systemic concerns like “structural racism” or disaster capitalism, we increasingly understand how little individual control ordinary people have. Politics, economics, and war aren’t gods we can petition in temples; they’re forces, like hurricanes, that destroy everything they encounter. Doing right in politics or economics changes nothing, because we’re individuated and lonely, and the forces are systemic, impersonal, and huge.

Bodyguard and Citadel drew my attention because of Richard Madden, demonstrating how essentially powerless Madden’s characters are, despite their shared dedication to law and justice. But once aware of these themes, I started seeing them everywhere. Heart of Stone, a Netflix showcase for Gal Gadot, features a similar non-state intelligence agency that pervades everything, yet is so elusive that even MI6 can’t root it out.

The recent Equalizer movies with Denzel Washington, Netflix’s The Grey Man with Ryan Reynolds, and the Mission Impossible movies mostly don’t impute non-state actors with the kind of reach (and finances) only available to governments. However, they frequently feature government corruption, incestuous relationships between money and power, and people who profit unfairly from the status quo. These malefactors oppress our heroes, who often go rogue to root out corruption.

However, these heroes are equally defined by what they can’t do as what they can. There’s no Chosen One, no Neo or Luke Skywalker to establish a just world. Rachel Stone, Ethan Hunt, Robert McCall, and Richard Madden might remove corrupt operators, but they can’t dismantle unjust systems. They (and therefore we) can only reset broken systems to the status quo ante. Reality now exists, but reality is historically unfree.

Saturday, September 16, 2023

The New American Serfdom, Part Two

This essay is a follow-up to Walmart and the New American Serfdom
One of several stadiums built for the Qatar World Cup, which cost thousands of lives

In the independent Persian Gulf states, a stark and unbridgeable gap exists between the wealthy and the masses. Sort of like everywhere, really. Because of deals brokered by the retreating British and French colonial empires after WWII, a minority of people, descended from those who collaborated with the colonial masters, own the land. We know now, as Europe didn’t in the 1940s, that owning the land means owning oil-drilling rights.

The colonial collaborators who received this massive largesse are now, mostly, dead. Their children and grandchildren own the stakes and, because the empires created make-do monarchies, also own the government. No popular vote will ever compel Gulf States oil aristocrats to pay better taxes or share the abundance created by exhaustive oil drilling. Gulf States’ per capita GDP is among Earth’s highest, yet most Gulf States residents are chronically poor.

Uneven wealth distribution forces Gulf States potentates to do something, anything, with their money. These countries have become synonymous with grotesque passion projects. Dubai’s man-made “Palm Islands,” golf resorts in the Saudi Arabian desert, and the Burj Khalifa are world-renowned, and mostly abhorred. Most recently, the rush-build stadiums for the 2022 FIFA World Cup in Qatar killed nearly 7,000 migrant workers, because money distorts the value of human life.

Those dead workers have haunted my imagination since that news reached world audiences. Migrants, mostly Indians and Pakistanis from rural agrarian backgrounds, rushed to Doha seeking work, because their homelands didn’t provide it. They, and their families back home, needed money, and Qatar had it. But Qatar had no incentive to protect workers, their skills, or their future productive value. Better to squeeze them now, even if it kills them.

Dedicated readers of American history—real American history, not the Florida version—will recognize this pattern. After the United States abolished slavery in 1865, several states, mostly former Confederate states, utilized a loophole in the 13th Amendment to arrest poor, and mostly Black, residents. These arrestees would be remanded into labor as payment for their “crimes,” which often involved being unemployed or not carrying arbitrary government ID.

Because the “buyers” for these convicted laborers didn’t own them (most buyers were former slaveholders, or their children), they had noincentive to keep their laborers alive and healthy. Shane Bauer and other writers describe taskmasters who literally worked leased convicts to death, denying them food, medical care, watertight lodgings, and other necessities. The mortality rate among leased convicts greatly exceeded that among antebellum slaves.

A photo of leased convicts in the American South, from the Smithsonian collection

Qatari migrant workers and American leased convicts had price tags set upon their lives. Their price threshold was so low that therewas no material incentive to keeping workers alive. Although they did highly valued work, including cane cutting, coal mining, and stadium construction, they didn’t own the product they made. And whatever work they did exceeded the price of their own lives, making human life a net sum loss.

Last time out, I described Walmart cutting workers wages while artificially inflating stock prices as evidence of “serfdom.” Under medieval feudalism, peasants owned their own land, but paid their lord a portion of their yield as taxes. Serfs, however, belonged to the land, which their lord owned directly; instead of paying their lord taxes, their lord paid them wages, drawn from the land’s yield. Also, serfs weren’t free to leave.

This new serfdom, for which I lack a meaningful name, resembles medieval serfdom much as convict leasing resembles slavery. Just as convict buyers had no incentive to keep workers alive and healthy, so mass employers like Walmart, or the Qatari government, don’t have to pay living wages or provide basic protections. Unlike slaveholders or medieval lords, modern serf-masters outsource the unpaid labor of raising new workers to working age.

Therefore, modern serfdom creates innovative pressures on women, the disabled, and—ironically enough—convicts, who have difficulty getting public-facing jobs. As stated before, employment (rather than work) has become not only a fiscal necessity, but a moral imperative. Those who can’t work outside the home because they have responsibilities, impairments, or a spotty personal history, are characterized as not only lazy, but morally deficient.

Again, for American history readers, demagogues and dog-whistlers have always used “moral deficiency” to justify crackdowns on minorities. One common charge against Black Americans who later became leased convicts, was “vagrancy.” Nor is this an historical outlier, as Florida’s newest public schools curriculum asserts that chattel slavery taught Black people job skills, that is, that slavery taught late-capitalist morality, whether anyone wanted to learn or not.

The New American Serfdom, Part Three

Friday, September 15, 2023

No, Don’t Give Up Your Dogs and Cats

My cat Max. Does this floofy face look like it could survive in the wild?

Do dogs exist? This may sound like a frivolous question, but hear me out. Because of selective breeding and other human intervention in canine genomics, the word “dog” refers to a panoply of domesticated creatures. It’s difficult to devise a definition of “dog” expansive enough to include breeds like chihuahuas and huskies, French poodles and St. Bernards, while also excluding closely related species, like wolves, foxes, and coyotes.

This week, former lifestyles editor Ellie Violet Bramley wrote an essay for The Guardian entitled The case against pets: is it time to give up our cats and dogs? Bramley, a dog owner herself, admits pet ownership brings humans great satisfaction and wellbeing. However, the relationship is unequal: pets descended from predator species are unable to roam, hunt, and otherwise fulfill their biological imperatives. Bramley considers these restrictions cruel.

I appreciate Bramley's point. As a cat dad, I know my boys feel cribbed when kept indoors, and have lost their natural predatory instincts; the pigeons living in the tree outside my door apparently enjoy taunting my boys when I permit them outdoor time. Imagine every rottweiler kept as apartment pets by city dwellers, who think taking them to the dog park on Saturdays is sufficient exercise time. These animals atrophy for lack of natural environment.

However, Bramley relies on what rhetoricians call an “essentialist” argument, that members of some group have some shared essence that we can’t define, but we know exists. In Bramley’s view, cats and dogs haven’t become essentially separate from their ancestors. Dogs remain, in essence, wolves, while housecats remain African wildcats. This is why evolutionary biologist Richard Dawkins rails against essentialist arguments: they blunt our ability to see change.

Bramley’s essentialism becomes visible if we even briefly imagine releasing our household pets. Will your teacup maltipoo crossbreed successfully bag enough prey to survive, breed, and integrate into the wild? Doubtful. Though Jack London romanticized dogs’ untamed nature in The Call of the Wild, most pet owners know their beloved critters have the survival instincts of a cabbage. They’ve adapted to live with human companionship and support.

I’ve written about this before. First, “nature” doesn’t really exist; humans shape every environment we encounter, for good or ill. Simultaneously, though, “nature” is remarkably resilient, rushing into niches humans leave vacant and creating their own adaptable ecosystems. Household animals have become what James Paul Gee calls man-made monsters, biological entities adapted to live with and serve human needs. Other man-made monsters include pigeons, racoons, and… humans.

My cat Pele, trying to channel his ancestral African wildcat

A popular internet meme describes cats as “apex predators” trapped in cute, fluffy bodies. But examinations of feral cats’ stomach contents reveal they mostly survive on garbage, not prey. That’s why feral cats proliferate in cities and towns, but scarcely exist in wilderness environments: because without humans, they’re helpless. With proper veterinary care and nutrition, household cats frequently live twenty years, but feral cats’ life expectancy is about three years.

Dogs, especially mutts, are somewhat more adaptable, but not much. The numbers speak for themselves: Britannica estimates there are approximately 65,000–78,000 wolves in North America, while best estimates indicate around ninety million household dogs in the United States alone. Friendly dogs flourish in human companionship, while standoffish wolves suffer. If boredom is the price of prosperity, well, millions of human cube farmers can appreciate that.

If Americans, just Americans, turned their pets loose, most wouldn’t flourish. Millions would stand outside their former owners’ doors, howling to come back inside where the food and water bowls live. Those that accepted their fate would, mostly, get devoured by coyotes, raptors, and other predators, which would cause predators’ numbers to swell, and would distort the ecosystem. The few surviving pets would become predators or scavengers themselves.

Human intervention in animal genomics made these changes, and these changes cannot be unmade. Our pets, livestock, and scavenger species exist, regardless of Ellie Violet Bramley’s moral qualms. Because humans created these man-made monsters, we have a responsibility to steward them. This means spaying and neutering, as Bob Barker insisted, but also making sure cute fuzzy animals have the exercise, mental stimulation, and pack companionship they need.

Our ability to steward our domesticated animals parallels our ability to steward the entire environment. Our pets suffer from the same late-capitalist demands currently causing rapid global warming: exorbitant working hours, habitat destruction, and a carbon-dependent economy. Relinquishing our responsibilities to our pets won’t make anyone happier, and could blunt our ability to comprehend the pressures destroying the world outside of our man-made doors.

Thursday, September 14, 2023

Walmart and the New American Serfdom

Last week, Walmart, still America’s largest employer, announced it would reduce starting wages for new hires by a dollar an hour. Though no existing workers will see reduced wages, Walmart retains their fiscal advantage, partly, through constant employee rollover; they’re banking on shedding most current workers quickly and refilling their ranks at lower wages. Under ordinary circumstances, this might anger customers and workers, but would be forgotten quickly.

Except that same Walmart announced a $20 billion stock buyback program in November. That means Walmart had $20 billion in unallocated revenues, and rather than spending it on wages, benefits, or corporate development, they rededicated that money to manipulating stock values by creating artificial scarcity. Contra the Reaganite belief that wealthy people plow revenues into the common good, Walmart is actively using its surplus to take value off the market.

On one level, this demonstrates the falsehood underlying libertarian thinking: most people might do right without compulsion, but the rich absolutely won’t. They became rich through resource hoarding, market manipulation, and wage theft, and they won’t stop unless somebody in authority forces them. On another level, though, this demonstrates not only an economic outcome, but a social principle. We’re witnessing the return of a modern technocratic serfdom.

Throughout my lifetime, Americans have accepted a public ethic that employment is a moral imperative, regardless of anything produced. We expect teenagers to get jobs at McDonalds, Walmart, or the mall, the minute they’re physically capable of working and handling the equipment safely. Not only have we accept that teenagers should work, to gain putative “skills,” but we accept the obverse: that low-skilled, poorly paying jobs are for teenagers.

This moral imperative to work continues into adulthood. Since the 1990s, America has tied receipt of poverty protection, like EBT and WIC, to having a job—regardless of whether regions and communities have employment available. This commitment is bipartisan: the work requirement was shepherded through Congress by Newt Gingrich, and ratified by Bill Clinton. Administrations representing both major American political parties have preserved this requirement ever since.

The current administration seems unlikely to reverse this. As Joe Biden said at the 2020 Democratic National Convention: “A job is about a lot more than a paycheck. It’s about your dignity…. It’s about your place in the community.” Biden has repeatedly stressed work’s moral component ever since, speaking some variation of his 2020 bromide: “My economic plan is all about jobs, dignity, respect, and community.”

All due respect to Brandon, but nobody applies to Walmart seeking dignity. The uncomfortable uniforms, Spartan conditions, and dismal hours are compounded by customers who believe themselves entitled to treat poorly compensated employees abusively. People seeking dignity work for themselves or their families—where that’s feasible. People seek employment outside the home because they need to get paid. Dignity and community aren’t negotiable currencies to cover rent and groceries.

So people need employment, but we’ve cheapened work by making employees ubiquitous. Walmart stores live and die by workers’ contributions, but when the corporation has excess revenue, it doesn’t return to employees. It enriches those who own stocks, which are inert legal documents that produce no value without workers. The fact that America’s largest employer can devalue wages during a period of surplus revenue demonstrates that America doesn’t value work.

When workers don’t own their produce, and work only for the enrichment of “owners” who don’t manage workplaces, this increasingly resembles serfdom. Conditions are worse than medieval serfdom, though, because lords of the manor at least had an obligation to their serfs’ health and defense. Modern aristocracy, by contrast, has liberty to cut employees loose, retain the value of labor already provided, and return revenues to the ownership class.

This pattern repeats throughout the economy. When schoolteachers in red states struck for better conditions in 2018, or the actors’ and writers’ strike right now, we hear defenders of the status quo insisting that workers should value their work, and gain satisfaction from jobs well done, regardless of pay. Whenever necessary workers demand more respect and better pay, some asshole inevitably says, “just quit.” It’s like clockwork.

Like slavery before it, this modern serfdom isn’t sustainable. Workers demand pay because they have obligations: housing, clothes, food. But modern aristocrats pay below the subsistence level necessary for these goods. Housing prices are worse now than they were before the 2008 housing bubble meltdown. Public health is as bad as it was before COVID-19. After twenty years of lessons, the aristocrats running our lives have learned nothing.

The New American Serfdom, Part Two
The New American Serfdom, Part Three

Saturday, September 9, 2023

The Great Stratford-on-Avon Noise Machine

I don’t like deleting anyone from my online friends lists. I sometimes hold onto friendships that have shuffled along, zombie-like, for years. To get ejected from my social media lists, you generally must pick fights, engage in personal insults, or use my page to spread lies. Recently, I had someone attempt all three, and it involved one of literature’s most tedious questions:

Did Shakespeare write the works of Shakespeare?

Anti-Stratfordianism is a pseudoscience, akin to antivax or Flat Earth conspiracies. Adherents “prove” their propositions, not through evidence, but by “poking holes,” finding supposed inconsistencies in the documented narrative, and “just asking questions.” As my overuse of scare quotes indicates, anti-Stratfordian arguments rely on obfuscations and innuendo, not evidence. Yet my now-ex-friend insisted I must “engage” every specious argument, or surrender the debate.

That, immediately, should’ve been a clue. Flooding the market with unsourced innuendo or anecdotes, then claiming victory on every point somebody can’t immediately rebut, is the tactic of ufologists and Bigfoot hunters, not serious social scientists. Bullshit artists, like Steven Crowder with his “change my mind” schtick, love barraging the unprepared with binders full of photocopied talking points, demanding on-the-spot answers.

My ex-friend began by demanding why we should consider Shakespeare the author of Shakespeare, when we have little documentary evidence of his life. As though the absence of documents, in a time when creating and storing documents was expensive, proves anything. In fairness, professional doubt manufacturers use this same technique on Homer, Socrates, Pythagoras, or Jesus Christ. Rhetoricians call this the “argument from ignorance.”

Jon Finch (left) and Francesca Annis in Roman Polanski's Macbeth

Sometimes we must focus on gaps in our knowledge. Law enforcement and counterterrorist experts do this frequently. But formal argument considers this fallacious in most contexts, because absence of knowledge usually proves little, except that nobody can document everything. Shakespeare, a poor boy from the provinces, didn’t merit physical documentation until relatively late in life.

Well, the anti-Stratfordian argues, what about Shakespeare’s lack of education? Most playwrights of the English Renaissance attended Oxford or Cambridge; how could Shakespeare write great literature without academic credentials?

Yes, most late-Elizabethan playwrights attended universities. We remember Christopher Marlowe, Robert Greene, George Peele, and Thomas Nashe as the “University Wits.” Greene wrote a notorious pamphlet condemning Shakespeare as an uneducated bumpkin. However, Greene’s one surviving play isn’t worth reading. Of the University Wits, only Marlowe bears reading now—and he nearly flunked his education, until the Queen personally intervened.

Insisting that Shakespeare couldn’t write well without a university education is classist. As a former university composition teacher, I can attest that some people write well without higher education, others write poorly with higher education, and some write well despite higher ed. (When I mentioned social class, my ex-friend said I’d engaged in “ad hominem attack.” Not so, sir. If there’s a fallacy there, it’s hasty generalization.)

The anti-Stratfordian shifts tactics: how could Shakespeare write about foreign lands so authoritatively? We have no evidence he ever left England. (Again with the “argument from ignorance.”)

Except Shakespeare didn’t write authoritatively about foreign lands. In Hamlet, he misnames Denmark’s royal palace, and gives nearly every Danish character Greek or Latin names. He sets several plays in Italy, including Verona, Venice, and Padua. Each feature explicitly English scenes, including women speaking in public (verboten in Renaissance Italy), court cases argued on English common law, and common English courtship rituals. Shakespeare’s “foreign lands” are exotic names draped over English scenes.

Kenneth Branagh as Hamlet

Next, the anti-Stratfordian demands I explain how Shakespeare,a poor country boy, could write about aristocracy? How could a provincial merchant’s son understand noble households and aristocratic families?

First, Shakespeare’s playing company had aristocratic sponsorship. As first the Lord Chamberlain’s Men, and later the King’s Men, Shakespeare’s company were members of the royal household, and therefore aristocratic insiders. Even without that, though, the poor have always understood how rich people think, as rich people don’t understand the poor, because they have to. Then as now, social class matters.

Finally, my ex-friend deployed the low blow: why was I getting emotional? “You don’t sound,” he wrote, “like a dispassionate academic here.” I realized that he’d gone for the troll argument, saying provocative things until I lost my composure, then crowing over my anger. His statement was a prettied-up version of “U mad bro?” So I blocked him.

I don’t brook bad-faith argument or underhanded tactics. I won’t engage future anti-Stratfordian arguments, and if anyone tries them on me ever again, I’ll show them this narrative, then mute them forever. It’s better than they deserve.

Friday, September 8, 2023

New From the Stephen King Factory!

Sadie Harper (Sophie Thatcher) investigates the dark recesses
of her house, in Rob Savage's The Boogeyman

Rob Savage (director), The Boogeyman

Dr. William Harper is better at dispensing psychological advice to his clients, than at following his own principles. Recently widowed, he and his two daughters have dealt with his wife’s passing by refusing to deal with it. But a walk-in client tells a story of intense suffering, as a nameless monster has apparently destroyed his family, one by one. Neither man realizes that, by telling the story, they’ve let the monster into Dr. Harper’s house.

This movie takes a similarly titled 1973 Stephen King short story as its inspiration, but not really its source. It dispenses with the story’s driving characters by the end of Act One, and focuses primarily on Dr. Harper’s teenage daughter, Sadie, played by Sophie Thatcher. Sadie tries to maintain balance between her father, who remains terminally mired in denial, and her sister, Sawyer, who has become frightened of her own shadow.

Although this movie doesn’t faithfully adapt King’s story, the screenwriting team of Scott Beck, Bryan Woods, and Mark Heyman have created a story that could’ve come whole-cloth from King’s repertoire. There’s the family conflict, the tension between science and preconscious fears, the monster older than time, and the resolution that… well, shelve that temporarily. Because this particular resolution is less interesting than the journey to reach that point.

The walk-in client, Lester Billings, vandalizes the Harper house, including the late Mrs. Harper’s art studio, which has remained sacrosanct since her meaningless death. William and Sadie then find Billings hanging in Mrs. Harper’s walk-in closet. They think they’ve endured the worst already, but that night, little Sawyer hears voices coming from her closet. The monster has invaded the Harper house, though nobody believes Sawyer. Yet.

Sadie Turner embraces the pain of her mother’s passing ardently. She wears her mother’s clothes, displays her mother’s artwork, and hasn’t thrown away the last brown-bag lunch Mom packed, although it’s gone rancid. Sadie embraces mothering her sister, because the pain and responsibility give her life meaning. But she hasn’t learned to trust others, or recognize their pain as equally real to hers. Her peers make that process no easier.

Watching the monster unfold, I couldn’t help recognizing Stephen King components. The monster lives in closets, basements, and disused rooms—the id of suburban American homes. Compare Pennywise, who emerges from the municipal sewer. Like Pennywise, we eventually learn the monster is older than time, that it feasts on its victims’ fear, and that once touched, it never entirely goes away. This is a Stephen King monster rewritten by ChatGPT.

Likewise, this movie recycles conventional Stephen King themes of family trauma, isolation, and lack of community. Without visible neighbors or friends to rely upon, or traditional networks like the church or Lion’s Club, the Harper family must fight their grief alone. However, as each survivor exists at different stages of Freudian development, they’re all ultimately alone. Modernity is lonely, in Stephen King’s world, and trauma makes everything worse.

King’s novels usually feature a congeries of outcasts, loners, and misfits unifying to confront an overwhelming horror. We see this in The Stand, or his Castle Rock novels. But at the risk of repeating myself, the Harper family most closely resembles the Losers’ Club that confronts Pennywise. When we see the monster, it even has the janky, seriocomic tone of Pennywise’s primordial form from the 1990 TV miniseries.

So let’s revisit that resolution we previously shelved. (Spoiler alert.) The Harpers have, together, finally accepted the Boogeyman is real and inside their house. They pause their differences, band together, and destroy the monster, though it costs them dearly. Then we see a family-based, very huggy moment where they accept one another—another King trademark—before Sadie discovers the monster is defeated, but not dead.

Stephen King has become such a thoroughly reliable commercial brand that Beck, Woods, and Heyman can create a Stephen King story without King’s actual involvement. Like sweatshops producing unlicensed Louis Vuitton, this movie is an okay knockoff that should satisfy casual fans, although dedicated purists might notice some wonky stitching. It’s scary, sure, but it works primarily by providing audiences with tropes they know and appreciate.

Some professional critics disparaged this movie’s reliance on jump-scares and conventional haunted house atmospherics. Fair play, perhaps, though the jump-scares made me jump. But between those jolts, this movie provides a story so familiar, you could wear it like a Snuggie. And all what a certain subset of the audience wants: the comfort of knowing they’re watching a familiar, time-tested Stephen King story.