Showing posts with label economy. Show all posts
Showing posts with label economy. Show all posts

Saturday, October 25, 2025

The East Wing and America’s Disposable Economy

An NBC News photo shows the East Wing demolition in process this week

Although a majority of Americans reportedly disapprove of this week’s demolition of the East Wing, exact disapproval follows largely partisan lines. The White House extension, first built in 1902, stood between the president and his long-sought “ballroom,” and approval of the new replacement largely tracks with personal support for the president. Less interesting than exact details to me, though, is the underlying beliefs exposed by the demolition.

I cannot help perceiving the president’s actions through my seven years in the construction industry. Unlike the president, who comes from the moneyed, white-collar end of property development, I worked the jobsites, cordless impact driver in hand, tape measure hanging off my belt. The president works with schematic drawings, models, and CAD renderings. For each of us, the other’s perspective is an abstraction, separate from our particular job.

To architects, contractors, and project managers, only the finished product matters. The process is as ephemeral as pixie dust. Management wants to complete construction on time and under budget; everything else serves that goal. Existing structures, built with outdated technology and, probably, with hand tools, are an impediment. To the property developer, historic preservation is a costly nuisance that impedes progress, while demolition is cost-efficient.

Historic preservation is, ultimately, an expression of sentimentality. The Democratic-aligned objection to the East Wing demolition has focused heavily on historic photographs taken among the picturesque colonnade, or memories of the officials—especially First Ladies—who conducted White House business in those offices. They revere the now-lost East Wing for the same reason you’d rather see the church where your parents were married be restored than razed.

But American economics doesn’t place dollar values on sentiment. The president ran all three of his campaigns on rhetoric of cost efficiency, profitability, and business acumen. The philosophy of shareholder value, which has dominated American economics since at least the 1970s, sees ordinary people not as moral agents, but as arbiters of price. Historic buildings are worth exactly as much as one can outbid the developer who’d rather flatten it.

The president with one of the hand-sized architectural models of his grandiose
monuments, which he loves waving around for journalists

My first construction industry job involved building a new city high school to replace the nearly seventy-year-old existing building. The city’s cost-benefit analysis deemed the existing building too expensive to save. This decision distressed many people who’d lived in town their entire lives, and had deep-seated formative memories associated with the existing building. But their feelings weren’t sufficient to absorb the tax bill necessary for costly restorations.

Many public and municipal buildings, including schools, government offices, and universities, still have exterior facades built from slab marble, sandstone, and other valuable materials worth preserving. But those are merely cosmetic. The rise of steel-frame architecture means that a small number of girders carry the building’s weight. Interiors are built from tin studs, gypsum drywall, and poured concrete, materials with a maximum life expectancy of about fifty years.

Commercial buildings and houses are even worse. Most such buildings constructed after the Eisenhower Administration need significant restoration after around thirty years, because cinder block masonry, OSB underfloors, and aluminum HVAC systems, will just rot. Several American suburbs, once desirable and exclusive, are now saddled with the blight of abandoned malls and big-box retail stores: too decrepit to use, too expensive to restore, and too dangerous to demolish.

Nor is this accidental. Since the Levittown building boom of the 1950s, architecture’s core principles, as a field, have shifted away from durability, onto cost reduction. The White House’s facade hasn’t been much amended since George Washington laid the cornerstone in 1793. My city’s former high school, by contrast, decayed so spectacularly that I broke pieces off the seventy-year-old face by kicking them with a steel-toed boot.

Our president claims the East Wing demolition became necessary after discussion with architects, but I don’t believe that. After seven years in the industry, I know that, to the bean counters who control modern design and construction, demolition is always the first choice. Unless government agencies or private conservationists step in to prevent it, contractors regard old buildings as a private cost, not a community asset.

To America’s capitalist class, everything is as disposable as an aluminum soda-pop can. People like our president measure buildings’ value not by their historic legacy, but by their cost efficiency. The East Wing was worth less, to him, than whatever supposed value he can extract from his extravagant “ballroom.” The same will apply, eventually, to your historic downtown, your local schools, or your house.

Nothing, anywhere, is safe from the bulldozers and grandiosity of America’s ruling class.

Sunday, January 19, 2025

Building New Houses Isn’t the Solution

This is what it usually looks like when Americans just build houses.

Supposedly, many Americans voted Donald Trump back into the presidency, in part, to protest the continuing rise in housing costs. Despite promises from both parties, housing prices continue outpacing household incomes, and the ratio of price to income is now worse than it was before the 2008 market collapse. Several pundits, including many I respect, have emerged to repeat the same mantra: let’s build more housing units.

The logic seems facially robust. Presumably, most readers learned the “supply and demand” principle in high school, that price emerges from an equilibrium of how much buyers want something, and how many units sellers have available. Therefore, if prices rise, it follows that supply is scarce—and it is. But this overlooks what forces rendered housing scarce, and what measures markets can take to counteract this scarcity.

Consider, for instance: hedge funds and other financial instruments have purchased private housing as investments. These homes nominally exist, and nominally have value. However, to work as investments, their value must constantly increase faster than the overall market, which, as we’ve seen, they do. If these funds sold their holdings, they’d flood the market, driving prices down. Therefore they must continue hoarding housing off the market.

This interpretation, however, is stacked. Many cities which create significant employment exist with ready-made limits to physical growth. New York is built mostly on a series of islands; Chicago is built on reclaimed swampland; San Francisco is built on a rocky peninsula. Even if BlackRock and other funds divested their holdings, these cities can only grow so big before hitting physical boundaries that choke their growth.

Other cities have policy-based growth limits. Many cities built on abundant flat land, like Omaha, near where I live, have urban design based on R1 zoning, the principle of favoring freestanding, detached houses on separate lots. Fully eighty percent of Omaha’s land area—and comparable amounts in similar cities like Minneapolis, St. Louis, and Tulsa—are legally restricted from building mixed-use developments or multifamily housing.

Therefore, building more houses in America’s heartland will require either changing the law, or building more urban sprawl. Despite cities having a reputation for car dependence, as Jeff Speck writes, most cars driven in the cities commute in from suburbs and R1 neighborhoods. And that’s saying nothing about the virgin prairie, swamp, or other ecologically valuable land that developers must destroy to accommodate R1 construction.

And where geography or policy don’t limit development, there’s economics. Eastern Rust Belt cities like Detroit, Gary, and Cleveland grew rapidly, infused with Marshall Plan money, following World War II. But when government money retreated, and precious supply lines moved to Asia, the cities dwindled again. These cities actually have plentiful housing, most of it rotting, because there’s no employment or other development to generate demand.

Different conditions in different cities reflect the diverss influences that cause cities to develop, or shrink. I’m reminded of James C. Scott, who analyzed means by which centralized development plans have created inequality, environmental devastation, and social collapse. These redevelopment schemes have shared an imposed quality: scholars, bureaucrats, or well-meaning but purblind revolutionaries thought they knew better than local communities, and simply issued demands.

Scott contrasts centralized planning with “local knowledge.” Old, unplanned cities, like central Bruges, Belgium, or Manhattan south of Houston Street, are so intricate and winding that only lifelong locals understand their street layouts. Yet these unplanned developments reflect regional geography, community, and economics. What seems sloppy and chaotic to the government planner, actually serves local needs to a T.

America’s federal government has a history of funding dystopian development projects. Rapid expansion in the Rust Belt, for instance, led directly to the same cities’ abandonment. Levittown-style suburban sprawl has always required transfusions of government money, only to create joyless “communities” that young residents aspire to leave. Now advocates call for “building more houses,” heedless of local need, planning regulations, or regional economy.

Well-meaning advocates, and their government allies, want to offset harsh economic conditions. But they impose one-size-fits-all policy recommendations that don’t reflect local needs. In some cities, there’s no more land for construction, while in others, new construction will create sprawl that leaves residents isolated and strands the aged or disabled. And aggressive construction, fueled by diesel, will inevitably create environmental devastation.

Creating decentralized, regionally specific solutions is time-consuming, expensive, and difficult. But cleaning up the devastation created by mass-produced central policies has already created messes we aren’t prepared to repair. We won’t fix the consequences by doubling down on the underlying problem.

Friday, September 6, 2024

An Oasis in the Desert of Reality, Part Two

The original Oasis lineup from 1994, as per NME
This essay is a follow-up to An Oasis in the Desert of Reality

The recently announced Oasis reunion was followed almost immediately by something that’s become widespread in today’s music industry: price gouging. Euphemistically termed “dynamic pricing,” the online structure lets ticket outlets jack prices commensurate with surging demand. This means that, as tickets became available for the paltry seventeen dates around the UK and Ireland, computers hiked prices to £350 ($460) per ticket, outside their working-class audience’s budgets.

American audiences could summarize the exorbitant prices with two words: Taylor Swift. Tickets for Swift’s Eras Tour originally sold for $49. Quickly, however, a combination of forces, including the Ticketmaster/LiveNation merger, unauthorized resale, and limited access, bloated prices to over $4000 in some markets. Swift’s mostly young, mostly female audience base obviously can’t afford such numbers. In both cases, predatory marketing ensured that fans best positioned to appreciate the respective artists, could least afford access.

But both artists share another characteristic. Why do these acts share monolithic grips on their markets? This perhaps made sense when the Beatles upended the music industry in 1963 and 1964, when fewer media options ensured a more homogenous market. But advances in technology have granted listeners access to more radio stations, innumerable streaming services with nigh-infinite channels, and more opportunities to share music with formerly niche fandoms.

Yet increased diversity produces a paradoxical outcome, embodied in the crushing demand for limited tickets. As listeners have access to more artists, more styles, and the entire history of recorded music, demand has concentrated on a smaller number of artists. I’ve noted this trend before: as the buy-in to create and distribute music has become more affordable, the actual number-one position on the Billboard Hot 100 has become less diverse.

Some of this reflects the way conglomerate corporations throttle access. Sure, entrepreneurial artists can record music in their bedrooms at higher quality than the Beatles dreamed possible, and distribute it worldwide almost for free. But approximately half the music market today travels through one outlet, Spotify. And, as Giblin and Doctorow write, Spotify’s algorithm actively steers listeners away from indie, entrepreneurial, and otherwise non-corporate options.

Taylor Swift in 2023

Three corporations—Sony BMG, Universal, and Warner—currently control over eighty percent of the recorded music market. That’s down from four corporations controlling two-thirds of the market in 2012, the year Universal absorbed EMI. (EMI, in turn, owned Parlophone, the label which published the Beatles.) Without support from the Big Three, artists have limited access to publicity, distribution, or the payola necessary to ascend Spotify’s ladder.

Aspirational musicians might, through good business management, make a middle-class living through constant touring and indie distribution. But without a major-label contract, musicians can’t hope for basic amenities like, say, a full weekend off, much less enough pay to buy a house. And with only three corporate overlords controlling the supposedly large number of major labels, musicians will always have a severe disadvantage.

Sure, the occasional musician might beat the odds and challenge the Big Three oligarchy. Taylor Swift notoriously forced both Spotify and Apple Music to pay overdue royalties to backlist artists a decade ago by withholding her lucrative catalog. But most musicians, even successful artists with chart hits, can’t expect to ever having such influence. Citing Giblin and Doctorow again, many artists with chart hits still bleed money in today’s near-monopoly market.

This structure encourages blandness and conformity, from both artists and audiences. Oasis sounded new and different thirty years ago, but they’re now comfort listening for a middle-aged, middle-income British audience. Taylor Swift samples styles and influences so quickly that she’s become a one-stop buffet with something to please (or displease) everybody. Both artists give audiences what they expect to hear—assuming they can hear anything over the stadium crowds.

Collective problems require collective solutions. Start by supporting politicians who enforce existing antitrust and anti-monopoly laws—and not supporting the rich presidential candidate who brags about not paying contractors. But we also have private solutions, starting with attending concerts and buying merch from local and indie artists who reinvest their revenue into their communities and stagehands. Tricky for some, yes, as most music venues are bars.

The problem isn’t hypothetical; we have changed in response to a diminished market. As Spotify, LiveNation, and the Big Three have throttled the music industry, we listeners have responded by accepting diminished standards, and consuming blander art. Though we have the illusion of choice, we allow the corporations to channel us into a less diverse market, and buy their pre-screened art. Independence is neither cheap nor easy, but it’s urgently necessary.

Monday, July 8, 2024

“Deaths of Despair” and the Working American Man

The expression “deaths of despair,” once an exclusively scholarly term, became commonplace somewhere around 2019. It describes the heightened mortality among poor and working-class Americans from alcoholism, drug overdose, and suicide. The problem is both class-based and distinctly American, and apparently mostly affects men. When it creased public awareness, “deaths of despair” were a disproportionately White phenomenon, though recent changes see it rising among Black Americans.

My post-college career has caromed from education to manufacturing, to construction, to marketing, and most recently, back to manufacturing. (FWIW, knowing how to do high-skilled professional work differs markedly from knowing how to find high-skilled professional work.) This rapid oscillation gives me a binocular perspective on the American economy. While the sources I consulted for this essay emphasized comorbidities like obesity, diet, and poverty, I suggest “deaths of despair” are caused by economic propaganda.

Working-class American men continue marinating in post-WWII messages emphasizing male self-sufficiency. Though most have accepted they’ll probably never again support a spouse and kids on a single paycheck, they continue hearing messaging that they should. This is especially true for younger working-class men raised on digital technology, as the phallocentric world of alt-right messaging fetishizes a Little House-ish strain of bucolic libertarianism that mostly only exists on TV anymore.

Ian Haney López notes that the White Americans most likely to oppose “welfare” and other forms of federalized help, are the White Americans most likely to need it. This happens, he writes, because “welfare” is racially coded in American political propaganda; Black people need help, White people should be self-reliant. This claim becomes somewhat flimsier with the recent rise in Black deaths of despair. So I don’t want to call Haney López wrong, but his position needs expanded.

In addition to race, federalized help is also coded criminally. Accepting government money is a form of stealing. Not corporate subsidies or PPP loans, certainly, which are necessary to maintain a well-oiled market, but any government money you’re eligible to receive is, essentially, picking the taxpayers’ pockets. That’s why thought leaders keep tying EBT to drug tests, criminal background checks, and other forms of screening to identify crimes. Because anybody who needs help is, a priori, a thief.

Therefore, whenever a worker reaches a point of destitution where it becomes necessary to start thinking about asking for help, it flips a moral switch. The demographic class raised up on mythology of friendly cops, law’n’order, and “just comply,” finds themselves asking: “Am I seriously committing a crime?” This trips them into the same guilt spiral healthy people experience whenever the intrusive thought of killing somebody appears. They feel compelled to stop the trajectory.

Unfortunately, the positions aren’t similar. When that intrusive “you could just strangle him” thought appears, we short-circuit the process by just not strangling him. The same simple solution doesn’t appear when the impoverished worker starts considering asking for help. Because even if they never actually request help, the poverty doesn’t just disappear. The conditions that made the request necessary linger, and the worker remains as incapable of bootstrapping as ever.

Please note, “crime” is something we do, but “criminal” is something we are. If you strangle somebody, the event has a clearly delineated beginning and end. But you never stop being a murderer, even after you serve your sentence. This becomes especially true when one’s crime isn’t a legal definition, but a moral category—think “crimes against humanity.” The crimes judged at Nuremberg didn’t transgress any written law, but Euro-American society’s definitions of human decency.

Social stigma leaves the criminal wearing an invisible scarlet letter. Unfortunately, in situations of moral judgement, that letter may be so invisible that only the person wearing it sees it. Have you ever seen working-class men asking their buddies for help? It’s almost impossible for most men to seek financial help without lowering their heads and shielding their faces. Yet their buddies, though perhaps not cash-flush, will cheerfully open their wallets to the extent possible. Because they don’t judge the seeker like the seeker judges himself.

American working-class men have internalized a message that seeking help, especially federalized help, is theft. Therefore, needing help equates to “criminality.” And what do we do with criminals? We judge and punish them. If a man’s friends, family, and community don’t punish him in the way he expects, he’ll punish himself, often with the vigor of a medieval flagellant. To prevent “deaths of despair,” we must stop racing after individual self-harm, and change the message working men receive and believe.

Friday, March 24, 2023

The Myth of Laziness in Modern Society

Devon Price, Ph.D., Laziness Does Not Exist

Dr. Devon Price believes we’ve all willingly swallowed “The Laziness Lie,” a narrative that ties human worth to productivity, self-sacrifice, and beauty. If we aren’t constantly economically productive, giving of ourselves for others, and pretty, we’re disparaged as lazy. This matters because “laziness” is an entirely personal fault, and if we aren’t following somebody else’s arbitrary yardstick, that exonerates the system and makes us individually bad people.

In brief, the behaviors we disdain as laziness in others, and in ourselves, serve solid purposes. When we’re fatigued, bored, and alienated from the fruits of our labors, we mentally check out from work. When loved ones impinge upon our limited mental energy with their demands, we become resistant and resentful of our relationships. When we’re disabled and literally can’t do any more, despite society’s demands to “push through,” we ultimately collapse.

Price therefore asserts that what we malign as laziness, is often our bodies or brains indicating that we’ve pushed too far. Fatigue or mental resistance are how our physical selves broadcast that they’ve been overextended. But our culture and economic structure demand we ignore our feelings as “irrational” and continue overworking ourselves. Our unelected leaders present fatigue, sleep deprivation, and pain as indicators of great moral virtue.

“Laziness” has a relatively recent origin. The word isn’t even attested in English, Dr. Price writes, until the middle fifteenth century—not coincidentally, about the time Western Europeans began keeping chattel slaves. Though global civilizations have always loathed idleness (in various ways), slave-owning economics necessitated inventing the “work ethic,” a codified way of blaming workers for needing rest, even when somebody else owns the fruits of their labors.

Price traces the development of “laziness,” as a moral pejorative, through industrialization, into modern technological society. Under capitalism, basic human needs like housing, food, and sleep, become rewards we earn by sacrificing ourselves to a forty-hour week. Sickness, weariness, and disability become something weak-minded people must endure stoically, not signs that we need to stop. Vacations and free time are luxuries, not human needs.

Devon Price, Ph.D.

Not surprisingly, Dr. Price dedicates around half his book to workplace ethics and “laziness,” and postulates multiple causes and solutions. What capitalist morality calls laziness, Price demonstrates, is a wholly reasonable response to bad pay and lack of autonomy. The mental barriers that demand we occasionally stop working, are only “bad” because they deprive owners of wealth. And workers can resist this exploitation by presenting a united front.

More surprisingly, Price’s other half goes beyond economics. Our relationships with others are often colored by fears that, if we don’t sacrifice enough of ourselves, we don’t love our friends and family enough. We often feel “lazy” and unworthy if we can’t support enough political causes. Even health and beauty frequently reek of “The Laziness Lie,” as disabled people and anybody who’s endured fat-shaming can tell you.

All these situations share a moral component. Whether it’s work, weight loss, or giving ourselves over to demanding kinfolk, we’re expected to perform with the constant, unrelenting efficiency of industrial machinery. But that’s just it, Price writes: we’re not machines. We’re not built from interchangeable parts. We’re complex, unique individuals with personal needs, and when we, or the people around us, ignore those needs, our bodies break down.

(Besides that, machines break down if they don’t rest occasionally.)

As a research psychologist and Loyola University professor, Price structures his arguments eloquently. He shifts seamlessly from individual experiences and case studies that demonstrate his point, out into themes of statistical analysis showing what these experiences mean to us. This is classic textbook organization: what’s true for the individual, proves portable to the population overall. Only the personal is truly universal.

Admittedly, many of Price’s case studies show evidence of selection bias. Everyone given the opportunity to describe their experience with burnout is college-educated and relatively stable. One person describes homelessness in the past tense; nobody’s pulling doubles at the factory or overnights at McDonald’s to afford childcare and keep a Dodge Challenger with expired tags running. If Price asked me, I’d love more economic diversity in his case studies.

Briefly, Price says, actions have purposes, even if we’re not conscious of them. When people start resisting externally imposed standards of productivity, attacking them (or attacking ourselves) as “lazy” uses moral imperatives to hand-wave the underlying problem. Until we find and address the purpose behind supposedly lazy behavior, nothing will change. When we accept our resistance as a sign of necessary repairs, we’ll finally fix our economy, and ourselves.

Monday, February 21, 2022

But Someone’s Gotta Do It

Eyal Press, Dirty Work: Essential Jobs and the Hidden Toll of Inequality In America

Few Americans have seen an untrimmed chicken carcass with our own eyes, but we know such things exist. Somebody, somewhere, turns living livestock into the styrofoam-backed meats Americans consume in such quantities. Similarly, somebody has to push the “kill” button on unmanned military drones, someone has to house the prisoners created by America’s “tough on crime” policies, someone has to extract hydrocarbons from the mine.  We sure won’t do it.

Journalist and NYU sociologist Eyal Press gathers the term “Dirty Work” from Everett Hughes, whose post-World War II research dealt with how citizens of totalitarian nations dealt with shared culpability for government atrocities. Ordinary citizens, Hughes determined, created a mental barrier. Some work was necessary, but innately dirty. Therefore a designated underclass handled the distasteful responsibility of, say, conducting pogroms and prosecuting wars of conquest.

Dirty Work, Press emphasizes early, is different from dirty jobs. Some jobs, like construction, agriculture, or infrastructure maintenance, are widely regarded as somewhat crummy. But Dirty Work is morally tainted, and that moral condemnation transfers onto those who do it. But that work remains necessary, because the population wants the work done. Workers performing Dirty Work become agents of society’s collective id, and we reward them with our disgust.

Press focuses mainly on three categories of Dirty Work in America: prison staffers, military drone pilots, and meatpackers. Combining a broad statistical survey of each field with close, intimate stories of people doing the work, he guides readers through these fields’ intricacies. He unpacks the sorts of people who accept, even seek, these jobs. And he demonstrates how they express our collective will, even as we publicly disown them.

Prisons suffer a paradoxical push-pull influence. Like schools, Americans see prisons as necessary means of enforcing laws and creating order. Candidates campaign on pledges to build, and fill, new prisons. But also like schools, prisons are first on the chopping block when budgets get cut. This becomes especially dangerous because, with the mass closure of state-funded asylums in the 1970s, prisons are now America’s leading warehouse for the mentally ill.

Eyal Press

America’s military became reliant on drones because we still expect to maintain global military dominion, but distrust committing troops. Drone pilots, working from bases in places like west Texas or northern California, save the Administration the PR headache of deployments, while keeping America active in stopping militants and terrorists. But because of their technology, individual drone pilots kill far more targets than soldiers ever could. And many suffer catastrophic PTSD.

Industrial meatpacking plants satisfy America’s appetite for cheap, plentiful meat, but at great cost. Animal rights activists love showing footage of how livestock suffer on the killing floors. But plant workers are inordinately likely to be maimed, even killed, while those who emerge physically unhurt suffer PTSD symptoms, including nightmares and hair loss. Like prison guards and drone pilots, they’re despised in their communities, underpaid, and punished for speaking out.

These workers have significant commonalities. Ruralism, for one: prisons, drone bases, and meatpacking plants are built well away from population centers. Prison workers and today’s military are also likely to come from rural areas, where often, these secure government jobs are the only reliable source of income. The workers, and their work, are easily ignored because they remain isolated on the periphery of “polite” American society.

They’re also subject to bipartisan contempt. Whenever whistleblowers in these industries come forward, conservatives ridicule them as crybabies who don’t appreciate the opportunities these jobs offer. But progressives, who historically attribute poverty and crime to systemic, rather than individual, causes, make exceptions for workers doing Dirty Work. Progressives hold these workers individually, not their employers or the laws which regulate them, culpable for their industries’ worst excesses.

Press delves into other Dirty Work, like immigrations enforcement and hydrocarbon mining, though in less depth. For comparison, Press has a chapter on “dirty tech,” high-tech innovations that make society less free, like Google making alliances with Communist China, or Facebook selling personal data to Cambridge Analytica. But, Press notes, dirty tech workers aren’t personally tainted. They’re free to quit with minimal penalties, and seldom held individually responsible.

Overall, Press describes a stratified economy where despised workers get thrust into despised work. This exposé channels historically respected journalists, from Upton Sinclair to Eric Schlosser, who have revealed the two-tiered rot in American society. The product is chilling. And maybe, this time, we’ll learn enough from the lessons Press offers that we won’t repeat these mistakes in the future. Though admittedly, the precedent isn’t good.

Thursday, October 7, 2021

Who Owns the Thoughts In Your Head?

Tucker Carlson, corporate puppet

In a week notable for explosive media revelations, corporate meltdowns, and weirdness, yesterday’s news about One America News Network (OANN) stood out. Disclosure by Reuters shows that OANN receives about 90% of its operating revenue through AT&T and its subsidiaries grabbed me violently. Yes, that same AT&T, which owns WarnerMedia, home of Ted Turner’s CNN, and also HBO, which broadcasts Last Week Tonight with John Oliver.

I can’t stress this enough: this means one umbrella corporation has investments in America’s Conservative, Centrist, and Left-wing media operations. While OANN has cultivated explicit fondness for Donald Trump and conservative nationalism, Oliver has become so influential in leftist circles that observers have described the John Oliver effect. CNN, meanwhile leans incrementally left, but attempts to maintain the traditional illusion of centrist neutrality.

Exactly where each broadcaster leans, however, matters little overall, if they’re beholden to the same corporations. Though AT&T doesn’t own OANN directly, OANN’s dependence on service providers in which Ma Bell owns a controlling interest, means that AT&T functionally owns responsibility for the network. And the same parent corporation has demonstrated it will sell audiences whatever pre-sliced bologna will line their pockets. Fundamentally, the corporation doesn’t care.

In today’s media-saturated environment, most audiences have liberty to select the news source they consider most reliable. This means networks actively court audiences by offering them viewpoints audiences consider trustworthy—which, usually, means selling viewers their own opinions, slightly polished. Telling audiences what they already believe has proven to be a lucrative business model. The creation of mass media conglomerates has made moral backbone optional.

John Oliver, corporate puppet

Most citizens lack resources necessary to find news without a corporate gatekeeper. The accumulation of news sources under owners like William Randolph Hearst or Rupert Murdoch, has meant that we depend on owners’ probity to maintain our information. But both Hearst and Murdoch have proven their probity, um, lacking. Both have long histories of massaging publicly available information to suit their preferred outcomes.

Nor is the problem limited to news.

I remember learning some years ago that somebody owns the Bible. Sort of. Though the best-selling English translation is still the King James (Authorized) Version, which is in public domain, the best-selling translation in contemporary English is the New International Version. And that translation, being a specific arrangement of words, is still under copyright. The NIV Bible belongs to Zondervan, which draws royalties on major text citations and reprints.

Fair enough, but who owns Zondervan? Though founded independently in a farmhouse, Zondervan is currently a subsidiary of HarperCollins, itself a subsidiary of Rupert Murdoch’s NewsCorp. HarperCollins also owns Avon, publisher of Anton Szandor LeVay’s Satanic Bible. The same company sells both sides of the fence. Rejecting religion doesn’t solve the problem; many of public atheist Richard Dawkins’ books are published by Houghton Mifflin, a subsidiary of… HarperCollins.

The corporations we ordinary people depend upon to disseminate morality, democracy, and knowledge, have their fingers in the opposite pie. Though it’s hypothetically possible that these corporations maintain boundaries between their divisions, we essentially have to trust the parent companies that they’ve achieved this. And they haven’t proven themselves wholly trustworthy when they make promises about internal mechanisms.

Rachel Maddow, corporate puppet

One could continue. Two corporations, Molson Coors and AnheuserBusch InBev, control ninety percent of America’s beer production. This includes such putative competitors as Leinenkugel, Shock Top, and Elysian. Though microbrews appear periodically—just like micropublishers and micronewspapers—it doesn’t take long before market forces, high overhead, and a closed market persuade many to sell out or fold. These are just examples; check sources on your favored industry to find more.

The companies that provide the news, religion, and beer we prefer, also probably market the news, religion, and beer we despise. No individual can possible comb the registers to find the connections, and when shopping, no individual brain can recollect the incestuous networks that control our choices. Overwhelmed and under-informed, we resort to the classic fallback position favored by customers throughout history: buying whatever’s convenient, even if it’s openly toxic.

Collective action seems necessary, but I can’t envision what that looks like. We can’t trust the government to regulate this inbred sludge, since the sludge manufacturers bankroll our politicians. It’s tempting to plead for revolution. But considering how Communism collapsed quickly into Stalinism and Maoism, that option becomes equally unappealing. When the state nationalizes the Inbred Sludge Industry, it becomes invested in making more inbred sludge.

Yet I fear the consequences of surrendering to fatalism. What solutions remain? I’ve run out of ideas.

Thursday, September 9, 2021

What, and Who, Is Art For?

Banksy, Snow, 2018 (source)

Someone recently hit me with a shopworn Banksy quote: “Art should comfort the disturbed and disturb the comfortable.” The anonymous British graffiti artist, whose success is as much a triumph of public relations as artistry, rewrites an axiom beloved by creative professionals, satirists, clergy, and politicians worldwide, to comfort the afflicted and afflict the comfortable. This chiasmus has precedents in the Bible, where Mary says:

He has brought down rulers from their thrones
    but has lifted up the humble.
He has filled the hungry with good things
    but has sent the rich away empty.

Sounds great, certainly. But one wonders what exactly this means, coming from Banksy. Earlier this year, one of Banksy’s canvases sold for $20 million, a new personal best, and a price range beyond anything us pedestrians could afford. Though Banksy became famous for semi-illicit guerilla work in outdoor spaces, the artist’s ability to continue making public art is subsidized by producing portable canvases which only the insanely wealthy can afford.

I began actor training in the early 2000s, when one couldn’t go thirty feet in any American theatre department without hearing somebody loudly singing excerpts from Jonathan Larson’s Rent. That play’s exhortations against post-Reagan malaise and the racism and casual homophobia of the late Twentieth Century took on an explicitly rebellious edge. The play implied all of Manhattan’s bohemian Alphabet City would rise against stultifying conformity and change the world.

Twenty years later, my acting career sputtered following some poor choices; Alphabet City has gentrified; and though casual homophobia isn’t instantiated in law anymore, the revolution never actually came. As YouTube critic Lindsay Ellis has explained, theatre often embraces the rhetoric of insurrection. But it absolutely requires the financial backing of corporate donors and rich patrons, because the soaring overhead means theatre bleeds money most of the time.

Artists, including me, consider ourselves incipient revolutionaries. We have messianic delusions that, like Mary’s Magnificat, we’ll overthrow rulers and raise the proletariat. (Mary sounds almost Marxist.) Yet art regularly loses money, and requires someone else’s generosity to cover the bills. Maybe it’s slightly better in Britain, where public subsidies mean the working class can afford theatre, orchestra, and opera tickets, but even that makes art beholden to the state.

Since my acting career has translated strictly into community theatre, I’ve discovered how risk-averse management frequently is. Not only will boards avoid anything with raunchy themes or controversy, which is perhaps understandable, but they’ll also avoid anything too new. I’ve witnessed decision-makers moot the idea of producing material written locally, but it always dies quietly, as only something road-tested on Broadway will likely pull audience numbers sufficient to entice sponsors.

Jackson Pollock, Blue Poles, 1952 (source)

This means our company will never do anything likely to challenge our community’s religious, political, and economic suppositions. I don’t even mean Augusto Boal’s sometimes self-righteous precepts that theatre shows society to itself, exposing our worst sins and social rot. Sam Wasson writes that Paul Sills and Mike Nichols invented Chicago improv to wrest theatre from elites, yet the companies soon became dependent on high-demand ticket prices.

It’s tempting to say these points describe in-person art, which always suffers from high overhead. What, one wonders, about recorded music and movies, which can amortize costs across much larger audiences? Even that doesn’t bear scrutiny, as most recording artists lose money and depend on concerts and personal appearances to get paid. And with Disney’s acquisition of Fox, Lucasfilm, ABC, and Marvel, one company has a stranglehold on Hollywood.

Fundamentally, art needs the system it rebels against to remain solvent. Not even profitable, but just above water. Artists individually may believe we’re doing God’s work, but in the aggregate, like politicians, we kowtow to whoever carries the checkbook. Great art, like a pearl, originates in friction and suffering, yet we wind up defending the system we abhor, because sooner or later, we get hungry. We can’t help becoming complicit.

These complaints aren’t unique to art. I’ve heard parish pastors voice the same struggle: they ultimately can’t drive the moneychangers from their temple, because somebody has to keep the lights on. Politicians campaign against the very fundraising practices that subsidize their eternal reelection bids. I only focus on art here because, as I struggle to find buyers for the manuscripts I’ve written, I still need rent and groceries.

But I’ve surrounded myself with art, and its baggage, too long to read quotes like Banksy’s without flinching. That statement seems directed at me. But, like Banksy, I still need someone to buy my art.

Monday, August 23, 2021

OnlyFans and the Anti-Democracy of Sex Work

OnlyFans' business model has, until now, tacitly depended on user-made porn

Last week’s announcement that pay-to-play social network OnlyFans will discontinue hosting “sexually explicit” material, is creating some controversy. First, OnlyFans hasn’t yet provided a meaningful definition of “sexually explicit,” on a site whose entire financial model until now has centered on women with their blouses off. Second, what motivated the change? OnlyFans claims that credit card companies brought the hammer down; MasterCard has responded (I’m paraphrasing) “Nuh-uh!”

OnlyFans launched by promising to exclude the intrusive advertising content that often slows social networks like Facebook or Twitter. But its customer base quickly drifted to the one commodity that people are consistently willing to pay for in today’s media-saturated environment, sex. Like pay-cable TV, people are willing to pay out for content that ad-based mainstream media won’t supply. And like drugs, the economic driver for sex work proves persistent.

We’ve seen this happen before recently. CraigsList famously shuttered its “erotic services” personal ads because it feared the federal government would use anti-human trafficking laws to act against them. This fear wasn’t unfounded, since the Department of Justice used exactly such laws to seize and dismantle BackPage, a resource sex workers used to find and pre-screen customers. Both actions were taken to assuage fears of “human trafficking,” a notorious bugbear.

Historically, conservatives have objected to sex work generally, and prostitution specifically, because they believe sex workers and their customers are just bad people. Progressives take a more nuanced approach, expressing concerns about coercion and exploitation, which aren’t entirely unfair. But progressives reflexively lump all sex work with human trafficking, which aren’t synonyms. The reasoning from Left and Right differ; the outcomes, for most sex workers, are indistinguishable.

Internet marketing promised to break this Left-Right duopoly. Remembering the giddy effusion directed toward the nascent read-write Web in the 1990s, I recall advocates like Howard Rheinggold gushing that, because nobody owned the Internet, it diffused authority to individuals. Many early Web advocates were aging ex-hippies, pleased that technology had finally fulfilled their long-dormant promise to seize authority from The Man and distribute it with egalitarian fervor onto the masses.

OnlyFans, like BackPage before it, represents a breakout in economic democratization. Individual women, needing income during the pandemic’s economic shakeout, seized their destinies by offering their services online. The Internet gave them the ability to screen customers, to set the terms of their labors, and to decide when or whether they were prepared to work. Internet sex work is almost Marxian in its dedication to individual empowerment.

If, like me, you internalized a narrative from childhood that sex work is degrading and shameful, this might not seem ennobling. But sex work has low barriers to entry, a legitimate economic expression, and a character of work. You, individually, may disapprove, but that doesn’t matter. Like alcohol, tobacco, and other vices, the economic drivers behind sex work don’t await popular acceptance. These drivers already exist for women needing work.


The MindGeek cartel controls so many adult sites that the market is essentially not free

Equally importantly, when OnlyFans, CraigsList, and BackPage stop supporting self-employed sex workers, the work doesn’t go away. Like with narcotics, large cartels step into vacuums created when governments squelch independent operators. The largest cartel, Montreal-based MindGeek, already controls such a large fraction of online sex work that it essentially sets the terms for smaller companies; like Rockefeller’s Standard Oil, startups have to compete directly with MindGeek.

Moralists may applaud OnlyFans’ willingness to deplatform “sexually explicit” content, despite the pledge’s vague wording. But by taking away self-employed women’s ability to flash their tits for pay, OnlyFans has created a backdoor subsidy for MindGeek and other “adult content” oligarchies. Like Standard Oil, DeBeers Diamonds, or Harvey Weinstein, MindGeek has an extensive list of abuse accusations, and because of their market share, nobody dares challenge their misuses of power.

Though OnlyFans sex workers used the corporate-owned platform to find their markets, and paid OnlyFans for the access, the company nevertheless gave women (and others) who needed income an opportunity to set their own terms. It helped democratize sex work. Reducing the number of opportunities to find work, doesn’t make either the sellers’ needs or the buyers’ demands disappear. It simply channels the economic pressures onto MindGeek and other monopolies.

MindGeek is the Al Capone of Internet sex work. Removing a platform honest, hardworking women use to score lucrative sex work, doesn’t encourage them to find other employment, especially in times like these, when other employment is scarce. It simply increases the power of the cartel to exploit, and profit from, the work of disfranchised laborers. OnlyFans’ anti-porn stance makes everyone less free.

Wednesday, June 23, 2021

The Speculative Bubble and the Birth of Modern Capitalism

Thomas Levenson, Money For Nothing: the Scientists, Fraudsters, and Corrupt Politicians Who Reinvented Money, Panicked a Nation, and Made the World Rich

The South Sea Bubble, an economic surge that helped solidify nascent capitalism in 1720, maybe isn’t as sexy as, say, Holland’s Tulip Mania of 1637. It hasn’t captured popular imagination the same way. Yet by helping invent mature bond markets and consolidating the power of the finance industry in modern politics, it ushered in the industrial “ownership society” we live with today. And it feels chillingly familiar.

Thomas Levenson, MIT Professor of Science Writing, seems an unlikely historian of Great Britain’s newly invented bond markets. His prior works for general audiences have focused on science icons like Galileo, Newton, and Einstein. Yet during the early Eighteenth Century, many British scientists, including Newton, reinvented themselves as financiers and civil servants. Their faith in mathematics and empiricism helped create a “scientific” approach to early modern economics.

This book covers roughly the period from 1693, when Parliament struck a deal with William of Orange to nationalize the public purse, to 1722, when the South Seas Bubble nearly bankrupted Britain. Before 1693, the public purse was basically the monarch’s personal resources, available to boost the crown’s personal glory. Taxes could supplement public funds, but decisions were tied to the monarch personally, making futures trades on public funds virtually impossible.

Once Britain’s national funds belonged to Parliament, it became possible to organize and sell derivatives. London had two markets then: the official Exchange, and the unofficial, unregulated coffee shops of Exchange Alley. Laws creating joint-stock corporations were more restrictive then, meaning fewer companies existed, and trades often occurred informally. Without standards, buyers often didn’t know what futures were really worth; stock-jobbing was a casino.

Into this unregulated market swaggered Parliament, and its new invention, the Bank of England. Britain needed cheap money to support ongoing wars with its traditional rivals, France and Spain. So Parliament began floating derivatives on the Exchange: annuities, pensions, and even lotteries. These derivatives had recently been invented by government mathematicians, including Newton and Edmond Halley (of comet fame), and the market was really an experiment. Failure was always a possibility.

Thomas Levenson

Alongside the government’s gambling, the South Sea Company emerged, willing to take another huge gamble, hoping to establish trade between London, and Spain’s Caribbean colonies. One problem: Britain and Spain were at war, again. While waiting for traditional enemies to resolve their differences, South Sea executives derived elaborate business plans based on Newton and Halley’s actuarial tables. Those plans looked so good, Exchange Alley began salivating.

Soon, the South Sea Company began seeing lucrative stock valuations, based not on anything they’d actually manufactured or sold, but upon the putative value of their future plans. Company principals used this valuation as leverage for their own borrowing. In so doing, they invented a rudimentary form of modern credit swap, creating actual monetary prices for promises. South Sea money wasn’t based on gold or land; it was based on futures.

Anyone familiar with the Great Recession will anticipate where this is headed. Because early government attempts to borrow on credit involved annuities with absurdly long payouts, Parliament was saddled with old debt when it needed to finance new. The Bank of England saw a convergence of Parliament’s needs with the South Sea Company, and devised an idea entirely new in Western finance: privatizing the public debt.

Through a Parliamentary action too complex to summarize, the South Sea Company began hoovering up public debt and translating it into common stock. Stockholders began receiving dividends based on government obligations, while the company continued conspicuously doing nothing. To untrained eyes (which included most of Exchange Alley), this looked like free money. Jealous investors began buying in, and stock prices soared. Many people became rich… on paper.

Levenson includes an epilog comparing the South Sea Bubble to the Great Recession, but not extensively. He trusts informed readers to recognize the parallels. What happened on Exchange Alley in 1720 looks rudimentary compared to today’s credit-default swaps and other derivatives, but in principle, they’re basically identical. And when the crash came, London had two choices: repair this newly discovered system, or return to the old ways.

To their credit, Britain’s investors chose wisely.

Alongside this new economy came a new government. As Levenson relates, the monarch becomes increasingly irrelevant, while an eager young politician, Robert Walpole, consolidates power, and becomes something new: a Prime Minister. The rise of early-modern economics helps create the early-modern government. Though Levenson says it’s risky to suggest our world was birthed around 1720, that implication clearly exists.

We’re living in the South Sea’s shadow.

Saturday, June 5, 2021

Hard At Work in Post-Labor America

It’s that time in the crisis cycle again: time for the self-righteous and wealthy to remind everyone how disconnected they really are. As America re-opens, and we have to make painful choices about how to rebuild the wreckage of our former economy, some people start boasting their privilege by whining about the injustice of it all. I just got accustomed to lockdown, whimper, why start back up again now?

First, cartoonist and essayist Tim Kreider imposed upon Atlantic readers with “I’m Not Scared to Reenter Society. I’m Just Not Sure I Want To.” Despite its promising title, it doesn’t address the implied theme of whether returning to our prior corporatist hellscape lives were worth it; Kreider instead mewls about how a lifestyle of “solitude, idleness, and nihilism” has become more appealing than work. Kreider needs less Pfizer, more Prozac.

Then a survey report emerged, saying that “64% of workers would pass up a $30k raise to work from home.” As big-tech and financial-services companies, which let millions of workers telecommute during the pandemic, desire a return to normality, many workers aren’t interested. They prefer work-from-home conditions, which allows them liberty to task-shift. Bored writing code or handling customer-service emails? Take fifteen to do laundry or heat a frozen burrito!

Both these voices sound superficially familiar. Who hasn’t yearned, periodically, to shirk work and malinger indoors, watching Netflix? And anybody who’s done white-collar work knows the eight-hour jive is moral rather than practical; we can’t focus that long on tasks for somebody else’s reward. Both the desire to jettison employment altogether, and the desire to work under home conditions, suggest a desire to refocus employment on workers, not bosses.

But.

Both stories bespeak mostly unexamined levels of privilege. Tim Kreider admits early that he doesn’t require an income, particularly; he crashes with friends. And the survey reports that supposed $30K refusal mainly among “Zillow, Twitter, and Microsoft employees.” We aren’t talking about people making sandwiches or scrubbing toilets, work that can’t be done remotely. Millions of Americans kept working through the pandemic, or didn’t get paid.

While Kreider didn’t need to work, and Microsoft restructured its work requirements, I and millions of others fell into a third category. Our jobs, declared “essential” for America’s thriving economy, kept going, and we needed to show up. These jobs weren’t without price, either; many low-paying service jobs became superspreader events. The media called grocery clerks “heroes” for doing their jobs, like they had a choice.

My construction labor was considered “essential.” But the things I worked to achieve, participation in community arts or amateur sports or just hanging out with friends, became suddenly lethal. Without a family, I moved from my apartment to my job and back like a convict on work-release. I had neither the luxury for chrysalis-like oblivion, like Kreider, nor liberty to schedule my own day, like the statistical Microsoft workers.

This leaves me a seemingly inevitable conclusion. I don’t need work tweaked, neither through a raise (though one would be nice), nor through telecommuting options. Rather, I question the structure of employment altogether. These sixteen months have made literal what Marx and Lenin considered metaphorically, that the ownership economy steals all meaning from work, while the rewards go to those who merely own things.

Fast-food franchisees complain that workers refuse employment because the government offers a pitiful stipend. ($300 per month amortizes to $15,600 per year—or approximately what Elon Musk makes every three seconds.) Others claim workers will return when they have other perquisites, like affordable child care, universal health insurance, and paid family leave. All these arguments assume people want rewards which have a detectable price tag.

But I’d contend workers want some meaningful connection between work and reward. Not a specific reward; they want control. The pandemic gave Tim Kreider the justification to indulge the moody self-abnegation inherent in his cartoons. It gave the abstract tech-industry workers freedom from being chained to the desk. It gave hourly workers freedom from… having anything to do or anywhere to go after hours. Not exactly the best trade-off for some.

These stories, and the traction they’ve received in the last several days, reflect the relative privilege held by the American punditocracy. The fact that cube farmers don’t want to return to the cube isn’t news; did they ever want to be there? Yet the existential rootlessness which the working class had amplified over the last sixteen months remains ignored. Unless, of course, it stops editors from getting their three-martini lunch.

Sunday, May 30, 2021

Freelance Restaurant Workers: a Modest Proposal

Promo still from the award-winning 2007 indie film Waitress

As the “labor shortage” drags on, both sides blaming each other for America’s struggling service industry, maybe it’s time to reevaluate the market. Our service industry remains beholden to a 19th-Century model of employment, where workers are obligated to management, and management in turn dispenses pay, benefits, and task assignments. But maybe Americans’ growing unwillingness to accept that model at today’s pay scale, suggests that model has outlived its usefulness.

Sure, I know, millions of Americans will insist it’s the pay scale that’s shuffling on, zombie-like, beyond its productive life expectancy. Our minimum wage hasn’t changed since 2009, while rent has increased by over half. And for tipped workers—meaning mostly food-service workers—the minimum wage hasn’t shifted since 1991, during which time rents have nearly tripled. Okay, by that narrow, prescriptivist model, our wage structure is egregiously out-of-date.

But clearly, with today’s governmental structure, proposals to update the wage base are a dead letter. Three Administrations, representing both major parties, have essentially shrugged and admitted their options are few: nobody will support higher wages for America’s underserved, they say, so let’s not even try. The Biden Administration campaigned explicitly on improving working pay for Americans, then surrendered three months into their term. Stop wishing on a star.

It’s time to admit: tipped staff don’t need management anymore. Businesses which hire tipped staff, which again means mostly restaurants, should stop hiring staff altogether, and staff should stop applying for these jobs, like Oliver Twist with his bowl, begging: “Please, sir.” If waitstaff’s absolute wage floor hasn’t increased since my high school days, they clearly don’t need wages at all. They’re already living on tips; why stop there?

Since over two-thirds of tipped workers’ pay has to come from tips just to equal the federal minimum wage, a number that’s already absurdly low, why not the rest. Most food-service work is wholly standardized: the numbering arrangement of tables, digital requirements to enter orders, the dress and behavior codes. Unless your waitstaff wears company-branded clothing, there’s little to distinguish the crew at one restaurant from nearly any other in America.

Restaurateurs should simply maintain a bulletin board with available tables. Aspiring waitstaff simply arrive during peak hours, claim as many or as few tables as they feel comfortable serving, and voila! The staffing problem resolves itself, because waitstaff no longer work for the restaurant. They work for their individual respective customers, get paid in tips, and keep everything they make. Restaurants are off the hook altogether.

Edmonton, Alberta-based chef Serge Belair at work

Owners should embrace this change, because it means they needn’t hire, train, or schedule staff anymore. Workers simply arrive, and work until they believe they’ve earned enough. Workers should favor this because they needn’t feign any particular loyalty to restaurants that provide lousy pay and benefits. Letting waitstaff go completely freelance, frees owners and workers alike from the burdens which employment (as opposed to work) brings.

Moreover, freelance status will give waitstaff more authority over the “I don’t tip” clientele. Under current conditions, some people excuse their refusal to tip by saying “it’s the restaurant’s responsibility to pay workers.” If servers go completely freelance, then customers who refuse to pay their servers have literally stolen services. Customers who don’t pay waitstaff under the freelance model would be thieves, exactly like customers who skip out on their tab now.

I can anticipate the likely counterarguments arising. What if not enough waitstaff want to work during lucrative meal rushes, and restaurants find themselves pleading for help? What if workers only want to freelance at posh restaurants where customers are subdued and respectful? For every objection, I have the same response: that sounds like a “you problem,” and you should work to cultivate a more polite, better-paying customer base.

Indeed, changing the service industry’s worker-employer model would, arguably, expose the roots of problems that make such work undesirable now. If some minority of customers behaves boorishly, making work unbearably nasty, maybe ask yourself what you’re doing that rewards such behavior. And if workers are so reluctant to arrive during peak hours that you can’t plan ahead effectively, we return to the same solution everyone offers: pay better.

I’ve had the pleasure of knowing many waiters, bartenders, and coffee baristas; all tell warm tales of regular customers with whom they became friends. Restaurants, coffee shops, and bars often become the beating hearts of local communities. Yet people leave the industry with disheartening frequency, usually for one reason: bad relationships with management. We can solve this problem by removing management completely.

Tuesday, May 11, 2021

The Capitalist War on Free Markets

Nebraska Governor Pete Ricketts

It’s no secret that I dislike Nebraska governor Pete Ricketts. I voted against him twice, and cannot fathom how anybody thought him prepared to administer our state. So last week, when he announced his cheap “beef passport” gimmick, a glamorized rewards card to encourage Nebraskans to eat more beef, I originally disregarded his stunt. Governor Ricketts is notorious for low-risk stunts that draw attention but accomplish little.

Then Wyoming said “hold my beer.”

A bill percolating through Wyoming’s legislature would authorize the state to sue other states that don’t purchase Wyoming coal. Traditionally a ranching state with little manufacturing, Wyoming’s economic base transitioned into coal-mining and related industries through the 1960s and 1970s. I have cousins working the Wyoming coal pits, so my sympathies certainly lie with coal miners. But this is a mind-bogglingly stupid answer to the situation.

While Pete Ricketts uses the “carrot” approach to manipulating markets, Wyoming governor Mark Gordon uses the “stick.” Ricketts creates meaningless rewards to trick people into believing they have something invested in continuing their behavior. Gordon just threatens those who don’t comply, knowing that even if his threats don’t work, resisting him will be costly. Neither seems willing to let market forces apply, and I find that telling.

These governors, both Republicans, represent the party that nominally believes Capitalism, which they define as market forces, are inherently good. Remember, Republican leadership spent 2020 insisting that providing working Americans a little salutary support during the worst pandemic in living memory, was tantamount to creeping Bolshevism. Yet when faced with market forces that don’t reward the status quo, from which their states have profited, they turn virtually Trotskyite.

Retail stores create loyalty programs to tie customers, particularly young customers without time-tested buying habits, to one store. Purchase five hamburgers, get the sixth free! It creates the illusion of commitment. But be serious, people already either eat red meat or don’t. Despite Colorado Governor Jared Polis’ recent attempt to bring back meatless Mondays, no carnivores are likely to forego beef unless they’re officially compelled to.

Wyoming Governor Mark Gordon

Coal, meanwhile, is disappearing from markets because it’s unnecessary. Burning hydrocarbons is environmentally reckless, especially coal, tainted with sulfur. Equally important, cleaner energy generation technologies, like wind, solar, and nuclear, have advanced sufficiently that they’re cheaper than coal. Maintaining coal-burning technology simply to invent a market for a resource that’s costly, dirty, and outdated, is far more anti-capitalist than stimulus checks during the pandemic.

Yet here we are. Can you imagine elected officials manipulating markets to keep other technologies afloat? Whale oil, muzzle-loaders, Betamax. Historically, technologies have become outdated, products have fallen out of demand, and people who made those products needed to realign themselves to the changing market. But these Republican governors, one with known Presidential aspirations, have chosen another tack: seizing control of markets to stop unwanted change.

As an ex-Republican, I find that amazeballs. The party I formerly supported believed markets drove morality: that if something was worth money, people would pay for it, and if nobody will pay for it, that proves it’s worthless. That’s why Republicans resist public recycling programs, research into alternative energy, and other attempts to fix broken markets. A dollar freely spent, to paleoconservatives, is the ultimate yardstick for public morality.

Until, apparently, it isn’t. Ricketts and Gordon both bring their offices’ might to bear in ensuring markets obey their whims. Ricketts, born rich, and Gordon, a successful rancher, have profited mightily from markets that favored their products and services. Yet apparently, both believe the market forces which propelled their family to prominence, should continue unchanged forever. When people want less meat or less coal, public demand is wrong, not the market.

I say this, knowing literally unrestrained markets have never really propelled American economics. Our government created growth markets by chasing Native Americans off their land, then repackaging the spoils as “homesteads.” We encouraged certain development schemes by putting dams, with their cheap electricity, along the Colorado River. America enjoys (if that’s the word) Earth’s cheapest gasoline because our government subsidizes petroleum.

Ricketts and Gordon’s maneuvers aren’t deviations from paleoconservative economics; they’ve just admitted aloud what we news junkies have always known. Our government picks winners, then manipulates markets to ensure the early adopters, mostly campaign contributors, are well rewarded. It always has. These aspiring leaders of Republican ideology have simply admitted such out loud for the first time.

It’s up to us, the voters, to ensure they wear the shameful consequences of their actions for all the world to see.

Tuesday, April 6, 2021

A Dollar For Your Soul, Please

What Lil Nas X Means to Me, Part Two

Max Barry’s 2003 satirical dystopia, Jennifer Government, utilizes a common Cyberpunk trick of name-dropping real-world corporations abusing quasi-governmental power. In the story MacGuffin, the Nike shoe corporation plans a “guerilla” marketing campaign. To create artificial demand for ordinary athletic shoes, the company orchestrates a series of murders, killing teenagers for their shoes. It works, and the unremarkable shoes become a hot commodity, too valuable to wear.

I remembered Barry last week, when Lil Nas X, the only rapper to ever top the Billboard country charts, released his new Satan shoe. Everything about this shoe appears unremarkable. It’s a repurposed Nike Air Max 97, a similarly quotidian shoe retailing, according to the Nike website, for about $170. In collaboration with a New York art collective, LNX alters the shoes with supposed satanic medallions, inflammatory logos, and one drop of human blood. Then he charges over $1000.

LNX’s Satan shoe channels previous pseudo-scandals created by professional media manipulators. The lightweight blasphemy recalls Madonna’s 1989 “Like a Prayer” video, or virtually everything Marilyn Manson has ever done. Marvel Comics and the band KISS released a 1970s comic book featuring, like the shoe, human blood in the ink. LNX seemingly designed this shoe to ask: How many times can the squares believe the same overhyped bullshit?

The squares responded: at least one more. Prominent Republicans, TV preachers, and conservative pundits flooded social media last Monday with condemnations and repudiations. Like clockwork, predictable sources claim a cheap publicity stunt means we’re engaged in a theological struggle for America’s shared soul, or something, and proof, proof I say, of the moral cesspool youth culture has become. It’s sadly, dishearteningly predictable.

And I say that as a Christian.

Importantly, while White cishet Christians throw theatrically public tantrums, they’re foregrounding LNX’s message: that, as a gay man raised in a conservative church, he spent years living with an internalized message that God hated him. It’s a dishearteningly familiar experience. Encouraged to consider themselves damned, children embrace that as their identity and, like millions of heavy metal meatheads everywhere, wave their anomie in everybody’s faces, because it’s all they have.

Because, don’t fool yourself, LNX’s demonstrations are every bit as ordinary as the conservative reactions against them. This banality has become an inevitable part of the performance. The headbanger, dungeon master, or walking virgin-whore dichotomy, behaves in some predictably provocative manner. Then the squares respond like puppets, moving mechanically through a standardized litany of performative outrage. The sequence is completely scripted, and has grown tiresome with repetition.

The market popularity of blasphemy results in commodified rebellion. Don’t forget, LNX is signed with Columbia Records, the label which previously gave us focus-tested teen rebellion like AC/DC, Rage Against the Machine, and Blue Öyster Cult. Like LNX, these acts promised to spit in the Establishment’s eye, while remaining ensconced within the womb-like security of one of Earth’s largest media conglomerates. Way to screw the system, guys.

Don’t misunderstand me. Lil Nas X’s accusations of traumatic treatment against the church require serious consideration. American Christianity has told outsiders “Jesus loves you,” but qualified that by endorsing racism, sexism, homosexism, and other us-vs-them behavior. (That’s in the aggregate, certainly.) Meanwhile, American church membership is falling precipitously. If Christians want meaningful explanations why, they should start seeking in their pulpits and pews.

But, failing that, let’s recognize that rebellion, like conformity, is a commodity token. Zondervan, the publisher which owns the New International Version translation of the Bible, is a subsidiary of media conglomerate HarperCollins. Anton LeVay’s The Satanic Bible is published by Avon Books, a subsidiary of… HarperCollins. The exact same company will sell you righteousness or blasphemy, whichever brings you comfort. They don’t care, they want to get paid.

Because both LNX’s blasphemy, and the resulting Christian retrenchment, follow a predictable script, neither will ultimately move the discussion meaningfully. The only beneficiaries of this conflict are the CEOs and marketeers. While individuals often see themselves as taking bold moral stands on these issues, the monetary transactions which result, ultimately redound to people like Kenneth Copeland, on the Christian side, or on LNX’s side, the shareholders of Sony Music Entertainment.

In Jennifer Government, the murders make ordinary shoes look valuable, and people pay literally thousands for ordinary joggers. But within a few months, the media landscape moves on, people forget, and the shoes get remaindered. Nike doesn’t care; they made their profit months ago. Max Barry’s point bears remembering, for LNX’s allies and enemies alike: brands aren’t your supporters. They’re here to get paid.

 

See also Part One

Tuesday, March 30, 2021

Okay, But Who REALLY Owns the Internet?

Who bears responsibility when famous people slap their names on other people’s work? In a tweet time-stamped late last Friday, late-night comedian Jimmy Fallon invited singer and internet personality Addison Rae to perform what he called “8 TikTok Dances.” Social media erupted in outrage almost immediately, as the two-minute performance completely wrote out the choreographers, most of them Black, who actually invented the dances.

I understand the outrage prompted by this performance. Though not outright bigoted, it does bespeak the sublimated racism common in social media algorighms: Black content creators often see their material get more clicks when handled by White peers. But thinking about it, I realized, this isn’t unique. Many people still apparently believe Steve Jobs personally invented the iPhone, not the hundreds of anonymous engineers on his payroll.

Digital culture often makes everything unofficially public domain. It’s difficult to police ownership, because information flows freely with minimal oversight and few ways of preventing drift. I’ve found my poetry copied onto other people’s blogs and social media pages, not always with my name on it. And I use news photos to decorate blog posts, including this one. The casual anarchy of the internet rewards a limited amount of Wild-West behavior.

Thus digital ownership, to an extent, depends on the honor system. We trust people to care about others’ property, knowing not everyone will. (That’s why programmers invent workarounds like NFTs.) Some people behave recklessly with others’ property, even knowing that people require control of their content in order to control their finances. They need to own their product if they want to make a living.

In fairness, opposite the concentrated nature of ownership, we have the distributed nature of funding. Thanks to crowdfunding resources, one need only ask, to receive a side income; some people make a middle-class living through crowdfunding. This depends on several factors, certainly, as White people and cismen often find it easier to make a living online. But within that stipulation, content creators have a certain amount of creative autonomy.

Fallon’s behavior, though, shows how limited that model remains. His old media connections offer him power over others’ public exposure. Notice we’re fighting over a TV star’s irresponsibility, because TV still influences what people get to see. Influence peddlers like Fallon still constitute an information bottleneck: as reprehensible as it appears that he’s taken creators’ names off their products, I never would’ve encountered the products without them.

Old media empires like NBCUniversal, Warner, and especially Disney, retain remarkable authority. They outright own the dwindling number of products that comprise our shared cultural experience, and we permit them to gatekeep what merits our time. Powerful media executives, and their onscreen hand-puppets like Fallon, still filter what gets seen, and we give them profound sway over our tastes, and the tastemaking process.

This behavior didn’t originate with Fallon. Elon Musk has been repeatedly scolded for sharing others’ artwork without credit. But, again, most tech mavens don’t create the works to which they sign their names. Elon Musk, like Steve Jobs, did some technical design, decades ago, but both are (were) business executives claiming credit via the “Royal We” for work mostly done by others, who mostly don’t draw residual payments.

Powerful people screen what’s worth watching, listening to, or dancing with. But the powerless and diffuse actually create the products the powerful endorse. In a world suffused with content, we trust gatekeepers to screen our limited attention time. When Fallon says something is worth watching, we trust him, because we have to. And when he backs himself with a pretty woman, he merits that much more of our attention.

Poor Fallon possibly didn’t realize he was stealing. Perhaps he, or his writers’ room, assumed falsely that these dances originated organically, and spread via grassroots gossip. Several Twitter users successfully found the original choreographers, so ignorance isn’t a great excuse; he could’ve found the choreographers, he just didn’t. Either way, it proves he requires a greater dose of responsibility than he’s currently showing.

MIT professor Eric von Hippel demonstrates that strict copyright law narrows economic development, when applied to technology. End users should have freedom to adapt and improve their products. But he acknowledges this doesn’t apply to art. If artists can’t own their products, they can’t make a living, and therefore can’t dedicate premium mental energy to art. Sadly, White people have often used this limitation to short-sell Black artists.

Digital technology makes such uncredited “borrowings” more likely. Thankfully, tech makes catching them more likely, too.