Sunday, May 30, 2021

Freelance Restaurant Workers: a Modest Proposal

Promo still from the award-winning 2007 indie film Waitress

As the “labor shortage” drags on, both sides blaming each other for America’s struggling service industry, maybe it’s time to reevaluate the market. Our service industry remains beholden to a 19th-Century model of employment, where workers are obligated to management, and management in turn dispenses pay, benefits, and task assignments. But maybe Americans’ growing unwillingness to accept that model at today’s pay scale, suggests that model has outlived its usefulness.

Sure, I know, millions of Americans will insist it’s the pay scale that’s shuffling on, zombie-like, beyond its productive life expectancy. Our minimum wage hasn’t changed since 2009, while rent has increased by over half. And for tipped workers—meaning mostly food-service workers—the minimum wage hasn’t shifted since 1991, during which time rents have nearly tripled. Okay, by that narrow, prescriptivist model, our wage structure is egregiously out-of-date.

But clearly, with today’s governmental structure, proposals to update the wage base are a dead letter. Three Administrations, representing both major parties, have essentially shrugged and admitted their options are few: nobody will support higher wages for America’s underserved, they say, so let’s not even try. The Biden Administration campaigned explicitly on improving working pay for Americans, then surrendered three months into their term. Stop wishing on a star.

It’s time to admit: tipped staff don’t need management anymore. Businesses which hire tipped staff, which again means mostly restaurants, should stop hiring staff altogether, and staff should stop applying for these jobs, like Oliver Twist with his bowl, begging: “Please, sir.” If waitstaff’s absolute wage floor hasn’t increased since my high school days, they clearly don’t need wages at all. They’re already living on tips; why stop there?

Since over two-thirds of tipped workers’ pay has to come from tips just to equal the federal minimum wage, a number that’s already absurdly low, why not the rest. Most food-service work is wholly standardized: the numbering arrangement of tables, digital requirements to enter orders, the dress and behavior codes. Unless your waitstaff wears company-branded clothing, there’s little to distinguish the crew at one restaurant from nearly any other in America.

Restaurateurs should simply maintain a bulletin board with available tables. Aspiring waitstaff simply arrive during peak hours, claim as many or as few tables as they feel comfortable serving, and voila! The staffing problem resolves itself, because waitstaff no longer work for the restaurant. They work for their individual respective customers, get paid in tips, and keep everything they make. Restaurants are off the hook altogether.

Edmonton, Alberta-based chef Serge Belair at work

Owners should embrace this change, because it means they needn’t hire, train, or schedule staff anymore. Workers simply arrive, and work until they believe they’ve earned enough. Workers should favor this because they needn’t feign any particular loyalty to restaurants that provide lousy pay and benefits. Letting waitstaff go completely freelance, frees owners and workers alike from the burdens which employment (as opposed to work) brings.

Moreover, freelance status will give waitstaff more authority over the “I don’t tip” clientele. Under current conditions, some people excuse their refusal to tip by saying “it’s the restaurant’s responsibility to pay workers.” If servers go completely freelance, then customers who refuse to pay their servers have literally stolen services. Customers who don’t pay waitstaff under the freelance model would be thieves, exactly like customers who skip out on their tab now.

I can anticipate the likely counterarguments arising. What if not enough waitstaff want to work during lucrative meal rushes, and restaurants find themselves pleading for help? What if workers only want to freelance at posh restaurants where customers are subdued and respectful? For every objection, I have the same response: that sounds like a “you problem,” and you should work to cultivate a more polite, better-paying customer base.

Indeed, changing the service industry’s worker-employer model would, arguably, expose the roots of problems that make such work undesirable now. If some minority of customers behaves boorishly, making work unbearably nasty, maybe ask yourself what you’re doing that rewards such behavior. And if workers are so reluctant to arrive during peak hours that you can’t plan ahead effectively, we return to the same solution everyone offers: pay better.

I’ve had the pleasure of knowing many waiters, bartenders, and coffee baristas; all tell warm tales of regular customers with whom they became friends. Restaurants, coffee shops, and bars often become the beating hearts of local communities. Yet people leave the industry with disheartening frequency, usually for one reason: bad relationships with management. We can solve this problem by removing management completely.

Saturday, May 29, 2021

What's With People Driving Cars Into Crowds?

A driver plows into a Seattle BLM protest, summer 2020 (source)

This Monday, a Tennessee woman was arrested after driving her car into a COVID-19 vaccine kiosk at the local mall. Though nobody was injured, reports indicate she narrowly missed seven workers. Witnesses report she screamed “No vaccine!” while swerving around orange road cones set up to prevent actions like hers. Police charged her with seven counts of reckless endangerment, though I can’t figure how she escaped attempted murder charges.

The Tennessee event (I won’t honor the assailant by sharing her name) fits a pattern we’ve become sadly accustomed to over the last five years. At the notorious Charlottesville, Virginia, “Unite the Right” rally, an Ohio White supremacist plowed his car into a crowd of counterprotestors, injuring nineteen, and killing Heather Heyer. During the BLM protests following George Floyd’s murder, police and civilian drivers repeatedly struck crowds on camera.

American conservatives have apparently developed a sense of entitlement that permits them to deliberately, even maliciously, attack others using their cars. The specific attachment of antiprogressive sentiment with cars seems curious. I know even other conservatives notice, because several Republican-controlled state legislatures have proposed or passed laws which functionally legalize driving cars into protestors. They basically admit they’re doing it on purpose.

Why cars, though? Why this specific affinity for resisting calls for change by deliberately attacking massed populations with motor vehicles? These attackers must realize that, like the photos of Bull Connor releasing the dogs, or the weeping girl at Kent State, these images of willful, deliberate carnage will only embolden protestors. Some other motivating force must apparently motivate their eagerness to drive cars into crowds… but what?

When mass-market cars first hit the American market, the general populace distrusted them. Consider Fitzgerald’s The Great Gatsby, in which narrator Nick Carraway uses Jordan Baker’s reckless driving to indicate her general disdain for others, and Myrtle Wilson is killed by Jay Gatsby’s car. Or Action Comics #1, which features Superman destroying a car, symbol of the gangsters’ destructive style. Cars, once, belonged to innately bad people.

A century of advertising and PR have amended that perception. Images of SUVs winding through picturesque mountain roads, or overbuilt family sedans doing figure-8s on the Bonneville Salt Flats, have persuaded Americans that cars represent the pinnacle of independence. Locked inside your car, serenaded by your favorite tunes, with your hands on the wheel and your feet on the pedals, you’ve taken ultimate control of your destiny. Congratulations.

NYPD vehicles striking BLM protesters, apparently deliberately, summer 2020 (source)

Recent American myth-makers, including Ronald Reagan, Rush Limbaugh, and Ayn Rand, preach that America’s great destiny is complete, atomized self-control. Entrepreneurs and billionaires are lauded as champions of individuality. Activities like wilderness camping or hiking the Appalachian Trail have become symbols of accomplishment: disconnecting oneself from society and living like a truly autonomous individual. Few people actually do these things, but we celebrate that we could.

Cars have become the apotheosis of American radical individualism. We completely disconnect from others, steer ourselves to our destinations, and captain our destiny, for a few minutes anyway. Would-be leaders, like Elon Musk, openly disparage public transportation, which leaves us dependent on others’ equipment and schedules. Cars, more than houses, more than entrepreneurship, embody American ideals of individualism and self-reliance.

We know this belief is untrue if we consider more than five minutes. Most roads aren’t as empty as test roads or salt flats; especially in cities, car dependency yokes us to long commutes for even routine errands. American car culture leaves us dependent on OPEC oil and imported car parts. If we blow a gasket or lose a spark plug, we’re stranded, especially in most bedroom suburbs, where relief can take hours to arrive.

As a myth, though, car-based independence remains remarkably persistent, evidence be damned. We feel independent while driving. For people who perceive independence and individualism as paramount values, preserving that independence comes first. And when the world tells them that they have to change, that they need to not breathe pathogens on strangers, or that they need to pay a little money to offset the damage from centuries of racism, basically, that threatens their cars.

If Freud lived today, and analyzed the influences driving American society, he might forget the phallus altogether. In certain circles, sex is subordinate to the ethos of independence. (Which might be why the ostentatiously independent keep having inappropriate sex, Matt Gaetz.) And our cars have become the public embodiment of that independence. So a Tennessee woman has to value others and their health, and she lashes out with the natural weapon: her car.

Tuesday, May 25, 2021

Romeo and Juliet in the Kingdom of Politics

Clare Danes and Leonardo DiCaprio in Baz Luhrmann's Romeo + Juliet

I never cottoned to Shakespeare’s Romeo and Juliet when I read it in college. Unlike Macbeth or King Lear, something about R&J sat badly with me. These protagonists came across as self-indulgent, petulant, and deaf to reason. Somehow I never twigged that maybe, that was Shakespeare’s intent: that Romeo and Juliet are bad people, molded by bad influences. Never, that is, until last week, when the Anthony Bouchard story broke.

If you missed this story, count your blessings. Bouchard, a Wyoming state senator and leading primary challenger for damaged incumbent Republican Representative Liz Cheney, admitted impregnating his 14-year-old girlfriend when he was 18. The relationship progressed through a teenage marriage and divorce, through personal and relationship strife, to suicide and estrangement. This folderol makes Shakespeare’s “pair of star-crossed lovers” look sober and abstemious.

Importantly for our purposes, Bouchard compares his tempestuous adolescent cyclone, to Romeo and Juliet. He clearly means this flatteringly, like his doomed first marriage was a beautiful love story of teenage innocence—like I once assumed Shakespeare intended. Yet considering the death, estrangement, and other violence left behind, I’ve reevaluated Shakespeare’s play, and realize Bouchard may be more correct than he realizes.

This idea that Romeo and Juliet is some beautiful story of innocence, probably comes from the accumulated myth. Franco Zeffirelli’s 1968 film presents its protagonists as beautiful, youthful lovers whose relationship is impeded by the grown-ups around them, the embodiment of Vietnam-era youth culture. Baz Luhrmann’s 1996 “white gangsta” interpretation contemporizes the same basic themes. These reflect the prettified baggage which Shakespeare’s story has accumulated through the centuries.

Yet one suspects, comparing Shakespeare’s play to Bouchard’s story, that perhaps The Bard anticipated a very different interpretation. Like Romeo, Bouchard used an esoteric legal loophole to wed his absurdly young bride. Things escalated, though. Like Juliet, Bouchard’s first wife destroyed herself. Which leads me to suspect that Juliet, and probably Bouchard’s first wife, had existing unresolved issues she couldn’t handle correctly. Their “romances” were extensions of this.

Wyoming State Senator
Anthony Bouchard

Bouchard’s wife committed suicide, and therefore is beyond questioning. But we can construe, from the fact that she married Bouchard, that she consented to the age-inappropriate relationship, inasmuch as teenagers can consent. Therefore she presumably sought an older boyfriend, looking for whatever psychological comfort he could provide. We can postulate the details; they don’t matter. What matters is the theme.

(Let’s withhold statutory considerations, which apply to Matt Gaetz. Wyoming’s age-of-consent laws include a four-year “close in age” exception. Bouchard’s behavior is skeevy, but not actually unlawful.)

Shakespeare’s play acknowledges that children have no control of their emotions. They fall in love easily, influenced by the courtly romances then popular throughout Europe. Without adult guidance, which Shakespeare shows early that both lovers lack, teenage emotions quickly go sideways, turning them into instruments of destruction. Kids feel deeply, and mistake their feelings for normal sexuality, which they assuage through reckless, sensual behavior.

We adults forget that teenagers aren’t miniature grown-ups. They have nearly adult bodies and desires, but haven’t learned to manage those desires, a reality compounded because our educational system extends the period of juvenile dependency until the almost-traumatic onset of adulthood. Our culture is bipolar, keeping kids helpless and childlike through young adulthood, then dropping the full burden of maturity upon them on their eighteenth birthdays.

Like Romeo and Juliet, Bouchard and his inappropriately young girlfriend had reckless sex, then had to face the consequences. Their marriage wasn’t happy, because it didn’t really sate the unresolved conflicts inside themselves, as Romeo and Juliet’s wouldn’t have. And like our star-crossed lovers, whose real desire was to die, Bouchard’s wife eventually fulfilled that goal. The living remained to gather the copious debris.

Shakespeare understood something which succeeding generations have forgotten: teenagers don’t have sex for the same reasons adults do. They don’t want to express love, feel intimacy, or even just get off. Plagued with the fallout of protracted adolescence, teenagers are possessed of inner fires which they fear will consume them. Sex offers a false promise of temporary release. But children, by definition, haven’t learned to handle these pressures appropriately.

This story has forced me to reevaluate my college interpretation of Romeo and Juliet. My reaction to them as immature and petulant maybe reflects a deeper truth: that children think themselves more mature than they are, or than society lets them be. Their self-indulgent behavior isn’t incidental. And if their adolescent intemperance isn’t stopped, they grow up thinking themselves entitled to power, because even as adults, consequences don’t matter.

Wednesday, May 19, 2021

Should We Go Back to Church?

Émile Durkheim

Émile Durkheim, the founder of modern sociology, wrote something telling in his last great monograph, The Elementary Forms of Religious Life. Religion, Durkheim observes, based on pre-literate folk religions, isn’t originally about worshipping God. Indeed, God (or God’s approximate equivalent) is frequently a latecomer to religion. Most religions begin as shared public expressions of community values. The original “God” is the people.

Please don’t misunderstand: Durkheim, who abandoned rabbinical school because he couldn’t pretend to faith he didn’t have, perhaps overgeneralized. Like Freud and Marx before him, he incorrectly assumed all religions essentially resembled his father’s forsaken beliefs. He also wrote amid the fading glow of European empire, and described folk religions as “primitive” and “rudimentary,” with the patronizing air of Motherland imperialism. It’s necessary to adopt Durkheim’s philosophy with discretion.

However, as American society, shuttered for sixteen months, begins opening back up, one decision I and millions of other Americans must make is: should I go back to church? I don’t ask whether we should continue believing in God. Many, like me, watching the arrogant and self-righteous defying mask orders and spreading the disease, must’ve felt comfort in the expectation of eventual justice for those who showed such prideful disdain.

Rather, in questioning whether to return to church, I question what exactly we worship. If Durkheim is right, if “God” is a manifestation of community values, which we ratify with our spoken liturgy, then it bears questioning exactly what values we deepen by speaking them aloud. Durkheim places especial importance on liturgy, which, he notes, isn’t mere words. Religions with clear liturgy enjoy greater loyalty and lesser apostasy over time.

Importantly, the White “Evangelical” churches most likely to have endorsed Former Guy’s administration, have also been the Christian branches most likely to reject formal liturgy. The suburban megachurches, TV televangelist extravaganzas, and satellite church programs most likely to equate Christianity with Republicanism, seldom have any consistent liturgy. Ethics are vague, shapeless, and defined primarily by the worship band, not any formal clergy.

I don’t have enough background in sociological research to draw scientific conclusions. However, speaking strictly anecdotally, I’ve observed that certain congregational characteristics appear to correlate with roughly how conservative the worshippers become. Very large congregations, for instance, with little interaction with the pastor, tend to separate into in-groups. Congregations without liturgy tend to find their theology where they can, or invent it on an ad hoc basis.

I attribute this to our human need for guidance. When faced with situations where clearly right answers don’t avail themselves, people look to respected figures for counsel. But when pastors don’t know their congregants, they can only offer bromides. When the only theology believers internalize has to fit a singalong rhythm, it doesn’t encourage deeper thinking. Theological reasoning, without seasoned guidance, becomes superficial, judgemental, and hasty.

People then embrace whatever direction some authority provides. The Former Administration used the language of pop theology effectively: it cited God, prayer, and orthodoxy often, without actually looking into them. Essentially it provided the guidance Christians needed, but which they didn’t receive from churches grown to large to know them. That, I believe anecdotally, is why megachurches apparently nurture conservatism: they mimic the form of church, but not the substance.

This matters because, since the 1990s, many mainline congregations, especially those with primarily White membership, have adopted the ethos of Evangelicalism. I’ve watched congregations embrace “contemporary” worship, which mostly means eliminating everything except pop-inspired singalongs and, maybe, a sermon. Inspired by the numbers attracted by lite-beer megachurches like Willow Creek and Saddleback, traditional churches mimic their approach.

Meanwhile, as predominantly White congregations embrace a large, impersonal model that encourages reactionary sectarianism, I started attending a mostly Black congregation in 2017. (When I can: it’s far from my house.) The congregation is small enough that the pastor can speak with everyone after the service. And its liturgy includes reading the full Ten Commandments, in unison, every service. We literally speak our values aloud together.

I fear romanticizing the experience, especially as a White man in a Black congregation. Yet the experience is wholly removed from attending a White congregation that elevates bigness and simplicity. This Black congregation exists in a mainly White town, in a country that still assumes “White” equals “normal.” Coming together and speaking their values gives the congregants strength to face a frequently hostile world, and equally important, a shared identity.

Durkheim’s theories still overgeneralize, and are often patronizing. But watching this congregation, I see what he means, and what’s frequently missing.

Thursday, May 13, 2021

Where Should Actors Draw the Line?

Let’s start with a position everyone should agree on now: it’s wrong for White actors to play Black, Hispanic, or otherwise non-White characters. Full-time professional offense-takers like Megyn Kelly might feign nostalgia for a prior time when Blackface was okay, but given what we’ve learned recently, surely anybody should understand that’s bad, unless they’re paid to not understand. Troweling on makeup to become a racial caricature is wrong and harmful.

However, as a trained actor with some stage experience, I retain important questions. As actors, we’re trained to become somebody we’re not: a different age, social class, nationality. Good people with big hearts practice to become violent, hateful, small people onstage. Likewise, people with embittered, destructive souls sometimes pass for amiable, fatherly figures for years, as Bill Cosby and Kevin Spacey taught us. Actors exist to step outside ourselves.

Where do we draw the line? How far can actors stray from themselves, from their assigned social roles and identity, before we become something offensive? This question strikes me after reading actor Fisher Stevens’ stated apology for playing Ben Jabituya in the 1986 comedy movie Short Circuit. Stevens, who is White Jewish, applied heavy makeup and adopted a pronounced accent to play Jabituya, a computer scientist from India.

The situation makes me uncomfortable because, though I’m nowhere near Stevens’ standing, I’ve also adopted fake accents to play roles. I’ve never done Blackface, a small mercy, since like Megyn Kelly, I grew up in the waning days when people still did that occasionally. I have, however, gotten cast across some questionable social lines. To further this debate, I must ask: do we consider Jewish people White?

I once got cast as an aged Jewish neighbor, whose looping grammar convinced me the character’s first language was Yiddish. So I took the initiative to teach myself a New York Yiddish accent, spent some time reading the Talmud and Sefir Yetzirah, and learned enough rudimentary Yiddish to incorporate it into my dialog. I never would’ve passed in a shul in Williamsburg, Brooklyn, but for a community theatre audience in Nebraska, I was Jewish.

Except, of course, I wasn’t. I, a White man of no fixed ethnic heritage, adopted the identity of an ethnic group that’s been historically ostracized, downtrodden, and subject to multiple attempts at genocide. I offered the character all the respect I could provide, learning his heritage, his religion, his language, and inhabiting the identity to the utmost extent possible. But I’m still a White guy trying to pass as Yiddish.

Memoirist Emily Raboteau, who is mixed-race, writes about a school experience. During a discussion of race, a Jewish friend turned to her and declared: “I’m not White.” Decades later, her friend made aliyah to Israel. Visiting her friend, Raboteau found her living proudly in a house expropriated from a Palestinian family. Her Jewish friend wasn’t White in America, Raboteau realized, but in Israel, she certainly was.

Therefore, I suggest that whether I transgressed racial boundaries by playing Jewish, is an unresolved question. Whether Jewish people are White is conditional. And because Jewish people are historically disadvantaged in America, where this play was set, one could make a persuasive case that I performed the equivalent of Blackface. Does it matter that, like Fisher Stevens, I played the role with utmost respect, and strove to avoid playing stereotypes?

I can’t answer that, because I’ve also played White people who were very different from myself. I played a poor White Southerner in To Kill a Mockingbird, a play I found riddled with problems. The entire play seemed designed to excuse its central White characters from responsibility for the violence enacted against Black Southerners. The character I played, however small, was an excessively broad stereotype of Southern Whiteness.

My role in Mockingbird was, I’d contend, more harmful to the categories of people it depicted, than my Yiddish character. My Yiddish character was respectful, honest, and most important, an individual. My White Southerner was a crass stereotype, in an ensemble of crass stereotypes, and equally important, was inaccurate. In portraying that character, I held an entire category of people up for ridicule and derision, irrespective of facts.

Don’t misunderstand me: this isn’t advocating Blackface or its equivalents. But beyond the clear boundary of “don’t change your skin color for a part,” it becomes much murkier. Portraying a character of another ethnicity, heritage, social class, or experience, could become explosive. And unfortunately, as with any experience reaching across social boundaries, the harm is often only visible in retrospect.

Tuesday, May 11, 2021

The Capitalist War on Free Markets

Nebraska Governor Pete Ricketts

It’s no secret that I dislike Nebraska governor Pete Ricketts. I voted against him twice, and cannot fathom how anybody thought him prepared to administer our state. So last week, when he announced his cheap “beef passport” gimmick, a glamorized rewards card to encourage Nebraskans to eat more beef, I originally disregarded his stunt. Governor Ricketts is notorious for low-risk stunts that draw attention but accomplish little.

Then Wyoming said “hold my beer.”

A bill percolating through Wyoming’s legislature would authorize the state to sue other states that don’t purchase Wyoming coal. Traditionally a ranching state with little manufacturing, Wyoming’s economic base transitioned into coal-mining and related industries through the 1960s and 1970s. I have cousins working the Wyoming coal pits, so my sympathies certainly lie with coal miners. But this is a mind-bogglingly stupid answer to the situation.

While Pete Ricketts uses the “carrot” approach to manipulating markets, Wyoming governor Mark Gordon uses the “stick.” Ricketts creates meaningless rewards to trick people into believing they have something invested in continuing their behavior. Gordon just threatens those who don’t comply, knowing that even if his threats don’t work, resisting him will be costly. Neither seems willing to let market forces apply, and I find that telling.

These governors, both Republicans, represent the party that nominally believes Capitalism, which they define as market forces, are inherently good. Remember, Republican leadership spent 2020 insisting that providing working Americans a little salutary support during the worst pandemic in living memory, was tantamount to creeping Bolshevism. Yet when faced with market forces that don’t reward the status quo, from which their states have profited, they turn virtually Trotskyite.

Retail stores create loyalty programs to tie customers, particularly young customers without time-tested buying habits, to one store. Purchase five hamburgers, get the sixth free! It creates the illusion of commitment. But be serious, people already either eat red meat or don’t. Despite Colorado Governor Jared Polis’ recent attempt to bring back meatless Mondays, no carnivores are likely to forego beef unless they’re officially compelled to.

Wyoming Governor Mark Gordon

Coal, meanwhile, is disappearing from markets because it’s unnecessary. Burning hydrocarbons is environmentally reckless, especially coal, tainted with sulfur. Equally important, cleaner energy generation technologies, like wind, solar, and nuclear, have advanced sufficiently that they’re cheaper than coal. Maintaining coal-burning technology simply to invent a market for a resource that’s costly, dirty, and outdated, is far more anti-capitalist than stimulus checks during the pandemic.

Yet here we are. Can you imagine elected officials manipulating markets to keep other technologies afloat? Whale oil, muzzle-loaders, Betamax. Historically, technologies have become outdated, products have fallen out of demand, and people who made those products needed to realign themselves to the changing market. But these Republican governors, one with known Presidential aspirations, have chosen another tack: seizing control of markets to stop unwanted change.

As an ex-Republican, I find that amazeballs. The party I formerly supported believed markets drove morality: that if something was worth money, people would pay for it, and if nobody will pay for it, that proves it’s worthless. That’s why Republicans resist public recycling programs, research into alternative energy, and other attempts to fix broken markets. A dollar freely spent, to paleoconservatives, is the ultimate yardstick for public morality.

Until, apparently, it isn’t. Ricketts and Gordon both bring their offices’ might to bear in ensuring markets obey their whims. Ricketts, born rich, and Gordon, a successful rancher, have profited mightily from markets that favored their products and services. Yet apparently, both believe the market forces which propelled their family to prominence, should continue unchanged forever. When people want less meat or less coal, public demand is wrong, not the market.

I say this, knowing literally unrestrained markets have never really propelled American economics. Our government created growth markets by chasing Native Americans off their land, then repackaging the spoils as “homesteads.” We encouraged certain development schemes by putting dams, with their cheap electricity, along the Colorado River. America enjoys (if that’s the word) Earth’s cheapest gasoline because our government subsidizes petroleum.

Ricketts and Gordon’s maneuvers aren’t deviations from paleoconservative economics; they’ve just admitted aloud what we news junkies have always known. Our government picks winners, then manipulates markets to ensure the early adopters, mostly campaign contributors, are well rewarded. It always has. These aspiring leaders of Republican ideology have simply admitted such out loud for the first time.

It’s up to us, the voters, to ensure they wear the shameful consequences of their actions for all the world to see.

Monday, May 10, 2021

Racism, the Individual, and Society

Robin DiAngelo, White Fragility: Why It's So Hart For White People To Talk About Racism

When confronting racism in America today, it isn’t enough to simply ban expressions of personal bigotry. That’s a statement most race scholars can probably accept, but which remains controversial among the general public. Even in American law today, “racism” means personal bigotry; SCOTUS took systemic racism out of consideration with McCleskey v. Kemp, in 1987. American sociologist Robin DiAngelo wants to reverse this trend. But I fear she goes too far in the other direction.

DiAngelo, who works mainly as a diversity consultant for high-dollar corporations, starts with a simple question: why do well-paid White executives, in mostly-White offices, turn defensive, even wrathful, when consultants note their companies’ racial breakdowns? Who do White people refuse to believe that racism includes the inheritance of centuries of inequality, manifested by the fact that Black Americans disproportionately don’t own their homes? Why do Whites refuse to even consider the possibility that racism perseveres?

I synopsize DiAngelo’s extensive conclusion: because naming and discussing systems violates America’s master narrative of individualism. We White Americans want to believe ourselves unbound by limitations of race, class, sex, or the kitchen sink. To suggest that systems exist, from which Whites have profited at others’ expense, offends our core sensibilities. That’s why we respond with defenses of “I’m not racist” or “I’m not responsible for my ancestors’ crimes.” We’re simply making everything individualist again.

DiAngelo finds this unsatisfactory. When White Flight and its doppelgänger, Gentrification, continue shaping urban landscapes, this isn’t individual. When access to good schools and good jobs requires demonstrated proficiency on standardized tests, which mostly evaluate what well-off Whites consider worth knowing, bootstrap ingenuity isn’t good enough. When 55% of White Americans believe anti-White racism is widespread, but few report experiencing it, this reflects a culture-wide phenomenon. And we can only address it together, collectively, systemically.

So far, so good. I agree with DiAngelo’s take on systems, perhaps because I’ve read her position before, in authors like Michael Eric Dyson and Ibram Kendi. (Which reflects another problem: the White people likely to read this book are Whites like me, who already essentially support her position. But I digress…) The legacy of redlining, shoddy education, and mass incarceration continues shadowing Black and Hispanic Americans’ opportunities, even though outright discrimination is now unlawful.

Robin DiAngelo

My problem is, DiAngelo insists racism is only systems. From the beginning, and periodically throughout the book, she repeats the message that individual bigotry isn’t really racism, that only systems which structurally inhibit BIPOC opportunity should count as racism. Even as she recounts narratives from her executive consultancy of Whites refusing to consider their ingrained bias, even while they talk over Black colleagues or otherwise demonstrate personal prejudice, DiAngelo insists racism is nothing but systems.

Perhaps this reflects DiAngelo’s background in academia, and her work with business executives. She mostly interacts with people who’d rather die than openly speak the N-word, or otherwise express individual bigotry. Such people cannot comprehend that systems manifest in their behavior, because they believe their education and economic standing have purged intolerance from their hearts. Because America’s well-off often believe they achieved greatness through their individual efforts, they’re perhaps ill-prepared to believe systems even exist.

But, having divided the last decade between assembly-line work and construction, I’m willing to attest that bigotry still exists. That young BIPOC strivers, desperate to escape their inherited straits, really do hear the N-word spoken aloud among their White peers and co-workers. That even if racism originates in systems, it becomes ingrained among individuals, who pass their bigotry onto their children. And that poor Whites, desperate for any advantage, sometimes still actively push African-Americans down.

Please don’t misunderstand me. I agree with DiAngelo’s beliefs about systemic injustice. But, in stating that individual behavior is socially conditioned, DiAngelo functionally denies individual agency. I can’t do that. Experience teaches me that humans are products of their conditions, and individual beings, at the same time. Without individual agency, nobody can effectively overcome obstacles, or challenge unjust systems. This doesn’t mean systemic racism doesn’t exist, only that individual racism exists too, and needs confronted.

DiAngelo never says anything out-and-out wrong. Her words simply reflect her academic and professional background, as my words reflect my background. For anti-racists interested in how the individual and the systemic interact, I recommend Ibram Kendi, whose positions overlap with DiAngelo’s, but go into much greater depth. DiAngelo’s book overlooks important nuances. I suspect she wrote it as a companion to her consultancy business, not realizing it would take on a life of its own.

Saturday, May 8, 2021

The First Amendment and its Baggage

Charlie Kirk

Charlie Kirk, who built his career decrying the evils of liberal universities, despite dropping out of online Bible college, saw his shadow this week. When Facebook’s fake third-party arbiters recommended extending the former President’s ban at least six more months, Kirk went on Twitter demanding: “The US Supreme Court should overturn the Facebook’s ‘Oversight Board’s’ ‘ruling’ which upholds the outlawing of the 45th President of the United States from social media.”

Ignore, temporarily, that SCOTUS has no authority here, or that Kirk represents the party which claims that whatever large corporations want must perforce be good, unless it’s not. More important, for my money: Kirk’s statement reflects an unspoken assumption about politics. As rhetorician Craig Rood writes about the Second Amendment, the constitutional prerogative has accumulated baggage that goes beyond the written text. The same applies to the First Amendment.

The literal First Amendment runs only 45 words long, enumerating five freedoms, and we’ve argued about those freedoms for 232 years now. What does it mean to neither respect nor abridge religion? The summer of 2020 saw unprecedented protests which challenged American ideas of peaceable assembly or right to petition. The First Amendment, like the Second, appears brief and settled on paper, but it remains altogether contentious.

What, therefore, does “freedom of speech” mean, constitutionally? By claiming SCOTUS should intervene in Facebook’s internal workings, Charlie Kirk apparently believes it means anybody has a God-given right to say anything they want, on anybody’s platform. Kirk doesn’t believe this, certainly; a member of Former Guy’s “1776 Commission,” a willful effort to exclude certain ideas from America’s public schools. He literally levied government authority to silence opinions he disliked.

We’ve seen this before. PragerU, the conservative vlogger channel, repeatedly claims Twitter, YouTube, and other media have “censored” them, claims they make on Twitter and YouTube. They demand privately owned platforms give them unlimited oxygen, because apparently their hundreds of thousands of followers have modest limits. This, to them, constitutes an imposition upon free speech. And they use the platforms they’ve supposedly been banned from to publicize this claim.

Dennis Prager

Kirk and Prager share an unstated premise, beyond their First Amendment interpretation. They share a definition of “rights” not as something inalienably endowed by our Creator, as Thomas Jefferson famously wrote. To them, rights are something demanded. This isn’t new; the concept of “rights” comes from the British monarch’s official motto: “Dieu et mon droit,” God and my right. “Right” here means one’s right hand, that is, one’s sword hand.

Conservative figureheads yearn for a time when monarchs and aristocrats achieved their rights through violence. That’s why hard-right state legislatures have passed, or are considering, laws that decriminalize hitting protesters with cars. And don’t pretend leftists don’t share this belief. They block traffic, permitting the car assassination behavior, because it works. As Dr. King said, “Freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed.”

Left-wing interpretations of the First Amendment traditionally assume rights simply exist, and when suppressed by authoritarian lawmakers, remain dormant, like tulips in winter. Right-wing interpretations assume rights must be defended, even without serious material challenges; that’s why they scream bloody murder when leftists suggest, timidly or aggressively, that police shouldn’t kill citizens. They’re working from a script in which rights are demanded violently, and assume we are too.

Unfortunately, that’s what happened on January 6th. In a complete surrender to Nietzschean egoism, an angry minority decided they had prerogative to enforce their will upon others, despite losing the election. To the mindset underlying Former Guy’s political movement, a movement which has reset the Republican Party baseline, the majority’s will only matters to the extent that the majority forcefully asserts it. Everything else is immaterial.

When Charlie Kirk insists Facebook has an obligation to platform Former Guy, or Dennis Prager uses Twitter and YouTube to claim he’s censored on Twitter and YouTube, outsiders sense a contradiction. Aren’t they, we ask, the party that believes private property is absolute? Aren’t they the party that believes government intrusion is socialism, and whimpers publicly about judicial overreach? How dare they involve SCOTUS to change corporate policy?

They see no such inconsistency. Private property, government authority, free speech, civil rights: these words have meaning, to modern conservatism, to the extent that anyone asserts them forcefully. Kirk and Prager don't think we should nationalize Facebook, Twitter, and Google. And they don’t think cherry-picking school curricula from the Oval Office is restrictive. To them, real freedom only comes after a bloody fight.