Monday, April 29, 2019

The Day the Revolution Stopped

The Robert E. Lee statue in Charlottesville, Virginia, which
sparked the disastrous "Unite the Right" rally

The Republican Party establishment has a conundrum: it has tried to position itself as the party of Lincoln. Aggressive PR pushes from conservative leaders like Dinesh D’Souza have peddled the idea that Republicans are progressive on race, contra the Democratic Party, by comparing the 1864 Republican platform with the Democrats’ explicitly segregationist 1960 platform. We deserve Black Americans’ loyalty, they proclaim, because, um, Abraham Lincoln!

Which, for all its flaws, could’ve been a potentially persuasive argument pretty recently. Not persuasive to most African Americans, certainly, who judge by what each party has actually accomplished; but persuasive to traditionalist Whites who want reassurance they aren’t the moral heirs of Bull Connor. Pretending that party alignments haven’t changed in 150 years allows Republicans, who generally believe history is constant and inelastic, to claim Abraham Lincoln and Ulysses S. Grant still represent their party.

Then Donald Trump opened his pie-hole and praised Robert E. Lee.

CUNY political theorist Corey Robin writes, in his book The Reactionary Mind, that conservatism is an essentially counter-revolutionary theory. That is, conservatives see progressives and leftists attempting to change what they consider an unjust system, and perceive that change as revolutionary. Revolutions are violent, of course, and seeing violence done upon their world, they “stand athwart history,” in William F. Buckley's words, “yelling Stop!”

However, conservatives cannot stop the revolution by simply defending what exists. If the current system weren’t terminally damaged, there would be no revolution; it’s impossible to defend a broken system, on the broken system’s terms. So conservatives constantly search about, seeking explanations why society should change as little as possible, while mollifying the conditions that made revolution necessary. Change is necessary, to eliminate the need for change.

Thus American conservatives have, at various times, allied themselves with abolitionism, labor unionism, civil rights, feminism, and other historical movements to correct injustice. More accurately, they ally themselves with the memory of these movements. That’s why, for instance, conservatives claim to be the true heirs of Dr. King, usually by grabbing his “content of their character” quote out of context. The present system thus becomes the fulfillment of past struggles.

Strange to say, I sympathize with this conservative impulse. We cannot fight noble fights forever, lest the fight itself become more important than its goal; Mainland China’s efforts to alleviate the damage caused by Chairman Mao’s excess proves the inherent risk of what Mao called “permanent revolution.” Maybe we haven’t won yet, but it remains necessary to believe we could, someday, win and stop fighting.


So conservatives try to elide the party realignments of the past 150 years, and claim they’re still “the Party of Lincoln.” Which, actually, is pretty admirable. (I’ll address Lincoln’s moral contradictions another time.) If Republicans still represented the values of Radical Reconstruction, I suspect we’d all get behind them. Well, maybe not all, but enough to make gradualism possible.

However.

While Republicans nominally embrace, if not current change, at least the changes of nostalgia, their president undermines them. Donald Trump lauds the commanding general of a fake nation founded on slavery. He reportedly calls nations with Black and Brown majorities “shithole countries.” He calls an all-white rally that literally marched under swastika flags “very fine people.” President Trump, it seems, can’t stay on his party’s message.

I’d suggest Trump probably doesn’t consider himself racist. Given his historic difficulty staying on-script, if he harbored explicit bigotry, surely he’d have dropped an N-bomb into a hot mike by now. However, unlike his party’s strategists, he defends the status quo in the language of the status quo. If that means protecting outright racists from well-earned social consequences, that doesn’t faze him. He isn’t a counter-revolutionary, he’s just standing still.

American conservatism has spent a half-century trying to shed the stink of racism in a country where the majority now finds out-and-out bigotry odious. Trump apparently hasn’t gotten that message. He still repeats Lost Cause propaganda that’s been extensively debunked; protects a social hierarchy founded on race, sex, and inherited wealth; and does so, unlike his party, in the unambiguous language of refusal to change, despite demonstrated injustice.

If Corey Robin is correct, and conservatism is indeed counter-revolutionary, American conservatism will have to reconcile their revolution with Donald Trump, who’s just a heel-dragger. For the coming generation, whenever any Republicans claim to be the real heirs of Abraham Lincoln or Dr. King, they’ll have to answer for Trump’s words. Until they cut Donald Trump loose, conservatives have surrendered any revolutionary aspirations; they’re the party of stasis.

Sunday, April 28, 2019

The First Church of Pop Culture Celebration


Two weeks ago, my FaceTube and InstaTwit feeds became clogged with everyone’s speculations about the Game of Thrones season premier. Audiences muttered, whispered, and gossiped with the same vigor previously used when discussing the Royal Baby. As it aired, friends deluged my timeline with live tweets; after it ended, others demanded a virginal, spoiler-free environment for literally days afterward, until they could watch their DVRs or on-demand thingamabobs.

People floated theories much like they do surrounding the Super Bowl, or a presidential election. Things were electric. And then… nothing. At least until nearly two weeks later, when everything I just said could be repeated perfectly, except with Avengers: Endgame.

Mass-media premiers assume the cross-cultural weight once reserved for papal visits and political events. This isn’t coincidental. As a society, we don’t share the same religion, language, or allegiance to Major League Baseball that we once had (at least in mainstream nostalgia). Our politics has become shattered. We don’t even have a unifying interpretation of national history anymore.

This leaves us only one communal experience: we consume the same media. Without the ability to click on the same premium cable TV channel, or queue up for the same cinema, we lack any unifying vision as a people. Consuming mass media has supplanted church, sports, and politics as the one thing keeping us from living like complete strangers who coincidentally occupy the same space.

In itself, I have no problem with this. Societies have always required something unitary to avoid spinning apart under the centrifugal force of selfish desires. My problem arises because these shared experiences are created by for-profit enterprises. HBO, owned by WarnerMedia, and Marvel Comics, owned by Disney, aren’t philanthropic charities building a better community. They exist to squeeze another buck from customers.

Therefore we’re unified, bound together, made a single people, by the effort to keep us passively consuming. Once-shared experiences like Christmas Eve mass, the Lincoln-Douglas debates, and World War II bond drives, brought populations together around the thesis of shared sacrifice. Movies and TV gather citizens in subordination to the most extreme manifestations of capitalist excess.

And what extreme manifestations they are. The advance in communications technology, which has made it possible for shoestring entrepreneurs to start film studios and basic cable channels, has paradoxically concentrated power in the hands of corporate conglomerates. Anybody can create content; only massive industrial monoliths can create PR sufficient to get heard over the background noise.

(And I do mean massive: WarnerMedia is owned by AT&T, while Disney’s impending Fox acquisition will create a company so large, it will own around half of America’s media landscape.)

In this condition, we ordinary citizens gather under the aegis provided by empires so vast, they make Mansa Musa or Charles V look anemic and unambitious. Recent pushback against rules changes, like the FCC abandoning Net Neutrality, suggest Americans are perhaps more aware of unregulated capitalism’s hidden risks than prior generations. Yet without GoT or the MCU, wholly owned capitalist subsidiaries, we have no shared cultural experience.


Don’t mistake me. I realize many prior social unifiers have been for-profit enterprises. MLB and the NFL are mega-corporations (though they retain outdated Depression-era protections against monopoly busting). Religion arguably once bound populations together, but any reader of religious history knows self-appointed servants of God have often gotten distracted in service of Mammon.

Nevertheless, these prior enterprises had a commitment to the greater good. Football and baseball players have frequently done goodwill tours in advance of the World Series and Super Bowl. Even at its most corrupt, the church needed to occasionally endow charities and public works, if only to retain tithe-paying parishioners’ willingness to give.

Modern media titans have no such motivation. Though TV channels and movie studios nominally compete, consolidated ownership in fewer and fewer hands makes them functionally as competitive as drug cartels fighting turf wars--that is, more willing to compromise than compete. This is especially true since the 1990s, when deregulation has resulted in more “vertical integration” in the industry. That is, more monopoly ownership.

This becomes especially pointed as we witness these most recent manifestations. Like M*A*S*H, Seinfeld, and Harry Potter before them, we’re watching mass culture movements concentrating about a franchise ending.

I don’t begrudge anybody enjoying Marvel movies or Game of Thrones. With religion, civil politics, and participatory sports in retreat, we need something holding us together. But it’s important to remember that these enterprises aren’t benevolent charities bent on community. They exist to part us from our money.

Friday, April 26, 2019

The Inevitable Patterns of American History, Part 3

A portion of America's southern border fence (New York Times photo)

Reading Greg Grandin’s The End of the Myth and Richard Gergel’s Unexampled Courage simultaneously left me with a sense of bleak fatalism… at first. Professor Grandin talks about how America’s frontier and overseas wars have consistently resulted in racial violence at home. Judge Gergel describes one such act of violence, when a decorated soldier, going home, found himself beaten blind for no greater offense than being born Black.

The continuity of patterns probably wouldn’t be lost on either author. What Grandin describes in the broad sweep of history, salted with specific examples, Gergel approaches from the specific, broadening out into larger structures. Reading these books, I can’t help the chill of recognition that America has a longstanding scheme to channel our aggressive tendencies outward, then act surprised when the aggressive people come home more prepared for violence.

However, as tempting as it becomes to see American history as an irresistible trend toward racism, violence, and war, these books offer readers an opportunity to see history’s living dynamic. We aren’t beholden to the past, because we can change; we have changed. And it has happened because individuals, motivated by the belief in their own rightness and America’s stated principles, have demanded Americans do what we know is right.

Professor Grandin writes that military intervention has historically channeled America’s racial animus outward. Much racial language that still permeates our national vocabulary originated in war; I grew up hearing my father repeating Vietnam-era racial descriptions (a fact that, to his credit, now embarasses him) which I considered simply “normal.” Because inevitably, veterans come home, bearing the propaganda they’ve learned with them.

Nor can national officials claim they don’t this. Judge Gergel writes that Sgt. Isaac Woodard’s beating, which motivated President Truman to desegregate the military and federal government, came amidst a rash of postwar racial violence. Sgt. Woodard stood out only because he survived his attack. Truman felt pressed to do something because he remembered the outbreaks of racial violence following World War I, in which he served.

But Truman also felt pressed to do something because activists pressed him. Truman sat down with activist leaders, including Walter E. White and Thurgood Marshall, intending to repeat his advisors’ official line that the federal government couldn’t do anything precipitous. We need to act gradually, to introduce legislation and deliberate upon it with modest speed, Truman’s official script went. Until activists confronted Truman with facts, and he rejected his script.

America's frontier myth, as depicted by Currier and Ives

I have difficulty reading this history without seeing everything America’s faced since 2001. Faced with an aggressive desire to do something, though we’re not too sure what, following a national tragedy, America did what it’s always done. It sent troops overseas. Maybe America needed to topple the Taliban; maybe Saddam Hussein overstayed his welcome on the world stage. But the issues in these cases were inarguably domestic American issues.

Except, Grandin writes, this overseas intervention went pear-shaped in ways no prior American military entanglement had. Our invasions of the Philippines in 1898 or Vietnam in 1965 dragged on and became massively unpopular at home, sure. But with disasters like Fallujah and Abu Ghraib, we’d never seen things go as spectacularly wrong as they did in our post-9/11 interventions. These became truly historic cock-ups.

Thus we had veterans, steeped in racist propaganda (and don’t pretend Abu Ghraib was either not racist nor not sanctioned), dropped into a postwar America ill-prepared to handle their experiences. Many formed civilian “border patrol” vigilante groups, whose documented activities uncannily resemble Klan lynchings. America’s history of being afraid to harvest the seeds it’s sown continues. If only bold leaders dared to step in and say, “This isn’t my America.”

Sadly, we had three successive presidents, representing both major political parties, who were unwilling to pull a Truman and place justice over expediency. Our current President has actively fanned these flames. Circumstances probably would have changed little had the 2016 election gone the other direction; it’s been less than six months since Hillary Clinton suggested nations should appease their racist elements. Violence begets violence, irrespective of political party.

So yes, America has a history of racism, one that we’ve seen writ large since 2001. And we have presidents, and presidential candidates, urging gradualism, just as Truman’s advisors did. But we have one other thing Truman also had: the American people, believing, however tenuously, in the principles of our founding documents. That’s why, despite the patterns, I can’t surrender to fatalism. Because the patterns of American history aren’t as inevitable as they seem.

Wednesday, April 24, 2019

The Inevitable Patterns of American History, Part 2

Richard Gergel, Unexampled Courage: The Blinding of Sgt. Isaac Woodard and the Awakening of President Harry S. Truman and Judge J. Waties Waring

This essay follows from The Inevitable Patterns of American History, Part 1

Isaac Woodard served three decorated years in the Pacific Theatre of World War II, before returning to segregated America. On February 12, 1946, just hours after receiving his honorable discharge, while still wearing his uniform and carrying his discharge papers, a White police chief pulled Woodard, who was Black, off a Greyhound bus on specious charges. Chief Lynwood Shull then beat Woodard so severely, both his eyeballs burst in their sockets, leaving him permanently blind.

What followed changed America. Racial violence was widespread following World War II, as it often is following America’s overseas wars, but Woodard broke the mold in one important way: he survived his attack. Most victims of racial violence in this era left a lynched corpse, and a close-lipped cadre of white witnesses, who were mostly perpetrators too, unafraid of consequences. Woodard found several allies, from Thurgood Marshall to Orson Welles, and eventually President Harry Truman.

When local prosecutors refused to charge a White lawman for beating a Black man, Truman ordered his Justice Department to commence a civil rights suit. Justice officials, unwilling to offend local law enforcement, made only a salutary prosecution. Seated before a white judge, J. Waities Waring, and an all-white jury, the Justice Department proceeded to apparently throw the case. Judge Waring was horrified. Years later, he said he’d never seen a case so deliberately bungled.

Both President Truman and Judge Waring descended from Southern slaveholders; Waring was considered classical Southern gentry. Both, however, had deep-rooted principles of fairness. When their ideals of justice came sideways across Southern race hierarchy, both men needed to decide which credo they’d rather support. Shown evidence that an honored veteran couldn’t get justice on the home front, they flinched. To their credit, both men would rather be honest than class-bound, and both became surprise anti-racists.

Isaac Woodard, in uniform, with his mother.
Archive photo, Georgia State University
Shocked by courts’ inability to provide Woodard justice, Truman engaged with civil rights like no President before. He became close with Walter E. White, head of the NAACP; later, he became the first sitting President to address the NAACP. When a coalition of Northern Republicans and Southern Democrats stonewalled his civil rights bill, Truman used the media to hold them accountable. This secured his second term, and also broke the Dixiecrat stranglehold on his party.

Judge Waring, simultaneously, used his bench to advance civil rights. Once indifferent to Black issues, Waring began reading scholars like Gunnar Myrdal, and became close with Thurgood Marshall. When he almost single-handedly overturned South Carolina’s longstanding Whites-only primary elections, he became a celebrity, appearing on the cover of Time magazine. His reforms had national implications. Then in 1951, already seventy years old, Waring became the first judge to dissent against Plessy v. Ferguson since 1895.

According to his press bio, David Gergel, author of the present volume, is a federal judge serving in the same Charleson, South Carolina, courthouse where Judge Waring once presided. This gives him access to troves of primary documents, which probably explains why Gergel so thoroughly understands Waring, a man Americans have largely forgotten. Gergel’s survey of Truman’s reforms is electrifying, but also broad. His most specific and historical chapters deal extensively with Judge Waring’s accomplishments.

Truman’s experience desegregating the military and government provided extensive evidence that races could work together in high-pressure environments. Waring’s rulings permitted important civil rights cases to flow into the Supreme Court docket. Between them, the President and the federal judge helped create the conditions that made Brown v. BOE possible. That’s not saying these two White government officials caused it to happen. History is too complex for that. But it couldn’t have happened without them.

Evidence suggests that Sergeant Woodard, who lived until 1994, never truly appreciated how his case influenced civil rights history in America. Despite being celebrated by stars like Woody Guthrie and Billie Holliday, and taken on national tour like a rock star, he eventually faded into obscurity, as news figures do, and lived a quiet retirement in New York. Yet both President Truman and Judge Waring explicitly acknowledged Woodard for jolting them out of their indifference.

Isaac Woodard suffered violence, and accidentally became a linchpin of history. He made institutional racism visible, which energized government officials to address the problem. And it’s tragic that this event has receded from memory. As Gergel quotes Truman telling a blue-ribbon panel in 1947, discussing lynchings which still happened when he was a boy: “There is a tendency in this country for that situation to develop again, unless we do something tangible to prevent it.”


See also: Why a Town Is Finally Honoring a Black Veteran Attacked by Its White Police Chief

Monday, April 22, 2019

The Inevitable Patterns of American History, Part 1

Greg Grandin, The End of the Myth: From the Frontier to the Border Wall in the Mind of America

American history has always, on some level, involved pushing the borders. The Founding Fathers talked a good game of liberty and republic, but fomented the Revolution to overrule the king’s refusal to permit homesteading west of the Alleghenies. When White Americans used up the Smoky Mountains and Ohio Valley, they kept pressing west. The continent seemed limitless… right up until giddy expansionism met its limits.

NYU and Yale historian Greg Grandin has long specialized in the parts of American history where optimistic rhetoric runs crosswise against realistic limits. Whether it’s capitalism’s inability to establish private overseas empires, in Fordlandia, or the perverse social forces aboard slave ships, in The Empire of Necessity, he’s translated obscure corners of history into plain vernacular English. This, his most sweeping book yet, examines not an event, but a theme.

The word “frontier” traditionally described a borderland, especially an armed and fortified borderland; ordinary civilians once avoided frontiers. Americans redefined the frontier as a place of opportunity. Before the current America even existed, James Madison created high-minded moral justifications for why perpetual settlement west expanded the potential of democracy. Washington and Jefferson elided moral language; they just pursued illegal land grabs on Native American territory.

But this perpetual migration had a dark side. Once White Americans secured their claims beyond the Alleghenies, and again after Jefferson finalized the Louisiana Purchase, the government used its unquestioned military supremacy to chase Indians off ancestral lands. Somehow, Whites always thought they’d gone as far west as they ever intended, and they’d never need lands they’d just given to Indians in treaty. Somehow, that never happened.

Greg Grandin
Grandin’s early chapters feature an interesting dualism. America’s twin impulses are embodied in two people: Andrew Jackson and John Quincy Adams. Even when these individuals aren’t present, their rhetoric defines the direction of history. Jackson advocated complete libertarian expansion and rejection of limits… for rich White people. Adams predicted this expansion would create new conflicts later; he accurately foresaw Indian wars, the Mexican-American War, even the Civil War, decades ahead.

America’s perpetual westward urge served East-Coast interests perfectly well. While clearing land for Northern industry, and Southern slavery, the western frontier provided an outlet, a “safety valve” they called it, for violent racist tendencies. When circumstances got too heated Back East, they sent poor Whites to fight frontier wars further west, especially the Mexican-American War. Eventually, however, those Whites always came home, suffused in racist propaganda and experienced in war.

Throughout the Nineteenth Century this pattern repeats itself: use racist lingo to justify driving Natives and Mexicans off their land, then resettle that land, often with slave plantations. The impulse means to push violence outside settled White land. But it always comes home; from race riots in eastern cities, to the Civil War itself, America’s race-baiting rhetoric always created new violence in the home territories, which justified further westward expansion.

Then, abruptly, America hit the Pacific.

Inevitably, America’s myth of limitlessness ran into geography’s limits. Diplomats drew an enforceable border with Mexico, and discovered they had to enforce it. So the aggression had to travel outside America. Beginning with the Spanish-American War (which Grandin calls the War of 1898), America became a world power, engaged in affairs outside its nominal borders. The theatres of conflict changed; results remained the same.

Frederick Jackson Turner floated his “Frontier Hypothesis” in 1893, defining that American history only makes sense when considered through the frontier. Turner believed frontier living tested Whites’ fortitude, gave Americans boundless optimism, and explained American industriousness. Grandin notes, however, that Turner consciously ignored how Whites only settled the frontier after the military had courteously cleared Indians off the land. This blinkered vision carried into the Twentieth Century.

When the frontier closed, it became a metaphor. Capitalists and socialists alike claimed their philosophy had genuine cowboy heritage. Eventually, as capitalism survived World War II, socialists lost the Cold War PR war. Presidents from LBJ to Ronald Reagan asserted their frontier qualities. Meanwhile, Dr. King anticipated that every bomb dropped in North Vietnam would eventually explode in America. Like John Quincy Adams, King proved right.

Grandin closes with consideration of today’s border conflicts. Vigilante groups, many comprised of veterans, “patrol” the border in cowboy hats and Confederate flags, and have for a century. Now, a critical mass of Americans, once enamored of the frontier, want to build a wall. America’s problems, Grandin says, remain unexamined, for largely continuous reasons. We’ve always postponed the inevitable, and paid for it later. And now we’re doing it again.

Wednesday, April 17, 2019

The Need For Holiness in a Secular World

A photo snapped live of Notre-Dame de Paris burning on Monday evening

I cannot imagine how many millions of people worldwide watched the fire unfolding at the Cathedral of Notre-Dame de Paris. Between straight journalism coverage, and the live-streaming from hundreds of smartphones around the city, we sat transfixed, wondering whether a piece of our physical cultural heritage would survive the night. Coverage of the al-Aqsa Mosque fire was more sporadic, but probably because the mosque is harder for outsiders to see.

Most interesting for me, the coverage crossed religious bounds. The most direct reportage of Notre-Dame, not surprisingly, came from France and Britain, the closest large countries. These are countries where the plurality religion is now “no religion.” Yet one needn’t have any specific faith to recognize that the possible destruction of these two iconic sanctuaries cuts into something shared in our culture: something valuable, historic, even—dare I say—holy.

The retreat of religion from modern life has made holiness a loaded concept. I contend, though, that it shouldn’t. Rudolf Otto writes, in The Idea of the Holy, that in its oldest form, calling something holy doesn’t mean calling it “godly” or “pure.” Calling something holy means calling it “separate” or “set apart.” Holy places, holy experiences, holy times are those which we divide from our mundane continuity and recognize as unique.

This may mean having special direct connection to divinity, but that’s only one kind of holiness. We’ll perhaps see this most directly in the ancient Greek playwright Sophocles. Whether the aged former king in Oedipus at Colonus or the exiled master warrior in Philoctetes, both these characters are punished for stepping on “holy ground.” This isn’t ground ritually consecrated, like a church sanctuary; in these plays, the holy ground isn’t even marked.

But it’s set apart from ordinary use, and humans aren’t supposed to walk there.

We all have experience with the holy, or anyway a longing for holiness, even without any particular faith. Some people find holiness in churches, mosques, and temples. Others find holiness in the people who congregate in churches, mosques, and temples.Still others find it in the Louvre, the National Mall in Washington, or their local veterans’ memorial. More would find holiness in the Olympic National Park (supposedly the quietest place in America), hiking from coast to coast, or in a boat on the ocean.

The al-Aqsa Mosque in Jerusalem. (Because of Jerusalem's urban design,
few good photos apparently exist of Monday's brief fire in action.)

What transcendent meaning we ascribe to holiness almost doesn’t matter. Holiness isn’t about getting close to God; it’s about a place carved out in life. A place where we are guests, visitors, a place where we’re invited in but never permanent residents. Both Notre-Dame and al-Aqsa have stood, inviolate, for nearly a millennium, surviving wars, occupations, tyrants, and atheists. They are different from our ordinary lives. Even without God, that makes them holy.

In both cases, modern life rushes right up to the perimeters of their sanctuaries. Nearly every photograph or live-stream of Notre-Dame burning is shot between high-rises, Second Empire apartment blocks, and other new-ish development. The sightlines around al-Aqsa are so crowded that I couldn’t even find a good picture of the fire. But in both cases, modernity crowds around the edges of the holy development, and stops.

Because these places are separate.

We watched these landmarks of holiness burn, fearful that we might be watching a dimunition of potential holiness in modern life. Even if we aren’t Christian or Muslim ourselves, we recognize we have diminishing opportunities to experience holiness. Keeping afloat means dedicating longer hours to work. Childrearing expectations have changed, and we’re required to hover over our kids constantly. Very little in life is set apart.

We have fewer places we go simply to be in that place. We have fewer times reserved to exist entirely in that moment, surrounded by people also entirely in that moment. As we discovered at Standing Rock in 2016, those who only value life by its dollar signs are rushing to run pipelines and strip-mines through the few sacred spaces remaining. Modernity cannot stomach something truly set aside.

Thankfully both Notre-Dame and al-Aqsa survived their fires. The spaces remain set apart from modern activities. The surge of mourning we witnessed on Monday and Tuesday, which crossed lines of religion and irreligion, reflects humanity’s desire to step outside ourselves and exist someplace, sometime, outside space and time. Fortunately, such places will continue to exist, at least for a while.

Hopefully, as we reckon with the feelings we felt this week, we’ll also muster gumption enough to create holiness. Because it isn’t enough to mourn when the holy burns.

Monday, April 15, 2019

Do School Resource Officers Do Any Good?

On April 9th, 2019, a St. Lucie County, Florida, sheriff’s deputy rushed a sixth grader outside his school, lifted him overhead, and body-slammed him to the pavement. Another student, safe inside a school bus, captured the incident on his smartphone. It took five days for the story to reach me, a white guy in the hinterlands. And now I find myself wondering what we have public schools for.


News reports of the incident describe a child behaving in an oppositional-defiant manner, disregarding instructions and disrupting classroom activities. They don’t, however, describe a child engaging in violence. Quotes from the incident report describe the youth as “agitated,” as “punching his fist into the palm of his hand.” But they also describe him, as the video shows, as trying to walk or run away. Nothing describes him as a threat.

Therefore, I question the response: not just the school resource officer’s actions, but the school resource officer’s presence. Enough stories like this have emerged in recent years, about adult police officers using extreme physical force on children, mostly children of color, to make me question whether keeping police in schools does any good. These individual anecdotes aren’t data, certainly, but when enough accumulate, a pattern emerges, and that becomes evidence.

Sociologists have spilled copious ink about how most people, including Black people, perceive young Black men as older than White kids the same age. More recent research has determined this is also true about young Black women. From an earlier age, Black youths are perceived as more mature, more sexual, stronger, and more culpable for their actions, than White youths.

To this deputy, this also apparently means an 11-year-old’s body is more capable of withstanding a body slam.

Do I even need to say the follow-up? But they’re not. Not only will body-slamming a child, regardless of race, cause potential lifelong injuries and pain, it also has maladaptive effects on brain development. This youth, this child, will more likely fear interactions with authority figures, and will therefore behave in more oppositional-defiant ways, making future violent confrontations more likely. He’s also now at greater lifetime risk for addiction.

As a once-and-future academic myself, I have many schoolteachers among my friends. They describe incidents like the one which got this student suspended happening increasingly often: as more parents work longer hours, children start having their adolescent rebellious phase earlier and earlier. This child engaged in the most rudimentary form of in-school resistance: refusing to sit still or follow instructions. It sounds downright boring.

Once the Dean got the school resource officer involved, though, it changed the entire tenor of the event. It went from being a case of routine, if unusually early, adolescent insubordination, to a confrontation between the law and a suspected criminal. Any involvement from school resource officers turns any prosaic administrative action into a potential criminal case. That’s exactly what happened here: a trained cop treated a stroppy kid like a hardened violent felon.

The now-notorious image of a school resource officer flinging a teenaged girl
for failing to get off her cell phone, in 2015

Americans accepted the everyday assigning of police officers inside schools because we wanted to protect kids against outsiders entering and committing crimes. They’re notoriously pretty poor at doing this, though. To cite only one noteworthy example, the armed, uniformed school resource officer failed to enter Marjory Stoneman Douglas High School while an active shooter was inside. The officer’s inaction probably contributed needlessly to the body count.

Rather than preventing outsiders from committing crimes inside schools, the primary stories we hear emerging regarding school resource officers involve them treating students (and, less often, teachers) as suspects. This gets especially compounded when working with children, who haven’t learned to internalize social norms yet. Incidents which teachers would, or should, see as opportunities for instruction, police will see as crimes that need stopped.

I’m glad my high school didn’t have a resource officer. My peers and I sometimes engaged in behavior that was technically illegal. However, we didn’t need busted; we needed patient adults to explain right, appropriate, socially standardized behavior. When you treat children like children, they have opportunities to correct their actions. When you treat them like criminals, you harden them against the future.

If you send trained police into any situation, they will look for crimes. Just like if you sent carpenters into schools, they will find repairs that need done, police will find laws that need enforced, because they’re trained to see that. Presumably that’s what happened in Florida last week. But children aren’t criminals, they’re children. At least, they are until they have a record.
Edit: after publishing this essay, a classmate contacted me and informed me our high school actually had a resource officer. In those pre-Columbine days, he mostly conducted drug enforcement, and little of that; being a resource officer back then wasn't exactly a hardship assignment. The fact that I was free to not know about this makes me feel about the Whitest I've ever been.

Friday, April 12, 2019

Julian Assange: Bit Player In His Own Story

The young, idealistic Julian
Assange we all remember
Watching the storm surrounding WikiLeaks co-founder Julian Assange’s arrest yesterday, I’ve deduced one clear conclusion: something happened. Beyond that, it’s tough to say. Did free journalism and democratic ideals get dragged kicking and screaming from international asylum? Did a rampant tool get evicted from Ecuador? Depends on who’s asking. Because Assange himself has become a total cypher; his arrest says more about us than him.

I don’t know about anybody else, but WikiLeaks creased my awareness in 2010, when it released reams of documents and hours of footage related to atrocities in Iraq and Afghanistan. He quickly became a nuisance for powerful people; when he ruffled the Obama Administration’s feathers for exposing their ongoing drone assassination campaign. Politicians love talking about transparency; WikiLeaks revealed they’re less friendly toward the practice.

But despite what some critics claimed after yesterday’s arrest, what Assange and WikiLeaks did wasn’t journalism. WikiLeaks simply deluged audiences with raw, undigested information, mostly in terms general audiences couldn’t understand. We still needed experienced journalists to translate piles and piles of documents, written in specialized jargon and flooded with sloppy data, into language us numpties actually speak. If we’re honest, journalists have done a bad job of this.

This became visible during the 2016 campaign. WikiLeaks became a pipeline of information directed from Moscow, through their website, into American news outlets, which kept the “Hilary’s e-mails” story alive despite a paucity of actual information. Though probably unconnected with the GOP campaign, it became clear WikiLeaks was carrying water for one side when Donald Trump praised the company nearly 150 times at campaign events.

This puts people evaluating the Assange arrest in difficult straits. On one hand, the Pentagon Papers precedent says, provided leaked information doesn’t jeopardize national security or put lives in danger, it’s always better for citizens to know. The democratic process requires informed, literate citizens in full command of the facts, because only debate under such conditions has the ability to flush bad ideas from the body politic.

On the other hand, WikiLeaks didn’t release information neutrally, or in service to democracy generally. It’s tough to say whether they knowingly sat on information regarding corruption at the heart of the Trump campaign, including ties between Russia and Paul Manafort, George Papadopoulos, Michael Flynn, and others. Maybe they just didn’t have such data. However, they clearly flooded the information market with information damaging to Democrats, imported wholesale from Russia.

And don’t try telling me Assange didn’t run WikiLeaks during his seven years of Ecuadorian asylum. During that time, the embassy needed to rescind first his phone privileges, then his internet access, because he repeatedly violated their stipulation that he not engage in political maneuvers. Standards of international law require asylum seekers to not engage with outside or homeland politics— a standard Assange and Edward Snowden both disregard flagrantly.

The shambling, zombie-like Julian
Assange we saw getting arrested
Watching the mass-media interpretation of Assange’s arrest, and the professional journalists chin-wagging over exactly what it means, it’s clear everyone imbues Assange with their values. He’s a horrible harbinger of dictatorial influence on global electoral politics. Or he’s a martyr to high-handed authoritarian government, getting railroaded by a conspiracy between the Republicans and the Tories. The language reporters have used since the arrest has been downright Manichean.

I’d contend, though, that it’s possible to say Assange represented the best aspirations for the “marketplace of ideas,” while also conceding that his company had become a clearinghouse for the worst realities of partisan hackery. Despite the moral dualism of mythology, nobody is entirely good, nor entirely bad. Humans, including journalists, are sloppy amalgams of ideal and expediency. As the Apostle Paul said, “If we say we have no sin, we deceive ourselves.”

Furthermore, I’d contend that the hunched-over, unshaven hobo we saw getting evicted from the Ecuadorian embassy yesterday wasn’t the would-be Free Speech Champion who went inside seven years ago. Isolated, lonely, and increasingly desperate, Assange appears to have undergone the same transformation that plagues rock stars and cult leaders. The pure-eyed Assange of 2012 isn’t the broken, captured fugitive of 2019. We all change; some of us decline.

Watching talking heads argue about Assange, I’d assert we aren’t seeing anybody talk about the real human. They’re discussing the ideals he represents to each individual. Assange himself doesn’t matter, probably hasn’t mattered for years. If we look past the figurehead he’s become, and examine the ideals we look for in journalists and other professional truth-tellers, perhaps we can escape the gravitational pull of the current controversy, and fix our long-term problems.

Monday, April 8, 2019

Time For a New American Radicalism

Bernie Sanders
The Democratic Party is Radicalizing,” screams a headline from The Atlantic Online, one the magazine apparently loves so much that they’ve re-posted it to social media every six or eight hours for the last five days. It’s written by Peter Wehner, a paleoconservative think-tank veteran and “contributing editor” presumably brought on board to decipher the Right to the Left. His previous viral hit was entitled “What I’ve Gained By Leaving the Republican Party.”

The article includes such red meat for rabble rousers as: “If you want to understand just how radicalized the Democratic Party has become in recent years, look at the ascent of Senator Bernie Sanders of Vermont.” Or: “Alexandria Ocasio-Cortez—now the second-most-famous democratic socialist in America—is the unquestioned star among the base.” Wehner apparently thinks we’ll be shocked to discover America’s nominally progressive party supports progressive candidates.

Disregarding Wehner’s tone, which resembles a rough draft for a neo-Nixonian “enemies list,” he clearly thinks asserting that America’s political discourse is “radicalizing” will scare voters into… something. “Radical” has become politics’ greatest bugbear, a monster jumping from behind rocks to gobble centrist discourse right up mwa-ha-ha! The word “radical” deserves its own subclause in Godwin’s Law because, like “Nazi,” speaking that word magically derails discussion.

Except what, exactly, does “radical” mean?

Over a decade ago, I read Shane Claiborne’s book The Irresistible Revolution: Living As an Ordinary Radical. Claiborne describes his Christian community’s mission, living as described in Acts, chapters 2 and 4. He describes this as “ordinary radical.” Ordinary, because it’s something they do every day, a means of living, not some rote observation they perform on Sunday mornings before returning to modern life. But also radical.

Alexandria Ocasio-Cortez
“Radical,” Claiborne explains, descends from the Latin root radix, meaning “root.” It’s also the etymology of “radish,” a widespread root crop. To be radical, Claiborne explains, means to live at something’s roots, whether it’s one’s religious roots, political roots, or cultural roots—and, for Claiborne, all these roots are intertwined, becoming one massive expression of living for True Believers. Which, thinking about it, sounds pretty appealing.

For the sake of argument, let’s accept Claiborne’s definition. If “radicalism” means dwelling at your philosophy’s roots, what roots, exactly, are Democrats seeking? Considering the massive preliminary field of 2020 presidential candidates, we spot an array of Christians and atheists, soft-core capitalists and Emma Goldman socialists, career politicians and young upstarts eager to reinforce what they consider core American values. A veritable smorgasbord of ideological precepts.

If we’re seeking big-D Democratic roots, I propose reclaiming the roots of small-d democracy. And that means returning to one book nearly all of America’s Founders read: The Spirit of the Laws, by Charles, Baron de Montesquieu. Published in 1748, it was translated into English in 1750. Thomas Jefferson and George Washington had copies in their libraries. Transcripts of America’s Constitutional Convention of 1789 reveal the founders quoted it liberally.

Montesquieu compared different political systems at different historical times. He found political structures rested on distinct foundations. Monarchs required citizens who loved honor, while despots required citizens who loved (or feared) the despot. Republics, however, required citizens who loved virtue, who elected to place common good above personal profit, and who saw themselves as both individuals and members of their communities, at the same time.

Thomas Jefferson pinched his famous saying about “life, liberty, and the pursuit of happiness” nearly verbatim from Montesquieu. But he assumed his audience knew Montesquieu intimately, which modern readers generally don’t. Because when Montesquieu identified “life,” he didn’t just mean “not getting killed.” Montesquieu thought free society owed its citizens “a certain subsistence, a proper nourishment, convenient clothing, and a kind of life not incompatible with health.”

Charles, Baron de Montesquieu
(Wikimedia Commons)
In plain English, Montesquieu thought republics couldn’t countenance hungry, naked, homeless citizens. He didn’t mention health care, in an era when “medicine” mostly involved leeches and bloodletting, but one can easily imagine that among his precepts. Montesquieu didn’t think society should make everyone rich; he understood a finite world cannot support infinite desires. But he believed free societies provide a secure bottom limit beyond which no citizen can ever fall.

If that’s the radicalism Peter Wehner mentions, I can support that. Considering the two biggest monsters Wehner names, Bernie Sanders and Alexandria Ocasio-Cortez, have based their platforms on making sure America’s most vulnerable citizens aren’t naked and starving, I’d consider that a good description of the current movement’s roots. I’d hope most Americans would agree (as they did during the Eisenhower administration) that these are roots worth watering.

Thursday, April 4, 2019

“Joker” and the Will To Destroy

Joaquin Phoenix's Joker
I initially didn’t intend to watch Joaquin Phoenix’s Joker movie trailer. I had various reasons for this: because recent DC movies have been mostly disappointing, or because the makeup scheme looks too deliberate, or because I just can’t keep abreast of today’s overproduced superhero industry. But like countless Americans, I couldn’t resist social media buzz, and I dialed it up. Now I find myself, shall we say, torn.

Audiences who don’t read comics probably accept one of the various prior big-screen Joker depictions, especially Jack Nicholson’s appearance in Tim Burton’s Batman, as authoritative. But Burton arguably damaged the character, by insisting he have the kind of character arc taught in university-level screenwriting classes: he needed to come from somewhere, and be going somewhere else. In the comics, the Joker doesn’t come from anywhere.

When Joker debuted in comics, in Batman #1 (1940), he had no “real” name. He had no visible motivation, made no demands, and simply announced his intent to kill. Then, having made the announcement, he carried through. This begins the arc which carries through today, though inconsistently: Joker destroys because that’s his nature. Unlike, say, Catwoman, a thief, or Scarecrow, a mad scientist, Joker doesn’t want anything. He just kills.

Heath Ledger's Joker
This doesn’t always sit comfortably with comfortably with contemporary writing scholars. Good villains, we’re assured in critical literature, have underlying motivations. Even the Joker’s publisher, DC Comics, has broadly accepted the story from Alan Moore’s 1988 graphic novel The Killing Joke, which gives Joker’s insanity roots in a specific incident. It doesn’t clarify or systematize Joker’s actions, but it does establish a concrete inciting moment.

Yet this devalues the character, particularly from the context which birthed him. Gotham, a cartoonized depiction of Depression-era Manhattan, reflected the anomie of its original era pretty accurately. Research at the Santa Fe Institute has persuasively argued that cities, which provide opportunities for random and unanticipated interaction, intensify all aspects of human ingenuity. Large cities create more art, science, and innovation, but also more crime.

The positive and negative of urban life are inseparable. In order to avail ourselves of employment, culture, and other life opportunities, city-dwellers knowingly put themselves at risk of crime. They consider that an acceptable risk. Cities, like Gotham, intensify all aspects of life, creating a richer pallette for creativity, but also occasionally destroying the unprepared. Batman is prepared for city life, and flourishes, albeit violently. Joker is unprepared, and pays.

We see this somewhat in the Joaquin Phoenix trailer. Joker starts as Arthur Fleck, a momma’s boy and aspiring comedian whose domestic life, apparently nurturing his mother through early-onset dementia, leaves him too depleted to pursue his career. His name provides clues. He might be King Arthur, living like a peasant, awaiting rediscovery and coronation. But ultimately he’s a fleck of a man and, like dandruff, destined to get discarded.

See, that’s actually a pretty good kickoff for a story arc. In the Scorsese movies this trailer visually references, like Taxi Driver and The King of Comedy, a fundamentally admirable but weak character gets crushed and becomes the very thing he previously deplored. Audiences rightly consider these movies classics because they say something important about us: we all contain the capacity to destroy the very virtues we so assiduously create.

Jack Nicholson's Joker
But. Within the Batman duality, Joker isn’t a put-upon everyman. Understanding Batman’s continuity requires a willingness to resign traditional morality and instead see things through Nietzschean eyes. Nietzsche insisted modernity required superior minds willing to resist reducing everything to “good” and “evil.” Rather, will we complete ourselves by asserting our reality into the world? Most people, he insisted, won’t have courage enough to demand the world notice them.

Faced with the way modernity strips outdated concepts of meaning, Batman and Joker respond in opposite ways. Batman, like the Greek tragic heroes Nietzsche admired, asserts himself boldly on Gotham, creating order, or pockets of order anyway. Joker, by contrast, surrenders to the one force Nietzsche abhorred, nihilism, and simply tries sow chaos. Joker doesn’t want anything. He doesn’t demand anything. He just exists to destroy.

Near the trailer’s end, we see Phoenix’s Arthur applying Joker makeup. That says everything. Arthur chooses to become Joker, and re-applies the makeup daily. “Doing Joker” is, to Arthur, an active and continuing choice. But Joker doesn’t choose to do Joker, he Is Joker, a condition of existence independent of will—and thus, in Nietzschean terms, a failure of will. Arthur continues the action; therefore he wants something. Therefore he isn’t the Joker.