Friday, October 28, 2016

Why I Started Buying CDs Again



I did something this weekend that I haven’t done in years: I bought a CD. For a guy who once spent nearly as much on recorded music as on books, that’s saying something. Having resisted downloading for ages, I don’t recall the exact point when my music collection basically consisted of digital encodings with no physical basis. Probably about the time Amazon Prime began offering full music albums at no added cost with their service.

But I impulsively purchased Waiting For the Dawn, by the Mowgli’s [sic]. I downloaded the album over a year ago, but only upon seeing it displayed in a downtown vinyl record store did I realize: I’ve never listened to the whole thing. I simply shelved it in an incorporeal music “library” that includes hundreds of albums I’ve never listened to, by artists I really like, from Gordon Lightfoot to Marian Hill to the Scissor Sisters.

That joins movies I’ve never watched, and e-books I’ve never read, some of which I’ve paid for, others made available “free” through an expensive subscription service. My tablet is a cluttered mess of content, a virtual TBR pile that, if translated into the material world, would crowd me out of my apartment. Why can I not power through this content? I’ve spent a week thinking about that, while I’ve kept the Mowgli’s on near-constant repeat.

Shlomo Benartzi of UCLA writes about studies going back to the 1980s about humans’ reading comprehension and retention when reading off a screen. Almost from the time computers became compact and affordable enough to read from, researchers began comparing screen reading with paper reading. And from the beginning, it’s been poor: when reading from a screen, our retention and processing ability drops by half. Human beings read better of printed pages. We probably always have.

Back when the Apple IIe was peak design, researchers attributed this to kludgy 8-bit type. No word on whether the “printed” reading came off a dot-matrix printer. But it gets complicated, because today’s screens aren’t kludgy. Typical laptop and tablet computers have dpi resolutions greater than most cost-effective printing techniques today, often twice the image resolution. Yet outcomes are unchanged. Screen reading from the most current iPad reduces retention by fifty percent, the same figure.

The Mowgli's [sic]

Benartzi suggests this reflects a paradox: excessively clear images enter our brains with minimal resistance. We don’t recall information we don’t struggle to process. Making screens slightly less legible might improve reading retention. He makes a pretty persuasive case, backed with evidence. But it misses an important point: whether clear or kludgy, reading from a screen reduces reading retention by half. This applies to both simple and difficult reading. Screens are the only common factor.


Simply reading digitally makes human brains less receptive to information and change than reading off paper. I lack conclusive scientific data, but I suspect this happens because digital reading is both cheap and ubiquitous. With just a mouse click or touchscreen tap, the specific information goes away, occupying no space and demanding no commitment. Physical books have mass and require storage. Paper books, with their multisensory experience and material weight, demand our accompanying psychological engagement.

This extends, I propose, to all digital media content. Streaming a Netflix movie requires no commitment, something owning a DVD absolutely demands. Playing a CD makes us listen to the whole thing, or at least consciously program which songs to include; downloading the same content doesn’t. I’ve spoken to a few people since buying this CD, and they agree: they download full albums, then listen to just the singles. Pandora streams or Spotify playlists? Pshaw.

Basically, I’ve concluded that when we don’t commit to owning the physical artifact, we don’t commit to consuming the content. Because downloading a Kindle book doesn’t require us to handle or store the material item, it doesn’t require us enough to care enough to finish the book. Downloaded music doesn’t commit us to care about anything besides the singles we already know and love. Streaming TV and movies are passive, requiring virtually nothing from us.

Don’t mistake me. After buying that CD, I did what I’ve done for a decade, ripping its content onto my laptop. Owning the item isn’t a magic talisman; we still buy the silver disc for its aesthetic content, not its physical beauty. But owning the thing still affects our thinking. It requires us to care about the entire package, not those comfortably familiar bits. If you need me, I’ll be off browsing the CD rack.

Thursday, October 20, 2016

This Election Is Not Rigged

I first heard a Presidential election’s legitimacy questioned before me, in real time, in 1992. Oh, sure, the 1876 election still casts a squalid shadow, and the 1824 election, settled in the House of Representatives, probably arose from a secret back-room deal. But history is history; in 1992, I heard someone I trust second-guessing an actual election that hadn’t yet even gone to the Electoral College. That questioner was my dad.

“If people had voted right after the war,” Dad said, referring to Operation Desert Storm, “you would’ve seen this vote go a whole different way.”

Which, on balance, was probably true. Amid the high nationalistic feeling Americans enjoyed (if that’s the word) following our first decisive spanking of Saddam Hussein, George H.W. Bush was practically synonymous with patriotism. But the intervening months saw Bush’s military coalition dissipate, his domestic policy drift, and the American economy wind down. The high feeling of summer 1991 descended into the dejection of fall 1992.

So naturally, my father second-guessed, not the electoral outcome, but the electoral process. A devoted Republican, he thought the election ought to reflect his favored candidate’s career high, not the situation the voters actually faced in November. I was more conservative back then, a reflection not of deep commitment, but a fear of alienating my father. So naturally, I agreed. If the system didn’t reward my favorite candidate, the fault lay with the system.

Chris Wallace, moderator of the
third presidential debate
My father’s message has followed me through this campaign. And it has primarily followed me in Donald Trump’s voice. From the beginning of the primary season, Trump vowed he’d honor the procedural outcome only if he thought he’d been “treated fairly,” a concept he never defined. Into the general election, he’s repeated the claim that the national election would be rigged, implying that only a Trump victory would be truly just and small-D democratic.

I dismissed this language as Dad-like bluster, pure sour grapes from a protest candidate whose ultimate outcome was virtually pre-decided. But in the final presidential debate, Chris Wallace backed Trump into a corner, demanding whether the candidate would accept the legal, constitutional outcome. Trump literally replied: “I will look at it at the time.”

Secretary Clinton called this answer “horrifying.” I couldn’t agree more.

What Trump and my father have trouble accepting, is that elections are arbitrary, but consistent. There’s no reason why presidential elections should happen on the first Tuesday after the first Monday in November, in years divisible by four, except that we’ve always done it that way. We have a change in government at consistent intervals, and those intervals don’t reflect transitory conditions of popularity or outside issues. Elections happen because it’s time for them to happen.

These victories don’t necessarily mean one person has won the ideological debate. Those can continue for decades. I own a book of essays by Isaac Asimov, entitled Fact and Fancy, in which he discusses global warming. The book is copyrighted 1963, and the specific global warming chapter is dated 1952. So that ideological debate has continued for at least sixty-four years, without reaching any resolution.

(Consider that next time somebody says we’re moving “too fast” on global climate change.)

Other debates have continued even longer. The ebb and flow of America’s race debate has persisted since whites imported the first African slaves in the 1520s. The debate has changed: should slaves be free, should races be equal, how do we define race? What about rights for women and the poor? Where do we define poor, non-white women? The debates mount up. And they don’t get resolved “on time.”

Elections, by contrast, happen at specific intervals. Congressmen need their constituents’ approval every two years to hold office. Presidents (and most state governors), every four. Senators, every six. Some positions, mostly local municipal positions like school boards, may need fresh approval every year. But these intervals happen with absolute, machine-like consistency. They happen because it’s time for them to happen.

And that includes the presidential race. George H.W. Bush lost the election because the election took place in November 1992, not August 1991. The timing must remain immune to the candidate’s personal popularity, because otherwise, skillful politicians could remain in office permanently. President Obama’s current Gallup ratings show a majority approval, yet nobody, Republican or Democratic, should wish him installed forever.

So yes, fairness and justice in elections don’t describe the outcome. My favored candidate may lose—and often has. But justice describes the system, regardless of whatever human temporarily occupies the office. And refusal to accept the system makes the individual, therefore, unjust. That alone should be disqualifying.

Tuesday, October 18, 2016

Pardon Billy Bush!

Billy Who?
News came down on Monday, October 17th: Billy Bush, the third-hour host of the Today show, a personality so forgettable that I’d even forgotten he’d been suspended a week prior, was fired. As the Ed McMahon to Donald Trump’s Johnny Carson on that now-notorious Access Hollywood tape, he’d contributed some vulgar and offensive lines to Trump’s praise of sexual assault. So Bush is out, apparently. Meanwhile, Trump is still polling around forty percent.

Upon Bush’s suspension last week, his NBC stablemate Seth Meyers wisecracked that this means Today show hosts, in the lowest-rated hour of broadcast network programming, are held to higher standards than presidential candidates. It’s literally easier to pull somebody from interviewing celebrity poodle groomers, than from contention for America’s highest office. Admittedly, no serious presidential candidate has ever said anything quite so revolting on tape. But still.

Fact is, I’m sympathetic with Billy Bush. Not completely, of course, since you’d expect a TV pseudo-journalist with over a decade’s experience to know not to feed trolls. His sounds of possibly enthusiastic agreement with Donald Trump probably encouraged the classic bully to keep going with his diatribe. But I’ve been in situations where one foul-mouthed goon appeared to dominate a crowd. It’s very hard to resist in that situation.

And when I say “been in situations,” I don’t mean in grade school. In all-male workplace situations, where one dominant personality winds other men up, that dominant asshat probably uses offensive language to control the discussion. Nobody in the crowd wants to be the first to speak against the type-A jackass, because it’s impossible to determine who really agrees, and nobody wants to be the one spoilsport. So things get worse rather than better.

Literally within the last month, in my current job, I’ve been in conversation with a man who used the N-word in conversation. I’ve stood silent, watching two men push each other’s buttons about long-discredited conspiracy theories regarding Hillary Clinton, while three other men hung back, pretending to be apolitical, lest the two leaders’ increasingly heated tone gets turned on us. And I’ve watched two dominant guys use sexual language approaching, but not topping that which Trump got recorded saying.

We’d all like to believe we’d have courage enough to speak against such situations. Ask most white Americans today (most, not all), and they’d probably say that, yes, if they time-traveled to that iconic Woolworth’s counter in Greensboro, North Carolina, of course they’d have courage enough to speak against the bigotry and violence. Surely they’d have sense enough to decry hanging witches in Salem, Massachusetts. Yeah, I’d know to give Mary and Joseph a warm bed.

But until you find yourself in such a situation, I seriously believe you don’t know how you’d respond. All your fine-seeming ethics fizzle and die when you realize you don’t know whether anybody would support you, or whether you’d find yourself the outcast in a situation you can’t leave—trapped on a bus with a rich, spoiled thug like Billy Bush, or trapped in a job you can’t afford to leave, like me.

It’s a situation I wouldn’t wish on anybody. Because for every Rosa Parks, whose singular sacrifice ignited a movement against entrenched bigotry, there are dozens of unnamed martyrs whose sacrifice did nothing. Civil rights historians record that many black Americans before Parks refused to relinquish their seats; most were jailed, and vanished from history. Only a confluence of circumstances, impossible to reconstruct now, made Parks the hero America needed.

Like me, Billy Bush found himself in a closed space, in a crowd of men, while one egomaniac pandered to the crowd’s reptile brains with tales, possibly fake or exaggerated, of sexual conquest. Yeah, Bush said “Whatever you want,” which Melania Trump has construed as “egging [Donald] on.” But nobody believes that. Bush’s short, grammatically incomplete sentences reflect a man scared to resist the tide.

So while I’m not exactly on Billy Bush’s side, I understand his situation. Merely having one woman within earshot would’ve changed the situation altogether. My workplace became, temporarily, a less hostile space when the company hired two young female electricians, and the guys trying to out-dude each other stepped back. And the Trump tape reflects a changed tone when Arianne Zucker appears.

Maybe Billy “Don’t Call Me George” Bush really isn’t that bad. Maybe he is. But he deserves the opportunity to explain himself on air, not just have his press credentials yanked because we’re outraged by Trump. He shouldn’t pay the price because there’s no established mechanism to fire a visibly deranged presidential candidate.

Saturday, October 15, 2016

The Heart of Mathematics; Mathematics of the Heart

1001 Books To Read Before Your Kindle Battery Dies, Part 74
Yoko Ogawa, The Housekeeper and the Professor

When a young single mother gets an agency referral to keep house for an aging, reclusive mathematician, she considers it another by-the-book job. This attitude only increases when she discovers her client has no love for anything but mathematics and number sets. However, when the mathematician discovers she has a young, intellectually struggling son waiting at home, he demands she bring her boy around daily. The change this sudden family wreaks, in a man who otherwise cannot change, is instantaneous and remarkable.

Yoko Ogawa's best-known novel, at least internationally, is the sort of project publishers release for sheer love of books. It's unlikely to be made into a blockbuster film, admits no franchise possibility, and has no fist fights or car chases. (It was adapted into a 2006 film, a quiet, understated study that’s never had an American release.) But it's the kind of book that makes me want to read, and it will enjoy the loyalty of anyone who reads because the word is a joy in itself.

Ogawa creates a world remarkably free of names. The first-person narrator is called only "I," and she keeps house for an invalid genius she only terms "the Professor." These two form a non-traditional family with the Housekeeper's son, nicknamed Root, in "a small city on the Inland Sea." The Housekeeper and her son build a bond with the Professor based on loyalty and his love of teaching. Their every accomplishment brings effusive praise from the old man they're actually caring for.

But the trick is that the Professor has a head injury that has scrambled his limbic system. Nothing entering his head leaves a mark lasting longer than eighty minutes. The Professor needs someone to care for, while the Housekeeper and Root long for a man in their lives to complete their troubled family. The Professor's yin finds the Housekeeper's yang. Root and the Housekeeper are inspired to be better people by the Professor, and seek after his praise, even knowing as they do that in eighty minutes he won't even remember.

Besides the sudden, unexpected family, this novel also deals heavily with mathematics. The Professor loves baseball, but because of his injury, cannot watch entire games. Instead, he reconstructs games from stats on baseball cards. This book involves the first clear, vernacular definitions I’d ever encountered of perfect numbers, Napier’s constant, and other math concepts. But this isn’t a math textbook, and has reportedly frustrated teachers who’ve used it as one. Rather, it’s a love story to math’s human implications.

The only proper nouns in this novel are prominent mathematicians and Japanese baseball heroes. The actual characters are named by their humanistic roles: housekeeper, sister-in-law, professor. In this regard, the novel recalls Expressionistic plays of the early Twentieth Century, peopled by characters with names like "Boss," "Stranger," and "Woman #4." Or perhaps it's more like Aesop's fables. But it clearly signals that these characters relate according to their responsibilities, not their identities.

The Professor must surely be one of the most interesting characters in recent literature. Like the mathematical constants he loves, he simply exists, unmoved by a world that continues changing without him. The only changes he faces come late in the novel, when his injury deteriorates further. In some ways, he requires coddling, lest an ignorant world break him. But in his absolute, incontestable stability, he requires others to change, to grow into his reality. Like math, his constancy is a change agent for an uneducated world.

Reputedly, Ogawa based her characterization of the Professor on Paul Erdős, a Hungarian mathematician who eschewed virtually all human contact in favor of math. But Ogawa refuses to make the Professor a bog-standard eccentric. Math, for the Professor, is not a drab science; it's a work of art and a mode of prayer. And it is this love of beauty and spirituality that inspires the Housekeeper and Root. Math is a tool that brings them together as a family and motivates them to reach for something higher.

The cerebral and episodic story, like many Japanese art novels, doesn't burst like a string of dynamite. Readers weaned on the cinematic style of paperback American fiction will seek for sturm und drang which never arrives. But lovers of language magic will find a refreshing rest from breathless American fiction. This novel’s self-selecting audience comprises those who truly love when words changes the way we see our world. Stunning, punchy, smart and touching. A book that reminds readers that we read for a reason.

Thursday, October 13, 2016

New Highs in Difficult Christian Scholarship

Oliver D. Crisp and Fred Sanders, editors, Locating Atonement: Explorations in Constructive Dogmatics

Atonement holds an ambiguous position within Christian systematic theology. For Temple-era Jews, atonement bought expiation for particular sins, through participation in Levitical ritual. But Christians believe true atonement forgives all sins, including Original Sin, through Christ’s death on the cross. Ordinary Christians in Sunday pews may find that explanation satisfying. But what, realistically, does that mean? High-minded working theologians have sought a self-contained definition of atonement for two millennia, coming no closer to finding it.

This book reproduces the proceedings of the third annual Los Angeles Theological Conference, January 2015. Twelve scholarly monographs by fourteen authors explore different avenues of what “atonement” means at the transcendent level. These seminarians present copious evidence, marshaled from both Scripture and prior theologians, describing some theory of how Christ’s death and resurrection change us. And though I’ve sometimes written that “this book isn’t for everybody,” this book sets a bold standard in intellectual exclusivity.

Reviewing the twelve individual papers seems daunting, particularly since I, a purely amateur theologian, don’t really understand several. This from a guy who reads Dietrich Bonhoeffer and Alvin Plantinga before bed. But many of these scholarly discursions address prior writers, or rely on terminology non-specialists won’t recognize (“dyothelitism”?). Be prepared, in reading this collection, to fetch your smartphone for cross-references. I can only recommend this book to ambitious, intellectually minded Christian audiences who read cautiously

We can, without parsing each article, deduce some broad generalizations. For instance, these scholars pretty uniformly reject “penal substitutionary atonement,” the belief that Christ, in dying, took humanity’s sin upon himself. Christ died for our sins, as your evangelical cousin hollers, but Christ didn’t die in our sins. However, after reading this book, I cannot make more specific statements: we have twelve reasons why penal substitutionary atonement doesn’t work, and twelve theories postulating what does.

I encounter some problem when, upon finding something enlightening or problematic within certain articles, cross-check the authors’ credentials. This book collects a mostly male, entirely Protestant, preponderantly Calvinist roster of seminary scholars and classroom intellectuals. Of the fourteen authors, only three have MDiv degrees, that is, pulpit minister credentials. All currently work as professors or lecturers, a tendency that comes across in their dense prose. This reads like professional scholars writing for other professional scholars.

Therefore, this book lends itself primarily to people who find themselves in teaching positions: scholars, ministers, Sunday School teachers. Because people who professionally serve to guide others’ development need to understand beyond the merely ordinary level, this level of deeper understanding gives them tools to address difficult topics from multiple angles. Anyone who has taught children and other newbies has encountered the dreaded question: “But why?” Having multiple answers helps address and diffuse others’ incomprehension.

Not that purely curious readers, like myself, shouldn’t read this book.I shouldn’t say I “enjoyed” it, because it isn’t fun-time reading; it requires reserving hours for dedicated reading and, where necessary, research. But having done so, I do feel more capacious in my understanding, better equipped to discuss difficult topics. And with this book’s thoroughly footnoted (yes, footnoted, a rarity anymore) sources, I also feel I have more options for more inclusive future study.

I feel obliged to note, before anybody can derail the discussion, that knowledge isn’t necessarily faith. Somebody reading this book without prior faith will find its principles confusing and frustrating, especially as it speculates on Platonic ideals rather than concrete precepts that researchers could investigate in the lab. Theologically open readers may finish this book better informed, but not saved. Remember, Christ himself got most frustrated with the chief priests and scholars of his day.

Despite getting flummoxed by the authors’ respective erudition, I find plenty within these monographs that challenges my thinking. Though I feel no closer to achieving real answers, my Socratic wrestling with deep theological matters can perhaps proceed with more informed, refined tools. I now have names for principles that seemed merely airy-fairy before, helping me compress higher theological concepts into mentally comprehensible concepts. Of course, intellectual comprehension doesn’t hasten salvation. But it helps prevent errors.

As someone who’s complained volubly at times about Christian writing’s tendency toward soft thinking and anti-intellectualism, books like this energize my trust in fellow-travelers. It reminds me that the Christian scholarly tradition, often neglected today behind feeling-oriented bromides and low-impact sermonizing, retains its foundations behind the glittering television facade. Though knowing the answers to hard questions won’t save my soul, caring enough to investigate certainly contributes to a heart open to the Spirit’s sanctifying action.

Tuesday, October 11, 2016

The Politics of "No Politics"

Andy Griffith as Larry "Lonesome" Rhodes, in Elia Kazan's A Face in the Crowd
The Oxford English Dictionary defines “politics” as “Public life and affairs involving matters of authority and government.” This may shock many who, like me, have long associated politics with strict partisan behavior, and the divisive consequences of party membership. But if we take Oxford’s definition, politics is much more inclusive than we’d ordinarily believe, and much more inescapable. We’re all under somebody’s authority; therefore, we’re all subject to politics.

This fact struck me this week, when a Los Angeles-based TV producer Paul Papanek shared a Facebook edit of quotes taken from Elia Kazan’s controversial 1957 classic A Face in the Crowd. The movie, about a poor Southern swindler launched to fame by television, attracts opinions as divided today as sixty years ago, when then-unknown star Andy Griffith openly disparaged and humiliated his audience. It also perfectly predicts this year’s presidential contest.


Papanek precedes this clip with the disclaimer: “I try to stay as far away from politics on Facebook as possible.” It seems we have two basic attitudes about politics this year. Either people, like me, unabashedly take sides and attempt to peddle their beliefs in the way they believe morally and intellectually right; or like Papanek, they declare their apolitical tendencies… right before launching into politics without naming names.

I understand the desire to avoid taking sides politically. Standard public etiquette has long insisted that polite people should always avoid talking politics and religion in civilized company, lest somebody take such umbrage at having their position maligned that all conventional civility gets abandoned. There’s little more appalling than watching a respected friend or colleague flipping their shit because they feel obliged to defend God or Party against heathens and blasphemers.

Yet even the etiquette professionals don’t buy that argument, not really. Judith Martin, the columnist known as Miss Manners, first became notorious in fan communities when, in her former capacity as Washington Post media critic, she wrote several columns on the theme that Star Wars is garbage. This began with her observation, about the first movie, that George Lucas had crafted a science fiction universe completely void of non-white people.

If that ain’t political, tell me what is.

So if even etiquette professionals don’t mind stirring the pot occasionally, why do we accept that having political opinions in public is something awful? A writer friend I know desperately tries to avoid crafting anything “political,” an effort to avoid ginning strong negative feelings in readers. I appreciate and understand her motivations. This being the Internet, I’ve received personal insults, though not yet threats, for daring to have, and express, an opinion on controversial topics.

We can probably agree that political partisanship, that is, strong and outspoken allegiance to an organized party and its candidates, leads to some pretty ugly behavior. In response to Donald Trump’s newly uncovered comments praising outright sexual assault, I myself have engaged in name-calling (though I don’t believe I was wrong). Having strong, party-friendly political opinions can often bring out the worst in people. We’ve all seen it.

However. When confronted with a situation where choosing between Column A, the bigot, and Column B, the prevaricator, we cannot really retreat into Column C, neither. (Yeah, I know Gary Johnson and Jill Stein exist. Let’s keep the conversation streamlined.) If we accept Oxford’s definition, that “politics” is the relationship between people and power, then there is no option of having no political affiliation. The only question is, what affiliation we’ll have.

Donald Trump and Hillary Clinton both want to change the American social landscape. Clinton wants to use government authority to defend the powerless in society, while Trump believes giving society’s best tools to the already powerful will lift everyone else up, too. We may publicly avoid endorsing either candidate, lest we start fights, but that isn’t the same as avoiding politics. Because “no change” endorses the present, which is also political.

Industrialized agriculture, carbon-burning technology, the spewing of invisible toxins into America’s air, soil, and water—“no change” means letting these conditions continue. And since these trends are peddled, not by ordinary farmers, drivers, and workers, but by monied interests who profit from the status quo, the people-to-power dynamic persists, regardless of partisan allegiance. “No change” is the party of right now. Which is political.

So yeah, you can avoid choosing the elephants or the jackasses. But that doesn’t make you righteous. That just gives your vote to “Lonesome” Rhodes, the Andy Griffith character above, the guy who profits from you not thinking about things. If that lets you keep the peace at dinner parties, God bless you. But don’t think you’re escaping the political trap by giving your vote to that guy.

Saturday, October 8, 2016

In Praise of Staring At Screens


An observation: when I'm bored, I spend a lot of time staring at my phone, tablet, or other networked technology.

A related observation: when I have something meaningful to do, I ignore my technology.

I thought about this recently when someone close to me trotted out the old canard about how screen-staring drives wedges into polite society. Classic images of hundreds of people gazing, oracle-like, into their tiny networked do-funnies have become the iconic image of our age. I've read professional talking heads complain (mostly online, ironically enough) that such behavior propels isolation, social anxiety, job dissatisfaction, divorce, and worse.

Such claims have long bothered me, for multiple reasons. First, they mix correlation with causation. People on tech are dissatisfied, but are they dissatisfied because of their tech? Or are they on their tech because they're dissatisfied? Which introduces my main point: that I know which of those, for me, drives the other.

At work, when I get foisted onto menial, second-tier jobs, like sweeping floors, I have a hard time not checking social media on my phone every fifteen minutes. Jobs like sweeping, where even visible progress is transient, and I know I'll have to re-sweep the same floor tomorrow, leave me restless, with little to do with my brain, and a nagging desire for contact. Tasks where I have a measure of autonomy and benchmarks for accomplishment, like carpentry, don't make me want to check my phone.

And it isn't just at work. In my private life, if I spend Saturday among friends who lift me up, talking about shared interests, I feel no desire to check my phone. I discovered this when I spent one day this summer with a friend, and realized I hadn't even thought about my phone for six hours. Self-paced days spent writing, working with my hands, or even watching clouds at the park, leave my brain too peaceful to worry about my tech.

Saturdays spent trapped doing nothing, or something that bores me, have the opposite effect. Tedious activities, like scrubbing the toilet, or unpleasant activities, like going into Wal-Mart, have me reaching for my phone before I even intend to. When my family visits, time spent talking invigorates me. But they inevitably, at least once during each visit, want to hit the craft shop. I see everything I want there inside ten minutes, and spend the rest of the time thumbing listlessly at my phone.

The human brain, when presented with no reward, searches for future reward. We instinctively know this if we think about it: the hope of future reward is what makes gambling casinos such a profitable industry. It also fuels certain religious organizations, the ones that promise Heaven, eventually, to believers who don't make waves on Earth. Future rewards can be more powerful motivation than the present before our eyes.

Social media promise that, if we wait long enough, someone we like will say something meaningful. But ten years ago, we had to boot up our computers, at home or work, to access that promise. Now, thanks to mobile technology, that promise travels with us. We now, at all times, experience the nagging fear that someone, somewhere, is saying something interesting... without us.

Sometimes, we have motivation enough to ignore that nagging. Sometimes we don't. Which times are which probably has detectable patterns. Recall the images of phone-staring you've seen recently. If they resemble the ones I've seen, they've been taken on buses, in waiting rooms, and other places people sit silently in groups. Places that thwart the human desire for community.

I suggest people spend hours daily browsing their technology, not because they're morally weak, but because they're bored, and possibly lonely. Even today's "social" activities, noisy events in crowded spaces, leave eager, curious minds bereft. People want human contact, a positive reward, and they'll chase it, whether directly or online.


I realize I'm generalizing outward from my personal experience. I believe my experience is typical, but I have no proof of this. Perhaps I'm an outlier, and don't even realize it. But I truly think, if we consider our experiences, we've all felt the feeling I'm describing, at least occasionally.

If we think people spend too much time staring at their tech, scolding them won't do much good. We already know it hasn't yet, and probably won't start now. Instead, to keep people from getting lost in future expectations, it might help to ask: why do so many people see the present as something to escape?

Thursday, October 6, 2016

Irish Blood on Irish Hands

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 13
Neil Jordan (writer/director), Michael Collins

“War is murder,” Michael Collins (Liam Neeson) barks in the movie titled after him. “Sheer bloody murder! Had you been here the past year, you’d know that.” Collins directs this at Eamon de Valera (Alan Rickman), leader of the 1919 Irish Rising, and eventual founding President of the Irish Republic. This represents the tension throughout this film, between the diplomat and public face of the Irish War of Independence, and the soldier who commanded that war.

Diametrically the opposite of films like Gandhi and Selma, which selectively celebrate peaceful rebels against unjust governments, Michael Collins unabashedly praises a man willing to kill people, burn buildings, and smash items, to achieve his people’s independence. Though auteur director Neil Jordan makes Collins a hero, he doesn’t prettify him, or his struggles. Wars are won, this movie makes plain, by people who have no place in the peace. In fighting, Michael Collins loses everything.

The story unfolds against the background of British occupation in Ireland. The Irish people mainly speak English, and London-style rowhouses pack close against Dublin’s sooty streets. Like Rome and other empires before, Britain has brought Ireland stability, but at the cost of crushing the occupied people’s identity, usually through violence. Ireland remains peaceful if the Irish stop being Irish. But one peep of demand for equality, justice, or freedom, and leaders meet flying British lead.

Born working-class and hungry in West Cork, Michael Collins rose through Sinn Fein ranks by a combination of strategic insight and public eloquence. But when the Easter Rising of 1916 ends in disaster, and a massive, violent purge of Sinn Fein leadership, Collins survives through sheer luck: the British don’t realize who he is. They turn him loose on Dublin’s foggy streets to resume agitating for independence. He doesn’t take long to develop a following.

With de Valera, Sinn Fein’s intellectual head, imprisoned in Dartmoor, Collins inaugurates a second violent uprising. Where “Dev” prefers direct military confrontations, which untrained, undersupplied Irish cannot possibly win, Collins issues a public warning that collaborators will henceforth be shot on sight. “There’s just one problem,” Collins’ aide Harry Boland (Aidan Quinn) says. “We’ll have to do it.” And so, in public daylit streets, they do. (This movie’s unsentimental depictions of violence may disturb some viewers.)

Alan Rickman (left) and Liam Neeson in Michael Collins
De Valera finds Collins’ tactics appalling. Even when they work to Dev’s favor, boosting him from prison and helping him re-establish his underground government, Dev attempts to constrain Collins. Dev wants the world to recognize Ireland’s legitimate independence, and fears Collins’ assassination campaign makes the Irish look depraved and psychopathic. New York-born Dev spends a year in America, wheedling uselessly for Woodrow Wilson’s recognition. Collins spends that year killing quislings. Only one man makes any progress.

Throughout, Collins attempts to retain a normal Irish existence. He tries playing football with the boys, singing at craics, and maintaining a relationship with his love, Kitty Kiernan, played by Julia Roberts. (Roberts, a blatant nod to American markets, has the worst fake Irish accent since that prancing ninny from the Lucky Charms commercials.) But the longer the story continues, the clearer it becomes he can’t be normal. Wartime makes him choose: life, or nation?

We already know Collins wins. But Dev’s last gambit pays huge dividends: rather than negotiating the peace treaty himself, Dev sends Michael Collins, nobody’s idea of a diplomat. This produces a treaty that grants Ireland incomplete independence, partitions the North, and gives Churchill control over Ireland’s military affairs. Overnight, Collins descends from national hero to villain, and civil war erupts. Collins, who spent years fighting for Irish independence, is forced to fight against Irish partisans.

This movie has garnered criticism for historical inaccuracies, particularly lionizing Collins at de Valera’s expense. Some important events are combined despite happening years and miles apart. Soldiers’ fates are presented in ways that didn’t necessarily happen, but make a better screen picture. Even Neil Jordan has admitted fictionalizing the story for better comprehensibility, especially for international markets. For historical purists, I recommend Tim Pat Coogan’s engaging, but huge, Michael Collins: the Man Who Made Ireland.

Americans like to worship heroes who supposedly achieve political ends through non-violent means. We make saints of Dr. King and Mahatma Gandhi, despite our own seven-year Revolution. This movie clarifies that, in global affairs, sometimes victory really matters more than the means. But this doesn’t make Michael Collins an unmixed hero. He gained his people a nation, but lost everything for himself. Neil Jordan stresses that war isn’t glorious, it’s just the price we pay.

Tuesday, October 4, 2016

Imagine a Workplace Built on Trust

Jonathan Raymond, Good Authority: How To Become the Leader Your Team is Waiting For

The common expression “work-life balance” assumes we compartmentalize whatever we do for pay from whatever gives our lives meaning. Jonathan Raymond, former attorney, tech CEO, and spiritual seeker, believes that’s a false division. Good leaders, in Raymond’s field-tested opinion, don’t manage human resources; they uplift people, a process that emboldens employees’ private lives as it improves their work output. But how, in a business climate of swinging machismo, can real leaders distinguish themselves from managers?

I applaud Raymond for avoiding chintzy buzzwords and trademarked Paths To Success. His approach involves self-scrutiny, human understanding, and professional mutuality. Though he uses occasional handy acronyms, and pinches liberally from simplified Jungian psychology, Raymond fundamentally calls managers to know themselves, and their subordinates, a slow, sloppy process that yields abundant rewards. Admittedly, this book is a billboard for Raymond’s management consultancy. But it exceeds other such books I’ve read by being grounded in facts.

For Raymond, leadership involves neither issuing orders and cracking the whip, nor heroically fixing others’ problems. Heroes, he writes, ensure their own job security by keeping subordinates dependent on frequent rescue. Instead, he encourages leaders to be “Less Superman, More Yoda”—that is, to give employees latitude enough to develop into the fully fledged individuals we already know they are. Leaders should avoid “company culture,” a buzzword that apparently offends Raymond, instead mentoring capable workers.

In pursuit of this goal, Raymond cites certain portable tools like the OWNER chart and the Accountability Dial. Accountability looms large in Raymond’s thinking, though he reserves it for fairly late in the book, since “accountability” often devolves into punishments and rewards. Instead, he systematically encourages workers to own their job performance, while demanding managers exercise good (rather than “borrowed”) authority over their charges. His system is portable enough to implement without necessarily hiring consultants.

As Raymond writes early, “the health of a [company’s] culture is equal to the collective ability of the people who work there to feel the impacts of their actions on others.” This requires employees to engage one another as fully human. Not necessarily equal: employees will resist improving their performance if they don’t anticipate consequences for their actions, and your buddy can’t sack you. Instead, when managers and employees consider their relationship mutual, outputs improve.

Jonathan Raymond
This creates certain tensions. Leaders guide and mentor their subordinates, do so without necessarily prying into workers’ personal lives. (Raymond uses the term “therapist” early, but ambiguously, and doesn’t harp on the concept.) This means meeting workers where they are, rather than bullying or chivvying them into an inferior relationship. This doesn’t mean exempting workers from fallout for their more egregious mistakes; Raymond makes clear that healthy culture requires setting firm boundaries and terminal limits.

Perhaps most important, in Raymond’s less-Superman-more Yoda model, leaders must relinquish the ideal of invulnerability. They cannot deny their own mistakes or weaknesses while simultaneously requiring workers to come to grips with theirs. Raymond’s OWNER chart, which is so good I’d rather let him explain it, does include “Name the Challenge” and “Embrace Mistakes.” Though he doesn’t use the term, his model relies on mutual honesty, too often a missing quality in today’s workplace environment.

Raymond’s model, and the examples he uses, draw from the white-collar technical world. A former tech CEO, Raymond’s experiences involve office work and skilled professional employees. However, with limited fine-tuning, his model, based on straightforward structures of group dynamics and social psychology, should translate into blue-collar work, like factories or construction. Having done both these categories, I’ve seen the importance of a firm but uplifting hand on the tiller, something frequently missing from manual trades.

Much as I appreciate Raymond’s model, he frequently misses his own assumptions. For instance, he assumes management and labor share mutual goals, and simply need to reconcile means. His three leadership archetypes—Fixer, Fighter, and Friend—miss one equally important model, the Foe, who threatens and insults workers into compliance, often preemptively. Similarly, his five employee archetypes overlook the Foe’s favorite self-justification, the Slacker. Maybe a management theory predicated on mutuality can’t accommodate these unilateralists.

So, for organizational leaders who want strong members on resilient teams, Jonathan Raymond offers a structural approach that rewards everyone together. Mutatis mutandis, Raymond’s business approaches could empower schoolteachers, religious and political leaders, activists, and others who would bring out the best in others. Having faced multiple workplaces where management and labor had essentially adversarial relationships, I find Raymond’s vision energizing. Despite its challenges, it needs leaders generous enough to implement it in real life.