Showing posts with label liberal arts. Show all posts
Showing posts with label liberal arts. Show all posts

Saturday, November 19, 2022

The Role of Art in a Divided Society

A still from Robert Wise and Jerome Robbins’ 1961 film of West Side Story

Sometime in the 1990s, I’ve forgotten exactly when, my sister’s high school theater program staged the classic musical West Side Story. Because of course they did, it’s standard theatrical repertoire. The only problem was, her school (she and I attended different high schools) was overwhelmingly White. The performance of urban tension between Hispanic and Irish communities, was played by farmers’ kids of mainly German and Czech heritage.

This meant, as you’d expect, brownface. Students playing the Puerto Rican Sharks gang dyed their hair, darkened their skin, and affected Latino accents. The White Jets, meanwhile, learned a stereotyped “New Yawk” accent and got ducktail haircuts. These students, who were entirely White and lived in Nebraska for most or all of their lives, immersed themselves in playing ethnically mixed East Coast characters, not always in the most sensitive ways.

Around twenty-five years later, my sister recalls that performance with a visible cringe. Troweling on makeup to play ethnically clichéd characters, which seemed broadly acceptable then, is patently unacceptable today. Nobody, except a few high-profile heel-draggers like Megyn Kelly, would pretend otherwise. But without the willingness to play characters who didn’t resemble themselves, I contend, these students would’ve deprived themselves, and their community, of something important.

West Side Story remains important theater, seventy-five years after its debut, because it addresses an important American cultural problem. The Jets and Sharks, defined by their race, attend the same high school and walk the same streets. But they never communicate, because they believe long-held bigoted myths about one another. When Tony and Maria dare fall in love, it transgresses one of America’s most cherished internal borders, the color line.

I’ve written before that teaching youth the humanities matters, because through art and literature, students see other people as fully dimensional human beings, with thoughts, feelings and dreams equal to their own. West Side Story reminds us that anybody, raised on such myths, could wind up believing them, and embracing the violence such division brings. Racism, this play reminds us, isn’t inevitable; it’s a choice we make, and keep making.

Arguably, that’s why White actors playing Brown characters is pretty specious, usually. If my sister’s high school had sufficient Hispanic actors to play the Sharks, they should’ve cast accordingly. No matter how sympathetically those student actors attempted to portray characters who were culturally or racially different from themselves, they’ll inevitably resort to stereotypes, sometimes hurtful ones, of people groups they’ve never actually met.

A still from Stephen Spielberg’s 2021 film of West Side Story

But simultaneously, if the school refused to perform this play, nobody would’ve had the opportunity to receive its message. Not the student actors, who needed to stretch beyond their limited small-town experience, nor the audience who, in Western Nebraska, seldom get to witness world-class art. Beyond the high school, getting to see top-tier theater means traveling to Omaha or Denver, and most people can’t spare that much money or time.

This elicits the question: is the message important enough to accept a less-than-optimum messengers? I don’t want to be mistaken for advocating brownface; the specific event I’m remembering belongs to its own time and place, and should remain there. But the event gave students and the community an opportunity to see people whose lives and experiences were wildly different from anything experienced locally. Even if those “people” were actors.

Questions like this will become more important in coming years. In 1957, when West Side Story debuted, Manhattan’s Upper West Side was predominantly working-class, racially mixed, and volatile. Within five years, the combined forces of gentrification and White Flight changed local demographics. By the 1980s, the Upper West Side was heavily populated with yuppies, while the ethnic communities celebrated onstage had been forced into dwindling enclaves.

The White small town where my sister attended high school has experienced something similar: there are now considerably more Hispanic residents, and even a few Black residents. Because the Hispanic residents are mostly agricultural workers, though, they seldom mix substantially with the community. Interactions with what locals call “Mexicans” happen in public places, like grocery stores; the actual community members seldom get to know one another beyond nodding hello.

Artistic expressions like West Side Story will matter more soon, as American society becomes more segregated, more hostile, more like the Sharks and Jets. Opportunities to see “the Other” as equally human to ourselves might make the difference between peace and violence. And sadly, not everybody will have access to racially representative casting choices. Cross-racial casting isn’t ideal, but it’s better than denying audiences the art they need to see.

Monday, July 25, 2022

Thinking About Thinking, and the Meaning of Meaning

Jonathan Haber, Critical Thinking: The MIT Press Essential Knowledge Series

Throughout my teaching career, I regularly heard “critical thinking” extolled as one of my field’s primary goals. In multiple fields, but especially in fundamental core courses like freshman writing (which is what I taught), we repeatedly heard that students should emerge with more refined and practicable critical thinking skills. Seldom did I hear what those skills were or how they were evaluated; their innate goodness was just viscerally understood.

Educational entrepreneur and curriculum writer Jonathan Haber spent his career trying to better understand what critical thinking was, and how its principles could be made portable. This, one of his last publications before his abrupt passing, compiles his insights into an easily readable pamphlet for general or specialist readers. It encompasses the important debates, and explores them in plain English. It’s a good introduction to the necessary components.

Haber introduces general principles and history of critical thinking. Though descended from the general history of Western intellectual process, critical thinking is a distinctly American distillation of that tradition, based on making mental processes practical. From Plato and Aristotle, to William James and Thomas Dewey, Haber lays out the critical thinking heritage in brief, with an emphasis on useful concepts. It’s fun, exciting, and intellectually dynamic.

What, though, actually is critical thinking? Haber acknowledges that remains controversial, but that academic consensus exists on several important points. Critical thinking involves reason based on evidence and testing, incorporating both scientific method and rhetorical communication. Useful application of these skills usually boils down to three important traits: “knowledge, skills, and dispositions.” That is, knowing information, using that information productively, and maintaining character traits like curiosity, open-mindedness, and creativity.

Though Haber dedicates an entire chapter to teaching and evaluating critical thinking, he doesn’t do anything as prescriptive as writing lesson plans. Though he describes having written social science curricula himself, he seems to prize individual and institutional autonomy. And he admits that evaluating critical thinking is slippery. Though scholars have written evaluative rubrics, none has achieved widespread use; evaluation is ultimately subjective.

Jonathan Haber

One declaration Haber is willing to make: repeated studies demonstrate that teaching critical thinking explicitly, yields better outcomes than teaching it implicitly. Expecting students to absorb critical thinking skills through osmosis, in classes like math, writing, science, and history, generally doesn’t work. Students learn best when teachers explain exactly what skills matter, demonstrate them in action, and give students ample opportunity to practice.

I really like Haber’s process. He directly explains concepts I needed to learn through trial and error, and never wholly figured out how to apply. Though he doesn’t write teachers’ lesson plans for them, he provides enough access to existing resources, and enough keywords for ongoing research, that committed teachers can close that gap themselves. If I’m ever given another opportunity to teach, I’ll apply Haber’s principles from the begining.

However.

Much as I appreciate Haber’s tutelage, I cannot help noticing shortcomings. First, Haber lavishly praises reason and analysis as benchmarks of critical thinking. He never acknowledges a growing corpus of scholarship, led by researchers like Jonathan Haidt, who contend (with evidence) that most human decision-making is instantaneous and preconscious. Though I think reason can retrain Haidt’s preconscious choices, such retraining must happen openly and deliberately. Which, right now, it isn’t.

Also, Haber praises advances in American critical education, and discusses how critical thinking makes for better citizens. How to reconcile this civil application with the evidence of increasing political intolerance around us? As critical thinking has become more widespread in American education, our body politic has become more divided, characterized by factionalism, in-group thinking, and violence. Almost like critical thinking in school isn’t enough on its own.

Indeed, one of Haber’s critical thinking virtues is “charity,” understanding the other side in the most forgiving terms possible. In today’s politics, one side desperately tries to play fair, court the center, and make peace; the other doubles down on sectarianism and anger. That side also decries higher-order education as anti-American and evil. You can’t educate people out of insularity when they consider fairness itself an immoral educational goal.

Therefore, let’s read Haber’s guide as introductory, not exhaustive. Haber himself talks about reading others’ claims to find the unspoken premise. In Haber’s case, the unspoken premise is that critical thinking is a challenge, not a goal: spreading deeper thought undermines some power structures, and those power structures respond by opposing education. Haber’s premise is incomplete for not addressing current affairs. But it is, nevertheless, a necessary first step to actually dealing with the problem.

Tuesday, December 12, 2017

The Liberal Arts Are More Important Now, Not Less

A famous portrait assumed
to be Christopher Marlowe
Christopher Marlowe’s play Doctor Faustus opens with an illustrative scene. Newly minted in his doctorate, Faustus chooses several books from his shelves, takes a seat in his study, and… picks his academic specialty. He decides on sorcery only after examining, and discarding, his era’s three legitimate academic fields: Law, Medicine, and Divinity. Remember, he does this after already achieving his doctoral degree.

I recalled this scene while reading Nina Handler’s lament, “Facing My Own Extinction,” in the Chronicle of Higher Education online. Handler, coordinator of English at Holy Names University in Oakland, California, looks both forward and backward as she makes peace with her school’s abolition of the English major. Her school, she laments, has become a preprofessional training seminar, where any class without a career payout gets dismissed as unnecessary.

It’s tempting to defend English for transcendental ideals, like that people who read are better able to have empathy, or that it potentially immunizes democracies against tyranny. But these arguments basically persuade the already persuaded. The educational reformers aggressively pushing STEM-focused curricula, subsidized by captains of industry and legislators desperate to not look backward, won’t hear such arguments, because they already disbelieve or dismiss them.

Instead, let’s contemplate the very material rewards that come from a diverse liberal arts curriculum. Two authors I’ve recently reviewed, Christian Madsbjerg and Scott Sonenshein, both write that business success and diverse education go hand in hand. Both authors observe something I’ve observed, though they have better source notes: specialists know how to do one thing well. But they’re incapable of adapting to changing economic, business, or professional conditions.

Christian Madsbjerg
The simple ability to read and understand multiple genres provides one recourse against such inflexibility. By simply stepping outside ourselves, our narrow range of experiences and specialized training, we learn ways of thinking that keep our minds active and developing. If we can wrap our heads around Gilgamesh, Hamlet, Elizabeth Bennett, and Bigger Thomas, we can also handle economic downturns, changes in job-related technology, and evolving moral values.

This isn’t only about English. In History, for instance, certain ideas are objectively correct, and therefore testable on a Scantron sheet. The last successful invasion of England, for instance, was objectively in 1066. But why? What made William the Conqueror able to sack England and take the throne, while Hitler’s Operation Sea Lion, much more thoroughly planned and backed with more advanced technology, got abandoned even after the Luftwaffe?

Don’t mistake this for abstruse woolgathering. America has a president who thinks Andrew Jackson could've posthumously prevented the Civil War, and an administration that thinks “the lack of an ability to compromise led to the Civil War.” These people, with the ability to guide the economy or take America to war, demonstrate a palpable lack of awareness about the weighty economic, social, historical, and military factors that shaped history, and continue shaping the present.

Perceptive readers will notice I haven’t really made a concrete argument for the English major specifically. Despite name-checking literary characters and historical events, I’ve made a more broad-based argument for a diverse liberal arts education. You’re right. Locking youths into an academic program, and resulting career path, at age 18, seems ridiculous to me. Like Faustus, they should choose their specialization only after gaining a diverse general foundation.

A handful of colleges and universities have followed this path. St. John’s College of Annapolis and Santa Fe comes to mind, as do Thomas Aquinas College and Reed College. These schools have softened or abolished undergraduate major departments, focusing on a diverse grounding in humanities and sciences. By contrast, Holy Names and other schools not only lock students into career tracks, they’re narrowing the number of tracks available.

Scott Sonenshein
Business, citizenry, and just plain humanity, require a diverse grounding in the humanities. This includes language arts and social sciences, but also mathematics and physical science. Our society suffers a plague of specialists today. It’s easy to point fingers at lawyers who only know law, or businessmen who only know business. But I’d also include journalists who only attended journalism school, and bureaucrats who have spent decades only in government.

Remember Doctor Faustus. Having chosen his discipline based upon earthly rewards, he gets the knowledge he seeks quickly. But he almost immediately declines into a shadow of himself, conjuring famous shades for kings like a carny barker, and playing horse pranks on foolish yokels. At his death, even God won’t take him back. Because knowledge isn’t supposed garner worldly payouts, it’s supposed to create a full and rounded soul.

Monday, April 3, 2017

Why We Need Liberal Arts in the Business World

Christian Madsbjerg, Sensemaking: The Power of the Humanities in the Age of the Algorithm

It’s become dogmatic in certain circles to insist that only STEM subjects matter; disciplines like English, Sociology, and Music have become passé. Danish-American strategy consultant Christian Madsbjerg disagrees. Traditional humanities disciplines are not backward or vestigial; in his experience, business professionals and forward-planning capitalists need these fields to function. We’re not all plugged into an algorithm, Madsbjerg writes. Humane arts are necessary in a wealthy society.

We’ve all grown bored with the repetitive claims. America needs more welders and fewer philosophers, Marco Rubio said. Madsbjerg quotes Jeb Bush that psychology majors are headed for jobs at Chik-fil-A. Today’s data-driven world gives us all important answers, and we can accurately predict outcomes if we simply have sufficient information. Businesses run on data, and we need more number crunchers, more code writers. The numbers speak for themselves.

Not so, Madsbjerg writes. Numbers almost never speak for themselves; they need humans to interpret them. In his early chapters, Madsbjerg details several high-profile incidents where numbers, adrift from human context, proved the exact opposite of reality. Software failed to predict movements in population, economy, even disease epidemiology. Only a well-informed human could restore the context these numbers lacked, giving them power to mean anything in the real world.

From this foundation, Madsbjerg builds five formal bromides about how humanities make American business possible. I could list them: statements like “Culture—Not Individuals,” or “The North Star—Not GPS.” But unlike too many business writers, who dispense fortune cookie platitudes with casual disregard, the real joy in reading Madsbjerg comes from his explanations. A schooled philosopher himself, Madsbjerg coaches readers through a thought process, not memorized “skillz drillz.”

Why, for instance, does the Ford Motor Company struggle to sell cars in India to people who are, demographically, almost identical to their core business in America? The answer, which Madsbjerg teases out across several chapters, has roots in cultural circumstances unrelated to cars. Running the bare statistics, middle-class Indian urbanites seemingly resemble their American peers. Understanding the difference requires pausing big data and unpacking respective cultural contexts.

Christian Madsbjerg
This doesn’t mean abandoning technical skills. In my favorite illustration, Madsbjerg describes a Danish architect scouting a location for a prospective Swiss bid. Important aspects of architecture, like engineering properties of glass, steel, and masonry, apply everywhere. But aspects of designing this building, to fit into this business and regional culture, involve understandings not taught in design classrooms. These require understanding language and industry and art—in short, understanding humans.

Madsbjerg, to his credit, does not produce another crinkum-crankum encomium to why liberal arts education makes us better people. I could’ve written that; I probably have. As a business consultant, Madsbjerg maintains focus on economic implications. Liberally educated professionals make better business executives, he insists, because their diverse education allows them to face difficult situations, sift conflicting evidence, and make decisions that improve everyone’s condition.

This requires a complex relationship with information. Business executives who turn data into outcomes don’t simply receive their information; they run it through filters that, for lack of better terminology, resemble anthropology, literature, and art. Business history, and Madsbjerg’s prose, is replete with examples of people, well-trained to do one thing (spreadsheets, double-entry bookkeeping), who stumbled altogether when confronted with the larger picture. Liberally educated professionals can simply adapt better.

And Madsbjerg himself is actually a good example of this. I’ve had several books cross my desk recently, offering to make readers into billionaire business icons; most either bury the audience in source notes and statistics, or tell long, rambling anecdotes that seem largely irrelevant. Madsbjerg, by contrast, creates the kind of balance that makes his advice practical: numbers where they’re necessary, stories where they’re relevant, always couched in comprehensible context.

Humans are sensemaking creatures, Madsbjerg writes, thus his title. Increases in data collection and statistical analysis have made sensemaking more powerful, nuanced, and worthwhile. But data never simply exists as-is; it always comes from somewhere, and requires human intelligence to make it applicable. Without that human intelligence, which comes from understanding literature, social science, and other humanities disciplines, numbers mean nothing, or even create more confusion than they solve.

I’ve read and reviewed several business books recently, and hated more than I care to recount. The worst are often mere billboards for the authors’ consultancies, comprehensible only if a Harvard MBA or the author is present. Madsbjerg has instead created a manifesto. Businesses, money, and data all serve people, he writes, not vice versa. Understanding this makes the difference between success and failure.

Friday, August 21, 2015

Worshipping at the Altar of Celebrity

1001 Books To Read Before Your Kindle Battery Dies, Part 55
Tom Payne, Fame: What the Classics Tell Us About Our Cult of Celebrity

Recent months have witnessed American popular culture consuming the characters it once elevated to mass-media stardom: Bill Cosby’s disregard for women’s autonomy. The gulf between Josh Duggar’s words and his actions. Even Al Gore, once the epitome of stuffed shirt respectability, has fallen in for blood-chilling accusations. It’s difficult to recall a time when so many vaunted personalities disclosed repugnant secrets for salacious audiences.

Have we truly produced a generation of celebrities famous for ephemera? Is our cult of fame truly unprecedented in a history of noble, upright heroes? Tom Payne thinks not. Bringing together recent puff journalism, centuries of history, and the Greco-Roman classics, Payne demonstrates that, the more things change, the more they remain the same. Our need to make celebrities, and then to tear them down, seems ingrained in human endeavor.

A Cambridge University graduate, Payne comes from a background well versed in understanding the distant past and the power classics hold over the present. As a former newspaper literary editor, he’s also accustomed to building bridges between books and their audiences. He proves himself a masterful context maker in this, his first book, establishing how trends that appear newfangled and revolutionary are actually, deep down, old hat.

Payne starts with the observation that Britney Spears' famous shaved head eerily mirrors Greek traditions of womanhood, when brides on the cusp of deflowerment offered their locks as holy sacrifice. Building on that, he finds parallels between how we treat athletes, politicians, celebrity marriages, and celebrity flameouts, and how ancients elevated demigods only to destroy them. He even finds matching Jeremiads of the decadent present written in our antique past.

Though he writes with a spirited, even coarse, voice, Payne's work is rich with philosophical weight. Anybody can approach this book, but nobody can really read it without rubbing up against discomforting concepts. Why, he wonders, do we take pleasure in seeing the mighty brought low, a trend we perpetuate with reality TV shows targeted for maximum humiliation? Is this really as different from the democratic process as we might hope?

Fame, in Payne’s figuration, isn’t mere acclaim; it entails elaborate ritual, by which we first elevate, then destroy, our idols. Watching the cold-blooded glee with which online commentators have eviscerated Jared Fogel, it’s easy to assume we’re watching a reasonable response to public wrongdoing. But if Payne’s figuration holds, we’re actually witnessing a trajectory not unlike that taken by mythic heroes, like Achilles and Cassandra: accomplishment to acclaim to sacrifice.


In Greco-Roman times, this trajectory had undeniable religious implications. Only the destruction of the truly mighty had power to appease the gods. Though some heroes brought low could return, like Odysseus, such restoration required an arduous journey through the land of the (literal or figurative) dead. Maybe that’s why audiences love comeback stories, because our celebrities, once restored, have messianic glamour we long to emulate.

Today, the fame arc isn’t necessarily religious, inasmuch as it involves no appeal to transcendence. But if, by religion, we mean the liturgical rites that bind human societies, then fame worship serves the same roles today as in classical times. Consider how we make secular saints of celebrities, Bono for instance, then methodically disparage and destroy their divinity. That structurally counts as religion with no gods.

We treat the beautiful and the good as superior, out of place in our lives. Indeed, we easily confuse beauty and virtue (he specifies Angelina Jolie, though he elides her work in the developing world, a serious oversight, I think). Then when we find out that those we have exalted have the same venial shortcomings we do, we pillory them for their weakness. What does this say about us?

I wish Payne explained some of his pop culture references better. For instance, in his desire to build trans-Atlantic appeal, he talks about both American and British culture, forgetting that they aren't wholly interchangeable. Jade Goody is one of Payne's major motifs, yet how many Americans have heard of her? Not me, certainly. Payne explains the classics thoroughly, yet I repeatedly had to Google his more current exemplars.

Still, Payne challenges us to answer hard questions: what primal impulse forces us to sacrifice the idols we have built? What perverse pleasure lets us watch systematized humiliation of our heroes, then apply for the same concourse of fame? Do we have the same primeval urges displayed at the Bacchanalia, and do we, perhaps, want to be sacrificed? Payne offers no easy answers, but implies that the questions matter most.

Monday, June 22, 2015

Critical Thinking for the Uncritical

Martin Cohen, Critical Thinking Skills For Dummies

Any book promising to discuss “Critical Thinking Skills” inevitably faces the same problems faced by books discussing “Democracy,” “Education,” or “God”: no agreed-upon definition. Back during my teaching days, critics demanded we teach Critical Thinking Skills, but what that meant depended on whom you asked. Everything from cultural literacy to scientific thinking to the entire Liberal Studies core could fall under that eternally elastic rubric.

For British philosopher Martin Cohen, Critical Thinking Skills roughly correspond with the medieval Trivium of Rhetoric, Logic, and Grammar. That is, Cohen wants readers to effectively organize, stage, and defend their own arguments and debates, while analyzing the arguments of others. Having attempted, with variable success, to teach the Trivium in Freshman Comp, I applaud Cohen’s motivation. But I significantly question his deployment of facts.

In keeping with the “For Dummies” format, Cohen assumes audiences have no prior familiarity with his topic. He introduces concepts with minimal recourse to jargon, and where he requires technical language, he defines every term. In broad outlines, Cohen’s tutorials make good foundations for self-guided study, and he introduces valuable concepts for broader analysis. He stresses important skills, like questioning one’s own presumptions, testing evidence, and differing structures of logic.

While Cohen focuses on generalities and philosophic principles, he’s engaging and informative. Cohen introduces advanced thinkers and up-to-date research, from Plato to Locke to Benjamin Bloom, to support his positions. Many new concepts Cohen cites excite me, making me want to discover more. But without a Works Cited list, source mining becomes difficult. That’s where problems arise, because I know some sources don’t say what Cohen says they say.

As early as Chapter Two, Cohen makes mistakes I easily identify. He misrepresents Daniel Kahneman and Thomas Kuhn. Though he never cites Kuhn by name, he quotes, incorrectly, the concept of “paradigm shift,” a term Kuhn first coined and defined. Not everybody reads such pointy-headed literature, especially not everybody buying a “For Dummies” book, so many people won’t recognize errors. I spot them, though, making me distrust other source citations.

He also defines terms incorrectly. For instance, Cohen defines ad hominem arguments as “Where the views of others are dismissed out of hand.” No they’re not. Ad hominem arguments deflect attention off the claim, and onto the claimant. Though the claimant’s person sometimes matters (elected officials’ party affiliations matter when they criticize one another), serious arguers usually consider ad hominem an attempt to salvage irreparable positions by muddying the waters.

Cohen sadly crossed my biggest line when he accused the BBC of stifling critical thinking on global warming. The BBC recently decided to stop giving climate change deniers equal time, simply because deniers lack scientific basis, and reject a position overwhelmingly shared by working scientists. You’ll literally find more debate in scientific circles about how gravity works, than you’ll find about the reality of anthropogenic global warming.

But Cohen insists the BBC should keep the media debate open because deniers exist, and because—seriously—many deniers are “articulate.” That’s a shitty reason to sustain otherwise resolved debates. Many flat earthers, seven-day creationists, and ufologists are also articulate; yet hopefully Cohen would recognize that forcing Neil DeGrasse Tyson to debate Fox Mulder would waste everybody’s time. Critical thinking sometimes requires excluding nuts and extremists from grown-up discussion.

Worse, keeping debates open inevitably rewards the status quo. Hydrocarbon producers profit when we avoid curbing our consumption, but they know you couldn’t find a climate scientist who disputes global warming (and isn’t paid by the hydrocarbon companies) with GPS and a Michelin map. But they don’t need to win the debate; to prevent you changing your carbon-burning ways, they only need to prevent resolution. See Rampton & Stauber for details.

I spot these discrepancies because I’ve read these topics previously. My reading appetites are omnivoracious, and my teaching experience spurred me to unpack very difficult topics so I could convey them. Readers encountering Cohen’s model anew will, mostly, lack my background, and have no reason to realize he makes incorrect, incomplete, or slanted claims. They’ll lack, well, Critical Thinking Skills enough to analyze Cohen’s argument. Wow, very “meta.”

Because concepts like Critical Thinking Skills lack single, enforceable definitions, it’s necessary to evaluate books like this on their own merits. I’ve attempted to do that. And Cohen’s terms don’t accord with his sources. Many buyers, I fear, will miss these gaps altogether. The results, for our economy, our freedoms, and our democracy, will have disastrous effects. Cohen offers solid principles, but undermines himself with incorrect evidence.

SEE ALSO:
Thinking About Thinking is Harder Than You Think

Friday, February 20, 2015

Stupidity For Sale


The abject idiocy of certain people who claim to speak for the public good continues to baffle me.

Late last week, veteran journalist M.D. Kittle wrote a pig-ignorant screed on Wisconsin Reporter, a regional website affiliated with right-wing umbrella group Watchdog.org. Kittle inveighed against any reforms of higher education that persisted in requiring any liberal arts core, insisting that anything other than job skills doesn’t comport with the Wisconsin Idea, a guiding principle of Wisconsin’s higher education system.

I’m accustomed to students complaining about liberal arts requirements. A classmate of mine mocked his additional history prerequisite as a mere money-making racket, a demonstrably nonsensical claim at a land grant university, where tuition barely scratches the surface of costs borne by taxes and endowments. As a teacher, I recall one student bellyaching: “Why study math? I’ll never need to factor polynomials for the rest of my life.”

Kittle takes this complaint, which I understand from students—who by definition don’t recognize their own best interests—and extends it to truly ridiculous ends:
The escalating cost of higher education is due in no small part to an outmoded liberal arts belief that forces computer science majors to take Lithuanian pottery or some other course in order to obtain a degree that is supposed to say the student has the skills to do the job at hand. At the end of the day, it’s safe to say IBM and Microsoft don’t give a damn whether their employees can operate a kiln.
You’re right. Employers don’t care if new hires possess such skills. And, other than arts majors, only an idiot would take such minutely specialized courses. Except at the most high-aspiring research universities, you’ll have difficulty finding anyone who even offers such particular courses to undergraduates. And anybody stuffing their CV with such esoteric subspecialties deserves the ding such choices attract.

However.

Conservatives, like those who run Watchdog.org, formerly advocated restoring firm liberal arts curricula to contemporary universities. The National Review editorial board openly endorsed toughening core studies when Jesse Jackson was organizing protests against “Western Culture” courses. What happened? When did the American Right decide against upholding traditional standards in higher education?

Only in America do families send youth to universities to achieve job skills. America has a highly regarded network of well-developed trade schools, which leaders like President Obama have advocated strengthening. And well they should. While university degree holders have greater lifetime earning potential, trade school graduates have greater immediate earning potential. Tradespeople with associate’s degrees can earn enough, right away, to start paying bills and raising families.

Yet people like Kittle, and the students he cites, want the prestige associated with university credentials. They just don’t want that boring old university education. Rather than elevating themselves to the complexity of university standards, they want universities lowered to mere skills training facilities. Their desire for a la carte education treats universities like shopping malls, and professors as service providers, not mentors or caregivers.

Worse, this attitude is crushingly passive. The desire for mere skills training reduces education to the mere transmission of information from one brain to another, an approach that provably doesn't work. Moreover, students claim they want skills training, but I know they don’t. When I tried lecturing my students, their eyes visibly glazed. When I engaged in dialog and asked questions without obviously correct answers, they came alive again.

Students, by nature, don’t know what they want and need. Important concepts reveal themselves only laterally, often in surprising ways. As I've written before, we never study topics for their own sake. Music is beautiful, but music also relies upon strict mathematical relationships; music is math made tangible. Likewise, literature is a compressed form of thinking, and the ability to comprehend literature is, manifestly, the ability to have empathy.

And, yes, Lithuanian pottery is a stupid course. Besides art majors, only somebody unthinking would take that class. But art history courses provide introductory studies in complicated visual communication, absolutely essential for engineers, physicists, and other skilled professionals who deal in spatial relationships. Just because some twenty-year-old doesn’t grasp why liberal studies doesn’t matter, doesn’t excuse adults indulging their ignorance.

Students who only study their job eventually do their job exactly like everyone else. Employers treasure college grads because they can break the mold. But libertarians like Kittle don’t want such individuality. In reducing education to job training, Kittle by extension reduce schools to industrial parts manufacturing. And those parts are students. I’d consider that sufficient reason to be outraged.

Friday, January 10, 2014

Remember, the Enemy's Gate is Down

Kevin S. Decker (editor), Ender's Game and Philosophy: The Logic Gate is Down

Nearly thirty years on, Orson Scott Card’s Ender’s Game remains not only the author’s most read, most influential book, but a powerful outsider cultural critique. It has achieved crossover success, and often gets read by academics, public policy makers, and general audiences that wouldn’t normally touch science fiction. It has a dedicated intergenerational readership, and organized opposition, probably exceeded only by the Bible and Harry Potter.

And no wonder: despite space opera flourishes, “the Enderverse” touches common human experiences that transcend time, genre, or audience background. Diverse readers see themselves in Card’s tale of a genius, disgusted with his own prowess, molded by a state which exploits his wrathful tendencies. But until Kevin S. Decker collected eighteen new essays from varying disciplines, all centered on Ender’s tale, I’d never realized its place in Western philosophical tradition.

The assembled authors muster a tremendous array of philosophical insight on Ender’s struggle, mixing ancient and modern philosophy, plus domains including governance, mathematics, computer science, and military ethics. Some authors draw on familiar sources, from Aristotle, Aquinas, and Sun Tzu, to Michel Foucault and Hannah Arendt. Other sources don’t ring immediate bells: Friedrich Schelling and G.E.M. Anscombe aren’t household names, but offer remarkably valuable insights.

Card’s story maintains its popularity, and avoids ordinary science fiction obsolescence, by embracing ambiguity. This is remarkable in a frequently sententious genre. Despite science fiction’s general agnosticism, authors like Asimov, Heinlein, and Octavia Butler eagerly spotlight their culminating morals. Card, though he clearly cues our sympathy for Ender, doesn’t flinch to show how Ender’s propensity for violence and self-delusion make him a prime Battle School candidate.

Decker’s peanut gallery utilizes this ambiguity to explore topics like war, education, competetive behaviors, and the composition and governance of modern and postmodern societies. Topics Card addresses momentarily, or which vanish into the background of his highly complex narrative, get detailed treatment by serious scholars. Though fans have long loved Card’s Enderverse for its complexity, they’ll surely share my joy at discovering how specific that complexity really is.

Some authors, like Danielle Wylie and Kenneth Wayne Sayles III, use Ender’s story to explicate important concepts in ancient or current philosophy. Others, like Jeremy Proulx and Matthew Brophy, use philosophy to shed new, deeper light on Ender and his struggles. This open, rolling dialog allows credentialed scholars in difficult disciplines to communicate plainly with general audiences. It also lets academic philosophers espouse the uncertainty frequently reserved only for artists.

Importantly, these authors don’t necessarily agree. Kody Cooper, for instance, cites Aquinas and Augustine to justify Ender’s violence. But James Cook, of the US Air Force Academy, draws the opposite conclusions from the same sources, while condemning Battle School for not teaching even rudimentary military ethics. (Some critics, primarily online, find disconcerting Hitler parallels in Ender’s story. Though some of these scholars cite these claims, none embrace them. Godwin’s Law applies in print, too.)

These disagreements make for some of an already surprisingly great book’s best reading. Legitimate scholars, mustering robust support, debate topics like how responsible we can hold Ender for his actions, or whether governments can legitimately manipulate their citizens. Some of Decker’s scholars would exonerate Ender altogether; others suggest he’s self-deluding and culpable. Though all of these authors “like” Card’s novel, they disagree vigorously on what liking a dystopia means.

Be warned: these authors, with their intellectual debates and philosophical hermeneutics, are academics. These eighteen essays, averaging around twelve pages apiece, are serious scholarship, not Ender’s True Hollywood Story. Don’t undertake this book unthinkingly, or mistake it for light beach reading. Expect authors to challenge, threaten, and overwhelm you. Expect to learn by struggling with hard concepts. Expect, frankly, a sit-down version of Battle School.

Though publisher Wiley Blackwell released this book to coincide with the new Ender’s Game movie, it shipped before the movie debuted. These scholars address only the literature, which they address in great depth. They assume you’ve already read the book and recognize flip references to Bonzo, Bean, and Eros. I admit, it’s been a while, and I needed to cross-check my paperback occasionally. Decker’s authors write for Ender fans, not newbs.

Ender’s dedicated audience won’t be surprised to discover how much intellectual intensity Card packed into his book. We’ve loved it for that reason for over a generation. But this book lets fans attach names to concepts, explore ramifications in greater depth, and situate it in our larger cultural tradition. Decker’s authors won’t make you enjoy Ender’s Game; they’ll show you what loving this timeless classic entails.

Friday, November 22, 2013

Dear Sebastian Thrun

An open letter to Sebastian Thrun, former Stanford professor, CEO of Udacity, and pioneer in Massively Open Online Courses (MOOCs). Thrun announced last week that his company would de-emphasize providing online course content for universities, which has worked poorly to date, and turn its focus to specialized corporate training. Thrun's announcement has met cheers and mockery from predictable circles.
I blush to admit, I was one among the cackling chorus of educators dancing circles around the mouldering carcass of your high-minded promises. Your utopian vision of digital education is expensive, resists meaningful measurement, and suffers ebola-like attrition rates. It encourages an essentially private, passive relationship to education, with bleak implications for life and career. We who teach for love of students dreaded the failure of empathy your model betokens.

But once the giddy exhilaration of vindication wore off, I paused to ask myself why I felt so strongly. As a fan of Neil Postman, I initially attributed my doubts to the medium. Like TV, the online environment rewards entertainment, short attention spans, and spectacle. It doesn’t reward independent thought or context. But that doesn’t hold water, or I couldn’t write this blog. Online education’s well-documented limitations must run deeper.

Professor Thrun, you come from an industrial research background. Your contributions to exciting new programs like Google Glass and Google’s newly announced self-driving car approach legendary. I applaud your accomplishments, because they transform our relationship to information and knowledge. But your public statements reflect your industrial background, openly treating school like a machine shop, and students like interchangeable parts.

Not that you’re alone in such opinions. Many writers utilize industrial metaphors to describe schooling in the coming era. I’ve reviewed some of these books here. But consider what this metaphor implies. You’ve redefined humans to have worth only instrumentally; that is, we derive meaning from our ability to work and make money. This contravenes the reasons we tell students to study liberal arts, because education makes our souls free.

Online education advocates have been appallingly unalert to the limitations inherent in their model. From government studies to academic guidelines to the popular exposés linked in the prior paragraph, authors excitedly espouse classroom-free learning as education’s liberation. Even before these mass-market analyses, I remember an awestruck series of MacArthur Foundation white papers breathlessly expounding how digital technology would imminently render classrooms, institutional schools, and professional educators obsolete.

These books and studies all share one limitation, however: they evaluate digital learning venues according to responses from people who finished the courses. Even the US Department of Education, a fierce cheerleader for new technology in education, concedes when cornered that these courses feature an attrition rate approaching ninety percent. These courses particularly disadvantage poor students, minorities, and men—the populations already disadvantaged by the current system.

In an interview last week, Professor Thrun, you openly disparaged poor students for entering your company’s courses unprepared. But nobody is born knowing how to “do school.” Children model what they see growing up. I was fortunate enough, as you presumably were, to grow up in a household brimming with books, where my parents modeled self-improvement as a cardinal virtue. Therefore I started school already attuned to the learning process.

Your model places 160,000 students under one or two teachers’ guidance, as occurred with your celebrated Stanford Artificial Intelligence class. But teachers cannot guide 160,000 students. You cannot possibly read 160,000 papers, conduct 160,000 personal counseling sessions, or know 160,000 names. You can only offer standardized tests, which evaluate students’ rote memorization ability, and never determine whether they’ve thought about what you taught them.

Education is not about conveying information from one mind to another—or, it shouldn’t be. We don’t invite students to sit down and passively receive data into their otherwise blank minds. Such behavior invites helplessness and confusion—and, oh look, that’s how most students greeted your classes. Students may parrot your lessons without ever gaining significant understanding. Inexperienced students, especially from underprivileged backgrounds, need personal guidance to make the intuitive leap from fact to insight.

Let me state that another way: ours is not an information economy, because economies rely on scarce and desirable commodities. Information, today, is common as dirt. Rather, ours is a processing economy, where our ability to synthesize separate knowledge increments into greater wholes creates value, as you, Professor, did with Google Glass. Such processing requires guidance, mentorship, and nurturance, not an undifferentiated fact dump.

If I’ve learned anything at the factory, it’s that technology makes human discretion more valuable, not less. Machines are ultimately helpless without humans to guide them. But students entrained to passively receive information cannot guide anything, because they need guidance themselves. You yourself, Professor, have created a world where free-thinking minds making profound logical leaps are more valuable than ever. And you cannot create such minds by making students stare indifferently at a screen.

Wednesday, November 13, 2013

The Year's Best Alice Munro

Elizabeth Strout (editor), The Best American Short Stories 2013

Every year, after I finish reading The Best American Short Stories annual edition, somebody inevitably asks: “Was it any good?” As though that’s a yes-or-no question. I usually respond with: “Depends. What’re you looking for?” Every year seems dominated by some theme, some insight that doesn’t reveal itself initially, but only after scrutinizing multiple stories. This year, your response will depend on how much you like Alice Munro.

Munro became the first Canadian Nobel Laureate in Literature two days after this collection shipped. Pretty good for an author who only writes short stories, in a market where short fiction venues haven’t weathered the digital revolution well. If short fiction has any future in today’s marketplace, it’ll come from authors absorbing Munro’s influence. American literature once needed a thousand Mark Twains; today it needs ten-thousand Alice Munros.

Well, this collection offers twenty, including Munro herself. Ironically, Munro’s contribution to this year’s collection, “Train,” is perhaps the most conventional story I’ve read from her. It has her accustomed generational sweep, and eschews climactic peaks, preferring gradual revelatory patterns. Yet she retains a sequential narrative and keeps focus on one defining character. It’s surprisingly linear from today’s most quintessentially non-linear narrative artist.

Though this year’s other featured authors don’t merely mimic Munro, her influence pervades this collection. Like Munro, most of these authors favor introspective narratives that resemble one character’s personal memoirs, rather than action- or dialog-driven external events. Two stories even utilize the diary format. And most authors eschew Freytag’s Pyramid, the movement from exposition to climax to denouement, which one of my writing mentors called the “Male Orgasmic Story Model.”

Instead, Munro and her votaries favor an arc of realization, as characters gradually uncover some concealed truth about who they are. Rather than one glaring moment when truth becomes unavoidable, these stories preponderantly prefer the friction that, with time, produces a pearl. Narrative becomes the process of discovery, not the history of moments. As Lorrie Moore puts it herein, “Mutilation was a language. And vice versa.”

Alice Munro
Different authors use this arc to different purposes. Karl Taro Greenfeld, in “Horned Man,” gradually builds a Poe-ish tension that, in its final moments, never gets resolved, leaving savory dread in readers’ brains. Kirsten Valdez Quade’s “Nemecia” unpacks the influence two cousins exercised on each other, growing up Spanish in the English-speaking southwest. These stories showcase a dark side to what we might call Munrovian fiction.

Authors like Daniel Alarcón and Suzanne Rivecca display another face. Nobody would mistake any story herein for Pollyannaism, and only fools would seek happy endings between these covers; yet these authors refute hip nihilism. Rivecca’s “Philanthropy” describes the healing a social worker begins when she stops playing socially acceptable roles. Alarcón’s “The Provincials,” though, shows a young actor beginning maturity when he chooses what adult role he wants to play.

Not every author handles Munrovian influence equally well. George Saunders, in “The Semplica Girls Diary,” starts an interesting story rolling, poses timely questions… then just stops. I’m reminded of that advice so often given undergraduates: “This story ends where it should be beginning.” David Means’ “The Chair” features a protagonist who receives a spectacular narrative opportunity, but, largely finishes where he began, resisting any opportunity for Munro’s powerful revelatory arc.

If this collection suffers one notable weakness, I’d cite narrow aesthetic diversity. Of the twenty stories, two magazines, The New Yorker and Granta, contribute nine. The remaining eleven come from generously sponsored glossy magazines; quirky, experimental lit rags stuff the Honorable Mention section. This perhaps explains the preponderance of white and Hispanic authors, particularly semi-celebrities like Jim Shepard and Junot Díaz. (Didja ever think any critic would disparage Hispanic privilege?)

Of these twenty authors, ten teach university-level creative writing. Though I’m nobody to condemn academic writers, this seems a remarkable number, representing prestigious schools like MIT, Vanderbilt, and Stanford. Alice Munro just writes, that’s what she does, and it shows in her distinct vernacular style, which other authors mimic, but seldom capture. Does this reflect the editor’s horizons, or does it reflect who has time to write in today’s economy?

Notwithstanding such momentary hiccups, this year’s eminently readable collection collects prime examples, from today’s prestigious names and looming stars. All “best of” collections have subjective views, reflecting the anthologizer as much as the market. But in today’s turbulent magazine market, this collection demonstrates two important, almost inarguable facts: first, short fiction retains its place in cultural discourse. Second, we have seen the future, and it looks like Alice Munro.

Monday, May 20, 2013

Mom and Dad as Learning Coach

Jen Lilienstein, A Parent's Playbook for Learning

If I learned anything in my teaching years, it’s that most “remedial” students don’t really have a problem with the subject. They have a problem with the system. Teachers and students talk past each other, and even eager students become discouraged because school seems like an adversarial environment. Education innovator Jen Lilienstein wants to give parents and teachers the tools to make kids better learners.

Many learning experts don’t actively analyze students’ learning until roughly high school, or older. Lilienstein focuses on grade school ages, adapting the concept of “multiple intelligences,” as popularized by researchers like Howard Gardner and Thomas Armstrong. This holds that human cognitive abilities, like your child’s learning ability, are separate, distinct components, not one big “mind.” Students have more ready individual access to certain intelligences than others.

The classroom model we take for granted, which all of us who went to public (state) school shared, is not necessarily the best way to learn. Lumping kids together based on age and geography, and stuffing them into a classroom with one teacher who may or may not understand them, is cost-effective, but pedagogically inadequate. Even more so today, when budget cuts pack fifty kids into many urban classrooms.

But unless you can afford to homeschool your kids, which most working parents can’t, you rely on schools to prepare your children for their adult roles. That means parents must translate often prolix concepts into approaches children can understand. Your child hasn’t learned to close that gap. As a former teacher, I can attest that if you and your child don’t close that gap early, you never will.

Lilienstein uses an abbreviated version of the Myers-Briggs Type Indicator (MBTI), an inventory test designed to highlight personality strengths. She divides kids into eight learning categories, each of which could hypothetically subdivide further—use this book as an introduction, not a blueprint. Each learning type has its own distinct processing patterns, and parents and teachers can maximize learning by playing to these strengths.

Imagine your child loves activity learning, like art or sports, but has difficulty with reading. Lilienstein suggests teaching your child to finger-spell words in sign language, as a way to make English an activity. Or what if your kid prefers short bursts of activity over the tedium of book learning? Consider adapting Trivial Pursuit to make learning competitive, ensuring a measurable goal at the end of the process.

And not just your kids; Lilienstein suggests ways her principles can smooth communications with their teachers, too. Though she writes primarily for parents, Lilienstein encourages teachers to participate in the learning customization process. She has a lengthy section on group learning, allowing teachers to partner students with peers whose complementary abilities let them go farther. I don’t fully trust this idea—research on collaborative pedagogy is at best contradictory—but for teachers who share this value, Lilienstein’s analysis will help design better group environments.

Lilienstein divides her book according to learning category, signaled by helpful visual icons. This will especially come in handy for parents whose kids have different learning styles. My parents sincerely tried to help, but because my brain doesn’t work like theirs, their tutoring sessions frequently ended in tears. If they’d had this book thirty years ago, my life might look very different, and our relationship would feel much less strained.

I see two inherent risks with this book. First, kids could easily conclude adults will cater to them. Parents and teachers must emphasize that, while we want to utilize their learning strengths, they must learn to take the initiative. Lilienstein calls this a “playbook,” and illustrates the cover with a coach’s whistle, on purpose: while adults may call the play, students must run it in a field they cannot predict.

Second, parents could approach this book too passively. Many adults, like me, graduated from the “come in, sit down, shut up” pedagogical approach, and we learned to run the system by going along to get along. Teaching our children to be active learners requires breaking our own molds and thinking in innovative ways. We must constantly adapt Lilienstein’s guideposts to children’s growing minds, meaning we must grow, too.

Lilienstein wrote this book as a companion to her website, Kidzmet.com. Consider using both together to ease your child through the difficulty of school. Because we all need to learn, and cannot all afford private tutors, Lilienstein’s thoroughly researched assistance can make the difference between kids frustrated and discouraged by the system, and self-guided learners, ready for adult life.

Monday, May 6, 2013

Who Are U?

Jeffrey J. Selingo, College (Un)bound: The Future of Higher Education and What It Means for Students

Let us start with a statement college professors, homeschool advocates, and Jeffrey Selingo can surely agree upon: American higher education is too expensive. Budget cuts have jacked tuition, schools spend scarce resources outside the classroom, administrative roles have become patronage plums, and deregulated loans put many working-class students in debt they may never beat. The question becomes: what do we do about it?

Books like this one matter, not because they attempt to answer the question, but because they advance the debate. No 250-page book can truly address all the options. Believe me, several noble attempts have crossed my desk. But they inevitably reflect the authors’ preferences for what American education should resemble. Therein, maybe, lies the problem, that American higher ed has become perilously homogenous.

Selingo, a respected educational journalist, addresses the question from multiple angles, gathering diverse sources with divergent views, reflecting real trends in recent debate. Because he addresses so much, I find myself swinging wildly. At one moment, I pump my fist and shout “Yes! Yes! Yes!” Then the next moment, I palm my face and mutter “No! No! No!” Then I ask myself the real question: why do I feel so strongly?

Education, Selingo says, has suffered in the last decade from a “race to the top” that involves little actual educational content. Highly groomed campuses and pricey sports championships attract new enrollees and alumni donations. But colleges, particularly private colleges, have offset these expenses by hiring adjunct instructors, concentrating efforts on grant-earning grad students, and packing undergrads into lecture halls of questionable pedagogical value.

We could reverse such trends by re-evaluating what education is for. The emphasis on defined disciplines and mandatory curricular trajectories is cost-effective and requires little effort from professors. Prestige majors with putative professional applications lock students into career tracks early. Instead, we should recall that employers, and society, love college grads not for their subject mastery, but for their wide-ranging ability to face new and unprecedented challenges.

I’ve made similar claims myself. But where the rubber meets the road, Selingo has a frustrating tendency to get giddy over unproven options. He especially shares contemporary reformers’ uncritical love for technology. Selingo writes: “Every new study of online learning arrives at essentially the same conclusion: students who take all or part of their classes online perform better than those who take the same course through traditional instruction.”

That’s just not true. Earlier this year, Selingo’s own magazine, the Chronicle of Higher Education, wrote: “Online Courses Could Widen Achievement Gap Among Students.” The more online courses students take, the greater their dropout risk. This especially applies to minorities, men, and students less prepared for the rigors of self-guided education. Multiple studies in multiple sources confirm this.

Consider: Selingo praises Thrun and Norvig’s celebrated 2011 online Stanford course that attracted 160,000 enrollees worldwide. But according to his own numbers, this class had a completion rate of only 13.75%, barely a third of Fairleigh-Dickinson University’s graduation rate, which Selingo calls “dismal.” Sure, a thousand students got job referrals, but would you pay to enroll in such a class for a one-in-160 chance of professional advancement?

In fairness, Selingo repeatedly ventures in the right direction, but not far enough. He drops a one-sentence reference to St. John’s College of Annapolis and Santa Fe, which is one more sentence than I’ve seen elsewhere. Schools like St. John’s, Deep Springs, and Reed College, which soften or eliminate disciplinary divisions, graduate high numbers of desirable employees. Why not more in this direction?

Similarly, Selingo makes a fleeting reference to competencies earned through “internships outside of the classroom.” I’ve often suggested students could benefit from non-classroom education, particularly vocational students who could learn their fields faster through old-fashioned apprenticeship. I know research exists on this, because I’ve read it in authors like John Taylor Gatto. But Selingo just name-drops it and walks away.

Don’t mistake me. Despite my critical tone above, Selingo says plenty I think educators could stand to hear. We need to remain responsive to our students, providing personalized education that works around especially working-class students’ needs. And we must eschew discipline-based “skillz drillz,” instead empowering students’ higher reasoning ability. I may dispute Selingo’s details, but his thesis is spot on.

On balance, I do recommend this book as part of a balanced library on what necessary reforms await American higher ed. We may embrace his principles while rejecting his brass tacks. I simply encourage any would-be readers to approach this book with their critical thinking cap on.

On a similar topic:
Living For the New U
The Next University

Friday, January 18, 2013

The Forgotten Aesop

Chandler A. Phillips, Proverbial Aesop: The Complete Aesopic Proverbs Translated with Commentary

Aesop is remembered for his numerous fables, which are not simply children’s stories as we’ve heard for years, but actually served important illustrative value in formal argument. But Chandler Phillips, a physician and engineer as well as a classicist, openly laments the virtual disappearance of Aesop’s proverbs. So, sixty years after the authoritative corpus of Aesop was compiled, he presents the first authoritative English translation of Aesop’s proverbs.

Proverbs represent a form of ideas in common circulation, a reinvention of received wisdom which, when compressed into one or two sentences, become revitalized. We sometimes think of proverbs as wisdom nuggets, tossed out for momentary consumption. But Phillips contends that proverbs act as compressed fables, and he backs that by comparing Aesop’s proverbs to his fables, and comparing his proverbs to common Greek, Latin, and Arabic wisdom sayings.

But proverbs go beyond encapsulating “the moral of the story.” They also reinvent wisdom, which may come from multiple sources, to make it relevant to new and changing situations. That’s why, Phillips contends, few of Aesop’s proverbs come directly from his more famous fables. Instead, they cast new light on wisdom the proverbist assumes his audience already knows. Aesop’s proverbs may refer to long-lost literary ancestors.

And that’s what makes these nuggets important to us. The loss of direct knowledge across the intervening centuries makes these little sayings valuable, not just in understanding how people thought in Classical times, but in understanding how we see the world, and ourselves in it, today. How have our ideas changed, and how have they stayed the same? How are we essentially similar to our Greek forebears, and how have we invented our own cultural identity?

Some of Aesop’s proverbs make perfect sense to us across the millennia. When he says “The snake sheds its skin, but not its true character,” we don’t have to unpack this. When he says “You may have a doctor as a friend, but you should not continuously need doctoring,” Aesop is speaking to a situation which remains present to us. We know what he means, because we see snakes molt, and we know doctors in our neighborhood.

Aesop, as painted by Diego Velasquez
Other proverbs rely on cultural context. Because we are not surrounded by the language of Greek society and religious ritual, we must spend time working to understand sayings like “A person desiring a cake of melted figs sets their own house on fire.” Such sayings become like Zen koans, which demand our careful contemplation, even as we realize we will never achieve a single “correct” answer. They say something about us, even if ancient truths remain opaque.

People who dismiss fables, proverbs, and parables as “mere” children’s stories or historical relics thus miss the point. These discrete wisdom packets advance current, ongoing debates about ourselves. They may do so in an indirect manner, but as we unpack ourselves in their densely woven, mythologically allusive content, we grow in understanding, not just of the absolute truth the proverb conveys, but also in the more subtle understanding of our own souls.

Our refusal to take proverbs seriously comes at our own detriment. All cultures have proverbs of some kind, but serious scholars routinely dismiss such compressed wisdom. I’ve seen biblical proverbs used in Sunday School classes, though the Wisdom Literature scarcely exists in most Protestant lectionaries. The implication, of course, is that only children benefit from compressed nuggets of wisdom; grown-ups need to spend time spinning longer narratives. Yeah, right.

Because these proverbs differ from Aesop’s fables, they bear consideration in their own right. The fables, like the proverbs, were not meant as mere metaphoric instruction for children; they were part of important legal arguments, and the ability to unpack, condense, expand, and create parables and fables was the center of a Greek education. Classicist George A. Kennedy goes into this in more detail in his Progymnasmata.

As these proverbs were created for adults, they deserve treatment that shocks away our simplistic thinking. Which is what Phillips offers: he reinvents this knowledge in a way that transcends the historical moment when they were written. But instead of telling us directly what the proverbs mean, he draws comparisons and allusions so that, like the Original Greeks, we can understand them in a more oblique, metaphorical manner.

The proverbs, like the fables, were not meant to be comprehended head-on. Instead, they reveal inner truths, some of which continue to shock and surprise. That is, they do if we contemplate them in the manner Aesop intended, and Phillips enables.

Monday, December 31, 2012

Thinking About Thinking is Harder Than You Think

Steve Siebold, Sex Politics Religion: How Delusional Thinking is Destroying America
Linda Elder and Richard Paul, 30 Days to Better Thinking and Better Living Through Critical Thinking: A Guide for Improving Every Aspect of Your Life, Revised and Expanded


Motivational speaker Steve Siebold has made the media rounds since the Sandy Hook shooting, and before that really, advocating a robust raft of reforms. He has publicly scolded President Obama and Congress for letting partisanship trump meaningful thought. But who hasn’t? His latest book demands citizens engage in deeper, realistic thought, which he evaluates by its conclusions: if you’re thinking, you’ll agree with that paragon of critical thought, Steve Siebold.

Siebold unpacks public morals and public policy, which he lumps under the titular umbrellas of Sex, Politics, and Religion. He further subdivides this into fifty-four subtopics—in a book under 300 pages! How can he possibly examine so many topics, when few merit more than three pages of widely spaced, large-font type? You know the answer. This book stinks of straw man arguments, ad hominem attacks, and doctrinaire thought feeding foregone conclusions.

This produces a haphazard goulash of Ayn Randian libertarian diatribes. Repeatedly, he intrudes the caveat “Critical thinking tells us...”, which inevitably precedes an unexamined conclusion. Like his TV persona, Siebold reflexively excludes other viewpoints, dismisses debate in one or two sentences, and only considers arguments which support the position he already had. But he considers himself the distillation of hard thought and realistic (read “cynical”) positions.

In his intro, Siebold promises: “This may rank among the most controversial books ever written.” But his unimaginative, doctrinaire opinions are old hat, even boring. This book epitomizes a man with great pride in his accomplishments, and impatience for divergent reasoning. Despite repeated calls for “critical thinking,” his frankly coercive tone rewards intellectual passivity and groupthink.

Briefly, Siebold exemplifies the three-part structure favored by schoolyard bullies and scolding fathers: “I’m right. You’re wrong. Shut up.” In a time of dangerous political controversy and remarkably banal violence, we need public figures to widen, not narrow, the debate. We must challenge ourselves to new solutions, not anchor ourselves to old ones. Thankfully, such a book exists.

Linda Elder and Richard Paul have dedicated their careers to advocating Critical Thought, which they see not as a series of conclusions, but as a process. We engage in critical thought when we test our assumptions and beliefs; practice intellectual virtues such as humility, honesty, and fairness; and practice discretion in how we receive news and opinions from media, bosses, and politicians. And they admit, this is much harder than it seems.

Fortunately, in their latest book, Elder and Paul have broken the process down into thirty nuggets, designed so you can digest each in one day. Drawing on the same techniques teachers have historically used to parse difficult books, like the Bible or classic Russian literature, their process guides readers through an intellectual labyrinthe by taking it one step at a time. This encourages readers to unpack their own conclusions, not swallow ready-made opinions.

Human thought, like human muscle action, must be learned through gradual coaching. Just as we may believe some lifting technique makes perfect sense, only to discover too late that it causes severe back pain, a thought may seem reasonable in light of reflexive beliefs and old prejudices. We must learn carefully, over a span of time, which thoughts will result in desirable outcomes, and which we’ll pay dearly for down the line.

Far from expounding their own opinions, Elder and Paul cite many sources. Some expound how critical thinking works: Plato and Aristotle, Émile Durkheim, Eric Hoffer. Others exemplify critical thinking in action: Aquinas, Thomas Paine, HL Mencken, Margaret Mead. The authors’ sources give us key insights into the thought processes we should pursue, and just as important, they give us models to emulate so we know how critical thought may appear.

In their intro, which runs nearly fifty pages, Elder and Paul distill the points of their many prior books into a short rundown on how your mind works—and how it sometimes fails to work. This introduction contrasts very specific technical language with simple diagrams that make concepts comprehensible. One can imagine this as their PowerPoint presentation at corporate gatherings. It’s somewhat intimidating, though; feel free to skip it until you’ve read the rest of the book.

These two books are mirror images. One would exclude new ideas, stifle avenues of thought, and submit all insights to groupthink. The other encourages innovation, opens doors that only seem closed, and neither dominates nor submits. One seems useful in the near term, but will hasten painful consequences. The other requires more effort, but will proffer real solutions, or at least more meaningful debate.

Monday, June 4, 2012

America, Land of the Do-It-Yourself Self

Jack Hitt, Bunch of Amateurs: Inside America's Hidden World of Inventors, Tinkerers, and Job Creators

Andrew Carnegie never went to school. Thomas Edison had no professional credentials whatsoever. America’s greatest innovations have come from the hands of people who the “proper officials” said had no business getting involved. Journalist Jack Hitt asserts that amateurism, the pursuit of a field out of sheer love, without expectation of reward, sits at the heart of American identity.

Amateur derives from a French term for love, and signifies that we have a passion for some subject that no amount of money can approach. And that’s what makes America strong. We made a country without relying on kings or popes. We advanced science sometimes in the face of proper scholarship. Our best businesses started as shoestring operations. We are a people that has built ourselves without waiting for someone to rubber stamp our enterprises.

Amateurism stands, for Hitt, against “credentialism,” the dogmatic belief that externally bestowed endorsement makes somebody an expert. Credentials often impede innovative thought—a point that isn’t even new this year, since it played so large in Jonah Lehrer’s book Imagine. The very process of becoming a professional insider instills habits of thought that ensure the thinker can do the job exactly how it’s always been done, and not one step beyond.

We can see this in Hitt’s book, when professional ornithologists wear blinders that keep them from seeing what even an amateur birder can see: that ain’t an extinct bird returned to earth. Or in the chapter on amateur astronomy, when the professionals have to keep doing work that will produce results. The amateurs have the liberty to study the sky, looking for the kinds of discoveries that rarely come, but actually move human knowledge forward.

We know this, just looking around. We can see that business school graduates make lousy entrepreneurs. Journalism school graduates seldom do meaningful investigations, preferring to repeat official statements with the agreeability of bobblehead dolls. Most physicists’ best work is done before they turn thirty. Outsiders, guerillas, and eager neophytes make the actual inroads that keep America’s greatest disciplines thriving.

Jack Hitt
Hitt notes an important study showing that compensation actually sucks the life from pursuits. From an early age, pay changes the equation that drives our actions. Small children will draw or write or play for the sheer joy of the process; but when rewards, or grown-up approval, gets into the equation, creativity and productivity go through the floor. That’s why, when you get a job doing what used to be your hobby, the joy goes out of whatever you used to love.

Not every chapter supports Hitt’s thesis. Indeed, his chapter on amateur archaeology, with its implications of sublimated racism and pseudo-intellectualism, suggests that amateurism contains the roots of powerful abuse. Even in his largely laudatory chapter on amateur genetics, he never quite addresses the risk he brings up of some exuberant teen warping the common cold into the next Black Plague. These serve as cautions against unbridled amateurism.

The do-it-yourself ethos taps into the best America has to offer, but also the worst. It forms the lifeblood of cultural development, allowing those truly passionate about their field to make substantial contributions. But it allows ignorant crackpots to go off half-cocked, propogating ideas that are dangerous or wildly offensive. But maybe that’s a fair description of America and her people: a nation of unrecognized geniuses and wild-eyed fanatics.

How, then to counter the worst impulses of amateurism? Hitt has no suggestions. Instead, he simply reminds us that, for all its risks, amateurism has contributed more to our national well-being than we can possibly calculate. It falls to us to ensure that we keep up the passionate immersion of amateurism, without lapsing into the dangerous extremes of moronic crankery.

Hitt overlaps with several other recent books. In addition to Lehrer, mentioned above, Hitt shares many themes with Susan Cain’s Quiet, Charles Pierce’s Idiot America, and Malcolm Gladwell’s Outliers—must be something in the water. Sometimes they correspond almost verbatim, since they’re quoting the same sources. While Hitt doesn’t necessarily bring new ideas to the table, he brings new and interesting context.

Hitt makes a strong case that Americans’ frontier ethos, where we do it ourselves, and where we make our own expertise, makes us the people we are. He sells his point with careful insight, unexpected dry wit, and spirited narrative panache. If he can get just a few people out of their TV-induced comas and out doing what they love, he will have done a good service to this great land.