A Close Look at Modern Mythology, Pop Culture, Hot Media, Book Reviews, and the Psychology That Makes Our Society Hop
Friday, September 30, 2011
Spinning a Line
About a year ago, I walked into my classroom and found my students chatting away. This particular class had the main door at the back, with the students facing away, so they often wouldn’t see me enter or know I was present until I was already in. Some near the front wouldn’t know I was present until I nearly reached the front of the room. Thus I could often get insights into their thinking by just entering slowly enough to avoid drawing attention.
That day, one student, Neil (not his real name), a former soldier with habit of comparing me to a drill instructor, said: “I feel like I’m not really as smart as people think, that I’m just spinning a line of bullshit.” At that moment he saw me enter and fell silent, but several around him nodded and voiced vague agreement before they noticed my presence.
As eighteen pairs of eyes turned to regard me, I asked, “Why did you go silent?”
“Well, I... I don’t know.”
“Is there something shameful,” I asked, “about spinning a line of bull?”
Neil shrugged, in that oversized, two-palms manner that the young often have. “It just seems, I don’t know, dishonest. Like I’m pretending I know something when I reallly don’t.”
“Well then, let me turn it around. What do you do when you spin a line of bull?”
“I just kind of pull a little bit from over here, and a little bit from over here, and run them together in a way that sounds smart.”
“So you’re combining separate pieces of information to create something new?”
“I guess so.”
“And,” I said, feeling I’d made the triumphant stroke, “how is that not an act of intellectual accomplishment?”
Silence reigned.
“I put it to you,” I said to the class, “that spinning a line of bullshit is the most important skill you will ever pick up in your academic career. That’s all any of us ever do, stringing together disparate pieces of knowledge and synthesizing something new. Mixing two pieces of information to create something where the whole exceeds the sum of its parts—well, that’s the only way anybody ever made any progress on anything.”
Neil offered that oversized shrug again. “But so much of what I come up with gets shot down when we have the class discussion.”
“And?”
“And doesn’t that mean I’ve failed in the process?”
“You created an idea, put it to the test, and found it didn’t work. What’s wrong with that?”
“Don’t we come up with ideas in the hopes of them surviving?”
“Yes, we do. But remember the old story about Thomas Edison, who attempted over two thousand options trying to find a useful filament for his light bulb. A lab assistant said that they had failed over two thousand times. Edison said they had not; they had definitively proven that two thousand false options would not work.”
Nathan in the other corner put his hand up then. (How many times did I have to tell them that, in college, they don’t need permission to speak up in an open discussion?) “Are you saying that failure is good, then?”
“Yes, I am.”
“But it’s...” Nathan waved his hands around, fumbling for words. “It’s failure!”
“Yes, it is. But you have to abolish the idea of ‘failure’ as more than just ‘not success.’ Thomas Watson, the guy who made IBM into a business juggernaut, said: ‘If you want to succeed more, increase your failure rate.’ Failure means you tried something. It means you took a risk. Failure means you’re still alive, still thinking, and one step closer to achieving your goals.”
“So,” Neil asked, “you want us to fail more?”
“I want you to risk more. And that may mean failing more. Employers favor college-educated job candidates because you have the skill to, as you put it, spin a line of bullshit. You have the breadth of knowledge necessary to step beyond the safe and known, take risks, and jump to the next level.”
Dani, who had the habit of sitting quietly until she had something concrete to say, leaned forward at this point. “You’re spinning a line of bullshit right now, aren’t you?”
“Of course. Do you really think I came in here this afternoon expecting to have this discussion?”
“I suppose not.”
“I’ve been spinning since the moment I came in. Let me ask you, do you like what you’ve heard so far?”
Dani nodded.
“Then let’s get on to today’s planned discussion.”
And we never had a better discussion all semester.
Wednesday, September 28, 2011
The Christian in the Free World—a Survey
Towards a Politics of Imagination, Part Three
Last time Quaker activist Parker J. Palmer divided life into the private, public, and political spheres, wherein the three most fundamental meeting places are the classroom, the congregation, and the workplace. This time I’d like to discuss how Christian congregations approach the three spheres.
Many frustrations surely begin in the private realm. This is the space we reserve for ourselves and those we trust—family, friends, and God. But it also bears our greatest disappointments. Most honest adults will agree our lives are not where we would like. Caryn Dahlstrand Rivandeira, author of Grumble Hallelujah, believes we can do something about that.
If we believe God takes an active interest in human affairs, then surely God shares our grief that life lets us down. This gives us permission to weep for the life we expected, freeing us from our burdens and letting us move onto life’s next stage. When we do that, we shed the shame which impedes us.
For many, private life offers the firm foundation to face the public and political spheres confident in our beliefs and positions. When private live shackles us to past expectations, be it in family, work, or friendship, we aren’t free to serve God. Drawing on the Book of Lamentations, Rivandeira insists we have a God-given right to face the present by mourning the past.
Unfortunately, for many Christians, church is the last place we feel free to speak our frustrations. Too often, congregations treat disappointment as apostasy. We need to shed that, because Christianity’s public sphere, the congregation, should heal and nurture. I feel this lack keenly at the heart of Ronnie Floyd’s Our Last Great Hope.
Reverend Floyd has a long history of working to advance the Great Commission. Unfortunately, he and I disagree on the word “disciple.” He holds forth at length on the importance of witness and outreach. While evangelism is an important aspect of Christianity, Floyd uses it in a way that I fear diminishes others we meet in our public lives.
Floyd makes a persuasive case that Christian outreach follows a concentric pattern, from families to communities, through the nation, into the whole world. However, he presents this outreach, and indeed all public interactions, as an urgent opportunity to make converts. This saps “the other” of basic humanity.
If I see everyone I meet as a potential convert, I reduce all other persons to numbers on a checklist. I make all other persons unequal to me, because they must submit to my point of view. Christ calls us to make disciples, not converts. Discipleship is a relationship based on sharing, teaching, and nurturance.
The public sphere does not exist to give us a soapbox to transform the world. It exists to let us build relationships with equals. I may try to persuade equals to my opinion; but if I see my opinion as the only one, and strangers as either with me or ripe for change, I make them unequal, even subordinate. This violates my own interest.
This also explains the frustration many Christians have in the political arena. As journalist Alisa Harris reveals in Raised Right, many sincere Christians equate their faith with a certain outlook—often conservative—and, because they believe one is absolute, the other must follow suit. Sadly, reality has a way of upsetting that apple cart.
More memoir than manifesto, Harris’ narrative contrasts her childhood certainty with the struggle she encountered carrying Christian conservatism into adulthood. When she received her views from parents and church leaders, claims like Ronald Reagan’s goodness and liberalism’s perfidy could hold water easily. But real life doesn’t permit such certitude.
College and working life provided tensions that Harris couldn’t reconcile with rightist dogma. Some people, she learned, are so downtrodden that they needed government assistance. Some capitalists don’t run their businesses along Christian lines. The tension forced her to evaluate her prior presumptions.
Harris embodies a movement many Christians experience: from family to adult life to political engagement. Her struggle to create a mature politics around her Christian faith mirrors my own struggle a years ago, and if Harris is like me, she has more upheaval to come. Conventional liberalism will prove as unsatisfying as inherited conservatism.
Good Christianity carries the same demands as Parker J. Palmer says good citizenship carries. If we believe in America’s Christian heritage, as many do, then bolstering our faith will make us better citizens. Here’s hoping people of faith have the courage to engage the spheres of life with complete fairness.
Part One
Part Two
Last time Quaker activist Parker J. Palmer divided life into the private, public, and political spheres, wherein the three most fundamental meeting places are the classroom, the congregation, and the workplace. This time I’d like to discuss how Christian congregations approach the three spheres.
Many frustrations surely begin in the private realm. This is the space we reserve for ourselves and those we trust—family, friends, and God. But it also bears our greatest disappointments. Most honest adults will agree our lives are not where we would like. Caryn Dahlstrand Rivandeira, author of Grumble Hallelujah, believes we can do something about that.
If we believe God takes an active interest in human affairs, then surely God shares our grief that life lets us down. This gives us permission to weep for the life we expected, freeing us from our burdens and letting us move onto life’s next stage. When we do that, we shed the shame which impedes us.
For many, private life offers the firm foundation to face the public and political spheres confident in our beliefs and positions. When private live shackles us to past expectations, be it in family, work, or friendship, we aren’t free to serve God. Drawing on the Book of Lamentations, Rivandeira insists we have a God-given right to face the present by mourning the past.
Unfortunately, for many Christians, church is the last place we feel free to speak our frustrations. Too often, congregations treat disappointment as apostasy. We need to shed that, because Christianity’s public sphere, the congregation, should heal and nurture. I feel this lack keenly at the heart of Ronnie Floyd’s Our Last Great Hope.
Reverend Floyd has a long history of working to advance the Great Commission. Unfortunately, he and I disagree on the word “disciple.” He holds forth at length on the importance of witness and outreach. While evangelism is an important aspect of Christianity, Floyd uses it in a way that I fear diminishes others we meet in our public lives.
Floyd makes a persuasive case that Christian outreach follows a concentric pattern, from families to communities, through the nation, into the whole world. However, he presents this outreach, and indeed all public interactions, as an urgent opportunity to make converts. This saps “the other” of basic humanity.
If I see everyone I meet as a potential convert, I reduce all other persons to numbers on a checklist. I make all other persons unequal to me, because they must submit to my point of view. Christ calls us to make disciples, not converts. Discipleship is a relationship based on sharing, teaching, and nurturance.
The public sphere does not exist to give us a soapbox to transform the world. It exists to let us build relationships with equals. I may try to persuade equals to my opinion; but if I see my opinion as the only one, and strangers as either with me or ripe for change, I make them unequal, even subordinate. This violates my own interest.
This also explains the frustration many Christians have in the political arena. As journalist Alisa Harris reveals in Raised Right, many sincere Christians equate their faith with a certain outlook—often conservative—and, because they believe one is absolute, the other must follow suit. Sadly, reality has a way of upsetting that apple cart.
More memoir than manifesto, Harris’ narrative contrasts her childhood certainty with the struggle she encountered carrying Christian conservatism into adulthood. When she received her views from parents and church leaders, claims like Ronald Reagan’s goodness and liberalism’s perfidy could hold water easily. But real life doesn’t permit such certitude.
College and working life provided tensions that Harris couldn’t reconcile with rightist dogma. Some people, she learned, are so downtrodden that they needed government assistance. Some capitalists don’t run their businesses along Christian lines. The tension forced her to evaluate her prior presumptions.
Harris embodies a movement many Christians experience: from family to adult life to political engagement. Her struggle to create a mature politics around her Christian faith mirrors my own struggle a years ago, and if Harris is like me, she has more upheaval to come. Conventional liberalism will prove as unsatisfying as inherited conservatism.
Good Christianity carries the same demands as Parker J. Palmer says good citizenship carries. If we believe in America’s Christian heritage, as many do, then bolstering our faith will make us better citizens. Here’s hoping people of faith have the courage to engage the spheres of life with complete fairness.
Part One
Part Two
Monday, September 26, 2011
Democracy, the Heart, and All Things Healing
Towards a Politics of Imagination, Part Two
Parker J. Palmer starts from the idea that democracy remains an ideal worth striving after. He doesn’t sugar-coat his opinion, or pretend America hasn’t frequently failed to uphold “the better angels of our nature,” according to the Abraham Lincoln quote that lives so close to his heart. He’s forthright in Healing the Heart of Democracy that we have miles to go. But we also have a goal worth pursuing, and a people more than ready for the pursuit.
In the wake of the Gabrielle Giffords shooting, our ongoing overseas fighting, and two polarizing presidential elections, many Americans feel disillusioned with politics altogether. Not Palmer. Like you and me, Palmer sees parties who ignore the electorate, office holders who talk past each other, and media figures who sell advertising by ginning up partisan rancor. But he sees this not as failure, but as unprecedented opportunity.
Palmer believes democracy depends vitally on certain “habits of the heart” among its citizens. Too often, we permit our elected, business, and media leaders to behave unchecked, secure in the illusion that, if they do anything wrong, we can vote them out, move our money, or change channels. But recent history proves that such passivity permits the unscrupulous to run society unchallenged. It’s time free people stepped up.
Many before Palmer have said the same. Such jeremiads have become almost comical in their repetition. But Palmer goes further. Instead of telling us to “get involved,” he describes what such involvement looks like. He advocates gathering around certain virtues, such as a “sense of curiosity, responsibility, and agency.” And he proposes a plan to bring such virtues back into view.
It starts by recognizing that we all have a stake in the public sphere. Palmer emphasizes that the “public” sphere is where citizens meet as equals and build meaningful relationships that flow freely; don’t confuse the public with the political sphere, which is hierarchical and power-based. We can never meet as equals in the political sphere. But work, commerce, religion, and other places of adult equality form democracy’s true beating heart.
Unfortunately, we permit “leaders” of questionable merit to devalue the public sphere, until the private sphere squeezes it out. We no longer go to marketplaces where goods, ideas, and relationships flow freely across economic and cultural barriers. Our air-conditioned, privately held malls take that place. Dissent gets squelched, both by ownership of public space, and by permitting strident partisan propaganda to replace frank discussion.
Such an isolated world leaves us vulnerable. We go worship on television, amuse ourselves on home entertainment centers, and hear our politicians on television, rather than going to church, the cinema, or open-air rallies. We pay the price for that isolation when we seek to fill our loneliness with ever-increasing spending, vacuous demagoguery, and faux relationships made to order through the media. Then it all comes crashing down, as it did in 2008.
Public life happens most in three venues: school, worship, and work. Sadly, leaders who don’t have our best interests at heart suborn all three venues. School, as I've lamented, has become hierarchical and encourages passivity rather than engagement. Worship works best when it rewards an outsider's perspective that seldom gets past the gatekeepers. And as long as workers and managers maintain an adversarial attitude, work remains a diminished public space.
Parker J. Palmer’s vision of how to remedy these inequities will not come easily. Despite his eloquent explanations throughout his book, he acknowledges that we will face resistance. Those who benefit from the current state of affairs will push back actively, while those who could gain from democratic renewal will offer only inertia—at least at first. To succeed, Palmer says, we need to invest for years, even generations.
Palmer’s intentions could easily be mistaken, especially in today’s climate, which rewards partisan rancor and snap decisions. To his credit, Palmer concedes his own prejudices early. But he works hard to remain neutral on partisan issues. He admits that people of good character come down on either side of contentious issues. The result, though, matters to him less than the character.
Democracy has suffered recently because we think we have achieved our Founders’ goals once for all. Palmer reminds us that democracy is not a state of being; it is a process. And if we want to pass a free and peaceful society onto the next generation, we need to take a guiding hand in maintaining that peace, and those freedoms, right now, and every day to come.
Part One
Parker J. Palmer starts from the idea that democracy remains an ideal worth striving after. He doesn’t sugar-coat his opinion, or pretend America hasn’t frequently failed to uphold “the better angels of our nature,” according to the Abraham Lincoln quote that lives so close to his heart. He’s forthright in Healing the Heart of Democracy that we have miles to go. But we also have a goal worth pursuing, and a people more than ready for the pursuit.
In the wake of the Gabrielle Giffords shooting, our ongoing overseas fighting, and two polarizing presidential elections, many Americans feel disillusioned with politics altogether. Not Palmer. Like you and me, Palmer sees parties who ignore the electorate, office holders who talk past each other, and media figures who sell advertising by ginning up partisan rancor. But he sees this not as failure, but as unprecedented opportunity.
Palmer believes democracy depends vitally on certain “habits of the heart” among its citizens. Too often, we permit our elected, business, and media leaders to behave unchecked, secure in the illusion that, if they do anything wrong, we can vote them out, move our money, or change channels. But recent history proves that such passivity permits the unscrupulous to run society unchallenged. It’s time free people stepped up.
Many before Palmer have said the same. Such jeremiads have become almost comical in their repetition. But Palmer goes further. Instead of telling us to “get involved,” he describes what such involvement looks like. He advocates gathering around certain virtues, such as a “sense of curiosity, responsibility, and agency.” And he proposes a plan to bring such virtues back into view.
It starts by recognizing that we all have a stake in the public sphere. Palmer emphasizes that the “public” sphere is where citizens meet as equals and build meaningful relationships that flow freely; don’t confuse the public with the political sphere, which is hierarchical and power-based. We can never meet as equals in the political sphere. But work, commerce, religion, and other places of adult equality form democracy’s true beating heart.
Unfortunately, we permit “leaders” of questionable merit to devalue the public sphere, until the private sphere squeezes it out. We no longer go to marketplaces where goods, ideas, and relationships flow freely across economic and cultural barriers. Our air-conditioned, privately held malls take that place. Dissent gets squelched, both by ownership of public space, and by permitting strident partisan propaganda to replace frank discussion.
Such an isolated world leaves us vulnerable. We go worship on television, amuse ourselves on home entertainment centers, and hear our politicians on television, rather than going to church, the cinema, or open-air rallies. We pay the price for that isolation when we seek to fill our loneliness with ever-increasing spending, vacuous demagoguery, and faux relationships made to order through the media. Then it all comes crashing down, as it did in 2008.
Public life happens most in three venues: school, worship, and work. Sadly, leaders who don’t have our best interests at heart suborn all three venues. School, as I've lamented, has become hierarchical and encourages passivity rather than engagement. Worship works best when it rewards an outsider's perspective that seldom gets past the gatekeepers. And as long as workers and managers maintain an adversarial attitude, work remains a diminished public space.
Parker J. Palmer’s vision of how to remedy these inequities will not come easily. Despite his eloquent explanations throughout his book, he acknowledges that we will face resistance. Those who benefit from the current state of affairs will push back actively, while those who could gain from democratic renewal will offer only inertia—at least at first. To succeed, Palmer says, we need to invest for years, even generations.
Palmer’s intentions could easily be mistaken, especially in today’s climate, which rewards partisan rancor and snap decisions. To his credit, Palmer concedes his own prejudices early. But he works hard to remain neutral on partisan issues. He admits that people of good character come down on either side of contentious issues. The result, though, matters to him less than the character.
Democracy has suffered recently because we think we have achieved our Founders’ goals once for all. Palmer reminds us that democracy is not a state of being; it is a process. And if we want to pass a free and peaceful society onto the next generation, we need to take a guiding hand in maintaining that peace, and those freedoms, right now, and every day to come.
Part One
Wednesday, September 21, 2011
Five Lessons Factory Work Taught Me About School
Students often come to college thinking an education will save them from blue collar work. But working at the factory while continuing to teach, though very difficult, has granted me important new insights into how I teach, and how they learn. Let me share just the most significant.
1. Pain is a sign that you are learning.
Setting components on the assembly line looks easy until you do it. When the machine runs at a standard forty-five units per minute, rest assured, on your first day, you will fail to keep up. Even after months on the job, minor setbacks can knock me off the production line. You run the machine, but the machine can easily wind up running you.
Yet watching a colleague with a year’s seniority, I noticed he didn’t even look at the components as he set them on the line. He just lifted them off the pallet, set them on the line, and boom, there it was. Trying to keep up with him proved physically painful. But I embraced the pain, and I’ve made significant strides in keeping up with the machine.
Students often avoid learning’s natural discomfort because they think, if it hurts, it’s harmful. But as Srinivasan Pillay says in Your Brain and Business, the process of growing new neurons, of your brain becoming more complex, feels painful at first. Diving into this pain is necessary to cultivate a more sophisticated mind.
2. Pay attention to the present; the past and future will pay attention to themselves.
At the standard running rate, I have only one and one-half seconds with each unit on the line, to evaluate it, decide whether it’s ready for the process, and fit it for the next station on the line. I can only do that if the unit before me occupies my whole attention. If I double-check prior units, or look ahead to the next unit, I can’t pay attention to the one in front of me, and I can’t do the best possible job. Work is downright Buddhist that way. We must remain mindful of the present, because the future and the past can crowd our consciousness detrimentally.
Students often become so entangled in what they think their finished product should look like that they never start. Others look backward, constantly correcting perceived error, and thus never finish because their attention is anchored backward. I do not mean that students should not plan, nor that they should avoid evaluation and revision, but when they let these steps crowd out the actual process, they bog down in ideals and never make inroads on the task before them.
3. Tension is desirable.
My job involves two work spaces. In one, the line runs rapidly and work keeps me moving. In the other, work runs stop-and-start, parts are often missing, and the pace lags. The first is harder, the second more restful, so you’d think I’d prefer the second. Yet without continuing tension, the second is often tedious, and eight hours passes with excruciating slowness.
Most students, and many teachers, think they prefer a leisurely, peaceable classroom. But I can tell the difference in output between my early students, when I ran a loose ship, and my recent students, whose feet I’ve held to the fire. Tension may seem unpleasant in the near term, but that extra little edge can often make the difference between real learning and sluggish tedium.
4. Rest often.
Some management theories contend that laborers should work with machine-like ferocity. Yet when machines run continuously for shift after shift, they break. When the metal gets strained, joints overheat, or the governing computer can’t manage the demands, the machines go ping and stop running. I’ve seen laborers do the psychological equivalent on the factory floor.
Many students push themselves to appalling lengths. In addition to full-time studies, my students often maintain jobs (some full time) and robust social schedule. Their workloads take a toll on students’ health, appearance, and job performance. Unfortunately, among those options, they most likely consider school of lowest importance.
Lack of rest, including but not limited to lack of sleep, causes musculoskeletal disorders, psychological distress, and endocrine imbalances. Henry Thompson, in The Stress Effect, catalogs the damages wrought by lack of rest, including weight problems, diminished judgment, and stunted prefrontal cortex. Whether you work or study, make rest a high priority.
5. Pee regularly.
Work, like learning, is difficult enough. Don’t compound that by facing the day with the frustration of a clenched sphincter.
1. Pain is a sign that you are learning.
Setting components on the assembly line looks easy until you do it. When the machine runs at a standard forty-five units per minute, rest assured, on your first day, you will fail to keep up. Even after months on the job, minor setbacks can knock me off the production line. You run the machine, but the machine can easily wind up running you.
Yet watching a colleague with a year’s seniority, I noticed he didn’t even look at the components as he set them on the line. He just lifted them off the pallet, set them on the line, and boom, there it was. Trying to keep up with him proved physically painful. But I embraced the pain, and I’ve made significant strides in keeping up with the machine.
Students often avoid learning’s natural discomfort because they think, if it hurts, it’s harmful. But as Srinivasan Pillay says in Your Brain and Business, the process of growing new neurons, of your brain becoming more complex, feels painful at first. Diving into this pain is necessary to cultivate a more sophisticated mind.
2. Pay attention to the present; the past and future will pay attention to themselves.
At the standard running rate, I have only one and one-half seconds with each unit on the line, to evaluate it, decide whether it’s ready for the process, and fit it for the next station on the line. I can only do that if the unit before me occupies my whole attention. If I double-check prior units, or look ahead to the next unit, I can’t pay attention to the one in front of me, and I can’t do the best possible job. Work is downright Buddhist that way. We must remain mindful of the present, because the future and the past can crowd our consciousness detrimentally.
Students often become so entangled in what they think their finished product should look like that they never start. Others look backward, constantly correcting perceived error, and thus never finish because their attention is anchored backward. I do not mean that students should not plan, nor that they should avoid evaluation and revision, but when they let these steps crowd out the actual process, they bog down in ideals and never make inroads on the task before them.
3. Tension is desirable.
My job involves two work spaces. In one, the line runs rapidly and work keeps me moving. In the other, work runs stop-and-start, parts are often missing, and the pace lags. The first is harder, the second more restful, so you’d think I’d prefer the second. Yet without continuing tension, the second is often tedious, and eight hours passes with excruciating slowness.
Most students, and many teachers, think they prefer a leisurely, peaceable classroom. But I can tell the difference in output between my early students, when I ran a loose ship, and my recent students, whose feet I’ve held to the fire. Tension may seem unpleasant in the near term, but that extra little edge can often make the difference between real learning and sluggish tedium.
4. Rest often.
Some management theories contend that laborers should work with machine-like ferocity. Yet when machines run continuously for shift after shift, they break. When the metal gets strained, joints overheat, or the governing computer can’t manage the demands, the machines go ping and stop running. I’ve seen laborers do the psychological equivalent on the factory floor.
Many students push themselves to appalling lengths. In addition to full-time studies, my students often maintain jobs (some full time) and robust social schedule. Their workloads take a toll on students’ health, appearance, and job performance. Unfortunately, among those options, they most likely consider school of lowest importance.
Lack of rest, including but not limited to lack of sleep, causes musculoskeletal disorders, psychological distress, and endocrine imbalances. Henry Thompson, in The Stress Effect, catalogs the damages wrought by lack of rest, including weight problems, diminished judgment, and stunted prefrontal cortex. Whether you work or study, make rest a high priority.
5. Pee regularly.
Work, like learning, is difficult enough. Don’t compound that by facing the day with the frustration of a clenched sphincter.
Monday, September 19, 2011
Dissecting China for Fun and Profit: Troy Parfitt's Tourism Journalism
Public pundits have made copious noise recently about how the West must prepare for the day, probably within our lifetimes, when we will accept second-class status to mainland China. In the last decade, China supplanted Germany as Earth’s third largest economy, then quickly surged past Japan for number two. In less than a generation, this former colossus, pushed to the brink of ruin a century ago, stands poised to knock America off the economic peak.
Troy Parfitt doesn’t buy it. As an international English teacher, this Canadian has watched China for years from his perch across a brief swim in Taiwan. In Why China Will Never Rule the World: Travels in the Two Chinas, he investigates this debate from the ground level. By combining travelogue, journalism, and editorial commentary, he provides insight into a country many Westerners still consider opaque and mystical.
He also calls into question what values make social and political judgments possible.
Parfitt uses the P.J. O’Rourke technique of cultural inquiry. This approach involves getting a hotel room in the nation under question, wandering the streets, talking to locals, and offering personal observations. Like O’Rourke, Parfitt relies on wiseacre comments and stand-up comedy narrative to create a story. Despite journalistic purposes, Parfitt is no dispassionate reporter; he inserts himself into a story still unfolding, narrating from the belly of the beast.
This technique requires an essential trade-off. Parfitt’s immediacy and human touch necessarily incorporate his bias into the story. He says in his introduction: “In spite of my prejudices, I honestly tried to approach the experience with as open a mind as possible.” To his credit, Parfitt does a better job than O’Rourke’s Shanghai chapter in Eat the Rich, which reduces China to broad “Inscrutable Orient” stereotypes.
But when Parfitt says of a group of domestic tourists, “I have seen brighter-looking ostriches,” I wonder how reliable I can consider his narrative. Sure, it’s a funny line. But can I really draw meaningful conclusions about all of China from such observations? I lived in Hawaii many years ago, and I know that domestic tourists’ wide-eyed passivity is by no stretch a Chinese characteristic.
Parfitt exceeds O’Rourke in his depth of investigation. Where O’Rourke stays in posh hotels for a week or two, bankrolling guides and translators while mocking locals from a secure balcony with minibar, Parfitt actually moves among the people he investigates. He took the time to learn Mandarin before entering the country, and circulates among the population for three months. This lets him get a broad cross-section of Chinese culture.
And unlike O’Rourke, who comically hates everything, Parfitt concedes that he likes a lot. He admires architecture, relishes local cuisine, and has a brief romance with a winsome Chinese lass. He presents China, not as a series of xenophobic ethnic clichés or abstruse economic statistics, but as a place occupied by real humans.
Unfortunately, he also sees them as real humans who primarily fail to uphold his Western standards. He wants swift service, smiles all around, and cab drivers who can negotiate Hong Kong streets in English. He wants standards of professionalism that didn’t even exist in the Western world a century ago. And he looks down on Chinese who don’t snap to. Though I can’t call Parfitt racist (he denigrates everyone equally), he certainly sees the world through his own particular lenses.
Parfitt concedes his own prejudices, and I have mine. One of mine is summed up in a quote from anthropologist Wade Davis: “Other cultures are not failed attempts at being you; they are unique manifestations of the human spirit.” Parfitt sees China’s peculiarities as shortcomings. I suspect China is different because it’s different. It manifests a unique interpretation of human potential.
If Parfitt sees dirty sushi bars and locals who resent outsiders as signs that China is a nation of missed opportunities, I’d offer him a tour of the shabby side of any North American city. If he considers China politically moribund because it honors Chairman Mao decades after his death, I defy him to explain American politicians’ appeals to the Founding Fathers. Anyone can find the lousy side of anywhere, if they look hard enough.
Parfitt’s book provides an interesting look at a certain aspect of Chinese culture. And it offers a valid dissenting view in what may be our generation’s most important debate. But it presents only one facet of a complex value judgment, and it must be read as such. Otherwise, it only serves to muddy an already murky debate.
Troy Parfitt doesn’t buy it. As an international English teacher, this Canadian has watched China for years from his perch across a brief swim in Taiwan. In Why China Will Never Rule the World: Travels in the Two Chinas, he investigates this debate from the ground level. By combining travelogue, journalism, and editorial commentary, he provides insight into a country many Westerners still consider opaque and mystical.
He also calls into question what values make social and political judgments possible.
Parfitt uses the P.J. O’Rourke technique of cultural inquiry. This approach involves getting a hotel room in the nation under question, wandering the streets, talking to locals, and offering personal observations. Like O’Rourke, Parfitt relies on wiseacre comments and stand-up comedy narrative to create a story. Despite journalistic purposes, Parfitt is no dispassionate reporter; he inserts himself into a story still unfolding, narrating from the belly of the beast.
This technique requires an essential trade-off. Parfitt’s immediacy and human touch necessarily incorporate his bias into the story. He says in his introduction: “In spite of my prejudices, I honestly tried to approach the experience with as open a mind as possible.” To his credit, Parfitt does a better job than O’Rourke’s Shanghai chapter in Eat the Rich, which reduces China to broad “Inscrutable Orient” stereotypes.
But when Parfitt says of a group of domestic tourists, “I have seen brighter-looking ostriches,” I wonder how reliable I can consider his narrative. Sure, it’s a funny line. But can I really draw meaningful conclusions about all of China from such observations? I lived in Hawaii many years ago, and I know that domestic tourists’ wide-eyed passivity is by no stretch a Chinese characteristic.
Parfitt exceeds O’Rourke in his depth of investigation. Where O’Rourke stays in posh hotels for a week or two, bankrolling guides and translators while mocking locals from a secure balcony with minibar, Parfitt actually moves among the people he investigates. He took the time to learn Mandarin before entering the country, and circulates among the population for three months. This lets him get a broad cross-section of Chinese culture.
And unlike O’Rourke, who comically hates everything, Parfitt concedes that he likes a lot. He admires architecture, relishes local cuisine, and has a brief romance with a winsome Chinese lass. He presents China, not as a series of xenophobic ethnic clichés or abstruse economic statistics, but as a place occupied by real humans.
Unfortunately, he also sees them as real humans who primarily fail to uphold his Western standards. He wants swift service, smiles all around, and cab drivers who can negotiate Hong Kong streets in English. He wants standards of professionalism that didn’t even exist in the Western world a century ago. And he looks down on Chinese who don’t snap to. Though I can’t call Parfitt racist (he denigrates everyone equally), he certainly sees the world through his own particular lenses.
Parfitt concedes his own prejudices, and I have mine. One of mine is summed up in a quote from anthropologist Wade Davis: “Other cultures are not failed attempts at being you; they are unique manifestations of the human spirit.” Parfitt sees China’s peculiarities as shortcomings. I suspect China is different because it’s different. It manifests a unique interpretation of human potential.
If Parfitt sees dirty sushi bars and locals who resent outsiders as signs that China is a nation of missed opportunities, I’d offer him a tour of the shabby side of any North American city. If he considers China politically moribund because it honors Chairman Mao decades after his death, I defy him to explain American politicians’ appeals to the Founding Fathers. Anyone can find the lousy side of anywhere, if they look hard enough.
Parfitt’s book provides an interesting look at a certain aspect of Chinese culture. And it offers a valid dissenting view in what may be our generation’s most important debate. But it presents only one facet of a complex value judgment, and it must be read as such. Otherwise, it only serves to muddy an already murky debate.
Friday, September 16, 2011
Mike Smith Saves the World from the Weather
Living in Tornado Alley, I’ve often felt grateful for the loud sirens and TV’s color-coded Doppler radar displays. Meteorologist Mike Smith, who pioneered many of the technologies that have saved lives in the nation’s midland, looks back over a pathbreaking career, and decades of weather history, in his debut book, Warnings: The True Story of How Science Tamed the Weather.
I didn’t realize that, as recently as the 1950s, the Weather Bureau—now the National Weather Service—not only didn’t predict tornadoes and hurricanes; they flatly forbid such forecasts. They feared you and I were too irrational to handle such knowledge. Hundreds of people sat blindly unaware in the path of truly horrific weather because the government thought public panic was riskier than mass destruction.
Members of my generation grew up with the idea that weather prediction was a reliable applied science, that forecasts would continue to improve, and that we had a right to know when destructive weather menaced our homes. Smith combines history and memoir to describe the changes that made such an attitude possible.
Smith pays particular attention to tornadoes. As a survivor of the 1957 Ruskin Heights tornado, which killed dozens and flattened a Kansas City suburb, Smith demonstrates particular affinity for tornadoes. This played out in his early career when, as a young weatherman, he failed to utilize the newest technology and left much of Oklahoma City vulnerable to a significant tornado outbreak.
Throughout history, humans have stood vulnerable to weather phenomena. Only recently have we had technology to plan for the weather in any concrete way. Victims of the 1900 Galveston Hurricane or the Great Hurricane of 1780 had no warning before their homew washed out to sea. When Dorothy trembled before the gruesome twister, that was no literary device. People literally lived in fear of the weather.
Our technology also makes us vulnerable to weather in entirely new ways. Smith spends several chapters on Delta Flight 191, which got caught in a controversial phenomenon called a “microburst,” a highly localized storm that pushes cold, wet air into the ground with such force that a jumbo jet could get sucked along like driftwood. Only high tech could leave us at the mercy of such weather.
Smith believes that technology also frees us from such risks. Hundreds of people died in air disasters caused by microbursts, but not since 1994. We have the ability to recognize and anticipate such risks in a way we couldn’t in 1985, when Delta 191 flew through what the captain thought was just the rain.
And that’s what Smith means when he says science has tamed the weather. We have the knowledge to spot disasters in advance, steer the most vulnerable out of the path of destruction, and prevent loss of life. Though weather remains beyond our control, we no longer have to live in fear of wind and water.
In one telling thread, Smith compares the 1955 Udall, Kansas, tornado, with a nearly identical outbreak that hit Greensburg, Kansas, in 2007. The two tornadoes followed such similar paths that, superimposed, you could easily confuse one tornado for the other. Each flattened the towns. Yet Greensburg’s residents survived to rebuild; Udall, as Smith says, “died in its sleep.” This story isn’t just informative; it’s touching.
Strangely, though we regard the weather as the ultimate impersonal truth, Smith describes the controversies that actually accrue to weather prediction. Personalities like Robert Miller and Ted Fujita have polarized the meteorological community. Though these debates linger outside public view, they have forced their way into how we perceive the weather, and how we shield ourselves from it.
Fujita’s theories, for instance, have entered popular culture through the movie Twister, to which Smith returns time and again. We toss around terms like F-4 and F-5 casually, as we did this year following the Joplin tornado, without realizing the battles that went into this scale’s general acceptance. Ted Fujita faced resistance throughout that seems appallingly unscientific to outsiders.
Smith skillfully makes this and other controversies seem not just important, but exciting. Meteorology, in his telling, has the same bare-knuckle energy we see in politics or sports. These battles, many of which Smith himself fought in, reveal how much of our modern, weather-safe lifestyle is contingent on personalities, and could have gone another way.
While weather forecasters often appear starchy and bland, Smith makes the weather into an urgent concern, and a remarkable victory. This story turns the weather into a quest, and meteorologists into the most unlikely heroes in recent literature.
I didn’t realize that, as recently as the 1950s, the Weather Bureau—now the National Weather Service—not only didn’t predict tornadoes and hurricanes; they flatly forbid such forecasts. They feared you and I were too irrational to handle such knowledge. Hundreds of people sat blindly unaware in the path of truly horrific weather because the government thought public panic was riskier than mass destruction.
Members of my generation grew up with the idea that weather prediction was a reliable applied science, that forecasts would continue to improve, and that we had a right to know when destructive weather menaced our homes. Smith combines history and memoir to describe the changes that made such an attitude possible.
Smith pays particular attention to tornadoes. As a survivor of the 1957 Ruskin Heights tornado, which killed dozens and flattened a Kansas City suburb, Smith demonstrates particular affinity for tornadoes. This played out in his early career when, as a young weatherman, he failed to utilize the newest technology and left much of Oklahoma City vulnerable to a significant tornado outbreak.
Throughout history, humans have stood vulnerable to weather phenomena. Only recently have we had technology to plan for the weather in any concrete way. Victims of the 1900 Galveston Hurricane or the Great Hurricane of 1780 had no warning before their homew washed out to sea. When Dorothy trembled before the gruesome twister, that was no literary device. People literally lived in fear of the weather.
Our technology also makes us vulnerable to weather in entirely new ways. Smith spends several chapters on Delta Flight 191, which got caught in a controversial phenomenon called a “microburst,” a highly localized storm that pushes cold, wet air into the ground with such force that a jumbo jet could get sucked along like driftwood. Only high tech could leave us at the mercy of such weather.
Smith believes that technology also frees us from such risks. Hundreds of people died in air disasters caused by microbursts, but not since 1994. We have the ability to recognize and anticipate such risks in a way we couldn’t in 1985, when Delta 191 flew through what the captain thought was just the rain.
And that’s what Smith means when he says science has tamed the weather. We have the knowledge to spot disasters in advance, steer the most vulnerable out of the path of destruction, and prevent loss of life. Though weather remains beyond our control, we no longer have to live in fear of wind and water.
In one telling thread, Smith compares the 1955 Udall, Kansas, tornado, with a nearly identical outbreak that hit Greensburg, Kansas, in 2007. The two tornadoes followed such similar paths that, superimposed, you could easily confuse one tornado for the other. Each flattened the towns. Yet Greensburg’s residents survived to rebuild; Udall, as Smith says, “died in its sleep.” This story isn’t just informative; it’s touching.
Strangely, though we regard the weather as the ultimate impersonal truth, Smith describes the controversies that actually accrue to weather prediction. Personalities like Robert Miller and Ted Fujita have polarized the meteorological community. Though these debates linger outside public view, they have forced their way into how we perceive the weather, and how we shield ourselves from it.
Fujita’s theories, for instance, have entered popular culture through the movie Twister, to which Smith returns time and again. We toss around terms like F-4 and F-5 casually, as we did this year following the Joplin tornado, without realizing the battles that went into this scale’s general acceptance. Ted Fujita faced resistance throughout that seems appallingly unscientific to outsiders.
Smith skillfully makes this and other controversies seem not just important, but exciting. Meteorology, in his telling, has the same bare-knuckle energy we see in politics or sports. These battles, many of which Smith himself fought in, reveal how much of our modern, weather-safe lifestyle is contingent on personalities, and could have gone another way.
While weather forecasters often appear starchy and bland, Smith makes the weather into an urgent concern, and a remarkable victory. This story turns the weather into a quest, and meteorologists into the most unlikely heroes in recent literature.
Wednesday, September 14, 2011
Toto, I Think We're Not in the 1980s Anymore
Toto’s “Rosanna” has been following me around town lately.
Ever since I used Toto to epitomize flash-in-the-pan culture in a recent blog post, the 1982 arena standard has cropped up everywhere. I can hardly run to the grocery store, spin the dial on the way to work, or sit on hold without David Paich’s plaintive chords hitting me from somewhere. This one song has become a soundtrack for these weeks in my life.
This is hardly the first song to take on unusual proportions in my life. Before this, I couldn’t shake U2’s “Mysterious Ways.” At other times, “Ticket to Ride” and “Whiter Shade of Pale” have redefined ubiquity.
But somehow, this feels different. This almost feels like a deliberate rebuke, a reminder that, despite my flip words, Toto has demonstrated remarkable staying power. After all, though Toto’s popularity in America dwindled after 1984, they remained a powerhouse touring act in Europe and Asia through 2008, and a recent reunion tour set sales records.
Compared to other music from the time, “Rosanna” endures remarkably well. No one would doubt, with its synth licks and rack of guitarists, that this is a period piece; this song could not get recorded today, at least not in this arrangement. But compare Toto to Madonna. She has enjoyed greater public staying power, but her earliest singles, sung in a strange piping soprano, are almost unlistenable.
American culture focuses markedly on the past. “Classic Rock,” a euphemism for what we once called oldies, is one of our most common radio formats. Cable TV is lined wall to wall with channels dedicated to rerunning nostalgic shows from my parents’ childhood. Though I Love Lucy is less common than a few years ago, I can hardly change channels without running into The Andy Griffith Show.
That seems strange for a country that proclaims its national youth and vigor. We tell ourselves that, unlike wheezy old aristocratic Europe, with its centuries of history and baggage, we’re still young, hip, and with-it. Yet somehow, we keep looking backward, like a retiree retreading the glory days.
Back in the 1990s, Chevrolet ran ads touting that only a country as young and vibrant as America could devise something as revolutionary as Chevy’s then-new model year. But Chevy ballyhooed their supposed adolescence to the backing of Jimi Hendrix’s “Fire.” They shouted their youth with the support of a musician who died before I was born.
The decade before Toto’s “Rosanna” was littered with faux nostalgia. Aerosmith’s 1973 “Dream On,” recorded when singer Steven Tyler was only 25, repeats its fatalistic lyrics about how “the past is gone” and “maybe tomorrow, the good Lord will take you away,” with drum-like constancy. Even Bruce Springsteen’s energetic “Born to Run” came from an artist who, at the time, consciously made himself up like a fifties greaser.
Despite its heartbreak themes, “Rosanna” seems almost opposite to that. Like the Beatles, Toto hit the stage with smiles and a set list. They may have had long hair, but they seemed clean-cut and forward-looking. Except, notice the video’s subtheme of the feud between the Jets and Sharks. West Side Story debuted twenty-five years before this video was recorded. Nostalgia lives!
Ignoring the “classic” radio and TV channels that protect people from encountering anything new, even current culture appears past-addled. Tim McGraw’s recent annoying C&W hit “Back When” features the repeated hook “I miss back when.” Unfortunately, the past he describes substantially occurred before his birth. He pines for his parents’ youth.
I cannot exempt myself from this criticism. I own the complete Beatles catalog on CD, can sing “Piece of My Heart” from memory, and watch DVDs of Doctor Who episodes first broadcast while I was learning to walk. The past has a way of keeping a firm hold on anyone.
Nor should we abandon the past. A culture without history or tradition must reinvent the wheel every generation. We shouldn’t desire that. Isaac Newton famously said: “If I have seen a little further it is by standing on the shoulders of Giants.”
I only fear it can become tempting to live in the past. A song stalking me can beckon, like a siren, to rest on yesterday’s successes. Consider all the rah-rah patriotism reminding Americans of the Moon race, World War II, or the Oregon Trail. There’s a fine line between revering the past, and nesting in it.
I miss you, Rosanna. But I have to balance my past against making sure I have a future.
Monday, September 12, 2011
"AHA" Said the Writer, and Kept On Writing
Writers are introspective by nature. Since our one constant resource is ourselves, we spend countless hours contemplating that source. Even nonfiction writers, dealing in areas outside themselves, see everything through the lens of their senses and experiences. So when the Aha Moment people contacted me, I immediately wondered when I had my greatest inspiration.
My blog first attracted their attention. I had no idea the production company toured the country, soliciting opinions from everybody they can find; I’d never thought about where those ads came from. Ads are ads, and they enter my house hoping to part me from my money. Why should they care what a writing teacher from the provinces thinks?
Although I get consistently positive marks from my students, and many express the desire to keep working with me, I’ve never gotten past the “pretender syndrome,” a common shortcoming among teachers. We often feel we don’t belong in front of the class. We don’t know enough—about our subject, classroom management, psychology, or whatever—to really count as working teachers.
And that’s when I remembered James (not his real name).
Four semesters ago, I thought James wasn’t paying attention in class. I tried to persuade him to take writing seriously, to respect himself enough to invest in his education. I did everything short of begging to engage him with the class. Though he showed up consistently, he sat quietly, staring at me, and repeatedly got papers in late.
At the end of the semester, after I’d distributed the student evaluations and retired to my office, James sauntered into the room. Most students are glad to be well quit of the class, and often never say so much as “hello” to me once they’re done, not even the ones who did well in class. I’m an imposition on their lives, one which ends after only a few months. So I was surprised to see James there.
He walked over, hands in his pockets, studying his shoes. I wondered if he felt bad about something. After a moment, James finally opened his mouth. “Mr. Nenstiel, I just wanted you to know, I went to a small rural high school. The teachers all figured we’d go to community college, get jobs, and never pay any attention to education again. So they never made us write just about anything.
“So I want you to know,” he continued, “that you’ve made me write more in one semester than all my teachers put together made me write all through high school. I think I only wrote two full-length papers in four years.”
James went quiet again. I assumed, as who wouldn’t, that he was about to tell me what a son of a bitch I was for working him so hard. I’d made him write three times as much, in one class in one semester, as he’d written throughout high school. No one likes being forced to do anything.
Finally, James lifted his head to face me and finish his thought.
“I wanted to thank you for that, Mr. Nenstiel. Because I always thought I was kind of stupid. I’d always been treated like I was stupid, and my parents and my teachers never encouraged me to set my sights high. But when you required me to dig down inside so I could keep coming up with something to say, I realized how much I actually had going on. I think, maybe, I’m kind of a smart guy.”
I blinked. I never expected to hear anything like that. I shook his outstretched hand, and he turned and left my office. I haven’t seen him since.
That was my Aha Moment. I couldn’t compress that into a sound bite, but in that moment, I realized what an effect I have long after I leave my students’ lives. For good or for ill, I’m now a part of them, and they’re now a part of me.
I still feel adrift when I try to anticipate and meet my students’ needs. Jargon like “lesson plan” and “pedagogy” only conceals that teaching is a stunt performed without a net. But while I still struggle, I no longer despair. Despite my doubts, despite my struggles, I realized, this was the legacy I was leaving my students. This was the mark I was leaving on the world. I was a teacher, and this was my lesson.
So now I keep writing, and keep talking, because that was my Aha. And you have become part of what I leave behind.
Friday, September 9, 2011
Jessa Pruit's Jams and Preserves
As I write this, we’re in the final stretch to complete the full production of Jessa Pruit’s Jams and Preserves, a debut comedy organized as a fundraiser for First Lutheran Church, in Kearney, Nebraska. By the time you read it, the whole process may well be over. Nearly two months’ effort went into only three showings on a makeshift stage with a shoestring budget. And no one has more on the line than me.
Though this is neither the first play I’ve written to receive a full staging, nor the first play I’ve directed, the magnitude of firsts remains nevertheless significant. This is my first full-length staging as a writer, my first full-length comedy, my first play longer than a one-act as a director, the first play I’ve directed in nine years. This experience has hit me with uncounted lessons in quick succession.
Most importantly, this is the first time I’ve directed my own work. I once swore I would never do that, because I read a newspaper report on a cast that grew frustrated with Edward Albee’s inability to direct his own classic, Who’s Afraid of Virginia Woolf? Because Albee focused on the language to the exclusion of action, the actors had to push against their own director to create a dynamic stage picture. I didn’t want to be that guy.
But, when I was offered the opportunity to direct a play as a fundraiser for the church, I have to admit, my first loyalties showed their faces. I write; that’s what I do, that’s who I am. So I made my directing conditional upon permission to write my own play. And that’s what I did— though, without any audition process, I had to accept every actor who wanted to participate.
The learning curve in this process has been constant. For instance, I’ve had to work to deadline pressure like I’ve never had before. I had barely a week before knowing who my cast would be, and needing to present them with a script. I’ve had to write with the rehearsal already in process, giving my actors their scripts in dribs and drabs. They’ve been remarkably accommodating, too, especially since my cast received their final rewrites only a week before they were scheduled to act off book.
Because of the volunteer nature of the process, I’ve scrambled to compromise in the most productive ways. As a director, I see my role not as a dictator, but as a surrogate audience, and my job as helping my actors achieve the best show they can put on. Because most of my cast has very limited acting experience, and some are acting for the first time, that process has meant me fining new ways to communicate that don’t apply with seasoned actors.
But I’ve enjoyed the trade-off that, as playwright, I could help my actors put on their best show by showcasing their greatest strengths. Because this is a church activity, and because I already know most of the performers on stage, I’ve been able to write roles that reflect their speech rhythms and natural mannerisms. Not that I have my actors playing themselves; just that they don’t have to wrap their tongues around somebody else’s style of speaking.
The writer-director finds himself in a strange position. Those roles are usually separated, and even when the writer is present to fine-tune a world debut production, the writer is usually expected to sit in the back of the auditorium, taking notes, and only communicating with the actors through the director. The writer, even in this public art, is a private role, while the director is more external, communicative, and public.
Mingling the two roles means I need to wear two hats at once, which is never easy for anyone. I must trust to the artistic vision of the words, and stand fast where I believe changes would weaken the show, even as I must cater to my actors by giving them what they need. The writer and director can usually avoid cast pressures by blaming limits on one another. Not so when you fill both roles.
Yet I’d recommend the experience to any writer, director, or theatre professional. I have the joy of seeing the skeleton of my writing, which is always just a guidepost for someone else’s imagination, augmented by my actors. So many times I’ve watched my cast, realize they’ve done something I never even dreamed, and felt simple joy. They’ve taken my work so much further than I ever could.
Though this is neither the first play I’ve written to receive a full staging, nor the first play I’ve directed, the magnitude of firsts remains nevertheless significant. This is my first full-length staging as a writer, my first full-length comedy, my first play longer than a one-act as a director, the first play I’ve directed in nine years. This experience has hit me with uncounted lessons in quick succession.
Most importantly, this is the first time I’ve directed my own work. I once swore I would never do that, because I read a newspaper report on a cast that grew frustrated with Edward Albee’s inability to direct his own classic, Who’s Afraid of Virginia Woolf? Because Albee focused on the language to the exclusion of action, the actors had to push against their own director to create a dynamic stage picture. I didn’t want to be that guy.
But, when I was offered the opportunity to direct a play as a fundraiser for the church, I have to admit, my first loyalties showed their faces. I write; that’s what I do, that’s who I am. So I made my directing conditional upon permission to write my own play. And that’s what I did— though, without any audition process, I had to accept every actor who wanted to participate.
The learning curve in this process has been constant. For instance, I’ve had to work to deadline pressure like I’ve never had before. I had barely a week before knowing who my cast would be, and needing to present them with a script. I’ve had to write with the rehearsal already in process, giving my actors their scripts in dribs and drabs. They’ve been remarkably accommodating, too, especially since my cast received their final rewrites only a week before they were scheduled to act off book.
Because of the volunteer nature of the process, I’ve scrambled to compromise in the most productive ways. As a director, I see my role not as a dictator, but as a surrogate audience, and my job as helping my actors achieve the best show they can put on. Because most of my cast has very limited acting experience, and some are acting for the first time, that process has meant me fining new ways to communicate that don’t apply with seasoned actors.
But I’ve enjoyed the trade-off that, as playwright, I could help my actors put on their best show by showcasing their greatest strengths. Because this is a church activity, and because I already know most of the performers on stage, I’ve been able to write roles that reflect their speech rhythms and natural mannerisms. Not that I have my actors playing themselves; just that they don’t have to wrap their tongues around somebody else’s style of speaking.
The writer-director finds himself in a strange position. Those roles are usually separated, and even when the writer is present to fine-tune a world debut production, the writer is usually expected to sit in the back of the auditorium, taking notes, and only communicating with the actors through the director. The writer, even in this public art, is a private role, while the director is more external, communicative, and public.
Mingling the two roles means I need to wear two hats at once, which is never easy for anyone. I must trust to the artistic vision of the words, and stand fast where I believe changes would weaken the show, even as I must cater to my actors by giving them what they need. The writer and director can usually avoid cast pressures by blaming limits on one another. Not so when you fill both roles.
Yet I’d recommend the experience to any writer, director, or theatre professional. I have the joy of seeing the skeleton of my writing, which is always just a guidepost for someone else’s imagination, augmented by my actors. So many times I’ve watched my cast, realize they’ve done something I never even dreamed, and felt simple joy. They’ve taken my work so much further than I ever could.
Wednesday, September 7, 2011
Can Hank Hanegraaff Close the God Debate?
We in the logic-chopping business have a name for discussions which seem unending and unable to reveal a final conclusion. We call these “essentially contested” debates. That means that the positions in the debate gain their definition from the controversy, not from their ability to erase opposing positions. These discussions cannot, and indeed must not, end, because if they end, the positions lose their compass.
Of all the essentially contested debates, none looms higher than God’s nature and existence. Radio personality Hank Hanegraaff, in Has God Spoken?: Proof of the Bible’s Divine Inspiration, weighs in on the question, providing reasons why, at the very least, scriptural evidence is internally consistent and reliable, if you accept that God has any validity. What he doesn’t provide, notwithstanding his subtitle, is any actual proof.
Clearly an experienced arguer, Hanegraaff structures his systematic apologetics not just by the facts, but according to an approach that will make his concepts memorable. He relies on clever mnemonics to make abstruse concepts, like typological prophecy and Assyriology, seem straightforward. As religiously literate as I am, he introduces concepts I’ve never seen before, and makes them stick in my head.
But as he writes, Hanegraaff reveals his own limitations. Time and again, he feels the need to answer his opponents, especially Bart Ehrman, with whom Hanegraaff has an apparent ax to grind. I admit, Ehrman’s pedantic sophistry bores me, too. But Hanegraaff returns to him so often that he reveals how dependent he is from those he would refute.
At various times, Hanegraaff characterizes Ehrman as “benighted,” “a fool in his folly,” and “the shock doc.” He also turns his disdain onto Christopher Hitchens, John Dominic Crossan, the Jesus Seminar, and Barack Obama, among others. He spends so much time and effort on dismantling others’ claims and disparaging his opponents’ presuppositions that, without anyone to oppose, he would clearly run out of anything to say.
Thus he can never really provide “proof” of divine inspiration. Even laying aside God’s abstract nature, which resists scientific scrutiny and therefore cannot be either proved or disproved, Hanegraaff’s core claims rely on refuting others’ claims. This does not mean that Hanegraaff is a weak arguer or that his claims are hollow, but it means that the heart of his book lies outside its covers.
The concept of proof innately assumes that we can close a debate. By providing sufficient evidence to construct an airtight case—say, that the accused really did murder the victim—we preclude all other options, and have nothing more to say. While a few ornery revisionists may try to exonerate John Wilkes Booth, most people consider the evidence, concur that it yields only one conclusion, and call the case proved.
The Bible cannot yield such proof. It can offer copious evidence, and many great minds have spent twenty centuries examining, collating, and bolstering such evidence. Yet it relies on the central premise that a transcendent force called “God” caused the entire universe, and continues to take an active interest in human affairs. All the evidence Scripture offers stands or falls on that central premise.
And that’s where Hanegraaff stumbles. He constructs an admirable network of evidence, one that I find highly persuasive, and even a pleasure to read. But when he approaches that core premise, he turns circular. We can regard Scripture as authoritative, he says, because it was inspired by God. We can assume God’s existence because it is attested in Scripture.
To his credit, Hanegraaff is not dogmatic. He has no more interest in millenarian alarmists than in secular hair-splitters. He scorches Harold Camping in a manner both witty and just. He demands that you don’t take his word for anything, but examine Scripture and earthly evidence, because understanding comes holistically. Despite his acceptance of the core God premise, he expects readers to take an active hand in their own inquiries.
But he tries to close the case, and that’s where he fumbles. He refutes others, and now others will refute him. As American philosopher Eric Hoffer observes in The True Believer, mass movements rely on opponents to give themselves shape. Hanegraaff could not have written this book without an Ehrman to oppose. Why should he think he gets the last word?
As a Christian, I find Hanegraaff’s web of evidence persuasive. But my friend Roger, agnostic from an early age, would not, because he doesn’t share that core premise. And that’s why, good as this book is, it falls short of its promise of proof.
Of all the essentially contested debates, none looms higher than God’s nature and existence. Radio personality Hank Hanegraaff, in Has God Spoken?: Proof of the Bible’s Divine Inspiration, weighs in on the question, providing reasons why, at the very least, scriptural evidence is internally consistent and reliable, if you accept that God has any validity. What he doesn’t provide, notwithstanding his subtitle, is any actual proof.
Clearly an experienced arguer, Hanegraaff structures his systematic apologetics not just by the facts, but according to an approach that will make his concepts memorable. He relies on clever mnemonics to make abstruse concepts, like typological prophecy and Assyriology, seem straightforward. As religiously literate as I am, he introduces concepts I’ve never seen before, and makes them stick in my head.
But as he writes, Hanegraaff reveals his own limitations. Time and again, he feels the need to answer his opponents, especially Bart Ehrman, with whom Hanegraaff has an apparent ax to grind. I admit, Ehrman’s pedantic sophistry bores me, too. But Hanegraaff returns to him so often that he reveals how dependent he is from those he would refute.
At various times, Hanegraaff characterizes Ehrman as “benighted,” “a fool in his folly,” and “the shock doc.” He also turns his disdain onto Christopher Hitchens, John Dominic Crossan, the Jesus Seminar, and Barack Obama, among others. He spends so much time and effort on dismantling others’ claims and disparaging his opponents’ presuppositions that, without anyone to oppose, he would clearly run out of anything to say.
Hank Hanegraaff |
The concept of proof innately assumes that we can close a debate. By providing sufficient evidence to construct an airtight case—say, that the accused really did murder the victim—we preclude all other options, and have nothing more to say. While a few ornery revisionists may try to exonerate John Wilkes Booth, most people consider the evidence, concur that it yields only one conclusion, and call the case proved.
The Bible cannot yield such proof. It can offer copious evidence, and many great minds have spent twenty centuries examining, collating, and bolstering such evidence. Yet it relies on the central premise that a transcendent force called “God” caused the entire universe, and continues to take an active interest in human affairs. All the evidence Scripture offers stands or falls on that central premise.
And that’s where Hanegraaff stumbles. He constructs an admirable network of evidence, one that I find highly persuasive, and even a pleasure to read. But when he approaches that core premise, he turns circular. We can regard Scripture as authoritative, he says, because it was inspired by God. We can assume God’s existence because it is attested in Scripture.
To his credit, Hanegraaff is not dogmatic. He has no more interest in millenarian alarmists than in secular hair-splitters. He scorches Harold Camping in a manner both witty and just. He demands that you don’t take his word for anything, but examine Scripture and earthly evidence, because understanding comes holistically. Despite his acceptance of the core God premise, he expects readers to take an active hand in their own inquiries.
But he tries to close the case, and that’s where he fumbles. He refutes others, and now others will refute him. As American philosopher Eric Hoffer observes in The True Believer, mass movements rely on opponents to give themselves shape. Hanegraaff could not have written this book without an Ehrman to oppose. Why should he think he gets the last word?
As a Christian, I find Hanegraaff’s web of evidence persuasive. But my friend Roger, agnostic from an early age, would not, because he doesn’t share that core premise. And that’s why, good as this book is, it falls short of its promise of proof.
Monday, September 5, 2011
Tracy Seeley's Private Prairie Fire
We who live in America’s “flyover country” struggle with outsiders’ broad stereotypes. We’re regularly disparaged as uncultured, uneducated, and backward. I’ve had Californians ask me point blank if Nebraska has electricity and indoor plumbing. Seriously. And few people mock the Heartland more than people who have left the area themselves. Which is why I struggle to decide what to make of Tracy Seeley’s My Ruby Slippers: The Road Back to Kansas.
Diagnosed with cancer and fresh out of a relationship just after 9/11, Seeley decided to investigate the baker’s dozen addresses listed in her baby book. Why did her family relocate four times in less than two years, and relocate relentlessly before Seeley was old enough to understand? Though she begins in Colorado, her search focuses mostly on the state she tried hardest to expunge from her history: Kansas.
When she left for graduate school in Texas and an academic career on both coasts, Seeley worked hard to suppress her Midland vowels and prairie mannerisms. Texas may have hick mythology, but it has panache that left Kansas with the Dust Bowl. So Seeley is as surprised as anyone when the state she denied for decades proves more of her home than the San Francisco where she’s immersed herself for decades.
Or so she says. Her subtext doesn’t correspond with her thesis. She talks, for instance, about rediscovering houses and neighborhoods which, lingering at the fringes of her consciousness, continue to mold her identity. Each new discovery places a new piece in the jigsaw puzzle of her conflicted childhood, colored by her father’s vague ambitions, her mother’s long-suffering perseverance, and a history of permanent impermanence.
Yet even while embracing her heritage, she describes the Kansas and Colorado landscape as nothing but “wheat, wheat, wheat, wheat, corn,” and calls the ambitious German Lutherans who built cathedrals on the gamble of future population infusions “mad.” She says she finally sees how the Great Plains has produced true culture, even if unrecognized, then characterizes her childhood neighbors by their “beige carpets and plastic hall runners.”
I can’t blame Seeley for this ambivalence. I’ve scarcely met a prairie dweller who doesn’t praise small town life in one breath and cuss the lack of amenities in the next. We openly embrace city slickers’ characterizations of us as hillbillies with the same aggression workers use in slapping back at bosses, yet when people ask where we’re from, we act embarrassed and evade.
Every fall, my regional university receives an infusion of ranchers’ children from rural western Nebraska. Many of them, especially men, insist religiously on wearing their boots, Stetsons, and Wrangler shirts just like home... until they realize that, even in Kearney, Nebraska, we pretend to greater worldliness. By the end of October, farm and ranch wear largely disappears from campus as students race to not be seen as “countrified.”
But while I can’t fault Seeley’s ambivalence, I question why she put it on paper. She yearns to be equally at home in Kansas and California, yet insists on bringing arugula salads and organic tahini to the plains, resisting the beefy diet favored by laborers. Her attitude remains colonial, wanting others to conform to her expectations and making little effort to speak the language of the people she claims as neighbors.
In describing “Children of the Corn,” Stephen King recounts getting lost near Thedford, Nebraska, amid miles of interchangeable, oppresive cornfields. Surely, he says, nothing good comes from such expanses. Notably, he never mentions meeting or speaking with even one Nebraskan. Other horror writers, from Lovecraft and Lord Dunsany to Peter Straub and Clive Barker, bewail the menace urbanites find inherent in the countryside.
On a less horrific note, the British film Cold Comfort Farm features Kate Beckinsale as Flora, a posh Londoner who remakes her country kin in her image. Though intended satirically, Cold Comfort Farm reflects very real attitudes, on both sides of the Atlantic: if them hicks could be as urbane as us, they’d know to leave the farm. No more provincial attitude can I imagine.
Reading this, I may appear to dislike Seeley’s narrative. Not so: it has its redeeming qualities, in its author’s journey of self-discovery, and the struggles she undertakes to rebuild herself after cancer. Yet in achieving these goals, she relies on the same creaky coastal self-infatuation that has kept the Heartland embarrassed by itself for generations.
We who live in the Great Plains deserve to be taken seriously. And we deserve it as much from ourselves as from outsiders.
Diagnosed with cancer and fresh out of a relationship just after 9/11, Seeley decided to investigate the baker’s dozen addresses listed in her baby book. Why did her family relocate four times in less than two years, and relocate relentlessly before Seeley was old enough to understand? Though she begins in Colorado, her search focuses mostly on the state she tried hardest to expunge from her history: Kansas.
When she left for graduate school in Texas and an academic career on both coasts, Seeley worked hard to suppress her Midland vowels and prairie mannerisms. Texas may have hick mythology, but it has panache that left Kansas with the Dust Bowl. So Seeley is as surprised as anyone when the state she denied for decades proves more of her home than the San Francisco where she’s immersed herself for decades.
Or so she says. Her subtext doesn’t correspond with her thesis. She talks, for instance, about rediscovering houses and neighborhoods which, lingering at the fringes of her consciousness, continue to mold her identity. Each new discovery places a new piece in the jigsaw puzzle of her conflicted childhood, colored by her father’s vague ambitions, her mother’s long-suffering perseverance, and a history of permanent impermanence.
Yet even while embracing her heritage, she describes the Kansas and Colorado landscape as nothing but “wheat, wheat, wheat, wheat, corn,” and calls the ambitious German Lutherans who built cathedrals on the gamble of future population infusions “mad.” She says she finally sees how the Great Plains has produced true culture, even if unrecognized, then characterizes her childhood neighbors by their “beige carpets and plastic hall runners.”
I can’t blame Seeley for this ambivalence. I’ve scarcely met a prairie dweller who doesn’t praise small town life in one breath and cuss the lack of amenities in the next. We openly embrace city slickers’ characterizations of us as hillbillies with the same aggression workers use in slapping back at bosses, yet when people ask where we’re from, we act embarrassed and evade.
Every fall, my regional university receives an infusion of ranchers’ children from rural western Nebraska. Many of them, especially men, insist religiously on wearing their boots, Stetsons, and Wrangler shirts just like home... until they realize that, even in Kearney, Nebraska, we pretend to greater worldliness. By the end of October, farm and ranch wear largely disappears from campus as students race to not be seen as “countrified.”
But while I can’t fault Seeley’s ambivalence, I question why she put it on paper. She yearns to be equally at home in Kansas and California, yet insists on bringing arugula salads and organic tahini to the plains, resisting the beefy diet favored by laborers. Her attitude remains colonial, wanting others to conform to her expectations and making little effort to speak the language of the people she claims as neighbors.
In describing “Children of the Corn,” Stephen King recounts getting lost near Thedford, Nebraska, amid miles of interchangeable, oppresive cornfields. Surely, he says, nothing good comes from such expanses. Notably, he never mentions meeting or speaking with even one Nebraskan. Other horror writers, from Lovecraft and Lord Dunsany to Peter Straub and Clive Barker, bewail the menace urbanites find inherent in the countryside.
On a less horrific note, the British film Cold Comfort Farm features Kate Beckinsale as Flora, a posh Londoner who remakes her country kin in her image. Though intended satirically, Cold Comfort Farm reflects very real attitudes, on both sides of the Atlantic: if them hicks could be as urbane as us, they’d know to leave the farm. No more provincial attitude can I imagine.
Reading this, I may appear to dislike Seeley’s narrative. Not so: it has its redeeming qualities, in its author’s journey of self-discovery, and the struggles she undertakes to rebuild herself after cancer. Yet in achieving these goals, she relies on the same creaky coastal self-infatuation that has kept the Heartland embarrassed by itself for generations.
We who live in the Great Plains deserve to be taken seriously. And we deserve it as much from ourselves as from outsiders.
In the interest of full disclosure, this book is published by a University of Nebraska Press imprint. I teach at a University of Nebraska campus. I received a copy of this book to review, at the publisher's expense, which has not influenced my opinion in any way, and has nothing to do with my professional academic standing.
Friday, September 2, 2011
All Work and All Play
When I recently met a factory executive where I work, I cannot pretend I was shocked that he didn’t know my name. Thousands of workers labor around the clock in a facility as large as a minor-league sports arena, and I haven’t been there long enough to distinguish myself yet. But I was shocked when I mentioned my job title, and he not only knew who I was, but could quote from my employee file.
In a modern condensed industrial environment, where people work elbow to elbow and often get identified in company files according to what they do, work becomes an intensely impersonal endeavor. Humans love work, and define ourselves by our labor, as I’ve said before. When we meet new people, we ask: “So what do you do for a living?” Yet industrial and post-industrial society hampers work’s simple joy.
Barbara Garson’s All the Livelong Day observes the gap between workers’ and managers’ expectations. A Socialist, Garson began her research considering work an unfair imposition by a lopsided economy. But she discovered how much people treasure consequential work. As she writes, “The crime of modern industry is not forcing us to work, but denying us real work. For no matter what tricks people play on themselves to make the day's work meaningful, management seems determined to remind them, ‘You are just tools for our use.’”
If my assembly line runs a standard 45 units per minute, then each unit sits before me for one and one-half seconds. During that time I must evaluate the unit’s quality, decide if it meets standards, and fit it for the next station in line. This often means only attaching one part; a filter that costs $12.95 may have fourteen people involved just in assembly, to say nothing of components before I see them or after they leave me.
In such an environment, we must re-evaluate “pride of craftsmanship.” I cannot claim to make the 18,000 filters rolling off my line nightly, any more than any individual can. Thus, I find pride in, for instance, being able to outpace the machine, or maintain a steady rhythm, or perform two tasks at once. My peers and I often do one job with the left hand and another with the right.
But not every trick we perfect improves our work. Especially in the most tedious areas, workers often play games intended to establish themselves apart from their jobs. Like the manager who knew me only by my title, many supervisors identify workers by their assigned machines: Forklift Driver. Paint Line Maintenance. Can Press Operator.
In attempting to separate their identities from their jobs, such workers frequently earn supervisors’ ire. Because the company exists to turn low-valued raw materials into high-valued commodities, workers’ games and playfulness seem extraneous and, if workers spent more time working and less noodling, they’d earn more profit.
Some managers regard anything less than working flat out for eight straight hours as theft of company time, attested in Barbara Ehrenreich’s Nickel and Dimed. One manager (I won’t name names) reprimanded me for pausing to crack my back, a necessary time investment since the assembly line is built for someone six inches shorter than me. Back pain is part of my work routine; thus, so is back pain relief.
Whenever somebody mentions unions in America today, nay-sayers claim that workers don’t need representation; they’re paid according to their work, and if they want better pay and benefits, they should work harder. But, as Joshua Holland notes in The Fifteen Biggest Lies about the Economy, there is not now, and arguably never has been, any relationship between the hardest work and the highest pay.
My factory sometimes runs so fast that seasoned workers can’t match the pace. Work orders require such high output that really doing the work would cause repetitive stress injuries. I work two jobs and can’t afford to change the oil in my truck, but I’ve been called on the carpet for mistakes I could avoid if the machines ran at a marginally slower pace.
As managers disparage workers, workers retrench into their identities. Mutual resentment builds, and the gulf between management and labor gets reinforced. It’s a vicious cycle, created by the fact that individuals don’t own the product of their labors, and nourished because we don’t really know each other.
I love work. I enjoy my job. But I am not my job. And that makes me a threat to a system that thrives on anonymity.
In a modern condensed industrial environment, where people work elbow to elbow and often get identified in company files according to what they do, work becomes an intensely impersonal endeavor. Humans love work, and define ourselves by our labor, as I’ve said before. When we meet new people, we ask: “So what do you do for a living?” Yet industrial and post-industrial society hampers work’s simple joy.
Barbara Garson’s All the Livelong Day observes the gap between workers’ and managers’ expectations. A Socialist, Garson began her research considering work an unfair imposition by a lopsided economy. But she discovered how much people treasure consequential work. As she writes, “The crime of modern industry is not forcing us to work, but denying us real work. For no matter what tricks people play on themselves to make the day's work meaningful, management seems determined to remind them, ‘You are just tools for our use.’”
If my assembly line runs a standard 45 units per minute, then each unit sits before me for one and one-half seconds. During that time I must evaluate the unit’s quality, decide if it meets standards, and fit it for the next station in line. This often means only attaching one part; a filter that costs $12.95 may have fourteen people involved just in assembly, to say nothing of components before I see them or after they leave me.
In such an environment, we must re-evaluate “pride of craftsmanship.” I cannot claim to make the 18,000 filters rolling off my line nightly, any more than any individual can. Thus, I find pride in, for instance, being able to outpace the machine, or maintain a steady rhythm, or perform two tasks at once. My peers and I often do one job with the left hand and another with the right.
But not every trick we perfect improves our work. Especially in the most tedious areas, workers often play games intended to establish themselves apart from their jobs. Like the manager who knew me only by my title, many supervisors identify workers by their assigned machines: Forklift Driver. Paint Line Maintenance. Can Press Operator.
In attempting to separate their identities from their jobs, such workers frequently earn supervisors’ ire. Because the company exists to turn low-valued raw materials into high-valued commodities, workers’ games and playfulness seem extraneous and, if workers spent more time working and less noodling, they’d earn more profit.
Some managers regard anything less than working flat out for eight straight hours as theft of company time, attested in Barbara Ehrenreich’s Nickel and Dimed. One manager (I won’t name names) reprimanded me for pausing to crack my back, a necessary time investment since the assembly line is built for someone six inches shorter than me. Back pain is part of my work routine; thus, so is back pain relief.
Whenever somebody mentions unions in America today, nay-sayers claim that workers don’t need representation; they’re paid according to their work, and if they want better pay and benefits, they should work harder. But, as Joshua Holland notes in The Fifteen Biggest Lies about the Economy, there is not now, and arguably never has been, any relationship between the hardest work and the highest pay.
My factory sometimes runs so fast that seasoned workers can’t match the pace. Work orders require such high output that really doing the work would cause repetitive stress injuries. I work two jobs and can’t afford to change the oil in my truck, but I’ve been called on the carpet for mistakes I could avoid if the machines ran at a marginally slower pace.
As managers disparage workers, workers retrench into their identities. Mutual resentment builds, and the gulf between management and labor gets reinforced. It’s a vicious cycle, created by the fact that individuals don’t own the product of their labors, and nourished because we don’t really know each other.
I love work. I enjoy my job. But I am not my job. And that makes me a threat to a system that thrives on anonymity.
Subscribe to:
Posts (Atom)