Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Friday, May 13, 2022

Children Building Cities out of Sand

Stock photo

We quickly learned how to find a place on the playground’s perimeter, the six of us, where we could build our cities made of sand. We were all in fifth and sixth grade, at a standard prefab elementary school in Southern California, one of countless identical schools in countless identical suburbs erected hastily in the building boom of the 1970s and early 1980s. But we built the cities we wanted to live in.

The playground sand was dry and granular, basically low-density road gravel, not suitable for building. That didn’t stop us. We swept it down to create a level surface, and marked out roads. Then we scooped up double-handfuls of sand and began building our elaborate urban arcologies: massive arenas for stout-hearted competitions, for instance, or grand halls for universities or laboratories or kings. Our cities were vast places of epic architecture.

We weren’t building cities, of course. We were building stories. We built the kinds of metropolises we wanted to occupy, cities designed to inspire awe and motivation. All our roads were majestic boulevards; all our buildings were vast palaces of art, science, and leadership. Utopian cities of great aspiration, where somehow, the fiddling business of cities—sewage removal and building maintenance, for instance—happened automatically, outside the story.

Nobody outside our group wanted us doing this, of course. Though we were within the playground fence, adults frequently scolded us for getting literally as far from the classroom buildings as rules permitted. They cited fatuous claims that maybe we wouldn’t hear the bell summoning us back to class, or kidnappers might leap the fence and abscond with us before adults could intervene. Don’t you want, they asked, to play over here with the other kids?

I don’t blame those teachers. Given what I now know about liability insurance and bad PR in the years following the Adam Walsh case, they were bombarded with demands to keep students safe no matter what. They meant well. They just couldn’t comprehend that the places they wanted us to play, and the games they encouraged, were noisy, crowded, and hectic. We didn’t want to run around; we wanted to build and tell stories.

I have considerably less sympathy for the other kids. Frequently, if adults didn’t compel us to abandon our stories and play “accepted” games among the crowds, other students would find ways to thwart our inventions. Sometimes they’d outright lie, claiming adults had summoned us back, with threats of punishment. Other times they just ran through our cityscapes, dragging their feet and making the maximum mess possible.

Didn’t matter much. Either way, we saw bigger kids maliciously destroy our cities, our stories.

We learned, as a result, to dream and tell stories surreptitiously. We continued building our science-fictional cities, but learned to keep one eye out for interference. Whether it came from well-meaning authority figures, or mean-spirited peers who got pleasure from destroying what we’d built, interference was always present at the margins. If our aspirations became too independent, well, then we aspired illicitly.

As a grade schooler, I enjoyed academic subjects, but I didn’t necessarily enjoy them at officially approved times. Schools have designated times to read and write, perform math or science, and the omnipresent phys-ed. Having spent time teaching myself, I understand the importance of having everybody working on the same wavelength simultaneously. In chronically short-staffed, cash-poor schools, independent learning wasn’t much of an option.

But I wanted to write when I wrote, not when my well-meaning teacher said writing was appropriate. I invented elaborate stories for myself involving math, far more interesting than the “skillz drillz” practiced in the textbooks. The approach advocated in textbooks worked, insofar as kids learned basic skills sufficiently to ace standardized tests. But only when allowed to play with words and numbers like toys, was I able to care enough to actually master the concepts.

Our sandcastle cities were the manifestation of this principle. We learned collaboration by building the cities together. We learned math by determining how high we could stack unstable playground sand to make our cathedrals of innovation. As our stories became increasingly elaborate, we learned important language arts skills, in persuading teammates why having this boulevard here, not somewhere else, served the city.

Sometimes I wonder what became of my fellow builders. The military reassigned my father every two years, so most of my childhood friends retreated to the anonymity from which they arose. How many of them are architects, storytellers, schoolteachers now?

And how many of them daily apply the skills we learned at the farthest verge, building cities out of sand?

Monday, February 7, 2022

How To Change Your Mind (Maybe)

Adam Grant, Think Again: The Power of Knowing What You Don't Know

America is arguably plagued with a crisis of overconfidence. Politicians, business professionals, and pundits can’t be shaken from their opinions. Everyone from bankers to athletic coaches keeps supporting their investments, even after they’ve proven themselves unreliable. Why are so many people unwilling to change their minds? And is there any way to reverse this apparently culture-wide aversion to basic rethinking?

I had two different opinions about Wharton organizational psychologist Adam Grant’s latest book. While reading, I felt very genial toward his message, that constantly reĆ«valuating our own beliefs strengthens our position, and makes it more likely that we’re ultimately correct. He uses the latest research from psychology and behavioral economics to justify his position, but he restates that research in plain English, so us ordinary readers grasp the import of his message.

However, I took few notes while reading, and when I sat to write this review, I realized I couldn’t remember very much of what he’d said. Grant made what felt, as I was reading, like several important and substantive points. However, he made them in ways that had no mental adhesion, that slid off my recollection like bugs off a windshield. If I can’t recall them, just two hours later, have his points really changed my mind?

Not that Grant says nothing memorable. Throughout his book, he emphasizes the importance of “thinking like a scientist.” This means using knowledge, not as absolute truth, but as the foundation for hypothesis and experiment. Good scientists test all knowledge against emerging evidence, and when the evidence requires it, they change their minds. To scientists, being proven wrong isn’t evidence of weakness; it’s a sign that they’re still growing.

Grant contrasts thinking like a scientist to the other roles our thinking often falls into: preacher, prosecutor, and politician. That is, we preach the truth of our own understanding, prosecute supposed flaws in others’ understanding, and politically hedge between these extremes to make our understanding useful. All these, Grant admits, are useful roles, in their place. But too often, we let them dominate us, and we pay for it.

Adam Grant

This broad outline makes sense. The extreme intellectual and political intolerance which dominates modern discourse comes from people seeing their ideas, not as tentative expressions of the best evidence, but as extensions of their own identity. Too many people respond to friendly intellectual challenges like they’d respond to a bear attack. This individualistic approach is personally harmful, and it prevents us getting any closer to resolving our differences.

In justifying this outline, however, Grant caroms wildly. In an early chapter, he draws an analogy between rigid, hidebound thinking, and the way the once-popular Blackberry Corporation collapsed because it couldn’t adapt to changing tides. It’s a valid analogy. Except he lays the premise, then leaves for so long that, when he returns to it, I’d forgotten the premise, and had to skip backward to remember what he was talking about.

He repeats this hit-and-run technique with multiple examples: Apple Computer, Pixar Animation, the anti-vax movement, his own cousin’s medical career. He dips in, makes his point, and zooms away, in a manner that arguably would work well in a TED Talk or other oral format. In a book, where audiences expect thoughtful authors to unpack weighty topics with appropriate gravity, the product just looks chaotic.

Maybe it’s the format. Like multiple scholars writing nonfiction for a general-interest market (Malcolm Gladwell and Naomi Klein come to mind), Professor Grant feels obligated to make his narrative fast-moving, concise, and peppy. However, he’s addressing a topic where his audience has precast opinions, and where he often has to overcome deep intransigence. As Grant himself notes, “There's a fine line between heroic persistence and foolish stubbornness.”

I found plenty in Grant’s writing that stuck with me. His chapter on Motivational Interviewing, a technique of persuasion through asking questions rather than making statements, gave me plenty to consider. Particularly since I formerly struggled to overcome my students’ ingrained beliefs by making statements, which they found easy to ignore, I really enjoyed Grant’s insights into this topic, and will definitely read further and utilize this technique.

On balance, I’m glad I read this book. Professor Grant gives readers important insights, encourages us to reframe our own thinking processes, and shows us how successful rethinkers have made their approach systemic. Perhaps I’ll reread the book, taking more notes the second time. But this book didn’t really change me, ironically. In today’s busy, time-crunched environment, most readers won’t give books that second chance.

Monday, April 6, 2015

Taming the Screen-Eyed Monster

Golden Krishna, The Best Interface Is No Interface

Design professional Golden Krishna has become frustrated with graphical user interfaces. The novelty has worn off putting every important function into a smartphone app, and the ubiquity of touchscreens has made ordinary people subservient to their technology. Think about it: does your refrigerator really need WiFi compatibility and a streaming Pandora feed?  Even better, is driving enhanced when drivers have in-dash Facebook demanding their attention?

Krishna comes from a background in User Experience (UX), a design paradigm emphasizing how we can maximize users’ positive response to new technology. This often parallels with another discipline, User Interface (UI), which specifically focuses on graphical user interfaces—or as they’re called in the industry, “interfaces.” These two disciplines have become so entwined that many job-seeker websites now advertise UX/UI as a single field, confining end-user experiences to a screen.

No, says Krishna, this is wrong. This attitude encourages sameness, resulting in finished products not sufficiently differentiated, and poorly attuned to user needs. Design meetings begin with enthusiastic goals to re-envision some task we all undertake; they finish by creating another smartphone app, impractical website (with fifty-page usage agreement), or another screen stuck somewhere it doesn’t belong. Graphic interfaces on curbside trash cans? Really?

Rather than repeating past success, Krishna advocates three core principles:
  1. “Embrace Typical Processes Instead of Screens”
  2. “Leverage Computers Instead of Serving Them”
  3. “Adapt to Individuals”
Krishna refines these three principles into new outlooks on the design process. He asks his colleagues questions that have gone largely unasked: is this process better, more efficient, more useful than what came before? (Is a smartphone app to unlock your car more practical than your key? No.) Can simple, screen-free technology make difficult tasks simpler? Can household technology learn user preferences—without cumbersome, insulting screen apps?

Golden Krishna
In some respects, Krishna’s vision overlaps with prior visionaries and critics; Jaron Lanier springs to mind. Both inveigh against technological passivity. Computers and other doodads are fine, Krishna asserts, if they serve human needs and make human life simpler. But addictively colorful phone apps, unhelpful multistep processes for simple tasks, and ad space colonizing screens like Spanish moss has made life palpably less simple and enjoyable.

Technology is capable of learning human needs. While it’s impossible for designers to create separate experiences for the millions, potentially billions, of individual users, technology is capable of adapting itself. Krishna cites several examples, from a simple fuzzy-logic home thermostat, to Deep Blue, the chess-playing computer that beat Garry Kasparov, of devices and systems that see human uniqueness as a virtue, not a bug.

To emphasize his message, Krishna has made this book a paragon of design. Though running north of 200 pages plus back matter, Krishna’s text is actually much shorter, with visual diagrams, photos, dialogs, and non-traditional use of white space. He writes with the compressed energy of a TED talk, and uses his book to demonstrate his principles. He doesn’t wallow in nitty-gritty tutorials. Instead, he invites readers to share an evolving vision.

A prior reviewer wrote: “Make no mistake: This is a sermon. It's not a practical guide. It's not a set of concrete steps to improve.” If I may speak for Mr. Krishna, that’s essentially the point. UX/UI has become dominated by step-by-step instructions and closed-process approaches, which render customers and designers both functionally passive. Krishna speaks against that technique, demanding content creators and experience designers remain actively engaged with their product.

Krishna’s stated principles will undoubtedly receive much criticism. Not just from those whose career paths rely on tech companies doing what they’ve always done, either. He repeatedly stresses the importance of design ethics, of prioritizing users’ well-being above “monetizing eyeballs.” Can you imagine, say, Mark Zuckerberg telling shareholders that this quarter’s dividends have gone down because he’d rather do right by users than sell ad space?

Me neither.

That said, he’s not wrong. Today’s epidemic of people glued eyes-first to laptops, tablets, and phones didn’t just happen; UX/UI professionals designed it. Enrapt audiences are good customers and, more accurately good product which corporations can tranch and resell to ad peddlers (see also Marc Goodman). Much as I enjoy Facebook, it’s impossible to deny that first-generation coders didn’t have our best interests in mind.

No, this isn’t a how-to book. It’s a vision of what Golden Krishna believes computers should be capable of. It’s a manifesto for future designers to apply themselves to making technology simpler for us, not dominant over us. It’s a vision of a future in which I’d willingly live.

Monday, October 13, 2014

In the Kingdom of the Newbies

Liz Wiseman, Rookie Smarts: Why Learning Beats Knowing in the New Game of Work

Did you ever read a book and think the author missed her own point? Say, an author praising the ingenuity of beginners, whose unclouded vision opens doors in today’s fast-moving economy? Liz Wiseman says plenty I find laudable in this book, but suffers the very tunnel vision she attributes to others. She’s so eager to extol the contributions rookies make in contemporary business, she misses that her evidence points to a related, but very separate, conclusion.

Wiseman, a management consultant and businesswoman of varied CV, covers much the same ground Shane Snow and Jack Hitt explored recently. However, where Snow and Hitt are journalists, Wiseman, an entrepreneur and researcher, brings hard analytical sophistication to her process. She makes a persuasive case that, in disciplines where innovative thinking matters, new players and career shifters bring strategic advantages which credentialed experts often miss.

Rookies accomplish this, Wiseman writes, through aggressive networking, diversifying the knowledge base, and seeking guidance where needed. Wiseman writes: “Aware of his [sic] own lack of knowledge, the rookie embarks on a desperate, focused, diligent search, hunting for experts who can teach him and guide his way.” Oh, wait, so experts really are necessary? Rookies benefit from their willingness to defer to experience?

That suggests, not that rookies beat veterans, but that rookies and veterans need one another, forming a symbiotic relationship where each advances the other. Indeed, where each lacks the other, catastrophic consequences frequently ensue. Untutored newbies created the Clinton-era tech stock bubble. Grizzled old hands with minimal tendency to ask plainspoken questions tanked the financial and housing sectors. Imagine if either had simply shown basic willingness to listen.

Shane Snow addressed this very topic (I had significant problems with Snow, but this wasn’t one). Though one-on-one mentorships tend to perpetuate old habits, a diffuse program where senior workers counsel up-and-comers encourages newbies to take chances, learn more, and do better. Though neither Snow’s journalism nor Wiseman’s research proves it, common sense suggests such relationships also keep veterans open to rookies’ innate wide-eyed wonder.

Liz Wiseman
Further, Wiseman repeatedly extols “humility” as a rookie virtue. Rookies, she insists, are naturally humble, where veterans are cocksure, shunning advice. I say: can be. We’ve all known noobs who accept, even solicit, guidance, and pundits who talk without listening. We’ve also known old warhorses who maintain the cheerful mindset of perpetual students, and novices who prove the adage, “A little learning is a dangerous thing.”

In my varied career, I’ve seen:
  • Apprentice actors who argue with directors, believing themselves unrecognized Pacinos;
  • Freshman Comp students who demand top marks because they received all A’s in high school;
  • Recent nursing graduates who unilaterally countermand doctors’ orders;
  • Graduate students picking fights with otherwise generous professors in defense of theories discredited decades ago;
  • Writing workshop participants who eagerly give criticism, but turn deaf when receiving it; and
  • Factory noobs who need bandages or splints because they reach around basic safeguards.
And I must admit, at various times, these people have been me.

Wiseman talks up “green belt syndrome,” a martial arts term for student fighters who, having received their first Dan rank, believe themselves born samurai. Wiseman clearly thinks this makes them scrappy and indomitable. But martial artists call it a “syndrome” deliberately: GBS sufferers frequently pick fights they’re unqualified to win, jeopardizing themselves and others. Some people require periodic ass-beatings to instill needed humility.

So, if neither rookie humility nor teamwork are foregone conclusions, what remains? Neither innocence nor experience seems sufficient, whether from common sense nor Wiseman’s exposition. Indeed, from Wiseman’s own evidence, I draw a contrasting conclusion, the necessity of all stages within complex organizations. Apprentice triumphalism is as unwarranted as professional self-satisfaction. Rookies need expert guidance; veterans need unfiltered newbie eyes.

Even Wiseman acknowledges this early: “Rookie smarts isn’t defined by age or by experience level,” she writes; “it is a state of mind.” Complex organizations benefit from occasional transfusions of fresh blood, whether from new hires or internal reshuffles. This doesn’t mean putting your best shellbacks to pasture, because new blood needs old. But it does require never becoming so enamored of past triumph that you miss the approaching future.

In my favorite quote, Wiseman writes, “What we know might mask what we don’t know and impede our ability to learn and perform.” I agree; I’ve seen Taylorist managers submarine their own operations by refusing floor-level advice. But that doesn’t make the diametrical opposite true. Wiseman’s so focused on rookie contributions that she apparently misses the two-way nature of the relationship.

Saturday, September 13, 2014

An Open Letter to My New Favorite Author

Shane Snow, Smartcuts: How Hackers, Innovators, and Icons Accelerate Success
Note: this commentary is a continuation of my previous review, Success on the Installment Plan.
Dear Mr. Snow:

On August 12th, 2014, shortly after I published my pre-release review of your first book, you contacted me personally. Among other things, you asked: “what [do] you think would have solved that ‘half an argument’ issue?” And: “were there chapters where you feel the ‘half an argument’ thing wasn't a problem?” Having taken a month to contemplate your questions, I think I’m finally ready to venture an answer.

Let me first thank you for your courteous, intelligently self-critical message. I’ve suffered recently from authors who think they’re owed positive reviews simply for publishing something, or accuse me of vitriolic bias for disagreeing, or aggressively attempt to squelch and silence my response. Your gentlemanly willingness to keep civil, engage in dialog with opposing viewpoints, and solicit further feedback suggests you’ll go far.

Therefore, after careful consideration, I must conclude my problem isn’t with your book specifically. That is, while your book embodies problems I’ve seen increasingly often recently, my problem is the trend upon which you ride. I’m troubled by the popularity of a secularized pseudo-Calvinist determinism that treats success and failure as foreordained, business and life circumstances as transferable, and life as free from contingency.

Essayist, businessman, and hedge fund manager Nassim Nicholas Taleb identifies three fallacies that impede our ability to analyze economic, social, and cultural movements:
  1. “The illusion of understanding,” the belief that reality is, in full, comprehensible;
  2. “The retrospective distortion,” the tendency to evaluate events afterward, seeking linear narrative and clear cause-and-effect relationships; and
  3. “The overvaluation of factual information,” the assumption that, with sufficient facts, we can preclude flukes and fortuity from all decisions.
This tripod accurately describes my problem with many business theorists’ writings, including Clayton Christensen, Seth Godin, Josh Linkner, and now you. By assuming we can retrospectively reconstruct success, and market it generically, we systemically dismiss life’s unpredictable circumstances. We treat every success as happening in a vacuum. (I have significant problems with Taleb, too. But that’s for another diatribe altogether.)

The model you utilize in writing your book essentially involves finding people you deem successful, admirable, and worthy of emulation; tracing the path they followed to arrive where they are; and urging us to do much the same. Certainly, in describing business pioneers like Elon Musk, or cultural innovators like J.J. Abrams, I cannot fault your facts. Yet in stepping outside your text, I cannot avoid noticing significant omissions.

Consider: your profiles frequently involve what your subjects reveal in direct interviews, official press biographies, and other forms of self-reporting. You never ask yourself why successful people report themselves certain ways. Smarter people than me have observed that simply being wealthy changes how people think. They write biographies to justify themselves, or propound moral principles, or sell product. Factual accuracy ranks low in their priorities.

I’ll revisit an example from my first review. Having a theatre degree myself, your Jimmy Fallon example speaks to me directly. I’m intimately familiar with performance—not just the love of engaging an audience, but the frustration of turning one’s love into one’s career. Therefore, it bothers me that you spend pages and pages and pages on Fallon (and several on Louis CK), but none whatsoever on the thousands of aspiring comedians forced to quit every year.

Examining Fallon’s success, and nobody else’s failure, creates the retrospective illusion that Fallon succeeded because he had to succeed. Numerous comedians follow his exact arc. But they didn’t play Zanies the night network scouts visited, or they died onstage the night somebody else killed, or they auditioned for SNL the day Lorne Michaels ate bad pastrami, or any of ten thousand circumstances not encompassed by Fallon’s official biography.

Because your work spells out success anecdotes in exhaustive detail, while giving only nodding recognition to failures in similar fields, it creates an illusion of false control. In your personal e-mail, you acknowledge that “business indeed is often like gambling...but that there are ways to make smarter bets, and that's through pattern recognition.” But remember the bromide, the house always wins. Your best chapters aren’t about gambling at all.

I particularly like your chapter on how Eli Pariser parlayed Upworthy.com into a competitive Web venture by co-opting techniques from listicle writers and spam merchants. My favorite bit has you describing how he floats multiple link titles to experimentally test which attracts the most clicks. But that’s almost exactly the opposite of gambling: he starts with a desired outcome, tests variables, factors for contingencies, and thinks like an engineer. Can you not see that?

Benedict Carey’s How We Learn, published the same day your book was, focuses on how people can systematically improve themselves, their careers, and their chances in real life. It’s truly possible to reduce happenstance to an acceptable level. But to do so, we must plot opportunities prospectively, not retrospectively. And we must chart our own course, not assume we can replicate something somebody else already did.

If we could genericize success and market it broadly, somebody already would have. While I don’t mind using others’ stories as inspiration or exemplar, we must resist the temptation to think we can follow somebody else’s paths to success. “Overnight successes” play from long, painful investments. You hint at the difficult parts of slow learning, Mr. Snow, but your text spotlights dramatic high points. In short, your book needs more process, less spectacle.

Wednesday, September 10, 2014

Success on the Installment Plan

Shane Snow, Smartcuts: How Hackers, Innovators, and Icons Accelerate Success

Something cracked inside me partway through reading this book. I was digging Snow’s exhortations about knowing your field, and finding efficient ways to improve yourself for maximum desirable outcome, and “working smarter and achieving more—without creating negative externalities.” Wow, externalities: Snow even knows my favorite buzzwords. I really thought this guy might be in a weight class with Malcolm Gladwell or Charles Duhigg.

Then, at almost exactly the halfway point, specifically in chapter five, something rang hollow. Snow builds a metaphor comparing business trends to surfing. Good surfing waves come in groups, called “sets,” and sometimes the first wave in the set is the best. Sometimes, though, competitive surfers know to wait for the second or third wave. There’s no scientific or measurable way of knowing which; serious competitors develop gut instinct.

Likewise, Snow writes, being first on some popular bandwagon is sometimes economically advantageous. Other times, savvy innovators know to hang back, letting someone else pay the costs of creating new markets, then slide in later to reap the reward. How to tell? There’s no true way. Either you know or you don’t, and the difference between runaway success and abysmal failure lies mainly in your ability to guesstimate the signs.

In other fields, we call this “gambling.” We excoriate bankers whose half-educated guesses on economic movements nearly imploded America’s economy in 2007. Perhaps the difference between Goldman Sachs, whose name became a veritable cussword after TARP funds prevented their abject collapse, and dubstep millionaire Skrillex, whom Snow praises, is that Skrillex won his bets. (Snow mentions, but doesn’t much explicate, economic second-wavers like Gmail and Twitter.)

I realized: Snow cherry-picks winners, and rebuilds their development arc retrospectively. He essentially predicts the past, treating various winners’ triumphs as inevitable because they won. There’s little sense of the contingencies that contribute to success. Because Snow spends little time on those who fail in the same fields, except to occasionally name-check them, we get little idea what separates Snow’s extolled successes from similar failures.

Lemme give just one example: Snow explicates Jimmy Fallon’s run up comedy’s ladder, which was fast: he went from live stand-up, to SNL, to Late Night, to the Tonight Show, and he isn’t even forty yet. Fallon enjoyed a committed mentor who refined his performance, sheared useless place-holding dates from his performance schedule, and focused on what performances would further Fallon’s goal: getting on SNL. Snow says everything exactly right.

But.

Thousands of people become stand-up comedians yearly. Most are young, starry-eyed strivers; some are second-careerists revitalizing themselves. Most will, as Snow describes of Louis CK, informally apprentice themselves to George Carlin or Bill Cosby recordings. Thirty or forty will get on SNL, and others will snag writing contracts, sitcom development deals, or HBO specials and lucrative tours. Most will give up because they get hungry. What’s the difference?

The answer defies simple formulae, but to approach it, let me use another comparison from Snow. Two YouTube users became overnight stars: Paul “Double Rainbow” Vasquez couldn’t replicate his success, while makeup artist Michelle Phan did. It takes Snow an entire chapter to reach the seemingly obvious conclusion: Phan’s content was useful. Consider who becomes YouTube stars: comedians, opinionators, educators. People who produce practical, topical, or consistently entertaining content.

Jimmy Fallon succeeded because he cultivated (and still cultivates) a roster of impersonations that play well on after-dark TV. Note that Fallon’s TV career has flourished, but his very brief cinematic career cratered. Snow never mentions Fallon’s biggest failure, the movie Taxi, which Fallon himself admitted on NPR “wasn’t supposed to be a tragedy.” Even with a committed mentor, Fallon’s success relies on his ability to avoid repeating failures.

Snow essentially collects a robust selection of anecdotes which prove his desired point, spots the similarities between them, and treats these as definitive. But note, he starts with the conclusion, and builds the reasoning retroactively. We in the logic-chopping business call this “shooting the barn,” from the metaphor of a supposed Texas sharpshooter who opened up into his barn wall, then painted a target around the largest cluster of holes.

In fairness, Snow writes well. I read cover-to-cover in two sittings. And he makes many good points, albeit mostly in isolation. The problem isn’t any one thing Snow says, but the overall pattern, which doesn’t hold together. Malcolm Gladwell incorporates counter-evidence in his reasoning, and addresses it. Snow basically considers his message so self-evident that he doesn’t bother. This book just feels like half a statement, waiting for more.

Friday, October 11, 2013

eSpelling

I first encountered the term “email” in a Freshman Composition textbook, in an essay by digital journalism pioneer Michael Kinsley. Not that I’d never encountered the concept before. By 1999, when I started college late, digital communications were busily revolutionizing how people communicated, allowing people to send text-based communications internationally, but also giving junk advertisers unprecedented access to ordinary citizens’ information.

No, I understood the idea of “email.” The orthography, however, took me by surprise. I’d always written it “e-mail,” emphasizing the separate pronunciation of the first syllable, /ē’-māl/. Kinsley’s spelling made the word look like it should be pronounced /ə-māl’/, with the emphasis on the second syllable. It resembled the common French name Emile, more than anything electronically analogous to the Post Office, which I still religiously used back then.

This digitally motivated shift in English orthography has not been entirely even. The corporate name Google has become a lower-case verb because of Google’s overwhelming market presence, and because English needed such a word. The comic strip character Barney Google probably helped ease us into saying “to google” when we needed to search the Internet. Yet the spelling of “to google” remains entirely consistent with preceding English language.

Not that Kinsley holds domain on spelling. The MLA Stylesheet, and other print-based usage guides, favor “e-mail,” the spelling that still seems most consistent with pronunciation to me. The Stack Exchange, a linguistics website, indicates that my preferred spelling remains more widely used. Yet the AP Stylebook changed its standards in 2011, now favoring “email” for journalists online and in print. As go journalists, so, probably, will go the vernacular.

Linguists agree that English is probably the hardest language for second-language learners to savvy. Its frequent borrowings from other languages create lopsided spelling shifts, as a word from French may align with another from Japanese, all arranged in an essentially German grammar. Advancing digital technology only compounds this confusion, because we aren’t borrowing from a real language. We lack any precedent for how to spell or organize new words.


Only that initial vowel, usually e or i, has changed how we write. We absorb inconsistencies with remarkable ease: for instance, we’ve never decided how to describe commerce occurring in an entirely digital format. Do we call it “e-commerce,” “ecommerce,” or “eCommerce”? My computer’s digital spell-check accepts the first and third options, but balks at the second, though it had no problem with how I wrote “email” earlier.

Trade names ease some of this confusion. Brands like iShares, iPad, or eBay have easily pronounceable names, because the offset capital letter emphasizes the initial vowel’s separate status. Nobody seriously expects to shop on /ə-bā’/, using an /’i-pƦd/. Yet the capital letter doesn’t just steer pronunciation; it also signals these terms’ identity as wholly owned brand names. Nobody “owns” email, though it has many providers, so it won’t get capitalized.

Even the brand-name workaround doesn’t always work. E*Trade is pronounced much like email, yet the Wal-Mart-ish asterisk creates an entirely new orthography, separate from email or eBay. It’s hard to say how seriously to take this, however, since the company still formally organized as “E-Trade Financial Corporation.” The asterisk may be a mere stylistic flourish invented by advertisers. Mercifully, few others have mimicked it so far.

Electronic books may only compound this inconsistency. Not only do we have no agreed means of describing these products (e-books? ebooks? eBooks?), but the large and growing number of programming formats means that books written in one form, Kindle for instance, are unreadable in other formats, like iPad or Kobo. Authors’ surface orthography proves only the tip of the readability iceberg, and people favoring one format develop their own dialect.

While e-book entrepreneurs frantically try to invent new formats, they strive to keep them compatible with old formats, like HTML and PDF. Thus English retains its single shared past, when computers had to be compatible or they’d be useless; but it cruises toward a divided future. The market will certainly shake out certain formats (Rocket yBooks still exist, for instance, but only vestigially). But new technology will hasten new confusions.



English has never had an institution like L'AcadĆ©mie FranƧaise, which regulates and purifies French. We’ve never needed or wanted such an institution, and its lack keeps English adaptable to changing times. Yet technology’s rapid and accelerating shifts mean we’ll face new ideas and applications daily. Without mutual standards for new, tech-based language, English will only become harder and more opaque. Earth’s most widespread language deserves such a backstop.

Monday, June 4, 2012

America, Land of the Do-It-Yourself Self

Jack Hitt, Bunch of Amateurs: Inside America's Hidden World of Inventors, Tinkerers, and Job Creators

Andrew Carnegie never went to school. Thomas Edison had no professional credentials whatsoever. America’s greatest innovations have come from the hands of people who the “proper officials” said had no business getting involved. Journalist Jack Hitt asserts that amateurism, the pursuit of a field out of sheer love, without expectation of reward, sits at the heart of American identity.

Amateur derives from a French term for love, and signifies that we have a passion for some subject that no amount of money can approach. And that’s what makes America strong. We made a country without relying on kings or popes. We advanced science sometimes in the face of proper scholarship. Our best businesses started as shoestring operations. We are a people that has built ourselves without waiting for someone to rubber stamp our enterprises.

Amateurism stands, for Hitt, against “credentialism,” the dogmatic belief that externally bestowed endorsement makes somebody an expert. Credentials often impede innovative thought—a point that isn’t even new this year, since it played so large in Jonah Lehrer’s book Imagine. The very process of becoming a professional insider instills habits of thought that ensure the thinker can do the job exactly how it’s always been done, and not one step beyond.

We can see this in Hitt’s book, when professional ornithologists wear blinders that keep them from seeing what even an amateur birder can see: that ain’t an extinct bird returned to earth. Or in the chapter on amateur astronomy, when the professionals have to keep doing work that will produce results. The amateurs have the liberty to study the sky, looking for the kinds of discoveries that rarely come, but actually move human knowledge forward.

We know this, just looking around. We can see that business school graduates make lousy entrepreneurs. Journalism school graduates seldom do meaningful investigations, preferring to repeat official statements with the agreeability of bobblehead dolls. Most physicists’ best work is done before they turn thirty. Outsiders, guerillas, and eager neophytes make the actual inroads that keep America’s greatest disciplines thriving.

Jack Hitt
Hitt notes an important study showing that compensation actually sucks the life from pursuits. From an early age, pay changes the equation that drives our actions. Small children will draw or write or play for the sheer joy of the process; but when rewards, or grown-up approval, gets into the equation, creativity and productivity go through the floor. That’s why, when you get a job doing what used to be your hobby, the joy goes out of whatever you used to love.

Not every chapter supports Hitt’s thesis. Indeed, his chapter on amateur archaeology, with its implications of sublimated racism and pseudo-intellectualism, suggests that amateurism contains the roots of powerful abuse. Even in his largely laudatory chapter on amateur genetics, he never quite addresses the risk he brings up of some exuberant teen warping the common cold into the next Black Plague. These serve as cautions against unbridled amateurism.

The do-it-yourself ethos taps into the best America has to offer, but also the worst. It forms the lifeblood of cultural development, allowing those truly passionate about their field to make substantial contributions. But it allows ignorant crackpots to go off half-cocked, propogating ideas that are dangerous or wildly offensive. But maybe that’s a fair description of America and her people: a nation of unrecognized geniuses and wild-eyed fanatics.

How, then to counter the worst impulses of amateurism? Hitt has no suggestions. Instead, he simply reminds us that, for all its risks, amateurism has contributed more to our national well-being than we can possibly calculate. It falls to us to ensure that we keep up the passionate immersion of amateurism, without lapsing into the dangerous extremes of moronic crankery.

Hitt overlaps with several other recent books. In addition to Lehrer, mentioned above, Hitt shares many themes with Susan Cain’s Quiet, Charles Pierce’s Idiot America, and Malcolm Gladwell’s Outliers—must be something in the water. Sometimes they correspond almost verbatim, since they’re quoting the same sources. While Hitt doesn’t necessarily bring new ideas to the table, he brings new and interesting context.

Hitt makes a strong case that Americans’ frontier ethos, where we do it ourselves, and where we make our own expertise, makes us the people we are. He sells his point with careful insight, unexpected dry wit, and spirited narrative panache. If he can get just a few people out of their TV-induced comas and out doing what they love, he will have done a good service to this great land.

Friday, May 25, 2012

One Possible Cure For Small Town Malaise

Ever since Sinclair Lewis diagnosed the “village virus” in his 1920 classic Main Street, America has remained deeply divided in its feelings about small town life. On the one hand, we often treat small towns and rural areas as bastions of earthy virtues and interlaced community. On the other, small towns often produce small minds, and serve as seething cauldrons of resentment. Unfortunately, both views are right, which means both views are wrong.

Anyone who has lived in America’s small towns recently, however, knows our rural communities are certainly one thing: marginal. This has been the state of American village life since at least the Eisenhower era, when cheap cars and postwar prosperity led to a concentration of industrial might in metropolitan areas. Small towns became stopovers on American transportation routes—and, with the rise of Interstates and cheap air travel, became not even that.

Nowadays, John Mellencamp serenades his happy memories of small town life, though his tours stay in cities large enough to support arena venues. Bill Clinton touted his birth in tiny Hope, Arkansas, while eliding that, when he was a boy, his mother moved the family to the resort suburb of Hot Springs, or that he left Arkansas altogether to commence his career. People tout small town origins, but have to leave to make something of themselves.

Jonah Lehrer, author of Imagine: How Creativity Works, describes studies that have indicated why some of the world’s largest cities have also proven the most creatively fertile. Where large, diverse populations interact, people have the opportunity to discover new points of view, prod others to greater accomplishments, and test each other’s capacities. Simply put, as the population increases, productivity increases, geometrically.

This thesis has its limitations. Were size the only relevant variable, Lagos, Nigeria, would be as creative and prosperous as London, which is about the same size. But if we look at those cities and neighborhoods that have proven the most productive over the years—Greenwich Village, Haight Ashbury, Bloomsbury, the Left Bank—we see they aren’t just populous. They’re also arranged to facilitate interaction among diverse populations.

Manhattan’s White Horse Tavern, famous for Dylan Thomas’s fatal drinking binge, began life as a longshoreman’s watering hole. Hampstead pubs famously draw artists, laborers, businessmen, and tourists. People meet one another across economic and social lines. Even the streets favor interactions, since crowded main roads slow traffic and advantage pedestrians. How many novels, paintings, and business ideas were conceived in Paris Metro stations, I wonder?


By contrast, the small towns where I’ve lived have a self-segregating tendency. We have working class bars, professional bars, student bars, sports bars, and their clientele never mixes. Different coffee shops and tea houses, different restaurants and businesses, cater to distinct customer bases, deepening divides. Too often, small town dwellers never meet anyone particularly different from themselves.

This is heightened by village layouts. Low land values and minimal space competition means towns sprawl. Without a car, small town dwellers are stranded, but with cars, they never meet anyone they don’t want to. People can ensure they only frequent businesses that cater to them, not ones near where they live or work. One of my town’s major coffee shops is drive-thru only, so you can enjoy your mocha frappe without any messy human entanglements.

Where communities have economy enough to encourage new building, that tendency becomes more pronounced. Single-use developments include similarly sized houses on identical lots, forbidding mixing of economic strata. They also seldom have restaurants, bars, shops, or hangouts where people can meet. They may have a few lots zoned commercial, but building costs ensure only chain businesses, like fast food franchises, can afford to move in.

In Nebraska, where I live, politicians invest much hair-pulling in wondering what it takes to keep ambitious, educated young people from leaving the state. Yet large-scale economic development funds exclusively create pedestrian-hostile towns which minimize human interaction. Even for those whose education and connections ensure economic mobility, social mobility has dwindled to insignificance. Free flow of ideas just doesn’t happen.


Shifting design priorities to boost interaction would alleviate at least part of the ennui that blankets many small American towns. If people could sit down to eat and drink in the same space as a callused tradesman and a necktied attorney, they would encounter new ideas, which they could then experiment with, until they created something truly new. If people had to walk to work, they could enjoy the surprise of simply saying hello to people they don’t see every day.

Such changes wouldn’t be a silver bullet to save America’s small communities. Many towns need to modernize their infrastructure and overcome cultural habits that defy the times. But if people could meet newer, more diverse groups, such towns would be well on their way to achieving such much-needed goals. The longer we wait, the harder it will be to overcome inertia.

Wednesday, November 2, 2011

Companies That Know Our Needs Before We Do

Why did the iPod and Kindle create an appetite in consumers while the Zune and the Sony Reader didn’t? Why do grocery buyers regard visiting Safeway a chore, but look forward to shopping at Wegman’s? Adrian Slywotzky investigates these questions in Demand: Creating What People Love Before They Know They Want It. But he overtalks the point and, unfortunately, blurs the line between business writing and advertising.

Consider: did you have any idea you were unhappy with Lackluster Video before Reed Hastings launched Netflix? Did you feel the need for constant social networking before you signed up for Facebook? Of course not. Yet the creative minds behind these breakthroughs recognized needs so completely unmet that we hadn’t even noticed we had them yet. Now we can’t imagine our lives without these cushy conveniences.

Slywotzky admits from the beginning that he has no simple formula for what he calls “demand creators,” those producers and vendors who sell what we never knew we needed. Though he identifies several areas where future developments will happen, he can’t give you a step-by-step guide to creating demand. Demand creation relies on creativity and insight. Our best hope is to learn from those who have done it before.

It may seem that many demand creators have their leaps only once. Slywotzky published this book before Netflix’s recent flame-out. While Reed Hastings had a good idea, his follow-through hasn’t been exemplary. He had one idea that revolutionized the media distribution business, but he has peddled a great deal of confusion lately. Can we really learn anything from someone who misjudged his audience so badly?

Yet remember how many times Steve Jobs or Jeff Bezos hit the nail on the head. Each correctly anticipated not only what customers wanted, but how developing technology made fulfilling those wants possible. And technology isn’t even necessary. The Wegman’s grocery chain has stayed ahead of the competition for decades by simply making their stores a destination, and by not expanding faster than economic realities permit.

Some demand creation requires routine awareness. Jeff Bezos recognized the Kindle’s potential when he saw a Sony Reader, which was well designed but poorly pitched, and realized it could benefit from Amazon.com’s infrastructure. Other demand creation is more studious. California-based CareMore delivers managed medical care to the elderly at steep discounts while improving quality because market research returned results that appear downright counterintuitive.

Adrian Slywotzky
For all their differences, though, demand creators share an ethic of recognizing their customers as real people with real needs. Even non-profits like Teach For America or the Seattle Opera recognize their audiences, in very real ways, as customers. The institutions have something people need—such as education or culture—and their first responsibility is to match their product with others’ needs.

But I have difficulty understanding the mental habits Slywotzky wants me to gain because his segmented structure doesn’t let me spot patterns. He discusses two or three companies per chapter, from profitable industries to philanthropic institutions, but by the next chapter, those examples vanish. I’d love to know how Zipcar embodies Slywotzky’s various virtues, but once he’s done with it, he never mentions it again.

And his praise of companies he admires goes on at length. His praise, in particular, of Tetra Pak and the Kindle go well beyond what we need to understand these products’ success. Slywotzky’s stories mount up, with such unflagging praise that he starts to sound like a PR flack. Is he paid by the line? My eyes glaze over, and I find myself wondering whether I’ve really learned anything in the last fifteen or twenty pages.

Business writing must respect that business innovators lead busy lives. They generally exceed forty hours, and time spent reading should be remunerative in some way, whether by teaching something useful or providing needed psychological refreshment. Writers must put every anecdote, every discursion, every rumination on trial for its life. Anything that doesn’t advance the point must go.

This relatively long book could use a firm editor. It’s nearly a third longer than similar business books, and even at that length, feels disjointed. Much as I appreciate the individual points, and as much as I they advance ongoing discussions, this book doesn’t go nearly as far as it should. Inquisitive readers can gain plenty from reading, but it simultaneously goes way longer than it should, and doesn’t do its topics justice.

I’ve identified a demand I didn’t know I had. I demand this necessary book, with the numerous kinks worked out.