Wednesday, August 8, 2018

Alex Jones and the New Techno-Government

Alex Jones
Facebook and Google have given me massive ethical twitches recently. As Earth’s two among biggest websites, they draw massive amounts of business into their webs every day. It’s virtually impossible to communicate with a mass English-speaking audience without going through these two companies. But as privately held companies, they have the ability to enforce personal, often arbitrary “community standards” on content produced by ordinary citizens. They have become the ultimate privitization of the public sphere.

This most readily manifests in “adult” content. YouTube (a Google subsidiary) and Facebook both assiduously screen images of boobs, sexual content, vulgar or violent memes, and anything else that might curl Aunt Mabel’s hair. I have no problem with that, theoretically. Except I do, because both Google and Facebook are so vast, and functionally screen so much undifferentiated content ordinary people receive, that making these concepts disappear from their sites makes them basically disappear forever.

So I’m conflicted about Facebook and YouTube’s decision Monday to scrub notorious troll Alex Jones. This moon-faced whack-a-mole, notorious for preaching everything from “Pizzagate” to Sandy Hook “crisis actors” to saying Democrats plan to launch a “Second Civil War,” finally pushed even Facebook and Google’s studiously neutral content critics too far. They’ve decided to starve him of oxygen. Part of me wants to shout: “Thank God! Maybe we can get serious, grown-up discussion going again!”

Except…

Between them, Google (which owns YouTube) and Facebook (which owns Instagram) control over half of planet Earth’s Internet advertising revenue. They aren’t just content gatekeepers; they profit handsomely from deciding what you and I see. Though neither company has official state standing, both have power and reach autocrats like Vladimir Putin and Xi Jinping must drool over. Making somebody disappear from these sites has consequences so far-reaching, the word “censorship” isn’t out of line.

How do we process entities like this, which have greater reach than William Randolph Hearst or Rupert Murdoch ever dreamed of? Google and Facebook have state-like power, but no state-like democratic oversight. Most shareholders have no idea what contributes to “community standards” on these sites. Even many people enforcing standards make snap decisions. Try this experiment: report a friend’s perfectly innocuous statement for violations. Betcha it’ll disappear, because monitors can’t actually read every reported violation.

Mark Zuckerberg
This isn’t even an issue of whether certain speech is acceptable. Like most First Amendment absolutists, I draw the line at incitements to violence. Saying something like “Person X is stupid and shouldn’t breed” is offensive, and deserves scolding. Saying “Get your rifles, Person X is gonna die” crosses a line between speech and action. And when somebody with a platform reaching millions of listeners, simply saying anything requires a diligent conscience and constant scrutiny.

Yet as we saw in 2016, during the first great “fake news” wave, propagandists can produce meaningless, fact-free gibberish that nevertheless motivates a base already primed for anger. We’ve seen what angry people do: they carry guns into pizza joints to verify conspiracy theories. They shoot a roomful of journalists. They hector parents of a murdered child so badly they have to go into hiding. This isn’t free speech fallout, it’s the consequences of actions.

Google, Facebook, and their subsidiaries thus find themselves in a precarious situation: they are private companies with the reach and influence once exclusive to governments. In order to survive, in order to do business and remain viable, they must exercise the discretion of the state. Their so-called community standards, like “no boobies where children can see them,” now have semi-governmental weight behind them. Companies no longer just live by community standards, they now set them.

This is simultaneously comforting and horrifying. It means companies now step up and take responsibility for ways people use their products, even when they use their products recklessly. Corporations have too often sworn off culpability for their products: think gun and cigarette manufacturers. If Facebook and Google can own their products’ behavior, and enforce some bottom minimum for accountability, then maybe so can Philip Morris or Smith & Wesson. I mean, probably not, but maybe. Someday.

Yet when governments silence unacceptable speech, we understand who they answer to. When states say flag burning and incitement aren’t protected speech acts, we (hopefully) realize the government answers to its people. (Pipe down, North Korea.) Corporations don’t. Sergey Brin and Mark Zuckerberg are accountable to, I assume, somebody, but who? And what ensures they use their unelected, state-like authority reasonably? These questions should scare even we who are happy to see Alex Jones silenced.

No comments:

Post a Comment