Thursday, July 28, 2022

Algorithm & Blues

Gretchen Rubin

Has anybody actually calculated the useful life expectancy of a Pandora channel? You probably know what I mean. You input some artist you’ve recently discovered, and Pandora starts giving you sufficiently similar artists. You like Nick Drake (let’s say hypothetically), so it introduces you to Ralph McTell, John Cale, and Donovan. Yippee! But at some vague point, it starts giving you only artists you already know. It stops being useful.

Gretchen Rubin writes that Pandora’s own corporate executives agree that their product is a mixed blessing. (Rubin interviewed a selected executive before Sirius XM bought out Pandora.) The service introduces listeners to artists they’d never encounter through commercial FM radio, particularly if their seed artist is esoteric, international, or regional. But Pandora doesn’t introduce listeners to new kinds of artists. We hear more artists, but less diversity.

Pandora, like similar streaming services including Spotify and Deezer, make our worlds paradoxically deeper, but narrower. By contrast, FM radio or the original MTV showed a greater range of artists, but didn’t explore most options particularly minutely: they made our worlds broader, but shallower. Because broadcast and cable services need to attract large audiences to maintain ad revenue, getting too particular drives away sweet, sweet dollar signs.

I noticed this pattern when social media began aggressively showing me ads from Ben Shapiro’s Daily Wire. The Facebook and Twitter algorithms observe me using Christian language and discussing my faith, and assume I’m conservative. I keep playing whack-a-mole to drive these ads away. Meanwhile YouTube (a Google subsidiary) knows I like leftist videos, so it assumes I’m atheist, and aggressively pushes videos by Christopher Hitchens and Sam Harris.

My Netflix and Amazon Prime recommendations, once a labyrinthe of new discoveries, have become ingrown, like a toenail. As I become bored of the once-vast panoply of options, I don’t know where to begin looking for new options because, unlike FM radio and broadcast TV that once dominated our lives, the streaming services don’t vary. To break free from enforced monotony, I’m forced to flee the paid services.

Sasha Issenberg

This observation probably seems obvious to most people today. But the longer I ruminate over it, the more I realize it’s become a defining reality of daily life. Our employment options have become constrained by mass corporate consolidation, especially for older workers with more responsibilities. Anybody who’s looked for work recently knows that, as the employment market has become more controlled, filling out one application can take an entire day.

Likewise, the ubiquity of single-use urban design based on feeder roads means we have little need, much less opportunity, to visit most of the cities we dwell in. Despite living in a city under fifteen square miles myself, I recently realized entire neighborhoods are terra incognita to me, because there’s no reason to enter without invitation. My daily world has shrunk to a small handful of streets and prefab buildings.

This trend reaches its culmination in political parties. In America, if you favor free markets, you’d better join the Republicans and also favor hoarding guns and shitting on the environment. If you favor racial equality, you’ll join the Democrats and favor abortion on demand and hiking the minimum wage. Sure, third parties exist, but in America’s winner-take-all system, if you hope to win, you’d better buy into the duopoly.

Nor is this accidental. Political parties, and the superpacs that bankroll them, have channeled massive algorithms into politicking. As Sasha Issenberg writes, numerous private companies store databases derived from your purchase records, social media behavior, and other sources, which make predictions about your political allegiances. Parties use these databases to steer what information you see, making your political views more siloed, extreme, and frequently intolerant.

Free will, if humans truly have it, is always conditioned. That’s not revolutionary to say; even I've said something similar. But as algorithms control what information we’re permitted to discover, the conditions circumscribing our choices become more extreme. Companies like Facebook, Google, and Amazon make first-order profit by making our options deeper, but narrower, just like Pandora. Their second-order profit comes from selling those limits to the rich and powerful.

If public discussions have become more polarized and intolerant—and evidence suggests they have—then these controlling algorithms bear a robust share of the responsibility. By making sure we never encounter new ideas, new people, or even new music, these algorithms reduce us to crude caricatures of our once-vibrant selves. Our horizons become shriveled. And unfortunately, it seems impossible to resist on an individual basis.

No comments:

Post a Comment