page 196

First, Gade developed an internal debugging tool to understand how exactly the recommendation algorithm worked. It could tell why a piece of content was being promoted at a given time. Pieces of content came appended with a small link that displayed a few reasons for its appearance in the feed, the variables that made the algorithm register it.

page 197

The logic was basic, mostly revolving around that dominant metric of engagement—what is already popular gets even more exposure. But the feature at least made the feed seem more coherent.

page 200

Communications Decency Act with a piece called Section 230.

page 200

But in the social media era, it has also allowed the tech companies that have supplanted traditional media businesses to operate without the safeguards of traditional media.

page 200

But in the social media era, it has also allowed the tech companies that have supplanted traditional media businesses to operate without the safeguards of traditional media. Section 230 makes a distinction between an open platform, like Facebook, and what users publish on it. “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” the law states.

page 201

The conflicting cases brought up a fundamental paradox: Internet services that did nothing to filter the content going to users were legally protected, while services that did try to filter the content, even just for basic quality or safety, were not protected.

page 202

Social networks displaced traditional publishers by absorbing advertising revenue; siloing content into algorithmic feeds; and mediating the relationship between publishers and their consumers. The consequence is that traditional media companies have been decimated, losing most of their revenue when compared to decades past, and publications are now forced to maintain themselves as responsible publishers with much smaller numbers of staff. Even with their restricted circumstances, traditional media companies continue to hold responsibility for every piece of content they publish. Meanwhile, digital platforms could claim they were not media companies at all with the excuse of Section 230.

page 202

Section 230 has served as a shield, distancing social networks from what individual users post on their platforms. That ranges from #MeToo investigations to racist comments or threats of violence. The law has become the target of growing skepticism, and lawsuits are attempting to hold social networks responsible for what they distribute.

page 203

Algorithmic feeds help automatically distribute misinformation and can speed ideological radicalization, feeding users ever more extreme content in a single category. The problem with Section 230 is that ultimately, and strangely, the law makes it so that no one is currently responsible for the effects of algorithmic recommendations.

page 204

One of our only possibilities is switching to another platform, and yet that, too, has already been limited by the problem of monopolization.

page 204

One of our only possibilities is switching to another platform, and yet that, too, has already been limited by the problem of monopolization. Our relationship to algorithmic feeds feels like a trap: we can neither influence them nor escape them.

page 204

Repealing Section 230 would not be a panacea. In some ways, it protects users’ free speech online and allows digital platforms as we know them to exist without being sued into oblivion for every stray tweeted insult or accusation.

page 204

The justices probed the uses and capabilities of algorithmic recommendations and debated if algorithms can be considered “neutral” (I would argue they cannot), but the general incomprehension was palpable.

page 205

The problem comes down to determining which kinds of content should be able to travel so quickly and frictionlessly across Filterworld, and which should be slowed down or stopped entirely.

page 206

floor that the Internet has come to resemble. Open-source software like Mastodon, which allows its users to create and host their own Twitter-like social networks, provides one hint at what might

page 206

Open-source software like Mastodon, which allows its users to create and host their own Twitter-like social networks, provides one hint at what might be to come. But Mastodon also demonstrates some of the disadvantages of such different infrastructure. Audiences are smaller on self-hosted platforms, and interactions are more difficult. You might not be able to find any kind of content that you want.

page 206

Open-source software like Mastodon, which allows its users to create and host their own Twitter-like social networks, provides one hint at what might be to come. But Mastodon also demonstrates some of the disadvantages of such different infrastructure. Audiences are smaller on self-hosted platforms, and interactions are more difficult. You might not be able to find any kind of content that you want. There isn’t the possibility, or threat, of viral fame. But those are the trade-offs that we may have to make for a more sustainable overall digital culture.

page 206

The Ivermectin stories spread far because they attracted engagement, in part as a politicized issue fueled by the rhetoric coming from Trump and his administration. The misinformation turned out to be optimized for the algorithmic equation, creating the kind of instant interaction that prompts even more promotion.

page 206

The spread of misinformation online during the COVID-19 pandemic drove a viral craze for Ivermectin, a drug that is most often used for horses. The patients who took it were hurt much more than helped, sometimes even hospitalized because they ingested the medicine. The Ivermectin stories spread far because they attracted engagement, in part as a politicized issue fueled by the rhetoric coming from Trump and his administration. The misinformation turned out to be optimized for the algorithmic equation, creating the kind of instant interaction that prompts even more promotion.

page 206

Facebook outsources much of its human moderation to a company called Accenture, which employs thousands of moderators around the world, including in countries like Portugal and Malaysia. These laborers face daily exposure to deaths on camera, recorded abuse, and child pornography.

page 206

Facebook outsources much of its human moderation to a company called Accenture, which employs thousands of moderators around the world, including in countries like Portugal and Malaysia. These laborers face daily exposure to deaths on camera, recorded abuse, and child pornography. They keep the feeds clean for the rest of us, exposing themselves to psychic harm the way trash pickers digging through international electronic waste dumped in Ghana and elsewhere are exposed to poisonous chemicals. The toxic material doesn’t just magically vanish because of the mediation of the algorithm. Once again, the human labor is obscured.

page 207

“Platforms have offered us up something that is meant for intimate, connected speech. Then they moderate it in a statistically broad, systematic way,” Gillespie told me. There’s little room for considering the specific nature of a given piece of content, since it lacks context and becomes atomized in the overall feed.

page 207

Tarleton Gillespie, a technology scholar at Cornell University who is now a principal researcher for Microsoft, explained what that gap means. “Platforms have offered us up something that is meant for intimate, connected speech. Then they moderate it in a statistically broad, systematic way,” Gillespie told me. There’s little room for considering the specific nature of a given piece of content, since it lacks context and becomes atomized in the overall feed.

page 210

Algorithmic feeds accelerate these worst impulses, not just on an individual level but an aggregate one, across all the users of a social network. Titillating material—content that might be violent, provocative, or misleading—can be easier to discover than material that is more boring but also more valuable.

page 210

individualization, whether driven by our own actions or an algorithm. But that seemingly more democratic and low-hierarchy dynamic has also given us a sense that the old laws and regulations don’t apply, precisely because we can decide when to watch or listen to something and when to choose another source. We might have more independence, but we ultimately have less protection as consumers.

page 210

But that seemingly more democratic and low-hierarchy dynamic has also given us a sense that the old laws and regulations don’t apply, precisely because we can decide when to watch or listen to something and when to choose another source. We might have more independence, but we ultimately have less protection as consumers.

page 211

Rather than deciding which content is positive or negative, any kind of content that goes viral—accelerating quickly via recommendations—would be limited instead of promoted more, slowing down its spread and giving moderators more time to judge if it’s appropriate before it reaches massive audiences. Circuit breakers could also bring us back to a less globalized media ecosystem, when pieces of content stayed more firmly within their original contexts.

page 212

The only way to escape the tracking was to hide from it, using an incognito web browser or faking your Identity with a virtual private network.

page 212

The only way to escape the tracking was to hide from it, using an incognito web browser or faking your Identity with a virtual private network. Even then, digital platforms often offered fewer features if you weren’t logged in to an account—which, of course, was continuously tracked.

page 212

In April 2016, the European Union adopted a law called the General Data Protection Regulation.

page 213

GDPR, as it’s usually abbreviated, uses the term “data subject” to describe all of us users. It means anyone who is identifiable from their data online, whether their name, location, or “one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”

page 213

GDPR recognizes that these days, we are our data—data both documents what we have done and influences what we are able to do, or are most likely to do, in the future, oftentimes through algorithmic decisions.

page 213

GDPR, as it’s usually abbreviated, uses the term “data subject” to describe all of us users. It means anyone who is identifiable from their data online, whether their name, location, or “one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.” Data subject might be legalese, but it’s an evocative phrase. In philosophy, a “subject” is any entity with agency and a unique personal experience. Data seems to stand in contrast to that; it’s an immaterial, inanimate thing that documents experience—a record of something that has happened. GDPR recognizes that these days, we are our data—data both documents what we have done and influences what we are able to do, or are most likely to do, in the future, oftentimes through algorithmic decisions. Thus, we should have some of the same kinds of control over it, and rights to it, that we have over our physical bodies. The law outlines a list of “data subject rights,” like fundamental human rights, that digital platforms have to comply with. The first is the right to transparency, forcing companies to respond to users who request information about how and why their data is being used in “clear and plain language.” The second ensures that users can make that request, giving them a “right of access” to information about which forms of data are collected, when tracking happens, and how long data is stored, as well as to request copies of the data itself. In 2017, Judith Duportail, a journalist working for The Guardian, used GDPR to request all of the data that Tinder had on her—which amounted to eight hundred pages, including her Facebook likes; metadata about every match and conversation she had on the platform; and over seventeen hundred messages. (She was quite a data subject.) Duportail was “amazed by how much information I was voluntarily disclosing,” she wrote. But she shouldn’t have been: such data is the fuel for Tinder’s product itself, and self-disclosure is the trade-off we make for automated efficiency.

page 213

Duportail was “amazed by how much information I was voluntarily disclosing,” she wrote. But she shouldn’t have been: such data is the fuel for Tinder’s product itself, and self-disclosure is the trade-off we make for automated efficiency.

page 213

Duportail was “amazed by how much information I was voluntarily disclosing,” she wrote. But she shouldn’t have been: such data is the fuel for Tinder’s product itself, and self-disclosure is the trade-off we make for automated efficiency. Romantic matchups—another kind of algorithmic recommendation—require intimate knowledge.

page 213

“right to rectification”

page 213

“right of access”

page 213

right to transparency, forcing companies to respond to users who request information about how and why their data is being used in “clear and plain language.”

page 213

“right to erasure”

page 214

The fourth group is about opting out, giving users a “right to object,” to choose to no longer be tracked.

page 215

Though these penalties pale in comparison to tech companies’ annual revenue, they do demonstrate how GDPR can provoke a certain amount of compliance.

page 215

Filterworld, the law has been disappointing in other ways. All it takes is the click of an “accept all cookies” button for a user to be tracked just as much as they were before.

page 215

While a bill of data rights sounds like an ideal solution to Filterworld, the law has been disappointing in other ways. All it takes is the click of an “accept all cookies” button for a user to be tracked just as much as they were before.

page 215

Frictionlessness is always the Filterworld ideal—as soon as you slow down, you might just reconsider what you’re clicking on and giving your data away.

page 215

I’m guilty of this passivity myself. When such GDPR notices began popping up on American websites, I most often just clicked to give away my data. If there was a website I liked to read—Eater, for example, the national food publication—I accepted the tracking readily, because I figured, perhaps wrongly, that I could trust the site.

page 216

In part it was out of laziness and the tricks of interface design. The opt-in button is often darker and more prominent than the opt-out, so my brain took a beat too long to understand which was which.

page 216

“This whole kind of individual responsibility type mechanism that the GDPR creates isn’t really effective,” Leerssen continued.

page 216

According to Leerssen, they are “command and control regulations, where the government is telling the industry what to do, rather than leaving it to a matter of user choice.”

page 216

The Digital Services Act, which was approved in July 2022 and goes into effect in 2024, provides for some of the same kinds of transparency and communication around recommendations that GDPR does for data: Platforms “should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritized for them.”

page 216

The Digital Services Act, which was approved in July 2022 and goes into effect in 2024, provides for some of the same kinds of transparency and communication around recommendations that GDPR does for data: Platforms “should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritized for them.” But it also says that algorithmic feeds must be customizable, enabling users to change the balance of variables at will or choose a feed that doesn’t leverage personal data at all: “options that are not based on profiling of the recipient.”

page 216

The tech giants are labeled “gatekeepers.” The law bans combining data from different services operated by one company, such as Meta’s Facebook and WhatsApp, without the user’s consent. It also prohibits “self-preferencing,” the way that Google and Amazon at times promote their own products under the guise of neutral automated recommendations, as in search results—one of the tricks that reinforces the homogeneity of the current Internet.

page 217

But that option would only be available for users in the EU—because the US has been much slower to adopt such legislation. When I saw the headlines, I was jealous. It was suddenly as if only EU residents could breathe pollution-free air.

page 219

It was always a struggle to get Facebook to cooperate with Social Science One—the company’s lawyers hid behind the argument that they were violating user privacy by sharing any data with outside analysts.

page 219

Frances Haugen’s Facebook whistleblowing lit a fire under him, and he released the draft the day she testified in front of the Senate.

page 219

PATA forces social media companies to supply data to researchers, via requests that are vetted by the National Science Foundation. Companies that don’t comply would then lose the protections of Section 230 and be liable for everything on their platforms.

page 219

Platform Accountability and Transparency Act,

page 220

Unfortunately, we do not have a constitutional right to personal taste. Therefore, we also must change our own habits, becoming more aware of how we consume culture and how we can resist the passive pathways of algorithmic feeds.

page 221

There are already ways to pay artists directly for what they create online. Bandcamp serves as the digital equivalent of an indie record store for independent musicians; users can buy digital files and streaming access directly instead of Spotify mediating the exchange. Patreon enables creators to paywall anything they choose to, whether writing, images, or audio. It offers a linear feed of posts that the user subscribes to monetarily, a much more powerful relationship than a Twitter

page 221

There are already ways to pay artists directly for what they create online. Bandcamp serves as the digital equivalent of an indie record store for independent musicians; users can buy digital files and streaming access directly instead of Spotify mediating the exchange. Patreon enables creators to paywall anything they choose to, whether writing, images, or audio. It offers a linear feed of posts that the user subscribes to monetarily, a much more powerful relationship than a Twitter follow. Substack does the same for email newsletter subscriptions. In 2008, the Wired editor Kevin Kelly famously wrote that a creator needs to find only “1,000 true fans” to fund their

page 221

There are already ways to pay artists directly for what they create online. Bandcamp serves as the digital equivalent of an indie record store for independent musicians; users can buy digital files and streaming access directly instead of Spotify mediating the exchange. Patreon enables creators to paywall anything they choose to, whether writing, images, or audio. It offers a linear feed of posts that the user subscribes to monetarily, a much more powerful relationship than a Twitter follow. Substack does the same for email newsletter subscriptions.

page 221

In 2008, the Wired editor Kevin Kelly famously wrote that a creator needs to find only “1,000 true fans” to fund their work and allow them to make a living—one thousand people who might pay them $100 a year. It’s an entirely different model from the larger digital platforms, where audiences have to be as big as possible.

page 221

The most powerful choice might be the simplest one: Stop lending your attention to platforms that exploit it.

page 221

But the more dramatic option is to log out entirely and figure out how to sustain culture offline once more.

page 222

I feel somewhat stifled online these days, in part because I can’t express myself as much as I once could, in the days of personal blogs and slowly developed conversation with other people. The templates are too restrictive, and the pace is too fast.

page 222

I feel somewhat stifled online these days, in part because I can’t express myself as much as I once could, in the days of personal blogs and slowly developed conversation with other people. The templates are too restrictive, and the pace is too fast. Even though the technology was much worse two decades ago, the experience, or the ecosystem, had its advantages.

page 222

First, we need to seek out the appropriate digital structures, and then we need to carry out the daily labor of determining a new way of living online.

page 224

one form of algorithmic anxiety is about feeling misunderstood by algorithmic recommendations, another is feeling hijacked

page 224

one form of algorithmic anxiety is about feeling misunderstood by algorithmic recommendations, another is feeling hijacked by them, feeling like you couldn’t escape them if you tried. Perhaps too much now depends on these feeds, and their influence

page 224

one form of algorithmic anxiety is about feeling misunderstood by algorithmic recommendations, another is feeling hijacked by them, feeling like you couldn’t escape them if you tried.

page 224

Algorithmic recommendations are addictive because they are always subtly confirming your own cultural, political, and social biases, warping your surroundings into a mirror image of yourself while doing the same for everyone else.

page 224

Ultimately, my sense of self was beholden to the responses I got from my invisible audiences, whose attention was algorithmically mediated, too.

page 225

Yet when I thought twice about my anxieties, I realized they were relatively

page 226

No one cares when you stop tweeting; the algorithm will simply slot in the content of some more willing participant, because, in Filterworld, everyone is replaceable.

page 226

To publicly acknowledge the cleanse was to jinx it, not to mention an example of the kind of self-aggrandizement that social media encourages. No one cares when you stop tweeting; the algorithm will simply slot in the content of some more willing participant, because, in Filterworld, everyone is replaceable.

page 228

When I browsed them, I felt like an unexpected visitor, someone who wasn’t supposed to be there. The sites all but shouted: Don’t you know you’re supposed to be on Facebook or Twitter!?

page 228

Because this system was so entrenched, publication websites had dismantled their home pages to the point that they often featured only a few stories on the screen at a time, with a maximum of images and a minimum of text. When I browsed them, I felt like an unexpected visitor, someone who wasn’t supposed to be there. The sites all but shouted: Don’t you know you’re supposed to be on Facebook or Twitter!?

page 229

In the absence of recommendations, I was left with things that I intentionally chose to consume, like email newsletters. The digital equivalent of hand-printed pamphlets, these missives offered a way to connect directly with publications or writers who I wanted to hear from—voices that I trusted.

page 230

In 1998, the Japanese artist and writer Yoshitoshi ABe released an anime television show called Serial Experiments Lain.

page 230

Lain, as it’s usually shortened, is a fable of life in the Internet era. In the show, a teenage girl named Lain discovers a virtual realm called “the Wired.”

page 231

These required much more labor to find what you like and consume it than the frictionless avenues of algorithmic feeds. While avoiding that labor may be convenient, it also makes our personal tastes flimsier, less hard-won.

page 232

One paper described communities of consumption as a form of “mutual learning”—we collectively figure out what it is that we’re looking for and how to find it. The likes of Twitter and Facebook, with their unstable interfaces and manipulative algorithms, are less conducive to mutual learning. There are various ways to go

page 232

One paper described communities of consumption as a form of “mutual learning”—we collectively figure out what it is that we’re looking for and how to find it. The likes of Twitter and Facebook, with their unstable interfaces and manipulative algorithms, are less conducive to mutual learning.

page 233

The slower and more careful approach is to seek out these seams of culture yourself and chart your path, bookmarking accounts, connecting with other people interested in the same things, and comparing notes, the way I did on anime forums or the early days of Twitter, before it became too vast to maintain a grasp on. This is a more conscious and intentional form of consumption—a form that was mandatory before feeds made it so easy to outsource our choices about what to consume online. It recalls the term connoisseur. In an art history context, the descriptor dates as far back as the eighteenth century, when connoisseurship referred to amateur collectors who could tell which artist painted a work based solely on looking at it.

page 233

The slower and more careful approach is to seek out these seams of culture yourself and chart your path, bookmarking accounts, connecting with other people interested in the same things, and comparing notes, the way I did on anime forums or the early days of Twitter, before it became too vast to maintain a grasp on. This is a more conscious and intentional form of consumption—a form that was mandatory before feeds made it so easy to outsource our choices about what to consume online.

page 233

It recalls the term connoisseur. In an art history context, the descriptor dates as far back as the eighteenth century, when connoisseurship referred to amateur collectors who could tell which artist painted a work based solely on looking at it. They sought out the artist’s signature gestures in a given work, which they had studied and cataloged.

page 233

It recalls the term connoisseur. In an art history context, the descriptor dates as far back as the eighteenth century, when connoisseurship referred to amateur collectors who could tell which artist painted a work based solely on looking at it. They sought out the artist’s signature gestures in a given work, which they had studied and cataloged. Connoisseurs developed expert knowledge, largely through the act of consumption.

page 239

Teenagers are more open to new experiences, regardless of what technology they use to consume them, and have the tendency and time to indulge obsessions, to become connoisseurs. But I’ve realized that what I appreciated so much about those online interactions is that they were built on person-to-person recommendations, not automated ones. Someone had to care enough to tell me what they liked, and I had to care enough to trust them and give it a fair try.

page 238

I must admit that my admiration for this period is driven in part by nostalgia.

page 239

In the guise of speeding it up, it actually impedes that organic development of culture and instead prioritizes flatness and sameness, the aesthetics that are the most transmissible across the networks of digital platforms. In a way, this book is an attempt to recapture recommendations from recommender systems. We should talk even more about the things we like, experience them together, and build up our own careful collections of likes and dislikes. Not for the sake of fine-tuning an algorithm, but for our collective satisfaction.

page 239

Recommending things is a professional human job, after all.

page 239

moment and expanding the boundaries of what is considered tasteful. You might find them in a boutique, at an art museum, on a radio station, or behind the scenes at a movie theater. These professional recommenders are called curators.

page 239

You might find them in a boutique, at an art museum, on a radio station, or behind the scenes at a movie theater. These professional recommenders are called curators.

page 239

that we avoid homogeneity. They guide our consumption. Though the word might get overused on the Internet, what we really need is more curation—the cultivation and deployment of personal taste.

page 239

we really need is more curation—the cultivation and deployment of personal taste.

page 239

Though the word might get overused on the Internet, what we really need is more curation—the cultivation and deployment of personal taste.

page 241

Those decades saw “the rise of the curator as creator,” as the museum-studies scholar Bruce Altshuler put it in his 1994 book The Avant-Garde in Exhibition.

page 241

In a sense, the individual star curators are the opposite of recommendation algorithms: they utilize all of their knowledge, expertise, and experience in order to determine what to show us and how to do it, with utmost sensitivity and humanity.

page 242

In the social network era, we’ve all had to curate our identities, in the sense of selecting which pieces of content best represent us on a profile page.

page 242

When we don’t have to make a selection, doing so, or knowing that someone else did, becomes a kind of luxury, albeit a pathetic one.

page 243

To determine the role of curators in the algorithmic gridlock of Filterworld, I met up with Paola Antonelli, a curator who joined the Museum of Modern Art in 1994 and is now the senior curator of its Department of Architecture and Design as well as a director of research and development. Antonelli is one of the most innovative curators of our time, and I’m lucky to have known her for more than a decade, carrying on a meandering conversation about art, design, technology, and the future of culture.

page 243

To determine the role of curators in the algorithmic gridlock of Filterworld, I met up with Paola Antonelli, a curator who joined the Museum of Modern Art in 1994 and is now the senior curator of its Department of Architecture and Design as well as a director of research and development.

page 245

Curators must also respect their audience’s capacity to think for themselves. Antonelli tries to leave her exhibition theses open-ended, “90 percent baked.” The remaining 10 percent gives space to the audience to bring their own experience to the work, completing the idea or argument on their own. When things are too predetermined or set in a template, audiences are alienated, because they feel no agency. “I believe that my job is not to tell people what’s good and what’s bad, but rather, it’s to stimulate their own critical sense,” she continued. Just as a chef’s amuse-bouche wakes up the appetite so one can better appreciate the meal that follows, the curator’s selection stimulates our senses to consider what’s in front of us. This kind of holistic sensitivity is something an algorithmic feed is incapable of replicating.

page 258

Curation is an analog process that can’t be fully automated or scaled up the way that social network feeds have been. It ultimately comes down to humans approving, selecting, and arranging things.

page 261

Another step toward a more curated Internet is to think more carefully about the business models that drive the platforms we use.

page 261

“If you are not paying for it, you’re not the customer; you’re the product being sold.” When digital platforms are free to use and make money through advertising, content is reduced to a way of attracting attention.

page 261

“If you are not paying for it, you’re not the customer; you’re the product being sold.” When digital platforms are free to use and make money through advertising, content is reduced to a way of attracting attention. When you are paying directly for the content itself, however, the content is more economically sustainable and tends to have more resources invested into it, which is better for both creators and consumers.

page 265

We turn to art to seek connection, yet algorithmic feeds give us pure consumption. Truly connecting requires slowing down too much, to the point of falling out of the feed’s grip. You can’t stay in an algorithmic flow state while reading a CD booklet.

page 274

To resist Filterworld, we must become our own curators once more and take responsibility for what we’re consuming. Regaining that control isn’t so hard. You make a personal choice and begin to intentionally seek out your own cultural rabbit hole, which leads you in new directions, to yet more independent decisions. They compound over time into a sense of taste, and ultimately into a sense of self.

page 276

Culture has to follow the dominant modes of perception of a given era. While a twentieth-century building might have been designed to be photographed, the twenty-first-century work of art is “designed for reproducibility” through algorithmic feeds, like Patrick Janelle’s cortado glamour shots on Instagram or Nigel Kabvina’s cooking videos on TikTok. They each contribute and conform to a generic, flattened, reproducible aesthetic. Hence the general state of ennui and exhaustion, the sense that nothing new is forthcoming.

page 277

Like water flowing into a pot, the creative impulse changes to fit the shape of the containers that we have for it, and the most common containers now are the feeds of Facebook, Instagram, Twitter, Spotify, YouTube, and TikTok.

page 277

In terms of how culture reaches us, algorithmic recommendations have supplanted the human news editor, the retail boutique buyer, the gallery curator, the radio DJ—people whose individual taste we relied on to highlight the unusual and the innovative.

page 279

too. Resistance to algorithmic frictionlessness requires an act of willpower, a choice to move through the world in a different way. It doesn’t have to be a dramatic one.

page 279

Resistance to algorithmic frictionlessness requires an act of willpower, a choice to move through the world in a different way. It doesn’t have to be a dramatic one.

page 279

I had never been inside; the name, clip art–style logo, and dark interior were turnoffs. It wasn’t Instagrammable.

page 279

It opened in 1994, not long after the first Starbucks opened in D.C., which was the chain’s first location on the East Coast. I had never been inside; the name, clip art–style logo, and dark interior were turnoffs. It wasn’t Instagrammable.

page 280

By moving away from the mindset of passive consumption and thinking about a post-algorithmic digital ecosystem, we begin to construct that alternative, demonstrating that the influence of algorithms is neither inevitable nor permanent.