content moderation

back to index

103 results

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media

by Tarleton Gillespie  · 25 Jun 2018  · 390pp  · 109,519 words

Tweets,” a recurring feature in which celebrities read aloud to the camera hateful tweets they had received. Journalists began to examine the hidden labor behind content moderation, most notably a 2014 Wired report by Adrian Chen documenting the experiences of Filipino workers who scrubbed U.S. social media platforms for dollars a

before beginning to challenge platforms for their moderation policies and their responsibility for the Internet’s many troubles, let’s start with a simple reminder. Content moderation is hard. This should be obvious, but it is easily forgotten. Moderation is hard because it is resource intensive and relentless; because it requires making

is frank or explicit but socially valuable, or material simply removed in error. One of the biggest challenges platforms face is establishing and enforcing a content moderation regime that can address both extremes simultaneously. The rules must account for the most egregious atrocities as well as material that is questionable but defensible

media platforms. To move this reconsideration forward, we need to examine the moderation apparatus that has been built over the past decade: the policies of content moderation, the sociotechnical mechanisms for its enforcement, the business expectations it must serve, the justifications articulated to support it. We need to look into why this

I would add a second set that, while they do not neatly fit the definition of platform, grapple with many of the same challenges of content moderation in platformlike ways: recommendation and rating sites like Yelp and TripAdvisor; exchange platforms that help share goods, services, funds, or labor, like Etsy, Kickstarter, Craigslist

the company). Costolo himself stepped down just a few months later. But without absolving Twitter of responsibility, it’s not just that Twitter “sucked” at content moderation. Twitter was grappling with the legacy of a particular configuration of rights and responsibilities, or a relative absence of responsibilities, that was already two decades

or behavior are undercutting the once sturdy principle of safe harbor articulated in Section 230. PLATFORMS ARE NOT LIKE TRADITIONAL INTERMEDIARIES The early logic of content moderation, and particularly the robust safe harbor protections offered to intermediaries by U.S. law, makes sense in the context of the early ideals of the

, multiple reviewers, and multiple platforms—and we have at least a fuzzy sense of the enormity of this undertaking. At this size, certain approaches to content moderation are practically impossible. For instance, there is simply too much content and activity to conduct proactive review, in which a moderator would examine each contribution

have grown in scale and ambition, traditional community management has become increasingly untenable. This means that the approaches social media platforms take, toward not just content moderation but all types of information management, are tied to this immense scale. Content is policed at scale, and most complaints are fielded at scale. More

this, some platforms and third parties are pairing automatic detection with editorial oversight, in ways that must give up some of the dreams of automating content moderation but can come closer to addressing the challenge of scale. The most effective automatic detection techniques are the ones that know what they’re looking

removal, remains seductive, even as the limitations become increasingly apparent. Platform managers find it appealing, because they want to be rid of the work that content moderation requires and because these companies are filled with engineers, who often prefer to solve social problems with smarter technology. Policy makers find automated moderation appealing

just as troubling is the presumptions it rests on—eugenics—and its promise of predetermination. What is troubling about Faception is also troubling about AI content moderation more broadly. It will always fail to anticipate cultural innovation: new words, new forms of expression, new tactics, new associations, new threats. But as we

kinds of punishments are enforced, and the philosophical approach their platforms take to governance itself. In their earliest days, many platforms did not anticipate that content moderation would be a significant problem. Some began with relatively homogenous user populations who shared values and norms with one another and with the developers—for

In fact, in the early days of a platform, it was not unusual for there to be no one in an official position to handle content moderation. Often content moderation at a platform was handled either by user support or community relations teams, generally more focused on offering users technical assistance; as a part

overseen by a few hundred, largely white, largely young, tech-savvy Californians who occupy a small and tight social and professional circle. Contract worker doing content moderation for U.S. tech companies at Task Us, an American outsourcing tech company located in the Taguig district of Manila, Philippines. Photograph by Moises Saman

time, this process was still a secret carefully guarded by most platforms, but increasingly platform managers are sensing a need to demonstrate their commitment to content moderation. Still, platforms continue to reveal little about their process, or how big a problem it is. And third-party companies that employ these moderators are

their approach or effectiveness, these independent grassroots efforts constitute another part of the labor that, in its current incarnation, supports and executes the process of content moderation. There are moments in which platforms recognize, or are forced to acknowledge, that they need outside expertise. The array of issues platforms find themselves adjudicating

sexual content and nudity”—which end up generating different age ratings.63 THE LABOR AND THE LOGISTICS Each social media platform has cobbled together a content moderation process that draws on the labor of some combination of company employees, temporary crowdworkers, outsourced review teams, legal and expert consultants, community managers, flaggers, admins

labor raises two additional challenges: the coordination of these efforts and the translation of both concerns and criteria between the different layers. The challenge of content moderation, then, is as much about the coordination of work as it is about making judgments. First, decisions the press or disgruntled users see as mistaken

the platform themselves. Some content policy managers are hoping that more sophisticated data collection and analysis will help them better address the logistical challenges of content moderation. Platforms manage themselves through the gathering of data, and they gather an immense amount of it. This is no less true for the procedures of

judgment, conviction, and suspicion on both sides. As more and more women faced deletions and sometimes suspensions, some were “discovering” that there was in fact content moderation at work behind the platform. But this did not mean that they clearly understood how it worked. Deletions and suspensions were generally accompanied by short

Facebook breastfeeding community was able to provoke a change in the rules that acknowledged them and their values. Nearly all aspects of the process of content moderation—setting the rules, fielding complaints, making decisions, and addressing appeals—are kept entirely hidden from users. Decisions are made behind closed doors, and built into

issues we’ve seen has outstripped our existing processes for governing the community. Mark Zuckerberg, chairman and CEO of Facebook, “Building Global Community,” February 2017 Content moderation is such a complex sociotechnical undertaking that, all things considered, it’s amazing that it works at all, and as well as it does. Even

their legal liability to consider their greater obligations to the public. IMPROVING MODERATION There are many things social media companies could do to improve their content moderation: More human moderators. More expert human moderators. More diverse human moderators. More transparency in the process. Better tools for users to block bad actors. Better

function at a scale and under a set of expectations that increasingly demands automation. Yet the kinds of decisions that platforms must make, especially in content moderation, are precisely the kinds of decisions that should not be automated, and perhaps cannot be. They are judgments of value, meaning, importance, and offense. They

at the start of this chapter. Every platform promises to offer something in contrast to something else—and as such, every platform promises moderation.12 Content moderation is a key part of what social media platforms do that is different, that distinguishes them from the open web: they moderate (removal, filtering, suspension

enforcement of the implicit contract, and right now it is pushing platforms away from the safe harbors they have enjoyed.14 Rethinking content moderation might begin with this recognition, that content moderation is the essential offer platforms make, and part of how they tune the public discourse they purport to host. Platforms could be

. 37Adrian Chen, “The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed,” Wired, October 23, 2014, https://www.wired.com/2014/10/content-moderation/. 38Charlie Warzel, “‘A Honeypot For Assholes’: Inside Twitter’s 10-Year Failure to Stop Harassment,” Buzzfeed, August 11, 2016, https://www.buzzfeed.com/charliewarzel/a

Online Free Speech?”; Matamoros-Fernández, “Platformed Racism”; Milosevic, “Social Media Companies’ Cyberbullying Policies”; Ober and Wildman, “Social Media Definition and the Governance Challenge”; Roberts, “Commercial Content Moderation”; Roth, “‘No Overly Suggestive Photos of Any Kind’”; Tushnet, “Power without Responsibility.” 75Kennedy et. al., “Regulation and Social Practice Online”; Lingel, Digital Countercultures and the

-facebook-reddit-censorship-free-speech; Sarah Roberts, “Social Media’s Silent Filter,” Atlantic, March 8, 2017, https://www.theatlantic.com/technology/archive/2017/03/commercial-content-moderation/518796/. 20Aarti Shahani, “From Hate Speech to Fake News: The Content Crisis Facing Mark Zuckerberg,” NPR All Tech Considered, November 17, 2016, http://www.npr

-clickworker-platform-how-half-a-million-people-are-training-ai-for-pennies-per-task/. 27Irani, “The Cultural Work of Microwork,” 726. 28Ibid., 725. 29Roberts, “Commercial Content Moderation.” 30Brad Stone, “Concern for Those Who Screen the Web for Barbarity,” New York Times, July 18, 2010, http://www.nytimes.com/2010/07/19/technology

Allows Bodily Fluids,” Gawker, February 16, 2012, http://gawker.com/5885836/. 33Gray, “Your Job Is about to Get ‘Taskified.’” 34Downey, “Making Media Work.” 35Roberts, “Content Moderation.” 36Roberts, “Commercial Content Moderation.” 37Postigo, “America Online Volunteers.” 38Dibbell, “A Rape in Cyberspace.” 39Dutton, “Network Rules of Order.” 40Bergstrom, “‘Don’t Feed the Troll’”; Lampe and Resnick, “Slash

. Cambridge: MIT Press. ———. 2015. Reading the Comments: Likers, Haters, and Manipulators at the Bottom of the Web. Cambridge: MIT Press. ROBERTS, SARAH T. 2016. “Commercial Content Moderation: Digital Laborers’ Dirty Work.” In Intersectional Internet: Race, Sex, Class and Culture Online, ed. Safiya Umoja Noble and Brendesha Tynes, 147–59. New York: Peter

Lang. http://ir.lib.uwo.ca/commpub/12/?utm_source=ir.lib.uwo.ca%2Fcommpub%2F12&utm_medium=PDF&utm_campaign=PDFCoverPages. ———. 2017. “Content Moderation.” http://escholarship.org/uc/item/7371c1hf.pdf. ROCHET, JEAN-CHARLES, AND JEAN TIROLE. 2003. “Platform Competition in Two-Sided Markets.” Journal of the European Economic

Action on Social Media Platforms.” Media and Communication 5 (3): 28–36. ———. Forthcoming. “A Conceptual Framework for the (Private) Public Sphere: Surveying User Experiences of Content Moderation on Social Media Platforms.” Manuscript submitted for publication. WOOLGAR, STEVE. 1990. “Configuring the User: The Case of Usability Trials.” Sociological Review 38 (S1). WU, TIM

) Aftenposten, (i), (ii)n17 Airtime (video chat platform), (i) algorithms: that select, curate, and recommend, (i), (ii), (iii), (iv), (v), (vi); and the automation of content moderation, (i); biases of, (i), (ii)n76 Amazon sales rankings, moderation of, (i) Android (Google), (i) Angwin, Julia, (i) anorexia. See self-harm AOL chatrooms, moderation

also pornography The Daily Dot, (i) data: user data as an economic imperative for platforms, (i), (ii), (iii), (iv), (v), (vi), (vii), (viii); data about content moderation, (i), (ii), (iii), (iv), (v), (vi), (vii); training data for automated moderation, (i); user data as a way to filter away objectionable content, (i); user

research), (i) Jobs, Steve (Apple), (i), (ii), (iii) Karp, David (Tumblr), (i) Kickstarter, (i), (ii), (iii)n5 Klonick, Kate, (i) Kwasnica, Emma, (i) labor of content moderation, (i); by flaggers, (i), (ii), (iii), (iv), (v)n43; by crowdworkers, (i), (ii), (iii); by internal policy teams, (i); psychological impact of, (i); by community

with, (i), (ii)s LinkedIn (Microsoft), (i), (ii), (iii), (iv) Livejournal, (i), (ii), (iii), (iv), (v), (vi)n8 livestreaming, moderation of, (i), (ii) logistics of content moderation work, (i), (ii), (iii) machine learning, (i), (ii), (iii) MacKinnon, Rebecca, (i) magazines. See traditional media Malaby, Tom, (i) marketplace of ideas, (i), (ii) Matias

of, (i), (ii) Newsvine (news discussion platform), (i), (ii) Nexus One (Google), (i) Ning (social networking platform), (i), (ii) nipples, as a contested issue for content moderation, (i), (ii), (iii) NIPSA / Not in the Public Site Area (Flickr guideline), (i) nonconsensual pornography. See revenge porn norms: in society more broadly, (i), (ii

of, (i); ill-fit for platforms, (i), (ii); suggestions for, (i) Safesearch / safe mode, (i), (ii), (iii), (iv), (v)n78 scale, as a challenge for content moderation, (i), (ii), (iii), (iv), (v), (vi), (vii), (viii), (ix) Schiller, Phil (Apple), (i) search engines: moderation of, (i), (ii), (iii); law regarding, (i), (ii), (iii

Gilded Rage: Elon Musk and the Radicalization of Silicon Valley

by Jacob Silverman  · 9 Oct 2025  · 312pp  · 103,645 words

employees knew what they were doing. Musk took things a step further, arguing that some Twitter employees were acting with malice, especially anyone working on content moderation or trust and safety. These departments work on interlocking, perhaps intractable, problems related to free speech, harassment, privacy, platform manipulation, and disinformation operations. Some of

different political regimes and cultural contexts. That ambiguity and uncertainty didn’t fit with Musk’s militant vision of a new Twitter, where anything resembling content moderation was immediately consigned away as censorship. It also meant that people who worked in these roles were inherently suspect. Yoel Roth, Twitter’s head of

abusers, the more his following grew.17 18 Firing more than half of Twitter employees, Musk transformed how the platform operated. He chiseled away at content moderation, disbanding teams working on trust and safety, and then claimed he was doing more to protect children from sex trafficking and abusive material than the

, here, here conspiracy and Sam Bankman-Fried here, here, here, here and James Beeks here and the Oath Keepers here and Solano County here, here content moderation, Twitter here, here corporatism here, here Covello, Jim here Covid-19 here, here, here, here Craft Ventures here, here, here, here, here, here, here Crooks

Extremely Hardcore: Inside Elon Musk's Twitter

by Zoë Schiffer  · 13 Feb 2024  · 343pp  · 92,693 words

Here” 14. “Hide-and-Seek” 15. “Some Things Are Priceless” Part II HELLSITE 16. “Let That Sink In!” 17. “I Understand How Computers Work” 18. “Content Moderation Is a Product” 19. “The Bird Is Freed” 20. “We Truly Cobbled It Together” 21. “Comedy Is Now Legal on Twitter” 22. “Please Print” 23

wokeness for making the streaming service unwatchable. The stakes were even higher at Twitter. To Musk, the future of democracy was on the line. Leftist content moderation, office lattes, twenty-week parental leave, conservative “shadowbanning,” holiday breaks, regular janitorial services: all artifacts of Twitter’s woke culture that needed to be uprooted

—made him exceedingly bad at running Twitter. His impulsiveness did not play well with advertisers. His thirst for speed alarmed regulators. And his opposition to content moderation alienated regular users. Six months after the deal closed, Twitter had lost two thirds of its value and found itself in hot water with lawmakers

and Europe. As Twitter’s business went into free fall, Musk’s reputation took a hit. To many, he’d become more edgelord, less visionary. “Content moderation is really hard and apparently harder than rocket science,” noted Evelyn Douek, an assistant professor at Stanford Law School, who studies online speech regulation. This

high—and Twitter seemed partly to blame. Suddenly, working at Twitter wasn’t cool, it was embarrassing. Employees demanded that the company double down on content moderation to balance newsworthiness against the threat of potential violence. Trump wasn’t a regular user, they argued, he deserved—he required—special treatment. In 2019

his dream job at Twitter. The platform was at the height of its influence and just beginning to grapple in earnest with free speech and content moderation, particularly in parts of the world that lacked free speech protections. Roth sat at a small desk in the San Francisco office, in front of

turbocharging its potential. Anyone who worked at Twitter could have told him these goals were in direct conflict. Users might disagree about the type of content moderation they wanted to see, but few wanted no moderation whatsoever, which could devolve into a harassment-laden hellscape. More importantly, Twitter was hamstrung by outside

stores, regulators, and (in all likelihood) advertisers. Yoel Roth’s trust and safety team knew this intimately. For years, they’d been working to balance content moderation with free speech. If Twitter failed to enforce its policies, it would get a call from Apple, hinting that the company’s next app release

.” A lot of things would surprise Yue in the coming weeks. But Musk’s focus on cost cutting wasn’t one of them. CHAPTER 18 “Content Moderation Is a Product” With the deal poised to finally close, advertisers were growing increasingly concerned that a free speech absolutist was going to take one

was an advertiser’s nightmare. “Advertisers play an underappreciated role in content moderation,” says Evelyn Douek, a professor and speech regulation expert. “So much of the content moderation discourse has always been a highfalutin discussion on free speech, on safety versus voice. But content moderation is a product, and brand safety has always been a key

he wants, but the fact is that he’s been transparent about his intention to re-platform Twitter’s most toxic users and to gut content moderation,” said Andrew Graham, founder of the brand consultancy Bread & Law. “You don’t ‘debate’ whether, for example, vaccines work, or whether people of certain races

arrived at the second floor of the building, Yoni Ramon, a Tesla security engineer, stopped him. “How do I get access to Twitter’s internal content moderation systems?” Ramon asked without introduction. Roth blinked. “You don’t,” he said. “That’s not going to happen.” He explained that Twitter was operating under

was going to try to sabotage Twitter. Roth suggested a solution: restricting broad access to Twitter’s internal systems while keeping it enabled for critical content moderation functions. “OK, you’re going to tell that to Elon,” Ramon said. Moments later, Roth sat face-to-face with Musk, walking him through Twitter

was sensitive to the risks of offline violence in the Brazilian election and wanted to make sure that we didn’t mess with Twitter’s content-moderation capabilities,” Roth told Newton. “It was like a dream come true.” * * * — That night, Musk started clearing out the executive ranks. He fired Parag Agrawal; Ned

, which he described on This American Life to Casey Newton. He wouldn’t break the law, or undermine an election, or “take arbitrary or unilateral content-moderation action.” But when he finally quit on November 10, 2022, it wasn’t for anything he’d written on the list. It had only been

her the truth. It wasn’t just one thing. There were the growing concerns over the FTC, the escalating fight with Apple over Twitter’s content moderation policies, and the absurd fiasco of Twitter Blue. Irwin laid it on thick, telling Roth he was Twitter’s only hope. Roth got in his

checks & balances” Transparency around future plans that will affect user or brand safety, including changes to community guidelines and moderation policies A commitment to “effective content moderation,” and ability to enforce the platform’s rules Employees on the sales team were at a loss. Ads made up 90 percent of Twitter’s

was buying Twitter to enable freedom of speech and freedom of expression and that he wouldn’t come in and do the same sort of content moderation that was done by the old regime.” That same evening, Musk found a way to walk back the bans, asking his followers when he should

Meta, which had seen a slower increase in requests and hadn’t meaningfully upped its compliance. Instead, it appeared that while Twitter was rolling back content moderation rules in the US, it was caving to authoritarian leaders more readily than it had under the leadership of Jack Dorsey, Parag Agrawal, and Vijaya

we’ll make sure it isn’t counted.’ That’s not a right at all!” Boreing tweeted. The Daily Wire was well versed in using content-moderation disputes to its advantage. “@elonmusk is not beholden to conservatives. He has the right to run his business as he sees fit. But if Twitter

Generations: The Real Differences Between Gen Z, Millennials, Gen X, Boomers, and Silents—and What They Mean for America's Future

by Jean M. Twenge  · 25 Apr 2023  · 541pp  · 173,676 words

. After leaking a trove of documents to the Wall Street Journal, Haugen testified that Facebook regularly placed profits over safety. The company, she says, relaxed content moderation around misinformation after the 2020 election was over, likely contributing to the January 6, 2021, attempt to take over the Capitol. Facebook profits from anger

Everything for Everyone: The Radical Tradition That Is Shaping the Next Economy

by Nathan Schneider  · 10 Sep 2018  · 326pp  · 91,559 words

our relationships expect that we pay with our personal data. The internet’s so-called sharing economy requires its permanently part-time delivery drivers and content moderators to relinquish rights that used to be part of the social contracts workers could expect. Yet a real sharing economy has been at work all

Breaking Twitter: Elon Musk and the Most Controversial Corporate Takeover in History

by Ben Mezrich  · 6 Nov 2023  · 279pp  · 85,453 words

, but the philosophical direction Jack believed the company was heading. As a manager, Jack had been notoriously hands-off when it came to issues like content moderation and the banning of problematic accounts; he’d always been a free speech advocate, and especially during the tumult of the recent elections, he’d

the internet were real and many, and more often than not, inhabited by teenagers. In his work, Yoel had always been careful to emphasize that content moderation shouldn’t be governed by “dictatorial edict.” As he would later explain, in an interview with tech journalist Kara Swisher, what was important wasn’t

Yoel and his work on the safety and moderation front as integral to the platform; he’d even tweeted that he intended to launch “a content moderation council, with widely diverse viewpoints to decide on moderation and account reinstatements.” Although Jessica had only texted with Yoel briefly as she’d left the

British ad company. Robin would want to get Elon out in front of the biggest clients so that he could personally assuage their concerns about content moderation policies, about the personnel changes that were obviously coming, and about his plans for Twitter’s future. “Yes, that,” Kahill said. “But I think he

alive.” He added that if Twitter was “ruled by dictatorial edict” then there was no need for him at the company. The fact that the “Content Moderation Council” Elon had promised had never materialized was one component of his decision, but it seemed the Blue Check fiasco was Yoel’s final straw

Always Day One: How the Tech Titans Plan to Stay on Top Forever

by Alex Kantrowitz  · 6 Apr 2020  · 260pp  · 67,823 words

the integration of these new “inputs” on the ground level. Mitchell heads Facebook’s risk and response team, which works to find vulnerabilities in its content-moderation systems. Birch is a program manager in its strategic response team, which coordinates Facebook’s response to crises across divisions. And Lavin is a former

model of, 81–82 artificial intelligence/machine learning at, 75–81, 88 and Cambridge Analytica, 83, 84, 158 and congressional investigations, 83–84, 85–86 content moderation at, 77–81, 86 contractors’ wages at, 155 dominance of, 3 Engineer’s Mindset at, 16 and Facebook Groups, 69–70, 201 and Facebook Live

Elon Musk

by Walter Isaacson  · 11 Sep 2023  · 562pp  · 201,502 words

, either out of calculation or his compulsion to be brutally honest, what he really thought: that they were wrong to kick off Trump, that their content moderation policies crossed the line into unjustifiable censorship, that the staff had been infected by the woke-mind virus, that people should show up to work

do that. Instead, he was rather conciliatory on these hot-button issues. Leslie Berland, the chief marketing officer of Twitter, began with the issue of content moderation. Instead of simply invoking his mantra about the goodness of free speech, Musk went deeper and made a distinction between what people should be allowed

were rushing to close the deal. Hearing his French accent, I realized he was the same Ben—Ben San Souci—who had asked Musk about content moderation at the coffee-bar visit. An engineer by demeanor, he wasn’t a natural networker, but he was suddenly being swept into the inner circle

off. He also fired most of the human resources managers. And that was just round one in what would be a three-round bloodbath. 84 Content Moderation Twitter, October 27–30, 2022 Clockwise, from top left: With Kanye West at SpaceX; Yoel Roth; Jason Calacanis; David Sacks Council of one The musician

teaching Musk a series of lessons about the complexity of free speech and the downsides of impulsive policymaking. Alongside the layoff decisions, the issue of content moderation dominated Musk’s first week at Twitter. He had been waving the banner of free speech, but he was learning that his views were too

October, a few weeks before he was due to take over Twitter, Musk had raised in one of our conversations the idea of creating a content moderation council that would decide these issues. He wanted diverse voices on it from around the world, and he described the type of members he had

, “No, it’s not really a priority now.” Yoel Roth When Musk fired Twitter’s chief legal officer Vijaya Gadde, the task of dealing with content moderation, and the equally difficult task of dealing with Musk, fell to a somewhat academic but cheerful, fresh-faced thirty-five-year-old named Yoel Roth

secure things. At least show me what the tools look like.” Roth thought that was reasonable. He pulled out his laptop and showed Ramon the content moderation tools that Twitter used and recommended some measures they could take to guard against an insider threat. “Can you be trusted?” Ramon suddenly said, looking

. “But you should know this is a major culture war issue.” There was a lot of advertiser concern about how Musk was going to handle content moderation. “If the very first thing that he does is remove Twitter’s hateful conduct policy related to misgendering, I don’t believe it will go

to reassure advertisers who were starting to flee Twitter. “To be super clear,” he tweeted, “we have not yet made any changes to Twitter’s content moderation policies.” As he does with people he considers his inner circle, Musk began texting Roth regularly with questions and suggestions. Even when a spate of

reassuring to cajoling to threatening. “Twitter has had a massive drop in revenue, due to activist groups pressuring advertisers, even though nothing has changed with content moderation and we did everything we could to appease the activists,” he tweeted after the meetings. “They’re trying to destroy free speech in America.” Space

2–10, 2022 A presentation in the conference room James Musk, Dhaval Shroff, and Andrew Musk assessing engineers Thermonuclear Yoel Roth and most of the content moderation team had survived round one of the layoffs and firings. Given the battle against racist trolling and the revolt among advertisers, it seemed prudent not

put a warning on any offending tweet, lower its visibility, and not let it be retweeted. Musk approved. Musk then suggested an additional idea for content moderation. Twitter had a little-known feature called “Bird Watch.” It allowed users to put corrections or contextual statements on tweets they found false. Musk loved

for judicious reflection about media bias and the complexities of content moderation, except that it got caught in the vortex that these days sends people scurrying into their tribal bunkers on talk shows and social media. Musk

.” I think the second half of his sentence is more true than the first. The Twitter Files brought some transparency to how Twitter had handled content moderation, but they also showed how difficult the task can be. The FBI, for example, flagged Twitter that some accounts tweeting negatively about vaccines and Ukraine

Cybertruck in Nevada, apologized. He had forgotten that he was due to fly to New Orleans to meet with President Macron to talk about European content moderation regulations. He asked Shroff to come back that evening. As he was waiting for Macron, Musk sent Shroff texts pushing their meeting later. “I’m

Elluswamy, Tim Zaman, Phil Duan. Kate Conger, Mike Isaac, Ryan Mac, and Tiffany Hsu, “Two Weeks of Chaos,” New York Times, Nov. 11, 2022. 84. Content Moderation: Author’s interviews with Yoel Roth, David Sacks, Jason Calacanis, Elon Musk, Jared Birchall, Yoni Ramon. Cat Zakrzewski, Faiz Siddiqui, and Joseph Menn, “Musk’s

, 578–80 EM’s reconciliation with Kimbal and, 346 EM’s resistance to authority and, 379, 417–18 Tesla and, 408, 417–18, 441 Twitter content moderation and, 572–73 Cramer, Jim, 293 Crawford, Esther, 509, 540 Crider, Johnna, 290 Culture, The novels (Banks), 400 Cyberpunk video games, 310, 318, 485 Daimler

, 554 EM’s management of Twitter advertiser boycotts and, 537–38 advertisers and, 533–35, 537–38, 547, 559–60, 580 Apple and, 559–60 content moderation and, 524–31, 537, 554, 566, 567, 572–73, 574–77 desk-siding, 552 EM’s demon mode and, 537–39 EM’s management style

, 444 Roiland, Justin, 564 Romney, Mitt, 424 Rope, Keith, 82 Rosen, Harold, 125–26 Ross, Rick, 484 Roth, Yoel, 523 advertiser boycotts and, 537–38 content moderation and, 525–30, 531, 537, 567, 568, 572 departure of, 542–44, 579 EM’s attacks on, 579–80 EM’s impulsive tweets and, 533

for, 261, 420, 443, 555 EM’s joke about supporting, 462 EM’s meeting with, 261–62 Errol Musk on, 579 Nosek and, 423 Twitter content moderation and, 568 Twitter reinstatement issue, 554–55 Trump, Ivanka, 418 Tsuga, Kazuhiro, 221–22 Turing, Alan, 240, 595 Twitter AI chatbots and, 601 as AI

Flask Web Development: Developing Web Applications With Python

by Miguel Grinberg  · 12 May 2014  · 420pp  · 61,808 words

with extra powers to help keep the application running smoothly. Administrators are the best example, but in many cases middle-level power users such as content moderators exist as well. There are several ways to implement roles in an application. The appropriate method largely depends on how many roles need to be

Laziness Does Not Exist

by Devon Price  · 5 Jan 2021  · 362pp  · 87,462 words

, “The Trauma Floor: The Secret Lives of Facebook Moderators in America,” Verge, February 25, 2019, https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona. 9. “APA Stress in America Survey: US at ‘Lowest Point We Can Remember’; Future of Nation Most Commonly Reported Source

We Are the Nerds: The Birth and Tumultuous Life of Reddit, the Internet's Culture Laboratory

by Christine Lagorio-Chafkin  · 1 Oct 2018

Forward: Notes on the Future of Our Democracy

by Andrew Yang  · 15 Nov 2021

The Internet of Garbage

by Sarah Jeong  · 14 Jul 2015  · 81pp  · 24,626 words

Facebook: The Inside Story

by Steven Levy  · 25 Feb 2020  · 706pp  · 202,591 words

Likewar: The Weaponization of Social Media

by Peter Warren Singer and Emerson T. Brooking  · 15 Mar 2018

Ghost Work: How to Stop Silicon Valley From Building a New Global Underclass

by Mary L. Gray and Siddharth Suri  · 6 May 2019  · 346pp  · 97,330 words

Digital Empires: The Global Battle to Regulate Technology

by Anu Bradford  · 25 Sep 2023  · 898pp  · 236,779 words

Empire of AI: Dreams and Nightmares in Sam Altman's OpenAI

by Karen Hao  · 19 May 2025  · 660pp  · 179,531 words

Battle for the Bird: Jack Dorsey, Elon Musk, and the $44 Billion Fight for Twitter's Soul

by Kurt Wagner  · 20 Feb 2024  · 332pp  · 127,754 words

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World

by Max Fisher  · 5 Sep 2022  · 439pp  · 131,081 words

Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination

by Mark Bergen  · 5 Sep 2022  · 642pp  · 141,888 words

Character Limit: How Elon Musk Destroyed Twitter

by Kate Conger and Ryan Mac  · 17 Sep 2024

The Science of Hate: How Prejudice Becomes Hate and What We Can Do to Stop It

by Matthew Williams  · 23 Mar 2021  · 592pp  · 125,186 words

System Error: Where Big Tech Went Wrong and How We Can Reboot

by Rob Reich, Mehran Sahami and Jeremy M. Weinstein  · 6 Sep 2021

Working in Public: The Making and Maintenance of Open Source Software

by Nadia Eghbal  · 3 Aug 2020  · 1,136pp  · 73,489 words

Coders: The Making of a New Tribe and the Remaking of the World

by Clive Thompson  · 26 Mar 2019  · 499pp  · 144,278 words

Your Computer Is on Fire

by Thomas S. Mullaney, Benjamin Peters, Mar Hicks and Kavita Philip  · 9 Mar 2021  · 661pp  · 156,009 words

Lurking: How a Person Became a User

by Joanne McNeil  · 25 Feb 2020  · 239pp  · 80,319 words

Gamification by Design: Implementing Game Mechanics in Web and Mobile Apps

by Gabe Zichermann and Christopher Cunningham  · 14 Aug 2011  · 145pp  · 40,897 words

Internet for the People: The Fight for Our Digital Future

by Ben Tarnoff  · 13 Jun 2022  · 234pp  · 67,589 words

Enshittification: Why Everything Suddenly Got Worse and What to Do About It

by Cory Doctorow  · 6 Oct 2025  · 313pp  · 94,415 words

The Constitution of Knowledge: A Defense of Truth

by Jonathan Rauch  · 21 Jun 2021  · 446pp  · 109,157 words

Nexus: A Brief History of Information Networks From the Stone Age to AI

by Yuval Noah Harari  · 9 Sep 2024  · 566pp  · 169,013 words

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt

by Sinan Aral  · 14 Sep 2020  · 475pp  · 134,707 words

The Lonely Century: How Isolation Imperils Our Future

by Noreena Hertz  · 13 May 2020  · 506pp  · 133,134 words

Terms of Service: Social Media and the Price of Constant Connection

by Jacob Silverman  · 17 Mar 2015  · 527pp  · 147,690 words

Reset

by Ronald J. Deibert  · 14 Aug 2020

The Smartphone Society

by Nicole Aschoff

Superbloom: How Technologies of Connection Tear Us Apart

by Nicholas Carr  · 28 Jan 2025  · 231pp  · 85,135 words

Code Dependent: Living in the Shadow of AI

by Madhumita Murgia  · 20 Mar 2024  · 336pp  · 91,806 words

Stories Are Weapons: Psychological Warfare and the American Mind

by Annalee Newitz  · 3 Jun 2024  · 251pp  · 68,713 words

The Age of Surveillance Capitalism

by Shoshana Zuboff  · 15 Jan 2019  · 918pp  · 257,605 words

Zucked: Waking Up to the Facebook Catastrophe

by Roger McNamee  · 1 Jan 2019  · 382pp  · 105,819 words

Super Thinking: The Big Book of Mental Models

by Gabriel Weinberg and Lauren McCann  · 17 Jun 2019

Uncanny Valley: A Memoir

by Anna Wiener  · 14 Jan 2020  · 237pp  · 74,109 words

An Ugly Truth: Inside Facebook's Battle for Domination

by Sheera Frenkel and Cecilia Kang  · 12 Jul 2021  · 372pp  · 100,947 words

The Gig Economy: A Critical Introduction

by Jamie Woodcock and Mark Graham  · 17 Jan 2020  · 207pp  · 59,298 words

Off the Edge: Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything

by Kelly Weill  · 22 Feb 2022

In Covid's Wake: How Our Politics Failed Us

by Stephen Macedo and Frances Lee  · 10 Mar 2025  · 393pp  · 146,371 words

Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism

by Sarah Wynn-Williams  · 11 Mar 2025  · 370pp  · 115,318 words

Futureproof: 9 Rules for Humans in the Age of Automation

by Kevin Roose  · 9 Mar 2021  · 208pp  · 57,602 words

Brotopia: Breaking Up the Boys' Club of Silicon Valley

by Emily Chang  · 6 Feb 2018  · 334pp  · 104,382 words

The Age of AI: And Our Human Future

by Henry A Kissinger, Eric Schmidt and Daniel Huttenlocher  · 2 Nov 2021  · 194pp  · 57,434 words

Algorithms of Oppression: How Search Engines Reinforce Racism

by Safiya Umoja Noble  · 8 Jan 2018  · 290pp  · 73,000 words

The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future

by Orly Lobel  · 17 Oct 2022  · 370pp  · 112,809 words

For Profit: A History of Corporations

by William Magnuson  · 8 Nov 2022  · 356pp  · 116,083 words

Sandy Hook: An American Tragedy and the Battle for Truth

by Elizabeth Williamson  · 8 Mar 2022  · 574pp  · 148,233 words

Broad Band: The Untold Story of the Women Who Made the Internet

by Claire L. Evans  · 6 Mar 2018  · 371pp  · 93,570 words

Death Glitch: How Techno-Solutionism Fails Us in This Life and Beyond

by Tamara Kneese  · 14 Aug 2023  · 284pp  · 75,744 words

New Laws of Robotics: Defending Human Expertise in the Age of AI

by Frank Pasquale  · 14 May 2020  · 1,172pp  · 114,305 words

Algospeak: How Social Media Is Transforming the Future of Language

by Adam Aleksic  · 15 Jul 2025  · 278pp  · 71,701 words

Blood in the Machine: The Origins of the Rebellion Against Big Tech

by Brian Merchant  · 25 Sep 2023  · 524pp  · 154,652 words

Dangerous Ideas: A Brief History of Censorship in the West, From the Ancients to Fake News

by Eric Berkowitz  · 3 May 2021  · 412pp  · 115,048 words

Filterworld: How Algorithms Flattened Culture

by Kyle Chayka  · 15 Jan 2024  · 321pp  · 105,480 words

Four Battlegrounds

by Paul Scharre  · 18 Jan 2023

Doppelganger: A Trip Into the Mirror World

by Naomi Klein  · 11 Sep 2023

Them and Us: How Immigrants and Locals Can Thrive Together

by Philippe Legrain  · 14 Oct 2020  · 521pp  · 110,286 words

The Business of Platforms: Strategy in the Age of Digital Competition, Innovation, and Power

by Michael A. Cusumano, Annabelle Gawer and David B. Yoffie  · 6 May 2019  · 328pp  · 84,682 words

The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future

by Keach Hagey  · 19 May 2025  · 439pp  · 125,379 words

Amateurs!: How We Built Internet Culture and Why It Matters

by Joanna Walsh  · 22 Sep 2025  · 255pp  · 80,203 words

How to Stand Up to a Dictator

by Maria Ressa  · 19 Oct 2022

Care: The Highest Stage of Capitalism

by Premilla Nadasen  · 10 Oct 2023  · 288pp  · 82,972 words

No Filter: The Inside Story of Instagram

by Sarah Frier  · 13 Apr 2020  · 484pp  · 114,613 words

Woke, Inc: Inside Corporate America's Social Justice Scam

by Vivek Ramaswamy  · 16 Aug 2021  · 344pp  · 104,522 words

We Are Bellingcat: Global Crime, Online Sleuths, and the Bold Future of News

by Eliot Higgins  · 2 Mar 2021  · 277pp  · 70,506 words

Nothing but Net: 10 Timeless Stock-Picking Lessons From One of Wall Street’s Top Tech Analysts

by Mark Mahaney  · 9 Nov 2021  · 311pp  · 90,172 words

Twitter and Tear Gas: The Power and Fragility of Networked Protest

by Zeynep Tufekci  · 14 May 2017  · 444pp  · 130,646 words

Badvertising

by Andrew Simms  · 314pp  · 81,529 words

Consent of the Networked: The Worldwide Struggle for Internet Freedom

by Rebecca MacKinnon  · 31 Jan 2012  · 390pp  · 96,624 words

Superminds: The Surprising Power of People and Computers Thinking Together

by Thomas W. Malone  · 14 May 2018  · 344pp  · 104,077 words

Platform Capitalism

by Nick Srnicek  · 22 Dec 2016  · 116pp  · 31,356 words

The Metaverse: And How It Will Revolutionize Everything

by Matthew Ball  · 18 Jul 2022  · 412pp  · 116,685 words

Attention Factory: The Story of TikTok and China's ByteDance

by Matthew Brennan  · 9 Oct 2020  · 282pp  · 63,385 words

Rationality: What It Is, Why It Seems Scarce, Why It Matters

by Steven Pinker  · 14 Oct 2021  · 533pp  · 125,495 words

Humans as a Service: The Promise and Perils of Work in the Gig Economy

by Jeremias Prassl  · 7 May 2018  · 491pp  · 77,650 words

The Autonomous Revolution: Reclaiming the Future We’ve Sold to Machines

by William Davidow and Michael Malone  · 18 Feb 2020  · 304pp  · 80,143 words

Genius Makers: The Mavericks Who Brought A. I. To Google, Facebook, and the World

by Cade Metz  · 15 Mar 2021  · 414pp  · 109,622 words

Ways of Being: Beyond Human Intelligence

by James Bridle  · 6 Apr 2022  · 502pp  · 132,062 words

American Marxism

by Mark R. Levin  · 12 Jul 2021  · 314pp  · 88,524 words

Searches: Selfhood in the Digital Age

by Vauhini Vara  · 8 Apr 2025  · 301pp  · 105,209 words

Exponential: How Accelerating Technology Is Leaving Us Behind and What to Do About It

by Azeem Azhar  · 6 Sep 2021  · 447pp  · 111,991 words

The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism

by Nick Couldry and Ulises A. Mejias  · 19 Aug 2019  · 458pp  · 116,832 words

The Measure of Progress: Counting What Really Matters

by Diane Coyle  · 15 Apr 2025  · 321pp  · 112,477 words

Vassal State

by Angus Hanton  · 25 Mar 2024  · 277pp  · 81,718 words

Lab Rats: How Silicon Valley Made Work Miserable for the Rest of Us

by Dan Lyons  · 22 Oct 2018  · 252pp  · 78,780 words

AI 2041: Ten Visions for Our Future

by Kai-Fu Lee and Qiufan Chen  · 13 Sep 2021

The Big Fix: How Companies Capture Markets and Harm Canadians

by Denise Hearn and Vass Bednar  · 14 Oct 2024  · 175pp  · 46,192 words

Beautiful Data: The Stories Behind Elegant Data Solutions

by Toby Segaran and Jeff Hammerbacher  · 1 Jul 2009

The Wires of War: Technology and the Global Struggle for Power

by Jacob Helberg  · 11 Oct 2021  · 521pp  · 118,183 words

Walled Culture: How Big Content Uses Technology and the Law to Lock Down Culture and Keep Creators Poor

by Glyn Moody  · 26 Sep 2022  · 295pp  · 66,912 words

How to Do Nothing

by Jenny Odell  · 8 Apr 2019  · 243pp  · 76,686 words

Artificial Unintelligence: How Computers Misunderstand the World

by Meredith Broussard  · 19 Apr 2018  · 245pp  · 83,272 words

Ours to Hack and to Own: The Rise of Platform Cooperativism, a New Vision for the Future of Work and a Fairer Internet

by Trebor Scholz and Nathan Schneider  · 14 Aug 2017  · 237pp  · 67,154 words