Section 230

back to index

description: US legal legislation of Internet sites

50 results

pages: 898 words: 236,779

Digital Empires: The Global Battle to Regulate Technology
by Anu Bradford
Published 25 Sep 2023

Tech Companies Protected by the Courts: Judicial Interpretation of Section 230 While Congress played a key role in setting tech companies free by legislating Section 230, the US judiciary has been crucial in protecting, and even expanding, the freedoms embedded in that legislation. US courts have upheld both the content and rationale behind Section 230’s two-way liability shield, including in a famous case Zeran v. America Online, Inc., which was decided by a US federal court in 1997.69 The Zeran case concerned AOL’s liability over defamatory statements a third party had posted on AOL’s message board. The court upheld AOL’s Section 230 immunity, highlighting the need to protect “freedom of speech in the new and burgeoning internet medium” and noting how imposing liability on AOL would constitute “intrusive government regulation of speech.”70 In interpreting Section 230 immunity and the congressional intent behind it, the court emphasized two key objectives: first, Section 230 incentivizes the platforms to filter obscenity and other offensive material without the fear of liability.

Post (Jan. 18, 2021), https://www.washingtonpost.com/politics/2021/01/18/biden-section-230/. 119.See Eric Johnson, Nancy Pelosi Says Trump’s Tweets “Cheapened the Presidency”—and the Media Encourages Him, Vox (Apr. 12, 2019), https://www.vox.com/2019/4/12/18307957/nancy-pelosi-donald-trump-twitter-tweet-cheap-freak-presidency-kara-swisher-decode-podcast-interview (addressing Rep. Pelosi’s perspective on Section 230). 120.See David Morar & Chris Riley, A Guide for Conceptualizing the Debate Over Section 230, Brookings Inst. (Apr. 9, 2021), https://www.brookings.edu/techstream/a-guide-for-conceptualizing-the-debate-over-section-230/. 121.Jerrold Nadler, Chairman, Comm. on the Judiciary, Investigation of Competition in Digital Markets 133 (2020). 122.Press Release, Jerrold Nadler, Chairman, H.

shareType=nongift. 168.See generally Tim Wu, The Curse of Bigness: Antitrust in the New Gilded Age (2018). 169.Carrie Goldberg, Herrick v. Grindr: Why Section 230 of the Communications Decency Act Must Be Fixed, Lawfare (Aug. 14, 2019), https://www.lawfareblog.com/herrick-v-grindr-why-section-230-communications-decency-act-must-be-fixed. 170.Herrick v. Grindr LLC, 306 F. Supp. 3d 579 (2d. Cir. 2019), cert. denied, 140 S. Ct. 221, 221 (2019); Carrie Goldberg, Herrick v. Grindr: Why Section 230 of the Communications Decency Act Must be Fixed, Lawfare (Aug. 14, 2019), https://www.lawfareblog.com/herrick-v-grindr-why-section-230-communications-decency-act-must-be-fixed. 171.Doe v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016), cert. denied, 2017 WL 69715 (No. 16-276). 172.Press Release, Xavier Becerra, Attorney General of California, Attorney General Becerra Announces Shutdown of World’s Largest Online Sex Trafficking Website, Backpage.com (Apr. 9, 2018), https://oag.ca.gov/news/press-releases/attorney-general-becerra-announces-shutdown-world%E2%80%99s-largest-online-sex. 173.Citron & Wittes, supra note 58, at 453. 174.Cass Sunstein, Is Social Media Good or Bad for Democracy?

pages: 344 words: 104,522

Woke, Inc: Inside Corporate America's Social Justice Scam
by Vivek Ramaswamy
Published 16 Aug 2021

Those who are familiar with this law in Silicon Valley generally agree that companies like Google, Facebook, and Twitter would have never become large-scale behemoths without Section 230. The irony is that they generally cite that fact as an argument in favor of Section 230, not an argument against it. Before going further, it’s worth pausing to note what Section 230 actually says. The law has two key provisions. Section 230(c)(1) says that platforms are not to be “treated as the publisher or speaker” of any information provided by their users. This means that if someone tweets something disparaging about you on Twitter, you can sue that person, but you can’t sue Twitter. Section 230(c)(2) is the so-called “Good Samaritan” provision, which immunizes platforms from liability for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”32 In a nutshell, it’s this second part that gives social media companies the power to censor material on their sites.

He posted the following on Twitter, ironically enough: “It’s pretty simple: if Twitter and Google and the rest are going to editorialize and censor and act like traditional publishers, they should be treated like traditional publishers and stop receiving the special carve out from the federal government in Section 230.”34 Interestingly, the movement to scrap Section 230 has become increasingly bipartisan, with even President Joe Biden suggesting a repeal of the statute.35 Silicon Valley’s defenders of Section 230 claim that social media companies like Facebook and Twitter could have never gotten off the ground in a big way if they were also responsible for fighting lawsuits from angry users who sue those companies for being defamed on their sites.

In an ultimate irony, Jack Dorsey in his Senate testimony invoked that very fact in arguing against the repeal of Section 230. He warned that if Section 230 were to disappear, then only massive tech companies capable of shouldering the financial burdens associated with new legal liabilities would continue to exist, choking out smaller companies and startups that try to compete. It would leave “only a small number of giant and well-funded technology companies,” he testified.36 Dorsey’s lack of self-awareness is laughable: the only way that the current Silicon Valley behemoths ever became behemoths was due to the existence of Section 230 in the first place. But this is at best a learning for the future.

pages: 390 words: 109,519

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media
by Tarleton Gillespie
Published 25 Jun 2018

There are many who, even now, strongly defend Section 230. The “permissionless innovation” it provides arguably made the development of the web, and contemporary Silicon Valley, possible; some see it as essential for that to continue.74 As David Post remarked, “No other sentence in the U.S. Code, I would assert, has been responsible for the creation of more value than that one.”75 But among defenders of Section 230, there is a tendency to paint even the smallest reconsideration as if it would lead to the shuttering of the Internet, the end of digital culture, and the collapse of the sharing economy. Without Section 230 in place, some say, the risk of liability will drive platforms to either remove everything that seems the slightest bit risky, or turn a blind eye.

In a phrase common to their terms of service agreements (and many advantageous legal contracts), social media platforms can claim “the right but not the responsibility” to remove users and delete content. This is classic legal language, designed to protect a provider from as much liability as possible while ensuring it the most discretionary power.33 But the phrase captures the enviable position that the Section 230 safe harbor offers. And it is an apt description for the legal and cultural standing that platforms have enjoyed since, particularly in the United States. Section 230 extends a legislative distinction common to U.S. telecommunication law between publishers that provide information (and therefore can be held liable for it) and distributors that merely circulate the information of others (and thus should not be held liable)—known as the “content/conduit” distinction.34 Since ISPs offered “access” to the Internet, and did not produce the content they help circulate, the law prioritized the free movement of information, and limited ISPs’ liability for the content users circulated through them.35 As with telephone systems, holding an intermediary liable for what users say or do might be an incentive to monitor users proactively and shut down anything that looked risky.

If social media platforms are neither conduit nor content, then legal arrangements premised on those categories may be insufficient. One possibility would be to recommit to, even double down on, Section 230, but with a sober and unflinching eye for which platforms, or aspects of platforms, warrant it and which exceed it. If a platform offers to connect you to friends or followers, and deliver what they say to you and what you say to them, then it is a conduit. This would enjoy Section 230 safe harbor, and could include the good faith moderation that safe harbor anticipated. But the moment that a platform begins to select some content over others, based not on a judgment of relevance to a search query but in the spirit of enhancing the value of the experience and keeping users on the site, it has become a hybrid.

pages: 321 words: 105,480

Filterworld: How Algorithms Flattened Culture
by Kyle Chayka
Published 15 Jan 2024

They can do that because of the United States’s 1996 Telecommunications Act, which included a Communications Decency Act with a piece called Section 230. Section 230 allowed the Internet to grow exponentially over the past decades. Our digital world wouldn’t be the same without it. But in the social media era, it has also allowed the tech companies that have supplanted traditional media businesses to operate without the safeguards of traditional media. Section 230 makes a distinction between an open platform, like Facebook, and what users publish on it. “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” the law states.

“If that rule was going to take hold, then the internet would become the Wild West and nobody would have any incentive to keep the internet civil,” Cox later told Wired. Cox and Wyden’s Section 230 allowed digital platforms to mediate some content, particularly anything “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable,” without being held liable for its publication. In other words, the online distributors could “in good faith” interfere with content as long as it seemed to be for the general good of the users. Section 230 was signed into law by President Bill Clinton in early 1996. Yet mainstream social media in the 2010s bears a very different relationship to Section 230. In 1996, the Internet was still a relatively niche experience for its sixteen million or so users.

Meanwhile, digital platforms could claim they were not media companies at all with the excuse of Section 230. The separations between roles blurred. In this new ecosystem, digital platforms took on some of the functions of publishers in deciding which content reached consumers. If CompuServe could claim neutrality in its court case, since it didn’t influence what Rumorville published, Facebook has much less of a semblance of neutrality. The algorithm’s curatorial actions are akin to a newspaper choosing which stories to put on the front page. Section 230 has served as a shield, distancing social networks from what individual users post on their platforms.

pages: 412 words: 115,048

Dangerous Ideas: A Brief History of Censorship in the West, From the Ancients to Fake News
by Eric Berkowitz
Published 3 May 2021

An example is the controversy over social media censorship. While Donald Trump’s attacks on Section 230 of the Communications Decency Act, one of the pillars of speech protections on social media, lost momentum with his election loss, the issues exploded anew in January 2021, when Twitter banned him, along with seventy thousand QAnon accounts, following a pro-Trump attack on the US Capitol building. The immediate results were a drop in online misinformation, the migration of extremist content to the web’s darker reaches, and bitter criticism of “Big Tech” and Section 230 from almost all corners. But even if Section 230 survives intact in the short term, the push to substantially restrict online speech may well bear fruit.

The First Amendment was already a high barrier against most government incursions, but in 1996, Congress built another rampart with Section 230 of the Communications Decency Act, a statute giving Internet companies and social media platforms additional and sometimes overlapping protections against liability. The law immunizes website operators from lawsuits for ads and most user-generated content, from political diatribes and videos of police brutality to vicious reviews of restaurants and plumbers. Thanks to Section 230, platforms can moderate their sites without the risk of being called to account for their users’ and advertisers’ false, hateful, or defamatory posts, and also amplify or take down posts, or even terminate user accounts, without legal exposure.

Congress—intent on fostering a “vibrant and competitive free market” for the Internet “unfettered by . . . regulation,” and worried that rulings such as the one against Prodigy would create a disincentive for the nascent industry to remove harmful content—adopted Section 230.76 Under cover of this statute, sites have been able to manage online operations without having to evaluate every one of the countless posts that appear each day, while at the same time amplifying and profiting from those posts, even the hateful and obnoxious ones, with strategic ad placements. The courts have applied Section 230 widely, striking down challenges both when sites remove user content and when they leave it up. In 2019, for example, a federal appellate court in New York held that the law even bars claims of civil terrorism.

pages: 309 words: 81,243

The Authoritarian Moment: How the Left Weaponized America's Institutions Against Dissent
by Ben Shapiro
Published 26 Jul 2021

The New York Times, for example, can be held liable as a publisher for information appearing in its pages. The New York Times’ comments section, however, does not create liability—if a user posts defamatory material in the comments, the Times does not suddenly become responsible. The purpose of Section 230, then, was to open up conversation by shielding online platforms from legal liability for third parties posting content. Section 230 itself states as much: the goal of the section is to strengthen the internet as “a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.”15 As the Electronic Freedom Foundation describes, “This legal and policy framework has allowed for YouTube and Vimeo users to upload their own videos, Amazon and Yelp to offer countless user reviews, craigslist to host classified ads, and Facebook and Twitter to offer social networking to hundreds of millions of Internet users.”16 There is one problem, however: the stark divide between platforms for third-party content and publishers who select their content begins to erode when platforms restrict the content third parties can post.

Thus, for example, a New York court found in 1995 that Prodigy, a web services company with a public bulletin board, became a publisher when it moderated that board for “offensiveness and ‘bad taste.’”17 In reaction, Section 230 created an extremely broad carve-out for platforms to remove offending content; bipartisan legislators wanted to protect platforms from liability just for curating content in order to avoid seamy or ugly content. Thus Section 230 provides that no platform shall incur liability based on “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”18 At the beginning, our major social media companies understood full well the intent behind Section 230.

Now social media are quickly becoming less like open meeting places and more like the town squares in Puritan New England circa 1720: less free exchange of ideas, more mobs and stocks. The saga of the social media platforms begins with the implementation of the much-maligned and misunderstood Section 230 of the Communications Decency Act in 1996. The section was designed to distinguish between material for which online platforms could be held responsible and material for which they could not. The most essential part of the law reads, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

pages: 661 words: 156,009

Your Computer Is on Fire
by Thomas S. Mullaney , Benjamin Peters , Mar Hicks and Kavita Philip
Published 9 Mar 2021

Unlike publications like newspapers that are accountable for the content they print, online services would be relieved of this liability. Section 230 had two purposes: the first was to “encourage the unfettered and unregulated development of free speech on the Internet,” as one judge put it; the other was to allow online services to implement their own standards for policing content and provide for child safety.24 Yet not all agree that Section 230 has been a harmless tool for self-expression. As legal scholar Mary Graw Leary describes, just who receives the greatest benefit from Section 230 may depend on whom you ask or who you are. She contextualizes Section 230 as having primarily benefited the tech companies themselves, explaining, “In Reno v.

Unlike television and radio broadcasters in the United States, who can be held liable and fined for the material they allow to pass over their channels, social media firms have been traditionally afforded a very specific legal status that both largely grants them immunity in terms of responsibility for the content they host and disseminate and also affords them the discretion to intervene upon that content as they see fit.22 This particular legal status of “internet intermediary” dates to the Communications Decency Act of 1996 and to its Section 230, in particular.23 Specifically, this portion of the CDA has set the tone for how internet intermediaries such as ISPs, search engines, and now social media platforms have been defined in the law for liability purposes. The Electronic Frontier Foundation, a major supporter of Section 230, describes its history as such: Worried about the future of free speech online and responding directly to Stratton Oakmont, Representatives Chris Cox (R-CA) and Ron Wyden (D-OR) introduced an amendment to the Communications Decency Act that would end up becoming Section 230. The amendment specifically made sure that “providers of an interactive computer service” would not be treated as publishers of third-party content.

Jeff Kosseff, The Twenty-Six Words That Created the Internet (Ithaca, NY: Cornell University Press, 2019). For an extended treatment of the origin and impact of Section 230, see Kosseff’s monograph on the subject, which he calls a “biography” of the statute. 24. Electronic Frontier Foundation, “CDA 230: Legislative History” (September 18, 2012), https://www.eff.org/issues/cda230/legislative-history. 25. Mary Graw Leary, “The Indecency and Injustice of Section 230 of the Communications Decency Act,” Harvard Journal of Law & Public Policy 41, no. 2 (2018): 559. 26. Leary, “The Indecency and Injustice of Section 230 of the Communications Decency Act,” 559. 27. Kate Klonick, “The New Governors: The People, Rules, and Processes Governing Online Speech,” Harvard Law Review 131 (2018): 1598–1670. 28.

pages: 475 words: 134,707

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt
by Sinan Aral
Published 14 Sep 2020

Senate that would require the Federal Trade Commission, a five-person commission of presidential appointees, to certify the political neutrality of a platform before it received Section 230 protections under the CDA. Such a measure would impose government oversight over social media moderation by requiring platforms to appease the speech requirements of the commission to avoid the civil litigation that Section 230 shields them from. If the Congress does not act to reform Section 230 to protect conservative voices, President Trump has signaled he will impose an executive order to have the Federal Communications Commission clarify when Section 230 applies in an attempt to police social media moderation and curb this perceived anticonservative bias.

Recognizing the need to incentivize platforms to moderate content, Section 230 provided platforms the protections they needed to make tough content-moderation decisions without the fear of civil prosecution. When we understand this history, it becomes clear how Section 230 helps maintain free speech and the quality of our communication ecosystem. If the platforms were liable for the harm done by any of their 3 billion users across trillions of daily messages, then social media and many other Internet services, including Wikipedia, and the commenting sections of many newspapers would likely become unworkable overnight. In this sense, Section 230 makes social media platforms, commenting on online newspapers, and even Wikipedia possible.

And while users of the banned subreddits migrated to other subreddits, “the migrants did not bring hate speech with them to their new communities, nor did the longtime residents [of those communities] pick it up from them. Reddit did not ‘spread the infection.’ ” Both legislation and content moderation by the platforms seem to work. So how can we incentivize the appropriate actors to act? Much of the discussion in the United States centers on Section 230 of the Communications Decency Act (CDA). Section 230 establishes broad immunity for social media platforms and other “interactive computer services” from civil prosecution based on what users post. Some advocates incorrectly interpret this law as absolving social media platforms of a responsibility to moderate user-generated content.

Reset
by Ronald J. Deibert
Published 14 Aug 2020

One cannot speak about regulating social media without considering section 230 of the U.S. Communications Decency Act. This legislation protects online intermediaries that host or republish speech from being held liable for what users say and do over their platforms, in the way that more traditional publishers are. Section 230 was not only critical to the early development of the internet, it has helped maximize innovation and online free expression. With the exception of certain criminal and intellectual property claims, section 230 ensures that the platforms are not subject to endless and potentially chilling lawsuits.

With the exception of certain criminal and intellectual property claims, section 230 ensures that the platforms are not subject to endless and potentially chilling lawsuits. Now that the platforms have matured and routinely manipulate their content, some commentators have called for additional exceptions to section 230’s immunity clauses.446 To be sure, the prospect of revising section 230 is daunting; it would be extraordinarily complex legislatively and would entail huge potential pitfalls. (To wit: in May 2020, Donald Trump publicly advocated abolishing section 230 entirely as part of a personal vendetta against Twitter for flagging his tweets.) Not only are liability protections critical to ensuring free speech online, they reduce barriers for new entrants to the online space, which is essential for competition.

Retrieved from https://datasociety.net/library/securitize-counter-securitize/ Some commentators have called for additional exceptions to Section 230’s immunity clauses: Sylvain, O. (2018, April 1). Discriminatory designs on user data. Retrieved from https://knightcolumbia.org/content/discriminatory-designs-user-data; Citron, D. K., & Penney, J. (2019, January 2). When law frees us to speak. Fordham Law Review, 87(6). Retrieved from https://ssrn.com/abstract=3309227; Citron, D.K., & Wittes, B. (2018). The problem isn’t just backpage: Revising Section 230 immunity. Georgetown Law Tech Review, 453. Retrieved from https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Citron-Wittes-453-73.pdf; The most accessible and comprehensive account of Section 230 of the CDA is Kosseff, J. (2019).

pages: 574 words: 148,233

Sandy Hook: An American Tragedy and the Battle for Truth
by Elizabeth Williamson
Published 8 Mar 2022

That might sound good, but it’s based on a false reading of the First Amendment: Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. The First Amendment shields Americans’ freedom of speech against government interference, not Facebook interference. As private businesses, social media companies make their own rules about what users can and can’t say on their platforms. But the law, namely Section 230 of the Communications Decency Act of 1996, protects them from liability when they fail to enforce them. Newspapers, television, and news websites enjoy First Amendment protection for the content they publish and air. But the same body of law holds them responsible for its truth. If they knowingly or recklessly spread falsehoods that defame individuals or businesses, they can be sued in the states, whose definitions of “defamation” differ.

Although Facebook, Twitter, Google, and YouTube are fast becoming most Americans’ main source of news and information, federal law protects them from being sued for any defamatory content they distribute.[11] In 1996, Congress recognized the internet as an extraordinary information and educational resource for Americans, and its potential as “a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.” So they tried to protect the internet from excessive government regulation, giving it nearly unfettered possibilities for growth. Section 230 immunizes social platforms from liability by treating them not as publishers but as mere pipelines for user-created content.[12] Here’s the relevant part: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

Free speech principles, they claim, compel them to surrender platforms connecting one-third of the world to democracy’s worst actors. The argument betrays either shocking ignorance of the Constitution, monstrous cynicism, or both. The companies have spent billions on lobbying and marketing campaigns aimed at preserving their Section 230 immunity. Reminding Congress of 230’s original intent, they rhapsodize about their good deeds. Mark Zuckerberg pivots to the social justice movements and marriages fostered by Facebook when asked about its role in enabling Russian election interference, the neo-Nazis gathering on Facebook groups, and the murders streamed on Facebook Live.

pages: 265 words: 69,310

What's Yours Is Mine: Against the Sharing Economy
by Tom Slee
Published 18 Nov 2015

As Forbes’ Jeff Bercovici reported, “Uber likes this system because it enjoys being able to say all of its drivers have near-perfect ratings. But it’s a harsh one for drivers, and also for ­customers, who find themselves repeatedly forced to choose between guilt, spite and ignorance.” While Uber dictates the behavior of its drivers in more and more specific ways, it still takes none of the responsibility when things go wrong. Section 230 of the “Communications Decency Act” may seem like an odd law to protect the company, but here’s how it works. The law was initially introduced to say that blogs and other user-content sites such as YouTube were not responsible for content posted by its users. Fair enough. But now Uber says it’s not a taxi company, it just runs a web site and an app, and puts drivers in touch with riders.

Uber’s ability to provide value to its consumers comes not only from its technology but also, as we saw in Chapter 4, from its ability to externalize costs. Beyond this, the company keeps costs down by running at a loss in order to foster growth. It’s a common thread throughout the Sharing Economy. Section 230 of the Communications Decency Act says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 4 The law means that bloggers are not legally responsible for what commenters write on their site, that YouTube is not legally responsible for videos that users upload, Facebook is not responsible for what its users posts, and so on.

But the law has repeatedly been interpreted broadly, extending protection to all kinds of online platforms,5 and Sharing Economy companies have argued that the law means that they are not responsible for the actions of their service providers, or for what goes on between service providers and customers. Presenting themselves as marketplaces, as technology companies, and not as service providers, allows the claim that Section 230 applies to them. Taxi companies may be responsible for taxi rides, but if you’re a ride-sharing provider you don’t want that expense; hotels and B&Bs may be responsible for what happens to their guests, but if you’re Airbnb you don’t want that kind of liability. It may seem far-fetched that a company like Uber, which is now experimenting with taking roughly 30% of the fare for each ride and which includes a $1 “safety fee” in its price, has no responsibility when things go wrong on that ride, and when Uber was designated as a Transportation Network Company in California, the question was left open.6 But for Sharing Economy companies Section 230 is a good start until courts show otherwise.

pages: 642 words: 141,888

Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination
by Mark Bergen
Published 5 Sep 2022

“They better be careful,” the president huffed in August, “because they can’t do that to people.” Trump allies attacked Silicon Valley platforms for abusing their protected status under Section 230, the law that shielded user-generated websites from liability. Ted Cruz, a Texas Republican, berated Mark Zuckerberg in a hearing for failing to operate Facebook as a “neutral public forum,” as Section 230 dictated. In fact, the law did not dictate this, but this blustering threat still worked. Within Google, members of its policy team were instructed to be extra cautious about anything Section 230 related and anything that might make Google appear as if it were taking on the role of a publisher.

YouTube Helped,” The New York Times, October 23, 2017, https://www.nytimes.com/2017/10/23/technology/youtube-russia-rt.html. GO TO NOTE REFERENCE IN TEXT the law did not dictate this: Catherine Padhi, “Ted Cruz vs. Section 230: Misrepresenting the Communications Decency Act,” Lawfare, April 20, 2018, https://www.lawfareblog.com/ted-cruz-vs-section-230-misrepresenting-communications-decency-act. GO TO NOTE REFERENCE IN TEXT blustering threat still worked: The law actually dictates that websites are clear to make “good faith” efforts to restrict material considered “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

Religious right scolds wanted to wipe out sex and other ills, free-market champions wanted fewer impediments to commerce, and media lobbyists wanted protection from intellectual property theft. The resulting flurry of muddled laws included two whoppers that defined the modern internet. In 1996 the Communications Decency Act went after “obscene and indecent” materials online and included a short provision, Section 230, that gave websites permission to remove smut and shielded them from liability for posts users wrote. In 1998 the Digital Millennium Copyright Act (DMCA) provided ways for owners of intellectual property, like songs and movies, to claim rights online. In theory, the laws protected websites from lawsuits and copyright trouble.

pages: 270 words: 79,180

The Middleman Economy: How Brokers, Agents, Dealers, and Everyday Matchmakers Create Value and Profit
by Marina Krakovsky
Published 14 Sep 2015

A little legal backstory will help. In the United States, a website cannot generally be held liable for the actions of its users, whether or not there are consequences beyond the site; the reason is Section 230, an influential provision of the Communications Decency Act of 1996. Most laws impose greater legal obligations on businesses, says Eric Goldman, a law professor at Santa Clara University who has published widely on Internet governance—but Section 230 says that Internet businesses can maintain their immunity from liability even if they do nothing to police users. That’s because lawmakers didn’t want to stifle technical innovation, and they also didn’t want to create a perverse incentive for websites to do too little screening and policing.

Without legal immunity, Goldman explains, “Congress was concerned that websites would feel like if they tried to help and failed, they would face more liability than if they didn’t try to help.” The goal of Section 230, therefore, was to get websites to do more policing. That way, if you’re a website and you fail in your efforts, “you’re not on the hook for failure, and the idea is that websites would do more.”36 Section 230 appears to have worked as intended: we have indeed seen impressive leaps of innovation among Internet businesses in recent years. Strong reputation systems that keep users accountable have become standard among the most successful middleman businesses—think Airbnb, eBay, TaskRabbit, even Uber; these systems have raised the bar for other middlemen.

See Paul A. Pavlou and David Gefen, “Psychological Contract Violation in Online Marketplaces: Antecedents, Consequences, and Moderating Role,” Information Systems Research 16, no. 4 (2005): 272–99. 36.There are exceptions to the protective power of Section 230. For one thing, the government can prosecute a website under federal criminal law. Section 230 also doesn’t cover intellectual property claims. Also, another legal scholar, James Grimmelmann of the University of Maryland, told me that other enforcers, such as the FTC and attorney generals, can go after websites for making misleading statements— “promises that get taken away by the fine print.”

pages: 81 words: 24,626

The Internet of Garbage
by Sarah Jeong
Published 14 Jul 2015

Theoretically, DMCA safe harbor protects the little guys, ensuring that the Internet will continue to evolve, flourish, and provide ever-new options for consumers. The DMCA is also one of the handful of ways you force an online intermediary to remove content. The Communications Decency Act, Section 230 Under present law, DMCA works in lockstep with Section 230 of the Communications Decency Act, which generally immunizes services from legal liability for the posts of their users. Thanks to CDA 230, if someone tweets something defamatory about the Church of Scientology, Twitter can’t be sued for defamation. There are very few exceptions to CDA 230.

It’s often very difficult to target the poster directly. They might be anonymous. They might have disappeared. They might live in a different country. So usually, when seeking to delete something off the Web, wronged individuals go after the platform that hosts the content. The problem is that those platforms are mostly immunized through Section 230 of the Communications Decency Act (described in detail below). The biggest gaping hole in CDA 230, however, is copyright. That’s where most of the action regarding legally-required deletion on the Internet happens, and all of that is regulated by the DMCA. The Digital Millennium Copyright Act The Digital Millennium Copyright Act, among other things, provides “safe harbor” to third-party intermediaries so long as they comply with notice-and-takedown procedures.

pages: 619 words: 177,548

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity
by Daron Acemoglu and Simon Johnson
Published 15 May 2023

They can become powerful civil-society associations and contribute to the emergence of a broader social movement, especially if combined with the other measures we are proposing. Repeal Section 230 of the Communications Decency Act. Central to the regulation of the tech industry is Section 230 of the 1996 Communications Decency Act, which protects internet platforms against legal action or regulation because of the content they host. As Section 230 explicitly states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

This means that government regulation has to play a role, and a first step in this would be to repeal Section 230 and make platforms accountable when they promote such material. The emphasis here is important. Even with much better monitoring, it would be unrealistic to expect that Facebook can eliminate all posts that include misinformation or hate speech. Yet it is not too much to expect that their algorithms should not give such material a much broader platform by “boosting” it and actively recommending it to other users, and this is what the repeal of Section 230 should target. We should also add that such a relaxation of Section 230 protection would be most effective for platforms such as Facebook and YouTube that use algorithmic promotion of content, and is less relevant for other social media, such as Twitter, where direct promotion is less relevant.

We should also add that such a relaxation of Section 230 protection would be most effective for platforms such as Facebook and YouTube that use algorithmic promotion of content, and is less relevant for other social media, such as Twitter, where direct promotion is less relevant. For Twitter, experimenting with different regulation strategies, requiring the monitoring of the most heavily subscribed accounts, may be necessary. Digital Advertising Tax. Even getting rid of Section 230 is not enough, however, because it leaves unchanged the business model of internet platforms. We advocate a nontrivial digital advertising tax to encourage alternative business models, such as those based on subscription, instead of the currently prevailing model that largely relies on individualized targeted digital advertising.

Likewar: The Weaponization of Social Media
by Peter Warren Singer and Emerson T. Brooking
Published 15 Mar 2018

Two younger U.S. representatives—Chris Cox, a Republican from California, and Ron Wyden, a Democrat from Oregon—realized that unless something was done to protect websites that tried to police their own networks, the entire internet would be paralyzed by fear of lawsuits and prison time. Their consequent amendment became 47 U.S.C. § 230 (1996), known as Section 230. It was, in the words of Wired magazine, “the most important law in tech.” Section 230 provided “protection for ‘Good Samaritan’ blocking and screening of offensive material,” essentially ruling that no website could be held accountable for the speech of its users. And no website that made a “good-faith” effort to enforce applicable U.S. regulations could be punished, even if its efforts fell short.

This wasn’t a problem—it was the whole reason services like Blogger existed: to share the panoply of human expression, emotions, and beliefs. By contrast, violations of intellectual property rights were covered not by the permissive Section 230, but by the much stricter 1998 Digital Millennium Copyright Act (DMCA). This law imposed a maximum prison sentence of five years or a fine of $500,000 for the first offense of posting material for which someone else held a copyright. Fortunately, much like Section 230, the law also included a “safe harbor” provision. If websites promptly responded to a takedown notice filed by the copyright holder—without pausing to consider the merits of the request—they could avoid being sued or jailed.

Reno v. American Civil Liberties Union (1997) was the first and most important Supreme Court case to involve the internet. In a unanimous decision, the justices basically laughed the CDA out the door, noting that it massively violated the First Amendment. The only part of the CDA that survived was Section 230. Over the ensuing years, it would be consistently challenged and upheld. With each successful defense, its legal standing grew stronger. Outside of two specific exemptions (federal criminal law and intellectual property), the internet was mostly left to govern itself. As a result, most early corporate censorship—more politely known as “content moderation”—would come not because of government mandate, but to avoid involving government in the first place.

System Error: Where Big Tech Went Wrong and How We Can Reboot
by Rob Reich , Mehran Sahami and Jeremy M. Weinstein
Published 6 Sep 2021

The battle over net neutrality is ultimately a fight about whether your internet service provider has the right to speed up or slow down content passing through its network in order to make more money or favor particular providers. Similarly, we now face complicated but hugely consequential questions about a notorious provision called Section 230 of the Communications Decency Act that was appended to the 1996 Telecommunications Act. With a few exceptions, Section 230 immunizes websites and internet service providers from legal liability stemming from any content posted by users. Whereas newspapers and television programs are content creators and therefore responsible for what they print or broadcast, internet service providers and social media companies can distribute user-generated content without incurring legal responsibility, even when that content is hateful, libelous, false, or vulgar.

The Future of Platform Immunity Addressing the pollution of our information ecosystem also means figuring out what obligations companies should have beyond fealty to their shareholders and adherence to existing laws. In short, should government hold companies to a higher standard of behavior in order to protect democracy? Any conversation about corporate behavior in the United States must begin with Section 230 of the Communications Decency Act, known as CDA 230 for short. This provision is nothing short of the oxygen that has enabled internet platforms to grow. Passed in 1996, CDA 230 immunizes providers of interactive computer services for liability arising from user-generated content. More specifically, it says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker,” thus enabling platforms to facilitate the posting and sharing of content without significant concern about legal liability.

The court considered Prodigy to be more than a simple distributor of content because its use of automated curation tools and guidelines for posting were a “conscious choice, to gain the benefits of editorial control.” The decision sent shock waves through the growing internet business, raising fears that the new platforms would become targets of lawsuits. In response, a bipartisan effort in Congress added what became Section 230 to a bill already under consideration to regulate the access of minors to indecent online content. Daphne Keller, a former associate general counsel for Google, is one of the world’s leading experts on the issue of “intermediary liability,” the technical term for the extent to which platforms bear any legal responsibility for the content that they post, share, or curate.

Forward: Notes on the Future of Our Democracy
by Andrew Yang
Published 15 Nov 2021

These platforms actually have a few interrelated problems: market incentives that maximize engagement and addiction, misinformation and deepfakes, and data-enabled targeted advertising. There is a big rule when it comes to social media platforms you might have heard of, thanks to Donald Trump: section 230 of the Communications Decency Act (CDA). Section 230 says that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” It has been interpreted to mean that platforms that publish content provided by someone else—say, a user of a forum—don’t have any legal responsibility for the published content.

And it’s one reason you’ll constantly hear Facebook argue that it’s a “platform, not a publisher.” Being a publisher would bring with it the civic responsibilities and legal liabilities that any newspaper or television network bears every day. In 2020 major politicians on both sides of the aisle began either calling to or threatening to “repeal section 230.” This has been due to dissatisfaction with Facebook’s and Twitter’s treatment of various types of content. Conservatives complain that conservative-facing content—most recently, QAnon-related posts, calls by Trump that suggested violence, medical misinformation, and now Trump himself—has been singled out for censorship or redaction.

In response, Facebook’s CEO, Mark Zuckerberg, reportedly told people that these companies account for a tiny slice of Facebook’s revenue and there was nothing to worry about: the top hundred advertisers on Facebook are less than 6 percent of the company’s revenue. The Stop Hate for Profit campaign continued with celebrities like Rosario Dawson and Katy Perry taking a break from Instagram. The reliance of companies on the safe harbor from civil liability afforded by section 230 indicates something incredibly important: the Communications Decency Act can easily be amended to hold social media companies more responsible for the content they publish and to force them to share their data with the public so we know what they know. Right now, we are sadly reliant upon limited third-party data and bland corporate communications about the extent of various problems while the social media companies themselves have unfathomably large data troves.

pages: 290 words: 73,000

Algorithms of Oppression: How Search Engines Reinforce Racism
by Safiya Umoja Noble
Published 8 Jan 2018

At one time, the FCC enforced decency standards for media content, particularly in television, radio, and print. Many political interventions over indecency and pornography on the web have occurred since the mid-1990s, with the 1996 Communications Decency Act (CDA) being the most visible and widely contested example, particularly section 230 with respect to immunity for online companies, which cannot be found liable for content posted by third parties. Section 230 is specifically designed to protect children from online pornography, while granting the greatest rights to freedom of expression, which it does by not holding harm toward Internet service providers, search engines, or any other Internet site that is trafficking content from other people, organizations, or businesses—companies such as Google, Facebook, Verizon, AT&T, Wordpress, and Wikipedia—all of which are exempt from liability under the act.14 These were the same protections afforded to Hunter Moore and his revenge-porn site discussed in chapter 4.

These protections of immunity are mostly upheld by the Zeran v. America Online, Inc. (1997) ruling in the U.S. Court of Appeals for the Fourth Circuit, which found that companies are not the responsible parties or authors of problematic material distributed over their hardware, software, or infrastructure, even though section 230 was intended to have these companies self-censor indecent material. Instead, the courts have ruled that they cannot hold companies liable for not self-censoring or removing content. Complicating the issues in the 1996 act is the distinction between “computer service providers” (nonmediated content) and “information providers” (mediated content).16 During the congressional hearings that led to the Federal Trade Commission investigation of Google, the reporter Matthew Ingram suggested in a September 2011 article that “it would be hard for anyone to prove that the company’s free services have injured consumers.”17 But Ingram is arguably defining “injury” a little too narrowly.

Zimmer (Eds.), Web Searching: Multidisciplinary Perspectives, 11–34. Dordrecht, The Netherlands: Springer. Dicken-Garcia, H. (1998). The Internet and Continuing Historical Discourse. Journalism and Mass Communication Quarterly, 75, 19–27. Dickinson, G. M. (2010). An Interpretive Framework for Narrower Immunity under Section 230 of the Communications Decency Act. Harvard Journal of Law and Public Policy, 33(2), 863–883. DiMaggio, P., Hargittai, E., Neuman, W. R., and Robinson, J. P. (2001). Social Implications of the Internet. Annual Review of Sociology, 27, 307–336. Dines, G. (2010). Pornland: How Porn Has Hijacked Our Sexuality.

pages: 314 words: 88,524

American Marxism
by Mark R. Levin
Published 12 Jul 2021

Sun, September 18, 2020, https://www.the-sun.com/news/us-news/1495061/china-document-8-million-training-detention-camps/ (April 25, 2021). 53 “Church leaders seek Home Depot boycott on Georgia voting law,” Canadian Press, April 21, 2021, https://www.msn.com/en-ca/money/topstories/church-leaders-seek-home-depot-boycott-on-georgia-voting-law/ar-BB1fRzT0 (April 25, 2021). 54 Evie Fordham, “Goya ‘buy-cott’ begins as customers load up on product after Trump backlash,” Fox Business, July 12, 2020, https://www.foxbusiness.com/markets/goya-food-sales-trump-controversy (April 25, 2021). 55 Mann, “CEOs answer the call of the woke by pivoting to ‘stakeholder’ capitalism.” 56 John Binder, “Wall Street, Corporations Team Up with Soros-Funded Group to Pressure States Against Election Reforms,” Breitbart, April 13, 2021, https://www.breitbart.com/politics/2021/04/13/wall-street-corporations-team-up-with-soros-funded-group-to-pressure-states-against-election-reforms/ (April 25, 2021). 57 David Aaro, “Ron DeSantis pushes bill aimed to take power away from Big Tech,” Fox News, February 16, 2021, https://www.foxnews.com/tech/desantis-pushes-bill-to-aimed-to-take-power-away-from-big-tech (April 25, 2021). 58 Rachel Bovard, “Section 230 protects Big Tech from lawsuits. But it was never supposed to be bulletproof,” USA Today, December 13, 2020, https://www.usatoday.com/story/opinion/2020/12/13/section-230-big-tech-free-speech-donald-trump-column/3883191001/ (April 25, 2021). 59 Ibid. 60 John Solomon, “Zuckerberg money used to pay election judges, grow vote in Democrat stronghold, memos reveal,” Just the News, October 20, 2020, https://justthenews.com/politics-policy/elections/memos-show-zuckerberg-money-used-massively-grow-vote-democrat-stronghold (April 25, 2021); Libby Emmons, “BREAKING: Project Veritas exposes Google manager admitting to election interference,” Post Millennial, October 19, 2020, https://thepostmillennial.com/breaking-project-veritas-exposes-google-manager-admitting-to-election-influence (April 25, 2021). 61 Unlike most of the corporations listed, the Fox news platforms, such as the Fox News Channel, for which I host a Sunday program, and the Fox Business Channel, were actually created, not acquired, by Fox. 62 Levin, Liberty and Tyranny, 114. 63 Ibid., 115. 64 Maydeen Merino, “ ‘Net Zero Is Not Enough’: John Kerry Says We Need to Remove Carbon Dioxide from the Atmosphere,” Daily Caller, April 22, 2021, https://dailycaller.com/2021/04/22/john-kerry-remove-carbon-atmosphere-leaders-summit-climate-change/ (April 25, 2021). 65 “The Government Is on My Property.

Moreover, you can monitor Big Tech’s oligopolists’ censorship activities by using the Media Research Center’s FreeSpeechAmerica Project and its Censortrack website, found here: https://censortrack.org/. However, the root cause of Big Tech’s power and abuse goes back to the protection granted it by Congress in 1996 under Section 230 of the Community Decency Act. As Rachel Bovard of the Conservative Partnership Institute (CPI), explains: It “protects the Big Tech companies from being sued for the content users post on their sites. The law also creates a liability shield for the platforms to ‘restrict access to or availability of material that the provider or user considers to be… objectionable, whether or not such material is constitutionally protected.’ ”58 She adds: “A handful of Big Tech companies are now controlling the flow of most information in a free society, and they are doing so aided and abetted by government policy.

That these are merely private companies exercising their First Amendment rights is a reductive framing which ignores that they do so in a manner that is privileged—they are immune to liabilities to which other First Amendment actors like newspapers are subject—and also that these content moderation decisions occur at an extraordinary and unparalleled scale.”59 Thus, when Republicans next control Congress and the presidency, they must be aggressively pressured to withdraw Section 230 immunity from Big Tech, which President Trump attempted to do but was thwarted by his own party. Moreover, Facebook billionaire Mark Zuckerberg’s interference with and attempted manipulation of elections, including the presidential election in 2020 with hundreds of millions in targeted contributions, as well as Google’s manipulation of algorithms, must be investigated and outlawed both at the federal and state level.60 You can contact friendly state legislators and file complaints against corporations that make what are effectively in-kind contributions with various federal and state agencies and, again, show up at their shareholder meetings and be heard.

pages: 918 words: 257,605

The Age of Surveillance Capitalism
by Shoshana Zuboff
Published 15 Jan 2019

“No provider or user of an interactive computer service,” the statute reads, “shall be treated as the publisher or speaker of any information provided by another information content provider.”49 This is the regulatory framework that enables a site such as TripAdvisor to include negative hotel reviews and permits Twitter’s aggressive trolls to roam free without either company being held to the standards of accountability that typically guide news organizations. Section 230 institutionalized the idea that websites are not publishers but rather “intermediaries.” As one journalist put it, “To sue an online platform over an obscene blog post would be like suing the New York Public Library for carrying a copy of Lolita.”50 As we shall see, this reasoning collapses once surveillance capitalism enters the scene. Section 230’s hands-off stance toward companies perfectly converged with the reigning ideology and practice of “self-regulation,” leaving the internet companies, and eventually the surveillance capitalists among them, free to do what they pleased.

Pasquale, “The Automated Public Sphere” (Legal Studies research paper, University of Maryland, November 10, 2017). 47. Ammori, “The ‘New’ New York Times,” 2259–60. 48. Adam Winkler, We the Corporations (New York: W. W. Norton, 2018), xxi. 49. “Section 230 of the Communications Decency Act,” Electronic Frontier Foundation, n.d., https://www.eff.org/issues/cda230. 50. Christopher Zara, “The Most Important Law in Tech Has a Problem,” Wired, January 3, 2017. 51. David S. Ardia, “Free Speech Savior or Shield for Scoundrels: An Empirical Study of Intermediary Immunity Under Section 230 of the Communications Decency Act” (SSRN Scholarly Paper, Rochester, NY: Social Science Research Network, June 16, 2010), https://papers.ssrn.com/abstract=1625820. 52.

The Constitution is exploited to shelter a range of novel practices that are antidemocratic in their aims and consequences and fundamentally destructive of the enduring First Amendment values intended to protect the individual from abusive power. In the US, congressional statutes have played an equally or perhaps even more important role in sheltering surveillance capitalism from scrutiny. The most celebrated of these is a legislative statute known as Section 230 of the Communications Decency Act of 1996, which shields website owners from lawsuits and state prosecution for user-generated content. “No provider or user of an interactive computer service,” the statute reads, “shall be treated as the publisher or speaker of any information provided by another information content provider.”49 This is the regulatory framework that enables a site such as TripAdvisor to include negative hotel reviews and permits Twitter’s aggressive trolls to roam free without either company being held to the standards of accountability that typically guide news organizations.

pages: 380 words: 109,724

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US
by Rana Foroohar
Published 5 Nov 2019

Consider Section 230 of the Communications Decency Act, which gave technology companies exemption from liability for what people do and say on their platforms. In 1996, when this law was crafted, no one could have predicted that it would come to serve as a legal loophole for companies like Backpage.com that deliberately created—and profited handsomely from—a platform for online sex trafficking. On August 1, 2017, a bipartisan group of senators, led by Democrat Claire McCaskill of Missouri and Republican Rob Portman of Ohio, introduced legislation that would create a carve-out in section 230 for tech firms that knowingly facilitate sex trafficking, meaning they could be held responsible for that.

But he just repeated, with a touch of condescension: ‘Yes, but we can’t code for it, so it can’t be done.’ ” The message was that the debate would be held on the technologist’s terms, or not at all.17 A lot of people—including many of our elected leaders in Washington—have bought into that argument. Perhaps that’s why, from the beginning, the rules have favored the industry over the consumers they supposedly serve. The most notable example of “special” rules that benefit Big Tech is the get-out-of-jail-free card provided by section 230 of the Communications and Decency Act of 1996 (CDA), which exempts tech firms from liability for nearly all kinds of illegal content or actions perpetrated by their users (there are a few small exceptions for copyright violations and certain federal crimes). In the early days of the commercial Internet, back in the mid-1990s, one of the refrains we heard over and over from Silicon Valley was the notion that the Internet was like the town square—a passive and neutral conduit for thoughts and activities—and that because the online platforms were, by this definition, public spaces, the companies who ran them were not responsible for what happened there.

That video was then uploaded 1.5 million times within twenty-four hours on Facebook and uploaded on YouTube at the rate of one video per second.2 As she put it in a powerful speech following the episode, “We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published. They are the publisher, not just the postman. It cannot be a case of all profit, no responsibility.”3 The Communications Decency Act section 230 exceptions that allow platforms to get away with the dissemination of hate and violence in a way that no other type of media can are ripe for review. Rethinking them will be tricky: There is a risk that platforms will be overzealous in the policing of hate speech if they are on the hook for it legally, and that could in turn have a chilling effect on free speech in general.

pages: 484 words: 114,613

No Filter: The Inside Story of Instagram
by Sarah Frier
Published 13 Apr 2020

So they brainstormed a way to automatically detect the worst content and prevent it from going up, to preserve Instagram’s fledgling brand. “Don’t do that!” Zollman said. “If we start proactively reviewing content, we are legally liable for all of it. If anyone found out, we’d have to personally review every piece of content before it goes up, which is impossible.” She was right. According to Section 230 of the Communications Decency Act, nobody who provided an “interactive computer service” was considered the “publisher or speaker” of the information, legally speaking, unless they exerted editorial control before that content was posted. The 1996 law was Congress’s attempt to regulate pornographic material on the Internet, but was also crucial to protecting internet companies from legal liability for things like defamation.

Zollman knew this because at Formspring, she’d gone with her boss to a meeting with Del Harvey, the person in charge of dealing with these same legal issues at Twitter. “Del Harvey” was a professional pseudonym to protect the employee from the throngs of angry internet users she made rules for. The Section 230 law was the one thing that had stuck with Zollman from the meeting. Still, Zollman didn’t want Instagram to ignore these posts. She knew from Formspring how a dark culture could grow if untended, and how Instagram had become an escape for her. The number of Instagram users was still small enough that Riedel and Zollman could personally click through all of the damaging content to decide what to do, finishing the job in shifts.

Internet,” Wired, August 14, 2017, https://www.wired.com/2017/08/instagram-kevin-systrom-wants-to-clean-up-the-internet/. After just nine months, the app: M. G. Siegler, “The Latest Crazy Instagram Stats: 150 Million Photos, 15 per Second, 80% Filtered,” TechCrunch, August 3, 2011, https://techcrunch.com/2011/08/03/instagram-150-million/. According to Section 230 of the Communications Decency Act: Protection for private blocking and screening of offensive material, 47 U.S. Code § 230 (1996). 3 | THE SURPRISE “He chose us, not the other way around.”: Dan Rose, interview with the author, Facebook headquarters, December 18, 2018. Google had bought YouTube for $1.6 billion: Associated Press, “Google Buys YouTube for $1.65 Billion,” NBC News, October 10, 2006, http://www.nbcnews.com/id/15196982/ns/business-us_business/t/google-buys-youtube-billion/#.XX9Q96d7Hox.

pages: 398 words: 86,023

The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia
by Andrew Lih
Published 5 Jul 2010

But the U.S. Internet provider BellSouth would not assist him. The editorial illustrated on a very public level what Wikipedia was and was not on the hook for. Fortunately, the U.S. law Section 230 of the Communications Decency Act protects Wikipedia from having to be liable for the content in Wikipedia. As a forum and provider of the virtual space, and not the editorial content, it is protected. Seigenthaler wrote: Section 230 of the Communications Decency Act, passed in 1996, specifically states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker.” That legalese means that, unlike print and broadcast companies, online service providers cannot be sued for disseminating defamatory attacks on citizens posted by others. 192_The_Wikipedia_Revolution While Wikipedia was in the clear legally, it was the worst public relations black eye so far.

There are scores of articles with unsourced statements written by anonymous individuals. Will there be a big lawsuit that will put community members in jeopardy? Editors are legally responsible for their edits, and the Wikimedia Foundation, being protected as a common carrier in the United States under the Section 230 safe harbor provisions of the Telecommunications Act, is not responsible for content on the site. It is likely an editor will be on the receiving end of a civil action sooner or later. There has already been at least one criminal case.100 However, if one makes an edit to another section of the article, is one then responsible for the previous edits to which one added a change?

pages: 239 words: 80,319

Lurking: How a Person Became a User
by Joanne McNeil
Published 25 Feb 2020

The app was reapproved for the App Store following this ban. Another factor that may have contributed to Tumblr’s decision was the Stop Enabling Sex Traffickers Act (SESTA) and Fight Online Sex Trafficking Act (FOSTA), passed in Congress that year, which amended U.S. law Section 230 of the 1996 Communications Decency Act to make an exception for “sex trafficking.” To back up, Section 230 is how social sharing is even legally possible in the United States. It classifies platforms more as libraries than publishers. As the Electronic Frontier Foundation explains it, “This legal and policy framework has allowed for YouTube and Vimeo users to upload their own videos, Amazon and Yelp to offer countless user reviews, Craigslist to host classified ads, and Facebook and Twitter to offer social networking to hundreds of millions of Internet users.”

Several websites have covered Tumblr’s decline (Gita Jackson, “In 2018, Tumblr Is a Joyless Black Hole,” Kotaku, July 2, 2018; Brian Feldman, “Tumblr’s Unclear Future Shows That There’s No Money in Internet Culture,” New York, June 28, 2017; Seth Fiegerman, “How Yahoo Derailed Tumblr,” Mashable, June 15, 2016; Joe Porter, “Tumblr was removed from Apple’s App Store over child pornography issues,” The Verge, November 20, 2018), but, as I write this, WordPress is set to buy it, so there’s some hope for a turnaround. The EFF quote comes from an explainer on its website, “Section 230 of the Communications Decency Act.” Sara M. Watson’s report “Toward a Constructive Technology Criticism” is available to read on the Columbia Journalism Review website (October 4, 2016). Winona Ryder appeared on Late Night with Jimmy Fallon on January 10, 2011. 5. CLASH A Twitter service known as Topsy indicated there were 75,465 #SolidarityIsForWhiteWomen tweets, according to Susana Loza’s “Hashtag Feminism, #SolidarityIsForWhiteWomen, and the Other #FemFuture,” which appeared in issue no. 5 of Ada: Journal of Gender, New Media, and Technology.

pages: 445 words: 135,648

Nothing Personal: My Secret Life in the Dating App Inferno
by Nancy Jo Sales
Published 17 May 2021

When dating apps are promoted as neutral, and dating app dates are seen as somehow naturally occurring events, then it takes Big Dating companies off the hook for any responsibility for the sexual assaults and rapes that regularly happen through their platforms—responsibility they already know they don’t bear in a legal sense. That’s because of Section 230 of the 1996 Communications Decency Act, which protects Internet service providers from legal action based on third-party actions. Controversial Section 230 in effect immunizes both service providers and social media companies from liability for wrongful acts committed by their users. If anything is ever to change in terms of protecting dating app users from harm, then Section 230 is going to have to be amended, in order to “deny bad Samaritans…immunity,” says Boston University School of Law professor Danielle Citron, a leading expert on the issue.

Last modified September 16, 2019. www.cnn.com/2019/09/16/health/sexual-initiation-forced-united-states-study/index.html. Citron, Danielle Keats. Hate Crimes in Cyberspace. Cambridge, MA: Harvard University Press, 2014. Citron, Danielle Keats, and Benjamin Wittes. “The Internet Will Not Break: Denying Bad Samaritans Section 230 Immunity.” Fordham Law Review, forthcoming. University of Maryland Legal Studies Research Paper No. 2017-22, July 24, 2017. Clifford, Catherine. “How a Tinder Founder Came Up with Swiping and Changed Dating Forever.” CNBC, January 6, 2017. www.cnbc.com/2017/01/06/how-a-tinder-founder-came-up-with-swiping-and-changed-dating-forever.html.

pages: 370 words: 112,809

The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future
by Orly Lobel
Published 17 Oct 2022

In 1996, in the early days of the internet, the Communications Decency Act (CDA) was enacted to protect providers of an interactive computer service from civil liability for another’s actions. In just a few words, Section 230 of the CDA provided a shield for online platforms to deflect responsibility for the content published on them: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In 2018, Congress introduced an exemption to Section 230, the Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act (FOSTA-SESTA). The wisdom of imposing liability on online platforms for trafficking activities on their sites is still debated.

Social scientists, for example, must work with data scientists to provide context and ask the critical questions about definitions, the sources of data, and the interpretation of patterns. There are accelerated, heated debates and numerous legislative proposals to tighten the regulation of digital technology, including to amend Section 230 of the U.S. Communications Decency Act of 1996 in ways that would limit digital platform immunity and require platforms to moderate illegal content. These proposals also include more transparency over what data is collected. The European Union is leading the way with its 2018 General Data Protection Regulation (GDPR) and subsequent proposed reforms.

pages: 234 words: 67,589

Internet for the People: The Fight for Our Digital Future
by Ben Tarnoff
Published 13 Jun 2022

Even as their interactions are being subtly (or unsubtly) structured by the design of the user interface and the code underneath, they enjoy a feeling of autonomy, a feeling of being free to express themselves. The power of the social media mall thus rests on a strange kind of sovereignty: the sort that pretends it doesn’t exist. This disavowal has its legal basis in the famous Section 230, passed into law as part of the Communications Decency Act of 1996. Section 230 shields online services of all kinds, as well as ISPs, from legal liability for the speech they circulate. The protection it affords is especially vital for the owners of social media malls, who can disclaim responsibility for the activities of their users even as they are covertly involved in shaping those activities.

pages: 1,136 words: 73,489

Working in Public: The Making and Maintenance of Open Source Software
by Nadia Eghbal
Published 3 Aug 2020

She compares the problem to traffic congestion caused by automobiles, where each person wants to drive their own car, but in doing so increases the congestion experienced by others, and, eventually, themself.214 Community moderation Software does not exist in a vacuum. As it acquires more users, some of those people will inevitably use it in undesirable ways. From a legal perspective, software providers have been historically absolved from addressing this cost. Section 230 of the United States Communication Decency Act (CDA) protects most platforms from liabilities associated with the content uploaded by their users: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”216 In open source, all popular licenses contain an “as is” clause, which protects creators from liability in the case of harm arising from the code’s use.

Charlotte Hess and Elinor Ostrom (Cambridge, MA: MIT Press, 2011), 34. 204 Donald Stufft (@dstufft), “PyPI ‘costs’ like 2-3 million dollars. . .,” Twitter, May 11, 2019, 5:10 p.m., https://twitter.com/dstufft/status/1127320131359653890. 205 Donald Stufft (@dstufft), “The first full month of PyPI/PSF . . .,” Twitter, July 21, 2017, 1:29 p.m., https://twitter.com/dstufft/status/888450899357704192. 206 Donald Stufft (@dstufft), “April ‘bill’ for Fastly . . .,” Twitter, May 11, 2019, 5:26 p.m., https://twitter.com/dstufft/status/1127324217622638599. 207 Drew DeVault, “The Path to Sustainably Working on FOSS Full-Time,” Drew DeVault’s Blog, February 24, 2018, https://drewdevault.com/2018/02/24/The-road-to-sustainable-FOSS.html. 208 Werner Vogels, “Eventually Consistent,” Communications of the ACM 52, no. 1 (January 2009): 40, https://doi.org/10.1145/1435417.1435432. 209 Lily Hay Newman, “GitHub Survived the Biggest DDoS Attack Ever Recorded,” Wired, March 1, 2018, https://www.wired.com/story/github-ddos-memcached/ 210 Meira Gebel, “In 15 Years Facebook Has Amassed 2.3 Billion Users - More Than Followers of Christianity,” Business Insider, February 4, 2019, https://www.businessinsider.com/facebook-has-2-billion-plus-users-after-15-years-2019-2. 211 Barry Schwartz, “Google: We Can’t Have Customer Service Because . . .,” Search Engine Roundtable, August 24, 2011, https://www.seroundtable.com/google-support-staff-limits-13916.html. 212 Nolan Lawson, “What It Feels Like to Be an Open-Source Maintainer,” Read the Tea Leaves, March 5, 2017, https://nolanlawson.com/2017/03/05/what-it-feels-like-to-be-an-open-source-maintainer/. 213 Frederick Brooks, The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition, 2nd ed. (Reading: Addison-Wesley, 1995), 121. 214 Devon Zuegel, “The City Guide to Open Source,” Increment 9, May 2019, https://increment.com/open-source/the-city-guide-to-open-source/. 215 Robert Glass, Facts and Fallacies of Software Engineering (Boston: Addison-Wesley, 2010), 174. 216 “Section 230 of the Communications Decency Act,” Electronic Frontier Foundation, n.d., https://www.eff.org/issues/cda230. 217 Ben Balter, “Open Source License Usage on GitHub.com,” The GitHub Blog, GitHub, March 9, 2015, https://github.blog/2015-03-09-open-source-license-usage-on-github-com/. 218 “The MIT License,” Open Source Initiative, n.d., https://opensource.org/licenses/MIT. 219 “Open Source Survey,” Open Source Survey, 2017, https://opensourcesurvey.org/2017/. 220 “The Developer Coefficient: A $300B Opportunity for Business,” Stripe, September 2018, https://stripe.com/reports/developer-coefficient-2018. 221 Ensmenger, “When Good Software Goes Bad.” 222 “About Required Status Checks,” GitHub Help, n.d., https://help.github.com/en/github/administering-a-repository/about-required-status-checks. 223 Alan Zeino, “Faster Together: Uber Engineering’s IOS Monorepo,” Uber Engineering (blog), March 6, 2017, https://eng.uber.com/ios-monorepo/ 224 Steve Klabnik (@steveklabnik), “Today I glanced at some numbers . . .,” Twitter, June 14, 2019, 11:17 a.m., https://twitter.com/steveklabnik/status/1139552342842458112. 225 “The State of the Octoverse,” GitHub, 2019, https://octoverse.github.com/. 226 Russ Cox, “Our Software Dependency Problem,” Research!

pages: 336 words: 91,806

Code Dependent: Living in the Shadow of AI
by Madhumita Murgia
Published 20 Mar 2024

But her biggest problem, she said, is that there are no legal incentives for the web’s largest platforms, such as Google, YouTube, Facebook, Instagram, and others – let alone specialist pornography websites – to curtail the distribution of non-consensual and AI-generated pornography, and keep users safe. She has become obsessed with holding internet platforms accountable for the abuse that occurs online. In particular, Goldberg is going after Section 230, a part of the Communications Decency Act in the United States which protects online platforms from liability for any third-party content posted on them. ‘This is one of the biggest problems evolving on the Internet. The fact that the most powerful companies in the history of the universe, in terms of wealth but also the information they have about us, the idea that they are immune from liability, when no other industry is,’ she said.

Then She Scanned Her Face Online’, CNN Business, May 24, 2022, https://edition.cnn.com/2022/05/24/tech/cher-scarlett-facial-recognition-trauma/index.html. 12 Carrie Goldberg, Nobody’s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls (Little, Brown and Company, 2019). 13 Margaret Talbot, ‘The Attorney Fighting Revenge Porn’, The New Yorker, November 27, 2016, https://www.newyorker.com/magazine/2016/12/05/the-attorney-fighting-revenge-porn. 14 ‘Section 230’, EFF, n.d., https://www.eff.org/issues/cda230. 15 Haleluya Hadero, ‘Deepfake Porn Could Be a Growing Problem Amid AI Race’, Associated Press News, April 16, 2023, https://apnews.com/article/deepfake-porn-celebrities-dalle-stable-diffusion-midjourney-ai-e7935e9922cda82fbcfb1e1a88d9443a. 16 Ibid. 17 Molly Williams, ‘Sheffield Writer Launches Campaign over “Deepfake Porn” after Finding Own Face Used in Violent Sexual Images’, The Star News, July 21, 2021, https://www.thestar.co.uk/news/politics/sheffield-writer-launches-campaign-over-deepfake-porn-after-finding-own-face-used-in-violent-sexual-images-3295029. 18 ‘Facts and Figures: Women’s Leadership and Political Participation’, The United Nations Entity for Gender Equality and the Empowerment of Women, March 7, 2023, https://www.unwomen.org/en/what-we-do/leadership-and-political-participation/facts-and-figures. 19 Jeffery Dastin, ‘Amazon Scraps Secret AI Recruiting Tool That Showed Bias against Women’, Reuters, October 11, 2018, https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G. 20 Mary Ann Sieghart, The Authority Gap: Why Women Are Still Taken Less Seriously Than Men, and What We Can Do about It (Transworld, 2021). 21 Steven Feldstein, ‘How Artificial Intelligence Systems Could Threaten Democracy’, Carnegie Endowment for International Peace, April 24, 2019, https://carnegieendowment.org/2019/04/24/how-artificial-intelligence-systems-could-threaten-democracy-pub-78984. 22 ‘Deepfakes, Synthetic Media and Generative AI’, WITNESS, 2018, https://www.gen-ai.witness.org/. 23 Yinka Bokinni, ‘Inside the Metaverse’ (United Kingdom: Channel 4, April 25, 2022). 24 Yinka Bokinni, ‘A Barrage of Assault, Racism and Rape Jokes: My Nightmare Trip into the Metaverse’, Guardian, April 25, 2022, https://www.theguardian.com/tv-and-radio/2022/apr/25/a-barrage-of-assault-racism-and-jokes-my-nightmare-trip-into-the-metaverse.

pages: 372 words: 100,947

An Ugly Truth: Inside Facebook's Battle for Domination
by Sheera Frenkel and Cecilia Kang
Published 12 Jul 2021

Within hours, the platform had put a warning label on the tweet, indicating that Trump was glorifying violence. Trump fired back, threatening regulations and the abolishment of an important legal shield for internet companies known as Section 230. “Twitter is doing nothing about all of the lies & propaganda being put out by China or the Radical Left Democrat Party. They have targeted Republicans, Conservatives & the President of the United States. Section 230 should be revoked by Congress. Until then, it will be regulated!” he tweeted. Over the past year, Twitter had been gradually diverging from Facebook on speech policies. It had banned political ads soon after Zuckerberg’s Georgetown declaration to protect all political speech, and it had begun to label tweets in which Trump spread false or misleading information around the elections; earlier in the week, Twitter had attached a fact-checking label to two tweets by the president that discredited the integrity of mail-in voting.

pages: 137 words: 35,041

Free Speech And Why It Matters
by Andrew Doyle
Published 24 Feb 2021

They claim to be platforms committed to the principle of free speech, and yet at the same time behave like publishers who seek to enforce limitations on the opinions that may be expressed. Whenever the likes of Twitter or Facebook are sued for libellous material posted by their users, they invariably cite Section 230 of the Communications Decency Act, which ensures that they are not legally responsible for content that they fail to remove. The law was crafted out of an understanding that, given the proliferation of comment sections on news websites, it was always unfeasible to expect media outlets to be able to ensure that illegal content would not be uploaded.

pages: 446 words: 109,157

The Constitution of Knowledge: A Defense of Truth
by Jonathan Rauch
Published 21 Jun 2021

On the policy front, companies and organizations of many kinds sifted questions like whether and how to place boundaries on user-uploaded content; how much anonymity is too much; how to define and weigh free-speech commitments; where to draw lines between outspokenness and harassment. On the legal front, a vigorous debate emerged over the future of Section 230, a U.S. statute shielding internet companies from civil liability for user-generated content—an exemption which was not available to offline publishers, and which either allowed free speech to flourish online or underwrote online harassment and fraud, depending on your point of view. (There was truth in both views.)

See academics and academia Schultz, Steve, 250 science: biases in, 73–75, 196–97; conservative media on, 175; creed wars and, 63–65; fallibilism and, 58–59; globalization of knowledge and, 68–70; reality-based communities and, 62–68, 70–73; republican virtues of, 112–13; social networks and, 65–68, 70–73. See also liberal science scientism, 113–14 Scott, Andrew, 72–73 Scottish Enlightenment, 62 secondary boycotts, 216, 219–20 Section 230 (internet company liability statute), 147 self-censorship, 11, 13, 36, 220–23, 240 shaming. See canceling and cancel culture Shapiro, Barbara J., 102, 121 Sherif, Muzafer, 34–35, 71 Shields, Jon A., 226 Silverglate, Harvey, 243–44 Simmons, Ruth, 208–09 Simpson, Thomas, 237 Singal, Jesse, 212, 218 skepticism, 57–58 Skripa, Sergei, 165 slavery, 81, 253–54 Sloman, Steven: The Knowledge Illusion (with Fernbach), 34, 72 smearing.

pages: 482 words: 121,173

Tools and Weapons: The Promise and the Peril of the Digital Age
by Brad Smith and Carol Ann Browne
Published 9 Sep 2019

Warner, “Potential Policy Proposals for Regulation of Social Media and Technology Firms” (draft white paper, Senate Intelligence Committee, 2018), https://www.scribd.com/document/385137394/MRW-Social-Media-Regulation-Proposals-Developed. Back to note reference 14. When Congress passed the Communications Decency Act in 1996, it included section 230(c)(1), which states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230, at https://www.law.cornell.edu/uscode/text/47/230. As one author has noted, “When first enacted by Congress, section 230 was intended to foster openness and innovation in the World Wide Web by giving websites broad legal protections and allowing the Internet to grow as a true marketplace of ideas.

pages: 516 words: 116,875

Greater: Britain After the Storm
by Penny Mordaunt and Chris Lewis
Published 19 May 2021

Niall Ferguson explains how social media escaped liability in his book The Square and the Tower: Networks and Power, from the Freemasons to Facebook. They were exempted from publishers liability under the American Telecommunications Act 1996, Section 230, which categorises them as telecommunications firms rather than publishers. He argues that companies are entitled to Section 230’s protection – but only if they uphold the diversity of discourse envisaged by Congress. A drive for the reinvention of public discourse could be at the centre of a drive to modernise our democracy to act as a rallying call for other areas of British life to follow suit.

pages: 569 words: 156,139

Amazon Unbound: Jeff Bezos and the Invention of a Global Empire
by Brad Stone
Published 10 May 2021

The report also proposed that Congress elevate the requirements for big tech companies to get approvals for mergers, so that dominant firms would have to disclose even smaller acquisitions and show how they are “necessary for serving the public interest.” Not mentioned in the report, to quell the chaos of the Amazon Marketplace, lawmakers could also reform the notorious Section 230 of the Communications Decency Act, which currently holds that internet providers like Amazon are not liable for the legal infractions of their users. Changes to Section 230 could force Amazon to be accountable for fraudulent or unsafe products sold on its site by third-party sellers. Regulators could also compel Amazon to verify sellers with a tax ID number or require them to put down a security deposit, which they’d forfeit with any signs of fraud (Alibaba’s Tmall website does this).

The Identity Trap: A Story of Ideas and Power in Our Time
by Yascha Mounk
Published 26 Sep 2023

Social media platforms would also remain free to use algorithms that amplify content on the basis of such properties as being less divisive, for example by favoring posts that elicit few negative responses. But they would no longer be able to ban users or delete posts because of the substantive political views they express. If companies decline to regulate themselves in a clear and transparent way, legislators should consider stepping in. In the United States, Section 230 now ensures that platforms like Twitter and Facebook are not legally liable for the content posted by their users; some other jurisdictions have adopted similar rules. If major platforms continue to censor their users in capricious or untransparent ways, democratic governments should (even though this too would have drawbacks) give serious thought to attaching clear conditions to this immunity.

It’s perfectly fine for social media platforms to concentrate on particular forms of content, branding themselves as politically progressive or conservative. But in that case the analogy to traditional publishers is even clearer. So social media platforms with an explicit political lean should be able to delete content they do not like, but precisely for that reason they should, like other publishers, not enjoy Section 230 protections. GO TO NOTE REFERENCE IN TEXT illegal, extreme, or uncivil behavior: For a discussion of how social media platforms could continue to censor forms of expression, like libel or child pornography, that are actually illegal without having to act as ideological censors, see David French, “A Better Way to Ban Alex Jones,” New York Times, Aug. 7, 2018, www.nytimes.com/2018/08/07/opinion/alex-jones-infowars-facebook.html.

The Smartphone Society
by Nicole Aschoff

If Facebook did admit it was a media company, it would have to act like one, vetting stories, checking facts, considering ethics—all the boring stuff that a company like the Washington Post or CNN does. By not self-identifying as a media company Facebook avoids liability for any nastiness that pops up on its site: section 230 of the Communications and Decency Act enables companies such as Facebook to avoid “intermediary liability” for the things people say or do on their platforms and, on the flip side, allows them to act as “Good Samaritans,” policing their sites as they see fit. Nonetheless, after a year of denying that fake news was a problem, Facebook quietly started reducing the amount of news people see in their newsfeed.

pages: 223 words: 71,414

Abolish Silicon Valley: How to Liberate Technology From Capitalism
by Wendy Liu
Published 22 Mar 2020

A softer version of this, which could be useful as a transitional step, could be to mandate an open API for companies in crowded spaces, so that their intellectual property essentially becomes commoditised — something that is already happening with legislation targeting scooter apps in Washington, DC.14 Companies that choose not to make their code publicly viewable and contestable should be held liable for any harm caused, even indirectly. In the US, an amendment to Section 230 of the 1996 Communications Decency Act could make platforms featuring user-generated content accountable for any content explicitly promoted in a recommendation engine.15 This would be most relevant for YouTube recommendations, which have come under fire for being gateways for extremist content.16 The equivalent of the Freedom of Information Act (FOIA) for private companies above a certain size would also be useful, so that internal decisions about product or corporate strategy are archived and disclosed upon request.

pages: 743 words: 201,651

Free Speech: Ten Principles for a Connected World
by Timothy Garton Ash
Published 23 May 2016

But America’s First Amendment culture, as we know it today, is in fact the product of judicial rulings, laws and political decisions over the 100 years since the First World War, and mainly from the last half century. One that has been crucial to global internet freedom is buried away in section 230 of the Communications Decency Act, the very law against an earlier, more restrictive version of which Barlow’s broadside was directed. Section 230 states that ‘no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider’.57 So the intermediary is not responsible.

pages: 295 words: 87,204

The Capitalist Manifesto
by Johan Norberg
Published 14 Jun 2023

But for startups with few employees and little capital, these regulations act as direct barriers to entry. Research on the US economy shows that market concentration grew the most in sectors where regulations increased the fastest.31 It is in this context that we should see Facebook’s new-found interest in abolishing section 230 of the Federal Communications Decency Act, which means that American platforms are allowed to moderate content without making themselves open to legal action for what others publish on their site. Without this, platforms would be stopped from taking any action against even hateful speech and harassment if they didn’t have the resources to engage in very strict moderation to make sure that nothing ever slips through.

pages: 305 words: 101,093

Who Owns This Sentence?: A History of Copyrights and Wrongs
by David Bellos and Alexandre Montagu
Published 23 Jan 2024

However, another senator, Ron Wyden, saw that Exon’s stringent proposals risked stifling a whole new adventure in its cradle. He succeeded in adding to it a special provision to protect Internet service providers from legal liability for material that their users post on the web. The wording he had inserted under Section §230 states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”, thus paving the way for the “safe harbor” provisions of the Digital Millennium Copyright Act, which now apply to all kinds of online sites, including social media.

pages: 1,172 words: 114,305

New Laws of Robotics: Defending Human Expertise in the Age of AI
by Frank Pasquale
Published 14 May 2020

Zeynep Tufekci, “Mark Zuckerberg Is in Denial,” New York Times, November 15, 2016, http://www.nytimes.com/2016/11/15/opinion/mark-zuckerberg-is-in-denial.html?_r=2. 26. Olivier Sylvain, “Intermediary Design Duties,” Connecticut Law Review 50, no. 1 (2018): 203; Danielle Keats Citron and Mary Anne Franks, “The Internet as a Speech Machine and Other Myths Confounding Section 230 Reform” (working paper, Public Law Research Paper No. 20-8, Boston University School of Law, Massachusetts, 2020); Carrie Goldberg, Nobody’s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls (New York: Plume, 2019), 38. 27. Craig Silverman, “Facebook Is Turning to Fact-Checkers to Fight Fake News,” BuzzFeed News, December 15, 2016, https://www.buzzfeed.com/craigsilverman/facebook-and-fact-checkers-fight-fake-news?

pages: 401 words: 112,589

Flowers of Fire: The Inside Story of South Korea's Feminist Movement and What It Means for Women's Rights Worldwide
by Hawon Jung
Published 21 Mar 2023

Others called the law insufficient, since it only targets open, public chatrooms and not private chatrooms. †A similar push to hold tech platforms more accountable is underway in the United States as well. Several lawmakers and experts on online harassment and hate speech have called for the amendment of a key internet law—Section 230 of the Communications Decency Act—that protects tech firms from liability over content their users post, including sexually abusive materials. *Of course, men are not immune from the threats of cyber sexual abuse, either. News of sextortionists posing as women online to lure other men into sending explicit photos or videos and using the footage to blackmail them have made increasingly frequent headlines.

pages: 462 words: 129,022

People, Power, and Profits: Progressive Capitalism for an Age of Discontent
by Joseph E. Stiglitz
Published 22 Apr 2019

For instance, if an individual has a repeat grocery order, that kind of information could be stored—but not used for other purposes. 31.Anonymizing data may not suffice. Since the Big Data companies can figure out who the individual is, if they are given enough data about the individual, some of the information in the data set itself will have to be stripped away. 32.Platforms were granted immunity under section 230 of the Communications Decency Act. Liability associated with the posting of defamatory articles could easily bankrupt the platforms, so it might be necessary to impose some limitations on their liability—enough to provide them some incentive to exercise care over what is posted, but not so much as to make it impossible for them to operate.

pages: 559 words: 155,777

The Sinner and the Saint: Dostoevsky and the Gentleman Murderer Who Inspired a Masterpiece
by Kevin Birmingham
Published 16 Nov 2021

“After invoking the help”: Nicholas I quoted in Riasanovsky, Nicholas I and Official Nationality, 5. See also Saunders, “Pyrrhic Victory,” 137. “organ of the government”: Ruud, Fighting Words, 87. “Gentlemen! I have no police”: Riasanovsky, Nicholas I and Official Nationality, 209. “feared and respected”: Benckendorff quoted in Monas, Third Section, 230. “slaves of the tsar”: Marshall Poe, “What Did Russians Mean When They Called Themselves ‘Slaves of the Tsar’?,” Slavic Review 57, no. 3 (Autumn 1998): 585–608. “What is your idea”: “Dostoevsky’s Testimony,” in Dostoevsky as Reformer, 46. “monastic-industrial discipline”: Fourier, “Phalanstery,” 138.

pages: 655 words: 156,367

The Rise and Fall of the Neoliberal Order: America and the World in the Free Market Era
by Gary Gerstle
Published 14 Oct 2022

This exemption held even if such content was deemed to be hateful, false, or incendiary. Internet providers thus acquired broad immunity against suits that might be filed regarding the damage that such content might cause individuals, groups, or institutions. This was Section 509 of the Telecommunications Act, later incorporated as Section 230 of the revised Communications Act of 1934.70 Throughout the legislative effort to reform telecommunications, virtually no one of influence in either the Democratic or Republican party dared suggest that the broadcast/cable/satellite spectrum was a public good owned by the American people, or that corporations seeking access to it ought to be regarded (and regulated) as public utilities.

pages: 568 words: 164,014

Dawn of the Code War: America's Battle Against Russia, China, and the Rising Global Cyber Threat
by John P. Carlin and Garrett M. Graff
Published 15 Oct 2018

See also Advanced Persistent Threat 1 Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction (WMD Commission), 187–188 Committee on Foreign Investment in the United States (CFIUS), 352, 399 Communications Decency Act, 97 Communist Party, 147 community-wide assessment, 326 Comprehensive National Cybersecurity Initiative (CNCI), 156–157, 173 CompuServe, 96 Computer Crime and Intellectual Property Section (CCIPS), 75–77, 98, 132–134, 199 Computer Emergency Response Team Coordination Center (CERT/CC), 95, 150, 353 Computer Fraud and Abuse Act, 90, 94, 116, 133n, 143n, 201 Computer Intrusion Center, 107 Computer Intrusion Squad, 122 computer virus, 88, 93, 111, 119–120, 281 computerization, 353 Conficker Worm, 158–159 confidence levels, 330–332 Consumer Privacy Bill of Rights, 345, 363 Cook, Tim, 344 copyright, 65, 67 corporate espionage, 162–163 Coulter, Chris, 355 counterespionage, 197–198 Counterfeit Library, 114 counterintelligence, 191 Counterintelligence and Export Control Section, 230 counterproliferation, 315, 341 counterterrorism, 191 Counterterrorism Security Group (CSG), 325 Craig, James, 284, 285, 288, 289 credit card fraud, 115 CRG. See Cyber Response Group Criminal Division (FBI), 143 Crist, David, 212 critical infrastructure, 60–61, 344, 344n, 351 Critical Infrastructure Working Group, 174 Crocker, Steve, 80 CrowdStrike, 195, 256, 271, 272, 293, 296 Cruz, Ted, 266 CryptoLocker, 291–292 CSG.

pages: 541 words: 173,676

Generations: The Real Differences Between Gen Z, Millennials, Gen X, Boomers, and Silents—and What They Mean for America's Future
by Jean M. Twenge
Published 25 Apr 2023

Just as many generational trends can be traced to technology, much of what divides us now can be traced to social media—perhaps not as the sole cause, but as a crucial contributor. Thus far, the internet in general and social media in particular have been relatively unregulated. A piece of federal law called Section 230 means content providers, like Meta, cannot be sued for what people post on their platforms, hampering many attempts to regulate the apps. However, it’s possible social media companies could be sued for the design of their platforms—or persuaded in other ways to change them for the betterment of society.

The Code: Silicon Valley and the Remaking of America
by Margaret O'Mara
Published 8 Jul 2019

Kara Swisher and Elizabeth Corcoran, “Gingrich Condemns On-Line Decency Act,” The Washington Post, June 22, 1995, D8; Steve Lohr, “A Complex Medium That Will Be Hard to Regulate,” The New York Times, June 13, 1996, B10; Nat Hentoff, “The Senate’s Cybercensors,” The Washington Post, July 1, 1995, A27; 47 U.S. Code, Section 230. 15. Daniel S. Greenberg, “Porn Does the Internet,” The Washington Post, July 16, 1997, A19. 16. Elizabeth Darling, “Farewell to David Packard,” Palo Alto Times, April 3, 1996, https://www.paloaltoonline.com/weekly/morgue/news/1996_Apr_3.PACKARD.html, archived at https://perma.cc/5B2A-HDPE. 17.