Filter Bubble

back to index

description: intellectual isolation involving search engines

115 results

pages: 274 words: 75,846

The Filter Bubble: What the Internet Is Hiding From You
by Eli Pariser
Published 11 May 2011

Together, these engines create a unique universe of information for each of us—what I’ve come to call a filter bubble—which fundamentally alters the way we encounter ideas and information. Of course, to some extent we’ve always consumed media that appealed to our interests and avocations and ignored much of the rest. But the filter bubble introduces three dynamics we’ve never dealt with before. First, you’re alone in it. A cable channel that caters to a narrow interest (say, golf) has other viewers with whom you share a frame of reference. But you’re the only person in your bubble. In an age when shared information is the bedrock of shared experience, the filter bubble is a centrifugal force, pulling us apart.

We can now begin to see how the filter bubble is actually working, where it’s falling short, and what that means for our daily lives and our society. Every technology has an interface, Stanford law professor Ryan Calo told me, a place where you end and the technology begins. And when the technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens. That’s a powerful position, Calo says. “There are lots of ways for it to skew your perception of the world.” And that’s precisely what the filter bubble does. THE FILTER BUBBLE’S costs are both personal and cultural.

Personalization can get in the way of creativity and innovation in three ways. First, the filter bubble artificially limits the size of our “solution horizon”—the mental space in which we search for solutions to problems. Second, the information environment inside the filter bubble will tend to lack some of the key traits that spur creativity. Creativity is a context-dependent trait: We’re more likely to come up with new ideas in some environments than in others; the contexts that filtering creates aren’t the ones best suited to creative thinking. Finally, the filter bubble encourages a more passive approach to acquiring information, which is at odds with the kind of exploration that leads to discovery.

pages: 382 words: 105,819

Zucked: Waking Up to the Facebook Catastrophe
by Roger McNamee
Published 1 Jan 2019

Giving users what they want sounds like a great idea, but it has at least one unfortunate by-product: filter bubbles. There is a high correlation between the presence of filter bubbles and polarization. To be clear, I am not suggesting that filter bubbles create polarization, but I believe they have a negative impact on public discourse and politics because filter bubbles isolate the people stuck in them. Filter bubbles exist outside Facebook and Google, but gains in attention for Facebook and Google are increasing the influence of their filter bubbles relative to others. Everyone on Facebook has friends and family, but many are also members of Groups.

And yet people believed it, one so deeply that he, in his own words, “self-investigated it” and fired three bullets into the pizza parlor. How can that happen? Filter bubbles. What differentiates filter bubbles from normal group activity is intellectual isolation. Filter bubbles exist wherever people are surrounded by people who share the same beliefs and where there is a way to keep out ideas that are inconsistent with those beliefs. They prey on trust and amplify it. They can happen on television when the content is ideologically extreme. Platforms have little incentive to eliminate filter bubbles because they improve metrics that matter: time on site, engagement, sharing. They can create the illusion of consensus where none exists.

At the time I did not see a way for me to act on Eli’s insight at Facebook. I no longer had regular contact with Zuck, much less inside information. I was not up to speed on the engineering priorities that had created filter bubbles or about plans for monetizing them. But Eli’s talk percolated in my mind. There was no good way to spin filter bubbles. All I could do was hope that Zuck and Sheryl would have the sense not to use them in ways that would harm users. (You can listen to Eli Pariser’s “Beware Online ‘Filter Bubbles’” talk for yourself on TED.com.) Meanwhile, Facebook marched on. Google introduced its own social network, Google+, in June 2011, with considerable fanfare.

pages: 592 words: 125,186

The Science of Hate: How Prejudice Becomes Hate and What We Can Do to Stop It
by Matthew Williams
Published 23 Mar 2021

The tactic of piggy-backing on a mainstream news item to push an alt- or far-right agenda has resulted in some figures becoming YouTube stars, including Infowars’ Paul Joseph Watson. Extreme filter bubbles Right-wing political and far-right figures are known to use online filter bubbles to drum up support for their campaigns. Columbia University Professor Jonathan Albright was one of the first scientists to map the alt-right ‘fake news’ filter bubble, identifying where links to the sites were being shared. He found thousands of web-pages and millions of links spread over not only Facebook, YouTube and Twitter but also the New York Times, Washington Post and many other mainstream sites.7 This filter bubble was occupied not solely by alt-right ‘keyboard warriors’, but also major political figures, and rising international stars in the alt-right movement.

But the use of new ‘deep learning’ technology that is informed by billions of user behaviours a day means extreme videos will continue to be recommended if they are popular with site visitors. Filter bubbles and our bias Research on internet ‘filter bubbles’, often used interchangeably with the term ‘echo chambers’,§ has established that partisan information sources are amplified in online networks of like-minded social media users, where they go largely unchallenged due to ranking algorithms filtering out any challenging posts.9 Data science shows these filter bubbles are resilient accelerators of prejudice, reinforcing and amplifying extreme viewpoints on both sides of the spectrum.

Looking at over half a million tweets covering the issues of gun control, same-sex marriage and climate change, New York University’s Social Perception and Evaluation Lab found that hateful posts related to these issues increased retweeting within filter bubbles, but not between them. The lack of inter-filter bubble retweeting is facilitated by Twitter’s ‘timeline’ algorithm which prioritises content from the accounts that users most frequently engage with (via retweeting or liking). Given that these behaviours are highly biased towards accounts that share users’ views, exposure to challenging content is minimised by the algorithm. Filter bubbles therefore become further entrenched via a form of online confirmation bias, facilitated by posts and reposts that contain emotional content in line with held views on deeply moral issues.10 It therefore seems likely that at points in time when such issues come to the fore, say during court cases, political votes or following a school shooting, occupants of filter bubbles (likely to be a significant number of us who don’t sit on the fence) hunker down and polarise the debate.

pages: 476 words: 125,219

Digital Disconnect: How Capitalism Is Turning the Internet Against Democracy
by Robert W. McChesney
Published 5 Mar 2013

Madrigal, “I’m Being Followed.” 182. Turow, Daily You, 69, 118–22. 183. Madrigal, “I’m Being Followed.” 184. Pariser, Filter Bubble, 49. 185. Alan D. Mutter, “Retailers Are Routing Around the Media,” Reflections of a Newsosaur, Mar. 13, 2012, newsosaur.blogspot.com/2012/03/retailers-are-routing-around-media.html. 186. “Mad Men Are Watching You.” 187. Turow, Daily You, 159. 188. Pariser, Filter Bubble, 120–21. 189. “The All-Telling Eye,” The Economist, Oct. 22, 2011, 100–101. 190. Pariser, Filter Bubble, 120–21. 191. Alan D. Mutter, “Newspaper Digital Ad Share Hits All-Time Low,” Reflections of a Newsosaur, Apr. 23, 2012, newsosaur.blogspot.com/2012/04/newspaper-digital-ad-share-hits-all.html. 192.

It seems to me that even if we could network all the potential aliens in the galaxy—quadrillions of them, perhaps—and get each of them to contribute some seconds to a physics wiki, we would not replicate the achievements of even one mediocre physicist, much less a great one.”32 In his 2011 book, The Filter Bubble, Eli Pariser argues that because of the way Google and social media have evolved, Internet users are increasingly and mostly unknowingly led into a personalized world that reinforces their known preferences. This “filter bubble” each of us resides in undermines the common ground needed for community and democratic politics; it also eliminates “‘meaning threats,’ the confusing, unsettling occurrences that fuel our desire to understand and acquire new ideas.”

Acxiom combines extensive offline data with online data. See Natasha Singer, “A Data Giant Is Mapping, and Sharing, the Consumer Genome,” New York Times, Sunday Business Section, June 17, 2012, 1, 8. How comprehensive is their data set? In The Filter Bubble, 42–43, Pariser writes about how the Bush White House discovered that Acxiom had more data on eleven of the nineteen 9-11 hijackers than the entire U.S. government did. 126. Pariser, Filter Bubble, 7. See also Noam Cohen, “It’s Tracking Your Every Move and You May Not Even Know It,” New York Times, Mar. 26, 2011, A1, A3. 127. Peter Maass and Megha Rajagopalan, “That’s No Phone, That’s My Tracker,” New York Times, July 13, 2012. 128.

pages: 276 words: 81,153

Outnumbered: From Facebook and Google to Fake News and Filter-Bubbles – the Algorithms That Control Our Lives
by David Sumpter
Published 18 Jun 2018

Start on a conservative page then 20 clicks later you will probably still be reading conservative material. Each set of bloggers had created their own world, within which their views reverberated. Filter bubbles came later and are still developing. The difference between ‘filtered’ and ‘echoed’ cavities lies in whether they are created by algorithms or by people. While the bloggers chose the links to different blogs, algorithms based on our likes, our web searches and our browsing history do not involve an active choice on our part. It is these algorithms that can potentially create a filter bubble.1 Each action you make in your web browser is used to decide what to show you next. Whenever you share an article from, for example, the Guardian newspaper, Facebook updates its databases to reflect the fact that you are interested in the Guardian.

C., Haddadi, H. and Seto, M. C. 2016. ‘A first look at user activity on Tinder.’ Advances in Social Networks Analysis and Mining (ASONAM), 2016 IEEE/ACM International Conference pp. 461–6. IEEE. Chapter 11 : Bubbling Up 1 In his book and TED Talk on the filter bubble, Eli Pariser revealed the extent to which our online activities are personalised. Pariser, Eli. 2011. The Filter Bubble: How the new personalized web is changing what we read and how we think. Penguin. Google, Facebook and other big Internet companies store data documenting the choices we make when we browse online and then use it to decide what to show us in the future. 2 https://newsroom.fb.com/news/2016/04/news-feed-fyi-from-f8-how-news-feed-works 3 www.techcrunch.com/2016/09/06/ultimate-guide-to-the-news-feed 4 In the model, the probability a user chooses the Guardian at time t is equal to where G(t) is the number of times the user has already chosen the Guardian and T(t) is the number of times the user has chosen the Telegraph.

During the months that followed my visit to Google in May 2016, I started to see a new type of maths story in the newspapers. An uncertainty was spreading across Europe and the US. Google’s search engine was making racist autocomplete suggestions; Twitterbots were spreading fake news; Stephen Hawking was worried about artificial intelligence; far-right groups were living in algorithmically created filter-bubbles; Facebook was measuring our personalities, and these were being exploited to target voters. One after another, the stories of the dangers of algorithms accumulated. Even the mathematicians’ ability to make predictions was called into question as statistical models got both Brexit and Trump wrong.

pages: 137 words: 38,925

The Death of Truth: Notes on Falsehood in the Age of Trump
by Michiko Kakutani
Published 17 Jul 2018

“binary tribal world”: Sykes, “How the Right Lost Its Mind and Embraced Donald Trump”; Sykes, “Charlie Sykes on Where the Right Went Wrong.” “In the new Right media culture”: Charles Sykes, How the Right Lost Its Mind (New York: St. Martin’s Press, 2017), 180. “With Google personalized”: Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011), 3. “an endless you-loop”: Ibid., 16. “If algorithms are going to curate”: Eli Pariser, “Beware Online ‘Filter Bubbles,’ ” TED2011, ted.com. 7. ATTENTION DEFICIT “When you want to know”: William Gibson, Zero History (New York: Putnam, 2010), 212. Tim Berners-Lee: “History of the Web: Sir Tim Berners-Lee,” World Wide Web Foundation.

What’s alarming to the contemporary reader is that Arendt’s words increasingly sound less like a dispatch from another century than a chilling mirror of the political and cultural landscape we inhabit today—a world in which fake news and lies are pumped out in industrial volume by Russian troll factories, emitted in an endless stream from the mouth and Twitter feed of the president of the United States, and sent flying across the world through social media accounts at lightning speed. Nationalism, tribalism, dislocation, fears of social change, and the hatred of outsiders are on the rise again as people, locked in their partisan silos and filter bubbles, are losing a sense of shared reality and the ability to communicate across social and sectarian lines. This is not to draw a direct analogy between today’s circumstances and the overwhelming horrors of the World War II era but to look at some of the conditions and attitudes—what Margaret Atwood has called the “danger flags” in Orwell’s 1984 and Animal Farm—that make a people susceptible to demagoguery and political manipulation, and nations easy prey for would-be autocrats.

Part of the problem is an “asymmetry of passion” on social media: while most people won’t devote hours to writing posts that reinforce the obvious, DiResta says, “passionate truthers and extremists produce copious amounts of content in their commitment to ‘wake up the sheeple.’ ” Recommendation engines, she adds, help connect conspiracy theorists with one another to the point that “we are long past merely partisan filter bubbles and well into the realm of siloed communities that experience their own reality and operate with their own facts.” At this point, she concludes, “the Internet doesn’t just reflect reality anymore; it shapes it.” 5 THE CO-OPTING OF LANGUAGE Without clear language, there is no standard of truth.

pages: 270 words: 79,992

The End of Big: How the Internet Makes David the New Goliath
by Nicco Mele
Published 14 Apr 2013

I chuckled—it was a surprise, and I’m pretty sure my distinguished “friend” had no idea that the “Social Reader” was reporting back his every click to people like me. The Filter Bubble Entertainment consumption within our new social networking life also increasingly threatens to render us more isolated from one another. In the past, every American watched one of three television channels and read a daily newspaper, even if only for sports scores and coupons. A big, shared public sphere existed in which politicians, policy makers, leaders, and public intellectuals could argue and debate—what we commonly call the court of public opinion. By contrast, with the End of Big we inhabit a “filter bubble” in which our digital media sources—primarily Google and Facebook—serve up content based on what they think we want to read.32 Even your newsfeed on Facebook is algorithmically engineered to give you the material you’re most likely to click on, creating a perverse kind of digital narcissism, always serving you up the updates you want most.

Right now, we already have all the fracturing we can take—thanks to cultural trends associated with radical connectivity. Our former sense of citizenship, of belonging to a larger commonwealth, has given way to the “filter bubble” we now inhabit, in which our digital media sources serve up content based on what they think we want to read, creating a perverse kind of digital narcissism. Eli Pariser opens his book The Filter Bubble by describing two friends with similar demographic profiles who each google “BP” in the midst of the oil company’s disastrous Gulf of Mexico oil spill. One of his friends gets stock quotes and links to the company’s annual report; the other gets news articles about the spill and environmental activist alerts.

By contrast, with the End of Big we inhabit a “filter bubble” in which our digital media sources—primarily Google and Facebook—serve up content based on what they think we want to read.32 Even your newsfeed on Facebook is algorithmically engineered to give you the material you’re most likely to click on, creating a perverse kind of digital narcissism, always serving you up the updates you want most. Nicholas Negroponte called it the “Daily Me,” but it was Eli Pariser who coined the term “filter bubble” in his book of the same name.33 The danger of the personalization of entertainment becomes clear when we remember entertainment’s traditional social functions. We need quality entertainment not just because it’s fun but also because it brings us together as a democratic society. This is one of the few areas of life where we share a common bond—and it’s rapidly going away.

pages: 475 words: 134,707

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt
by Sinan Aral
Published 14 Sep 2020

Second, algorithmic curation caused a filter bubble that significantly reduced news diversity. Readers who came back to the website multiple times were randomly assigned to the algorithm or to the humans each time. When they were assigned to the algorithm, the filter bubble set in, and they read more narrowly. When they were assigned to the page edited by humans, they read more widely. Third, algorithmic curation didn’t just narrow readers’ options for diverse content, it caused their reading choices to narrow as well. In other words, the filter bubble was not limited to the fourth slot in the newsfeed—it spilled over into consumption choices more generally.

In contrast, what it means to be a Democrat or a Republican has become more homogenized within the two parties. Finally, Levy found that Facebook’s newsfeed algorithm does create a filter bubble. The algorithm is less likely to supply people with news that runs counter to their preexisting attitudes. Even though the experiment inspired people to read opposing viewpoints, Facebook’s algorithm continued to supply them with content that leaned toward their prior political views despite their having subscribed to counterattitudinal sources. Those who claim the Hype Machine is polarizing argue that it creates filter bubbles that in turn polarize us. While no hard evidence directly confirms or denies the Hype Machine’s role in polarization, evidence from multiple experimental studies shows that the machine’s recommendation algorithms create filter bubbles of polarized content consumption.

mod=article_inline; Aaron Zitner and Dante Chinni, “Democrats and Republicans Live in Different Worlds,” Wall Street Journal, September 20, 2019. partisans also select into right- or left-leaning media audiences: Kevin Arceneaux and Martin Johnson, Changing Minds or Changing Channels? Partisan News in an Age of Choice (Chicago: University of Chicago Press, 2013). “filter bubbles” of polarized content: Cass R. Sunstein, Republic.com (Princeton: Princeton University Press, 2001); Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (London: Penguin UK, 2011). Some studies find small increases in polarization with Internet use: Yphtach Lelkes, Gaurav Sood, and Shanto Iyengar, “The Hostile Audience: The Effect of Access to Broadband Internet on Partisan Affect,” American Journal of Political Science 61, no. 1 (2017): 5–20.

pages: 145 words: 41,453

You Are What You Read
by Jodie Jackson
Published 3 Apr 2019

This personalisation of the news is not unique to Yahoo; many other news organisations have been doing the same in a bid to build audience engagement. Eli Pariser, activist and chief executive of Upworthy, termed this ‘the filter bubble’. He describes it as ‘your own personal, unique universe of information that you live in online’.10 Eli points out that although this digital universe is controlled for you, it is not created by you – you don’t decide what enters your filter bubble and you don’t see what gets left out. The ‘give them what they want’ mentality the news industry has pushed through social media ends up inflating our filter bubble rather than bursting it. In our modern consumer environment, we have an immediate availability of products, entertainment, food and information.

Notes 1 Harbison, F., quoted in Teheranian, M., Communications policy for national development, Routledge, London, 2016. 2 Schramm, W., Mass media and national development, Stanford University Press, Stanford, 1973. 3 The Total Audience Report: Q1 2016, Nielsen.com, available at: http://www.nielsen.com/us/en/insights/reports/2016/the-total-audience-report-q1-2016.html 4 Johnson, C., The Information Diet: A Case for Conscious Consumption, O’Reilly Media, London, 2015. 5 Ibid., p. 31. 6 Ibid., p. 35. 7 Merrill, J., The Elite Press, Pitman, New York, 1968, p. 20. 8 Mitchell, A., Stocking, G. and Matsa, K., ‘Long-Form Reading Shows Signs of Life in Our Mobile News World’, Pew Research Center for Journalism & Media, 5 May 2016. 9 Shearer, E. and Gottfried, J., News Use Across Social Media Platforms 2017, Pew Research Center’s Journalism Project, 2018, available at: http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/ 10 Pariser, E., Beware online ‘filter bubbles’, Ted.com, 2018, available at: https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles 11 Vinderslev, A., BuzzFeed: The top 10 examples of BuzzFeed doing native advertising, Native Advertising Institute, 2018, assssssvailable at: https://nativeadvertisinginstitute.com/blog/10-examples-buzzfeed-native-advertising/ RESOURCES News organisations: ‘BBC World Hacks’ BRIGHT Magazine The Correspondent Delayed Gratification INKLINE Monocle News Deeply The Optimist Daily Positive News Solutions Journalism Network Sparknews ‘The Upside’ (by the Guardian) The Week ‘What’s Working’ (by the Huffington Post) YES!

Refuse to accept that there is only one way that the news should be; refuse to accept that negative news is the only narrative worth telling; refuse to accept that the news ‘is the way that it is’ and instead decide that it should be more balanced in its coverage. And then start making changes and choices that reflect this. There are six effective ways we can change our media diet in a way that will help us become more informed, engaged and empowered: 1. Become a conscious consumer 2. Read/watch good-quality journalism 3. Burst your filter bubble 4. Be prepared to pay for content 5. Read beyond the news 6. Read solutions-focused news 1. Become a conscious consumer The freedom of the press is sacrosanct in most modern-day democracies. This freedom provides an awfully long lead on what kind of product they are able to produce. They have almost free reign to exploit our appetite for sensationalised bad news.

pages: 283 words: 85,824

The People's Platform: Taking Back Power and Culture in the Digital Age
by Astra Taylor
Published 4 Mar 2014

We are not purposefully retreating into our own distinct worlds, becoming more insular. Instead, invisible filter bubbles are imposed on us. Online, no action goes untracked. Our prior choices are compiled, feeding the ids of what we could call algorithmic superegos—systems that determine what we see and what we don’t, channeling us toward certain choices while cutting others off. And while they may make the Internet less overwhelming, these algorithms are not neutral. “The rush to build the filter bubble is absolutely driven by commercial interests,” Pariser warns. “It’s becoming clearer and clearer that if you want to have lots of people use your Web site, you need to provide them with personally relevant information, and if you want to make the most money on ads, you need to provide them with relevant ads.”41 Ironically, what distinguishes this process from what Nicholas Negroponte enthusiastically described as “the Daily me”—the ability of individuals to customize their media diets thanks to digital technology—is that the personalization trend is not driven by individual demand but by the pursuit of profit via targeted advertising.

In the popular imagination, either the Internet has freed us from the stifling grip of the old, top-down mass media model, transforming consumers into producers and putting citizens on par with the powerful, or we have stumbled into a new trap, a social media hall of mirrors made up of personalized feeds, “filter bubbles,” narcissistic chatter, and half-truths. Young people are invoked to lend credence to both views: in the first scenario, they are portrayed as empowered and agile media connoisseurs who, refusing to passively consume news products handed down from on high, insist on contributing to the conversation; in the second, they are portrayed as pliant and ill-informed, mistaking what happens to interest them for what is actually important.

New mechanisms have emerged that sift through the chaos of online content, shaping it into a targeted stream. As a consequence, our exposure to difference may actually decrease. Eli Pariser, the former executive director of MoveOn.org and founder of the viral content site Upworthy, calls this problem the “filter bubble,” a phenomenon that stems from the efforts of new-media companies to track the things we like and try to give us more of the same. These mechanisms are “prediction engines,” Pariser says, “constantly creating and refining a theory of who you are and what you’ll want next.”40 This kind of personalization is already part of our daily experience in innumerable ways.

The Smartphone Society
by Nicole Aschoff

See, for example, Toplensky, “EU Fines Google €2.4bn over Abuse of Search Dominance”; Waters, Toplensky, and Ram, “Brussels’ €2.4bn Fine Could Lead to Damages Cases and Probes in Other Areas of Search;” Barker and Khan, “EU Fines Google Record €4.3bn over Android.” 45. For a good synopsis of Pariser’s ideas, see Beware Online Filter Bubbles, video of TED Talk, https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en; see also Pariser, The Filter Bubble. 46. Zuckerberg himself used to refer to Facebook as a “social utility,” but in recent years he has eschewed this terminology, possibly because the implications of Facebook’s being a utility are far afield from his vision for the company. 47.

When we pull out our phones to search for, say, the presidential election, we most likely get very different results from another person sitting next to us doing an identical search on their phone. The information we receive is tailored to us on the basis of our past searches, websites visited, our current location, what type of phone we’re using, and a host of other factors. As Pariser says, we each live in a “filter bubble”—our “own personal unique universe of information.” The idea of a standard Google in which everyone typing the same query gets the same search results simply no longer exists.45 Personalization is not limited to Google; Facebook, Yahoo, Twitter, even the websites of major newspapers are using personalization to increase user engagement.

He says, “Social media makes it easier for people to surround themselves (virtually) with the opinions of likeminded others and insulate themselves from competing views.” Likening social media to a disease vector, Sunstein surmises that it is “potentially dangerous for democracy and social peace.”56 In these days of filter bubbles and algorithmically generated search results it takes effort to seek out opposing political views. Our social media feeds largely show us political content that we already “like.” If we don’t hear other people’s perspective, can we agree on a collective political project? Others say much of the behavior associated with modern social movement organizing is not real politics.

pages: 234 words: 67,589

Internet for the People: The Fight for Our Digital Future
by Ben Tarnoff
Published 13 Jun 2022

“Trading up the chain”: Alice Marwick and Rebecca Lewis, “Media Manipulation and Disinformation Online,” Data & Society Research Institute, May 15, 2017. 142, This messiness is manifest … Lack of evidence for “filter bubbles”: Peter M. Dahlgren, “A Critical Review of Filter Bubbles and a Comparison with Selective Exposure,” Nordicom Review 42, no. 1 (2021): 15–33; Axel Bruns, “Filter Bubble,” Internet Policy Review 8, no. 4 (2019). “Widespread heterogeneity …”: P. M. Krafft and Joan Donovan, “Disinformation by Design: The Use of Evidence Collages and Platform Filtering in a Media Manipulation Campaign,” Political Communication 37, no. 2 (2020): 195.

It is designed to elicit strong emotions like anger so that it can spread more easily. It spreads more easily because social media sites are optimized for user engagement—capturing more user attention means collecting more advertising dollars—and content that provokes and enrages naturally gets more engagement. Finally, algorithmically generated echo chambers—“filter bubbles”—create perfect vectors for such content. Insulated from the moderating influence of mainstream media, and from the dissenting views of other users, these bubbles become breeding grounds for extremism. There are elements of truth to this story. But it is mostly misleading. It makes a number of questionable assumptions, and rests on an oversimplified, mechanistic model of how software and psychology and politics interact.

The information that people encounter online matters, but its relationship with their beliefs is nonlinear: Dylann Roof did not became a neo-Nazi on the basis of a single Google search. Human beings are complicated and contradictory. So is the process whereby they acquire their ideological frames. This messiness is manifest in online spaces, contrary to the “filter bubbles” thesis—which, like the theory that polarization is produced by social media, has scant evidence to support it. People can and do find like-minded interlocutors on the internet, and the algorithms that underpin social media feeds and recommendation systems can contribute to these clusterings.

pages: 380 words: 109,724

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US
by Rana Foroohar
Published 5 Nov 2019

In these and many other senses, the digital revolution is a miraculous and welcome development. But in order to ultimately reap the benefits of technology in a broad way, we need a level playing field, so that the next generation of innovators is allowed to thrive. We don’t yet live in that world. Big Tech has reshaped labor markets, exacerbated income inequality, and pushed us into filter bubbles in which we get only the information that confirms the opinions we already have. But it hasn’t provided solutions for these problems. Instead of enlightening us, it is narrowing our view; instead of bringing us together, it is tearing us apart. With each buzz and beep of our phones, each automatically downloaded video, each new contact popping up in our digital networks, we get just a glimmer of a vast new world that is, frankly, beyond most people’s understanding, a bizarre land of information and misinformation, of trends and tweets, and of high-speed surveillance technology that has become the new normal.

The Data-Industrial Complex A few years back, Guillaume Chaslot, a former engineer for YouTube who is now at the Center for Humane Technology, a group of Silicon Valley refugees who are working to create less harmful business models for Big Tech, was part of an internal project at YouTube, the content platform owned by Google,2 to develop algorithms that would increase the diversity and quality of content seen by users. It was an initiative that had begun in response to the “filter bubbles” that were proliferating online, in which people would end up watching the same mindless or even toxic content again and again, because algorithms that tracked them as they clicked on cat videos or white supremacist propaganda once would suggest the same type of content again and again, assuming (often correctly) that this was what would keep them coming back and watching more—thus allowing YouTube to make more money from the advertising sold against that content.

It was one of many, many letters that he and other senators have written in recent years, trying to get Big Tech to change its behavior. But simply making a few half-hearted efforts on the margins—adding a few human watchdogs here and there, or reiterating their supposed commitment to quality content over propaganda—is akin to trying to treat an aggressive cancer with a multivitamin. Why? Because these problems—filter bubbles, fake news, data breaches, and fraud—are all at the center of the most malignant—and profitable—business model in the world: that of data mining and hyper-targeted advertising.10 The Aura of Science The Cambridge Analytica scandal, whereby it was revealed that the Facebook platform had been exploited by foreign actors to influence the outcome of the 2016 presidential election—precipitated a huge rise in public awareness of how social media and its advertising-driven revenue model could pose a threat to liberal democracy.

The Internet Trap: How the Digital Economy Builds Monopolies and Undermines Democracy
by Matthew Hindman
Published 24 Sep 2018

Facebook in particular has emphasized hyperpersonalization, with Facebook founder and ceo Mark Zuckerberg stating that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.”5 With the rise of the iPad and its imitators, Negroponte’s idea that all of this personalized content would be sent to a thin, lightweight, “magical” tablet device has been partially realized, too. Scholarship such as Siva Vaidhyanathan’s The Googlization of Everything and Joe Turow’s The Daily You has viewed the trend toward personalized content and ubiquitous filtering as a part of a worrying concentration of corporate power. Eli Pariser’s bestselling book The Filter Bubble voices similar worries. But for journalism and media scholarship as a whole, as Barbie Zelizer has noted, there has been surprisingly little work on recommender systems.6 To the extent that algorithmic news filtering has been discussed at all, it was long unhelpfully lumped with a grab bag of different site features under the heading of “interactivity.”7 Research by Neil Thurman and Steve Schifferes has provided a taxonomy of different forms of personalization and has chronicled their (mostly growing) deployment across different news sites.8 Even Thurman and Schifferes’s work, however, has said little about recommender systems because traditional news organizations lagged in deploying them.

After years of effort, the contest had ended in a photo finish. The Lessons of the Netflix Prize Why should those interested in online audiences or digital news care about the Netflix Prize? One answer is that these recommender systems now have enormous influence on democratic discourse. In his book The Filter Bubble progressive activist Eli Pariser reported that posts from conservative friends were systematically excluded from his Facebook feed. This sort of filtering heightens concerns about partisan echo chambers, and it might make it harder for citizens to seek out opposing views even if they are inclined to.

In one way, however, Netflix’s example calls into question claims that filtering technologies will end up promoting echo chambers and eliminating serendipitous exposure. Such worries have been a centerpiece of scholarship on personalized news over the past decade (see earlier discussion). One of Pariser’s key claims about what he terms the “filter bubble” is that it is ostensibly invisible to users.32 Netflix, however, tries hard to make users aware of its recommendation system: “We want members to be aware of how we are adapting to their tastes. This not only promotes trust in the system, but encourages members to give feedback that will result in better recommendations.”33 Netflix also attempts to explain (in an oversimplified way) why specific movies are recommended, typically highlighting its recommendations’ similarity to movies the user has already rated.

pages: 268 words: 75,850

The Formula: How Algorithms Solve All Our Problems-And Create More
by Luke Dormehl
Published 4 Nov 2014

“I’m interested in trying to subvert all of that; removing the clutter and noise to create a more efficient way to help users gain access to things.” The problem, of course, is that in order to save you time by removing the “clutter” of the online world, Nara’s algorithms must make constant decisions on behalf of the user about what it is that they should and should not see. This effect is often called the “filter bubble.” In his book of the same title, Eli Pariser notes how two different users searching for the same thing using Google will receive very different sets of results.36 A liberal who types “BP” into his browser might get information about the April 2010 oil spill in the Gulf of Mexico, while a conservative typing the same two letters is more likely to receive investment information about the oil company.

Unlike the libertarian technologist’s pipe dream of a world that is free, flat and open to all voices, a key component of code and algorithmic culture is software’s task of sorting, classifying and creating hierarchies. Since so much of the revenue of companies like Google depends on the cognitive capital generated by users, this “software sorting” immediately does away with the idea that there is no such thing as a digital caste system. As with the “filter bubble,” it can be difficult to tell whether the endless distinctions made regarding geo-demographic profiles are helpful examples of mass customization or exclusionary examples of coded discrimination. Philosopher Félix Guattari imagined the city in which a person was free to leave their apartment, their street or their neighborhood thanks to an electronic security card that raised barriers at each intersection.

“[A] map was just a map, and you got the same one for New York City, whether you were searching for the Empire State Building or the coffee shop down the street. What if, instead, you had a map that’s unique to you, always adapting to the task you want to perform right this minute?” But while this might be helpful in some senses, its intrinsic “filter bubble” effect may also result in users experiencing less of the serendipitous discovery than they would by using a traditional map. Like the algorithmic matching of a dating site, only those people and places determined on your behalf as suitable or desirable will show up.33 As such, while applying The Formula to the field of cartography might be a logical step for Google, it is potentially troubling.

pages: 259 words: 73,193

The End of Absence: Reclaiming What We've Lost in a World of Constant Connection
by Michael Harris
Published 6 Aug 2014

There’s always going to be both comfort food and something surprising.” Roman’s insistence on tastemaking flies in the face of most content providers, who seek only to gratify the known desires of users. And it’s an impulse that could go a long way toward countering something that Internet activist Eli Pariser has coined “the filter bubble.” Here’s how a filter bubble works: Since 2009, Google has been anticipating the search results that you’d personally find most interesting and has been promoting those results each time you search, exposing you to a narrower and narrower vision of the universe. In 2013, Google announced that Google Maps would do the same, making it easier to find things Google thinks you’d like and harder to find things you haven’t encountered before.

A restaurateur in Ottawa’s famous ByWard Market: “Marisol Simoes Jailed: Co-owner of Kinki and Mambo in Ottawa Gets 90 Days for Defamation,” Huffington Post, accessed January 16, 2014, http://www.huffingtonpost.ca/2012/11/16/marisol-simoes-jailed_n_2146205.html. “Today’s internet is killing our culture”: Andrew Keen, The Cult of the Amateur (New York: Doubleday/Currency, 2007). “the filter bubble”: Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Press, 2011). Google announced that Google Maps: Evegny Morozov, “My Map or Yours?,” Slate, accessed September 4, 2013, http://www.slate.com/articles/technology/future_tense/2013/05/google_maps_personalization_will_hurt_public_space_and_engagement.html.

Eventually, the information you’re dealing with absolutely feels more personalized; it confirms your beliefs, your biases, your experiences. And it does this to the detriment of your personal evolution. Personalization—the glorification of your own taste, your own opinion—can be deadly to real learning. Only if sites like Songza continue to insist on “surprise” content will we escape the filter bubble. Praising and valuing those rare expert opinions may still be the best way to expose ourselves to the new, the adventurous, the truly revelatory. • • • • • Commensurate with the devaluing of expert opinion is the hypervaluing of amateur, public opinion—for its very amateurism. Often a comment field will be freckled with the acronym IMHO, which stands for the innocuous phrase “in my honest opinion” (or, alternatively, “in my humble opinion”).

pages: 151 words: 39,757

Ten Arguments for Deleting Your Social Media Accounts Right Now
by Jaron Lanier
Published 28 May 2018

Armies of trolls and fake trolls would game the system and add enough cruel podcast snippets to the mix that your digest would become indigestible. Even the sweetest snippets would become mere garnishes on a cruel, paranoid, enraging, and crazy-making sonic soup. Or, maybe your aggregated podcast will be a filter bubble. It will include only voices you agree with—except they won’t really be voices, because the content will all be mushed together into a stream of fragments, a caricature of what listeners supposedly hold in common. You wouldn’t even live in the same universe as someone listening to a different aggregation.

I still believe that it’s possible for tech to serve the cause of empathy. If a better future society involves better tech at all, empathy will be involved. But BUMMER is precisely tuned to ruin the capacity for empathy. DIGITALLY IMPOSED SOCIAL NUMBNESS A common and correct criticism of BUMMER is that it creates “filter bubbles.”3 Your own views are soothingly reinforced, except when you are presented with the most irritating versions of opposing views, as calculated by algorithms. Soothe or savage: whatever best keeps your attention. You are drawn into a corral with other people who can be maximally engaged along with you as a group.

(But, to review, the term should be “manipulate,” not “engage,” since it’s done in the service of unknown third parties who pay BUMMER companies to change your behavior. Otherwise, what are they paying for? What else could Facebook say it’s being paid tens of billions of dollars to do?) On the face of it, filter bubbles are bad, because you see the world in tunnel vision. But are they really new? Surely there were damaging and annoying forms of exclusionary social communication that predate BUMMER, including the use of racist “dog whistles” in politics. For example, in the 1988 American presidential election, politicians famously used the story of a black man named Willie Horton who had committed crimes after a prison furlough in order to evoke latent racism in the electorate.

pages: 437 words: 105,934

#Republic: Divided Democracy in the Age of Social Media
by Cass R. Sunstein
Published 7 Mar 2017

doid=2488388.2488435 (accessed August 23, 2016). In a similar vein, a 2015 paper takes initial steps toward visualizing filter bubbles in Google and Bing searches, finding that both search engines do create filter bubbles; the authors, however, note that it appears that filter bubbles may be stronger within certain topics (such as results of searches about jobs) than within others (such as results from searches about asthma). Tawanna R. Dillahunt, Christopher A. Brooks, and Samarth Gulati, “Detecting and Visualizing Filter Bubbles in Google and Bing,” Proceedings of the Thirty-Third Annual ACM Conference: Extended Abstracts on Human Factors in Computing Systems (New York: Association for Computing Machinery, 2015), 1851–56, http://dl.acm.org/citation.cfm?

(working paper, MIT Sloan School, Cambridge, MA, 1996), http://web.mit.edu/marshall/www/papers/CyberBalkans.pdf (accessed August 23, 2016). 2.In a provocative 2011 book, Eli Pariser popularized a theory of “filter bubbles” in which he posited that due to the effects of algorithmic filtering, Internet users are likely to be provided with information that conforms to their existing interests and, in effect, is isolated from differing viewpoints. We continue to obtain evidence on the phenomenon. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Press, 2011). A 2013 paper measures the effect of search personalization on Google, concluding that 11.7 percent of Google search results differed between users due to personalization—a finding that the authors describe as “significant personalization.”

Adamic, “Exposure to Ideologically Diverse News and Opinion on Facebook,” Science 348, no. 6239 (2015): 1130–32. 46.Ibid., 1132. 47.Eytan Bakshy, Solomon Messing, and Lada Adamic, “Exposure to Diverse Information on Facebook,” Research at Facebook, May 7, 2015, https://research.facebook.com/blog/exposure-to-diverse-information-on-facebook/ (accessed September 6, 2016). 48.See, for example, Chris Cillizza, “Why Facebook’s News Feed Changes Are Bad News for News, Washington Post, June 29, 2016, https://www.washingtonpost.com/news/the-fix/wp/2016/06/29/why-facebooks-news-feed-changes-are-bad-news/?tid=sm_tw_pp&wprss=rss_the-fix (accessed September 6, 2016). 49.Quoted in Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Press, 2011), 1. 50.Moshe Blank and Jie Xu, “News Feed FYI: More Articles You Want to Spend Time Viewing,” Facebook Newsroom, April 21, 2016, http://newsroom.fb.com/news/2016/04/news-feed-fyi-more-articles-you-want-to-spend-time-viewing// (accessed September 6, 2016). 51.See, for example, Alessandro Bessi, Fabiana Zollo, Michela Del Vicario, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi, “Trend of Narratives in the Age of Misinformation,” PLOS ONE 10, no. 8 (2015): 1–16, http://journals.plos.org/plosone/article/asset?

pages: 321 words: 105,480

Filterworld: How Algorithms Flattened Culture
by Kyle Chayka
Published 15 Jan 2024

In 2011, the writer and Internet activist Eli Pariser published his book The Filter Bubble, describing how algorithmic recommendations and other digital communication routes can silo Internet users into encountering only ideologies that match their own. The concept of filter bubbles has been debated over the decade since then, particularly in the context of political news media. Some evaluations, like Axel Bruns’s 2019 book Are Filter Bubbles Real?, have concluded that their effects are limited. Other scientific studies, like a 2016 investigation of filter bubbles in Public Opinion Quarterly, found that there is a degree of “ideological segregation,” particularly when it comes to opinion-driven content.

Other scientific studies, like a 2016 investigation of filter bubbles in Public Opinion Quarterly, found that there is a degree of “ideological segregation,” particularly when it comes to opinion-driven content. Yet culture and cultural taste have different dynamics than political content and ideological beliefs online; even though they travel through the same feeds, they are driven by different incentives. While political filter bubbles silo users into opposing factions by disagreement, cultural recommendations bring them together toward the goal of building larger and larger audiences for the lowest-common-denominator material. Algorithmic culture congregates in the center, because the decision to consume a piece of culture is rarely motivated by hate or conflict.

After the 2016 election of Donald Trump, the American public became slightly more aware of how we were being manipulated by algorithmic feeds. Democrats couldn’t understand how anyone had voted for Trump, given that their Facebook and Twitter feeds didn’t promote as many posts from the other side of the political spectrum, creating one of Eli Pariser’s filter bubbles, a digital echo chamber. Online, they lived in an illusion of total agreement that Trump was ridiculous. At the same time, his supporters were surrounded by content that reinforced their own views—another form of homogeneity. Recommender systems had sorted the audiences into two neat categories that didn’t need to overlap, whereas in a human-edited newspaper or television news program, some mutual exposure might have been more likely.

pages: 327 words: 88,121

The Vanishing Neighbor: The Transformation of American Community
by Marc J. Dunkelman
Published 3 Aug 2014

No longer do we have to trudge to the library and pull out a volume of the Reader’s Guide to Periodical Literature to find an article detailing the electoral landscape in the panhandle of Michigan; with a few keystrokes from our living room couch we can call up every article ever written on the subject. The advent of social networking has only served to press the point further. As Eli Pariser recently argued in The Filter Bubble, without our even knowing it, many software companies have found ways to predict the information we want and provide it to the exclusion of everything else.5 As a further convenience (of sorts), the previous searches need not be related; it’s now possible, for example, to associate our political sensibilities with our taste in restaurants and alcohol.

Nevertheless, it seems less likely that networks of individuals ensconced in conversations on topics with which they agree will yield as many big breakthroughs. And that marks our central challenge. No one can claim credibly that Americans are intellectually isolated today, even if we are caught in what Eli Pariser once termed “filter bubbles.”36 What remains to be seen—and what ought to worry anyone looking at the future of economic growth—is whether the new arena for the Medici effect will, in the end, prove as effective as the old. The manner in which Americans organized themselves—what Tocqueville saw as such a crucial element of American exceptionalism—had an altogether underappreciated effect on the growth and dynamism of the American economy.

Mann and Norman J. Ornstein, It’s Even Worse Than It Looks: How the American Constitutional System Collided with the New Politics of Extremism (New York: Basic Books, 2012), 59. 4Bill Carter, “Prime-Time Ratings Bring Speculation of a Shift in Habits,” New York Times, April 23, 2012. 5Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011), 6–10. 6Cramer, Ruby. “2 Charts That Explain What Your Food Says About Your Politics,” Buzzfeed.com, October 31, 2012, http://www.buzzfeed.com/rubycramer/2-charts-that-explain-what-your-food-says-about-yo. 7Natasha Singer, “Your Online Attention, Bought in an Instant,” New York Times, November 17, 2012. 8Kenneth T.

pages: 285 words: 86,853

What Algorithms Want: Imagination in the Age of Computing
by Ed Finn
Published 10 Mar 2017

Gillespie, “The Relevance of Algorithms”; Pariser, The Filter Bubble; Galloway, Protocol. 78. Turner, From Counterculture to Cyberculture, 17. 79. Galloway, Gaming, 76. 80. For more on this debate see Hayles, My Mother Was a Computer; Hansen, Bodies in Code, among many others. The topic is explored through a number of seminal cyberpunk novels, such as: Gibson, Neuromancer; Sterling, Schismatrix; and, of course, Stephenson, Snow Crash. 81. Bogost and Montfort, “Platform Studies.” 82. Kirschenbaum, Mechanisms, 10–11. 83. Ibid., 12–13. 84. Bogost, “The Cathedral of Computation.” 85. Pariser, The Filter Bubble. 86. Stephenson, Snow Crash, 434. 87.

Golumbia stands in here for a range of critics who argue that the de facto result of computational culture, at least if we do not intervene, is to reinforce state power. In later chapters we will examine more closely the “false personalization” that Tarleton Gillespie cautions against, extending Internet activist and author Eli Pariser’s argument in The Filter Bubble, as well as media theorist Alexander Galloway’s elegant framing of the political consequences of protocol.77 But Turner’s From Counterculture to Cyberculture offers a compelling view of how countercultural impulses were woven into the fabric of computational culture from the beginning. The figure of the hacker draws its lineage in part from the freewheeling discourse of the industrial research labs of the 1940s; the facilities that first created the opportunities for young people to not only work but play with computers.78 But, as Galloway has argued, the cybernetic paradigm has recast the playful magic of computation in a new light.

They reshape the spaces within which we see ourselves. Our literal and metaphorical footprints through real and virtual systems of information and exchange are used to shape the horizon ahead through tailored search results, recommendations, and other adaptive systems, or what Pariser calls the “filter bubble.”85 But when algorithms cross the threshold from prediction to determination, from modeling to building cultural structures, we find ourselves revising reality to accommodate their discrepancies. In any system dependent on abstraction there is a remainder, a set of discarded information—the différance, or the crucial distinction and deferral of meaning that goes on between the map and the territory.

Likewar: The Weaponization of Social Media
by Peter Warren Singer and Emerson T. Brooking
Published 15 Mar 2018

So subtle was the code that governed user experience on these platforms, most people had no clue that the information they saw might differ drastically from what others were seeing. Online activist Eli Pariser described the effect, and its dangerous consequences, in his 2011 book, The Filter Bubble. “You’re the only person in your bubble,” he wrote. “In an age when shared information is the bedrock of shared experience, the filter bubble is the centrifugal force, pulling us apart.” Yet, even as social media users are torn from a shared reality into a reality-distorting bubble, they rarely want for company. With a few keystrokes, the internet can connect like-minded people over vast distances and even bridge language barriers.

16,000,” New York Times, June 25, 2012, http://www.nytimes.com/2012/06/26/technology/in-a-big-network-of-computers-evidence-of-machine-learning.html. 249 “We never told it”: Ibid. 250 where to put the traffic lights: Erik Brynjolfsson and Andrew McAfee, “The Business of Artificial Intelligence,” Harvard Business Review, July 2017, https://hbr.org/2017/07/the-business-of-artificial-intelligence. 250 more than a million: Joaquin Quiñonero Candela, “Building Scalable Systems to Understand Content,” Facebook Code, February 2, 2017, https://code.facebook.com/posts/1259786714075766/building-scalable-systems-to-understand-content/. 250 wearing a black shirt: Ibid. 250 80 percent: “An Update on Our Commitment to Fight Violent Extremist Content Online,” Official Blog, YouTube, October 17, 2017, https://youtube.googleblog.com/2017/10/an-update-on-our-commitment-to-fight.html. 250 “attack scale”: Andy Greenberg, “Inside Google’s Internet Justice League and Its AI-Powered War on Trolls,” Wired, September 19, 2016, https://www.wired.com/2016/09/inside-googles-internet-justice-league-ai-powered-war-trolls/. 251 about 90 percent: Ibid. 251 thoughts of suicide: Vanessa Callison-Burch, Jennifer Guadagno, and Antigone Davis, “Building a Safer Community with New Suicide Prevention Tools,” Facebook Newsroom, March 1, 2017, https://newsroom.fb.com/news/2017/03/building-a-safer-community-with-new-suicide-prevention-tools/%20https://newsroom.fb.com/news/2017/03/building-a-safer-community-with-new-suicide-prevention-tools/. 251 database of facts: Jonathan Stray, “The Age of the Cyborg,” Columbia Journalism Review, Fall/Winter 2016, https://www.cjr.org/analysis/cyborg_virtual_reality_reuters_tracer.php. 251 managing the “trade-offs”: Kurt Wagner, “Facebook’s AI Boss: Facebook Could Fix Its Filter Bubble If It Wanted To,” Recode, December 1, 2016, https://www.recode.net/2016/12/1/13800270/facebook-filter-bubble-fix-technology-yann-lecun. 251 Their more advanced version: For a good overview, see “Cleverbot Data for Machine Learning,” Existor, accessed March 20, 2018, https://www.existor.com/products/cleverbot-data-for-machine-learning/. 252 machine-driven communications tools: Matt Chessen, “Understanding the Psychology Behind Computational Propaganda,” in Can Public Diplomacy Survive the Internet?

Yet these 4 billion flesh-and-blood netizens have now been joined by a vast number of digital beings, designed to distort and amplify, to confuse and distract. The attention economy may have been built by humans, but it is now ruled by algorithms—some with agendas all their own. Today, the ideas that shape battles, votes, and even our views of reality itself are propelled to prominence by this whirring combination of filter bubbles and homophily—an endless tide of misinformation and mysterious designs of bots. To master this system, one must understand how it works. But one must also understand why certain ideas take hold. The ensuing answers to these questions reveal the foundations of what may seem to be a bizarre new online world, but is actually an inescapable kind of war. 6 Win the Net, Win the Day The New Wars for Attention . . . and Power Media weapons [can] actually be more potent than atomic bombs.

pages: 243 words: 76,686

How to Do Nothing
by Jenny Odell
Published 8 Apr 2019

This chapter is also based on my personal experience learning about my bioregion for the first time, a new pattern of attention applied to the place I’ve lived in my entire life. If we can use attention to inhabit a new plane of reality, it follows that we might meet each other there by paying attention to the same things and to each other. In Chapter 5, I examine and try to dissolve the limits that the “filter bubble” has placed on how we view the people around us. Then I’ll ask you to stretch it even further, extending the same attention to the more-than-human world. Ultimately, I argue for a view of the self and of identity that is the opposite of the personal brand: an unstable, shapeshifting thing determined by interactions with others and with different kinds of places.

As Gordon Hempton, an acoustic ecologist who records natural soundscapes, put it: “Silence is not the absence of something but the presence of everything.”23 Unfortunately, our constant engagement with the attention economy means that this is something many of us (myself included) may have to relearn. Even with the problem of the filter bubble aside, the platforms that we use to communicate with each other do not encourage listening. Instead they reward shouting and oversimple reaction: of having a “take” after having read a single headline. I alluded earlier to the problem of speed, but this is also a problem both of listening and of bodies.

Compared to the algorithms that recommend friends to us based on instrumental qualities—things we like, things we’ve bought, friends in common—geographical proximity is different, placing us near people we have no “obvious” instrumental reason to care about, who are neither family nor friends (nor, sometimes, even potential friends). I want to propose several reasons we should not only register, but care about and co-inhabit a reality with, the people who live around us being left out of our filter bubbles. And of course, I mean not only social media bubbles, but the filters we create with our own perception and non-perception, involving the kind of attention (or lack thereof) that I’ve described so far. * * * — THE MOST OBVIOUS answer is that we should care about those around us because we are beholden to each other in a practical sense.

pages: 324 words: 96,491

Messing With the Enemy: Surviving in a Social Media World of Hackers, Terrorists, Russians, and Fake News
by Clint Watts
Published 28 May 2018

My experiences with the crowd—watching the mobs that toppled dictators during the Arab Spring, the hordes that joined ISIS, the counterterrorism punditry that missed the rise of ISIS, and the political swarms duped by Russia in the 2016 presidential election—lead me to believe that crowds are increasingly dumb, driven by ideology, desire, ambition, fear, and hatred, or what might collectively be referred to as “preferences.” Eli Pariser, the head of the viral content website Upworthy, noted in his book The Filter Bubble the emergence and danger of social media and internet search engine algorithms selectively feeding users information designed to suit their preferences. Over time, these “filter bubbles” create echo chambers, blocking out alternative viewpoints and facts that don’t conform to the cultural and ideological preferences of users. Pariser recognized that filter bubbles would create “the impression that our narrow self-interest is all that exists.”1 The internet brought people together, but social media preferences have now driven people apart through the creation of preference bubbles—the next extension of Pariser’s filter bubbles.

Counter-Efforts Must Advance, Lumpkin Says”, interview by Renee Montagne, Morning Edition, NPR, https://www.npr.org/2016/02/01/465106713/as-isis-evolves-u-s-counter-efforts-must-advance-lumpkin-says. CHAPTER 9: FROM PREFERENCE BUBBLES TO SOCIAL INCEPTION 1. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books (April 24, 2012). https://www.amazon.com/Filter-Bubble-Personalized-Changing-Think/dp/0143121235. 2. Tom Nichols, “The Death Of Expertise,” The Federalist (January 17, 2014) http://thefederalist.com/2014/01/17/the-death-of-expertise. 3. Thomas Jocelyn, “Abu Qatada Provides Jihadists with Ideological Guidance from a Jordanian Prison,” The Long War Journal (April 30, 2014). https://www.longwarjournal.org/archives/2014/04/jihadist_ideologue_p.php; and Spencer Ackerman, Shiv Malik, Ali Younes, and Mustafa Khalili, “Al-Qa’ida ‘Cut Off and Ripped Apart by Isis,’” The Guardian (June 15, 2015). https://www.theguardian.com/world/2015/jun/10/isis-onslaught-has-broken-al-qaida-its-spiritual-leaders-admit. 4.

Pariser recognized that filter bubbles would create “the impression that our narrow self-interest is all that exists.”1 The internet brought people together, but social media preferences have now driven people apart through the creation of preference bubbles—the next extension of Pariser’s filter bubbles. Preference bubbles result not only from social media algorithms feeding people more of what they want, but also people choosing more of what they like in the virtual world, leading to physical changes in the real world. In sum, our social media tails in the virtual world wag our dog in the real world. Preference bubbles arise subtly from three converging biases that collectively and powerfully herd like-minded people and harden their views as hundreds and thousands of retweets, likes, and clicks aggregate an audience’s preferences.

pages: 531 words: 125,069

The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure
by Greg Lukianoff and Jonathan Haidt
Published 14 Jun 2018

Retrieved from http://genforwardsurvey.com/assets/uploads/2017/09/NBC-GenForward-Toplines-September-2017-Final.pdf 11. Iyengar & Krupenkin (2018). 12. Pariser (2011). A “filter bubble” is what happens when the algorithms that websites use to predict your interests based on your reading/viewing habits work to avoid showing you alternative viewpoints. See: El-Bermawy, M. (2016, November 18). Your filter bubble is destroying democracy. Wired. Retrieved from https://www.wired.com/2016/11/filter-bubble-destroying-democracy 13. Mann & Ornstein (2012). 14. Levitsky, S., & Ziblatt, D. (2018, January 27). How wobbly is our democracy?

By the 1990s, there was a cable news channel for most points on the political spectrum, and by the early 2000s there was a website or discussion group for every conceivable interest group and grievance. By the 2010s, most Americans were using social media sites like Facebook and Twitter, which make it easy to encase oneself within an echo chamber. And then there’s the “filter bubble,” in which search engines and YouTube algorithms are designed to give you more of what you seem to be interested in, leading conservatives and progressives into disconnected moral matrices backed up by mutually contradictory informational worlds.12 Both the physical and the electronic isolation from people we disagree with allow the forces of confirmation bias, groupthink, and tribalism to push us still further apart.

Clinical & Experimental Immunology, 160, 1–9. Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. New York, NY: Cambridge University Press. Ostrom, V. (1997). The meaning of democracy and the vulnerability of democracies. Ann Arbor: University of Michigan Press. Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. New York, NY: Penguin Press. Pavlac, B. A. (2009). Witch hunts in the Western world: Persecution and punishment from the Inquisition through the Salem trials. Westport, CT: Greenwood Press. Peterson, C., Maier, S.

Super Thinking: The Big Book of Mental Models
by Gabriel Weinberg and Lauren McCann
Published 17 Jun 2019

Many people don’t realize that they are getting tailored results based on what a mathematical algorithm thinks would increase their clicks, as opposed to a more objective set of ranked results. The Filter Bubble When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles. Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints. And because of availability bias, they consistently overestimate the percentage of people who hold the same opinions.

Or you might just consider the interactions you have had with her personally, as opposed to getting a more holistic view based on interactions with other colleagues with different frames of reference. With the rise of personalized recommendations and news feeds on the internet, availability bias has become a more and more pernicious problem. Online this model is called the filter bubble, a term coined by author Eli Pariser, who wrote a book on it with the same name. Because of availability bias, you’re likely to click on things you’re already familiar with, and so Google, Facebook, and many other companies tend to show you more of what they think you already know and like.

Use Ockham’s razor and Hanlon’s razor to begin investigating the simplest objective explanations. Then test your theories by de-risking your assumptions, avoiding premature optimization. Attempt to think gray in an effort to consistently avoid confirmation bias. Actively seek out other perspectives by including the Devil’s advocate position and bypassing the filter bubble. Consider the adage “You are what you eat.” You need to take in a variety of foods to be a healthy person. Likewise, taking in a variety of perspectives will help you become a super thinker. 2 Anything That Can Go Wrong, Will ALL YOUR ACTIONS HAVE CONSEQUENCES, but sometimes those consequences are unexpected.

pages: 579 words: 160,351

Breaking News: The Remaking of Journalism and Why It Matters Now
by Alan Rusbridger
Published 14 Oct 2018

Some thought we were drowning in too much news; others feared we were in danger of becoming newsless. Some believed we had too much free news: others, that paid-for news was leaving behind it a long caravan of ignorance. No one could agree on one narrative. The old media were lazy and corrupt: and/or the new players were greedy and secretive. We were newly penned into filter bubbles: rubbish – they had always been there. There was a new democracy of information: bunkum – the mob were now in control. The old elites were dying: read your history – power simply changes shape. On this most people could agree: we were now up to our necks in a seething, ever churning ocean of information; some of it true, much of it wrong.

Two economists, Fred Hirsch and David Gordon, explored how this worked in their 1975 book Newspaper Money: ‘Newspapers are among the few products for which one buyer’s money is not as good as another’s in the eyes of the seller: 8p a day from a reader earning £5,000 a year is worth much more than 8p from a reader earning £2,000 a year, because a newspaper is not just selling its editorial product to its readers, but is selling its readers’ incomes to advertisers.’ This created a pressure on newspaper managements to ‘upgrade’ the readership.6 ‘Safety today lies in the appeal to a cohesive socio-economic group, the more well-heeled the better.’ It was a type of filter bubble. ‘Minorities with high spending power find themselves excellently catered for,’ wrote Hirsch and Gordon (who, lest he sounds a trifle Marxist, later went on to run the Economist). ‘Minorities who have less pull on advertisers find themselves neglected.’7 The Guardian’s problem in 1975, in the eyes of Hirsch and Gordon, was that it didn’t have the advantages of high-income readers – the opposite of the Financial Times, which could print a tiny number of copies to a very select readership . . . and still make healthy profits.

As did the downsides. It was, obviously, not necessarily good at complexity – though it could link to complexity. It could be frustratingly reductive. It didn’t patiently and painstakingly report, in the way a good news organisation still did. It was to some extent parasitical. Wrongly used, it could create ‘filter bubbles’ in which those users who wanted to narrow their minds could simply reinforce their own echoing prejudices. It could be used by autocrats to undermine the truth as readily as it could be used to publish it. The full glare of the world’s attention could focus on a single unstable piece of information.

pages: 307 words: 101,998

IRL: Finding Realness, Meaning, and Belonging in Our Digital Lives
by Chris Stedman
Published 19 Oct 2020

“However, weaker ties may be far flung and composed of people with varying political and social ties.” In other words, weak ties can help us get out of our in-groups and filter bubbles—a problem social media enables, but may also be able to help us combat, if we use it mindfully. When we put distance between ourselves and others—something we often do to feel safer and more secure—we ironically end up hurting ourselves because we isolate ourselves from people who can help us become more fully human. Online, we sometimes create filter bubbles and echo chambers when what we need is to be surrounded by people who see the world in different ways than we do.

—is only answerable if you look at the emotional truths behind it. It’s debatable whether our increasingly digital lives cause us to care less about the planet, Ozaksut said, but one thing that is very clear to him is that many of the lines that used to define our communities have degraded. We see it in the development of an alternate news reality, the filter bubbles we enter online, and the dissolution of certain forms of offline community. Ozaksut traced much of this to the “mass systemic collapse in the perceived integrity of institutions, from Major League Baseball to the Catholic Church.” At the same time as this collapse, algorithm-based online communities came up, and these end up atomizing us along lines we don’t always recognize.

Yes, he acknowledged, more people are now able to become more aware of climate change; the New York Times climate desk, for example, is doing incredible work bringing the climate crisis into people’s social media feeds, he said. But if you’re getting the NYT climate newsletter or seeing their content on social media, it’s probably because you’ve already signed up for it, or you’re in Facebook groups where people are already alarmed about climate change. So, once again, the issue is our online filter bubbles and the inflammatory content that drives them. Ozaksut’s observation that our social media platforms boost divisive content—and how that’s a more urgent issue than the distance social media puts between us and the physical world—confirmed my sense that we face a monumental issue when trying to use the internet to become more real.

pages: 257 words: 77,030

A Manual for Creating Atheists
by Peter Boghossian
Published 1 Nov 2013

Everyone believes this about polygamy, and those who don’t are just wackos.” Clustering thus increases the confidence value that we implicitly assign to a belief—we become more certain our beliefs are true. Further complicating this clustering phenomenon is what American online organizer Eli Pariser terms “filter bubbles” (Pariser, 2012). A “filter bubble” describes the phenomena of online portals—like Google and Facebook—predicting and delivering customized information users want based upon algorithms that take preexisting data into account (e.g., previous searches, type of computer one owns, and geographical location). Consequently, and unbeknownst to the user, the information users see is in ideological conformity with their beliefs.

“It’s all over the Internet,” or “I’m sure it’s true, I just Googled it this morning and saw for myself,” gains new meaning as one is unwittingly subject to selective information that lends credence to one’s beliefs as confirming “evidence” appears at the top of one’s Google search. Combine clustering in like-minded communities with filter bubbles, then put that on top of a cognitive architecture that predisposes one to belief (Shermer, 2012) and favors confirmation bias, then throw in the fact that critical thinking and reasoning require far more intellectual labor than acceptance of simple solutions and platitudes, then liberally sprinkle the virulence of certain belief systems, then infuse with the idea that holding certain beliefs and using certain processes of reasoning are moral acts, and then lay this entire mixture upon the difficulty of just trying to make a living and get through the day with any time for reflection, and voilà: Doxastic closure!

On A+, with a comment about Richard Carrier’s intemperance. Rationally Speaking. Retrieved from http://rationallyspeaking.blogspot.com/2012/08/on-with-comment-about-richard-carriers.html Plantinga, A. (2000). Warranted Christian belief. New York, NY: Oxford University Press. Pariser, E. (2012). The filter bubble: What the Internet is hiding from you. New York, NY: The Penguin Group. Previc, F. H. (2006). The role of the extrapersonal brain systems in religious activity. Consciousness and Cognition, 15, 500–539. Prochaska, J. O., Norcross, J. C., & DiClemente, C. C. (1994). Changing for good: The revolutionary program that explains the six stages of change and teaches you how to free yourself from bad habits.

pages: 290 words: 73,000

Algorithms of Oppression: How Search Engines Reinforce Racism
by Safiya Umoja Noble
Published 8 Jan 2018

Of course, Google Search is an advertising company, not a reliable information company. At the very least, we must ask when we find these kinds of results, Is this the best information? For whom? We must ask ourselves who the intended audience is for a variety of things we find, and question the legitimacy of being in a “filter bubble,”3 when we do not want racism and sexism, yet they still find their way to us. The implications of algorithmic decision making of this sort extend to other types of queries in Google and other digital media platforms, and they are the beginning of a much-needed reassessment of information as a public good.

In short, we must fight to suspend the circulation of racist and sexist material that is used to erode our civil and human rights. I hope this book provides some steps toward doing so. NOTES INTRODUCTION 1. Matsakis, 2017. 2. See Peterson, 2014. 3. This term was coined by Eli Pariser in his book The Filter Bubble (2011). 4. See Dewey, 2015. 5. I use phrases such as “the N-word” or “n*gger” rather than explicitly using the spelling of a racial epithet in my scholarship. As a regular practice, I also do not cite or promote non–African American scholars or research that flagrantly uses the racial epithet in lieu of alternative phrasings. 6.

Communications: European Journal of Communication Research, 36(3), 335–352. Pacey, A. (1983). The Culture of Technology. Cambridge, MA: MIT Press. Palmer, C. L., and Malone, C. K. (2001). Elaborate Isolation: Metastructures of Knowledge about Women. Information Society, 17(3), 179–194. Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. New York: Penguin. Pasquale, F. (2015). The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard University Press. Pavlik, J. V. (1996). New Media Technology: Cultural and Commercial Perspectives. Boston: Allyn and Bacon.

Traffic: Genius, Rivalry, and Delusion in the Billion-Dollar Race to Go Viral
by Ben Smith
Published 2 May 2023

After the 2008 election, he had stopped running MoveOn to write a book, The Filter Bubble, where he warned presciently that the subtle personalization pioneered by Facebook and other platforms would “serve up a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.” But Eli wasn’t interested in being just a pundit. He wanted to get back into the work of politics. In November of 2011, he saw an opportunity to test out how MoveOn could turn filter bubbles into a source of progressive power.

Jonah forwarded me the exchange, adding, “It is really fun collaborating with Facebook’s team on how News Feed should work.” When Facebook figured out how to purge clickbait headlines, Upworthy’s traffic took the hit. Eli occupied a strange position. He had built his company on top of Facebook’s algorithms, but he was also a Facebook critic. His own startup had proved out his dark theory about the filter bubble. Facebook kept trying to refute his argument: their own research, they said, showed that the information you saw on the site was dictated by whom you chose to be friends with—not by Facebook’s algorithms. But Facebook’s research contained a curious detail, one of the company’s own analysts confided in Eli: relatively small right-wing websites seemed to be getting far more “engagement” by Facebook’s metrics than huge liberal ones, like Upworthy, or The Huffington Post, or BuzzFeed.

Go to note reference in text He’d promised Breitbart: McKay Coppins, “Breitbart’s Inheritors Battle over His Legacy,” BuzzFeed, October 22, 2012, https://www.buzzfeed.com/mckaycoppins/breitbarts-inheritors-battle-over-his-legacy. Go to note reference in text he warned presciently: Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin, 2011). Go to note reference in text As Iowa debated: Angie Aker, “Two Lesbians Raised a Baby and This Is What They Got,” MoveOn, November 30, 2011, https://front.moveon.org/two-lesbians-raised-a-baby-and-this-is-what-they-got.

pages: 322 words: 84,752

Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up
by Philip N. Howard
Published 27 Apr 2015

They tend to be run by volunteers and strapped for cash; rarely do they have the resources to invest in good information infrastructure. The world’s authoritarian governments are better positioned than civil society groups for the internet of things. Bots and Simulations Any device network we build will create some kind of what Eli Pariser calls a filter bubble around us.31 We will be choosing which devices to connect, and those devices will both collect information about us and provide information to us. But the danger is not so much that our information supplies may be constrained by the devices we purposefully select. It is the danger that our information supplies may be manipulated by people and scripts we don’t know about.

Brian Krebs, “Amnesty International Site Serving Java Exploit,” Krebs on Security, December 22, 2011, accessed September 30, 2014, http://krebsonsecurity.com/2011/12/amnesty-international-site-serving-java-exploit/. 30. @indiankanoon, “IK Servers Are Getting DDoSed Using the DNS Reflection Attack,” Indian Kanoon (October 19, 2013), accessed September 30, 2014, https://twitter.com/indiankanoon/status/391497714451492865. 31. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (London: Penguin, 2011). 32. Keith Wagstaff, “1 in 10 Twitter Accounts Is Fake, Say Researchers,” NBC News, November 26, 2013, accessed September 30, 2014, http://www.nbcnews.com/technology/1-10-twitter-accounts-fake-say-researchers-2D11655362; Won Kim et al., “On Botnets,” in Proceedings of the 12th International Conference on Information Integration and Web-Based Applications and Services (New York: ACM, 2010), 5–10, accessed September 30, 2014, http://dl.acm.org/citation.cfm?

See also Green Movement/Revolution (Iran) computational propaganda, 205 Congo (Democratic Republic of Congo): anarchy in, 94; elections in, 117; governance in, 104–5 connective action, 149, 168–73, 175–76, 230; China’s censors blocking, 188; downsides of, 218–19; qualities of, 173–74 connective insecurity, 182 connective security, 176, 218, 230 consumer electronics: market for, 57–58; uploading usage data, 212–14 copyright infringement, 214 core, 147 corruption, 82–83, 96–97, 142–43, 216 criminal organizations, 96; dictators’ support of, 93–94; IT support for, 236; operating in place of governments, 80 crisis, information control following, 120–21 crisis mapping, 70–71, 101, 239 crowd sourcing, 21, 70, 101, 195, 295 crypto-clans, 149, 173–74, 239, 295 cryptography, 24, 237 Cuba, 92; digital dilemma in, 87; dissidence in, 62; pro-regime social media in, 201; protests in, 86; USAID involvement in, 31 cultural production, 15 cyberattacks, 38–39, 41, 190, 201–2, 295 cybercrime, 97 cyberdeterrence, 110, 148, 153–57, 233–34 cyberespionage, 40, 41, 117, 190 cybernationalism, 218 cybersecurity, partners in, 41 cyberterrorism, 43 cyberwarfare, 34, 153–54, 203, 295 dark data, 140 data: access to, xiii–xiv; aftermarket for, 245–47; beneficiaries of, 247–48; categorizing of, 141; compromised, 247; exchange value of, 55; global trade in, 181; hoarding of, 246; levels of, 141; mining of, 144, 179, 181, 245–47; monitoring systems for, 212; sharing of, 233 data shadow, 242 Datys, 215 dead pool, for dictators, 91–92 default passwords, 3 Deibert, Ron, 179 democracies: elections in, technology and, 127–28; law enforcement in, technology leaking to, 222; leaders in, susceptible to viral public outrage, 128; political independence from, 145; technology policy in, 133, 134; threats in, to internet of things, 184 democracy: xxii–xxiii, 296; adapting expectations for, 243; advocates of, keeping political memes alive, 124; bots as threat to, 208, 209–11; early stages of, 108; exporting of, 148; internet linked to, 167–68; political victories in, 128–29; predictors of, 167; technology diffusion and, 8 democratization, 50 denial-of-service attacks, 2, 4, 201, 203, 204, 208, 237 development projects, 101–2 device networks, 52; adoption patterns for, 168; battles over, 47; competition between, 162; data trails from, 114; eliminating market discrimination, 123; expansion of, botnets and, 234; filter bubbles and, 202; foreign affairs and, 249–50; governance delivery and, 100–102; governments’ attempts to control, 56–59; human security and, 175–76; impact of, on global crises, 20; linking civic groups to groups in need, 68; malware on, 113–14; market efficiency and, 158; political information and, 175; relevance of, 8; risks of, weighed against benefits, 219; spread of, 45–46, 53; standards setting for, 229; weaponizing of, 112–13 devices: betraying owners’ behavior, 227; consuming most internet bandwidth, 233; networked, attacks on, 115–16; relationships between, and the political order, 34; sharing information, 236 device tithing, 250, 296 dictators: aging of, 73, 86, 91–93, 99; changes for, 67–68, 72–73, 91; control by, 78; facing digital dilemma, 73, 86–88; hardware and software for, 162 digital activism, 86, 169, 221, 296 digital clubs, 174, 239, 296 digital cosmopolitanism, 138 digital dilemma, 73, 84–88, 151–52, 252, 296 digital exchange, 55, 56 digital images, undermining political ideology, 125 digital mapping, 139 digital media: aligned with social networks, 138; government use of, 1–2; impacts of, on international affairs, 230; learning cycles with, 155–57; as research tool, 240; as sources and conduits of power, 231 digital natives, 46–47 digital networks: bots’ domination of, 34; connections exposed by, 71–72; decentralized, 117–18; domestic political battles on, 116; moderating political opinion, 130 digital rights management, 213–14, 226–27 digital technology: flow of, 10–11; usage patterns for, 9 dirty networks, 67, 72, 177–78, 296; adapting device networks, 216; collapse of, 98–99; connections among, 97; demise of, 92; exposure of, 98; nodes in, 93–94.

pages: 523 words: 112,185

Doing Data Science: Straight Talk From the Frontline
by Cathy O'Neil and Rachel Schutt
Published 8 Oct 2013

philosophy of, Philosophy of Exploratory Data Analysis–Philosophy of Exploratory Data Analysis tools of, Exploratory Data Analysis exponential downweighting, Working out a Volatility Measurement, Exponential Downweighting formula for, Exponential Downweighting exponential random graph models (ERGMs), A Second Example of Random Graphs: The Exponential Random Graph Model inference for, Inference for ERGMs exponential random graphs, A Second Example of Random Graphs: The Exponential Random Graph Model F F-score, Evaluation Facebook, Big Data and Data Science Hype, The Current Landscape (with a Little History), Populations and Samples of Big Data, Gabriel Tarde, Social Networks and Data Journalism data science in, The Current Landscape (with a Little History) Kaggle and, Their Customers real-time streaming data, Populations and Samples of Big Data false negative rate, Pick an evaluation metric false positive rate, Pick an evaluation metric fault tolerance, Word Frequency Problem feature construction, How to Be a Good Modeler feature extraction, William Cukierski, Example: User Retention feature generation, Example: User Retention feature selection, William Cukierski, Feature Selection–User Retention: Interpretability Versus Predictive Power challenges in, Challenges in features and learning criticisms of, Random Forests decision tree algorithm, The Decision Tree Algorithm decision trees, Embedded Methods: Decision Trees entropy, Entropy filters, Filters random forests and, Random Forests–Random Forests user retention example, Example: User Retention–Example: User Retention wrapper, Wrappers–In practice feature selection methods, Example: User Retention embedded, Example: User Retention filters, Example: User Retention wrappers, Example: User Retention feature transformation, How to Be a Good Modeler features, Feature Selection feedback loops, Evaluation feedback loops in financial models, The Financial Modeling Feedback Loop–The Financial Modeling Feedback Loop filter bubble thought experiment, Thought Experiment: Filter Bubbles filters, Example: User Retention filters, ordering features with, Filters finalizing models, In-Sample, Out-of-Sample, and Causality financial data exercise, Exercise: Financial Data financial modeling autocorrelation, correcting, A Baby Model–A Baby Model financial data exercise, Exercise: Financial Data GetGlue exercise, Exercise: GetGlue and Timestamped Event Data–Exercise: Financial Data logistic regression in, Why Regression?

test sets, Training and test sets tests, Code readability and reusability text data, Populations and Samples of Big Data text-mining models, Thought Experiment: Meta-Definition TF-IDF vectors, Naive No Longer thought experiments access to medical records, Thought Experiment automated statistician, Thought Experiment: Automated Statistician chaos simulation, Thought Experiment: How Would You Simulate Chaos? data science as a science, Thought Experiments–Media 6 Degrees Exercise Egyptian politics, Thought Experiment filter bubble, Thought Experiment: Filter Bubbles human powered airplane, Thought Experiment image recognition, Scraping the Web: APIs and Other Tools large-scale network analysis, Moving from Descriptive to Predictive learning by example, Thought Experiment: Learning by Example–How About k-nearest Neighbors? medical data, Thought Experiment, Closing Thought Experiment meta-definitions, Thought Experiment: Meta-Definition personal data collection, Mark’s Thought Experiment privacy, concerns/understanding of, Thought Experiment: What Is the Best Way to Decrease Concern and Increase Understanding and Control?

Or if they have enough ratings, you can decide not to update the rest of them. As with any machine learning model, you should perform cross-validation for this model—leave out a bit and see how you did, which we’ve discussed throughout the book. This is a way of testing overfitting problems. Thought Experiment: Filter Bubbles What are the implications of using error minimization to predict preferences? How does presentation of recommendations affect the feedback collected? For example, can you end up in local maxima with rich-get-richer effects? In other words, does showing certain items at the beginning give them an unfair advantage over other things?

pages: 527 words: 147,690

Terms of Service: Social Media and the Price of Constant Connection
by Jacob Silverman
Published 17 Mar 2015

Upworthy was founded by Eli Pariser, who previously was the executive director of liberal advocacy organization MoveOn before going on to write The Filter Bubble, a book warning that search algorithms, by taking into account our preferences, browser histories, locations, and other personal information, limit our ability to access diverse points of view. This all became richly ironic when Pariser founded Upworthy in March 2012. Upworthy is a self-proclaimed liberal site trafficking only in positive and uplifting messages; by its very construction, it’s dedicated to pushing one political point of view bound in a certain package. It’s confined itself to its own filter bubble. Many media outlets and writers have political biases, acknowledged or otherwise, but Pariser’s own history of writing about the dangers of being exposed to a narrow range of views makes this a curious enterprise, though his past career as a political operative goes some way toward explaining it.

The rise of the HBO series Girls owes much to the fact that it chronicles the lives of the same type of people who would cover it professionally—young writers and creative people living in Brooklyn.) Thisness is the flattery of representation in concentrated form. Thisness allows us to feel like pieces of media were made for us. Thisness is your own personal filter bubble, showing you exactly what you want. (It also allows BuzzFeed to divide their readers into highly specific, personalized categories, aiding in future targeting efforts.) It’s not just for entertainment. Outrage and grievance also play well, which is why talk radio and strident political blogging have boomed in the last fifteen years.

(By default, Vortex gives “Narnia” as a user’s location, though that can be adjusted as part of the game.) Although Vortex wasn’t released into the wild—it was a thesis project, it had some security holes, and browser companies would’ve been unlikely to allow it—it was social-media rebellion at its best. As Law wrote, “The ‘Internet’ does not exist. Instead, it is many overlapping filter bubbles which selectively curate us into data objects to be consumed and purchased by advertisers.” Her program, even if it never was made publicly available, brilliantly illuminated these points. It’s the kind of project that’s deeply revealing of how the surveillance economy works: by arranging us into limited categories and subgroups that can easily be managed and monitored, with our data and attention bought and sold accordingly.

pages: 521 words: 118,183

The Wires of War: Technology and the Global Struggle for Power
by Jacob Helberg
Published 11 Oct 2021

“My friends wouldn’t post something that’s not true.”15 The latter point hints at a particularly problematic element of social media: the consolidation of so-called filter bubbles. These cocoons are partly the product of human nature. As psychologists know, we have an “implicit bias” to trust members of our own group, compounded by “confirmation bias” that inclines us to seek out and believe information that reinforces our existing views—and reject information that doesn’t. But filter bubbles can be exacerbated by technology. While social media companies are not responsible for creating cognitive biases, technology can inflame them.

The following year, a short Teen Vogue clip of the Parkland anti-gun violence activist Emma González ripping up a paper target from a gun range was altered to depict her ripping up the Constitution.34 Trolls have circulated several manipulated videos of House Speaker Nancy Pelosi seemingly slurring her speech, leading platforms like Facebook and Twitter to remove the clips or label them as “partly false.” A prominent Republican congressman circulated a doctored video of health care activist Ady Barkan—who suffers from a neurodegenerative disease and uses a computerized voice to talk—literally putting words into his computerized voice.35 Unsurprisingly, these narratives flourish within the filter bubbles and inflamed partisan discourse of our democratic society. Just imagine the consequences. When Syrian hackers hijacked the Associated Press Twitter account in 2013 to tweet fake breaking news—that a White House explosion had injured President Obama—the stock market lost $136 billion in just three minutes.36 Several years later, Pakistani defense minister Khawaja Asif fell for a fake news story alleging that a former Israeli defense minister had threatened Pakistan with a nuclear attack.

112 In the battle for those six inches inside our heads, better digital literacy is the defense we need. Break Out of Hyper-Partisan Bubbles Yet even greater digital literacy won’t overcome one of the greatest challenges we face—our willingness to believe any information that comes from our own tribe or filter bubble. One 2018 study demonstrated that we’re more likely to trust people from our own political group even when it comes to completely unrelated tasks like sorting shapes.113 This poses several challenges. On the front-end, partisanship and polarization accelerate the spread of fake news. That too-good-to-be-true quote certainly sounds like something a Republican would say, you might think, as you click Retweet and send a mistruth zinging through cyberspace.

pages: 390 words: 96,624

Consent of the Networked: The Worldwide Struggle for Internet Freedom
by Rebecca MacKinnon
Published 31 Jan 2012

Also see John Pomfret, “In China, Google Users Worry They May Lose an Engine of Progress,” Washington Post, March 20, 2010, www.washingtonpost.com/wp-dyn/content/article/2010/03/19/AR2010031900986.html (accessed June 21, 2011). 9 geopolitical vision for a digitally networked world: Eric Schmidt and Jared Cohen, “The Digital Disruption: Connectivity and the Diffusion of Power,” Foreign Affairs 89, no. 6 (November/December 2010), 75–85. 10 In his book The Filter Bubble, Eli Pariser: Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011). 10 Siva Vaidhyanathan warns: Siva Vaidhyanathan, The Googlization of Everything (And Why We Should Worry) (Berkeley: University of California Press, 2011). 10 As Harvard’s Joseph Nye points out in The Future of Power: Joseph S.

Companies argue that collecting a wide array of personal data is necessary to serve people better, in ways most people have shown that they want. Critics argue that companies have gone far beyond what most citizens actually want—when they have a chance to understand what is really going on. In his book The Filter Bubble, Eli Pariser warns that search engines and social networks manipulate what we find and who we interact with on the Web in a way that maximizes our value to advertisers but that is likely to minimize the chances that we will be exposed to a sufficiently diverse range of news and views that we need as citizens to make informed political and economic choices.

Buzz Cade, Marilyn Cafferty, Jack Calvert Calyx Internet Access Cameron, David Castells, Manuel Cellan-Jones, Rory Censorship of categories of Internet traffic of data and text messages freedom vs. security Google and national-level filtering systems opposition to national censorship by private intermediaries Center for Democracy and Technology Center for Information Technology Policy, Princeton Center for Internet and Society Chandler, Mark Chaos Computer Club Charter of Human Rights and Principles for the Internet Chesterman, Simon Chile, net neutrality law in China, Internet in censorship controls on political information e-commerce market “e-parliament” Google government-friendly communities “great firewall of China” paradox of patriotic hackers role of government social networking sites China Digital Times, on censorship of Google pullout China Netcom China Unicom Chinese Communist Party control of dissent by economic leadership of use of Internet by Christopher, Warren Cisco Systems, sales of surveillance systems to authoritarian governments Citizen-centric society, threats to Citizen Lab, University of Toronto Citron, Danielle “Civil Rights in an Information Age” (Citron) Cleek, Ashley Clinton, Hillary The Cluetrain Manifesto (Searls) Code and Other Laws of Cyberspace (Lessig) Cognitive Surplus (Shirky) Cohen, Jared Comcast Committee to Protect Journalists Commotion Wireless Communication Power (Castells) “Communique on Principles for Internet Policymaking” (OECD) Contemporary Business News Copyright enforcement Anti-Counterfeiting Trade Agreement (ACTA) concerns about abuse of effects of lobbying on free expression and HADOPI as rationale for surveillance and censorship website shutdowns Council of Europe Crabgrass “Creating Passionate Users” (Sierra blog) Creative Commons Cryptome Cyber-separatism Cybersecurity and Internet Freedom Act of 2011 Dailymotion Daum Deep packet inspection (DPI) Deibert, Ronald Democracy Forum Democracy in America (de Tocqueville) Democracy movements, origin of Democratizing Innovation (von Hippel) Deng Xiaoping Deng Yujiao Denial of service attacks (DDoS) Denton, Nick Department of Homeland Security (DHS) Desai, Nitin Diaspora Diebold Election Systems Digital bonapartism defined in Russia Digital commons activism and licensing need for protection of individual rights role of technical protocols of Digital Due Process (DDP) bill Digital Economy Act (2010) Digital Justice Coalition Digital Millennium Copyright Act (DMCA) Digital real estate Discipline & Punish (Foucault) Doctorow, Cory Doherty, Will Domain name system (DNS) Domini DPI (deep packet inspection) Drumbeat Drummond, David Drupal Dui Hua Foundation Dynamic Coalition on Internet Rights and Principles Dynaweb e-G8 conference ECPA (Electronics Communications Privacy Act of 1986) Egypt activism in government control of mobile phones in surveillance in use of Tor in Egyptian Blogs Aggregator 18th Brumaire of Louis Napoleon (Marx) El-Fattah, Alaa Abd El-Hamalawy, Hossam Electronic Frontier Foundation (EFF) Electronic Industry Code of Conduct Electronics Communications Privacy Act of 1986 (ECPA) Ericsson Estrada, Joseph European Digital Rights Initiative (EDRI) European Service Providers Association EveryDNS Exodus International Extractive Industries Transparency Initiative ExxonMobil Facebook activism and addition of new encryption and security settings attitude toward anonymity digital commons and effectiveness for activists in Egypt inconsistency of policy enforcement lobby to update ECPA politicians and privacy issues protection from hate and harassment “Quit Facebook Day,” real-name policy The Facebook Effect (Kirkpatrick) Facebook Zero Fair Labor Association (FLA) Falun Gong FBI (Federal Bureau of Investigation) F&C Asset Management FCC (Federal Communications Commission) Federal Trade Commission Federalist No. 10 Feriani, Samir The File (Ash) The Filter Bubble (Pariser) Financial Times, on anonymity FinFisher Fiore, Mark Firefox Flickr, removal of photos of Egyptian state security agents from Folksam Ford Motor Company Foreign Affairs on Google’s vision of networked world on US obsession with circumvention Foreign Intelligence Surveillance Act (FISA) Amendments Act (2008) Foucault, Michel Franken, Al Free Press Freedom House Freedom of Connection—Freedom of Expression: The Changing Legal and Regulatory Ecology Shaping the Internet (UNESCO) FreedomBox Freegate Frydman, Gilles “Fugitivus,” The Future of Power (Nye) Gaddafi, Muammar Gamma International UK Ltd.

pages: 371 words: 108,317

The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future
by Kevin Kelly
Published 6 Jun 2016

engineered a way to automatically map one’s position in the field of choices visually, to make the bubble visible, which made it easier for someone to climb out of their filter bubble by making small tweaks in certain directions. Second in the ideal approach, I’d like to know what my friends like that I don’t know about. In many ways, Twitter and Facebook serve up this filter. By following your friends, you get effortless updates on the things they find cool enough to share. The ease of shouting out a recommendation via a text or photo is so easy from a phone that we are surprised when someone loves something new but doesn’t share it. But friends can also act like a filter bubble if they are too much like you. Close friends can make an echo chamber, amplifying the same choices.

The cognification is based on subtle details of my (and others’) behavior that only a sleepless obsessive machine might notice. The danger of being rewarded with only what you already like, however, is that you can spin into an egotistical spiral, becoming blind to anything slightly different, even if you’d love it. This is called a filter bubble. The technical term is “overfitting.” You get stuck at a lower than optimal peak because you behave as if you have arrived at the top, ignoring the adjacent environment. There’s a lot of evidence this occurs in the political realm as well: Readers of one political stripe who depend only on a simple filter of “more like this” rarely if ever read books outside their stripe.

See also advertising commodity attention, 177–79 commodity prices, 189 communications and decentralization, 118–19, 129–31 and dematerialization, 110–11 and free markets, 146 inevitable aspects of, 3 oral communication, 204 and platforms, 125 complexity and digital storage capacity, 265–66 computers, 128, 231 connectivity, 276, 292, 294–95 consumer data, 256 content creation advertisements, 184–85 custom music, 77 early questions about, 17 and editors, 148–51, 152, 153 and emergence of user-generated content, 19, 21–22, 184–85, 269–74, 276 and Google search engines, 146–47 and hierarchical/nonhierarchical infrastructures, 148–54 impulse for, 22–23 and screen culture, 88 and sharing economy, 139 value of, 149 convergence, 291, 296 cookies, 180, 254 cooperation, 139–40, 146, 151 copper prices, 189 copying digital data and copy protection, 73 and creative remixing, 206–9 and file sharing sites, 136 free/ubiquitous flow of, 61–62, 66–68, 80, 256 generatives that add value to, 68–73 and reproductive imperative, 87 and uncopiable values, 67–68 copyright, 207–8 corporate monopolies, 294 coveillance, 259–64 Cox, Michael, 286–87 Craigslist, 145 Creative Commons licensing, 136, 139 crowdfunding, 156–61 crowdsourcing, 185 Cunningham, Ward, 135–36 curators, 150, 167, 183 customer support, 21 cyberconflict, 252, 275 dark energy and matter, 284 “dark” information, 258 Darwin, Charles, 243 data analysis and lifelogging, 250–51 “database cinema,” 200 data informing artificial intelligence, 39, 40 decentralization, 118–21 and answer-generating technologies, 289 and bottom-up participation, 154 and collaboration, 142, 143 of communication systems, 129–31 and digital socialism, 137 and emergence of the “holos,” 295 and online advertising, 182–85 and platforms, 125 and startups, 116–17 and top-down vs. bottom-up management, 153 Deep Blue, 41 deep-learning algorithms, 40 DeepMind, 32, 37, 40 deep reinforcement machine learning, 32–33 dematerialization, 110–14, 125, 131 diagnoses and diagnostic technology, 31–32, 239, 243–44 diaries and lifelogging, 248–49 Dick, Philip K., 255 diet tracking, 238 Digg, 136, 149 digitization of data, 258 directional sense, 243 discoverability, 72–73, 101 DNA sequencing, 69 documentaries, updating of, 82 domain names, 25–26 Doritos, 185 Downton Abbey (series), 282 drones, 227, 252 Dropbox, 32 drug research, 241 DVDs, 205 Dyson, Esther, 186 Eagleman, David, 225 e-banks, 254 eBay, 154, 158, 185, 263, 272, 274 ebooks and readers, 91–96 and accessibility vs. ownership, 112 advantages of, 93–95 bookshelves for, 100 fluidities of, 79 interconnectedness of, 95–96, 98, 99–100, 101–2, 104 and just-in-time purchasing, 65 liquidity of, 93 tagging content in, 98 and tracking technology, 254 echo chambers, 170 economy, 21, 65, 67–68, 136–38, 193 ecosystems of interdependent products and services, 123–24 editors, 148–51, 152, 153 education, 90, 232 Einstein, Albert, 288 electrical outlets, 253 email, 186–87, 239–40 embedded technology, 221 embodiment, 71, 224 emergent phenomena, 276–77, 295–97 emotion recognition, 220 employment and displaced workers, 49–50, 57–58 Eno, Brian, 221 entertainment costs, 190 epic failures, 278 e-retailers, 253 etiquette, social, 3–4 evolution, 247 e-wallets, 254 experience, value of, 190 expertise, 279 exports, U.S., 62 extraordinary events, 277–79 eye tracking, 219–20 Facebook and aggregated information, 147 and artificial intelligence, 32, 39, 40 and “click-dreaming,” 280 cloud of, 128, 129 and collaboration, 273 and consumer attention system, 179, 184 and creative remixing, 199, 203 face recognition of, 39, 254 and filtering systems, 170, 171 flows of posts through, 63 and future searchability, 24 and interactivity, 235 and intermediation of content, 150 and lifestreaming, 246 and likes, 140 nonhierarchical infrastructure of, 152 number of users, 143, 144 as platform ecosystem, 123 and sharing economy, 139, 144, 145 and tracking technology, 239–40 and user-generated content, 21–22, 109, 138 facial recognition, 39, 40, 43, 220, 254 fan fiction, 194, 210 fear of technology, 191 Felton, Nicholas, 239–40 Fifield, William, 288 films and film industry, 196–99, 201–2 filtering, 165–91 and advertising, 179–89 differing approaches to, 168–75 filter bubble, 170 and storage capacity, 165–67 and superabundance of choices, 167–68 and value of attention, 175–79 findability of information, 203–7 firewalls, 294 first-in-line access, 68 first-person view (FPV), 227 fitness tracking, 238, 246, 255 fixity, 78–81 Flickr, 139, 199 Flows and flowing, 61–83 and engagement of users, 81–82 and free/ubiquitous copies, 61–62, 66–68 and generative values, 68–73 move from fixity to, 78–81 in real time, 64–65 and screen culture, 88 and sharing, 8 stages of, 80–81 streaming, 66, 74–75, 82 and users’ creations, 73–74, 75–78 fluidity, 66, 79, 282 food as service (FaS), 113–14 footnotes, 201 411 information service, 285 Foursquare, 139, 246 fraud, 184 freelancers (prosumers), 113, 115, 116–17, 148, 149 Freeman, Eric, 244–45 fungibility of digital data, 195 future, blindness to, 14–22 Galaxy phones, 219 gatekeepers, 167 Gates, Bill, 135, 136 gaze tracking, 219–20 Gelernter, David, 244–46 General Electric, 160 generatives, 68–73 genetics, 69, 238, 284 Gibson, William, 214 gifs, 195 global connectivity, 275, 276, 292 gluten, 241 GM, 185 goods, fixed, 62, 65 Google AdSense ads, 179–81 and artificial intelligence, 32, 36–37, 40 book scanning projects, 208 cloud of, 128, 129 and consumer attention system, 179, 184 and coveillance, 262 and facial recognition technology, 254 and filtering systems, 172, 188 and future searchability, 24 Google Drive, 126 Google Glass, 217, 224, 247, 250 Google Now, 287 Google Photo, 43 and intellectual property law, 208–9 and lifelogging, 250–51, 254 and lifestreaming, 247–48 and photo captioning, 51 quantity of searches, 285–86 and smart technology, 223–25 translator apps of, 51 and users’ usage patterns, 21, 146–47 and virtual reality technology, 215, 216–17 and visual intelligence, 203 government, 167, 175–76, 252, 255, 261–64 GPS technology, 226, 274 graphics processing units (GPU), 38–39, 40 Greene, Alan, 31–32, 238 grocery shopping, 62, 253 Guinness Book of World Records, 278 hackers, 252 Hall, Storrs, 264–65 Halo, 227 Hammerbacher, Jeff, 280 hand motion tracking, 222 haptic feedback, 233–34 harassment, online, 264 hard singularity, 296 Harry Potter series, 204, 209–10 Hartsell, Camille, 252 hashtags, 140 Hawking, Stephen, 44 health-related websites, 179–81 health tracking, 173, 238–40, 250 heat detection, 226 hierarchies, 148–54, 289 High Fidelity, 219 Hinton, Geoff, 40 historical documents, 101 hive mind, 153, 154, 272, 281 Hockney, David, 155 Hollywood films, 196–99 holodeck simulations, 211–12 HoloLens, 216 the “holos,” 292–97 home surveillance, 253 HotWired, 18, 149, 150 humanity, defining, 48–49 hyperlinking antifacts highlighted by, 279 of books, 95, 99 of cloud data, 125–26 and creative remixing, 201–2 early theories on, 18–19, 21 and Google search engines, 146–47 IBM, 30–31, 40, 41, 128, 287 identity passwords, 220, 235 IMAX technology, 211, 217 implantable technology, 225 indexing data, 258 individualism, 271 industrialization, 49–50, 57 industrial revolution, 189 industrial robots, 52–53 information production, 257–64.

pages: 390 words: 109,519

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media
by Tarleton Gillespie
Published 25 Jun 2018

I could subscribe to an array of these collective lenses: I don’t want to see videos that more than X users have categorized as violent.4 Trusted organizations could develop and manage their own collective lenses: imagine a lens run by the Southern Poverty Law Center to avoid content that allied users have marked as “racist,” or one from Factcheck.org filtering out “disputed” news articles. This would not help the “filter bubble” problem; it might in fact exacerbate it. Then again, users would be choosing what not to see, rather than having it deleted on their behalf. It would prioritize those who want a curated experience over those who take advantage of an uncurated one. Protect Users as They Move across Platforms Little of what a user does to curate and defend her experience on one platform can easily be exported to others.

Control.” 47Deibert, et al., Access Denied; Deibert et al., Access Controlled. 48Helmi Noman, “Sex, Social Mores, and Keyword Filtering: Microsoft Bing in the ‘Arabian Countries,’” OpenNet initiative, 2010, https://opennet.net/sex-social-mores-and-keyword-filtering-microsoft-bing-arabian-countries. 49Rebecca Rosen, “What to Make of Google’s Decision to Block the ‘Innocence of Muslims’ Movie,” Atlantic, September 14, 2012, https://www.theatlantic.com/technology/archive/2012/09/what-to-make-of-googles-decision-to-block-the-innocence-of-muslims-movie/262395/. 50Bill Chappell, “Google Maps Displays Crimean Border Differently in Russia, U.S.,” NPR, April 12, 2014, http://www.npr.org/blogs/thetwo-way/2014/04/12/302337754/google-maps-displays-crimean-border-differently-in-russia-u-s. 51Twitter, “Tweets Still Must Flow,” Twitter Blog, January 26, 2012, https://blog.twitter.com/official/en_us/a/2012/tweets-still-must-flow.html; Eva Galperin, “What Does Twitter’s Country-by-Country Takedown System Mean for Freedom of Expression?” Electronic Frontier Foundation, January 27, 2012, https://www.eff.org/deeplinks/2012/01/what-does-twitter%E2%80%99s-country-country-takedown-system-mean-freedom-expression. 52Many thanks to Nick Seaver for this observation. 53Schudson, Advertising, the Uneasy Persuasion. 54Pariser, The Filter Bubble; Sunstein, Republic.com 2.0. 55Ananny, “The Curious Connection between Apps for Gay Men and Sex Offenders.” CHAPTER 8 WHAT PLATFORMS ARE, AND WHAT THEY SHOULD BE 1Flyverbom, “Digital Age Transparency.” 2Twitter has since changed the “egg” icon that used to represent accounts that had not added a profile photo, because it had become associated with trolling.

How Users Matter: The Co-Construction of Users and Technology. Cambridge: MIT Press. PALFREY, JOHN. 2010. “Four Phases of Internet Regulation.” Social Research: An International Quarterly, 77 (3): 981–96. PAPACHARISSI, ZIZI. 2015. “We Have Always Been Social.” Social Media + Society 1 (1): 2056305115581185. PARISER, ELI. 2011. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. New York: Penguin. PASQUALE, FRANK. 2015. The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge: Harvard University Press. ———. 2016. “Platform Neutrality: Enhancing Freedom of Expression in Spheres of Private Power.”

pages: 619 words: 177,548

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity
by Daron Acemoglu and Simon Johnson
Published 15 May 2023

Eli Pariser, internet activist and executive director of MoveOn.org, reported in a TED talk in 2010 that although he followed many liberal and conservative news sites, after a while he noticed he was directed more and more to liberal sites because the algorithm had noticed he was a little more likely to click on them. He coined the term filter bubble to describe how algorithm filters were creating an artificial space in which people heard only voices that were already aligned with their political views. Filter bubbles have pernicious effects. Facebook’s algorithm is more likely to show right-wing content to users who have a right-leaning ideology, and vice versa for left-wingers. Researchers have documented that the resulting filter bubbles exacerbate the spread of misinformation on the social media site because people are influenced by the news items they see. These filter-bubble effects go beyond social media.

In the run-up to the election, Facebook was also mired in controversy because of a doctored video of House Speaker Nancy Pelosi, giving the impression that she was drunk or ill, slurring her words and sounding unwell in general. The fake video was promoted by Trump allies, including Rudy Giuliani, and the hashtag #DrunkNancy began to trend. It soon went viral and attracted more than two million views. Crazy conspiracy theories, such as those that came from QAnon, circulated uninterrupted in the platform’s filter bubbles as well. Documents provided to the US Congress and the Securities and Exchange Commission by former Facebook employee Frances Haugen reveal that Facebook executives were often informed of these developments. As Facebook came under increasing pressure, its vice president of global affairs and communications, former British deputy prime minister Nick Clegg, defended the company’s policies, stating that a social media platform should be viewed as a tennis court: “Our job is to make sure the court is ready—the surface is flat, the lines painted, the net at the correct height.

Some are optimistic that new technologies, such as Web 3.0 or the metaverse, can provide different dynamics. But as long as the current business model of tech companies and the surveillance obsession of governments prevail, they are more likely to further exacerbate these trends, creating even more powerful filter bubbles and a wider wedge with reality. It is late, but perhaps not too late. Chapter 11 outlines how the tide can be turned and which specific policy proposals hold promise for such a transformation. 11 Redirecting Technology Computers are mostly used against people instead of for people used to control people instead of to free them time to change all that— we need a… PEOPLE’S COMPUTER COMPANY —first newsletter of the People’s Computer Company, October 1972 (italics in original) Most of the things worth doing in the world had been declared impossible before they were done.

pages: 262 words: 69,328

The Great Wave: The Era of Radical Disruption and the Rise of the Outsider
by Michiko Kakutani
Published 20 Feb 2024

In The Origins of Totalitarianism (1951), she wrote that the kind of loyalty demanded by leaders like Stalin and Hitler “can be expected only from the completely isolated human being who, without any other social ties to family, friends, comrades, or even mere acquaintances, derives his sense of having a place in the world only from his belonging to a movement, his membership in the party.” Arendt, as usual, was writing as both a brilliant historian and a prescient analyst. Though it was published more than half a century before social media sealed us in insulated filter bubbles, The Origins of Totalitarianism explains why so many people, feeling dislocated in today’s world of seismic change, fall prey to the lies of autocrats. It explains why Trump’s rhetoric of fear and dispossession took root among voters who felt marginalized by changing cultural values, new economic hardships, and the perceived loss of their own status

A world of rabbit holes and magical thinking, where Trump and Putin have weaponized the most noxious aspects of digital age technology—like data overload, social media echo chambers, viral memes—to foment chaos, and redefine reality. As we’ll see in the next chapter, unforeseen side-effects of decisions made in the early days of Silicon Valley have made it easy for bad actors to exploit the internet, and easy for users to insulate themselves in filter bubbles where alternate realities thrive. They have learned that hyperbole and sensationalism get more clicks on social media and gaming platforms than reasoned, reasonable posts. And just as the anonymity of the web has enabled trolling, so the mediating effect of screens has made people feel insulated from the consequences of their online words and actions—a worrying dynamic when people, on average, are spending more than six hours a day on their phones and computers.

We are also becoming addicted to the little dopamine hits we get when a stranger “likes” something we posted, and feel spikes of irritation or worse when we are trolled. While the internet was meant to connect people across cultures and vocations, it has grown into an ever-expanding Borgesian maze that is endlessly subdivided into tiny, soundproofed filter bubbles, where we are only in touch with like-minded folks who share our prejudices and interests. As Marshall McLuhan predicted in 1969, emerging electronic technology had a decentralizing effect, and promoted “discontinuity and diversity and division”; in fact, the hallmarks of the new “global village” were less “uniformity and tranquility” than “conflict and discord.”

pages: 267 words: 82,580

The Dark Net
by Jamie Bartlett
Published 20 Aug 2014

In January 2014, Robinson was convicted of mortgage fraud and sentenced to eighteen months in prison. At the time of writing – June 2014 – he is out on early release. p.69 ‘Creating our own realities is nothing new . . .’ The American academic Eli Pariser has documented something he calls online ‘the filter bubble’: people increasingly surround themselves with information that corroborates their own world view and reduces their exposure to conflicting information. Pariser, E., The Filter Bubble: What the Internet is Hiding From You. In the UK, we already have what is called a ‘reality–perception gap’. For example, in a 2011 survey, 62 per cent of respondents thought of ‘asylum seekers’ when asked what they associate with immigrants.

Boyd, D., It’s Complicated: The Social Lives of Networked Teens. An incredibly useful and clear-eyed account of young people’s relationship with social networks. Hafner, K. and Lyon, M., When Wizards Stay Up Late: The Origins of the Internet. Krotoski, A., Untangling the Web: What the Internet is Doing to You. Pariser, E., The Filter Bubble: What the Internet is Hiding from You. Suler, J., ‘The Online Disinhibition Effect’, in CyberPsychology and Behaviour. An extremely influential theory about what effect that communicating from behind a screen has on us. Turkle, S., The Second Self; Life On the Screen and Alone Together. Sherry Turkle is without question one of the world’s experts on this subject, and someone whose studies on the impact of computers on human behaviour and identity are required reading.

pages: 304 words: 80,143

The Autonomous Revolution: Reclaiming the Future We’ve Sold to Machines
by William Davidow and Michael Malone
Published 18 Feb 2020

We suspect that information would put us at the bottom of the candidate list of any job opening for which we might apply. Algorithms constrain our lives in virtual space as well, whether we have done anything regrettable or not. They analyze our interests and select the things we see. In doing so, they limit the range of things to which we might be exposed. As Eli Pariser puts it in his book The Filter Bubble, “you click on a link, which signals your interest in something, which means you are more likely to see articles about that topic” and then “you become trapped in a loop.”22 You are being shown a distorted view of the world. In a very tangible sense, you are the subject of discrimination. If you’re having trouble finding a job as a software engineer, it may be because you got a low score from the Gild, a company that predicts the skills of programmers by evaluating the open-source codes they have written, the language they use on LinkedIn, and how they answer questions on software social forums.23 Algorithmic prisons are not new.

“Muslim-Amercian Group Criticizes TSA Plan and Profiling,” CNN, January 4, 2010, http://www.cnn.com/2010/CRIME/01/04/tsa.measures.muslims/ (accessed June 27, 2019); “TSA’s New Screening Targets Certain Passengers for ‘Enhanced’ Checks,” DailyTech, October 23, 2013, http://www.dailytech.com/TSAs+New+Screening+Targets+Certain+Passengers+for+Enhanced+Checks/article33597.htm. 22. Eli Pariser, The Filter Bubble (New York: Penguin Press, 2011), 125. 23. Don Peck, “They’re Watching You at Work,” The Atlantic, December 2013, http://www.theatlantic.com/magazine/archive/2013/12/theyre-watching-you-at-work/354681/ (accessed June 27, 2019). 24. “Equifax,” Wikipedia, http://en.wikipedia.org/wiki/Equifax (accessed June 27, 2019). 25.

See race and ethnicity European Union, 14, 128–129 expertise, impairment with, 2–3 Facebook, 43, 65, 70 addictive design elements of, 144 BAADD practices of, 88, 90, 91 content governance policies of, 168 cyber currency under, 10 emotion detection technology, 115 employee to user ratio for, 86, 105 evolution unpredictability of, 180 freemium business model profiting, 122–123 narcissistic personality proliferation on, 146–147 revenue, 150 Snapchat competition with, 91 usage decline, 154 facial recognition, 116 fake news, 18, 150, 168, 169–170 farming, 25, 152, 159, 160. See also Agricultural Revolution FarmVille, 140 FBI, 40, 124 Filter Bubble, The (Pariser), 125 financial crisis (2008), 73–74, 106–107, 178–179 financial industry: asset and hedge fund management automation in, 77–78 automatons/automation in, 10, 43, 77–78, 81–83, 102 business model transformations in, 74–83 cash to credit evolution in, 41–42 cybercrime, 39–40, 75–76, 78–80, 171–172, 177–178 employment in, 73, 102 Iceland’s reforms in, 178–179 information equivalence in, 74, 76–77, 83 mobile/cyber payment systems in, 10, 76–77, 80–82, 83, 171–172, 186 money transfer automation in, 78 non-monetizable productivity impacts on, 61 online banking evolution in, 10–11, 39–40 peer-to-peer payments systems in, 10, 76–77, 80–81, 83, 186 robo-advisers in, 10, 77, 82–83 structural transformations in, 10–11 substitutional equivalences in, 39–40, 41–42, 43, 45, 72 virtualization future in, 81–83, 102 virtual tellers, 81–82.

pages: 281 words: 83,505

Palaces for the People: How Social Infrastructure Can Help Fight Inequality, Polarization, and the Decline of Civic Life
by Eric Klinenberg
Published 10 Sep 2018

,” American Journal of Sociology 102, no. 3 (1996): 690–755. that confirm their beliefs: On inequality and class segregation, see Sean Reardon and Kendra Bischoff, “Income Inequality and Income Segregation,” American Journal of Sociology 116, no. 4 (2011): 1092–153. On the filter bubble, see Eli Pariser, The Filter Bubble (New York: Penguin Press, 2011). “discrimination based on race”: Shanto Iyengar and Sean Westwood, “Fear and Loathing Across Party Lines,” American Journal of Political Science 59, no. 3 (2015): 690–707. “threaten the nation’s well-being”: Pew Research Center, “Partisanship and Political Animosity in 2016,” June 22, 2016, http://www.people-press.org/​2016/​06/​22/​partisanship-and-political-animosity-in-2016/.

Citizens identified strongly with or against one of the two major parties, but—with exceptions for issues such as abortion, sexual morality, and capital punishment—they generally did not have firm or extreme views on most major policy matters. That’s changed in the last decades, however, as social inequality and class segregation have deepened, national news programs that transcended ideological lines have lost viewers, and the Internet has generated the rise of “filter bubbles,” where everyone can find facts and opinions that confirm their beliefs. All of this feeds the kind of in-group connection that social scientists call “bonding social capital,” but starves us of the “bridging social capital” we need to live together. Since 2008, Americans have become deeply divided on a wide variety of issues, and on some, such as climate change, criminal justice, and immigration, leading political officials and popular media personalities champion “alternative facts” over mainstream scientific findings.

pages: 288 words: 81,253

Thinking in Bets
by Annie Duke
Published 6 Feb 2018

It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available, yet we gravitate toward sources that confirm our beliefs, that agree with us. Every flavor is out there, but we tend to stick with our favorite. Making matters worse, many social media sites tailor our Internet experience to show us more of what we already like. Author Eli Pariser developed the term “filter bubble” in his 2011 book of the same name to describe the process of how companies like Google and Facebook use algorithms to keep pushing us in the directions we’re already headed. By collecting our search, browsing, and similar data from our friends and correspondents, they give users headlines and links that cater to what they’ve divined as our preferences.

“Possible Selves as Roadmaps.” Journal of Research in Personality 38, no. 2 (April 2004): 130–49. Oyserman, Daphna, Mesmin Destin, and Sheida Novin. “The Context-Sensitive Future Self: Possible Selves Motivate in Context, Not Otherwise.” Self and Identity 14, no. 2 (March 2015): 173–88. Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You. New York: Penguin, 2011. Paulos, John. Innumeracy: Mathematical Illiteracy and Its Consequences. New York: Hill & Wang, 1989. Pollan, Michael. In Defense of Food: An Eater’s Manifesto. New York: Penguin, 2008. Pollan, Michael, “History of Nutritionism” in “Michael Pollan and ‘In Defense of Food: The Omnivore’s Solution,’” Otis Lecture at Bates College, Lewiston, Maine, October 27, 2008. http://www.bates.edu/food/foods-importance/omnivores-solution/history-of-nutritionism.

Strangelove, 19, 243n Duarte, José, 146 Duhigg, Charles, 106–7, 109, 115 Dyson, Freeman, 243n Dyson, George, 243n Easterbrook, Frank, 227–29 echo chambers, 128–29, 141, 144–45, 206 Eli Lilly, 150 Ellis Island, 53 emotions, 191, 194–96, 198–200 employees, 48 hiring of, 42–43, 145, 172n Epstein, Theo, 100 ESPN, 91, 99 evolution, 11–13, 51–52, 64, 110 natural selection in, 91n–92n, 103 experience: learning and, 78–80, 82, 88, 89, 91, 93–95 processing of, 57–59 experts, 28, 78–79, 149–50, 158–59 exploratory thought, 128–29, 134 extinct species, 67–68 Facebook, 61 fake news, 60 false positives and false negatives, 12, 52 fat, 54–55, 62, 85–86, 164–65 feedback, 78–82 Fey, Tina, 250n Feynman, Richard, 72n, 156, 166, 170 filter bubble, 61 Firestein, Stuart, 27, 246n FiveThirtyEight.com, 6, 32, 230, 245n flat tire, 190–91, 194–96, 200 Florida Marlins, 98–100 food and diet, 54, 202–3, 221 fat in, 54–55, 62, 85–86, 164–65 SnackWell’s, 85–86, 179 football, 159 Super Bowl, 5–7, 10, 22, 46, 48, 165–66, 216–18, 241n–42n Foreign Service, 139–40 FoxSports.com, 6 Freedom of Information Act, 157 Freeman Spogli Institute for International Studies, 185 free press and free speech, 156 future, 46, 79–81, 88, 174, 178 age-progression software and, 184–87 backcasting from, 218–22, 225, 226 betting on, 79–80, 209 imagining, 183, 221 negative, working backward from, 221–26 reconnaissance and, 207–12, 218 retirement planning and, 182, 184–86, 203 scenario planning and, 209–18 see also outcomes; time travel, mental gambling, 4, 39, 45, 194–96 negotiated settlements in, 40 slot machines, 87–88 see also poker game theory, 19–21, 23, 242n, 243n General Electric, 150 generalized characterizations, 205 genes, 49, 83, 91n–92n, 103 Gibney, Alex, 99 Gilbert, Daniel, 50–52, 104 goals, 82, 84, 108, 111, 174 adjusting, 226 working backward from, 220–24 Goldman, William, 26, 244n golf, 83, 109, 247n Gonzalez, Alex, 100 Google, 52–53, 61, 150, 184 grain bin explosion, 228–29 grant prospecting, 213–15, 217–18 gratification, immediate or delayed, 182n, 226 Green Bay Packers, 159 Grey, David, 135 groups: for decision making and truthseeking, see decision groups reasoning styles in, 128–29 groupthink, 128 habits, 105–11, 115, 117, 133, 134, 160, 188 loops in, 106–7, 109 Haidt, Jonathan, 104, 129–30, 145–46, 157 Half-Life of Facts, The (Arbesman), 67–68, 219 Hamm, Mia, 109 Hansen, Gus, 244n happiness, 104, 231 individual moments and, 190–91, 193–94, 199 Happiness Hypothesis, The (Haidt), 104 Harrington, Dan, 244n, 248n Harvard Business Review, 96, 219 Hastorf, Albert, 57–58 Hearst, William Randolph, 60 heart disease, 55, 164–65 Heider, Fritz, 89 Hellmuth, Phil, 90–91 Hennigan, John, 37–43, 48, 79, 135 Heterodox Academy, 146–49, 153, 172n Hills, The, 119 hindsight bias, 10, 26, 212, 227–31 home buyers, 202 Howard, Ronald, 190n Huxley, Aldous, 78–79 Ignorance: How It Drives Science (Firestein), 27 immigrants, 53, 140 improvisation, 173–74, 207, 250n information, 28, 34, 70, 136 beliefs and, 49–53, 55–56, 59–61, 66, 70, 94, 138–39 communication of, 166, 173 disinformation, 60 hidden and incomplete, 20–23, 25, 26, 33–35, 45, 81, 87, 158 new, 55–56, 61, 70, 173 processing of, 55, 59, 62, 123, 165 sharing of, 156–60 source of, 161 Institute for Advanced Study, 243n, 246n intellectual property, 157 International Academy of Trial Lawyers (IATL), 93 Internet, 60–61, 145 investments, 44–45, 191–93, 195, 196, 203 IQ, 147 Ira the Whale, 135 irrationality, see rationality and irrationality Ivey, Phil, 105–6, 108, 112, 116n JAMA Internal Medicine, 164 Jenner, Brody, 120 Jentz v.

pages: 297 words: 84,009

Big Business: A Love Letter to an American Anti-Hero
by Tyler Cowen
Published 8 Apr 2019

I can’t say I was unhappy with these expenditures, as they did drive some traffic to our site. But it was hardly possible to manipulate people like zombies, and we have moved away from buying the ads, even though they helped give us an early boost. That is a much more typical Facebook advertising story than what you might hear about these days. The idea of a “filter bubble” is another criticism of Facebook, but it is not supported by the facts. So many times I have heard that Facebook or other social media put us in worlds where conservatives listen only to conservatives and progressives only to progressives, or some such similar complaint about echo chambers. Maybe at times that feels true, but the numbers just don’t support the fear, at least not so far.

class Clinton, Hillary Coase, Ronald cognition cognitive dissonance cognitive efficiency cognitive strengths Collison, Patrick and John compensating differential conspiracy theories control firms co-ops copyright corporations attempts to sway public opinion downside of personalization public dislike of Countrywide “creative destruction” credit cards credit card information credit card system privacy and crony capitalism business influence on government class and multinational corporations overview privilege and state monopoly status quo bias See also capitalism cryptocurrencies See also Bitcoin Csikszentmihalyi, Mihaly Curry, Stephen CVS cybersecurity “daily effective experiences” See also Kahneman, Daniel; Krueger, Alan Daley, William Damaske, Sarah Damore, James daycare defense spending DejaNews Democratic Party Desan, Mathieu Deutsche Bank discrimination Dollar General Dow Scrubbing Bubbles Dream of the Red Chamber DuckDuckGo Dying for a Paycheck (Pfeffer) eBay education email employment/unemployment European Union ex post Exxon eyeglass companies Facebook advertising and AI and “anti-diversity memo” censorship and China and competition and complaints about employees “filter bubble” income inequality and information and innovation and monopoly and News Feed politics and privacy and Russian-manipulated content venture capital and See also Zuckerberg, Mark facial recognition technology “fake news” See also media Fama, Eugene fast-food Fehr, Ernst Ferguson, Niall financial crisis financial sector America as tax and banking haven American stock performance banks “too big” global importance of US growth information technology and intermediation overview venture capital and American innovation Financial Times fintech flow Ford Motor Company Foreign Corrupt Practices Act Foroohar, Rana fraud, businesses and CEOs in laboratory games comparative perspective cross-cultural game theory nonprofits vs. for-profits overview research on corporate behavior spread of information and tax gap trust and free trade French, Kenneth Friedman, Milton Friendster Fritzon, Katarina fundraising Gabaix, Xavier Gates, Bill GDP General Agreement on Tariffs and Trade General Electric General Motors Gilens, Martin Glass-Steagall Act Gmail Goetzmann, William N.

profitability short-termism and venture capitalism and See also income; nonprofit institutions publishing Rand, Ayn Reagan, Ronald See also Republican Party Renaissance rent Republican Party See also Reagan, Ronald; Trump, Donald resale price maintenance (RPM) risk-taking Rite Aid Romney, Mitt Russia Sanders, Bernie Sara Lee Saudi Arabia Scrubbing Bubbles (animated characters) sexual harassment Shell shell companies Shephard, Alex short-termism Shu, Pian smartphones See also Apple Smith, Adam Smyth, Joshua M. Snapchat social media 2016 election and advertising and big business and CEOs and economy and effect on American society “filter bubble” and generational influence privacy and trust and workers and See also Facebook; Instagram; Twitter social responsibility Social Security socialism SpaceX See also Musk, Elon Spool Staiger, Douglas O. Starbucks start-ups Stephens-Davidowitz, Seth stockbrokers Stripe Sturzenegger, Federico subsidies SunTrust superachievers superficiality superfirms super-low See prices supermarkets superstar companies supply-and-demand model supply chains, global Supreme Court See also Citizens United decision SWIFT wire transfer Symantec Tabarrok, Alex Taiwan take-down of internet content talent attracting benefits and CEOs and corporations and education and finance and innovation and monopoly and venture capitalism and tariffs TARP bailout program taxes America as tax and banking haven business tax cuts Earned Income Tax Credit filing globalization and individual income tax lobbying and mobility and tax avoidance tax exemptions tax gap tax rates tax reform See also income; Internal Revenue Service T-bills See also U.S.

pages: 439 words: 131,081

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World
by Max Fisher
Published 5 Sep 2022

They settled on the end of 2016, four years away, and Goodrow later pledged to resign if he failed. It set YouTube hurtling toward a self-imposed deadline, its executives and engineers bent on pushing content that would hook users for as long as possible, in parallel with a presidential election in which its influence would prove fateful. 2. Filter Bubbles CHASLOT WAS NOT the only one in the Valley worried about the consequences of algorithms. A certain phrase had circulated, as shorthand for those concerns, since the previous summer, in 2011. One morning that May, as Chaslot labored over his workstation at Google’s Los Angeles office, his corporate chiefs filed into a convention hall across town, where a thirty-year-old activist named Eli Pariser walked on stage to warn the audience of tech executives and engineers that their algorithms might threaten democracy itself.

No surprise: he is a progressive activist who for several years ran the left-wing organizing site MoveOn.org. The change probably increased his time on Facebook. But was this good for him, to show him only posts that spoke to his preexisting biases? Was it good for society? He had a name for the effect: filter bubbles. The simplest algorithmic sorting can alter people’s attitudes severely enough to swing elections. In one 2015 experiment, Americans were told to choose between two fictional candidates by researching them online. Each participant was shown the same thirty search results on a Google mockup, but in different orders.

They had read all about it on Facebook, which was where they often discussed the “refugee situation,” one said. Traunstein leans liberal but is politically split, and I asked the woman if she ever got into arguments about refugees online. She seemed confused by the question. “Everyone feels this way,” she said. Her filter bubble, unanimous in fear, had become her reality. She, like Wassermann and his online friends, like Wolff’s other students, like the locals that Guske implored to take down racist falsehoods, were the submerged mass of an iceberg of society-wide social media radicalization. Denkhaus, the firefighter-arsonist, was just its tip.

pages: 170 words: 49,193

The People vs Tech: How the Internet Is Killing Democracy (And How We Save It)
by Jamie Bartlett
Published 4 Apr 2018

The basics of what this is doing to politics is now fairly well-trodden stuff: the splintering of established mainstream news and a surge of misinformation allows people to personalise their sources in ways that play to their pre-existing biases.5 Faced with infinite connection, we find the like-minded people and ideas, and huddle together. Brand new phrases have entered the lexicon to describe all this: filter bubbles, echo chambers and fake news. It’s no coincidence that ‘post-truth’ was the word of the year in 2016. At times ‘post-truth’ has become a convenient way to explain complicated events with a simple single phrase. In some circles it has become a slightly patronising new orthodoxy to say that stupid proles have been duped by misinformation on the internet into voting for things like Brexit or Trump.

Suler argues that because we don’t know or see the people we are speaking to (and they don’t know or see us), because communication is instant, seemingly without rules or accountability, and because it all takes place in what feels like an alternative reality, we do things we wouldn’t in real life. Suler calls this ‘toxic disinhibition’. This is what all the articles about ‘echo chambers’ and ‘filter bubbles’ miss. The internet doesn’t only create small tribes: it also gives easy access to enemy tribes. I see opposing views to mine online all the time; they rarely change my mind, and more often simply confirm my belief that I am the only sane person in a sea of internet idiots. * * * • • • It’s unfair to lay all this at the door of Big Tech, since much of this is a human, not technological, weakness.

pages: 23 words: 5,264

Designing Great Data Products
by Jeremy Howard , Mike Loukides and Margit Zwemer
Published 23 Mar 2012

This is not to say that Amazon’s recommendation engine could not have made the same connection; the problem is that this helpful recommendation will be buried far down in the recommendation feed, beneath books that have more obvious similarities to “Beloved.” The objective is to escape a recommendation filter bubble, a term which was originally coined by Eli Pariser to describe the tendency of personalized news feeds to only display articles that are blandly popular or further confirm the readers’ existing biases. As with the AltaVista-Google example, the lever a bookseller can control is the ranking of the recommendations.

pages: 420 words: 100,811

We Are Data: Algorithms and the Making of Our Digital Selves
by John Cheney-Lippold
Published 1 May 2017

When who we are is made from shifting founts of measurable-type meaning, the network power of algorithmic machines, citing Galloway and philosopher Eugene Thacker, “set[s] the terms within which practices may possibly occur.”82 This mode of shifting algorithmic identity is what I have previously called soft biopolitics, which I will describe further in chapter 2.83 In the preceding analysis of user profiling, I’m not referring to critiques of personalization technologies like those spelled out in legal scholar Cass Sunnstein’s Republic.com 2.0 or author Eli Pariser’s The Filter Bubble, in which data profiles construct self-affirming echo chambers of targeted advertisements and content according to one’s presumed identity, political affiliation, or interests.84 I’m instead describing a form of control that is much more indirect and unapparent. It’s a form that moves the goal posts of what is and what is not true, algorithmically regulating discursive construction largely beyond our gaze and most often without our comprehension.

Alexander R. Galloway and Eugene Thacker, The Exploit: A Theory of Networks (Minneapolis: University of Minnesota Press, 2007), 42. 83. Cheney-Lippold, “New Algorithmic Identity.” 84. Cass Sunstein, Republic.com 2.0 (Princeton, NJ: Princeton University Press, 2009); and Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Books, 2012). 85. Michel Foucault, Power/Knowledge (Brighton, UK: Harvester, 1980). 86. Judith Butler, Excitable Speech: A Politics of the Performative (New York: Routledge, 1997), 34. 87. Although Flanagan talks explicitly about women’s bodies, I use her concept to engage with the lived virtuality of these “creatures.”

Nigel Thrift, Knowing Capitalism (Thousand Oaks, CA: Sage, 2005), 172. 40. Aihwa Ong, Neoliberalism as Exception: Mutations in Citizenship and Sovereignty (Durham, NC: Duke University Press, 2006), 124. 41. Tung Hui-Hu, A Prehistory of the Cloud (Cambridge, MA: MIT Press, 2015), 146. 42. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Books, 2012). 43. Antoinette Rouvroy, “The End(s) of Critique: Data Behaviourism versus Due Process,” in Privacy, Due Process, and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology, ed.

pages: 180 words: 55,805

The Price of Tomorrow: Why Deflation Is the Key to an Abundant Future
by Jeff Booth
Published 14 Jan 2020

We know that our thoughts and actions are highly influenced by what we see, read, and listen to, so that means that the technology that targets us naturally creates filter bubbles. We might not even realize where an idea was implanted in our brains in the first place. As we click on the things we like, these bubbles reinforce on themselves—deepening the connections in our brains and hardening our views. We rarely look outside our own bubble of reality; and when we do, the people in other filter bubbles look downright crazy. It could be religion, politics, economics, race, or any number of other divisions. The conjunction of those three things—1) Maslow’s hierarchy, where many in the world or even our own backyard are at very different stages of the pyramid; 2) technology that targets us individually and therefore reinforces belief patterns; and 3) a natural tendency in humans to create “us versus them”—has the potential to create a dangerous reinforcing loop where hate and division reign.

pages: 598 words: 134,339

Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World
by Bruce Schneier
Published 2 Mar 2015

The first listing in a Google search: Chitika Online Advertising Network (7 Jun 2013), “The value of Google result positioning,” https://cdn2.hubspot.net/hub/239330/file-61331237-pdf/ChitikaInsights-ValueofGoogleResultsPositioning.pdf. the Internet you see: Joseph Turow (2013), The Daily You: How the New Advertising Industry Is Defining Your Identity and Your Worth, Yale University Press, http://yalepress.yale.edu/yupbooks/book.asp?isbn=9780300165012. the “filter bubble”: Eli Pariser (2011), The Filter Bubble: What the Internet Is Hiding from You, Penguin Books, http://www.thefilterbubble.com. on a large scale it’s harmful: Cass Sunstein (2009), Republic.com 2.0, Princeton University Press, http://press.princeton.edu/titles/8468.html. We don’t want to live: To be fair, this trend is older and more general than the Internet.

The first listing in a Google search result gets a third of the clicks, and if you’re not on the first page, you might as well not exist. The result is that the Internet you see is increasingly tailored to what your profile indicates your interests are. This leads to a phenomenon that political activist Eli Pariser has called the “filter bubble”: an Internet optimized to your preferences, where you never have to encounter an opinion you don’t agree with. You might think that’s not too bad, but on a large scale it’s harmful. We don’t want to live in a society where everybody only ever reads things that reinforce their existing opinions, where we never have spontaneous encounters that enliven, confound, confront, and teach us.

W., 230 Bush, George W., 230 business models, surveillance-based, 50, 56, 113–14, 206 Buzzfeed, 28–29 cable companies, surveillance by, 47–48 CALEA (Communications Assistance for Law Enforcement Act; 1994), 83, 120, 165 need for repeal of, 182 Callahan, Mary Ellen, 162–63 Cameron, David, 222, 228 Canada, in international intelligence partnerships, 76 Caproni, Valerie, 83 Carnegie Mellon University, 41 Carter, Jimmy, 230 cash registers, as computers, 14 cell phone metadata: NSA collection of, 20–21, 36, 37, 62, 138, 339 Stanford University experiment on, 21–22 cell phones: GPS-enabled, 3, 14 multiple functions of, 46 NSA’s remote activation of, 30 as surveillance devices, 1–3, 14, 28, 39, 46–47, 62, 100, 216–17, 219, 339 wiretapping of, 148 censorship, 94–95, 106–7, 187–88 self-, 95, 96 Census Bureau, US, 197 Central Intelligence Agency (CIA), 67 in domestic surveillance operations, 104 Senate Intelligence Committee hacked by, 102 Chambers, John, 122 Charter of Fundamental Rights of the European Union, 232, 364 chat services, 13, 83, 119, 226 government surveillance of, 29, 62, 81 checks and balances: oversight and, 175 secrecy and, 100 Chicago Police Department, 160 China: censorship in, 94, 95, 150–51, 187, 237 cyberattacks from, 42, 73, 132, 142, 148, 149, 180 50 Cent Party in, 114 mass surveillance by, 70, 86, 140, 209 Uighur terrorists in, 219, 287 ChoicePoint, 79, 116 Christie, Chris, 102 Church committee, 176 Cisco, 85, 122 Clapper, James, 129, 130, 336 Clinton, Hillary, 101, 106 Clinton administration, 120 Clipper Chip, 120–21 cloud computing, 5, 59, 60 consumer rights and, 60, 221 government surveillance and, 122 incriminating materials and, 59, 272 CNET, 125 Cobham, 3, 244 Code of Fair Information Practices (1973), 194 Code Pink, 104 Cohen, Jared, 4 COINTELPRO, 103 Cold War, 63, 71, 75, 207, 229 “collect,” NSA’s use of term, 129 Comcast, 358 as information middleman, 57 surveillance by, 48–49 commons, as lacking on Internet, 188–89 communication: computers as devices for, 13–14 ephemeral vs. recorded, 127–29 Communications Assistance for Law Enforcement Act see CALEA Communications Security Establishment Canada (CSEC), 40–41 Communists, Communism, 92–93 fall of, 63 complexity, as enemy of security, 141 Comprehensive National Cybersecurity Initiative, 69 computers, computing: cash registers as, 14 as communication devices, 13–14 cost of, 24 data as by-product of, 3–4, 5, 13–19 increasing power of, 35 smartphones as, 14 see also electronic devices Computer Security Act (1987), 187 COMSEC (communications security), 164–65 Congress, US, 237 NSA oversight by, 172–76 privacy laws and, 198–99 secrecy and, 100 “connect-the-dots” metaphor, 136, 139, 322 consent, as lacking in mass surveillance, 5, 20, 51 Consent of the Networked (MacKinnon), 210, 212 Constitution, US: Bill of Rights of, 210 First Amendment of, 189 Fourth Amendment of, 67, 156, 170 warrant process and, 92, 179, 184 Consumer Privacy Bill of Rights (proposed), 201, 202 consumer rights: cloud computing and, 30 data collection and, 192–93, 200–203, 211 convenience, surveillance exchanged for, 4, 49, 51, 58–59, 60–61 cookies, 47–48, 49 correlation of, 49 correlation, of data sets, 40–45, 49, 133, 263–64 Counterintelligence Field Activity, 69, 104 counterterrorism: excessive secrecy in, 171 as FBI mission, 184, 186 fear and, 222, 226, 227–30 mass surveillance as ineffective tool in, 137–40, 228 as NSA mission, 63, 65–66, 184, 222 NSA’s claimed successes in, 325 Creative Cloud, 60 credit bureaus, as data brokers, 52 credit card companies, data collected by, 14, 23–24 credit card fraud, 116, 313 data mining and, 136–37 credit cards, RFID chips on, 29 credit scores, 112–13, 159, 196 Credit Suisse, 35–36 CREDO Mobile, 207 Cryptocat, 215 cryptography, see encryption cultural change: systemic imperfection and, 163–64 transparency and, 161 Customer Relations Management (CRM), 51–52 customer scores, 110–11 Cyber Command, US, 75, 146, 180–81, 186, 187 cybercrime, increasing scale of, 116–19, 142 cyber sovereignty, 187–88 cyberwarfare, 74–75, 81, 132, 220 arms race in, 180–81 attack vs. defense in, 140–43 collateral damage from, 150–51 military role in, 185–86 NIST’s proposed defensive role in, 186–87 see also Cyber Command, US Dalai Lama, 72 Daniel, Jon, 101 data: analysis of, see data mining as by-product of computing, 3–4, 5, 13–19 historical, 35–37 increasing amount of, 18–19 see also metadata data broker industry, 2, 5, 41, 48, 51–53, 79, 234 correction of errors in, 269 customer scores in, 110–11 lack of consent in, 5, 51 data collection, 234 accountability and, 193, 196, 197–99 benefits of, 8, 190 fiduciary responsibility and, 204–5 government regulation and, 197–99 harms from, 8 health and, 16 limits on, 191, 192, 199–200, 202, 206 NSA definition of, 129, 320 opt-in vs. opt-out consent in, 198 respect for context in, 201 rights of individuals in, 192–93, 200–203, 211, 232 salience of, 203–4 security safeguards in, 192, 193–95, 202, 211 from social networking sites, 200–201 specification of purpose in, 192 see also mass surveillance Dataium, 195–96 data mining, 33–45 adversarial relationships and, 138–39 algorithmic-based, 129–31, 136–37, 159, 196 anonymity and, 42–45 correlation of data sets in, 40–45, 49, 133 credit card fraud and, 136–37 of historical data, 35–37 inferences from, see inferences, from data mining limits on uses of, 191, 192, 195–97, 206 personalized advertising and, 33, 35, 38 political campaigns and, 33, 54 quality assurance and, 34, 54, 136–37, 192, 194, 202 relationship mapping in, 37–38 security threats and, 136–40 tax fraud and, 137 data storage: capacity for, 18–19 cloud-based, 5, 59 limits on, 191, 199–200, 206 low cost of, 5, 18, 24, 144, 206 “save everything” model of, 34 Datensparsamkeit, 200 de-anonymizing, by correlation of data sets, 43–44, 263–64 Declaration of the Rights of Man and of the Citizen, 210 Defense Department, US: Counterintelligence Field Activity of, 69, 104 Cyber Command of, 75 domestic surveillance by, 69, 184 Defentek, 3 delete, right to, 201–2 democracy: government surveillance and, 6, 95, 97–99, 161–62, 172–73 whistleblowers as essential to, 178 demographic information, data brokers and, 52 denial-of-service attacks, 75 Department of Homeland Security, US, 27, 162–63, 295–96 deportation, discrimination and, 93 DigiNotar, hacking of, 71–72 direct marketing, 52 discrimination: corporate surveillance and, 109–13 government surveillance and, 4, 6, 93, 103–4 in pricing, 109–10 DNA sequencing, 16 de-anonymizing of, 44 DNS injection, 150–51 Doctorow, Cory, 217 “Do Not Track” debate, 80 Do Not Track law, California, 233 DoNotTrackMe, 49 “Don’t Ask Don’t Tell” policy, 197 DoubleClick, 48 Drake, Thomas, 101 Dread Pirate Roberts (Ross Ulbricht), 105 drone helicopters, 25, 29 micro-, 253 drone strikes, mass surveillance and, 94 Drug Enforcement Administration (DEA), 104, 105 Dubai, 27, 43 DuckDuckGo, 124 due process, 168, 184 Duffy, Tim, 227 East Germany, 23 eBay, 57–58 Economist, 91 EDGEHILL, 85 education, collection of data and, 8 Eisenhower, Dwight D., 230 Elbit Systems, 81 Elcomsoft, 150 electronic devices, vendor control of, 59–60 Ello, 124 Ellsberg, Daniel, 101 e-mail, 119, 226 local vs. cloud storage of, 31 Emanuel, Rahm, 234 encryption, 85–86, 224, 344 backdoors and, 86, 120–21, 123, 147–48, 169, 182, 314 business competitiveness and, 119–24 increased corporate use of, 208, 224 individual use of, 215 key length in, 143 NIST and, 186–87 NSA and, 144, 186 NSA undermining of standards for, 148–49 secrecy and, 171 value of, 143–44 Engel, Tobias, 3 Environmental Protection Agency (EPA), pollution regulation by, 194–95 ephemerality, of communication, 127–29 Epsilon, 41 Equifax, 53 error rates, in data mining, 34, 54, 136–37, 269 espionage, 63, 73, 74, 76, 158 surveillance vs., 170, 183–84 Espionage Act (1917), 101 Estonia, cyberattacks on, 75, 132 Ethiopia, 73 European Charter, 169 European Court of Justice, 202, 222 European Parliament, 76 European Union (EU), 195, 200, 202, 226, 238 Charter of Fundamental Rights of, 232, 364 Data Protection Directive of, 19, 79, 80, 159, 191, 209 data retention rules in, 222 Exact Data, 42 executive branch: abuses of power by, 234–35 secrecy of, 100, 170 Executive Order 12333, 65, 173 Facebook, 58, 59, 93, 198 customer scores and, 111 data collection by, 19, 31, 41, 123, 200, 201, 204 as information middleman, 57 manipulation of posts on, 115 paid placements on, 114 real name policy of, 49 Facebook, surveillance by: data-based inferences of, 34, 258 Like button and, 48 relationship mapping by, 37–38 tagged photo database of, 41 face recognition, automatic, 27, 29, 31, 41, 211 fair information practices, 194, 211 fair lending laws, 196 false positives, 137, 138, 140, 323–24 Farrell, Henry, 60 FASCIA, 3 fatalism, mass surveillance and, 224–25 fear: government surveillance and, 4, 7, 95–97, 135, 156–57, 182–83, 222, 226, 227–30 media and, 229 politicians and, 222, 228 privacy trumped by, 228 social norms and, 227–30 Federal Bureau of Investigation (FBI): CALEA and, 83, 120 COINTELPRO program of, 103 cost to business of surveillance by, 121–22 counterterrorism as mission of, 184, 186 data mining by, 42 GPS tracking by, 26, 95 historical data stored by, 36 illegal spying by, 175 IMSI-catchers used by, 165 legitimate surveillance by, 184 Muslim Americans surveilled by, 103 PATRIOT Act and, 173–74 phone company databases demanded by, 27, 67 surveillance of all communications as goal of, 83 warrantless surveillance by, 67–68, 209 wiretapping by, 24, 27, 83, 171 Federal Communications Commission (FCC), 198 Federal Trade Commission, US (FTC), 46–47, 53, 117, 198 Feinstein, Diane, 172 Ferguson, Mo., 160 fiduciary responsibility, data collection and, 204–5 50 Cent Party, 114 FileVault, 215 filter bubble, 114–15 FinFisher, 81 First Unitarian Church of Los Angeles, 91 FISA (Foreign Intelligence Surveillance Act; 1978), 273 FISA Amendments Act (2008), 171, 273, 275–76 Section 702 of, 65–66, 173, 174–75, 261 FISA Court, 122, 171 NSA misrepresentations to, 172, 337 secret warrants of, 174, 175–76, 177 transparency needed in, 177 fishing expeditions, 92, 93 Fitbit, 16, 112 Five Eyes, 76 Flame, 72 FlashBlock, 49 flash cookies, 49 Ford Motor Company, GPS data collected by, 29 Foreign Intelligence Surveillance Act (FISA; 1978), 273 see also FISA Amendments Act Forrester Research, 122 Fortinet, 82 Fox-IT, 72 France, government surveillance in, 79 France Télécom, 79 free association, government surveillance and, 2, 39, 96 freedom, see liberty Freeh, Louis, 314 free services: overvaluing of, 50 surveillance exchanged for, 4, 49–51, 58–59, 60–61, 226, 235 free speech: as constitutional right, 189, 344 government surveillance and, 6, 94–95, 96, 97–99 Internet and, 189 frequent flyer miles, 219 Froomkin, Michael, 198 FTC, see Federal Trade Commission, US fusion centers, 69, 104 gag orders, 100, 122 Gamma Group, 81 Gandy, Oscar, 111 Gates, Bill, 128 gay rights, 97 GCHQ, see Government Communications Headquarters Geer, Dan, 205 genetic data, 36 geofencing, 39–40 geopolitical conflicts, and need for surveillance, 219–20 Georgia, Republic of, cyberattacks on, 75 Germany: Internet control and, 188 NSA surveillance of, 76, 77, 122–23, 151, 160–61, 183, 184 surveillance of citizens by, 350 US relations with, 151, 234 Ghafoor, Asim, 103 GhostNet, 72 Gill, Faisal, 103 Gmail, 31, 38, 50, 58, 219 context-sensitive advertising in, 129–30, 142–43 encryption of, 215, 216 government surveillance of, 62, 83, 148 GoldenShores Technologies, 46–47 Goldsmith, Jack, 165, 228 Google, 15, 27, 44, 48, 54, 221, 235, 272 customer loyalty to, 58 data mining by, 38 data storage capacity of, 18 government demands for data from, 208 impermissible search ad policy of, 55 increased encryption by, 208 as information middleman, 57 linked data sets of, 50 NSA hacking of, 85, 208 PageRank algorithm of, 196 paid search results on, 113–14 search data collected by, 22–23, 31, 123, 202 transparency reports of, 207 see also Gmail Google Analytics, 31, 48, 233 Google Calendar, 58 Google Docs, 58 Google Glass, 16, 27, 41 Google Plus, 50 real name policy of, 49 surveillance by, 48 Google stalking, 230 Gore, Al, 53 government: checks and balances in, 100, 175 surveillance by, see mass surveillance, government Government Accountability Office, 30 Government Communications Headquarters (GCHQ): cyberattacks by, 149 encryption programs and, 85 location data used by, 3 mass surveillance by, 69, 79, 175, 182, 234 government databases, hacking of, 73, 117, 313 GPS: automobile companies’ use of, 29–30 FBI use of, 26, 95 police use of, 26 in smart phones, 3, 14 Grayson, Alan, 172 Great Firewall (Golden Shield), 94, 95, 150–51, 187, 237 Greece, wiretapping of government cell phones in, 148 greenhouse gas emissions, 17 Greenwald, Glenn, 20 Grindr, 259 Guardian, Snowden documents published by, 20, 67, 149 habeas corpus, 229 hackers, hacking, 42–43, 71–74, 216, 313 of government databases, 73, 117, 313 by NSA, 85 privately-made technology for, 73, 81 see also cyberwarfare Hacking Team, 73, 81, 149–50 HAPPYFOOT, 3 Harris Corporation, 68 Harris Poll, 96 Hayden, Michael, 23, 147, 162 health: effect of constant surveillance on, 127 mass surveillance and, 16, 41–42 healthcare data, privacy of, 193 HelloSpy, 3, 245 Hewlett-Packard, 112 Hill, Raquel, 44 hindsight bias, 322 Hobbes, Thomas, 210 Home Depot, 110, 116 homosexuality, 97 Hoover, J.

pages: 677 words: 206,548

Future Crimes: Everything Is Connected, Everyone Is Vulnerable and What We Can Do About It
by Marc Goodman
Published 24 Feb 2015

Simply stated, Facebook, Google, and other Internet companies know that if they provide you the “right” stuff, you’ll spend more time on their sites and click on more links, allowing them to serve you up more ads. Facebook is by no means alone in this game, and Google too quantifies all your prior searches and, more important, what you’ve clicked on, in order to customize your online experience. In his book The Filter Bubble, the technology researcher Eli Pariser carefully documented the phenomenon. Getting you the “right” results is big business, and millions of computer algorithms are dedicated to the task. Google reportedly has at least fifty-seven separate personalization signals it tracks and considers before answering your questions, potentially to include the type of computer you are on, the browser you are using, the time of day, the resolution of your computer monitor, messages received in Gmail, videos watched on YouTube, and your physical location.

As a result, our digital lives, mediated through a sea of screens, are being actively manipulated and filtered on a daily basis in ways that are both opaque and indecipherable. This fundamental shift in the way information flows online shapes not only the way we are informed but the way we view the world. Most of us are living in filter bubbles, and we don’t even realize it. Around the world, nations are increasingly deciding what data citizens should be able to access and what information should be prohibited. Using compelling arguments such as “protecting national security,” “ensuring intellectual property rights,” “preserving religious values,” and the perennial favorite, “saving the children,” governments are ever expanding their national firewalls for the purpose of Internet censorship.

With each successive generation, we grow deeply comfortable, even if only unconsciously, with blindly following the directions provided to us by machines. Garbage in, garbage out has been supplanted by garbage in, gospel out: if the computer says so, it must be true. The problem with such reasoning is that we as a society are relying on incorrect data all the time, a festering problem that will come back to bite us. Filter bubbles, invisible search engine censorship, national firewalls, and faulty data mean we have a fundamental integrity problem with the way we see the world, or more precisely with the way the world is presented to us, mediated through our screens. When Seeing Ain’t Believing In the preceding chapters, we focused extensively on what happens when your data leak and your information confidentiality is breached.

pages: 202 words: 59,883

Age of Context: Mobile, Sensors, Data and the Future of Privacy
by Robert Scoble and Shel Israel
Published 4 Sep 2013

There is a dark side to these growing capabilities. We should watch for the unintended consequences that always seem to accompany significant change. The potential for data abuse and the loss of privacy head the list of concerns. Eli Pariser wrote a passionate and sincere argument about the loss of privacy in his 2011 book, The Filter Bubble. Pariser took a dark view of the fact that virtually every online site collects, shares and sells user data. He talked about how large organizations use data to stereotype people and then assume they know what we want to see and hear. By getting our eyeballs to stick to their web pages they then get us to click on ads they target at us.

In fact, there is truth to what he had to say and people should consider Pariser’s perspective as they make their own decisions about what to do and not do in the Age of Context. In our view, though, Pariser presented a one-sided perspective on a multi-sided and highly granular issue. The Filter Bubble overlooked the world-improving changes that big data is making. As Neo’s Eifrem sees it, “Fundamentally, companies like Neo build hammers. You can use them to build or to smash. Yes, there will be abuses and we must be vigilant about that, but the best solution to empowering people to find and learn what they need is contained in the new databases.

pages: 446 words: 109,157

The Constitution of Knowledge: A Defense of Truth
by Jonathan Rauch
Published 21 Jun 2021

He and Morgan Marietta in their 2019 book, One Nation, Two Realities: Dueling Facts in American Democracy, use the term “dueling fact perceptions” to describe partisan and polarized versions of reality. “The result is a further retreat into like-minded bubbles, creating more distance and distrust, fostering greater polarization.” Scholars have returned mixed findings about the prevalence of filter bubbles in social media. The polarizing and filtering effects of partisan television and talk radio are better documented and larger than the bubble-making effects of social media, according to Barker. But social media are harder to study, precisely because no two people ever see the same reality. The Constitution of Knowledge is designed to force disagreements toward comparison and resolution, insisting on contestability and building one reality out of many; digital media was designed to do just the opposite, undermining contestability and splintering one reality into many.

“There is little overlap in the news sources they turn to and trust.”53 The researchers added an important caveat: “The study also suggests that in America today, it is virtually impossible to live in an ideological bubble. Most Americans rely on an array of outlets—with varying audience profiles—for political news. And many consistent conservatives and liberals hear dissenting political views in their everyday lives.” Media bubbles, like filter bubbles, were not completely sealed. But they were growing more polarized, more isolated, and more radicalized—especially on the right. The point is not that only conservatives lived in echo-chambers or believed fake news; plenty of progressives did, too. Nor is the point that conservative media were biased and other media were not.

As I spoke with academics and university students, I lost count of the number who echoed Tuvel’s worry. In journalism and law and certain other sectors of the reality-based community, coercive conformity had made inroads, but in universities it was reconfiguring the whole intellectual landscape, for students and professors alike. As a professor told me, “There is no bigger filter bubble than any selective university in the United States. It is definitely the case that at these institutions, which are supposed to be founded on the idea of a marketplace of ideas, there are all kinds of expressions you can’t say now. Anything that relates to race or gender, you had best keep your mouth shut if you have a point of view that deviates from the predominant woke one.

pages: 371 words: 109,320

News and How to Use It: What to Believe in a Fake News World
by Alan Rusbridger
Published 26 Nov 2020

There, most newspapers attempt to be as balanced as they can, in their reporting at least, and it is the broadcasters – think Fox News and much of talk radio – which many regard as wildly biased. Both printed and broadcast media in the UK and US accuse social media platforms of creating echo chambers (SEE: ECHO CHAMBERS) or filter bubbles. This is usually done without irony. Bias, along with inaccuracy, is one of the main reasons citizens give for no longer trusting mainstream media. But the most biased media outlets are often the most popular. People appear not to like the idea of bias, but in practice they lap it up – provided it confirms their own.

Whether all journalists have lived up to the exhortation is a different thing. And ‘appealing to the enlightened force of public opinion’ is a noble ambition, although sometimes difficult to sustain either as a business model or in an age of populist politics. E ECHO CHAMBERS Do we now live in filter bubbles, listening only to the pre-selected voices who share our views and prejudices? This is a persistent criticism of the digital age – and it is a myth, at least according to computer and social scientists who have explored how people are behaving online. Two Oxford Internet Institute academics, Elizabeth Dubois and Grant Blank, published research in 2018 arguing that, in fact, we live in a ‘high-choice environment’ which leads most internet users to more diverse content and perspectives.

Imagine, by contrast, a world in which carefully pre-selected information (based to a large extent on your political leanings) was packaged up and delivered to your doorstep in a format which required you to accept one version of events. The only way of escaping that particular echo chamber was to make a trip out to the newsagent and purchase an alternative viewpoint in a different newspaper. Now, that really was a filter bubble. EDITORS Piers Morgan started freelancing in national newspapers at the age of twenty-three after studying journalism at Harlow College and a period on local newspapers in South London. For a while he wrote a showbiz column, hobnobbing with celebrities, often placing himself at the centre of stories.

pages: 606 words: 157,120

To Save Everything, Click Here: The Folly of Technological Solutionism
by Evgeny Morozov
Published 15 Nov 2013

For Johnson, it seems that the project of pursuing media reform through collective action happily coexists with the project of seeking better understanding of our consumption practices via self-tracking; those two seem to run on two separate tracks without much overlap. “Should corporations building personalization algorithms include mutations to break a reader’s filter bubble? . . . Absolutely. But readers should also accept responsibility for their actions and make efforts to consume a responsible, nonhomogenous [sic] diet, too,” argues Johnson. Perhaps this pervasive emphasis on personal responsibility and individual salvation is the outcome of the Protestant streak in geek mentality documented by Chris Kelty.

By relying on nudges and other similar tricks, it might suddenly become possible to get people to pay attention to Africa or North Korea. At first, such proposals flourished in the context of increasing “serendipity”—which is believed to be under perpetual assault by digital technologies. Thus, Eli Pariser in his Filter Bubble writes that “engineers . . . can solve for serendipity by designing filtering systems . . . to expose people to topics outside their normal experience.” How exactly would it work? Pariser wants Internet companies to actively serve content that they know you haven’t consumed—but think you should.

Daston and Peter Galison, Objectivity (New York: Zone Books, 2010), 115–191. 160 “Algorithms may bring us new artists”: Steiner, Automate This, 88. 161 As Joseph Turow points out: Joseph Turow, The Daily You: How the New Advertising Industry Is Defining Your Identity and Your Worth (New Haven, CT: Yale University Press, 2012). 162 “when an ad is served”: ibid., 126. 162 “a basketball fan receiving an ad”: ibid., 126. 162 “trying to figure out how”: ibid., 124. 162 “lots of firms are beginning to create”: ibid., 124–125. 162 “be nimble in the use of data”: ibid., 125. 162 “We are entering a world”: ibid., 7. 163 already employ algorithms to produce stories automatically: for more on this, see my Slate column: Evgeny Morozov, “A Robot Stole My Pulitzer,” Slate, March 19, 2012, http://www.slate.com/articles/technology/future_tense/2012/03/narrative_science_robot_journalists_customized_news_and_the_danger_to_civil_discourse_.html. 163 “I often wonder how many people”: Katy Waldman, “Popping the Myth of the Filter Bubble,” Slate, April 13, 2012, http://www.slate.com/articles/news_and_politics/intelligence_squared/2012/04/the_next_slate_intelligence_squared_debate_is_april_17_why_jacob_weisberg_rejects_the_idea_that_the_internet_is_closing_our_minds_in_politics_.single.html . 163 “beneficial inefficiency” that accompanied: David Karpf, The MoveOn Effect: The Unexpected Transformation of American Political Advocacy, 1st ed.

System Error: Where Big Tech Went Wrong and How We Can Reboot
by Rob Reich , Mehran Sahami and Jeremy M. Weinstein
Published 6 Sep 2021

The conversation has shifted to the other pole. Humans are being replaced by machines, and the future of work is uncertain. Private companies surveil in ways that governments never even contemplated and profit handsomely in the process. The internet ecosystem feeds hate and intolerance with its echo chambers and filter bubbles. The conclusion seems inescapable: our technological future is grim. However, we must resist this temptation to think in extremes. Both techno-utopianism and -dystopianism are all too facile and simplistic outlooks for our complex age. Instead of taking the easy way out or throwing our hands up in the air, we must rise to the defining challenge of our era: harnessing technological progress to serve rather than subvert the interests of individuals and societies.

Should it matter if the identical material is circulated by US citizens rather than foreign agents? Should the speech of elected leaders or candidates for public office be treated differently than the speech of ordinary citizens? Just as important, we must confront the algorithmic sorting of users into filter bubbles that contribute to growing polarization, extremism, and decreasing social trust, all of which threaten the health of democracy. An unwavering commitment to free speech in the digital age can thwart a third value: individual dignity. The internet is what made possible the situation faced by Nicole Silverberg, an online deluge of misogynistic and hate speech.

But the evidence so far is decidedly mixed—neither a full endorsement of Sunstein’s concerns nor an exoneration of the platforms and the speech they host and distribute. Let’s begin with the major worry of the pessimists: that online interactions diminish people’s exposure to diverse views, encasing users in filter bubbles of the like-minded and thereby exacerbating polarization. If this view were correct, the consequences for democracy would be devastating. However, the evidence points in another direction. As one recent analysis concluded, “Even if most political exchanges on social media take place among people with similar ideas, cross-cutting interactions are more frequent than commonly believed, exposure to diverse news is higher than through other types of media, and ranking algorithms do not have a large impact on the ideological balance of news consumption.”

pages: 412 words: 115,048

Dangerous Ideas: A Brief History of Censorship in the West, From the Ancients to Fake News
by Eric Berkowitz
Published 3 May 2021

“So long as platform profits rely on keeping users on-platform as long as possible,” conclude Gary and Soltani, “controversial and harmful speech will continue to proliferate.”73 In August 2020, despite its public renewal of intention to remove hateful content, particularly by right-wing militias, Facebook left up a militia’s page stating its members’ intention to kill protestors in Wisconsin.74 Another result of the attention economy is that we now exist in filter bubbles online: users’ beliefs and prejudices are intensified with a fire hose of sympathetic and often strident material, while inconsistent views are overwhelmed or diverted before users see them. “Platforms have little incentive to eliminate filter bubbles,” explains McNamee, “because they improve metrics that matter: time on site, engagement, sharing.”75 That perpetual sense of indignation we feel on social media is exhausting, but it keeps us tethered to our laptops and phones.

The most recent entry concerning climate change involved the government’s inserting misleading language into at least nine different scientific reports a few days after it had paused a six-year study of measures to reduce climate-related flood risks to New York and New Jersey. Evidently it is better not to prepare than to acknowledge that there is a problem. The censorship and undermining of climate science under the Bush and Trump administrations contributed to widespread refusals among many Americans to accept plain, verified facts. In certain filter bubbles, the dismissal of scientific findings—even those with urgent, widespread public health implications—has become a way to assert one’s “freedom” from the perceived intrusions of liberal politics and unwanted constructions of reality. As of June 2020, a similar partisan divide had arisen in the US over whether to accept the dangers of coronavirus contagion.

pages: 404 words: 115,108

They Don't Represent Us: Reclaiming Our Democracy
by Lawrence Lessig
Published 5 Nov 2019

In the slogan of the day, the Internet “interprets censorship as damage and routes around it.”45 Using a protocol called “RSS,” people could mash together their own newspaper. And as artificial intelligence (AI) got great, the newspapers themselves would evolve to give us what we wanted in real time. The news no less than anything else on the Internet became part of the “Filter Bubble,” to use CEO of Upworthy Eli Pariser’s powerful and apt description of the epistemology of the emerging Net.46 We were as we wanted to be, and the technologies of the Net worked hard to give us exactly what we wanted to see. No doubt the Internet had banished the censor (at least for relatively developed democracies: the authoritarians were quick to learn how to use the technology to better control their people).

The dark: Jonathan Zittrain, The Future of the Internet and How to Stop It (New Haven, CT: Yale University Press, 2008); Jaron Lanier, You Are Not a Gadget: A Manifesto (New York: Knopf, 2010). 45.The quote is from John Gilmore, one of the founders of EFF. See “John Gilmore,” Wikipedia, available at link #96. 46.Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Press, 2011). 47.Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest (New Haven, CT: Yale University Press, 2017), 31, 270. 48.David W. Moore, The Opinion Makers: An Insider Exposes the Truth Behind the Polls (Boston: Beacon Press, 2008), Kindle edition, loc. 292–93. 49.Anthony Downs, An Economic Theory of Democracy (New York: Harper & Row, 1957). 50.On the limits of even this rationality—and of democracy generally—see Achen and Bartels, Democracy for Realists, 30; Kirby Goidel, America’s Failing Experiment: How We the People Have Become the Problem (Lanham, MD: Rowman & Littlefield, 2014), 57.

Adams, 37 multimember districts, 152–153, 299n19 overturning vetoes, 33 pharmaceutical company contributions, 46 political character from districts, 28–29 Reform Caucus, 256–258 representative representatives, 151–155 representing the people, 3, 4 swing district numbers, 23 See also Congress Iceland deliberative polling, 177, 306n7 “ideal speech situation,” 173, 305n1 ignorance about politics about, 83–88, 100–109, 120 after quitting Facebook, [117n108], 292n108 conventional wisdom ignorance, 235 democracy weakened, 135–136, 174 monarchs veiled in privacy, 137 opinions constructed from knowledge, 128–133, 134–135, 228, 294n125, 295n127 technology manifesting, 137–138 voting ignorantly knowingly, 194–196 See also knowledge intellectual property, [111n92], 113, 290n92 Internet advertisers and information, 91–92, 98–99, 101, 105–107, 120–123 antitrust law, 112, 291n93 Brexit, 216–219, 313n52 cheap speech effects, [93n61], 286n61 data collection evolution, 114–117 data monetization, 109–113 exposure to different thinking, 181 fiduciary duty applied to, 212–219 as “Filter Bubble,” 82 as gambling addiction, 115–117 for information, 81–83, 85–86, 201–203 media educating public, 85–86, 101–104, 173, 192, 201–203, 204–212, 288nn71–72 Mongolian users, 175 news consumption in 2016 election, 98–103 polarization what we want, 121–123 shared understanding gone, 83–87, 98, 127, 131, 173, 204–212 See also data; Facebook Ireland deliberative polling, 177–178 IssueOne.org, 258 Jefferson, Thomas, 37–38 journalism incentives, 103–109, 123–125.

pages: 281 words: 71,242

World Without Mind: The Existential Threat of Big Tech
by Franklin Foer
Published 31 Aug 2017

Facebook puts it this way: “By enabling people from diverse backgrounds to easily connect and share their ideas, we can decrease world conflict in the short and long term.” But we know this is an illusion. Facebook leads us to a destination that is the precise opposite of its proclaimed ideal. It creates a condition that Eli Pariser has called the “Filter Bubble.” Facebook’s algorithms supply us with the material that we like to read and will feel moved to share. It’s not hard to see the intellectual and political perils of this impulse. The algorithms unwittingly supply readers with texts and videos that merely confirm deeply felt beliefs and biases; the algorithms suppress contrary opinions that might agitate a user.

S., 159, 175, 219 Engelbart, Doug, 20–21 engineers, 13, 16, 21–22, 45, 83 and editorial process, 106, 155 at Facebook, 59, 63, 73–75 at Google, 50, 52–53, 55, 124–25 mind-set of, 63, 77 profession of, 43–44, 61–63 write algorithms, 73–75 England, 43–44, 115, 164–66, 168, 213–14 Enlightenment, 161–62, 218, 230 entertainment industry, 86, 107–8, 158 environment, 4, 17, 200, 204, 223 Epstein, Jason, 173 Facebook, 14, 30, 93, 102, 205, 221 and advertising, 210–11 and algorithms, 5, 7, 63, 69, 72–75, 90, 177–78 avoids taxes, 197 and copyright, 166 dominance of, 4–5, 29, 78, 81–82, 192 experiments on users, 74–77 and false news, 91–92, 217 “Filter Bubble” of, 177–79 founding of, 59–61, 134 goals/projects of, 1–2, 5, 72–73, 89 and journalism, 131–32 managed top-down, 56–57 manipulation of, 126 and the media, 6–7, 74, 90, 92, 140, 150, 155 and politics, 123 regulation of, 194, 203–4 “trending on,” 140, 147 and writers, 171, 174 Facebook’s News Feed, 72–75, 160 Federal Communications Commission (FCC), 108, 125–26, 185, 201 Federal Trade Commission, 121, 198, 203 Fischer, Steven Roger, 226–27 Fitzgerald, F.

The Ethical Algorithm: The Science of Socially Aware Algorithm Design
by Michael Kearns and Aaron Roth
Published 3 Oct 2019

Platforms such as Facebook also apply the same powerful machine learning techniques to build individual profiles of user interests based on collective data, and use these models to choose what appears in their News Feed. The aforementioned narrowing that can occur in this process has oftentimes been referred to as an “echo chamber” (or “filter bubble”)—meaning that users are being shown articles and information that align with or echo their existing beliefs and prejudices. And what articles appear on a user’s News Feed can further reinforce the existing behavior that led the algorithm to select them in the first place, resulting in another feedback loop.

Strangelove (film), 100 drug use data, 51–52 Dwork, Cynthia, 26, 36 dynamic effects, 194 echo chamber equilibrium, 123–26 economics applied to dating apps, 94–96 economic scarcity and dating apps, 94 and scope of topics covered, 19–20 The Economist, 145–46 email scams, 137–41, 154 embarrassing polls, 40–45 emergent phenomena, 10 empirical machine learning, 76 employment decisions, 15 encryption, 31–34, 37 Equifax, 32 equilibrium states and dating apps, 95–96 echo chamber equilibrium, 123–25 and game theory, 97–101 and navigation problems, 103 and two-route navigation problems, 107 and user preferences, 97 error rates in algorithms and differential privacy, 42–43 error-minimization, 70, 75, 78–79 and facial recognition, 15–16 and fairness vs. accuracy, 78–84 and image recognition competition, 165–66 and scientific research, 136 ethical principles and accuracy vs. fairness balance, 82–84, 192–93 and adaptive data analysis, 159–60 and algorithmic morality, 176–78 and algorithms as regulatory measure, 16–17 and concerns about algorithm use, 3–4 current state of ethics research, 169–70 design of ethical algorithms, 189–90, 193–95 ethics boards, 179 infancy of ethical algorithm field, 21 and machine vs. human learning, 6–7 and scope of topics covered, 19–21 and threat of optimization gone awry, 179 and unique challenges of algorithms, 7 European Union, 15 existential threat of machine learning, 179–82, 189–90 exploitation, 71–72 exploration period, 70–72, 93 exponential intelligence explosion, 185–88 Facebook advertising, 14–15 and design of algorithms, 4–5 and differential privacy, 51–52 and echo chamber equilibrium, 124–25 and image recognition algorithms, 145–46 News Feed, 8, 19–20 profit motive, 191–92 and promotion of diversity, 125 facial recognition, 15–16 fairness accuracy/fairness trade-off, 63, 69, 74–84, 87, 192–93 and algorithmic morality, 175–76 and algorithms as regulatory measure, 16–18 and biases, 57–63 and concerns about algorithm use, 3 and current state of ethics research, 169–70 and data collection bias, 90–93 and dating apps, 96–97 definitions of, 69–72 design of ethical algorithms, 190 differing notions of, 84–86 and dynamic effects of algorithms, 193–94 “fairness gerrymandering,” 86–90, 134–35 and forbidden inputs, 66–69 and goals of ethics research, 171 and “merit,” 72–74 and scope of topics covered, 18–21 and statistical parity, 69–72 and supervised machine learning, 63–64 and theoretical computer science field, 13 and threat of optimization gone awry, 184–85 and vectors, 65–66 “fake news,” 124–25 false negatives, 73–74, 84–85 false positives, 84–85, 189–90, 193 false rejections, 73, 91, 171 family data, 54–56 “fast takeoff” scenario, 185–88 FATE—fairness, accuracy, transparency, and ethics, 16–17 Federal Bureau of Investigation (FBI), 49–50 feedback loops, 19–20, 92, 95–96, 184–85 Felten, Ed, 187–88 filter bubble, 124 financial status data, 65–66 Fitbits, 50–51 fitness tracking data, 50–51 food science, 143–45, 158–59 forbidden inputs, 66–69 foreign policy, 15 forensic evidence, 54–56 formalization of goals, 194. See also precise specification goal Gale, David, 129 Gale-Shapley algorithm, 129–30 games and game theory and algorithmic self-play, 131–35 and cooperative solutions, 113–15 and current state of ethics research, 169–70 and dating preferences, 94–97 and equilibrium states, 97–101 and “fairness gerrymandering,” 89 gamesmanship in college admissions, 129–30 and injecting diversity, 125–26 and matching markets, 126–30 and the “Maxwell solution,” 105–13 and navigation problems, 101–6 and news filtering, 124–26 and online shopping algorithms, 116–21, 123–24 and scientific research, 135–36 and scope of topics covered, 19–20 and threat of optimization gone awry, 180 variety of applications for, 115–16 “garden of the forking path,” 159–64, 166–68.

pages: 270 words: 71,659

The Right Side of History
by Ben Shapiro
Published 11 Feb 2019

Frank Newport, “In US, 87% Approve of Black-White Intermarriage, vs. 4% in 1958,” Gallup.com, July 25, 2013, http://news.gallup.com/poll/163697/approve-marriage-blacks-whites.aspx. 17. “Race Relations,” Gallup.com, http://news.gallup.com/poll/1687/race-relations.aspx. 18. Mostafa El-Bermawy, “Your Filter Bubble Is Destroying Democracy,” Wired.com, November 18, 2016, https://www.wired.com/2016/11/filter-bubble-destroying-democracy/. 19. Levi Boxell, Matthew Gentzkow, and Jesse M. Shapiro, “Is the Internet Causing Political Polarization? Evidence from Demographics,” Brown.edu, March 2017, https://www.brown.edu/Research/Shapiro/pdfs/age-polars.pdf. 20.

pages: 312 words: 93,504

Common Knowledge?: An Ethnography of Wikipedia
by Dariusz Jemielniak
Published 13 May 2014

Yet the thesis that the open-collaboration phenomenon leads univocally and definitely to liberating consumers from traditional neoliberal institutions and economics seems risky. Moreover, the theoretical democratization of knowledge production may be simply a reenactment of the established system (König, 2012), as discussed in Chapter 2. As Eli E. Pariser’s recent work convincingly shows, the free access to information may just as well be threatened by “filter bubbles” (2011) and corporate monopolization of knowledge, not only supporting the old establishment but also adding new layers to it. Wikipedia seems to be, willingly or not, in the middle of a major ideological clash: Today powerful and highly profitable corporations such as Microsoft and Google are battling for a greater presence and power on the internet.

Journal of Computer-Mediated Communication, 3(1). doi:10.1111/j.1083-6101.1997.tb00065.x Panciera, K., Halfaker, A., & Terveen, L. (2009). Wikipedians are born, not made: A study of power editors on Wikipedia. Paper presented at the GROUP ’09 Proceedings of the ACM 2009 International conference on Supporting Group Work, New York. Pariser, E. E. (2011). The filter bubble: What the Internet is hiding from you. New York: Penguin Press. Parvaz, D. (2011, January 15). Look it up: Wikipedia turns 10. AlJazeera. Retrieved from http://www.aljazeera.com/indepth/features/2011/01/201111571716655385.html Pegg, D., & Wright, D. (2011, December 8). Wikipedia founder attacks Bell Pottinger for “ethical blindness.”

See also Gdańsk/Danzig edit war; RfAs (requests for adminship); Wales, Jimmy “Jimbo” entry barriers for new users, 101 EQ (rules of etiquette), 18 Errant (user), 44–45 Essjay controversy: and apology to Wales, 111; final results of, 117; legalistic solution to, 123; and nature of Essjay’s wrongdoing, 113–114; and resignation from Wikipedia positions, 113; and use of false “authority,” 109–112, 114 ethical breaching experiments, 164–167 ethnographic project, this book as, 193–194 etiquette, 18, 94 exclusionism, 23 Facebook, 95–96, 172 faceless, Wikipedia as, 183 face-to-face contact, online communities with and without, 77 facilitator role in conflict resolution and consensus, 82–83 false personae, 111, 113–114, 117–118 Faraj, Samer, 59–60, 85 FDC (Funds Dissemination Committee), 131–132, 197 featured-article designation, 24 Fetchcomms (user), 46 filibustering, 20 Filipacchi, Amanda, 16 “filter bubbles,” 189 “filter then publish” or vice versa, 183 five pillars, 96, 98 flagged revisions, 136 F/LOSS (free/libre and open-sourcesoftware) movement, 2; and copyright laws, 22; as created by software professionals, 107; forking in, 145; gender gap in, 16; leadership philosophy of, 174–176; patterns of partipation within, 39; as supporting Internet freedom and opposing censorship, 141; voluntary work and immaterial labor in, 230n7 FORG (forgive and forget) norm, 18 forking, 126, 133, 144–148, 179 formalization of rules, 120–121, 124, 151, 174 formal mediation, 61 fossilized procedures, 121 Foucault, Michel, 54 founder role in organizational development, 154–155, 174, 178–179 “founder’s seat” on board of trustees, 129 founding principles, 162 FoxNews.com, 167, 171 2 8 6    I n d e x freedom of information debates, 151 French Wikipedia, 12, 15, 77, 146 fund-raising and distribution, 130, 131–133 Funds Dissemination Committee (FDC), 131–132, 197 Future Search, 64 GAME (gaming the system) rule, 20 Ganga/Ganges issue, 76 Gardner, Sue: on controversial-content filtering, 145–146; and Quaker consensus model, 62; on Wales and Wikiversity issue, 165–166; on WMF accountability, 132; as WMF executive director, 129, 154–155 gatekeeping, 14, 17 Gdansk (user), 68, 73 Gdańsk/Danzig edit war, 64; active parties in, 67–68; beginning of, 65–67; called lame, 71; community attempts to resolve, 68–70; escalation of, 67–73; inability of consensus to end, 59, 70–71; mediation request in, 71; and peace without consensus, 74–76, 78; and stalemate, 80; votes during, 73, 74–75 “geeks,” 188 Geertz, Clifford, 195, 196 gender: attempts to address gender gap, 229n8; of editors, 14–16, 191, 229n8, 231–232n12; effects of gender bias, 16, 77; Homopedia, 5; and self-disclosure as optional online, 25, 117, 199 German Wikipedia, 11, 12, 15, 146, 234n8 Germany.

pages: 510 words: 120,048

Who Owns the Future?
by Jaron Lanier
Published 6 May 2013

Bloggers will notice when a candidate is quoted out of context in a campaign commercial. Similarly, journalists will eventually notice when inflammatory anti-Islamic videos have been faked and dubbed. That is not an entirely dysfunctional means of making up for lost context, but it does mean that corrections and context are trapped within online “filter bubbles.” It is not a given that those who might be predisposed to believe in a deceptive mash-up’s point of view will be exposed to a factual correction about what was mashed. Of course, there’s no guarantee that a person who wants to believe in an idea would actually follow the link to see if a mash-up was deceptive, but at least the link would be right there in front of them.

• A book won’t necessarily be the same for each person who reads it or if the same person reads it twice. On the one hand this will mean better updates for some kinds of information and fewer encounters with typos, but on the other will deemphasize the rhythm and poetics of prose, minimize the stakes of declaring a manuscript complete, and expand the “filter bubble” effect. • The means to find reading material will be where business battles are fought. The fights often will not be pretty. The interface between readers and books will be contested and often corrupted by spam and deception. • Writing a book won’t mean as much. Some will think of this as a democratic, antielitist benefit, and others will think of it as a lowering of standards

W., 149 Bush, Vannevar, 221n business data, 112–13, 150, 189 business plans, 107–8, 117–20, 154, 169–74, 175, 236, 258, 301–2, 344–45 cached data mirrors, 223 Cage, John, 212 California, University of, at Berkeley, 104, 107–8, 111, 172 call centers, 177n Caltech, 94, 184 Cambridge, Mass., 157–58 cameras, 2, 10, 89, 265, 309–11, 319 capital flows, 37, 43–45, 47, 49, 201, 329, 355–56 capitalism, 11, 16–17, 20, 43–46, 47, 49, 66–67, 79, 208, 243, 246–48, 258, 260–63, 272, 273n, 277, 329 capital resources, 86 “captured” populations, 170–71 carbon credits, 87, 88, 298–99, 300, 301–3, 314 cartels, 158 Catholic Church, 190 cell phones, 34n, 39, 85, 87, 162, 172, 182n, 192, 229, 269n, 273, 314, 315, 331 Central Intelligence Agency (CIA), 199–200 chance, 23n change acceleration, 10, 21, 130–33, 136, 193–95, 217 chaos, 165–66, 273n, 331 cheating, 120, 335 Chicago, Ill., 47 China, 54, 70, 85, 87, 199, 200, 201, 208 Christianity, 190, 193–94, 293 Christian Science, 293 civility, 293–94 civilization, 123, 255, 300, 311, 336 civil liberties, 317–24 classified ads, 177n click-through counts, 183, 286, 347 clothing, 89, 260, 367 Cloud Atlas, 165 cloud processors and storage, 11–12, 19, 20, 42, 62, 88, 92, 100, 110, 113, 121, 124n, 144, 146n, 147, 149, 151, 153–54, 168, 203, 209, 245–46, 255, 258, 261–62, 274, 284, 292, 306, 311–13, 326, 338, 347–48, 350, 359 code, 112, 272 cognition, 111–12, 195–96, 260, 312–13, 314, 315, 328 Cold War, 189 Coleman, Ornette, 353 collectives, 358–60 collusion, 65–66, 72, 169–74, 255, 350–51 Columbia Records, 161n commercial rights, 317–24, 347 commissions, 184 communications industry, 258 communism, 136, 153, 291, 344 compensatory servers, 64 competition, 42, 60, 81, 143–44, 147, 153, 180, 181, 187–88, 246–48, 326 complexity, 53–54 Computer Lib/Dream Machines (Nelson), 229 computer programmers, 113–14, 123, 286n computers: artificiality of, 130, 134 calculations by, 146n, 147–48, 149, 151 cloud processors and, see cloud processors and storage development of, 53, 129–30 as machines, 22–25, 123, 129–30, 155, 158, 165–66, 178, 191, 193, 195, 248, 257–58, 261, 328 memory of, 146n networks of, see digital networks parallel, 147–48, 149, 151 personal (PCs), 158, 182n, 214, 223, 229 programming of, 113–14, 120, 123, 157, 180, 193, 248, 272, 286n, 342, 362–63 remote, 11–12 reversible, 143n security of, 175, 345–46 servers for, 12n, 15, 31, 53–57, 71–72, 95–96, 143–44, 171, 180, 183, 206, 245, 358 software for, 7, 9, 11, 14, 17, 68, 86, 99, 100–101, 128, 129, 147, 154, 155, 165, 172–73, 177–78, 182, 192, 234, 236, 241–42, 258, 262, 273–74, 283, 331, 347, 357 user interface for, 362–63, 364 computer science, 113–14, 120, 123, 157, 180, 193, 248, 272, 286n, 342, 362–63 conflicts of interest, 62n Confucianism, 214, 215–16 connectivity, 171–72, 184–85, 273, 296n, 309, 316, 331 consciousness, 195–96 conservatism, 148, 149–51, 153, 204, 208, 249, 251, 253, 256, 293 construction industry, 151 consultants, 69–72 consumer electronics, 85–86, 162 consumer-facing sites, 179–80, 182, 216 consumers, see economies, consumer “content farms,” 120 contracts, 79–82, 172, 182, 183–84, 246–48, 314, 347, 352–53 copyright, 44, 47, 49, 60, 61, 96, 183, 206, 207, 224–26, 239–40, 263–64 corporations, 265–67, 307, 314, 348–51 correlations, 75–76, 114–15, 192, 274–75 correlative algorithms, 75–76 corruption, 31, 48, 77, 235, 257, 341n cost comparisons, 64 cost-effectiveness, 136–37 cost externalization, 59n countercultures, 24 Craigslist, 177n credit, 52, 116, 177, 193, 287–90, 305, 320, 337–38 credit cards, 185, 186, 269n credit ratings, 52, 116, 177, 193, 320 creepiness, 305–24 crime, 48, 307, 311, 319–21 crowdsourcing, 21, 86, 119–20, 356 cryptography, 14 currencies, 286–87 customer service, 177 cyberactivism, 14, 199, 200–201, 210, 308–9, 335–36, 339 cyberattacks, 201 cybernetics, 230 cyberpunk literature, 309, 356n Daedalus, 22 data, 12, 20, 50–54, 55, 71–76, 92, 167n, 174, 176–77, 178, 196, 223, 234–35, 246–48, 256–58, 271–75, 286–87, 292, 300, 307, 316, 317–24, 347 see also big data databases, 20, 71–72, 75–76, 178, 192, 203 data copying, 50–51 data mining, 120 dating services, 108–9, 113, 167–68, 274–75, 286 Deadpool, 189 death denial, 193, 218, 253, 263–64, 325–31, 367 death tolls, 134 debt, 29, 30n, 54, 92, 95, 96 Debt: The First 5,000 Years (Graeber), 30n decision-making, 63–64, 184, 266, 269–75, 284n decision reduction, 266, 269–71, 284n deconstructionism, 131 democracy, 9, 32–33, 44, 90–92, 120, 200, 202–4, 207, 208, 209, 209, 210, 246–48, 277–78, 321, 324, 336, 342 Democratic Party, 202 demonetization, 172, 176n, 186, 207, 260–61 denial of service, 171–72, 312–13, 315 depopulation, 97, 133 depressions, economic, 69–70, 75, 135, 151–52, 288, 299 deprinters, 88 derivatives funds, 56, 60, 149, 153, 155, 301 determinism, 125, 143, 156, 166–68, 202, 328, 361 devaluation, 15–16, 19–21 developed world, 53–54 Diamond, Jared, 134 dice throws, 23n Dick, Philip K., 18, 137 differential pricing, 63–64 digital cameras, 2 digital networks, 2–3, 9, 11, 12, 14, 15, 16, 17, 19–21, 31, 35, 49, 50–51, 53, 54–55, 56, 57, 59, 60–61, 66–67, 69–71, 74, 75, 77–80, 92, 96, 99, 107–8, 118–19, 120, 122, 129–30, 133n, 136–37, 143–48, 192, 199, 209, 221–30, 234, 235, 245–51, 259, 277, 278, 286–87, 308–9, 316, 337, 345, 349, 350, 355, 366–67 design of, 40–45 educational, 92–97 effects, 99, 153, 169–74, 179, 181–82, 183, 186, 207, 305, 362–63, 366 elite, 15, 31, 54–55, 60, 122, 201 graph-shaped, 214, 242–43 medical, 98–99 nodes of, 156, 227, 230, 241–43, 350 power of, 147–49, 167 punishing vs. rewarding, 169–74, 182, 183 tree-shaped, 241–42, 243, 246 see also Internet digital rights movement, 225–26 digital technology, 2–3, 7–8, 15–16, 18, 31, 40, 43, 50–51, 132, 208 dignity, 51–52, 73–74, 92, 209, 239, 253–64, 280, 319, 365–66 direct current (DC), 327 disease, 110 disenfranchisement, 15–16 dossiers, personal, 109, 318 dot-com bubble, 186, 301 double-blind tests, 112 Drexler, Eric, 162 DSM, 124n dualism, 194–95 Duncan, Isadora, 214 Dyson, George, 192 dystopias, 130, 137–38 earthquakes, 266 Eastern Religion, 211–17 eBay, 173, 176, 177n, 180, 241, 343 eBooks, 113, 246–47, 352–60 eBureau, 109 economic avatars, 283–85, 302, 337–38 economics, 1–3, 15, 22, 37, 38, 40–41, 42, 67, 122, 143, 148–52, 153, 155–56, 204, 208, 209, 236, 259, 274, 288, 298–99, 311, 362n, 363 economies: austerity in, 96, 115, 125, 151, 152, 204, 208 barter system for, 20, 57 collusion in, 65–66, 72, 169–74, 255, 350–51 competition in, 42, 60, 81, 143–44, 147, 153, 180, 181, 187–88, 246–48, 326 consumer, 16–17, 43, 54, 56n, 62, 63–65, 72–74, 85–86, 98, 114, 117, 154, 162, 173–74, 177, 179–80, 182, 192, 193, 215, 216, 223, 227, 241, 246, 247, 248–64, 271–72, 273, 286–88, 293, 323, 347–48, 349, 355–56, 357, 358–60 depressions in, 69–70, 75, 135, 151–52, 288, 299 dignity in, 51–52, 73–74, 92, 209, 239, 253–64, 280, 319, 365–66 distributions in, 37–45 of education, 92–97 efficiency in, 39, 42–43, 53, 61, 66–67, 71–74, 88, 90, 97, 118, 123, 155, 176n, 187–88, 191, 236, 246, 310, 349 entrepreneurial, 14, 57, 79, 82, 100–106, 116, 117–20, 122, 128, 148–49, 166, 167, 183, 200, 234, 241–43, 248, 274, 326, 359 equilibrium in, 148–51 financial sector in, 7n, 29–31, 35, 38, 45, 49, 50, 52, 54, 56–67, 69–70, 74–80, 82, 115, 116–20, 148n, 153–54, 155, 179–85, 200, 208, 218, 254, 257, 258, 277–78, 298, 299–300, 301, 336–37, 344–45, 348, 350 freedom and, 32–33, 90–92, 277–78, 336 global, 33n, 153–56, 173, 201, 214–15, 280 government oversight of, 44, 45–46, 49, 79–80, 96, 151–52, 158, 199, 205–6, 234–35, 240, 246, 248–51, 299–300, 307, 317, 341, 345–46, 350–51 growth in, 32, 43–45, 53–54, 119, 149–51, 236, 256–57, 270–71, 274–75, 291–94, 350 of health care, 98–99, 100, 153–54 historical analysis of, 29–31, 37–38, 69–70 humanistic, 194, 209, 233–351 361–367 of human labor, 85, 86, 87, 88, 99–100, 257–58, 292 identity in, 82, 283–90, 305, 306, 307, 315–16 inclusiveness of, 291–94 information, 1–3, 8–9, 15–17, 18, 19–20, 21, 35, 60–61, 92–97, 118, 185, 188, 201, 207, 209, 241–43, 245–46, 246–48, 256–58, 263, 283–87, 291–303, 331, 361–67 leadership in, 341–51 legal issues for, 49, 74–78 levees in, 43–45, 46, 47, 48, 49–50, 52, 92, 94, 96, 98, 108, 171, 176n, 224–25, 239–43, 253–54, 263, 345 local advantages in, 64, 94–95, 143–44, 153–56, 173, 203, 280 market, 16–17, 20, 23–24, 33–34, 38, 39, 43–46, 47, 50–52, 66–67, 75, 108, 118–19, 126, 136, 143, 144–48, 151–52, 155, 156, 167, 202, 207, 221–22, 240, 246–48, 254–57, 261, 262–63, 266, 277–78, 288, 292–93, 297–300, 318, 324, 326, 329, 344, 354, 355–56; see also capitalism mathematical analysis of, 40–41 models of, 40–41, 148–52, 153, 155–56 monopolies in, 60, 65–66, 169–74, 181–82, 187–88, 190, 202, 326, 350 morality and, 29–34, 35, 42, 50–52, 54, 71–74, 252–64 Nelsonian, 335, 349–50 neutrality in, 286–87 optimization of, 144–47, 148, 153, 154–55, 167, 202, 203 outcomes in, 40–41, 144–45 political impact of, 21, 47–48, 96, 149–51, 155, 167, 295–96 pricing strategies in, 1–2, 43, 60–66, 72–74, 145, 147–48, 158, 169–74, 226, 261, 272–75, 289, 317–24, 331, 337–38 productivity of, 7, 56–57, 134–35 profit margins in, 59n, 71–72, 76–78, 94–95, 116, 177n, 178, 179, 207, 258, 274–75, 321–22 public perception of, 66n, 79–80, 149–50 recessions in, 31, 54, 60, 76–77, 79, 151–52, 167, 204, 311, 336–37 regulation of, 37–38, 44, 45–46, 49–50, 54, 56, 69–70, 77–78, 266n, 274, 299–300, 311, 321–22, 350–51 risk in, 54, 55, 57, 59–63, 71–72, 85, 117, 118–19, 120, 156, 170–71, 179, 183–84, 188, 242, 277–81, 284, 337, 350 scams in, 119–21, 186, 275n, 287–88, 299–300 self-destructive, 60–61 social aspect of, 37–38, 40, 148–52, 153, 154–56 stimulus methods for, 151–52 sustainable, 235–37, 285–87 transformation of, 280–94, 341–51 trust as factor in, 32–34, 35, 42, 51–52 value in, 21, 33–35, 52, 61, 64–67, 73n, 108, 283–90, 299–300, 321–22, 364 variables in, 149–50 vendors in, 71–74 Edison, Thomas, 263, 327 editors, 92 education, 92–97, 98, 120, 150, 201 efficiency, 39, 42–43, 53, 61, 66–67, 71–74, 88, 90, 97, 118, 123, 155, 176n, 187–88, 191, 236, 246, 310, 349 Egypt, 95 eHarmony, 167–68 Einstein, Albert, 208n, 364 elderly, 97–100, 133, 269, 296n, 346 elections, 202–4, 249, 251 electricity, 131, 327 Electronic Frontier Foundation, 184 “elevator pitch,” 233, 342, 361 Eloi, 137 employment, 2, 7–8, 11, 22, 56–57, 60, 71–74, 79, 85–106, 117, 123, 135, 149, 151–52, 178, 201, 234, 257–58, 321–22, 331, 343 encryption, 14–15, 175, 239–40, 305–8, 345 Encyclopaedia Britannica, 338 End of History, The (Fukuyama), 165 endoscopes, 11 end-use license agreements (EULAs), 79–82, 314 energy landscapes, 145–48, 152, 209, 336, 350 energy sector, 43, 55–56, 90, 144, 258, 301–3 Engelbart, Doug, 215 engineering, 113–14, 120, 123–24, 157, 180, 192, 193, 194, 217, 218, 248, 272, 286n, 326, 342, 362–63 Enlightenment, 35, 255 enneagrams, 124n, 215 Enron Corp., 49, 74–75 entertainment industry, 7, 66, 109, 120, 135, 136, 185–86, 258, 260 see also mass media entrepreneurship, 14, 57, 79, 82, 100–106, 116, 117–20, 122, 128, 148–49, 166, 167, 183, 200, 234, 241–43, 248, 274, 326, 359 entropy, 55–56, 143, 183–84 environmental issues, 32 equilibrium, 148–51 Erlich, Paul, 132 est, 214 Ethernet, 229 Etsy, 343 Europe, 45, 54, 77, 199 evolution, 131, 137–38, 144, 146–47 exclusion principle, 181, 202 Expedia, 65 experiments, scientific, 112 experts, 88, 94–95, 124, 133–34, 178, 325–31, 341, 342 externalization, 59n Facebook, 2, 8, 14, 20, 56–57, 93, 109, 154, 169, 171, 174, 180, 181, 188, 190–91, 200n, 204, 206, 207, 209, 210, 214, 215, 217, 227, 242–43, 246, 248, 249, 251, 270, 280, 286, 306, 309, 310, 313, 314, 317, 318, 322, 326, 329, 341, 343, 344, 346, 347–48, 366 facial recognition, 305n, 309–10 factories, 43, 85–86, 88, 135 famine, 17, 132 Fannie Mae, 69 fascism, 159–60 fashion, 89, 260 feedback, 112, 162, 169, 203, 298, 301–3, 363–64, 365 fees, service, 81, 82 feudalism, 79 Feynman, Richard, 94 file sharing, 50–52, 61, 74, 78, 88, 100, 223–30, 239–40, 253–64, 277, 317–24, 335, 349 “filter bubbles,” 225, 357 filters, 119–20, 200, 225, 356–57 financial crisis (2008), 76–77, 115, 148n financial services, 7n, 29–31, 35, 38, 45, 49, 50, 52, 54, 56–67, 69–70, 74–80, 82, 115, 116–20, 148n, 153–54, 155, 179–85, 200, 208, 218, 254, 257, 258, 277–78, 298, 299–300, 301, 336–37, 344–45, 348, 350 firewalls, 305 first-class economic citizens, 246, 247, 248–51, 273, 286–87, 323, 349, 355–56 Flightfox, 64 fluctuations, 76–78 flu outbreaks, 110, 120 fMRI, 111–12 food supplies, 17, 123, 131 “Fool on the Hill, The,” 213 Ford, Henry, 43 Ford, Martin, 56n Forster, E.

pages: 320 words: 87,853

The Black Box Society: The Secret Algorithms That Control Money and Information
by Frank Pasquale
Published 17 Nov 2014

Google results have become so very particular that it is increasingly difficult to assess how much of any given subject or controversy any of us actually sees. We see what we have trained Google to show us and what Google gradually conditions us to expect. Entrepreneur Eli Pariser calls this phenomenon “the filter bubble” and worries that all this personalization has serious side effects, namely increased insularity and reinforced prejudice.121 So intense is the personalization of search results, for instance, that when British Petroleum’s (BP) massive oil spill was dominating cable news in the summer of 2010, searches for “BP” on Google led some users to fierce denunciations of the company’s environmental track record, and others to investment opportunities in the company.122 Only the search engineers at the Googleplex can reliably track who’s seeing what and why.

Pasquale and Tara Adams Ragone, “The Future of HIPAA in the Cloud,” Stanford Technology Law Review (forthcoming 2014). Available at http://papers.ssrn.com /sol3/papers.cfm?abstract _id=2298158. 48. A company called Acxiom has 1,600 pieces of information about 98 percent of U.S. adults, gathered from thousands of sources. Eli Pariser, The Filter Bubble (New York: Penguin, 2011), 3. At least some of them are healthindicative or health-predictive. Daniel J. Solove, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (New Haven, CT: Yale University Press, 2008); Natasha Singer, “You for Sale: Mapping the Consumer Genome,” New York Times, June 16, 2012; Nicolas P.

Kevin Kelly, What Technology Wants (New York: Viking, 2010), 331 (“each time we click a link we strengthen a node somewhere in the supercomputer’s mind, thereby programming . . . it”); Trebor Scholz, ed., Digital Labor: The Internet as Playground and Factory (New York: Routledge, 2013); Jaron Lanier, Who Owns the Future? (New York: Simon & Schuster, 2013); Jessica Weisberg, “Should Facebook Pay Its Users?,” The Nation, January 14, 2014 NOTES TO PAGES 79–81 257 (quoting manifesto “WE WANT TO CALL WORK WHAT IS WORK SO THAT EVENTUALLY WE MIGHT REDISCOVER WHAT FRIENDSHIP IS”). 121. Eli Pariser, The Filter Bubble (New York: Penguin, 2011). 122. Ibid., 6–7. 123. Fortunately, one has written a work of fiction to suggest what could go wrong. Shumeet Baluja, The Silicon Jungle: A Novel of Deception, Power, and Internet Intrigue (Princeton, NJ: Princeton University Press, 2011). 124. Cathy O’Neil, “When Accurate Modeling Is Not Good,” Mathbabe (blog), December 12, 2012, http://mathbabe.org/2012/12/12/when-accurate -modeling-is-not-good/ (analyzing the work of a casino CEO concerned with predictive analytics). 125.

pages: 280 words: 76,638

Rebel Ideas: The Power of Diverse Thinking
by Matthew Syed
Published 9 Sep 2019

By getting their news from Facebook, and other platforms, where friends share cultural and political leanings, people are more exposed to people who agree with them, and evidence that supports their views. They are less exposed to opposing perspectives. The dynamics of fine-sorting can be magnified by a subtler phenomenon: the so-called filter bubble. This is where various algorithms, such as those inside Google, invisibly personalise our searches, giving us more of what we already believe, and further limiting our access to diverse viewpoints.10 This is the digital equivalent of the Bahns experiment, but at a higher level of gearing. The sheer interconnectivity of the Internet has facilitated enhanced political fine-tuning.

Also see: https://iop.harvard.edu/forum/im-not-racist-examining-white-nationalist-efforts-normalize-hate https://www.youtube.com/watch?v=LMEG9jgNj5M 5 Data provided by the academic Angela Bahns, personal correspondence. 6 https://www.ncbi.nlm.nih.gov/pubmed/26828831 7 Data provided by Bahns, measured in 2009. 8 Conversation with the author. 9 http://www.columbia.edu/~pi17/mixer.pdf 10 Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (Viking, 2011). 11 https://qz.com/302616/see-how-red-tweeters-and-blue-tweeters-ignore-each-other-on-ferguson/ 12 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6140520/ 13 https://www.tandfonline.com/doi/pdf/10.1080/1369118X.2018.1428656 14 Kathleen Hall Jamieson and Joseph N.

pages: 573 words: 142,376

Whole Earth: The Many Lives of Stewart Brand
by John Markoff
Published 22 Mar 2022

One key idea that was deeply embedded in the Media Lab was the notion of personalized media. What Negroponte called the Daily Me, the concept that each person’s newsfeed would be personalized, was seen as an inevitable future with little understanding of the darker reality that would become known as filter bubbles several decades later. One of the people Brand met during his three-month stay was Danny Hillis, a cerebral young supercomputer designer who was a protégé of Minsky’s. Several years earlier Hillis had founded Thinking Machines, a radical supercomputer company based on the idea of linking tens of thousands of small microprocessor chips to work on problems simultaneously.

Brand had an early inkling of some of the forces that would radically reshape the world three decades later when the cyberworld the WELL presaged became the world’s predominant communications channel. What would drive Brand away from the WELL foreshadowed on a microscale the online culture of trolls, filter bubbles, disinformation, surveillance, and censorship that has come to deeply trouble the entire world in the past decade. Brand’s disaffection was complicated by the fact that the WELL’s growth had been strong up until 1991 and then slowed visibly, and it was not clear whether the slowdown was due to the national recession or some other reason.

Spencer, 216 Brown, Jerry, 98, 225–27, 230, 231, 348 Brussell, Mae, 215 Burning Man festival, 109, 328 Burrows, George Lord (great-grandfather), 8–9 Burrows, Lorenzo (great-great-grandfather), 7 Butler, Katy, 247–48 butterfly effect, 361 C Caffe Trieste, 48, 74 California, University of, at Berkeley, 25, 135, 302 California Museum of Science and Industry, 91 California Water Atlas, 227 Callahan, Michael, 94, 106–7, 137 Calthorpe, Peter, 129, 246, 256, 302, 305, 306, 307, 318, 341 Cannery Row (Steinbeck), 18, 46, 243 Cape Breton Island, Canada, Jennings and SB’s home on, 199–200, 207–9, 218, 234–35 Capra, Frank, 28 Capra, Fritjof, 295, 297 Carlston, Doug, 271, 325, 328–29 Carroll, Jon, 357 Cassady, Neal, 69, 126, 131–32, 143 Center for Advanced Study in the Behavioral Sciences, 45–46, 208 Chappell, Walter, 116, 119 Chernobyl nuclear disaster, 282 Chicago 8, trial of, 177, 188 Chippewas, 7 choice, freedom of, 42–43 Church, George, 360 Churchill, Winston, 291 CIA, 202, 298 City Lights Books, 37, 50, 74 Clear Creek, 205 climate change, 338–39 SB and, 4, 338, 339, 342, 347, 348–50, 357, 361 Schwartz and, 338–40, 342 Clock Library, 314, 325, 326, 327–28, 329, 362 Coate, John, 309 coevolution, 46, 217, 219, 222, 223, 232 CoEvolution Quarterly, 6, 85, 175, 208, 221–22, 227, 229, 230, 236, 240–41, 251, 254, 255–56, 257, 289 Black Panther–edited issues of, 229 Butler’s article on Baker scandal in, 247–48 demise of, 261 Gaia hypothesis story in, 230, 349 Kelly’s cover article in, 254 Kleiner’s cyberspace article in, 240 as money-losing venture, 233, 240 O’Neill’s space colonies story in, 231–32 provocative viewpoint of, 228–29 SB’s idiosyncratic editorial style at, 229, 249 SB’s separation from, 271 Schweickart’s “No Frames, No Boundaries” reprinted in, 225 Co-Existence Bagel Shop, 37, 48, 74 Collyns, Napier, 277, 295 Commoner, Barry, 206 commune movement, 139–40, 154, 177 communications technology, 133, 241, 282, 290, 298 Community Memory, 265 complex systems, learning by, 274, 277, 279–80, 284, 289 computer conferencing, 151, 240, 251–52, 263, 264, 265, 266 computer networks, see cyberspace; internet computers, computing: counterculture and, 185–86 Engelbart’s “bootstrapping” vision of, 151, 153 hobbyists in, 147, 158, 196, 198, 213, 230, 266–67 personal, see personal computers predicted exponential increase in powers of, 152–53 SB’s growing interest in, 145–46, 168, 266, 280 conservationism, of SB, 4, 340 Contact Is the Only Love (Stern), 93, 136 counterculture, 2, 10, 71, 75, 143, 146, 176, 177–78, 202, 216, 228 computing community and, 185–86 demise of, 181, 241 political-psychedelic divide in, 145, 152 SB’s negative reassessment of, 297 Trips Festival as catalyst for, 127–28, 130 Whole Earth Catalog in, 173–74 Counterculture Green (Kirk), 135 Coyote, Peter, 127, 225, 226, 287–88, 297, 358, 361 Creative Initiative Foundation, 41 Creative Philanthropy seminar, SB’s organizing of, 250 creativity, LSD and, 72, 76–77 Crooks family, 65–66 Crosby, David, 189–90 Crumb, R., 215, 228 Curwen, Darcy, 22, 23 cybernetics, 2, 4, 169, 208, 213, 216–17, 226, 273 Cybernetics (Wiener), 169, 226 cyberspace, 54, 84, 212, 240, 254–55, 258, 261, 279, 298–99 anonymity and pseudonymity in, 266 dangers of, 293, 315 dystopian aspects of, 308, 310–11 gold rush mentality and, 293, 323 impact of, 295–96 SB and, 4, 251–52, 282, 293 see also internet D Dalton, Richard, 261 Daumal, René, 186 Deadheads, 265–66 Defense Department, US, 315, 338 de Geus, Arie, 274, 285, 289 Delattre, Pierre, 47, 74 deserts, SB’s attraction to, 108–10, 114 Desert Solitaire (Abbey), 181 desktop publishing, 164–65 Detroit Free Press, 10, 172–73 Dick Cavett Show, SB’s appearance on, 192 Diehl, Digby, 200 Diggers, 206 Direct Medical Knowledge, 326, 333, 342 DiRuscio, Jim, 324–25 Divine Right’s Trip (Norman), 193, 223 DNA Direct, 343 Dome Cookbook (Baer), 162, 180 Doors of Perception, The (Huxley), 28 dot-com bubble, 296, 326, 333–34, 348 Doubleday, 253, 256, 257 Drop City (commune), 154, 162 “Drugs and the Arts” panel (SUNY Buffalo), 177 Duffy, Frank, 319 Durkee, Aurora, 180 Durkee, Barbara and Stephen, 61, 105, 119–20, 137, 139, 162, 177, 179, 180, 186, 229 Garnerville studio of, 60, 66, 69, 105, 106–7 SB’s friendship with, 51–52, 59–60, 66, 67, 133 in USCO, 106–7 Dvorak, John, 259, 260 Dymax, 147–48 Dynabook, 212 Dyson, Esther, 315, 325 E Eames, Charles, 44–45, 96, 113 Earth, viewed from space: SB’s campaign for photograph of, 134–35, 164 SB’s revelatory vision of, 1, 6, 362 Earth Day (1970), 182, 190, 364 Edson, Joanna, 75–76 Edson, John, 14, 18–20, 38–39, 75 education: intersection of technology and, 144, 145 see also alternative education movement; learning, act of Education Automation (Fuller), 169 Education Innovations Faire, 149 Ehrlich, Paul, 46, 177, 341, 360–61, 364 SB as influenced by, 28, 45, 47, 187, 188, 206, 222–23 EIES (Electronic Information Exchange System), 240, 251–52, 264, 266 Electric Kool-Aid Acid Test, The (Wolfe), 5, 88, 111, 121, 125, 170, 181 “Electric Kool-Aid Management Consultant, The” (Fortune profile of SB), 297 Electronic Frontier Foundation, 325 endangered species, 2, 360–61 Engelbart, Douglas, 83, 138, 146, 151–53, 158, 186–87, 230, 292, 361 “Mother of All Demos” by, 171–72 oNLine System of, 151, 156, 197, 212 SB influenced by, 150, 153, 185, 364 English, Bill, 160, 171, 185, 203, 211 English, Roberta, 160 Eno, Brian, 305, 306, 314, 319, 320, 325, 327, 336, 342, 353, 354 “Environmental Heresies” (Brand), 341–42 environmental movement, 2, 71, 159, 204 activist approach to, 181–82, 187, 188, 201–2, 297, 347 conservation vs. preservation in, 340 SB’s break with, 246, 336, 341, 347 SB’s role in, 4, 9–10, 180, 181–82, 201, 202, 204–7, 284, 347 Esalen Institute, 71–72, 84, 138, 176, 185 Esquire, 88, 146, 183, 247, 250 Essential Whole Earth Catalog, 286 Evans, Dave, 146, 156, 180, 185–87, 212 Exploratorium, 194–98 extinct species, revival of, 359–60 F Fadiman, Jeff, 38, 44, 59, 62–63, 64 Fadiman, Jim, 72–73, 77–78, 80, 84, 89, 97, 98, 101 Fall Joint Computer Conference (San Francisco; 1968), 171–72 Fano, Robert, 46, 273 Fariña, Mimi, 141, 237 Farm (commune), 257 Ferlinghetti, Lawrence, 50, 71 Field, Eric, 44, 53 Fillmore Auditorium (San Francisco), 125–26, 128, 130 filter bubbles, 279, 308 Fluegelman, Andrew, 220, 221–22, 269 Foer, Franklin, 5 Foreign Policy, 356 Fort Benning, SB at, 53–58 Fort Dix, SB at, 58–63, 64, 65–68 Fortune, 297, 339 “Four Changes” (Snyder), 187 Francis, Sharon, 105, 112 Frank, Delbert, 86–87 Frank, Robert, 179, 188, 199–200, 218 Fraunhofer, Joseph Ritter von, 108 Free Speech Movement, 135, 175 From Bauhaus to Our House (Wolfe), 304–5 “Fruits of a Scholar’s Paradise” (Brand; unpublished), 45–46, 208 Fukushima nuclear disaster, 355–58 Fuller, Buckminster, 134, 147, 162, 169, 175, 176, 217 SB influenced by, 132, 138, 146, 150, 168–69, 222, 243–44, 363–64 Fulton, Katherine, 318 futurists, 262, 273 SB as, 258–59, 280, 323 G Gaia hypothesis, 230, 349 games, SB’s interest in, 84, 120, 129–30, 149, 210, 211, 217, 219–21, 236–37 Gandhi, Mohandas K., 53 Garcia, Jerry, 128, 158 Garnerville, N.Y., Durkee/USCO studio at, 60, 66, 69, 105, 106–7, 136, 154 Gaskin, Stephen, 257 Gause, Gregory, 46 GBN, see Global Business Network genetic engineering, 341, 344, 360–61 geodesic domes, 176, 217 Georgia-Pacific, 9, 29 Gerbode Valley, Calif., 219–21, 236–37 Getty Museum, 329 Gibbons, Euell, 138 Gibson, William, 262, 294, 315 Gilman, Nils, 297–98 Gilmore, John, 325, 352 Ginsberg, Allen, 33–34, 50, 69, 77, 94, 177, 237 Global Business Network (GBN), 291, 295–300, 311, 313, 335, 340 Brand and Schwartz as co-founders of, 291–92 climate change scenario of, 338–39 SB consulting position at, 296, 298, 305, 314, 315–16, 323–24, 343, 354 globalization, 295–96, 346 global warming, see climate change GMO foods, 2, 344, 347, 357 Godwin, Mike, 308 Golden Gate National Recreation Area, 237, 360 Gone (Kirkland), 359 Gottlieb, Lou, 140 government, SB’s evolving view of, 166, 227, 348 Graham, Bill, 124, 125, 128, 130–31, 143 Grand Canyon, SB’s visit to, 19–20 Grateful Dead, 24, 123, 125, 126, 130, 131, 141, 158, 160, 189, 265–66 Great Basin National Park, 329–30 “Great Bus Race, The,” SB at, 181 Gregorian, Vartan, 27 Griffin, Susan, 295, 297 Griffith, Saul, 349–50 Gross, Cathleen, 286, 289 Grossman, Henry, 63 H hackers, hacker culture, 25, 84, 147–48, 150, 261, 267, 268–69, 273, 293, 294 SB’s Rolling Stone article on, 46, 211–13, 217, 250 Hackers: Heroes of the Computer Revolution (Levy), 266–67, 268, 270 Hackers Conference, 266–70, 326 Haight-Ashbury (San Francisco neighborhood), 74, 75, 128 Halpern, Sue, 241–42 Harman, Willis, 41, 42, 72, 73, 77, 273 Harner, Michael, 101, 118, 129 Harper’s Magazine, 46, 213, 228 SB’s Bateson profile in, 216–17 Whole Earth Epilog proposal of, 218, 219, 222 Harris, David, 149, 162, 299 Harvey, Brian, 268–69 Hawken, Paul, 247–48, 250, 281, 286, 290, 299, 332, 333, 334 Hayden Planetarium, 91, 92, 105 Hayes, Denis, 351 Healy, Mary Jean, 205, 206, 207 Heard, Gerald, 41–42, 84 Hells Angels, 120 Herbert, Anne, 230–31, 241, 255 Hershey, Hal, 183 Hertsgaard, Mark, 357 Hertzfeld, Andy, 267 Hewlett, William, 156 Hewlett-Packard, 147 Hickel, Walter, 206 Higgins Lake, Mich., Brand family camp at, 7, 8, 9, 10–11, 21, 30, 209, 289–90, 326, 327 Hillis, Danny, 289, 301, 305, 315, 336 Long Now Clock and, 313–14, 316–17, 325–26, 327, 328, 329, 330, 333, 362, 363 Thinking Machines founded by, 279–80, 288 Hippies, Indians, and the Fight for Red Power (Smith), 118 Hoagland, Edward, 201 Hoffer, Eric, 32 Hoffman, Abbie, 177–78, 214, 299 Hog Farm commune, 159, 181, 186, 188, 202, 205, 206, 220 Homebrew Computer Club, 147, 158, 198, 230, 266–67 Homo Ludens (Huizinga), 220 Hopcroft, David, 275 Hopi Indians, 100, 139, 205 Horvitz, Robert, 6 House Committee on Education and Labor, SB at hearing of, 190–91 Household Earth, see Life Forum How Buildings Learn (SB’s UC Berkeley seminar), 302 How Buildings Learn (BBC documentary), 320 How Buildings Learn (Brand), 291, 300–301, 304–7, 310, 312, 317–19, 323, 324, 331 How to Be Rich Well (SB book proposal), 344–46 Hubbard, Al, 42, 77, 273 Huerfano Valley, Colo., 139–40 Huizinga, Johan, 220 human potential movement, 71, 73, 84 humans: freedom of choice of, 42–43 as morally responsible for care of natural world, 42, 347, 349, 360, 361 SB’s speculations about fate of, 38–39 Human Use of Human Beings, The (Wiener), 160 Hunger Show (Life-Raft Earth), 187–88, 189, 203, 263 Huxley, Aldous, 28, 33, 41, 72, 144, 226 hypertext, concept of, 172, 230, 292, 293 I IBM, 91, 92, 96, 108, 211 I Ching, 89–90, 117, 153, 197, 253 Idaho, University of, 21 identity, fake, cyberspace and, 266 II Cybernetic Frontiers (Brand), 46, 213, 217, 221 Iktomi (Ivan Drift), 96–97 Illich, Ivan, 196 Independent, 353 information, personalization of, 279 information sharing, 180 information technology, 299–300, 315 information theory, 273 “Information wants to be free,” 270, 299, 301 information warfare, 315 In Our Time (Hemingway), 11 Institute for International Relations (IIR), 27, 34, 35, 37 Institute for the Future, 315 intelligence augmentation (IA), 83, 185, 187 International Federation of Internal Freedom, 89 International Foundation for Advanced Study, LSD experiments at, 42, 72, 73, 76–82, 273 internet, 146, 151, 279, 293, 314, 316, 326 ARPANET as forerunner of, 212 impact of, 295–96, 323 libertarianism and, 5, 348 see also cyberspace Internet Archive, 330, 332 Internet of Things, 279 Interval Research Corporation, 321–23 “Is Environmentalism Dead?”

pages: 743 words: 201,651

Free Speech: Ten Principles for a Connected World
by Timothy Garton Ash
Published 23 May 2016

If the three philosophers did not already have their heads in different clouds, they soon would have.145 The situation is constantly evolving but, at this writing, Google will customise your search results on the basis of your location and—if you are logged in rather than actively opting to search anonymously—your personal search history, as well as information drawn from those of your email accounts and social networks to which it has access. The last is the ‘social’ component of customisation, including what your friends are interested in. If you and I search for exactly the same term, we will get different results. And if we are not careful, we will each hive off into our own individual ‘filter bubble’.146 More broadly, across the internet there is a risk of fragmentation into thousands of tiny ‘information cocoons’: echo chambers where the news and opinions we see are only those favoured by the like-minded and our only newspaper is the Daily Me. At the extreme, you have the Norwegian mass murderer Anders Behring Breivik, whose anti-Muslim fury was reinforced by constantly revisiting a handful of hysterical sites about the threats Islam and multiculturalism posed to Europe and by his own tiny crowd of online correspondents.

A 15-year-old British schoolgirl who flew to join the Islamic State terrorist organisation in Syria had, it turned out, been following 74 radical and fundamentalist Islamist Twitter accounts.16 Unlike in the physical world, the internet makes it easy for the conspiracy theorist to find the 957 other people across the planet who share his or her particular poisoned fantasy. The increasingly personalised nature of internet searches on Google and other search engines can exacerbate the problem, with everyone disappearing into his or her own ‘filter bubble’.17 This is a much broader problem, to which we will return, but it clearly affects the analysis of speech and violence. Thus, for example, the Norwegian mass murderer Anders Behring Breivik was reinforced in his paranoid views by obsessive reading of anti-Islamic and anti-multiculturalist websites such as Pamela Geller’s ‘Atlas Shrugs’ and Robert Spencer’s ‘Gates of Vienna’, from which he quoted in his online ‘crusader’ manifesto.18 Does that mean such sites should be blocked and their content censored?

The most alarming aspect of this is the threat to our privacy, which I discuss more in chapter 7, but it can also impair our pursuit of knowledge. If the effect of search personalisation is to give a higher ranking to sites we and our online contacts have previously viewed, then we are in danger of being hived off into ‘filter bubbles’ of the like-minded. Google will reply that it is just giving people what they want—a more personalised, customised service. But that is only half the story. The other half is that Google is giving advertisers what they want: the capacity to target individual consumers ever more precisely. If information is power, personalised information is also money.

pages: 361 words: 81,068

The Internet Is Not the Answer
by Andrew Keen
Published 5 Jan 2015

Just as Instagram enables us to take photos that are dishonest advertisements for ourselves, so search engines like Google provide us with links to sites tailored to confirm our own mostly ill-informed views about the world. Eli Pariser, the former president of MoveOn.org, describes the echo-chamber effect of personalized algorithms as “The Filter Bubble.”41 The Internet might be a village, Pariser says, but there’s nothing global about it. This is confirmed by a 2013 study by the Massachusetts Institute of Technology showing that the vast majority of Internet and cell phone communication takes place inside a hundred-mile radius of our homes and by a 2014 Pew Research and Rutgers University report revealing that social media actually stifles debate between people of differnet opinions.42 But the reality of the Web is probably even more selfie-centric than the MIT report suggests.

See, for example, his latest book: Future Perfect: The Case for Progress in a Networked Age (New York: Riverhead, 2012). 36 Tom Standage, Writing on the Wall: Social Media—The First 2,000 Years (New York: Bloomsbury, 2013). 37 Ibid., epilogue, pp. 240–51. 38 Williams, “The Agony of Instagram.” 39 Rhiannon Lucy Coslett and Holly Baxter, “Smug Shots and Selfies: The Rise of Internet Self-Obsession,” Guardian, December 6, 2013. 40 Nicholas Carr, “Is Google Making Us Stupid?,” Atlantic, July/August 2008. Also see Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York; Norton, 2011). 41 Eli Pariser, The Filter Bubble: What the Internet Is Hiding From You (Penguin, 2011). See also my June 2011 TechCrunchTV interview with Eli Pariser: Andrew Keen, “Keen On . . . Eli Pariser: Have Progressives Lost Faith in the Internet?,” TechCrunch, June 15, 2011, techcrunch.com/2011/06/15/keen-on-eli-pariser-have-progressives-lost-faith-in-the-internet-tctv. 42 Claire Carter, “Global Village of Technology a Myth as Study Shows Most Online Communication Limited to 100-Mile Radius,” BBC, December 18, 2013; Claire Cain Miller, “How Social Media Silences Debate,” New York Times, August 26, 2014. 43 Josh Constine, “The Data Factory—How Your Free Labor Lets Tech Giants Grow the Wealth Gap.” 44 Derek Thompson, “Google’s CEO: ‘The Laws Are Written by Lobbyists,’” Atlantic, October 1, 2010. 45 James Surowiecki, “Gross Domestic Freebie,” New Yorker, November 25, 2013. 46 Monica Anderson, “At Newspapers, Photographers Feel the Brunt of Job Cuts,” Pew Research Center, November 11, 2013. 47 Robert Reich, “Robert Reich: WhatsApp Is Everything Wrong with the U.S.

pages: 308 words: 85,880

How to Fix the Future: Staying Human in the Digital Age
by Andrew Keen
Published 1 Mar 2018

With Facebook as our new front page on the world, we are simply being refed our own biases by networked software owned by a $350 billion data company that resolutely refuses to acknowledge itself as a media company because that would require it to employ armies of real people as curators. It would also make Facebook legally liable for the advertising that appears on its network. So what we see and read on social media, therefore, is what we want to see and read. No wonder everything now seems so inevitable to so many people. This echo chamber effect, the so-called filter bubble,27 has created a hall of mirrors, a “post-truth” media landscape dominated by fake news and other forms of online propaganda. Thus the disturbing success of Trump, Brexit, and the alt-right movement; thus the virulence of Putin’s troll factories, networked ISIS recruiters, and the other mostly anonymous racists, misogynists, and bullies sowing digital hatred and violence.

Emily Bell, “Facebook Is Eating the World,” Columbia Journalism Review, March 7, 2016. 24. Ibid. 25. Margot E. Kaminski and Kate Klonick, “Facebook, Free Expression and the Power of a Leak,” New York Times, June 27, 2017. 26. John Herrman, “Media Websites Battle Faltering Ad Revenue and Traffic,” New York Times, April 17, 2016. 27. Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (Penguin, 2011). 28. “2017 Edelman TRUST BAROMETER,” Edelman.com, January 15, 3017. 29. Allister Heath, “Fake News Is Killing People’s Minds, Says Apple Boss Tim Cook,” Telegraph, February 10, 2017. 30. Nir Eyal, Hooked: How to Build Habit-Forming Products (Portfolio, 2014). 31.

pages: 295 words: 87,204

The Capitalist Manifesto
by Johan Norberg
Published 14 Jun 2023

The Left thinks these are platforms for right-wing lunatics and disinformation, the Right thinks they are politically correct leftists who invented cancel culture. Many are angry at Apple for charging too much or at Facebook and Google for not charging (‘then you are the product’). At best, social media is just a stupid waste of time. At worst it is a machine that creates polarization, filter bubbles, loneliness and social pressure, and exists only to glue our eyeballs to ads. Whatever you think about these companies, when you look at how many of us flock to their platforms and what we state that we like about them, there is no doubt that they have created enormous value. An interview-based survey showed that if some of the most common services disappeared, people would be willing to pay imaginary sums for them.

Or to quickly learn how to make a red-wine sauce, fix a zip, read an annual report, get rid of a stain or dress up for a funeral. In addition, it has exposed us to a whole universe of ideas, cultures, research, stories and music. It might be worth some craziness. There is a lot of talk about us getting caught up in filter bubbles online, but we are actually much more exposed to opposing views than before because those who share our interests in a certain dimension do not necessarily do so in other areas. An index of how isolated we are from opposing views shows that people who read morning newspapers are somewhat more isolated than those who read news online, while workplaces, neighbourhoods and families are even more homogenized opinion bubbles.

pages: 88 words: 22,980

One Way Forward: The Outsider's Guide to Fixing the Republic
by Lawrence Lessig
Published 12 Feb 2012

We all have joined our i-enabled organization of choice: the Tea Party or MoveOn, Drudge or Huffington Post. We all get our daily fix of fury, from e-mail lists or podcasts, from news sites or blogs. We tune in to the message we want. We tune out the message we can’t stomach. Indeed, as Eli Pariser so powerfully demonstrates in his 2011 book The Filter Bubble, the machines themselves help us tune out. There’s no such thing as “a Google search”; there’s only “my searches on Google.” Google remembers the sort of stuff I’m interested in. Those interests help determine the search results that Google gives me. And thus are my search results different from yours: once again, the business model of polarization, made perfect by the amazing Google.

The Myth of Artificial Intelligence: Why Computers Can't Think the Way We Do
by Erik J. Larson
Published 5 Apr 2021

If thousands or millions of examples of proper logins by employees can be grouped or clustered, the odd ones sitting outside the cluster attract attention. They might be illegal or improper attempts, then. Again, machine learning figures out what’s normal—and thus what is abnormal—by analyzing frequencies. The frequency assumption explains “filter bubbles” in personalized content online, as well. Someone who despises right-leaning politics eventually receives only left-leaning opinions and other news content. The deep learning–based system controlling this outcome is actually just training a model that, over time, recognizes the patterns of the news you like.

computer), 222–224 deep reinforcement learning, 125, 127 Dostoevsky, Fyodor, 64 Dreyfus, Hubert, 48, 74 earthquake prediction, 260–261 Eco, Umberto, 186 Edison, Thomas, 45 Einstein, Albert, 239, 276 ELIZA (computer program), 58–59, 192–193, 229 email, filtering spam in, 134–135 empirical constraint, 146–149, 173 Enigma (code making machine), 21, 23–24 entity recognition, 137 Etzioni, Oren, 129, 143–144 Eugene Goostman (computer program), 191–195, 214–216 evolutionary technology, 41–42 Ex Machina (film, Garland), 61, 78–80, 82, 84, 277 Facebook, 147, 229, 243 facts, data turned into, 291n12 Farecast (firm), 143–144 feature extraction, 146–147 Ferrucci, Dave, 222, 226 filter bubbles, 151 financial markets, 124 Fisch, Max H., 96–97 Fodor, Jerry, 53 formal systems, 284n6 Frankenstein (fictional character), 238 Frankenstein: Or, a Modern Prometheus (novel, Shelly), 238, 280 frequency assumptions, 150–154, 173 Fully Automated High-Quality Machine Translation, 48 functions, 139 Galileo, 160 gambler’s fallacy, 122 games, 125–126 Gardner, Dan, 69–70 Garland, Alex, 79, 80, 289n16 Gates, Bill, 75 general intelligence, 2, 31, 36; abduction in, 4; in machines, 38; nonexistance of, 27; possible theory of, 271 General Problem Solver (AI program), 51 Germany: Enigma machine of, 23–24; during World War II, 20–21 Go (game), 125, 131, 161–162 Gödel, Kurt, 11, 22, 239; incompleteness theorems of, 12–15; Turing on, 16–18 Golden, Rebecca, 250 Good, I.

pages: 94 words: 26,453

The End of Nice: How to Be Human in a World Run by Robots (Kindle Single)
by Richard Newton
Published 11 Apr 2015

The search results you get will be different to the results for an identical search made by me. In fact, so much insight can be derived from your online behaviour that Google and other organisations can ensure you get news that makes you happy… or even angry the way you like to be angry. It’s a process described by Eli Pariser in his book The Filter Bubble: “When technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens.” Pariser wrote this following Google’s decision, in December 2009, to begin customising its search results for each user. Instead of giving you the most broadly popular result, Google would try to predict what you are most likely to click on.

pages: 561 words: 157,589

WTF?: What's the Future and Why It's Up to Us
by Tim O'Reilly
Published 9 Oct 2017

They were a tiny proportion of the total content shared on the site, he argued. Fake news is the stuff of tabloids. Marginal, once the subject of ridicule. How could it come to play such a large role in shaping our collective future? At the very least, the 2016 US presidential election showed what Eli Pariser had called “the filter bubble” in full force. Social media algorithms, driven by “likes,” show each person more of what they respond to positively, confirming their biases, reinforcing their beliefs, and encouraging them to associate online with like-minded people. The Wall Street Journal created an eye-opening site called Blue Feed/Red Feed that used Facebook’s own research data on the political preferences of its users to create side-by-side live feeds of hyperpartisan stories shown to each group.

The assertions getting consensus across all groups, or within specific groups, float to the top and are seen more often—just like content on Facebook, but with visibility into what percentage of others agreed or disagreed with them. This is very different from Facebook likes because participants can see the filter bubble–like graph of those who agree and disagree with a common set of assertions. Participants can click through to view the statements that shape a particular cluster. And as participants agree or disagree with various statements, their avatars move on the graph, toward or away from another cluster.

abstraction vs. reality, 21–22 achievement aspiring to be better, 357–58 climate change scenario, 360–63 creating more value than you capture, 17, 104, 246, 249–50, 291–92, 296–97, 354–55 developing a robust strategy, 358–67 and disruptive technology, 351–52 rising to great challenges, 369–72 spotting opportunities, 367–69 taking the long view, 355–57 technology and the future scenario, 364–67 working on what matters to you, 352–54 Active/X, 10 advertising, 79–81, 161–62, 225 Afghanistan, 116–17 Agrarian Justice (Paine), 306 agricultural productivity, 326 AI (artificial intelligence), x, xx–xxi, xxiv, 232–36 and cybercrime, 208–9 expanding research, 231–32 human fears about, ix, xv–xvi, 300 machine learning, 155, 163–69, 235–36, 334–36 personal agents, xiii, xiv–xvi, 82, 232, 233 and power efficiency, 302 social purpose for, 353–54 system design leads to predictable outcomes, 238–41 Airbnb, x, 64, 75, 97–98, 293 Akerlof, George, 249 Albright, Jonathan, 207–8 algorithms, xx, xxiv, 68 filter bubble, 199–200 human judgment vs. fact checking with, 211 management by, 59–61, 68 minimum-wage mandate vs. market-based algorithms, 197–98 and regulations, 180–81 trust in algorithmic systems, 224–28 and whac-a-mole (fake news), 201–9, 211 See also AI Alibaba, 294 Allchin, Jim, 24 AlphaGo, 165, 167 Altman, Sam, 306, 307 Alvarez, José, 197 Amazon, xi, 9, 34, 52–53, 90, 95, 103 1-Click e-commerce patent, 71–75 accessibility of data leads to AWS, 110–13 Andon Cord, 117–18 continuous improvement, 120–21, 122 and DevOps, 121–23 and electricity, 121, 124 on Linux operating system, 24 long-term investment priority, 245 and machine learning, 166 as a platform, 111–13 promise theory, 114–17 superior data and search returns, 39–40 teams, 113–14 uses for automation, 91–92 Amazon Echo, 82 Amazon Flex delivery service, 94 Amazon Go app, 78, 79 “Amazon’s Stranglehold” (LaVecchia and Mitchell), 103 Amazon Way, The (Rossman), 117 Amazon Web Services (AWS), 110–11 Andon Cord, 117 Andreessen, Marc, 15 Android smartphones, 52, 101 AOL, 276–77 Apache server, 99 Apple, xiii, xiv, 32, 53, 78, 101, 128, 136, 313, 321–22 Application Program Interface (API), 26, 128 Arora, Ashish, 246 artificial general intelligence, 233–34 art market, 312–19 Art of the Long View, The (Schwartz), 359 asylum application, automated, 332 AT&T, 6–7 augmented reality, xviii–xix, 344–45 augmented workers, 320–21, 326–32 access to opportunities, 332–34 cognitive augmentation/cyborgs, 321–22 importance of learning, 334–36 neurotech interfaces, 328–32 at Uber or Lyft, 58–59, 69–70, 332 See also education/training; employees Autodesk, 327–28 Autor, David, 305–6 Avent, Ryan, 304, 348–49 Bad Samaritans (Chang), 134 Baer, Steve, 295–97 Baird, Zoë, 342 Baldwin, Laura, 262, 349 bankruptcy for profit, 249 Basecamp, 287 Battelle, John, 29, 161 Bayha, Carla, 10, 355 Beam, Inc., 48–49, 51 Behr, Kevin, 122 Belenzon, Sharon, 246 beliefs, truth, and fake news, 210–14, 220–24 benefit corporations “B corps,” 292, 293 Berkeley Unix project, 6–7, 16 Berners-Lee, Tim, 99 Bersin, Josh, 111 Bessen, James, 345–47 Bezos, Jeff, 44, 71–75, 110–13, 114–15, 124, 366–67 Bharat, Krishna, 215 big data, 155–56, 163, 325, 326–27, 335–36 Blecharczyk, Nathan, 97–98 Blyth, Mark, 239 Boston, Massachusetts, 138–40 Bostrom, Nick, 234 Bouganim, Ron, 140 Bowling Alone (Putnam), 218–19 Boyd, John, 209 Bregman, Rutger, 307 Brin, David, 177, 179 Brin, Sergey, 132, 157, 160, 289–90 Browder, Josh, 332 Brown, John Seely, 341 Brynjolfsson, Erik, 303 Bucheit, Paul, 306–7, 308, 309 Buffett, Warren, 225, 242–43, 265, 272 “Building Global Community” (Zuckerberg), 218 Burdick, Brad, 126 Burgess, Mark, 114, 115 businesses declining R&D, 245–46 economic impact reports, 290–95 fitness function, 226, 239–41, 274, 352 limiting CEO salaries, 247 management, xxi, 153–54, 247, 279–80 and media content, 226–28 social conscience squashed, 240–41 startups, 41, 186, 247, 275, 279, 282–85, 316 stock price vs. long-term investment, 242–50 and tragedy of the commons, 249–50 uncertain job opportunities, 301–2 See also financial markets Business Insider, 211–12 business model mapping, 48–51, 57–61, 62–70 Cabulous, 56 Cadwalladr, Carole, 202–3, 214 Camp, Garrett, 54, 75 Car2go, 85 Carlsen, Magnus, 330 Carr, Nicholas, 64 Casey, Liam, 66 “Cathedral and the Bazaar” (Raymond), 8–9 central banks, xxi–xxii centralization and decentralization, 105–8 Central Park, New York, 132–33 Cerf, Vint, 107 Chan, Priscilla, 302–3 Chase, Robin, 84–85 Chastanet, Vidal, 371 Chesky, Brian, 97–98 chess and AI, 330 Chinese companies, 53 Chrapaty, Debra, 121 Christensen, Clayton, 24–25, 33–34, 315, 331, 351 Church, George, 328 Clark, Dave, 107 climate change, 300, 302, 360–63 Climate Corporation, 326 Cline, Craig, 29 “Clothesline Paradox, The” (Baer), 295–97 cloud computing, 35, 41, 53, 78, 84, 110–11, 119 Coase, Ronald, 89 Code for America, 138–44, 147, 148–49, 187, 222 Cohen, Stephen, 134 Cohler, Matt, 54 Collins, Jim, 352 combinatorial effects, 96–98 Common Gateway Interface (CGI), 81 communication, 44–45, 84, 90, 114, 115–19, 117 community.

pages: 422 words: 104,457

Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance
by Julia Angwin
Published 25 Feb 2014

It claimed the FTC was unfairly penalizing: Brent Kendall, “FTC Fires Back in Cybersecurity Case,” Wall Street Journal, Law Blog, May 24, 2013, http://blogs.wsj.com/law/2013/05/24/ftc-fires-back-in-cybersecurity-case/. I call this type of mass customization: Internet activist Eli Pariser calls this phenomenon the “filter bubble.” Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011). Consider this: searching for a traditionally black-sounding name: Latanya Sweeney, “Discrimination in Online Ad Delivery” (Working Paper Series, Harvard University, Cambridge, Massachusetts, January 28, 2013), https://papers.ssrn.com/sol3/papers.cfm?

pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All
by Robert Elliott Smith
Published 26 Jun 2019

This creates powerful feedback loops, the net effect of which are informational segregation that can be easily exploited to divide people in much the same way real-world segregation and prejudice has divided people in the past. This isn’t just conjecture. We can see informational segregation technically within online communities, precisely because of their technical nature, and we can see the effects on the power and profit of the entities involved. ‘Echo chambers’ and ‘filter bubbles’ have emerged online, creating sub-communities that spontaneously share only narrow, self-reinforcing points of view. These closed, polarized communities can be shown to be exceptionally effective market segments, both in the realms of politics and commerce. The hidden detail in all this is how algorithms not only exploits but drives this segregation.

Regardless of whether algorithms present ‘true’ or ‘fake’ news, they will still be working towards their primary directive: the maximization of value. Another unforeseen emergent phenomena of coupling neoliberal economic values to highly efficient, optimizing algorithms is the segregation of people into more closed communities (so-called ‘filter bubbles’ and ‘echo chambers’). While selective organization of the online social network is something that is done by human beings, their ‘unfriending’ actions can’t be disconnected from the emotional states generated by the news stories algorithms deliver and the comment wars that ensue. This escalation is in itself profitable, as the segregation of people into effective market segments commands more premium advertising rates.

pages: 396 words: 117,149

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
by Pedro Domingos
Published 21 Sep 2015

Before you buy a car, the digital you will go over every one of its specs, discuss them with the manufacturer, and study everything anyone in the world has said about that car and its alternatives. Your digital half will be like power steering for your life: it goes where you want to go but with less effort from you. This does not mean that you’ll end up in a “filter bubble,” seeing only what you reliably like, with no room for the unexpected; the digital you knows better than that. Part of its brief is to leave some things open to chance, to expose you to new experiences, and to look for serendipity. Even more interesting, the process doesn’t end when you find a car, a house, a doctor, a date, or a job.

See also Cancer drugs Duhigg, Charles, 223 Dynamic programming, 220 Eastwood, Clint, 65 Echolocation, 26, 299 Eddington, Arthur, 75 Effect, law of, 218 eHarmony, 265 Eigenfaces, 215 80/20 rule, 43 Einstein, Albert, 75, 200 Eldredge, Niles, 127 Electronic circuits, genetic programming and, 133–134 Eliza (help desk), 198 EM (expectation maximization) algorithm, 209–210 Emotions, learning and, 218 Empathy-eliciting robots, 285 Empiricists, 57–58 Employment, effect of machine learning on, 276–279 Enlightenment, rationalism vs. empiricism, 58 Entropy, 87 Epinions, 231 Equations, 4, 50 Essay on Population (Malthus), 178, 235 Ethics, robot armies and, 280–281 Eugene Onegin (Pushkin), 153–154 “Explaining away” phenomenon, 163 Evaluation learning algorithms and, 283 Markov logic networks and, 249 Master Algorithm and, 239, 241, 243 Evolution, 28–29, 121–142 Baldwinian, 139 Darwin’s algorithm, 122–128 human-directed, 286–289, 311 Master Algorithm and, 28–29 of robots, 121–122, 137, 303 role of sex in, 134–137 technological, 136–137 See also Genetic algorithms Evolutionaries, 51, 52, 54 Alchemy and, 252–253 exploration-exploitation dilemma, 128–130, 221 further reading, 303–304 genetic programming and, 52 Holland and, 127 Master Algorithm and, 240–241 nature and, 137–139 Evolutionary computation, 121–142 Evolutionary robotics, 121–122, 303 Exclusive-OR function (XOR), 100–101, 112, 195 Exploration-exploitation dilemma, 128–130, 221 Exponential function, machine learning and, 73–74 The Extended Phenotype (Dawkins), 284 Facebook, 44, 291 data and, 14, 274 facial recognition technology, 179–180 machine learning and, 11 relational learning and, 230 sharing via, 271–272 Facial identification, 179–180, 182 False discovery rate, 77, 301 Farming, as analogy for machine learning, 6–7 Feature selection, 188–189 Feature template, 248 Feature weighting, 189 Ferret brain rewiring, 26, 299 Feynman, Richard, 4 Filter bubble, 270 Filtering spam, rule for, 125–127 First principal component of the data, 214 Fisher, Ronald, 122 Fitness Fisher on, 122 in genetic programming, 132 Master Algorithm and, 243 neural learning and, 138–139 sex and, 135 Fitness function, 123–124 Fitness maximum, genetic algorithms and, 127–128, 129 Fix, Evelyn, 178–179, 186 Fodor, Jerry, 38 Forecasting, S curves and, 106 Foundation Medicine, 41, 261 Foundation (Asimov), 232 Fractal geometry, 30, 300 Freakonomics (Dubner & Levitt), 275 Frequentist interpretation of probability, 149 Freund, Yoav, 238 Friedman, Milton, 151 Frontiers, 185, 187, 191, 196 “Funes the Memorious” (Borges), 71 Futility of bias-free learning, 64 FuturICT project, 258 Galileo, 14, 72 Galois, Évariste, 200 Game theory, machine learning and, 20 Gaming, reinforcement learning and, 222 Gates, Bill, 22, 55, 152 GECCO (Genetic and Evolutionary Computing Conference), 136 Gene expression microarrays, 84–85 Generalizations, choosing, 60, 61 Generative model, Bayesian network as, 159 Gene regulation, Bayesian networks and, 159 Genetic algorithms, 122–128 Alchemy and, 252 backpropagation vs., 128 building blocks and, 128–129, 134 schemas, 129 survival of the fittest programs, 131–134 The Genetical Theory of Natural Selection (Fisher), 122 Genetic programming, 52, 131–133, 240, 244, 245, 252, 303–304 sex and, 134–137 Genetic Programming (Koza), 136 Genetic search, 241, 243, 249 Genome, poverty of, 27 Gentner, Dedre, 199 Ghani, Rayid, 17 The Ghost Map (Johnson), 182–183 Gibson, William, 289 Gift economy, 279 Gleevec, 84 Global Alliance for Genomics and Health, 261 Gödel, Escher, Bach (Hofstadter), 200 Good, I.

pages: 397 words: 110,130

Smarter Than You Think: How Technology Is Changing Our Minds for the Better
by Clive Thompson
Published 11 Sep 2013

If we don’t engage in that sort of work, it has repercussions. It’s easier to lean into homophily, connecting online to people who are demographically similar: the same age, class, ethnicity and race, even the same profession. Homophily is deeply embedded in our psychology, and as Eli Pariser adroitly points out in The Filter Bubble, digital tools can make homophily worse, narrowing our worldview. For example, Facebook’s news feed analyzes which contacts you most pay attention to and highlights their updates in your “top stories” feed, so you’re liable to hear more and more often from the same small set of people. (Worse, as I’ve discovered, it seems to drop from view the people whom you almost never check in on—which means your weakest ties gradually vanish from sight.)

Peter Diamandis, the head of the X Prize Foundation: Peter H. Diamandis, “Instant Gratification,” in Is the Internet Changing the Way You Think?: The Net’s Impact on Our Minds and Future, ed. John Brockman (New York: HarperCollins, 2011), 214. Facebook’s news feed analyzes: Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York, Penguin, 2011), 37–38, 217–43. people who are heavily socially active online: Lee Rainie and Barry Wellman, Networked: The New Social Operating System (Cambridge, MA: MIT Press, 2012), Kindle edition. Consider the case of Maureen Evans: Some of my writing here appeared in “Clive Thompson in Praise of Obscurity,” Wired, February 2010, accessed March 26, 2013, www.wired.com/magazine/2010/01/st_thompson_obscurity/.

pages: 434 words: 117,327

Can It Happen Here?: Authoritarianism in America
by Cass R. Sunstein
Published 6 Mar 2018

Politically Motivated Selective Exposure among Internet News Users,” Journal of Computer-Mediated Communication 14 (2009): 265–85. 22. Timur Kuran and Cass R. Sunstein, “Availability Cascades and Risk Regulation,” Stanford Law Review 51 (1999): 683–768. 23. Seth Flaxman, Sharad Goel, and Justin M. Rao, “Filter Bubbles, Echo Chambers, and Online News Consumption,” Public Opinion Quarterly 80 (2016): SI298–320. 24. Conor Friedersdorf, “The Value of Fighting Attacks on Free Speech Early and Often,” Atlantic, January 6, 2017; Jonah Goldberg, “Free Speech Isn’t Always a Tool of Virtue,” National Review, June 21, 2017. 25.

On the dynamics of preference and knowledge falsification, see Timur Kuran, Private Truths, Public Lies: The Social Consequences of Preference Falsification (Cambridge, MA: Harvard University Press, 1995). 26. A mainstream news outlet is one for which the two-party fraction of its readership that voted Republican in the last presidential election is between 0.3 and 0.7. For the three outlets mentioned, the shares are 0.42, 0.45, and 0.47, respectively (Flaxman, Goel, and Rao, “Filter Bubbles,” Table 2). 27. Herman Wong, “ ‘I’m Glad He Got Shot’: Nebraska Democrat Caught on Tape Criticizing Rep. Steve Scalise,” Washington Post, June 23, 2017. 28. Kate Dailey, “Fred Phelps: How Westboro Pastor Spread ‘God Hates Fags,’” BBC News Magazine, March 21, 2014. 29. Lloyd Grove, “How Breitbart Unleashes Hate Mobs to Threaten, Dox, and Troll Trump Critics,” Daily Beast, March 1, 2016. 30.

pages: 396 words: 113,613

Chokepoint Capitalism
by Rebecca Giblin and Cory Doctorow
Published 26 Sep 2022

It’s a grand unified theory of a decades-long, corporate-led hollowing out of creative culture. It will make you angry, and it should.” —ANDY GREENBERG, author of Tracers in the Dark “Not just a fascinating tour of the hidden mechanics of the platform era, from Spotify playlists to Prince’s name change, but a compelling agenda to break Big Tech’s hold.” —ELI PARISER, author of The Filter Bubble and cofounder of Avaaz “A tome for the times … The revolution will not be spotified!” —CHRISTOPHER COE, artist and cofounder of Awesome Soundwave “A masterwork … This is also a useful handbook to take on that power structure… . Both frightening and uplifting.” —DAVID A. GOODMAN, former president of the WGA West “If you have ever wondered why the web feels increasingly stale, Chokepoint Capitalism outlines in great detail how it is being denied fresh air… .

Scott Timberg, Culture Crash: The Killing of the Creative Class (New Haven, CT: Yale University Press, 2015), 171. 9. David Pidgeon, “Where Did the Money Go? Guardian Buys Its Own Ad Inventory,” Mediatel News, Oct. 4, 2016, https://mediatel.co.uk/news/2016/10/04/where-did-the-money-go-guardian-buys-its-own-ad-inventory. 10. Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin, 2011), 49. 11. Competition and Markets Authority, “Online Platforms,” 8. 12. Tim Hwang, Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet (New York: Farrar, Straus and Giroux, 2020), 99. 13. Matt Stoller, “Spotify Is Mimicking Google’s and Facebook’s Strategy: Will It Ruin Podcasting?”

pages: 159 words: 42,401

Snowden's Box: Trust in the Age of Surveillance
by Jessica Bruder and Dale Maharidge
Published 29 Mar 2020

No matter what browser you use, it’s worth remembering: Google and other commercial search engines act like giant vacuum cleaners, hoovering up your queries as an endless stream of market research. Their algorithms deliver results based on what they think you want, creating the kind of echo chamber technologists call a “filter bubble.” If you don’t want your searches tracked, try a private browser like DuckDuckGo — and note how different the results are. Some privacy-minded folks take one extra step, using a VPN — virtual private network — that camouflages online traffic. VPNs are services — some free, some subscription-based — which raises a whole new set of issues around trust and vulnerability.

pages: 525 words: 116,295

The New Digital Age: Transforming Nations, Businesses, and Our Lives
by Eric Schmidt and Jared Cohen
Published 22 Apr 2013

See “Cartoons from the Issue,” New Yorker, October 8, 2012, http://www.newyorker.com/humor/issuecartoons/2012/10/08/cartoons_20121001#slide=5. CHAPTER 2 THE FUTURE OF IDENTITY, CITIZENSHIP AND REPORTING While many worry about the phenomenon of confirmation bias: Eli Pariser describes this as a “filter bubble” in his book The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011). a recent Ohio State University study: R. Kelly Garrett and Paul Resnick, “Resisting Political Fragmentation on the Internet,” Daedalus 140, no. 4 (Fall 2011): 108–120, doi:10.1162/DAED_a_00118. famously dissected how ethnically popular names: Steven D.

pages: 444 words: 130,646

Twitter and Tear Gas: The Power and Fragility of Networked Protest
by Zeynep Tufekci
Published 14 May 2017

However, there is no transparency in algorithmic filtering: how is one to know whether Facebook is showing Ferguson news to everyone else but him or her, whether there is just no interest in the topic, or whether it is the algorithmic feedback cycle that is depressing the updates in favor of a more algorithm-friendly topic, like the ALS charity campaign? Algorithmic filtering can produce complex effects. It can result in more polarization and at the same time deepen the filter bubble.44 The bias toward “Like” on Facebook promotes the echo-chamber effect, making it more likely that one sees posts one already agrees with. Of course, this builds upon the pre-existing human tendency to gravitate toward topics and positions one already agrees with—confirmation bias—which is well demonstrated in social science research.

Trevor Timm (@trevortimm), “#Ferguson livestream has almost 40K viewers right now. For comparison, that’s almost 10% of CNN’s average viewership,” August 13, 2014, https://twitter.com/trevortimm/status/499742916315582464. 43. Tufekci, “Algorithmic Harms beyond Facebook and Google.” 44. For a prescient exploration of this danger, see Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Books, 2012). For a deep dive into how homophily and cosmopolitanism do and don’t operate online, see Ethan Zuckerman, Digital Cosmopolitans: Why We Think the Internet Connects Us, Why It Doesn’t, and How to Rewire It (New York: W.

pages: 505 words: 138,917

Open: The Story of Human Progress
by Johan Norberg
Published 14 Sep 2020

Many assume the problem is that online you only follow those who already share your views. I don’t think so. People who read news online are more likely to come across opposing views than those who read news in print, and those with the strongest views are more not less likely to seek out the other side.40 Filter bubbles are not new. In the era when every household had a newspaper, you only heard from political opponents in the vile abstracts journalists of your tribe wrote. More than a century ago, the Swedish author August Strindberg complained that newspapers and books only gave the reader more arguments for what he already believed, and that everything that contradicted those beliefs was excluded, so ‘he never gets out of his standard diving dress’.41 In those days the opponent was a distant danger that you mostly talked about disparagingly with your friends, not an immediate threat.

(Fukuyama), 362–5 End of Work, The (Rifkin), 312 Engels, Friedrich, 33, 36, 162, 206, 247, 256 English Civil War (1642–1651), 148, 183, 184, 201 Enigma machine, 124–6 Enlightenment, 4, 5, 6, 13, 103, 154–60, 165–6, 195–6 Environmental Performance Index, 327 Ephesus, 45 Epic of Gilgamesh, The, 38 Epicurus, 134–5 Epstein, Richard, 320 equality matching, 262–6, 267 Erasmus, 152 Erdogan, Recep Tayyip, 354 Ethiopia, 72, 130 ethnocentrism, 219, 271 Etruscan civilization (c. 900–27 BC), 43 Eubulus, 47 eugenics, 109 Euphrates river, 37 Euripides, 132 European Organization for Nuclear Research, 306 European Parliament, 325 European Union (EU) Brexit (2016–), 9, 14, 118, 238, 240–41, 349, 354, 379 common currency, 280–81 freedom of movement, 118, 343 migration crisis (2015–), 10, 114, 115, 342–3, 358 subsidies in, 280 trade and, 272 United States, trade with, 19 Evans, Oliver, 203 Evolution of God, The (Wright), 249 evolutionary psychology, 14, 23, 225 exoticism, 84 Expressionism, 198 Facebook, 239, 309 Falwell, Jerry, 113–14 Farage, Nigel, 241 farming, see agriculture Fascist Italy (1922–1943), 105, 219 FedEx, 319 Feifer, Jason, 290–92 Fenway Park, Boston, 223 Ferdinand II, King of Aragon, 97, 98, 106 Ferguson, Charles, 314 Fermi, Enrico, 105 Ferney, France, 153 feudalism, 92, 194, 202, 208 fight-or-flight instinct, 15, 346, 348–9 filter bubbles, 239 financial crisis (2008), 10, 15, 62, 254, 333, 358, 359–60 fire, control of, 32–3, 76 Flanders, 208 fluyts, 100 Flynn effect, 109 Fogel, Robert, 276 folk economics, 258–62 football, 223–4, 245–6 Forbes, 274 Ford, Henry, 203 Fortune 500 companies, 82 Fox News, 82, 302, 354 France, 151 American Revolutionary War (1775–83), 201 automation in, 313 Cathars, 94, 142 Cobden–Chevalier Treaty (1860), 53–4 corruption in, 345 Dutch War (1672–8), 101 Encyclopédie, 154 free zones in, 180–81 Huguenots, persecution of, 97, 99, 101, 158, 193 immigration in, 115 Jews, persecution of, 96, 97, 254 languages in, 289 Minitel, 313 Revolution (1789–99), 201, 292 Royal Academy of Sciences, 156 ruin follies, 287 St Bartholomew’s Day massacre (1572), 97 Thököly Uprising (1678–85), 137 Uber in, 320 University of Paris, 140, 141–2, 143 Francis I, Emperor of Austria-Hungary, 178 Franciscans, 144 Franklin, Benjamin, 107 Franks, 92 free speech, 127, 131–2, 160, 163–5, 343 Chicago principles, 164–5 emigration for, 152–3 university campuses, 163–5 free trade, see under trade Fried, Dan, 289 Friedman, Benjamin, 253 Friedman, David, 284 Friedman, Thomas, 325 Friedrich Wilhelm I, King of Prussia, 153 Fukuyama, Francis, 362–5 Fulda, Germany, 179, 180 Future and Its Enemies, The (Postrel), 300 Future of Nostalgia, The (Boym), 288 Galatia, 90 Galaxy Zoo, 80 Galilei, Galileo, 146, 150 Gallup, 164 game theory, 26 Gandhi, Indira, 326 gas lighting, 297 Gates, William ‘Bill’, 274, 277, 309 Gauls, 90, 91, 92 gay rights, 113, 336 Geary, Patrick, 288–9 gender equality, 113, 114 General Motors, 64 generations baby-boom generation (1946–64), 294, 340 generation X (1965–80), 340 immigration and, 106, 110–11, 113–14 interwar generation (1928–45), 340 millennial generation (1981–96), 340 nostalgia and, 291, 293–4, 296 genetically modified organisms (GMO), 299, 301 Geneva, Switzerland, 152, 153 Genghis Khan, 94–5, 96, 174 Genoa, Republic of (1005–1797), 73, 178 George II, King of Great Britain and Ireland, 193 George III, King of Great Britain and Ireland, 103, 193 George Mason University, 257, 258 Georgia, 365 Georgia, United States, 349 German Conservative Party, 254 Germany automatic looms, 179 Berlin Wall, fall of (1989), 10, 340, 341, 363, 364 Bronze Age migration, 75 budget deficits, 60 COVID-19 pandemic (2019–20), 12 guilds in, 190 immigration in, 114, 115 Jews, persecution of, 99, 104–6, 109, 220, 233 migration crisis (2015–), 342–3 Nazi period (1933–45), 104–6, 109, 124, 220, 233, 353 Neolithic migration, 74 protectionism in, 314 Reichstag fire (1933), 353 Thirty Years War (1618–48), 150 United States, migration to, 104, 107–8, 111 Weimar period (1918–33), 353 al-Ghazali, 139 Gholia, 89 Gibbon, Edward, 90 Gilder, George, 314 Gilgamesh, 38 Gillis, John, 291 Gingrich, Newton, 313 Gini coefficient, 273 Gintis, Herbert, 36 global history, 13 global price crisis (2010–11), 11 global warming, 75, 323, 325, 326–34 globalization, 4, 55, 270 backlashes against, 9, 14, 54, 57 cities and, 35 classical world, 43–50 conspiracy theories on, 323 disease and, 11, 77–9 United States and, 19 Westernization, 4 Glorious Revolution (1688), 101, 185–8, 190, 193 Goa, India, 146–7 golden nugget theory, 5 Golden Rule, 251–2 Golding, William, 219, 243, 244 Goldstone, Jack, 5, 133, 353 Goodness Paradox, The (Wrangham), 227 Google, 309, 311 Gordon, Thomas, 201 Göring, Hermann, 106 gossip, 229 Goths, 92 Gottlieb, Anthony, 135 Great Awakening (1730–55), 102 Great Depression (1929–39), 54–5, 56, 254 Great Enrichment, 167, 204 Great Recession (2007–9), 254–5, 358, 359–60 Great Transformation, The (Polanyi), 37 Great Vanishing, 134–5 Great Wall of China, 178 Greece, ancient, 127–32, 169 Athens, 47, 53, 89, 90, 131–2, 134 Axial Age, 129 cosmopolitanism, 87–8 golden nugget theory, 5 Ionian enlightenment, 127–9 Mycenae, 88 philosophy, 13, 70, 127–32, 134–5, 136 Phoenicians, relations with, 43, 44, 45, 46 science, 127–32, 136 Sparta, 47, 54, 90, 132 trade, attitudes towards, 47, 54 xenophobia in, 90 Green New Deal, 302 Greene, Joshua, 216, 259 Greenland, 51 Gregorian calendar, 137, 152 Gregory IX, Pope, 142 Gregory XIII, Pope, 152 gross domestic product (GDP), 68–9, 257, 278–9 Grotius, Hugo, 147, 152–3 groupthink, 83 Guangzhou, Guangdong, 352 guilds, 190 Gutenberg, Johannes, 146 Haber, Fritz, 105 Habsburg Empire (1282–1918) anti-Semitism in, 254 Austria, 151, 179, 190 refugees, 99 Spain, 98–9, 208 Hadrian, Roman Emperor, 91 Hadrian’s Wall, 47 Hagley Park, West Midlands, 286–7 Haidt, Jonathan, 163, 229, 344, 348, 357 Haile Selassie, Emperor of Ethiopia, 72 Hamas, 365 Hangzhou, Zhejiang, 173 Hanseatic League (1358–1862), 53 Hanson, Robin, 282 Hanway, Jonas, 298 Happy Days, 294 Harari, Yuval Noah, 38 Harriot, Thomas, 150 Hartsoeker, Nicolaas, 159 Harvard Business Review, 313 Harvard University, 116, 122, 137, 253, 309, 313 Haskell, Thomas, 206 Hässelby, Stockholm, 217–18, 245 Hayashi, Stuart, 370 Hayek, Friedrich, 1, 7, 29, 300, 325 Hebrew Bible, 248–50 Hegel, Georg Wilhelm Friedrich, 288, 365 Helm, Dieter, 328, 331 Henrich, Joseph, 36 Hercules, 87 Herodotus, 132 Hewlett-Packard, 304 Higgs, Robert, 337 Hill, Christopher, 182 Hinduism, 136, 149, 354 von Hippel, William, 24, 25, 262, 284 Hippocrates, 128 Hispanic people, 110–11 Hitler, Adolf, 104–5, 353 Hobbes, Thomas, 9, 152, 226 Hofer, Johannes, 288 Holmgren, Pär, 325 Holocaust (1941–5), 109, 220 Holy Roman Empire (800–1806), 155, 181, 288 Homestead Acts, 171 Homo economicus, 34, 36 Homo erectus, 76, 267 Homo sapiens, 3, 21, 23, 30–33, 76, 259–62, 282, 371 homosexuality, 79, 113–14, 336 Homs, Syria, 82 Honeywell, 303 Hong Kong, 53, 235, 316 Hoover, Herbert, 55 horseshoes, 203 House of Wisdom, Baghdad, 136 Household Narrative, The, 297 housing, 375–6 Huguenots, 97, 99, 101, 158, 193 human rights, 87, 147, 213 humanitarianism, 204–7 Hume, David, 151, 154, 194 Hungary, 105, 190, 235, 237, 354, 357 hunkering down, 121, 165 Huns, 93 hunter-gatherer societies death rate, 9 disease and, 78 division of labour and, 29, 32, 40–41, 57 equality matching, 262–3, 265 inbreeding and, 78 isolation and, 52 migration, 73–4, 78–9 physical fallacy, 268 race and, 232 trade, 265 tyranny of cousins, 230 Huntington, Samuel, 110, 362–3, 365–6 Hussein, Saddam, 345 Hussey, Edward, 287 Hutchins, Robert Maynard, 165 Hutus, 230–31 Hypatia, 134 hyper-fast stars, 80 IBM, 305, 307, 319 Ibn al-Haytham, 156 Ibn Hayyan, Jabir, 156 Ibn Rushd, 137–8, 143, 144, 145 ice core drilling, 49 Identity & Violence (Sen), 231 identity politics, 241 al-Idrisi, Muhammad, 137 immigration birth rates and, 115 crime and, 110, 119 culture and, 69–73, 116, 119, 120–23 disgust and, 336, 371 division of labour and, 117 empires and, 84–106 European migration crisis (2015–), 10, 114, 115, 118, 342–3 exoticism, 84 GDP and, 68 innovation and, 81–4 Islam and, 112–14, 255 labour market and, 115, 116–19 opposition to, 69, 70, 114–23, 223, 254–5 productivity and, 68, 81, 117, 204 protectionism and, 66–7 self-selection and, 107, 112 skilled vs unskilled, 66, 82, 102, 116, 117 trade and, 35, 66–7, 234–5 tribalism and, 223, 235–6, 240, 243 urban vs rural areas, 114 welfare and, 118, 281 zero-sum thinking and, 254–5, 259 immigration in United States, 102–14 crime and, 110, 119 innovation and, 81–2, 202 overestimation of, 115, 223 tribalism and, 223, 240 zero-sum thinking and, 254–5, 259 In Defence of Global Capitalism (Norberg), 270 in vitro fertilization, 298–9 inbreeding, 78 India, 42, 45, 46, 56, 75, 129, 136, 140, 146, 270 Arabic numerals, 70, 137 engineering in, 269 Hindu nationalism, 354 industrialization, 207 Maurya Empire (323–184 BC), 53 Mughal Empire (1526–1857), 98, 148, 149, 215 national stereotypes, 235 Pakistan, relations with, 366 pollution in, 326 poverty in, 276, 326 Indo-European language, 75 Indonesia, 41 Industrial Revolution; industrialization, 5, 6, 13, 54, 132, 180, 339 in Britain, 182, 188–99, 202 in China, 169, 172–3, 207 climate change and, 326 in Dutch Republic, 101 in India, 207 in Japan, 71 in United States, 202, 291–2 in Vietnam, 207 inequality, 273, 349 Inglehart, Ronald, 339 ingroups and outgroups, 217–47 fluidity, 230–38 political, 224–5, 238–42 zero-sum relationships and, 252–5 Innocent III, Pope, 233 InnoCentive, 126–7 innovation, 4, 6, 10, 27, 80 ancient world, 32, 42, 44, 46 authoritarianism and, 318 bureaucratic inertia and, 318–21 canon and, 195 cities and, 40, 53, 79 creative destruction, 57, 179, 182, 190 cultural evolution, 28 immigration and 81–4 patent systems, 189–90 population and, 27, 51, 53 Schumpeterian profits, 273–5 resistance to, 10, 179–81 zero-sum thinking and, 266–9 Inquisition, 150 France, 94, 143 Portugal, 100 Spain, 97, 98 intellectual property, 58 Intergalactic Computer Network, 307 International Monetary Fund (IMF), 117 Internet, 57, 275, 278, 306–11, 312, 313 interwar generation (1928–45), 340 Inuit, 22, 51 Ionian enlightenment, 127–9 IQ (intelligence quotient), 109 Iran, 365 Ireland, 104, 108–9, 111, 112, 379 iron, 172 Isabella I, Queen of Castile, 97 Isaiah, 46 Isaura Palaia, Galatia, 90 Isenberg, Daniel, 296 Isis, 89 Islam; Islamic world Arab Spring (2011), 10, 342 clash of civilizations narrative, 237, 365 conflict within, 365 efflorescence, 6, 53, 136–41 fundamentalism, 112, 134, 139, 351 Koran, 137, 250–51 migration from, 112–14 orthodox backlash, 148–9 philosophy, 5, 13 science, 70, 132, 136–41 values in, 112, 113 Islamic State, 351, 365–6 Islamic world, 5, 6, 13, 53, 70 Israel, 111, 365 Italy, 6, 151, 169 anti-Semitism in, 254 Fascist period (1922–1943), 105 Genoa, Republic of (1005–1797), 73, 178 guilds in, 190 Lombard League (1167–1250), 181 Ötzi, 1–2, 8–9, 73, 74 Padua, 144, 146 Papacy in, 155, 181 Renaissance, 6, 150, 153, 169 United States, migration to, 104, 109 Venice, Republic of (697–1797), 53, 144, 152, 174, 181 Jacobs, Jane, 39–40, 79, 264 James II and VII, King of England, Scotland and Ireland, 185–6 Jamestown, Virginia, 200 Japan housing in, 376 kimonos, 73 Meiji Restoration (1868), 53, 70–71 protectionism, 314 Tokugawa Shogunate (1600–1868), 54 United States, migration to, 104, 236, 335 Japanning, 156 JavaScript, 310 jealous emulation, 154–7 jeans, 73 Jefferson, Thomas, 103, 184, 201, 205 Jenner, Edward, 296 Jerusalem, 87, 251 Jesus, 250 Jews in Abbasid Caliphate, 136 anti-Semitism, 254–5, 356 Ashkenazim, 99 Babylonian captivity, 87, 249 Bible, 46, 72, 248–50 Black Death and, 355–6 in Britain, 101, 193 in Dutch Republic, 99, 100, 150 in Germany, 99, 104–6, 109, 111, 254 Inquisition and, 97, 98 in Israel, 111 Mongol invasion and, 95 Muhammed and, 251 Nazirites, 72 in Ottoman Empire, 98 persecution of, 11, 95–7, 109, 220, 233, 251, 355–6 in Poland, 111, 220 in Roman Empire, 90, 93, 94 Sephardim, 99 in Song Empire, 170 in Spain, 97, 98, 99, 140 in United States, 102, 109 Jim Crow laws (1877–1965), 106, 254 Job Buddy, 375 Jobless Future, The (Aronowitz), 312 Jobs, Steven, 82, 304 John Chrysostom, 135 John III Sobieski, King of Poland, 237, 238 Johnson, Samuel, 191, 197 Johnson, Steven, 306 Jones, Rhys, 51 Joule, James Prescott, 196 Judaism, 46, 72, 93, 94, 96, 97 Jupiter, 145 Jurchen people, 172 Justinian I, Byzantine Emperor, 134, 224 Kahn, Robert, 307 Kandinsky, Wassily, 220–21, 289 Kant, Immanuel, 154 Karakorum, Mongol Empire, 96 al-Karaouine, Morocco, 137 Kearney, Denis, 109 keels, 44 Kenya, 21–2 Khayyam, Omar, 137 al-Khwarizmi, 137 Kiesling, Lynne, 328 Kim Jong-il, 314–15 kimonos, 73 King, Martin Luther, 19 King, Steven, 111 Kipling, Rudyard, 70 Klee, Paul, 220–21, 289 Know-Nothings, 108–9 Kodak, 319 Koran, 137, 250–51 Kramer, Samuel Noah, 37, 292 Krastev, Ivan, 342–3 Krugman, Paul, 309 Ku Klux Klan, 254 Kublai Khan, 174 Kurds, 136 Kushim, 37–8 labour mobility, 69, 374–7 lacquerware, 156 lactose, 75 Lao Tzu, 129 lapis lazuli, 70 Late Bronze-Age Collapse (1200–1150 BC), 44, 49, 54 Lebanon, 43, 236 Lee, William, 179 leisure, 199 Lenin, Vladimir, 256 Lesbos, 141 Levellers, 183–4, 186 Leviathan (Hobbes), 152 Levinovitz, Alan Jay, 290 Levy, David, 205 Lewis, David Levering, 140 Libanius, 49 liberalism, 14, 183, 334–40 colonialism and, 214 disgust and, 335, 336 dynamism and, 301 economic, 185, 336 Islam and, 112–14 security and, 334–40, 378 slave trade and, 205 universities and, 163 Libya, 48, 89, 366 Licklider, Joseph Carl Robnett, 307 life expectancy, 4, 169, 339 light bulbs, 297 Lilburne, John, 183 Lincoln, Abraham, 203 Lind, Amanda, 72 Lindsey, Brink, 301 literacy, 15, 57, 168 in Britain, 188, 198 in China, 148 in Dark Ages, 50 empathy and, 246–7 in Greece, 128–9 in Renaissance, 146, 148 Lithuania, 238 Little Ice Age, 148 lobbying, 280, 329 Locke, John, 100, 152, 185, 186, 201 Lombard League, 181 London, England, 190, 193–4, 197 7/7 bombings (2005), 341 London Bridge stabbings (2019), 120 Long Depression (1873–86), 253–4 Lord of the Flies (Golding), 219, 243, 244 Lord’s Resistance Army, 365 Louis IX, King of France, 96 Louis XIV, King of France, 237 Louis XVI, King of France, 201 love, 199 Lucas, Robert, 167 Lucy, 24–5 Lugh, 89 Lul, 111 Luther, Martin, 150, 356 Lutheranism, 99, 356 Lüthi, Max, 351 Lysenko, Trofim, 162 Lyttelton family, 286 Macartney Mission (1793), 176 Macedonian Empire (808–148 BC), 84, 87–9 Madison, James, 337 madrasas, 138 Madrid train bombings (2004), 341 Maduro, Nicolás, 354, 380 Magna Carta (1215), 5 Magris, Claudio, 219 Malacca, 100 Maltesholm School, Hässelby, 217–18, 245 mammoths, 76 Manchester United, 246 Manichaeism, 93 Mann, Thomas, 79 Mansfield, Edward, 271 Mao Zedong, 53, 162, 315, 316, 317, 355 Marcus Aurelius, Roman Emperor, 91 Marduk, 87 de Mariana, Juan, 147 markets, 37 humanitarianism and, 204, 206 immigration and, 68 tribalism, 247 ultimatum game, 34–5 Marley, Robert ‘Bob’, 72 marriage, 199 Marshall, Thurgood, 335 Marx, Karl, 33, 36, 162, 169, 247, 255–6 Marxism, 33, 36, 162, 182, 256, 268 Mary II, Queen of England, Scotland and Ireland, 186, 193 Maryland, United States, 349 Maslow, Abraham, 339, 341 al-Masudi, 136 mathematics, 70, 134, 135, 137, 156 Maurya Empire (323–184 BC), 53 Mauss, Marcel, 71 McCarthy, Joseph, 335 McCarthy, Kevin, 108 McCloskey, Deirdre, 167, 189, 191–2, 198 McConnell, Addison Mitchell ‘Mitch’, 108 McKinsey, 313 measles, 77 media, 346–9, 370 Medicaid, 119 Medina, 251 Medusa, 88 Meiji Restoration (1868), 53, 70–71 Mencken, Henry Louis, 325, 353 Mercury, 89 Merkel, Angela, 343 Mesopotamia, 37–43, 45, 70, 292–3 Metaphysics (Aristotle), 142 Mexico, 73, 77, 257 United States, migration to, 110, 122, 223, 240, 255 Miami, Florida, 120 Micro-80 computers, 304 Microsoft, 305–6, 309 middle class, 60–61 Migration Advisory Committee, UK, 118 Miletus, 127 militarism, 214 Mill, John Stuart, 124, 160, 164, 176, 319 millennial generation (1981–96), 340 Milton, John, 150 Ming Empire (1368–1644), 54, 148, 175, 177–8, 179, 215 minimal group paradigm, 220–22 Minitel, 313 Mobutu Sese Seko, 187 Mokyr, Joel, 157, 195, 196–7 Molyneux, Stefan, 84 Mongol Empire (1206–1368), 53, 84, 94–7, 138, 139, 173–4, 352–3 monopolies, 182, 189 Monte Testaccio, 48 Montesquieu, 89, 94 Moral Consequences of Growth, The (Friedman), 253 Moral Man and Immoral Society (Niebuhr), 253 Moriscos, 97 mortgages, 375 Moscow Institute of Electronic Engineering, 304 most-favoured-nations clause, 53–4 Mughal Empire (1526–1857), 98, 148, 149, 215 Muhammed, Prophet of Islam, 251 Murray, William Vans, 104 Muslims migration of, 112–14, 170, 255 persecution of, 97, 106, 233, 355 Mutz, Diana, 271 Mycenae, 88 Myth of Nations, The (Geary), 288–9 Myth of the Rational Voter, The (Caplan), 258 Naipaul, Vidiadhar Surajprasad, 167 Napoleonic Wars (1803–15), 288 National Aeronautics and Space Administration (NASA), 126, 127 National Library of Medicine, US, 12 National Science Foundation, US, 313 National Security Agency, US, 313 national stereotypes, 235 nationalism, 9, 11, 13, 16 civic nationalism, 377–8 clash of civilizations narrative, 237 cultural purity and, 69, 70, 71, 352 immigration and, 69, 70, 82 nostalgia and, 287–8, 351 World War I (1914–18), 214 zero-sum thinking, 253, 254, 259, 272 nativism, 14, 122, 176, 223, 254, 349–51, 358 Natural History Museum, London, 124, 125 Naturalism, 198 Nazi Germany (1933–45), 104–6, 109, 124, 220, 233, 353 Nazirites, 72 Neanderthals, 30–33, 75, 76 Nebuchadnezzar, Babylonian Emperor, 46 neckties, 72 negative income tax, 374–5 Neilson, James Beaumont, 194 Nemeth, Charlan, 83 Neo-Classicism, 198 Neolithic period (c. 10,000–4500 BC), 74 Netflix, 309, 310 Netherlands, 99 von Neumann, John, 105 neurasthenia, 291 New Atlantis (Bacon), 147 New Guinea, 41 New Testament, 250 New York, United States crime in, 246, 334 September 11 attacks (2001), 10, 114, 340–42 New York Times, 291, 297, 325 New York University, 223 New York Yankees, 223 Newcomen, Thomas, 196 Newton, Isaac, 158–9, 201 Nicomachean Ethics (Aristotle), 131 Niebuhr, Reinhold, 253 Nietzsche, Friedrich, 365 Nîmes, France, 73 Nineteen Eighty-Four (Orwell), 230, 368 Nineveh, Assyria, 248–9 Nixey, Catherine, 134 Nobel Prize, 82, 105, 276 non-market societies, 34, 35 Nordhaus, William, 273–4 North American Free Trade Agreement (NAFTA), 63, 64 North Carolina, United States, 102 North Korea, 54, 314–15, 366 North Star, 44 nostalgia, 14, 286–95, 313, 351 Not Fit for Our Society (Schrag), 107 novels, 188–9, 246–7 nuclear power, 301, 327, 328, 329, 332 nuclear weapons, 105, 290, 306 O’Rourke, Patrick Jake, 280 Oannes, 267 Obama, Barack, 66, 240, 329 obsidian, 22, 29 occupational licensing, 376–7 Ögedei Khan, 96 Ogilvie, Sheilagh, 179 Oklahoma, United States, 218–19 Old Testament, 46, 72, 248–50 olive oil, 48 Olorgesailie, 21–2 omnivores, 299 On Liberty (Mill), 160 one-year-old children, 26 open society, 6 open-mindedness, 35, 112 Opening of the mouth’ rite, 70 Orbán, Viktor, 354, 380 de Orta, Garcia, 146–7 Orwell, George, 230, 368 Osman II, Ottoman Sultan, 148 Ottoman Empire (1299–1923), 84, 94, 98, 148, 215, 220, 237, 353 Ötzi, 1–2, 8–9, 73, 74 overpopulation, 81, 160 Overton, Richard, 183 Pacific islands, 52 Paine, Thomas, 56, 158, 247 Pakistan, 70, 366 Pallas Athena, 89 Pallavicino, Ferrante, 150 Palmer, Tom Gordon, 15 Panthers and Pythons, 243–4 Papacy, 102, 142, 143, 152, 155, 178 Papin, Denis, 179, 180 Paris, France exiles in, 152, 153 University of Paris, 140, 141–2, 143 parochialism, 216 patent systems, 58, 82, 189–90, 203, 314 in Britain, 179, 189–90, 203, 314 in China, 58 in France, 189 immigrants and, 82 in Netherlands, 189 in United States, 203 PayPal, 310 Peasants’ Revolt (1381), 208 peer review, 127 Pence, Michael, 108 penny universities, 166 Pericles, 131 Permissionless Innovation (Thierer), 299 Perry, Gina, 243 Perseus, 87–8 Persia, ancient, 84, 86–7, 88, 95, 129, 215 Abbasid period (750–1258), 136 Achaemenid Empire (550–330 BC), 86–7, 88 Greeks, influence on, 129 Mongols, influence on, 95 Safavid Empire (1501–1736), 149 Sasanian Empire (224–651), 134 personality traits, 7 Pertinax, Roman Emperor, 91 Pessimists Archive, 290, 297, 298 Pessinuntia, 89 Peters, Margaret, 66 Peterson Institute for International Economics, 60 Petty, William, 296 Philip II King of Spain, 98 Phoenicia (2500–539 BC), 43–6, 49, 70, 128–9 Phoenicia dye, 44 Phrygians, 89 physical fallacy, 267–8 Physics (Aristotle), 142 Pietists, 153 Pinker, Steven, 23, 243, 266, 324 Plague of Justinian (541–750), 77 Plato, 130, 131, 132, 134, 352 pluralism, 85, 129, 357 Plutarch, 45–6 Poland Battle of Vienna (1683), 237, 238 Dutch Republic, migration to, 99 Holocaust (1941–5), 220 immigration, 116 Israel, migration to, 111 United Kingdom, migration to, 120 United States, migration to, 108, 109 Polanyi, Karl, 37 polio, 293 pollution, 326, 347 Polo, Marco, 174 Popper, Karl, 6, 26, 127, 129, 130, 182–3, 237, 362 population density, 28 populism, 9, 13, 14, 16, 324, 379–82 authoritarianism and, 325, 350–51 complexity and, 324 nostalgia and, 295, 324, 351 trade and, 19 zero-sum thinking and, 254, 259, 274 pornography, 113, 336 Portugal Empire (1415–1999), 100, 146–7, 178 guilds in, 190 Inquisition, 100 Postrel, Virginia, 300, 312, 326 pound locks, 172 poverty, 4, 168, 213, 270 in Britain, 256 in China, 4, 316 immigration and, 66, 69, 81, 121 in Japan, 71 Jeff Bezos test, 275–9 Preston, Lancashire, 190 priests, 41, 128 printing, 146, 153, 171 Pritchard, James Bennett, 43 productivity cities and, 40 foreign trade and, 57, 59, 63 free goods and, 278 immigration and, 68, 81, 117, 204 programming, 8 Progress (Norberg), 12–13 progressives, 286, 300–302 Proserpina, 89 protectionism, 13, 15, 16, 54–5 Great Depression (1929–39), 54–5 immigration and, 66–7 Internet and, 314 Trump administration (2017–), 19, 57–8 Protestantism, 99, 104, 148, 149, 153, 169, 178, 237 Prussia (1701–1918), 153, 288 Psychological Science, 335 Puerto Rico, 80 Pufendorf, Samuel, 147 purchasing power, 59, 61, 63, 66, 198 Puritanism, 99, 102 Putin, Vladimir, 14, 353–4 Putnam, Robert, 121, 165 Pythagoras, 137 Pythons and Panthers, 243–4 al-Qaeda, 351 Qianlong, Qing Emperor, 153 Qing Empire (1644–1912), 148, 149, 151, 153, 175–7, 179 Quakers, 99, 102, 206 Quarantelli, Enrico, 338 Quarterly Journal of Economics, The, 63 race; racism, 76–7, 206, 231–4, 358–9 railways, 53, 179, 202, 296, 297 Rammstein, 274 RAND Corporation, 307 Raphael, 137 Rastafari, 72 Rattlers and Eagles, 218–19, 236, 243, 252 reactive aggression, 227–8 Reagan, Ronald, 63, 111 Realism, 198 realistic conflict theory, 222 Reconquista (711–1492), 139 Red Genies, 236 Red Sea, 75 Reformation, 148, 155 refugees crime and, 119 European migration crisis (2015–), 10, 114, 115, 281, 342–3 integration of, 117–18 German Jews (1933–45), 104–6, 109 Rembrandt, 99 reminiscence bump, 294 Renaissance, 5, 6, 132, 143, 145–6, 149–50, 215 Republic of Letters, 157–9, 165, 195 Republic, The (Plato), 352 Republican Party, 164, 225, 238, 240, 301 Reynell, Carew, 184 Reynolds, Glenn, 308 Ridley, Matthew, 20–21, 80 right to work laws, 65 Rizzo, Frank, 334 Road to Serfdom, The (Hayek), 325 Robbers Cave experiment (1954), 218–19, 236, 243, 252, 371 Robbins, Caroline, 200–201 Robertson, Marion Gordon ‘Pat’, 114 Robinson, James, 185, 187, 200 rock paper scissors, 26 Rogers, Will, 282 Roman Law, 5 Romanticism, 198, 287, 296–7 Rome, ancient, 47–50, 89–94, 132 Antonine Plague (165–80), 77 assimilation, 91–2 chariot racing, 224 Christianity in, 90, 93–4, 133–4 citizenship, 91 cosmopolitanism, 89–91 fall of, 54, 94 gods in, 89–90 golden nugget theory, 5 globalization, 45–6, 47–50 haircuts, 72 Latin alphabet, 45 philosophy, 70, 136 Phoenicians, relations with, 43, 44 Sabines, relations with, 89 Social War (91–88 BC), 91 trousers, attitudes towards, 92 Romulus, 89, 90 Rotterdam, Holland, 158 Rousseau, Jean-Jacques, 226 Royal Navy, 205 Royal Society, 156, 157, 158, 196 Rubin, Paul, 258 ruin follies, 286–7 rule of law, 68, 189, 269, 334, 343, 358, 379 Rumbold, Richard, 183–4 Rushdie, Salman, 73 Ruskin, John, 206, 297 Russia Imperial period (1721–1917), 154, 289–90 Israel, migration to, 111 Mongol period (1237–1368), 95, 352 Orthodox Christianity, 155 Putin period (1999–), 14, 15, 347, 353–4, 365, 367 Soviet period (1917–91), 162, 302–5, 315, 317 United States, relations with, 236 Yamnaya people, 74–5 Rust Belt, 58, 62, 64–6, 349 Rwandan Genocide (1994), 230–31 Sabines, 89 Safavid Empire (1501–1736), 149 safety of wings, 374 Saint-Sever, France, 180 Salamanca school, 147, 150 Sanders, Bernard, 302 Santa Fe Institute, 216 SARS (severe acute respiratory syndrome), 3, 162 Saudi Arabia, 365 Scandinavia Bronze Age migration, 75 Neolithic migration, 74 United States, migration to, 104, 108 see also Sweden scapegoats, 11, 83, 253, 268, 349, 355–61 Black Death (1346–53), 352, 355–6 Great Recession (2007–9), 255 Mongol invasion (1241), 95 Schmandt-Besserat, Denise, 38 School of Athens, The (Raphael), 137 School of Salamanca, 147, 150 Schrag, Peter, 107 Schrödinger, Erwin, 105, 128, 129, 132 Schumpeter, Joseph, 277 Schumpeterian profits, 273–5 science, 127–66 in China, 4, 13, 70, 153, 156, 162–3, 169–73 Christianity and, 133–5, 141–6, 149–50 Enlightenment, 154–9 experiments, 156–7 Great Vanishing, 134–5 in Greece, 127–32 jealous emulation and, 154–7 in Islamic world, 70, 132, 136–41 Renaissance, 145–6 Republic of Letters, 157–9, 165, 195 sclera, 25 Scotland, 101, 194 Scotney Castle, Kent, 287 Sculley, John, 304 sea peoples, 43 sea snails, 44 Seinfeld, Jerry, 224 Seleucid Empire (312–63 BC), 88 self-esteem, 372, 379 Sen, Amartya, 231 Seneca, 49, 91 Sephardic Jews, 99 September 11 attacks (2001), 10, 114, 340–42, 363 Septimius Severus, Roman Emperor, 91 Servius, Publius, 90 Seven Wonders of the World, 45 Seville, Spain, 91, 139 sex bonobos and, 226 encoding and, 233 inbreeding, 78 views on, 113, 336 SGML (Standard Generalized Markup Language), 307 Shaftesbury, Lord, see Cooper, Anthony Ashley Sherif, Muzafer, 219, 220, 222, 243, 252 Shia Islam, 149 Shining, The, 335 shirts, 72 Siberia, 76 Sicily, 89 Sierra Leone, 365 Siger of Brabant, 143, 144 Sikhism, 149 Silicon Valley, 311 Silk Road, 171, 174, 352 silver processing, 49 Simler, Kevin, 282 Simmel, Georg, 266 Simon, Julian, 81 Simple Rules for a Complex World (Epstein), 320 Singapore, 53 skilled workers, 36, 45, 66, 95, 97, 101, 117 Slater, Samuel, 202 slavery, 86, 156, 205–6, 232 in British Empire, 182, 199, 200, 205 in Mesopotamia, 40, 41, 43 in Rome, 47, 48 in Sparta, 54 in United States, 103, 106, 205, 232 smallpox, 77, 197, 293, 296 Smith, Adam, 21, 59, 192, 194, 205, 280 Smith, Fred, 319 smoke detectors, 234 Smoot–Hawley Tariff Act (1930), 55 snack boxes, 20 Snow, Charles Percy, 105 social media, 239, 347, 370 social status, 281–5 Social War (91–88 BC), 91 Socrates, 130, 131–2, 330 solar power, 328, 329, 331, 332 Solomon, King of Israel, 38, 45 Solyndra, 329 Song Empire (960–1279), 53, 169–75 Sony, 319 Soros, George, 323 South Korea, 314, 366 South Sudan, 365 Soviet Union (1922–91), 162, 302–5, 315, 317 Sovu, Rwanda, 231 Sowell, Thomas, 267–8 Spain, 97–101, 184, 207 Almohad Caliphate (1121–1269), 137–8 amphorae production, 48 al-Andalus (711–1492), 97, 137–9, 140 Columbus’ voyages (1492–1503), 178 Dutch Revolt (1568–1648), 98–9, 101 Empire (1492–1976), 147, 178, 182 guilds in, 190 Inquisition (1478–1834), 97, 98 Jews, persecution of, 97–8, 106, 140 Madrid train bombings (2004), 341 Muslims, persecution of, 97, 106 Reconquista (711–1492), 97, 138–9, 140 regional authorities, 152 Roman period (c.218 BC–472 AD), 48, 91 Salamanca school, 147, 150 sombreros, 73 Uber in, 320 vaqueros, 73 Spanish flu (1918–19), 77 Sparta, 47, 54, 90, 132 Spencer, Herbert, 165, 214 Spinoza, Baruch, 100, 150, 153 Spitalfields, London, 190 sports, 199, 223–4, 232–3, 245–6 Sri Lanka, 100, 365 St Bartholomew’s Day massacre (1572), 97 St Louis, SS, 109 Standage, Tom, 166 Stanford University, 307, 311 Star Trek, 246, 259 stasists, 301–2 Statute of Labourers (1351), 208 steam engine, 179, 180, 189, 194, 203, 296 steamships, 53, 202 Stenner, Karen, 242, 343, 348, 350, 357 Stockholm, Sweden, 217–18 Stranger Things, 294 Strasbourg, France, 153 strategic tolerance, 86–96 Strindberg, August, 239 Suarez, Francisco, 147 suits, 72 Sumer (4500–1900 BC), 37–43, 45, 55, 292–3 Summers, Larry, 329 Sunni Islam, 148, 149, 238, 365 superpowers, 338–9 supply chains, 11, 62, 66 Sweden DNA in, 73 Green Party, 325 Lind dreadlocks affair (2019), 72 immigration in, 114, 115, 118, 281 manufacturing in, 65 Muslim community, 114 Neolithic migration, 74 refugees in, 118, 281, 342 United States, migration to, 107 Sweden Democrats, 281 swine flu, 3 Switzerland, 152, 153 Sylvester II, Pope, 137 Symbolism, 198 Syria, 42, 82, 342, 365, 366 tabula rasa, 225 Tacitus, 91 Taiwan, 316, 366 Taizu, Song Emperor, 170 Tajfel, Henri, 220, 221–2 Tandy, Geoffrey, 124–6 Tang Empire (618–907), 84, 170, 177, 352 Tanzania, 257 Taoism, 129, 149 tariffs, 15, 56, 373 Anglo–French Treaty (1860), 53–4 Great Depression (1929–39), 54–5 Obama’s tyre tariffs (2009), 66 Trump’s steel tariffs (2018), 272 Tasmania, 50–53, 54 Tatars, 238 taxation in Britain, 72, 187, 188, 189 carbon tax, 330–31 crony capitalism and, 279–80 immigration and, 69 negative income tax, 374–5 in Song Empire, 172 in Spanish Netherlands, 98 Taylor, Robert, 306 TCP/IP protocol, 307 technology, 296–9 automation, 63, 312–13 computers, 302–14 decline, 51–2 Internet, 57, 275, 278, 306–11, 312 nostalgia and, 296–9, 313 technocrats, 299–300, 312, 313–14, 326–9 technological decline, 51–2 telescopes, 145–6 Teller, Edward, 105 Temple of Artemis, Ephesus, 45 Temple of Serapis, Alexandria, 134 Tencent, 311 terrorism, 10, 114, 229, 340–41, 363 Tetlock, Philip, 160 textiles, 172–3 Thales, 127 Thierer, Adam, 299 third-party punishment game, 35 Thirty Years War (1618–48), 72, 97, 148, 150 Thomas Aquinas, Saint, 142–3, 144–5 Thoreau, Henry David, 203 Thracians, 130 Thucydides, 131, 132 Tiangong Kaiwu, 153 Tibetans, 85 Tierra del Fuego, 52–3 Tigris river, 37, 139 Timurid Empire (1370–1507), 139 tin, 42 Tokugawa Shogunate (1600–1868), 54 Toledo, Spain, 140 tolerance, 86–114, 129 Tomasello, Michael, 25 ‘too big to fail’, 280 Tower of Babel, 39 Toynbee, Arnold, 382 trade, 13, 19–23, 28–9, 129, 140, 363, 373 backlashes against, 19, 54–67, 254 benefit–cost ratio, 60, 61, 62 Britain, 181–99 competitive advantage, 28–9 division of labour and, 28, 31, 57 Great Depression (1929–39), 54–5 Greece, ancient, 47 humanitarianism and, 204–7 Mesopotania, 37–43 migration and, 35, 66–7, 234–5 morality of, 33–6 Phoenicia, 43–6 Rome, ancient, 47–50 snack boxes, 20 United States, 19, 57–8, 202–3 zero-sum thinking and, 248, 252–66, 270–72 trade unions, 64, 65, 272, 374 Trajan, Roman Emperor, 91 Trans-Pacific Partnership, 58 Transparency International, 381 Treaty of Trianon (1920), 354 Treaty of Versailles (1919), 353 Trenchard, John, 201 Treschow, Michael, 65 Trevor-Roper, Hugh, 215, 356 tribalism, 14, 217–47, 362, 368–72 fluid, 230–38 political, 224–5, 238–42, 378, 379 media and, 348, 370 threats and, 241, 350, 370 Trollboda School, Hässelby, 218 Trump, Donald, 9, 14, 240, 313, 321, 322, 354, 365, 367, 380 immigration, views on, 223 presidential election (2016), 238, 241, 242, 349, 350 stasism, 301, 302 steel tariffs (2018), 272 trade, views on, 19, 57–8 zero-sum attitude, 248 Tunisia, 45, 48 Turing, Alan, 124 Turkey; Turks, 70, 74, 136, 156, 354, 357, 365 turtle theory, 121–2 Tutsis, 230–31 Twilight Zone, The, 260–61 Twitter, 84, 239, 245 Two Treatises of Government (Locke), 186, 201 tyranny of cousins, 229, 230 tyre tariffs, 66 Tyre, 45 Uber, 319–20 Uganda, 365 Ukraine, 75, 116, 365 ultimatum game, 34–6 umbrellas, 298 uncertainty, 321–6 unemployment, 62, 373–4, 376, 377 ‘unicorns’, 82 United Auto Workers, 64 United Kingdom, see Britain United Nations, 327 United States, 199–203 Afghanistan War (2001–14), 345 America First, 19, 272 automation in, 313 Bureau of Labor Statistics, 65 California Gold Rush (1848–1855), 104 China, trade with, 19, 57, 58–9, 62–3, 64 Chinese Exclusion Act (1882), 254 citizenship, 103 Civil War (1861–5), 109 climate change polices in, 328 Constitution (1789), 102, 202 consumer price index, 277 COVID-19 pandemic (2019–20), 12 crime in, 110, 119, 120, 346 Declaration of Independence (1776), 103, 201, 202 dynamism in, 301–2 Federalist Party, 103 free trade gains, 60, 61 Great Depression (1929–39), 54–5, 254 gross domestic product (GDP), 257 Homestead Acts, 171 housing in, 376 immigration, see immigration in United States Industrial Revolution, 202, 291–2 innovation in, 53, 203, 298–9 intellectual property in, 58 Internet in, 306–14 Iraq War (2003–11), 345 Jim Crow laws (1877–1965), 106, 254 Know-Nothings, 108–9 Ku Klux Klan, 254 labour mobility in, 374, 376–7 lobbying in, 280, 329 Manhattan Project (1942–6), 105 manufacturing, 62–6 McCarthy era (1947–57), 335 Medicaid, 119 middle class, 60–61 NAFTA, 63, 64 National Library of Medicine, 12 national stereotypes, 235, 236 nostalgia in, 290–92, 294 open society, 169, 199–203 patent system, 203 political tribalism in, 224–5, 238, 240 populist movement, 254 presidential election (2016), 238, 241, 242, 349, 350 railways, 202 Revolutionary War (1775–83), 102–3, 200–201 Robbers Cave experiment (1954), 218–19, 236, 243, 252, 371 Rust Belt, 58, 62, 64–6, 349 Saudi Arabia, relations with, 365 Senate, 108 September 11 attacks (2001), 10, 114, 340–42, 363 slavery in, 103, 106, 205 Smoot–Hawley Tariff Act (1930), 55 Supreme Court, 108, 335 tariffs, 66, 272 trade deficits, 60, 270 Trump administration (2017–), see Trump, Donald unemployment in, 373, 376 universities, 163–5, 241 Vietnam War (1955–75), 345 Watergate scandal (1972–4), 345 World War II (1939–45), 56, 64, 335 Yankees, 58 United Steelworkers, 64, 272 universal basic income (UBI), 374, 375 universities, 140 University Bologna, 140 University of California, Berkeley, 311 University of Cambridge, 140 University of Chicago, 165 University of Leeds, 357 University of London, 201 University of Marburg, 153 University of Oxford, 140, 144, 145, 328 University of Padua, 144, 146 University of Paris, 140, 141–2, 143 University of Pennsylvania, 271 University of Salamanca, 140 University of Toulouse, 144 unskilled workers, 36, 66, 102, 117 untranslatable words, 288 Ur, 55 urbanization, see cities Uruk, Sumer, 39 US Steel, 64 Usher, Abbott Payson, 196 Uyghurs, 85, 174 vaccines, 12, 296, 299 Vandals, 92 Vanini, Lucilio, 150 vaqueros, 73 Vargas Llosa, Mario, 213, 261 Vatican Palace, 137 Vavilov, Nikolai, 162 Venezuela, 354 Venice, Republic of (697–1797), 53, 144, 152, 174, 181 Vermeer, Johannes, 99 Vespucci, Amerigo, 146 Vienna, Austria, 95, 237, 238 Vienna Congress (1815), 288 Vietnam, 171, 207, 270, 345 Virgil, 91 Virginia Company, 200 vitamin D, 74 de Vitoria, Francisco, 147 Vladimir’s choice, 221, 252, 271 Voltaire, 153, 193 Walton, Sam, 277 Wang, Nina, 315 War of the Polish Succession (1733–8), 289–90 Ward-Perkins, Bryan, 50 warfare, 216–17, 243 Warren, Elizabeth, 302 washing of hands, 10, 335 Washington, George, 103, 205 Washington, DC, United States, 280 Watergate scandal (1972–4), 345 Watson, John, 291 Watson, Peter, 79 Watt, James, 172, 189, 194, 274 Weatherford, Jack, 95 Web of Science, 159 Weber, Maximilian, 204 WeChat, 311 Weekly Standard, 312 welfare systems, 118, 281, 374 Wengrow, David, 42 West Africa Squadron, 205 Western Roman Empire (395–480), 94, 135 Westernization, 4–5 Wheelan, Charles, 20 Whig Party, 185, 201 White House Science Council, 313 white supremacists, 84, 351, 367 Whitechapel, London, 190 Who Are We?

pages: 196 words: 54,339

Team Human
by Douglas Rushkoff
Published 22 Jan 2019

A society functioning on these platforms tends toward similarly discrete formulations. Like or unlike? Black or white? Rich or poor? Agree or disagree? In a self-reinforcing feedback loop, each choice we make is noticed and acted upon by the algorithms personalizing our news feeds, further isolating each one of us in our own ideological filter bubble. The internet reinforces its core element: the binary. It makes us take sides. 41. Digital media push us apart, but they also seem to push us backward. Something about this landscape has encouraged the regressive sentiments of the populist, nationalist, and nativist movements characterizing our time.

pages: 501 words: 145,943

If Mayors Ruled the World: Dysfunctional Nations, Rising Cities
by Benjamin R. Barber
Published 5 Nov 2013

The new technology allows us to assemble en masse anywhere and at the speed of light, but the billion on the Internet gather only as individuals in small coteries of friends and family; others, aliens, and enemies are not welcome. The web removes all physical limits from deliberation and common decision making but seems to reinforce social ghettoization and groupthink, as Eli Pariser shows in The Filter Bubble, his book on Google and search engines.21 Enthusiasts cite the Arab Spring and Occupy Wall Street as exemplars of how technology can catalyze democracy. The culture of the web, they say, is embedded in the genes of contemporary rebels, defining not just how dissidents organize as free agents but how they think, how they understand their freedom.

Barber, Strong Democracy: Participatory Politics for a New Age, Princeton, NJ: Princeton University Press, 1984, p. 274. 20. Herbert Hoover, cited in Langdon Winner, “The Internet and Dreams of Democratic Renewal,” in The Civic Web: Online Politics and Democratic Values, ed. David M. Anderson and Michael Cornfield. Oxford: Rowman and Littlefield, 2003, p. 168. 21. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, New York: Penguin Books, 2012. William F. Baker, former president of WNET, thus speaks of “Google’s Internet grab,” and suggests the issue is monopoly in this “dominant new information medium.” “Google’s Internet Grab,” The Nation, February 11, 2013. 22.

pages: 205 words: 61,903

Survival of the Richest: Escape Fantasies of the Tech Billionaires
by Douglas Rushkoff
Published 7 Sep 2022

Well-meaning efforts to use computers to make prison sentencing more fair yielded algorithms that put Black people in jail for longer than whites who had committed the same crimes. Simple algorithms that bring people the kinds of news stories they are most likely to read have wreaked havoc on our civic and political lives, leading to filter bubbles, alienation, and the unchecked proliferation of fake news. None of this was intended by the technologists who programmed these systems or the people who put their faith in these game-changing improvements over the ways we were doing things before. Technosolutions are extremely attractive to politicians and philanthropists like Michael Bloomberg, Reid Hoffman, the Ford Foundation, or Bill Gates, who take a data-driven approach to problem solving.

pages: 199 words: 63,724

The Passenger: Berlin
by The Passenger
Published 8 Jun 2021

Oldenburg writes that it is essential to the workings of a democracy because it provides a public space where below and above, left and right, outer and inner all mingle. And through encounters and contact with others it reminds us that none of us is the norm, whether we’re rich or poor, a secondary-school teacher or a foreman, a regular or someone who’s just wandered in. The hardening towards people who live or think differently that we find in online filter bubbles and echo chambers can be ironed out here. Every city and every country has their own version of this; think of the Vienna coffee house (recognised by UNESCO as having ‘intangible cultural heritage’) or the British pub or free house, which expresses its function in its very name. WHY SHOULD WE MOURN THE KNEIPE?

pages: 212 words: 69,846

The Nation City: Why Mayors Are Now Running the World
by Rahm Emanuel
Published 25 Feb 2020

When the left later countered with MSNBC, the balkanization of the news—and the information spreading and inflaming partisanship that came with it—was complete. This led to an opening for a presidential candidate like, say, Donald Trump to publicly describe any news that he didn’t like as “fake.” In recent years the explosion of social media, with its filter bubbles, its fast-spreading conspiracy theories, its vulnerability to hacking, its easily doctored photos, and its proliferation of ill-thought-out hot takes, has only exacerbated this problem, to the point where it just might have swung a presidential election. Aggressive gerrymandering, which began to increase in the early 2000s, has also played a big role in the nation-state’s decline.

pages: 210 words: 65,833

This Is Not Normal: The Collapse of Liberal Britain
by William Davies
Published 28 Sep 2020

Britain is a long way from the US experience, thanks principally to the presence of the BBC, which, for all its faults, still performs a basic function in providing a common informational experience. It is treated as a primary source of news by 60 per cent of people in the UK. Even 42 per cent of Brexit Party and UKIP voters get their news from the BBC. The panic surrounding echo chambers and so-called filter bubbles is largely groundless. If we think of an echo chamber as a sealed environment, which only circulates opinions and facts that are agreeable to its participants, it is a rather implausible phenomenon. Research by the Oxford Internet Institute suggests that just 8 per cent of the UK public are at risk of becoming trapped in such a clique.11 Trust in the media is low, but this entrenched scepticism long predates the internet or contemporary populism.

pages: 237 words: 65,794

Mining Social Media: Finding Stories in Internet Data
by Lam Thuy Vo
Published 21 Nov 2019

This information was tailored to each Facebook account, meaning that it was available only through looking at Linder and Cooper’s Facebook news feeds. NOTE You can read the article, “This Conservative Mom and Liberal Daughter Were Surprised by How Different Their Facebook Feeds Are,” from BuzzFeed News at https://www.buzzfeed.com/lamvo/facebook-filter-bubbles-liberal-daughter-conservative-mom/. Figure 5-1: A graphic of whose posts show up the most on Linder’s and Cooper’s feeds, respectively Ethical Considerations for Data Scraping Social media companies set data restrictions based on what they deem appropriate for their business interests, for their users’ privacy concerns, and for other reasons.

pages: 278 words: 70,416

Smartcuts: How Hackers, Innovators, and Icons Accelerate Success
by Shane Snow
Published 8 Sep 2014

At the time I checked the Facebook statistics, the story had received 12 shares; however that count may increase as more people discover it. 53 the top story on the hugely popular blog: The BuzzFeed article about ’90s side characters is by Dave Stopera, “20 Supporting Characters from ’90s TV Shows Then and Now,” BuzzFeed, March 27, 2012, http://www.buzzfeed.com/daves4/20-supporting-actors-from-90s-tv-shows-then-and-n (accessed May 27, 2013). I actually did laugh at the Olmec reference. 54 a mellow, unshaven author: Eli Pariser is author of a fascinating book about one of the darker effects of the “personalized” Internet, The Filter Bubble: What the Internet Is Hiding from You (Penguin Press, 2011). 54 including Facebook cofounder: Upworthy’s investor information can be found at CrunchBase, Upworthy, http://www.crunchbase.com/company/upworthy (accessed February 15, 2014). 55 the week after Upworthy launched: The baby meerkats and other disappointed animals can be found at Jack Shepherd, “33 Animals Who Are Extremely Disappointed in You,” BuzzFeed, April 10, 2012, http://www.buzzfeed.com/expresident/animals-who-are-extremely-disappointed-in-you (accessed May 27, 2013). 56 The little comedy theater: You can learn everything you want about Kelly Leonard, executive director of The Second City, and the school itself at The Second City, https://www.secondcity.com.

Raw Data Is an Oxymoron
by Lisa Gitelman
Published 25 Jan 2013

Will data follow the model of genetic materials, with data becoming the intellectual property of a data broker who had altered it in some fashion? Proposed policy solutions thus far include improved securitization, transparency and informed consent, expiration dates and storage limits, and the regulation of data centers. 28. On the era of personalization, see Eli Pariser, The Filter Bubble:What the Internet Is Hiding from You (New York: Penguin Books, 2011). 29. Kevin D. Haggerty and Richard V. Ericson, The New Politics of Surveillance and Visibility (Toronto: University of Toronto Press, 2006), 4. Dataveillance and Countervailance 30. Matthew Fuller, Media Ecologies: Materialist Energies in Art and Technoculture (Cambridge, MA: MIT Press, 2005), 149. 31.

pages: 286 words: 79,305

99%: Mass Impoverishment and How We Can End It
by Mark Thomas
Published 7 Aug 2019

According to Caleb Gardner: Forty-four per cent of US adults get news on the [Facebook] site, and 61 per cent of millennials… if that doesn’t frighten you, you don’t know enough about Facebook’s algorithm. If you have a parent who’s a Trump supporter, they are seeing a completely different set of news items than you are.21 The algorithm that worried Gardner also worried Eli Pariser, author of The Filter Bubble: What The Internet Is Hiding From You. As Pariser put it: Democracy requires citizens to see things from one another’s point of view, but instead we are more and more enclosed in our own bubbles. Democracy requires reliance on shared facts; instead we are being offered parallel but separate universes.

pages: 337 words: 86,320

Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are
by Seth Stephens-Davidowitz
Published 8 May 2017

Where did Donald Trump’s unexpected support come from? When Ann Landers asked her readers in 1976 whether they regretted having children and was shocked to find that a majority did, was she misled by an unrepresentative, self-selected sample? Is the internet to blame for that redundantly named crisis of the late 2010s, the “filter bubble”? What triggers hate crimes? Do people seek jokes to cheer themselves up? And though I like to think that nothing can shock me, I was shocked aplenty by what the internet reveals about human sexuality—including the discovery that every month a certain number of women search for “humping stuffed animals.”

pages: 297 words: 83,651

The Twittering Machine
by Richard Seymour
Published 20 Aug 2019

What if it doesn’t work that way? What if information is like sugar, and a high-information diet is a benchmark of cultural poverty? What if information, beyond a certain point, is toxic? One is struck, therefore, by the palpable timidity of commonplace diagnoses of ‘fake news’, opinion silos, filter bubbles and the ‘post-truth’ society. This, the ‘sour grapes’ theory of communications, is sensationalism. But all sensationalism is a form of understatement, all moral panic a form of trivialization, and this is glaringly so in the case of our ‘fake news’ panic.74 The problem is not the lies, but a crash in meaning.

Refuge: Transforming a Broken Refugee System
by Alexander Betts and Paul Collier
Published 29 Mar 2017

If the underlying purpose of the refugee regime is a duty of rescue and a pathway to autonomy, then the collective challenge should be how we can effectively and efficiently provide those rights to all refugees, rather than a different (and inapt) set of rights for an arbitrarily privileged few. The right to seek asylum is not the same thing as an absolute right to freedom of movement. Although it has become popular among advocacy organizations and within the liberal filter bubble to see being a refugee as necessarily conferring an unimpeded right to travel, this is neither ethically nor legally credible. Aside from going down a general ‘open borders’ route, the only refugee-specific argument one could use to justify an exceptional, absolute right to migrate is that because refugees have generally had such a difficult time we might wish to just let them have a ‘free pass’ in terms of migration.

pages: 339 words: 88,732

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
by Erik Brynjolfsson and Andrew McAfee
Published 20 Jan 2014

On gaming, see Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W. W. Norton & Company, 2011); on cyberbalkanization, see Marshall van Alstyne and Erik Brynjolfsson, “Electronic Communities: Global Villages or Cyberbalkanization?” ICIS 1996 Proceedings, December 31, 1996, http://aisel.aisnet.org/icis1996/5; and Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin, 2012); on social isolation see Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other (New York: Basic Books, 2012); and Robert D. Putnam, Bowling Alone: The Collapse and Revival of American Community, 1st ed.

pages: 284 words: 92,688

Disrupted: My Misadventure in the Start-Up Bubble
by Dan Lyons
Published 4 Apr 2016

Instead, like Hollywood, or Wall Street, Silicon Valley has become a metaphorical name for an industry, one that exists in Los Angeles, Seattle, New York, Boston, and countless other places, as well as the San Francisco Bay Area. The term bubble, as I use it, refers not only to the economic bubble in which the valuation of some tech start-ups went crazy but also to the mindset of the people working inside technology companies, the true believers and Kool-Aid drinkers, the people who live inside their own filter bubble, brimming with self-confidence and self-regard, impervious to criticism, immunized against reality, unaware of how ridiculous they appear to the outside world. HubSpot, where I worked from April 2013 to December 2014, was part of that bubble. In November 2014, the company floated a successful IPO, and it now has a market value of nearly $2 billion.

pages: 349 words: 95,972

Messy: The Power of Disorder to Transform Our Lives
by Tim Harford
Published 3 Oct 2016

(Many of the tweets made false claims, which were rapidly retweeted.)31 Pierson’s analysis showed that the two groups, with very different views of the world, barely interacted.32 From the middle of one of these groups, surrounded by outrage expressed by like-minded people, it is easy to believe that the world agrees with you. Of course the Internet is full of contrary viewpoints that might challenge our assumptions and encourage us to think more deeply, but few of us realize that we might have to get out and look for those viewpoints. In the words of author and digital activist Eli Pariser, a “filter bubble” exists to give us more of what we already believe. It is sometimes hard to see that bubble for what it is. When our stream of social media updates fits tidily into our preconceptions, we are hardly likely to mess it up by seeking out the people who disagree. The pattern repeats endlessly: we gain new choices about whom to listen to, whom to trust, and whom to befriend—and we use those new choices to tighten the circle around us to people who are more and more like us.

pages: 349 words: 98,868

Nervous States: Democracy and the Decline of Reason
by William Davies
Published 26 Feb 2019

Kennedy International Airport, New York, x, xiii, 41 Johns Hopkins University, 176 Jones, Alexander, 131 Kant, Immanuel, 128, 130 Kemelmacher-Shlizerman, Ira, 188 Kennedy Jr., Robert, 23 Kepler, Johannes, 35 Keynes, John Maynard, 165 King Jr., Martin Luther, 21, 224 knowledge economy, 84, 85, 88, 151–2, 217 known knowns, 132, 138 Koch, Charles and David, 154, 164, 174 Korean War (1950–53), 178 Kraepelin, Emil, 139 Kurzweil, Ray, 183–4 Labour Party, 5, 6, 65, 80, 81, 221 Lagarde, Christine, 64 Le Bon, Gustave, 8–12, 13, 15, 16, 20, 24, 25, 38 Le Pen, Marine, 27, 79, 87, 92, 101–2 Leadbeater, Charles, 84 Leeds, West Yorkshire, 85 Leicester, Leicestershire, 85 Leviathan (Hobbes), 34, 39, 45 liberal elites, 20, 58, 88, 89, 161 libertarianism, 15, 151, 154, 158, 164, 173, 196, 209, 226 Liberty Fund, 158 Libya, 143 lie-detection technology, 136 life expectancy, 62, 68–71, 72, 92, 100–101, 115, 224 Lindemann, Frederick Alexander, 1st Viscount Cherwell, 138 Lloyds Bank, 29 London, England bills of mortality, 68–71, 75, 79–80, 81, 89, 127 Blitz (1940–41), 119, 143, 180 EU referendum (2016), 85 Great Fire (1666), 67 Grenfell Tower fire (2017), 10 and gross domestic product (GDP), 77, 78 housing crisis, 84 insurance sector, 59 knowledge economy, 84 life expectancy, 100 newspapers, early, 48 Oxford Circus terror scare (2017), ix–x, xiii, 41 plagues, 67–71, 75, 79–80, 81, 89, 127 Unite for Europe march (2017), 23 London School of Economics (LSE), 160 loss aversion, 145 Louis XIV, King of France, 73, 127 Louisiana, United States, 151, 221 Ludwig von Mises Institute, 154 MacLean, Nancy, 158 Macron, Emmanuel, 33 mainstream media, 197 “Make America Great Again,” 76, 145 Manchester, England, 85 Mann, Geoff, 214 maps, 182 March For Our Lives (2018), 21 March for Science (2017), 23–5, 27, 28, 210, 211 marketing, 14, 139–41, 143, 148, 169 Mars, 175, 226 Marxism, 163 Massachusetts Institute of Technology (MIT), 179 Mayer, Jane, 158 McCarthy, Joseph, 137 McGill Pain Questionnaire, 104 McKibben, William “Bill,” 213 Megaface, 188–9 memes, 15, 194 Menger, Carl, 154 mental illness, 103, 107–17, 139 mercenaries, 126 Mercer, Robert, 174, 175 Mexico, 145 Million-Man March (1995), 4 mind-reading technology, 136 see also telepathy Mirowski, Philip, 158 von Mises, Ludwig, 154–63, 166, 172, 173 Missing Migrants Project, 225 mobilization, 5, 7, 126–31 and Corbyn, 81 and elections, 81, 124 and experts, 27–8 and Internet, 15 and Le Bon’s crowd psychology, 11, 12, 16, 20 and loss, 145 and Napoleonic Wars, xv, 127–30, 141, 144 and Occupy movement, 5 and populism, 16, 22, 60 and violence, opposition to, 21 Moniteur Universel, Le, 142 monopoly on violence, 42 Mont Pelerin Society, 163, 164 moral emotion, 21 morphine, 105 multiculturalism, 84 Murs, Oliver “Olly,” ix Musk, Elon, 175, 176, 178, 183, 226 Nanchang, Jiangxi, 13 Napoleonic Wars (1803–15), 126–30 chappe system, 129, 182 and conscription, 87, 126–7, 129 and disruption, 170–71, 173, 174, 175, 226 and great leader ideal, 146–8 and intelligence, 134 and mobilization, xv, 126–30, 141, 144 and nationalism, 87, 128, 129, 144, 183, 211 and propaganda, 142 Russia, invasion of (1812), 128, 133 Spain, invasion of (1808), 128 National Aeronautics and Space Administration (NASA), 23, 175 National Audit Office (NAO), 29–30 national citizenship, 71 National Defense Research Committee, 180 National Health Service (NHS), 30, 93 National Park Service, 4 National Security Agency (NSA), 152 national sovereignty, 34, 53 nationalism, 87, 141, 210–12 and conservatism, 144 and disempowerment, 118–19 and elites, 22–3, 60–61, 145 ethnic, 15 and health, 92, 211–12, 224 and imagined communities, 87 and inequality, 78 and loss, 145 and markets, 167 and promises, 221 and resentment, 145, 197, 198 and war, 7, 20–21, 118–19, 143–6, 210–11 nativism, 61 natural philosophy, 35–6 nature, 86 see also environment Nazi Germany (1933–45), 137, 138, 154 Netherlands, 48, 56, 129 Neurable, 176 neural networking, 216 Neuralink, 176 neurasthenia, 139 Neurath, Otto, 153–4, 157, 160 neurochemistry, 108, 111, 112 neuroimaging, 176–8, 181 Nevada, United States, 194 new atheism, 209 New Orleans, Louisiana, 151 New Right, 164 New York, United States and climate change, 205 and gross domestic product (GDP), 78 housing crisis, 84 JFK Airport terror scare (2016), x, xiii, 41 knowledge economy, 84 September 11 attacks (2001), 17, 18 New York Times, 3, 27, 85 newspapers, 48, 71 Newton, Isaac, 35 Nietzsche, Friedrich, 217 Nixon, Robert, 206 no-platforming, 22, 208 Nobel Prize, 158–9 non-combatants, 43, 143, 204 non-violence, 224 North Atlantic Treaty Organization (NATO), 123, 145, 214 North Carolina, United States, 84 Northern Ireland, 43, 85 Northern League, 61 Northern Rock, 29 Norwich, Norfolk, 85 nostalgia, xiv, 143, 145, 210, 223 “Not in my name,” 27 nuclear weapons, 132, 135, 137, 180, 183, 192, 196, 204 nudge techniques, 13 Obama, Barack, 3, 24, 76, 77, 79, 158, 172 Obamacare, 172 objectivity, xiv, 13, 75, 136, 223 and crowd-based politics, 5, 7, 24–5 and death, 94 and Descartes, 37 and experts, trust in, 28, 32, 33, 51, 53, 64, 86, 89 and Hayek, 163, 164, 170 and markets, 169, 170 and photography, 8 and Scientific Revolution, 48, 49 and statistics, 72, 74, 75, 82, 88 and telepathic communication, 179 and war, 58, 125, 134, 135, 136, 146 Occupy movement, 5, 10, 24, 61 Oedipus complex, 109 Office for National Statistics, 63, 133 Ohio, United States, 116 oil crisis (1973), 166 “On Computable Numbers” (Turing), 181 On War (Clausewitz), 130 Open Society and Its Enemies, The (Popper), 171 opiates, 105, 116, 172–3 opinion polling, 65, 80–81, 191 Orbán, Viktor, 87, 146 Organisation for Economic Co-operation and Development (OECD), 72 Oxford, Oxfordshire, 85 Oxford Circus terror scare (2017), ix–x, xiii, 41 Oxford University, 56, 151 OxyContin, 105, 116 pacifism, 8, 20, 44, 151 pain, 102–19, 172–3, 224 see also chronic pain painkillers, 104, 105, 116, 172–3 Palantir, 151, 152, 175, 190 parabiosis, 149 Paris climate accord (2015), 205, 207 Paris Commune (1871), 8 Parkland attack (2018), 21 Patriot Act (2001), 137 Paul, Ronald, 154 PayPal, 149 Peace of Westphalia (1648), 34, 53 peer reviewing, 48, 139, 195, 208 penicillin, 94 Pentagon, 130, 132, 135, 136, 214, 216 pesticides, 205 Petty, William, 55–9, 67, 73, 85, 167 pharmacology, 142 Pielke Jr., Roger, 24, 25 Piketty, Thomas, 74 Pinker, Stephen, 207 plagues, 56, 67–71, 75, 79–80, 81, 89, 95 pleasure principle, 70, 109, 110, 224 pneumonia, 37, 67 Podemos, 5, 202 Poland, 20, 34, 60 Polanyi, Michael, 163 political anatomy, 57 Political Arithmetick (Petty), 58, 59 political correctness, 20, 27, 145 Popper, Karl, 163, 171 populism xvii, 211–12, 214, 220, 225–6 and central banks, 33 and crowd-based politics, 12 and democracy, 202 and elites/experts, 26, 33, 50, 152, 197, 210, 215 and empathy, 118 and health, 99, 101–2, 224–5 and immediate action, 216 in Kansas (1880s), 220 and markets, 167 and private companies, 174 and promises, 221 and resentment, 145 and statistics, 90 and unemployment, 88 and war, 148, 212 Porter, Michael, 84 post-traumatic stress disorder (PTSD), 111–14, 117, 209 post-truth, 167, 224 Potsdam Conference (1945), 138 power vs. violence, 19, 219 predictive policing, 151 presidential election, US (2016), xiv and climate change, 214 and data, 190 and education, 85 and free trade, 79 and health, 92, 99 and immigration, 79, 145 and inequality, 76–7 and Internet, 190, 197, 199 “Make America Great Again,” 76, 145 and opinion polling, 65, 80 and promises, 221 and relative deprivation, 88 and Russia, 199 and statistics, 63 and Yellen, 33 prisoners of war, 43 promises, 25, 31, 39–42, 45–7, 51, 52, 217–18, 221–2 Propaganda (Bernays), 14–15 propaganda, 8, 14–16, 83, 124–5, 141, 142, 143 property rights, 158, 167 Protestantism, 34, 35, 45, 215 Prussia (1525–1947), 8, 127–30, 133–4, 135, 142 psychiatry, 107, 139 psychoanalysis, 107, 139 Psychology of Crowds, The (Le Bon), 9–12, 13, 15, 16, 20, 24, 25 psychosomatic, 103 public-spending cuts, 100–101 punishment, 90, 92–3, 94, 95, 108 Purdue, 105 Putin, Vladimir, 145, 183 al-Qaeda, 136 quality of life, 74, 104 quantitative easing, 31–2, 222 quants, 190 radical statistics, 74 RAND Corporation, 183 RBS, 29 Reagan, Ronald, 15, 77, 154, 160, 163, 166 real-time knowledge, xvi, 112, 131, 134, 153, 154, 165–70 Reason Foundation, 158 Red Vienna, 154, 155 Rees-Mogg, Jacob, 33, 61 refugee crisis (2015–), 60, 225 relative deprivation, 88 representative democracy, 7, 12, 14–15, 25–8, 61, 202 Republican Party, 77, 79, 85, 154, 160, 163, 166, 172 research and development (R&D), 133 Research Triangle, North Carolina, 84 resentment, 5, 226 of elites/experts, 32, 52, 61, 86, 88–9, 161, 186, 201 and nationalism/populism, 5, 144–6, 148, 197, 198 and pain, 94 Ridley, Matt, 209 right to remain silent, 44 Road to Serfdom, The (Hayek), 160, 166 Robinson, Tommy, ix Roosevelt, Franklin Delano, 52 Royal Exchange, 67 Royal Society, 48–52, 56, 68, 86, 133, 137, 186, 208, 218 Rumsfeld, Donald, 132 Russian Empire (1721–1917), 128, 133 Russian Federation (1991–) and artificial intelligence, 183 Gerasimov Doctrine, 43, 123, 125, 126 and information war, 196 life expectancy, 100, 115 and national humiliation, 145 Skripal poisoning (2018), 43 and social media, 15, 18, 199 troll farms, 199 Russian Revolution (1917), 155 Russian SFSR (1917–91), 132, 133, 135–8, 155, 177, 180, 182–3 safe spaces, 22, 208 Sands, Robert “Bobby,” 43 Saxony, 90 scarlet fever, 67 Scarry, Elaine, 102–3 scenting, 135, 180 Schneier, Bruce, 185 Schumpeter, Joseph, 156–7, 162 Scientific Revolution, 48–52, 62, 66, 95, 204, 207, 218 scientist, coining of term, 133 SCL, 175 Scotland, 64, 85, 172 search engines, xvi Second World War, see World War II securitization of loans, 218 seismology, 135 self-employment, 82 self-esteem, 88–90, 175, 212 self-harm, 44, 114–15, 117, 146, 225 self-help, 107 self-interest, 26, 41, 44, 61, 114, 141, 146 Semi-Automatic Ground Environment (SAGE), 180, 182, 200 sentiment analysis, xiii, 12–13, 140, 188 September 11 attacks (2001), 17, 18 shell shock, 109–10 Shrecker, Ted, 226 Silicon Fen, Cambridgeshire, 84 Silicon Valley, California, xvi, 219 and data, 55, 151, 185–93, 199–201 and disruption, 149–51, 175, 226 and entrepreneurship, 149–51 and fascism, 203 and immortality, 149, 183–4, 224, 226 and monopolies, 174, 220 and singularity, 183–4 and telepathy, 176–8, 181, 185, 186, 221 and weaponization, 18, 219 singularity, 184 Siri, 187 Skripal poisoning (2018), 43 slavery, 59, 224 smallpox, 67 smart cities, 190, 199 smartphone addiction, 112, 186–7 snowflakes, 22, 113 social indicators, 74 social justice warriors (SJWs), 131 social media and crowd psychology, 6 emotional artificial intelligence, 12–13, 140–41 and engagement, 7 filter bubbles, 66 and propaganda, 15, 18, 81, 124 and PTSD, 113 and sentiment analysis, 12 trolls, 18, 20–22, 27, 40, 123, 146, 148, 194–8, 199, 209 weaponization of, 18, 19, 22, 194–5 socialism, 8, 20, 154–6, 158, 160 calculation debate, 154–6, 158, 160 Socialism (Mises), 160 Society for Freedom in Science, 163 South Africa, 103 sovereignty, 34, 53 Soviet Russia (1917–91), 132, 133, 135–8, 177, 180, 182–3 Spain, 5, 34, 84, 128, 202 speed of knowledge, xvi, 112, 124, 131, 134, 136, 153, 154, 165–70 Spicer, Sean, 3, 5 spy planes, 136, 152 Stalin, Joseph, 138 Stanford University, 179 statactivism, 74 statistics, 62–91, 161, 186 status, 88–90 Stoermer, Eugene, 206 strong man leaders, 16 suicide, 100, 101, 115 suicide bombing, 44, 146 superbugs, 205 surveillance, 185–93, 219 Sweden, 34 Switzerland, 164 Sydenham, Thomas, 96 Syriza, 5 tacit knowledge, 162 talking cure, 107 taxation, 158 Tea Party, 32, 50, 61, 221 technocracy, 53–8, 59, 60, 61, 78, 87, 89, 90, 211 teenage girls, 113, 114 telepathy, 39, 176–9, 181, 185, 186 terrorism, 17–18, 151, 185 Charlottesville attack (2017), 20 emergency powers, 42 JFK Airport terror scare (2016), x, xiii, 41 Oxford Circus terror scare (2017), ix–x, xiii, 41 September 11 attacks (2001), 17, 18 suicide bombing, 44, 146 vehicle-ramming attacks, 17 war on terror, 131, 136, 196 Thames Valley, England, 85 Thatcher, Margaret, 154, 160, 163, 166 Thiel, Peter, 26, 149–51, 153, 156, 174, 190 Thirty Years War (1618–48), 34, 45, 53, 126 Tokyo, Japan, x torture, 92–3 total wars, 129, 142–3 Treaty of Westphalia (1648), 34, 53 trends, xvi, 168 trigger warnings, 22, 113 trolls, 18, 20–22, 27, 40, 123, 146, 148, 194–8, 199, 209 Trump, Donald, xiv and Bannon, 21, 60–61 and climate change, 207 and education, 85 election campaign (2016), see under presidential election, US and free trade, 79 and health, 92, 99 and immigration, 145 inauguration (2017), 3–5, 6, 9, 10 and inequality, 76–7 “Make America Great Again,” 76, 145 and March for Science (2017), 23, 24, 210 and media, 27 and opinion polling, 65, 80 and Paris climate accord, 207 and promises, 221 and relative deprivation, 88 and statistics, 63 and Yellen, 33 Tsipras, Alexis, 5 Turing, Alan, 181, 183 Twitter and Corbyn’s rallies, 6 and JFK Airport terror scare (2016), x and Oxford Circus terror scare (2017), ix–x and Russia, 18 and sentiment analysis, 188 and trends, xvi and trolls, 194, 195 Uber, 49, 185, 186, 187, 188, 191, 192 UK Independence Party, 65, 92, 202 underemployment, 82 unemployment, 61, 62, 72, 78, 81–3, 87, 88, 203 United Kingdom austerity, 100 Bank of England, 32, 33, 64 Blitz (1940–41), 119, 143, 180 Brexit (2016–), see under Brexit Cameron government (2010–16), 33, 73, 100 Center for Policy Studies, 164 Civil Service, 33 climate-gate (2009), 195 Corbyn’s rallies, 5, 6 Dunkirk evacuation (1940), 119 education, 85 financial crisis (2007–9), 29–32, 100 first past the post, 13 general election (2015), 80, 81 general election (2017), 6, 65, 80, 81, 221 Grenfell Tower fire (2017), 10 gross domestic product (GDP), 77, 79 immigration, 63, 65 Irish hunger strike (1981), 43 life expectancy, 100 National Audit Office (NAO), 29 National Health Service (NHS), 30, 93 Office for National Statistics, 63, 133 and opiates, 105 Oxford Circus terror scare (2017), ix–x, xiii, 41 and pain, 102, 105 Palantir, 151 Potsdam Conference (1945), 138 quantitative easing, 31–2 Royal Society, 138 Scottish independence referendum (2014), 64 Skripal poisoning (2018), 43 Society for Freedom in Science, 163 Thatcher government (1979–90), 154, 160, 163, 166 and torture, 92 Treasury, 61, 64 unemployment, 83 Unite for Europe march (2017), 23 World War II (1939–45), 114, 119, 138, 143, 180 see also England United Nations, 72, 222 United States Bayh–Dole Act (1980), 152 Black Lives Matter, 10, 225 BP oil spill (2010), 89 Bush Jr. administration (2001–9), 77, 136 Bush Sr administration (1989–93), 77 Bureau of Labor, 74 Central Intelligence Agency (CIA), 3, 136, 151, 199 Charlottesville attack (2017), 20 Civil War (1861–5), 105, 142 and climate change, 207, 214 Clinton administration (1993–2001), 77 Cold War, see Cold War Defense Advanced Research Projects Agency (DARPA), 176, 178 Defense Intelligence Agency, 177 drug abuse, 43, 100, 105, 115–16, 131, 172–3 education, 85 Federal Bureau of Investigation (FBI), 137 Federal Reserve, 33 Fifth Amendment (1789), 44 financial crisis (2007–9), 31–2, 82, 158 first past the post, 13 Government Accountability Office, 29 gross domestic product (GDP), 75–7, 82 health, 92, 99–100, 101, 103, 105, 107, 115–16, 158, 172–3 Heritage Foundation, 164, 214 Iraq War (2003–11), 74, 132 JFK Airport terror scare (2016), x, xiii, 41 Kansas populists (1880s), 220 libertarianism, 15, 151, 154, 158, 164, 173 life expectancy, 100, 101 March For Our Lives (2018), 21 March for Science (2017), 23–5, 27, 28, 210 McCarthyism (1947–56), 137 Million-Man March (1995), 4 National Aeronautics and Space Administration (NASA), 23, 175 National Defense Research Committee, 180 National Park Service, 4 National Security Agency (NSA), 152 Obama administration (2009–17), 3, 24, 76, 77, 79, 158 Occupy Wall Street (2011), 5, 10, 61 and opiates, 105, 172–3 and pain, 103, 105, 107, 172–3 Palantir, 151, 152, 175, 190 Paris climate accord (2015), 205, 207 Parkland attack (2018), 21 Patriot Act (2001), 137 Pentagon, 130, 132, 135, 136, 214, 216 presidential election (2016), see under presidential election, US psychiatry, 107, 111 quantitative easing, 31–2 Reagan administration (1981–9), 15, 77, 154, 160, 163, 166 Rumsfeld’s “unknown unknowns” speech (2002), 132 Semi-Automatic Ground Environment (SAGE), 180, 182, 200 September 11 attacks (2001), 17, 18 Tea Party, 32, 50, 61, 221 and torture, 93 Trump administration (2017–), see under Trump, Donald unemployment, 83 Vietnam War (1955–75), 111, 130, 136, 138, 143, 205 World War I (1914–18), 137 World War II (1939–45), 137, 180 universal basic income, 221 universities, 151–2, 164, 169–70 University of Cambridge, 84, 151 University of Chicago, 160 University of East Anglia, 195 University of Oxford, 56, 151 University of Vienna, 160 University of Washington, 188 unknown knowns, 132, 133, 136, 138, 141, 192, 212 unknown unknowns, 132, 133, 138 “Use of Knowledge in Society, The” (Hayek), 161 V2 flying bomb, 137 vaccines, 23, 95 de Vauban, Sébastien Le Prestre, Marquis de Vauban, 73 vehicle-ramming attacks, 17 Vesalius, Andreas, 96 Vienna, Austria, 153–5, 159 Vietnam War (1955–75), 111, 130, 136, 138, 143, 205 violence vs. power, 19, 219 viral marketing, 12 virtual reality, 183 virtue signaling, 194 voice recognition, 187 Vote Leave, 50, 93 Wainright, Joel, 214 Wales, 77, 90 Wall Street, New York, 33, 190 War College, Berlin, 128 “War Economy” (Neurath), 153–4 war on drugs, 43, 131 war on terror, 131, 136, 196 Watts, Jay, 115 weaponization, 18–20, 22, 26, 75, 118, 123, 194, 219, 223 weapons of mass destruction, 132 wearable technology, 173 weather control, 204 “What Is An Emotion?”

Data and the City
by Rob Kitchin,Tracey P. Lauriault,Gavin McArdle
Published 2 Aug 2017

Matsuda (eds), Personal, Portable, Pedestrian: Mobile Phones in Japanese Life. Cambridge, MA: MIT Press, pp. 257–276. Pallot, M., Trousse, B., Senach, B. and Scapin, D. (2010) ‘Living Lab research landscape: From user centred design and user experience towards user cocreation’, in First European Summer School ‘Living Labs’, Paris. Pariser, E. (2011) The Filter Bubble: What the Internet is Hiding from You. New York: Penguin. Park, R. (1969) ‘The city: Suggestions for investigation of human behavior in the urban environment’, in R. Sennett (ed.), Classic Essays on the Culture of Cities. New York: Appleton-Century-Crofts, pp. 91–130. Ratti, C. and Townsend, A. (2011) ‘The social nexus’, Scientific American 305(3): 42–48.

pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity
by Amy Webb
Published 5 Mar 2019

When the internet shifted from academia and government to the private sector in the 1990s, we let it propagate freely instead of treating it like a regulated utility or financial system. Back then, lawmakers didn’t think much about how all the data we’d generate on the internet might be used. So now it’s impossible to comply with every legal permutation while our previous filter bubbles expanded to fit geographic borders. This helped the promotion and propagation of fake news. Because bad actors are using generative algorithms, and because depending on region, we’re all getting different versions of news content, we don’t know what or whom to trust. Every one of the world’s most venerable news organizations has been tricked more than once, as trained journalists have a difficult time verifying videos of global leaders and everyday people alike.

pages: 418 words: 102,597

Being You: A New Science of Consciousness
by Anil Seth
Published 29 Aug 2021

At the extreme end of the spectrum, what horror could be unleashed if an AI system were put in charge of nuclear weapons, or of the internet backbone? There are also ethical concerns about the psychological and behavioural consequences of AI and machine learning. Privacy invasion by deepfakes, behaviour modification by predictive algorithms, and belief distortion in the filter bubbles and echo chambers of social media are just a few of the many forces that pull at the fabric of our societies. By unleashing these forces we are willingly ceding our identities and autonomy to faceless data-corporations in a vast uncontrolled global experiment. Against this background, ethical discussions about machine consciousness can appear indulgent and abstruse.

pages: 371 words: 107,141

You've Been Played: How Corporations, Governments, and Schools Use Games to Control Us All
by Adrian Hon
Published 14 Sep 2022

“Hyperlocal” apps like Nextdoor have been implicated in racial profiling by hosting hostile discussions about people of colour spotted in the neighbourhood; it would be easy for gamified AR equivalents to make profiling individuals even faster and more rewarding by upvoting and downvoting anyone in sight.3 And when dating and networking are already treated like games by pickup artists and LinkedIn, it’s inevitable we’ll see an even broader gamification of real-life social interactions. Most worryingly, AR will allow anyone—including conspiracy theorists and cults—to alter reality to suit their beliefs, like a supercharged ARG. With political polarisation on the rise in the West, it feels as if we already inhabit separate online worlds and “filter bubbles” from each other. Extending that divide to the real world bodes ill for all societies. After their abortive attempts to win the consumer AR market with half-baked hardware, Microsoft and Google have focused their ambitions on the government and workplace.4 Microsoft’s gamification of office and productivity software will undoubtedly become part of its wide-ranging AR plans, spanning virtual meetings and collaborative workspaces.

pages: 389 words: 119,487

21 Lessons for the 21st Century
by Yuval Noah Harari
Published 29 Aug 2018

Stroebel, ‘Father–Daughter Incest: Data from an Anonymous Computerized Survey’, Journal of Child Sexual Abuse 21:2 (2010), 176–99. 15. Ignorance 1 Steven A. Sloman and Philip Fernbach, The Knowledge Illusion: Why We Never Think Alone (New York: Riverhead Books, 2017); Greene, Moral Tribes, op. cit. 2 Sloman and Fernbach, The Knowledge Illusion, op. cit., 20. 3 Eli Pariser, The Filter Bubble (London: Penguin Books, 2012); Greene, Moral Tribes, op. cit. 4 Greene, Moral Tribes, op. cit.; Dan M. Kahan, ‘The Polarizing Impact of Science Literacy and Numeracy on Perceived Climate Change Risks’, Nature Climate Change 2 (2012), 732–5. But for a contrary view, see Sophie Guy et al., ‘Investigating the Effects of Knowledge and Ideology on Climate Change Beliefs’, European Journal of Social Psychology 44:5 (2014), 421–9. 5 Arlie Russell Hochschild, Strangers in Their Own Land: Anger and Mourning on the American Right (New York: The New Press, 2016). 16.

pages: 1,172 words: 114,305

New Laws of Robotics: Defending Human Expertise in the Age of AI
by Frank Pasquale
Published 14 May 2020

When so much of the circulation of media has been automated, new methods of structuring communication may be necessary to maintain community, democracy, and social consensus about basic facts and values. As Mark Andrejevic’s compelling work demonstrates, automated media “poses profound challenges to the civic disposition required for self-government.”6 These concerns go well beyond the classic “filter bubble” problem, to the habits of mind required to make any improvement in new media actually meaningful in the public sphere. The new laws of robotics also address the larger political economy of media. The first new law of robotics commends policies that maintain journalists’ professional status (seeing AI as a tool to help them rather than replace them).7 Content moderators need better training, pay, and working conditions if they are to reliably direct AI in new social media contexts.

pages: 364 words: 119,398

Men Who Hate Women: From Incels to Pickup Artists, the Truth About Extreme Misogyny and How It Affects Us All
by Laura Bates
Published 2 Sep 2020

Videos about jogging led to videos about running ultramarathons.’10 A Wall Street Journal investigation revealed the same phenomenon.11 Of course, for those looking for videos about fun dance moves, say, or cookery techniques, this is a relatively harmless pattern. But, for impressionable young people who start out looking at quite mainstream political content, it has much more serious implications. Chaslot told the Daily Beast he very quickly realised that ‘YouTube’s recommendation was putting people into filter bubbles… There was no way out.’ In a 2019 New York Times interview, YouTube’s chief product officer, Neal Mohan, denied that the platform created a ‘rabbit hole’ effect, saying that it offered a full spectrum of content and opinion, and that watch time was not the only feature used by the site’s recommendation systems.

pages: 370 words: 112,809

The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future
by Orly Lobel
Published 17 Oct 2022

Other uses would require companies to report risk assessments and explanations of how the algorithms are making decisions, including safeguarding the technology through ongoing human oversight. Consumers also would have a right to see disclosures that they are chatting with or seeing images produced by AI. In the United States, a new bipartisan bill, the Filter Bubble Transparency Act, would require the largest platforms to provide greater transparency on algorithmic processes and allow users to view content without secret curation by algorithms. France and the United Kingdom have plans to require that all algorithms used by the government are disclosed to the public.

pages: 538 words: 121,670

Republic, Lost: How Money Corrupts Congress--And a Plan to Stop It
by Lawrence Lessig
Published 4 Oct 2011

I don’t pretend to offer any solution to bad faith, though as I emphasize in “Against Transparency” (New Republic, Oct. 9, 2009), the most obvious solution is to eliminate the suggestion that there may be a conflict. 27. Florence T. Bourgeois, Srinivas Murthy, and Kenneth D. Mandl, “Outcome Reporting Among Drug Trials Registered in ClinicalTrials.gov,” Annals of Internal Medicine 153 no. 3 (Aug. 3, 2010): 158–66, 159, available at link #14. 28. Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (forthcoming, New York: Penguin Press, 2011), 28. 29. Top 1000 Sites—DoubleClick Ad Planner, available at link #15. The $150 million is calculated as follows: $1 per thousand page views, an estimated fourteen billion page views per month, times twelve months is at least $150 million. 30.

pages: 742 words: 137,937

The Future of the Professions: How Technology Will Transform the Work of Human Experts
by Richard Susskind and Daniel Susskind
Published 24 Aug 2015

Palfrey, John, and Urs Gasser, Born Digital (New York: Basic Books, 2008). Paliwala, Abdul (ed.), A History of Legal Informatics (Zaragoza: Prensas Universitarias de Zaragoza, 2010). Panel on Fair Access to the Professions, Unleashing Aspiration (London: Cabinet Office, 2009). Parfit, Derek, Reasons and Persons (Oxford: Clarendon Press, 1987). Pariser, Eli, The Filter Bubble (London: Penguin Books, 2012). Parsons, Matthew, Effective Knowledge Management for Law Firms (New York: Oxford University Press, 2004). Parsons, Talcott, ‘The Professions and Social Structure’, Social Forces, 17: 4 (1939), 457–67. Parsons, Talcott, The Social System (New York: Free Press, 1951).

pages: 533

Future Politics: Living Together in a World Transformed by Tech
by Jamie Susskind
Published 3 Sep 2018

CoinDesk, 3 Mar. 2016 <http://www.coindesk.com/blockchain-startup-aims-to-secure-1-million-estonian-health-records/> (accessed 30 Nov. 2017). Papacharissi, Zizi A. A Private Sphere: Democracy in a Digital Age. Cambridge: Polity Press, 2013. Parijs, Philippe van, and Yannick Vanderborght. Basic Income: A Radical Proposal for a Free Society and a Sane Economy. Cambridge, Mass: Harvard, 2017. Pariser, Eli. The Filter Bubble: What the Internet is Hiding from You. London: Penguin, 2011. Pasquale, Frank. The Black Box Society: The Secret Algorithms that Control Money and Information. Cambridge, Mass: Harvard University Press, 2015. Pasquale, Frank. ‘From Holocaust Denial to Hitler Admiration, Google’s Algorithm is Dangerous’.

pages: 788 words: 223,004

Merchants of Truth: The Business of News and the Fight for Facts
by Jill Abramson
Published 5 Feb 2019

All the while, the site’s content-serving algorithm bore the increasing burden of matching coverage from various viewpoints to the right users in order to keep everyone happy, or at least engaged, during what was shaping up to be a bitterly divisive 2016 presidential campaign. As the national narrative frayed into contradictory story lines, Facebook gave news providers an ever greater ability to cater to the camp that already bought into their spin. These were the so-called filter bubbles that Eli Pariser had warned about. They weakened democracy by causing the base of common knowledge and truth to wither away, replaced by intensely siloed and refined bands of information designed to appeal to ideological leanings. All the while, Facebook was increasing the competition among news outlets.

pages: 898 words: 236,779

Digital Empires: The Global Battle to Regulate Technology
by Anu Bradford
Published 25 Sep 2023

Instead of nurturing inclusive democracy, online engagement has often increased societal polarization. Cass Sunstein has warned about the perils of polarization resulting from online platforms delivering a highly “personalized experience” for each user.174 Social media has become a venue characterized by filter bubbles and “information cocoons” where citizens are no longer exposed to alternative viewpoints.175 This feeds social divisions and nurtures more extremist ideas. Deprived of shared conversations and experiences, a society cannot address social problems collectively. This undermines the potential for the internet to be a genuine public sphere where a conversation enhances understanding and paves the way for compromises.176 Social media not only feeds polarization, but also risks simply lowering the quality of information that citizens consume.