Filter Bubble

back to index

description: intellectual isolation involving search engines

127 results

The Filter Bubble: What the Internet Is Hiding From You

by Eli Pariser  · 11 May 2011  · 274pp  · 75,846 words

You Want, Whether You Want It or Not Chapter 8 - Escape from the City of Ghettos Acknowledgements FURTHER READING NOTES INDEX Advance Praise for The Filter Bubble “Internet firms increasingly show us less of the wide world, locating us in the neighborhood of the familiar. The risk, as Eli Pariser shows,

and more tailored to our needs, all the time. The risk, Eli Pariser reveals, is that we increasingly won’t see other perspectives. In The Filter Bubble, he shows us how the trend could reinforce partisan and narrow mindsets, and points the way to a greater online diversity of perspective.” —Craig Newmark

light on so many of our daily encounters.” —Bill McKibben, author of The End of Nature and Eaarth and founder of 350.org “The Filter Bubble shows how unintended consequences of well-meaning online designs can impose profound and sudden changes on politics. All agree that the Internet is a potent

smartest person I know thinking about the relationship of digital technology to participation in the democratic process—he is also the most experienced. The Filter Bubble reveals how the world we encounter is shaped by programs whose very purpose is to narrow what we see and increase the predictability of our

if it is not showing up in your recommended reads on Amazon.” —Douglas Rushkoff, author of Life Inc. and Program or Be Programmed “In The Filter Bubble, Eli Pariser reveals the news slogan of the personalized Internet: Only the news that fits you we print.” —George Lakoff, author of Don’t

who were both). Throughout my investigation, I was struck by the lengths one has to go to in order to fully see what personalization and filter bubbles do. When I interviewed Jonathan McPhie, Google’s point man on search personalization, he suggested that it was nearly impossible to guess how the

practice: Personalization is already much more a part of our daily experience than many of us realize. We can now begin to see how the filter bubble is actually working, where it’s falling short, and what that means for our daily lives and our society. Every technology has an interface,

powerful position, Calo says. “There are lots of ways for it to skew your perception of the world.” And that’s precisely what the filter bubble does. THE FILTER BUBBLE’S costs are both personal and cultural. There are direct consequences for those of us who use personalized filters (and soon enough, most of

us will, whether we realize it or not). And there are societal consequences, which emerge when masses of people begin to live a filter-bubbled life. One of the best ways to understand how filters shape our individual experience is to think in terms of our information diet. As sociologist

amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown. In the filter bubble, there’s less room for the chance encounters that bring insight and learning. Creativity is often sparked by the collision of ideas from different disciplines

always warranted, and when decisions are made on the basis of this data that affect you negatively, they’re usually not revealed. Ultimately, the filter bubble can affect your ability to choose how you want to live. To be the author of your life, professor Yochai Benkler argues, you have to

data companies with a mission of churning out information about their readers’ preferences—unless, in other words, they could adapt themselves to the personalized, filter-bubble world—they were sunk. NEWS SHAPES OUR sense of the world, of what’s important, of the scale and color and character of our problems

present system has a sense of ethics and public responsibility baked in, however imperfectly. But though it’s playing some of the same roles, the filter bubble does not. A New Middleman New York Times critic Jon Pareles calls the 2000s the disintermediation decade. Disintermediation—the elimination of middlemen—is “the thing

to fade, and its front page mixes the articles the group thinks are most important with your personal preferences and behavior—a marriage of the filter bubble and the most-popular list. Las Últimas Noticias, a major paper in Chile, began basing its content entirely on what readers clicked on in

a scary extreme, it’s surrender.” Of Apple and Afghanistan Google News pays more attention to political news than many of the creators of the filter bubble. After all, it draws in large part on the decisions of professional editors. But even in Google News, stories about Apple trump stories about

things are of similar importance to developments in Afghanistan. But this Apple-centric ranking is indicative of what the combination of popular lists and the filter bubble will leave out: Things that are important but complicated. “If traffic ends up guiding coverage,” the Washington Post’s ombudsman writes, “will The Post

a funhouse mirror reflecting a funhouse mirror reflecting reality. This distorting effect is one of the challenges posed by personalized filters. Like a lens, the filter bubble invisibly transforms the world we experience by controlling what we see and don’t see. It interferes with the interplay between our mental processes and

come anywhere close. In two important ways, personalized filters can upset this cognitive balance between strengthening our existing ideas and acquiring new ones. First, the filter bubble surrounds us with ideas with which we’re already familiar (and already agree), making us overconfident in our mental frameworks. Second, it removes from our

world and the world to fit our schemata, and it’s in properly balancing the two processes that growth occurs and knowledge is built. The filter bubble tends to dramatically amplify confirmation bias—in a way, it’s designed to. Consuming information that conforms to our ideas of the world is

likely to follow political news. Therefore, people with more education can actually become mis-educated.” And while this phenomenon has always been true, the filter bubble automates it. In the bubble, the proportion of content that validates what you know goes way up. Which brings us to the second way the

a result we become curious about its contents. But to feel curiosity, we have to be conscious that something’s being hidden. Because the filter bubble hides things invisibly, we’re not as compelled to learn about what we don’t know. As University of Virginia media studies professor and Google

Adderall society, in which hyperfocus displaces general knowledge and synthesis. Personalization can get in the way of creativity and innovation in three ways. First, the filter bubble artificially limits the size of our “solution horizon”—the mental space in which we search for solutions to problems. Second, the information environment inside the

new ideas in some environments than in others; the contexts that filtering creates aren’t the ones best suited to creative thinking. Finally, the filter bubble encourages a more passive approach to acquiring information, which is at odds with the kind of exploration that leads to discovery. When your doorstep is

become powerful enough to win against the grand masters of chess. Narrowing the solution horizon, in other words, was key. In a way, the filter bubble is a prosthetic solution horizon: It provides you with an information environment that’s highly relevant to whatever problem you’re working on. Often, this

requires the bisociation of ideas that are indirectly related—as when Page applied the logic of academic citation to the problem of Web search—the filter bubble may narrow your vision too much. What’s more, some of the most important creative breakthroughs are spurred by the introduction of the entirely

Pasteurs of the world often have no idea what they’re looking for. The biggest breakthroughs are sometimes the ones that we least expect. The filter bubble still offers the opportunity for some serendipity, of course. If you’re interested in football and local politics, you might still see a story

filter, it’s nearly impossible to sort the usefully serendipitous and randomly provocative from the just plain irrelevant. The second way in which the filter bubble can dampen creativity is by removing some of the diversity that prompts us to think in new and innovative ways. In one of the standard

away from entries on LSD, Teflon, Parkinson’s disease, Sri Lanka, Isaac Newton, and about two hundred other topics of comparable diversity.” But the filter bubble has dramatically changed the informational physics that determines which ideas we come in contact with. And the new, personalized Web may no longer be as

—the way one can hop from article to article on Wikipedia—are friendly to the divergent part of that process. But the rise of the filter bubble means that increasingly the convergent, synthetic part of the process is built in. Battelle calls Google a “database of intentions,” each query representing something

too few—you can find yourself overwhelmed by the number of options or paralyzed by the paradox of choice. But the basic point remains: The filter bubble doesn’t just reflect your identity. It also illustrates what choices you have. Students who go to Ivy League colleges see targeted advertisements for

personal feeds of professional scientists might feature articles about contests that amateurs never become aware of. By illustrating some possibilities and blocking out others, the filter bubble has a hand in your decisions. And in turn, it shapes who you become. A Bad Theory of You The way that personalization shapes identity

bias, mixing “should” stories with “want” stories and encouraging us to dig into the difficult but rewarding work of understanding complex problems. But the filter bubble tends to do the opposite: Because it’s our present self that’s doing all the clicking, the set of preferences it reflects is necessarily

do—when they are able to accurately gauge the workings of your psyche—things get even weirder. Targeting Your Weak Spots The logic of the filter bubble today is still fairly rudimentary: People who bought the Iron Man DVD are likely to buy Iron Man II; people who enjoy cookbooks will

The Godfather: Part II, they’ll like The Godfather: Part III. But the overfitting problem gets to one of the central, irreducible problems of the filter bubble: Overfitting and stereotyping are synonyms. The term stereotyping (which in this sense comes from Walter Lippmann, incidentally) is often used to refer to malicious xenophobic

opinions outright, it’ll increasingly revolve around second-order censorship—the manipulation of curation, context, and the flow of information and attention. And because the filter bubble is primarily controlled by a few centralized companies, it’s not as difficult to adjust this flow on an individual-by-individual basis as you

places, we might be able to see traces of this kind of thing happening now—sentiment being algorithmically shifted over time. But if the filter bubble might make shifting perspectives easier in a future Iraq or Panama, Rendon was clearly concerned about the impact of self-sorting and personalized filtering for

the obvious exceptions. But even putting aside intentional manipulation, the rise of filtering has a number of unintended yet serious consequences for democracies. In the filter bubble, the public sphere—the realm in which common problems are identified and addressed—is just less relevant. For one thing, there’s the problem of

, while conversations that could introduce me to new ideas are obscured. Of course, friendly doesn’t describe all of the stories that pierce the filter bubble and shape our sense of the political world. As a progressive political news junkie, I get plenty of news about Sarah Palin and Glenn Beck

found that stories that aroused strong feelings—awe, anxiety, anger, happiness—were much more likely to be shared. If television gives us a “mean world,” filter bubbles give us an “emotional world.” One of the troubling side effects of the friendly world syndrome is that some important public problems will disappear. Few

cruise line industry” that was in violation of their general guidelines about taste. Apparently, advertisers that implicated corporations in public issues weren’t welcome. The filter bubble will often block out the things in our society that are important but complex or unpleasant. It renders them invisible. And it’s not just

advertising is half a decade behind the state of the art in commercial advertising, most of this change is still to come. But for starters, filter-bubble politics could effectively make even more of us into single issue voters. Like personalized media, personalized advertising is a two-way street: I may

up the cost of these ads, making it too costly for campaigns to ever engage the other side. The most serious political problem posed by filter bubbles is that they make it increasingly difficult to have a public argument. As the number of different segments and messages increases, it becomes harder

build. To paraphrase Spider-Man creator Stan Lee, with great power comes great responsibility. But the programmers who brought us the Internet and now the filter bubble aren’t always game to take on that responsibility. The Hacker Jargon File, an online repository of geek culture, puts it this way: “Hackers

’s a future where advertisers will develop ever more powerful and reality-bending ways to make sure their products are seen. The days when the filter bubble disappears when we step away from our computers, in other words, are numbered. The Robot with Gaydar Stanford Law professor Ryan Calo thinks a

the end of naive empiricism, of the world as we see it, and the beginning of something far more mutable and weird: a real-world filter bubble that will be increasingly difficult to escape. Losing Control There’s plenty to love about this ubiquitously personalized future. Smart devices, from vacuum cleaners

s showing up and why, whereas Facebook makes it nearly impossible. All other things being equal, if you’re concerned about having control over your filter bubble, better to use services like Twitter than services like Facebook. We live in an increasingly algorithmic society, where our public functions, from police databases to

literate enough to understand what most basic bits of code are doing. Changing our own behavior is a part of the process of bursting the filter bubble. But it’s of limited use unless the companies that are propelling personalization forward change as well. What Companies Can Do It’s understandable

the pressure of monetization pulls them in a different direction. What Governments and Citizens Can Do There’s plenty that the companies that power the filter bubble can do to mitigate the negative consequences of personalization—the ideas above are just a start. But ultimately, some of these problems are too

Outnumbered: From Facebook and Google to Fake News and Filter-Bubbles – the Algorithms That Control Our Lives

by David Sumpter  · 18 Jun 2018  · 276pp  · 81,153 words

was making racist autocomplete suggestions; Twitterbots were spreading fake news; Stephen Hawking was worried about artificial intelligence; far-right groups were living in algorithmically created filter-bubbles; Facebook was measuring our personalities, and these were being exploited to target voters. One after another, the stories of the dangers of algorithms accumulated. Even

media outlets that ran mathsy-sounding stories about the isolation and polarisation created by algorithms. To start with, the discussion was of echo chambers and filter bubbles. The theory was that Facebook and Google were personalising our searches to such an extent that we only saw what we wanted to see. The

lot of my colleagues seemed to think so. I wasn’t so sure. Long before academics and journalists were worrying about the Trump and Clinton filter bubbles, two young computer scientists were already looking at how political campaigns shaped and were shaped by the Internet. In 2004, Lada Adamic and Natalie Glance

provided by Twitter. When commentators analyse this vast array of social media they usually come back to two key themes: the echo chamber and the filter bubble. These concepts are related but slightly different. The 2004 political blogs network is a primitive example of an echo chamber. Bloggers linked to other bloggers

20 clicks later you will probably still be reading conservative material. Each set of bloggers had created their own world, within which their views reverberated. Filter bubbles came later and are still developing. The difference between ‘filtered’ and ‘echoed’ cavities lies in whether they are created by algorithms or by people. While

, our web searches and our browsing history do not involve an active choice on our part. It is these algorithms that can potentially create a filter bubble.1 Each action you make in your web browser is used to decide what to show you next. Whenever you share an article from, for

actually means it has done a principal component analysis of your ‘likes’). So while the model shows that there is a risk that Facebook creates filter bubbles, it does not prove that all its users are trapped in bubbles. I wanted to find out how well the simplified filter model captures reality

she took a sabbatical to work at Facebook, where she ended up staying as a data scientist. Working at Facebook allowed Lada to test the filter bubble hypothesis on mainstream politics. Together with two other Facebook scientists, Lada looked at the network of connections between politically affiliated friends on Facebook. Friendship networks

to our own opinions. Moreover, American conservatives, a group often accused of being closed-minded, were exposed to slightly more contradictory opinions than liberals. The filter bubble test was published in the leading scientific journal Science. It was just one example of how, during the first half of this decade, Facebook took

fellow maths-football nerds about the game, and the maths and stats used to analyse it. This small part of the Twitterverse is both a filter bubble and an echo chamber. I know that Twitter filters my feed so that the biggest geeks are at the top of my home page. Other

pp. 461–6. IEEE. Chapter 11 : Bubbling Up 1 In his book and TED Talk on the filter bubble, Eli Pariser revealed the extent to which our online activities are personalised. Pariser, Eli. 2011. The Filter Bubble: How the new personalized web is changing what we read and how we think. Penguin. Google, Facebook

Lecture: Interdependence, Communication, and Aggregation: Transforming Voters into Electorates.’ PS: Political Science & Politics 50.1: 3–11. 3 DiFranzo, D. and Gloria-Garcia, K. 2017. ‘Filter bubbles and fake news.’ XRDS: Crossroads, The ACM Magazine for Students 23, no. 3: 32–5. 4 Jackson, D., Thorsen, E. and Wring, D. 2016. ‘EU

, here, here false negatives here, here false positives here, here, here, here Fark here Feedly here Feller, Avi here Fergus, Rob here Ferrara, Emilio here filter bubbles here, here, here FiveThirtyEight here, here, here, here Flipboard here Flynn, Michael here football here, here robot players here, here Fortunato, Santo here, here Fowler

The Seven Rules of Trust: A Blueprint for Building Things That Last

by Jimmy Wales  · 28 Oct 2025  · 216pp  · 60,419 words

, the democratization of content. I liked the idea that anybody could contribute. I live on the Upper West Side of Manhattan, in the most extreme filter bubble there is in the world. The fact that I was able to see videos people made at home, on YouTube, I thought it was fantastic

Disrupted: My Misadventure in the Start-Up Bubble

by Dan Lyons  · 4 Apr 2016  · 284pp  · 92,688 words

also to the mindset of the people working inside technology companies, the true believers and Kool-Aid drinkers, the people who live inside their own filter bubble, brimming with self-confidence and self-regard, impervious to criticism, immunized against reality, unaware of how ridiculous they appear to the outside world. HubSpot, where

Boom: Bubbles and the End of Stagnation

by Byrne Hobart and Tobias Huber  · 29 Oct 2024  · 292pp  · 106,826 words

innovation, bubbles are uniquely suited to incubate and accelerate future technologies that can break through stagnation and accelerate growth. A taxonomy of bubbles: Speculative versus filter bubbles Looking at bubbles more closely, we can differentiate between two kinds of bubbles, although they are less distinct than they might initially seem. One kind

justification for asset prices is the expectation that someone else will be willing to pay even more. The other kind of bubble is a filter bubble. Participants in filter bubbles wall themselves off from opinions they disagree with and become increasingly convinced that their viewpoints reflect the one true way to understand the world

. Filter bubbles are generally seen as dangerous; indeed, they can be a source of conspiracy theories, misinformation, and pathological behavior. But they also represent a filtering out

they put enough resources to work on behalf of their vision, they just might attain it—or create something unexpected but still extraordinarily valuable. Even filter bubbles are capable of creating positive feedback loops. On the one hand, they can describe toxic circumstances, such as severe political partisanship. But a

also be an environment that leads to the pursuit of new ideas, despite widespread beliefs that they won’t work. Yes, QAnon exists in a filter bubble, but so did Moderna when it rejected the consensus that an mRNA vaccine was impractical in the short term. The sheer volume of information in

the world means that we need some kind of filter bubble just to keep things coherent. The real question is whether we’re filtering out good information or bad. Both speculative and

filter bubbles depend on information flows and feedback loops. In a financial bubble, price changes are taken as evidence that the theories behind the bubble are true.

often die down after a while, whether through internal contradictions or because the bubble needs an external source of energy to sustain itself. A political filter bubble can fall apart when it prompts participants to do obviously crazy things. (Readers are welcome to fill in their own example of what constitutes crazy

are hard at work doing the other nine-tenths. Mean-reversion versus inflection bubbles Both kinds of bubbles—the classic speculative financial bubble and the filter bubble—can lead to good and bad outcomes. Which outcome arises often depends on whether the bubble in question is a mean-reversion bubble or an

as Meta Platforms is that the company is committed to ruling the next platform it runs on. Likewise, regulations and narratives can be seen as filter bubbles subject to self-reinforcing dynamics, reflecting the perception that there is only one chance to get things right. In the 1990s, some of the Atari

; they can also be about behaviors, like getting more people in the organization to think about problems in the same way and creating a healthy filter bubble in which everyone can cooperate. Since bubbles are temporary phenomena with powerful long-term effects, you should fear missing out. Financial FOMO at its worst

“desktop” computer, the “what you see is what you get” (WYSIWYG) word processor, and Ethernet. Each of these innovative environments possessed elements of speculative and filter bubbles, enabling them to transform a vague vision of corporate destiny into a specific vision of the future. 237 All of these R&D labs benefited

deliberately kept secret, transformative corporate R&D projects always face an uphill climb from a skeptical public and worried competition. But that also induces the filter bubble phenomenon so necessary for transformative innovation. Some of the people working at OpenAI, Anthropic, and other AI companies feel like they’re part of a

Messing With the Enemy: Surviving in a Social Media World of Hackers, Terrorists, Russians, and Fake News

by Clint Watts  · 28 May 2018  · 324pp  · 96,491 words

hatred, or what might collectively be referred to as “preferences.” Eli Pariser, the head of the viral content website Upworthy, noted in his book The Filter Bubble the emergence and danger of social media and internet search engine algorithms selectively feeding users information designed to suit their preferences. Over time, these

echo chambers, blocking out alternative viewpoints and facts that don’t conform to the cultural and ideological preferences of users. Pariser recognized that filter bubbles would create “the impression that our narrow self-interest is all that exists.”1 The internet brought people together, but social media preferences have now

driven people apart through the creation of preference bubbles—the next extension of Pariser’s filter bubbles. Preference bubbles result not only from social media algorithms feeding people more of what they want, but also people choosing more of what they like

/01/465106713/as-isis-evolves-u-s-counter-efforts-must-advance-lumpkin-says. CHAPTER 9: FROM PREFERENCE BUBBLES TO SOCIAL INCEPTION 1. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books (April 24, 2012). https://www.amazon.com

/Filter-Bubble-Personalized-Changing-Think/dp/0143121235. 2. Tom Nichols, “The Death Of Expertise,” The Federalist (January 17, 2014) http://thefederalist.com/2014/01/17/the-death-

99%: Mass Impoverishment and How We Can End It

by Mark Thomas  · 7 Aug 2019  · 286pp  · 79,305 words

are seeing a completely different set of news items than you are.21 The algorithm that worried Gardner also worried Eli Pariser, author of The Filter Bubble: What The Internet Is Hiding From You. As Pariser put it: Democracy requires citizens to see things from one another’s point of view, but

Zucked: Waking Up to the Facebook Catastrophe

by Roger McNamee  · 1 Jan 2019  · 382pp  · 105,819 words

way on their own. Users would think they were seeing a balance of content when in fact they were trapped in what Eli called a “filter bubble” created and enforced by algorithms. He hypothesized that giving algorithms gatekeeping power without also requiring civic responsibility would lead to unexpected, negative consequences. Other publishers

were jumping on board the personalization bandwagon. There might be no way for users to escape from filter bubbles. Eli’s conclusion? If platforms are going to be gatekeepers, they need to program a sense of civic responsibility into their algorithms. They need to

. I no longer had regular contact with Zuck, much less inside information. I was not up to speed on the engineering priorities that had created filter bubbles or about plans for monetizing them. But Eli’s talk percolated in my mind. There was no good way to spin

and Sheryl would have the sense not to use them in ways that would harm users. (You can listen to Eli Pariser’s “Beware Online ‘Filter Bubbles’” talk for yourself on TED.com.) Meanwhile, Facebook marched on. Google introduced its own social network, Google+, in June 2011, with considerable fanfare. By the

I would have liked to understand the problem. More than four years of relentless success at Facebook had bred overconfidence. The company was in a filter bubble of its own. Every day, there were more users, spending more time on the site, generating more revenue and earnings, which pushed the stock to

cues and feedback loops that would normally cause a bully to experience shunning or disgust by their peers are not present. Adults get locked into filter bubbles, which Wikipedia defines as “a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user

to see based on information about the user, such as location, past click-behavior and search history.” Filter bubbles promote engagement, which makes them central to the business models of Facebook and Google. But filter bubbles are not unique to internet platforms. They can also be found on any journalistic medium that reinforces the

preexisting beliefs of its audience, while suppressing any stories that might contradict them. Partisan TV channels like Fox News and MSNBC maintain powerful filter bubbles, but they cannot match the impact of Facebook and Google because television is a one-way, broadcast medium. It does not allow for personalization, interactivity

, sharing, or groups. In the context of Facebook, filter bubbles have several elements. In the endless pursuit of engagement, Facebook’s AI and algorithms feed each of us a steady diet of content similar to

want sounds like a great idea, but it has at least one unfortunate by-product: filter bubbles. There is a high correlation between the presence of filter bubbles and polarization. To be clear, I am not suggesting that filter bubbles create polarization, but I believe they have a negative impact on public discourse and politics because

more time on the site. Once a person identifies with an extreme position on an internet platform, he or she will be subject to both filter bubbles and human nature. A steady flow of ideas that confirm beliefs will lead many users to make choices that exclude other ideas both online and

off. As I learned from Clint Watts, a national security consultant for the FBI, the self-imposed blocking of ideas is called a preference bubble. Filter bubbles are imposed by others, while a preference bubble is a choice. By definition, a preference bubble takes users to a bad place, and they may

change. Preference bubbles can be all-encompassing, especially if a platform like Facebook or Google amplifies them with a steady diet of reinforcing content. Like filter bubbles, preference bubbles increase time on site, which is a driver of revenue. In a preference bubble, users create an alternative reality, built around values shared

ones that are incontrovertible. This is how a large minority of Americans abandoned newspapers in favor of talk radio and websites that peddle conspiracy theories. Filter bubbles and preference bubbles undermine democracy by eliminating the last vestiges of common ground among a huge percentage of Americans. The tribe is all that matters

ensure that users who like one piece of disinformation will be fed more disinformation. Fed enough disinformation, users will eventually wind up first in a filter bubble and then in a preference bubble. If you are a bad actor and you want to manipulate people in a preference bubble, all you have

even aware of Tristan’s ideas. They certainly had not offered an invitation to speak. Then a miracle occurred. Eli Pariser, whose legendary presentation on filter bubbles had mesmerized the TED audience in 2011, independently suggested to TED curator Chris Anderson that he add Tristan to the program. And so it happened

. For those who engaged frequently with the Group, the effect would be to make beliefs more rigid and more extreme. The Group would create a filter bubble, where the troll, the bots, and the other members would coalesce around an idea floated by the troll. We also shared a hypothesis that the

that the Russians—and Trump—would have focused on activating a minority—Trump’s base—while simultaneously suppressing the vote of the majority. Facebook’s filter bubbles and Groups would have made that job relatively easy. We didn’t know how much impact the Russians would have had. However, roughly four million

, “What should we do?” Tristan didn’t miss a beat. “Hold a hearing and make Mark Zuckerberg testify under oath. Make him justify profiting from filter bubbles, brain hacking, and election interference.” Washington needed help dealing with the threat of interference through social networks, and Senator Warner asked that we support him

, sport, philosophy, and idea. The ones built around extreme ideas—disinformation, fake news, conspiracy theories, hate speech—become filter bubbles, reinforcing the shared value, intensifying emotional attachments to it. Thanks to Groups and filter bubbles, inflammatory posts can reach huge numbers of like-minded people on Facebook with only a little spending. The Russian

go for the mass market, which means Facebook. The Russians would have placed the story in the Facebook Groups they controlled, counting on Facebook’s filter bubbles to ensure widespread acceptance of the veracity of the story, as well as widespread sharing. Trolls and bots help, but the most successful disinformation and

he, in his own words, “self-investigated it” and fired three bullets into the pizza parlor. How can that happen? Filter bubbles. What differentiates filter bubbles from normal group activity is intellectual isolation. Filter bubbles exist wherever people are surrounded by people who share the same beliefs and where there is a way to keep out

beliefs. They prey on trust and amplify it. They can happen on television when the content is ideologically extreme. Platforms have little incentive to eliminate filter bubbles because they improve metrics that matter: time on site, engagement, sharing. They can create the illusion of consensus where none exists. This was particularly true

Russian interference, when the use of trolls and bots would have increased the illusion of consensus for human members of affected Facebook Groups. People in filter bubbles can be manipulated. They share at least one core value with the members of their group. Having a shared value fosters trust in the other

this is possible because users trust what they find on social media. They trust it because it appears to originate from friends and, thanks to filter bubbles, conforms to each user’s preexisting beliefs. Each user has his or her own Truman Show, tailored to press emotional buttons, including those associated with

fear and anger. While endless confirmation of their preexisting beliefs sounds appealing, it undermines democracy. The Russians exploited user trust and filter bubbles to sow discord, to reduce faith in democracy and government, and, ultimately, to favor one candidate over the other. The Russian interference succeeded beyond any

and continues to succeed because many key stakeholders in our government have been slow to acknowledge it, taking no meaningful steps to prevent a reoccurrence. Filter bubbles and preference bubbles undermine critical thinking. Worse still, the damage can persist even if the user abandons the platform that helped to foster them. Listening

the world’s population. On its best day, IBM’s monopoly was limited to governments and the largest corporations. Thanks to brain hacking and the filter bubbles that result from it, Facebook’s influence over consumers may be greater than any single business before it. Persuading 40 percent of the world’s

downsides. The design of devices should deliver utility without dependence. The design of applications and platforms should respect the user, limiting the effects of existing filter bubbles and preventing new ones from forming. Done right, every internet platform would be a new bicycle for the mind. In data privacy, a really useful

responsibility for third-party content on the platform, they had the effect of promoting the primary elements of filter bubbles—family, friends, and Groups—at the expense of the content most likely to pierce a filter bubble, journalism. The changes seemed like a step backward. Had they been implemented in 2015, they would likely

to News Feed in January that reduced the weight of journalistic sources almost certainly made at least one aspect of preventing interference—the piercing of filter bubbles—much harder. Zuck agreed to testify at two congressional hearings—a joint session of the Senate Judiciary and Commerce committees, the other with the House

, Facebook has become the most important platform for news and politics. The impact of Facebook on public discourse is unprecedented, thanks to its Truman Shows, filter bubbles, and manipulation of attention. No one elected the employees of Facebook, but their actions can have a decisive impact on our democracy. In almost every

far too often. Only the well-connected or highly visible can recover from these mistakes. And then there is the impact of Truman Shows and filter bubbles, the primary consequence of which seems to be polarization. As amplifiers of disinformation and hate speech, Facebook, Google, and Twitter have effectively abridged the rights

Cambodia. Facebook remains a threat to public health. Users get addicted. They get jealous when friends show off their beautiful lives. They get stuck in filter bubbles and, in some cases, preference bubbles. Facebook helps them get into these states, but it cannot get them out. Preference bubbles redefine identity. To break

, regulations to level the economic playing field would require years to implement. One of the really important steps is to require the option of a filter bubble–free view of Facebook News Feed and Google search results. What I have in mind is a button on every page that enables users to

look like? There are opportunities as far as the eye can see. Human-driven social networks would enable sharing with friends, but without massive surveillance, filter bubbles, and data insecurity. They would require a different business model, perhaps based on subscriptions. Given the ubiquity of social media and the threats posed by

other app they offer. They acquire your credit card data, as well as data from other offline sources. They use artificial intelligence to create a filter bubble in your search results of things you like. They use their sea of data to crush competitors. Google’s most glaring problems are in YouTube

with little understanding of our government and how it is supposed to work. Even if Facebook were to change its business model to discourage new filter bubbles, what could it do about the ones that already exist? What can Facebook do about the fact that humans prefer disinformation to information? What can

not dominate the information sphere as they did in 2016. It owes the world its best effort to prioritize facts over disinformation and to pierce filter bubbles of nonsense. This is not a capricious request. Facebook’s inattention to its potential impact on democracy enabled unprecedented interference in the United States and

essential value of compromise, which starts with listening to and acknowledging the legitimacy of different points of view. If we are to break out of filter bubbles and preference bubbles—whether online or off—we need to invest the time necessary to be better informed. We need to be appropriately skeptical about

would be for users to force change. Users have leverage because internet platforms cannot survive without their attention. In a perfect world, users would escape filter bubbles, not allow technology to mediate their relationships, and commit to active citizenship. The midterm elections provided evidence that an increasing number of users in the

in technology, kids do not develop normal social skills. Teenagers use social media technology to bully one another. Adults seek out the warm embrace of filter bubbles that erode critical-thinking skills. Seduced by convenience, users surrender personal data for ever less benefit. The platforms themselves have failed to safeguard personal data

about the dark side of social media began in 2011 with Eli Pariser’s groundbreaking TED Talk on filter bubbles. I recommend the video of that talk, as well as Eli’s book, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2012). The book that energized me in

–89 emotions and, 9, 11, 68, 69, 75, 233, 270 F8 conference, 217–18 fake accounts deleted by, 229, 230, 265 FarmVille, 184, 195, 196 filter bubbles and, 78, 87, 90, 96, 116–19, 124–27, 141, 166, 209, 215, 245, 257, 258 financial value of, 9, 284 Free Basics, 179–80

Communications Commission, 136 Federal Trade Commission (FTC), 113–14, 136, 182–84, 188–90, 200, 286 Feinstein, Dianne, 132 Fernando, Randima, 167 Ferrell, Will, 156 filter bubbles, 66–67, 87, 94, 109, 125–27, 157–58, 246, 269, 278–81 extreme views and, 91, 93 Facebook and, 78, 87, 90, 96, 116

consumer purchases and, 67–68 content moderation and, 237 content suppliers harmed by, 283 DoubleClick and, 138–39 European Union and, 138, 260, 281–82 filter bubbles and, 87, 90 Global Data Protection Regulation and, 221, 259–60 Gmail, 39, 68, 253, 261, 271, 285 Google+, 67, 69 Home, 108, 236, 262

The Black Box Society: The Secret Algorithms That Control Money and Information

by Frank Pasquale  · 17 Nov 2014  · 320pp  · 87,853 words

. We see what we have trained Google to show us and what Google gradually conditions us to expect. Entrepreneur Eli Pariser calls this phenomenon “the filter bubble” and worries that all this personalization has serious side effects, namely increased insularity and reinforced prejudice.121 So intense is the personalization of search results

. A company called Acxiom has 1,600 pieces of information about 98 percent of U.S. adults, gathered from thousands of sources. Eli Pariser, The Filter Bubble (New York: Penguin, 2011), 3. At least some of them are healthindicative or health-predictive. Daniel J. Solove, The Future of Reputation: Gossip, Rumor, and

–81 257 (quoting manifesto “WE WANT TO CALL WORK WHAT IS WORK SO THAT EVENTUALLY WE MIGHT REDISCOVER WHAT FRIENDSHIP IS”). 121. Eli Pariser, The Filter Bubble (New York: Penguin, 2011). 122. Ibid., 6–7. 123. Fortunately, one has written a work of fiction to suggest what could go wrong. Shumeet Baluja

Big Business: A Love Letter to an American Anti-Hero

by Tyler Cowen  · 8 Apr 2019  · 297pp  · 84,009 words

us an early boost. That is a much more typical Facebook advertising story than what you might hear about these days. The idea of a “filter bubble” is another criticism of Facebook, but it is not supported by the facts. So many times I have heard that Facebook or other social media

/unemployment European Union ex post Exxon eyeglass companies Facebook advertising and AI and “anti-diversity memo” censorship and China and competition and complaints about employees “filter bubble” income inequality and information and innovation and monopoly and News Feed politics and privacy and Russian-manipulated content venture capital and See also Zuckerberg, Mark

Apple Smith, Adam Smyth, Joshua M. Snapchat social media 2016 election and advertising and big business and CEOs and economy and effect on American society “filter bubble” and generational influence privacy and trust and workers and See also Facebook; Instagram; Twitter social responsibility Social Security socialism SpaceX See also Musk, Elon Spool

Survival of the Richest: Escape Fantasies of the Tech Billionaires

by Douglas Rushkoff  · 7 Sep 2022  · 205pp  · 61,903 words

The Constitution of Knowledge: A Defense of Truth

by Jonathan Rauch  · 21 Jun 2021  · 446pp  · 109,157 words

Nervous States: Democracy and the Decline of Reason

by William Davies  · 26 Feb 2019  · 349pp  · 98,868 words

The End of Big: How the Internet Makes David the New Goliath

by Nicco Mele  · 14 Apr 2013  · 270pp  · 79,992 words

Traffic: Genius, Rivalry, and Delusion in the Billion-Dollar Race to Go Viral

by Ben Smith  · 2 May 2023

Digital Disconnect: How Capitalism Is Turning the Internet Against Democracy

by Robert W. McChesney  · 5 Mar 2013  · 476pp  · 125,219 words

Internet for the People: The Fight for Our Digital Future

by Ben Tarnoff  · 13 Jun 2022  · 234pp  · 67,589 words

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US

by Rana Foroohar  · 5 Nov 2019  · 380pp  · 109,724 words

The Science of Hate: How Prejudice Becomes Hate and What We Can Do to Stop It

by Matthew Williams  · 23 Mar 2021  · 592pp  · 125,186 words

The People's Platform: Taking Back Power and Culture in the Digital Age

by Astra Taylor  · 4 Mar 2014  · 283pp  · 85,824 words

The Price of Tomorrow: Why Deflation Is the Key to an Abundant Future

by Jeff Booth  · 14 Jan 2020  · 180pp  · 55,805 words

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt

by Sinan Aral  · 14 Sep 2020  · 475pp  · 134,707 words

Likewar: The Weaponization of Social Media

by Peter Warren Singer and Emerson T. Brooking  · 15 Mar 2018

Whole Earth: The Many Lives of Stewart Brand

by John Markoff  · 22 Mar 2022  · 573pp  · 142,376 words

Algospeak: How Social Media Is Transforming the Future of Language

by Adam Aleksic  · 15 Jul 2025  · 278pp  · 71,701 words

The Vanishing Neighbor: The Transformation of American Community

by Marc J. Dunkelman  · 3 Aug 2014  · 327pp  · 88,121 words

Palaces for the People: How Social Infrastructure Can Help Fight Inequality, Polarization, and the Decline of Civic Life

by Eric Klinenberg  · 10 Sep 2018  · 281pp  · 83,505 words

The Smartphone Society

by Nicole Aschoff

Breaking News: The Remaking of Journalism and Why It Matters Now

by Alan Rusbridger  · 14 Oct 2018  · 579pp  · 160,351 words

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World

by Max Fisher  · 5 Sep 2022  · 439pp  · 131,081 words

Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism

by Sarah Wynn-Williams  · 11 Mar 2025  · 370pp  · 115,318 words

Collaborative Society

by Dariusz Jemielniak and Aleksandra Przegalinska  · 18 Feb 2020  · 187pp  · 50,083 words

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity

by Daron Acemoglu and Simon Johnson  · 15 May 2023  · 619pp  · 177,548 words

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity

by Amy Webb  · 5 Mar 2019  · 340pp  · 97,723 words

Who Owns the Future?

by Jaron Lanier  · 6 May 2013  · 510pp  · 120,048 words

WTF?: What's the Future and Why It's Up to Us

by Tim O'Reilly  · 9 Oct 2017  · 561pp  · 157,589 words

Rebel Ideas: The Power of Diverse Thinking

by Matthew Syed  · 9 Sep 2019  · 280pp  · 76,638 words

The Nation City: Why Mayors Are Now Running the World

by Rahm Emanuel  · 25 Feb 2020  · 212pp  · 69,846 words

This Is Not Normal: The Collapse of Liberal Britain

by William Davies  · 28 Sep 2020  · 210pp  · 65,833 words

The Wires of War: Technology and the Global Struggle for Power

by Jacob Helberg  · 11 Oct 2021  · 521pp  · 118,183 words

The Capitalist Manifesto

by Johan Norberg  · 14 Jun 2023  · 295pp  · 87,204 words

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All

by Robert Elliott Smith  · 26 Jun 2019  · 370pp  · 107,983 words

The Internet Is Not the Answer

by Andrew Keen  · 5 Jan 2015  · 361pp  · 81,068 words

Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World

by Bruce Schneier  · 2 Mar 2015  · 598pp  · 134,339 words

The Passenger: Berlin

by The Passenger  · 8 Jun 2021  · 199pp  · 63,724 words

These Strange New Minds: How AI Learned to Talk and What It Means

by Christopher Summerfield  · 11 Mar 2025  · 412pp  · 122,298 words

Ten Arguments for Deleting Your Social Media Accounts Right Now

by Jaron Lanier  · 28 May 2018  · 151pp  · 39,757 words

Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance

by Julia Angwin  · 25 Feb 2014  · 422pp  · 104,457 words

Future Crimes: Everything Is Connected, Everyone Is Vulnerable and What We Can Do About It

by Marc Goodman  · 24 Feb 2015  · 677pp  · 206,548 words

The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure

by Greg Lukianoff and Jonathan Haidt  · 14 Jun 2018  · 531pp  · 125,069 words

Character Limit: How Elon Musk Destroyed Twitter

by Kate Conger and Ryan Mac  · 17 Sep 2024

Supremacy: AI, ChatGPT, and the Race That Will Change the World

by Parmy Olson  · 284pp  · 96,087 words

IRL: Finding Realness, Meaning, and Belonging in Our Digital Lives

by Chris Stedman  · 19 Oct 2020  · 307pp  · 101,998 words

What Algorithms Want: Imagination in the Age of Computing

by Ed Finn  · 10 Mar 2017  · 285pp  · 86,853 words

Terms of Service: Social Media and the Price of Constant Connection

by Jacob Silverman  · 17 Mar 2015  · 527pp  · 147,690 words

#Republic: Divided Democracy in the Age of Social Media

by Cass R. Sunstein  · 7 Mar 2017  · 437pp  · 105,934 words

The New Digital Age: Transforming Nations, Businesses, and Our Lives

by Eric Schmidt and Jared Cohen  · 22 Apr 2013  · 525pp  · 116,295 words

A Manual for Creating Atheists

by Peter Boghossian  · 1 Nov 2013  · 257pp  · 77,030 words

The End of Absence: Reclaiming What We've Lost in a World of Constant Connection

by Michael Harris  · 6 Aug 2014  · 259pp  · 73,193 words

You Are What You Read

by Jodie Jackson  · 3 Apr 2019  · 145pp  · 41,453 words

The People vs Tech: How the Internet Is Killing Democracy (And How We Save It)

by Jamie Bartlett  · 4 Apr 2018  · 170pp  · 49,193 words

Thinking in Bets

by Annie Duke  · 6 Feb 2018  · 288pp  · 81,253 words

Merchants of Truth: The Business of News and the Fight for Facts

by Jill Abramson  · 5 Feb 2019  · 788pp  · 223,004 words

Twitter and Tear Gas: The Power and Fragility of Networked Protest

by Zeynep Tufekci  · 14 May 2017  · 444pp  · 130,646 words

The Formula: How Algorithms Solve All Our Problems-And Create More

by Luke Dormehl  · 4 Nov 2014  · 268pp  · 75,850 words

Chokepoint Capitalism

by Rebecca Giblin and Cory Doctorow  · 26 Sep 2022  · 396pp  · 113,613 words

The Death of Truth: Notes on Falsehood in the Age of Trump

by Michiko Kakutani  · 17 Jul 2018  · 137pp  · 38,925 words

The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future

by Kevin Kelly  · 6 Jun 2016  · 371pp  · 108,317 words

To Save Everything, Click Here: The Folly of Technological Solutionism

by Evgeny Morozov  · 15 Nov 2013  · 606pp  · 157,120 words

The Internet Trap: How the Digital Economy Builds Monopolies and Undermines Democracy

by Matthew Hindman  · 24 Sep 2018

They Don't Represent Us: Reclaiming Our Democracy

by Lawrence Lessig  · 5 Nov 2019  · 404pp  · 115,108 words

Filterworld: How Algorithms Flattened Culture

by Kyle Chayka  · 15 Jan 2024  · 321pp  · 105,480 words

The Myth of Artificial Intelligence: Why Computers Can't Think the Way We Do

by Erik J. Larson  · 5 Apr 2021

Super Thinking: The Big Book of Mental Models

by Gabriel Weinberg and Lauren McCann  · 17 Jun 2019

The Right Side of History

by Ben Shapiro  · 11 Feb 2019  · 270pp  · 71,659 words

How to Fix the Future: Staying Human in the Digital Age

by Andrew Keen  · 1 Mar 2018  · 308pp  · 85,880 words

How to Do Nothing

by Jenny Odell  · 8 Apr 2019  · 243pp  · 76,686 words

Doing Data Science: Straight Talk From the Frontline

by Cathy O'Neil and Rachel Schutt  · 8 Oct 2013  · 523pp  · 112,185 words

Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up

by Philip N. Howard  · 27 Apr 2015  · 322pp  · 84,752 words

Free Speech: Ten Principles for a Connected World

by Timothy Garton Ash  · 23 May 2016  · 743pp  · 201,651 words

Raw Data Is an Oxymoron

by Lisa Gitelman  · 25 Jan 2013

Dangerous Ideas: A Brief History of Censorship in the West, From the Ancients to Fake News

by Eric Berkowitz  · 3 May 2021  · 412pp  · 115,048 words

Can It Happen Here?: Authoritarianism in America

by Cass R. Sunstein  · 6 Mar 2018  · 434pp  · 117,327 words

Consent of the Networked: The Worldwide Struggle for Internet Freedom

by Rebecca MacKinnon  · 31 Jan 2012  · 390pp  · 96,624 words

Algorithms of Oppression: How Search Engines Reinforce Racism

by Safiya Umoja Noble  · 8 Jan 2018  · 290pp  · 73,000 words

Smarter Than You Think: How Technology Is Changing Our Minds for the Better

by Clive Thompson  · 11 Sep 2013  · 397pp  · 110,130 words

System Error: Where Big Tech Went Wrong and How We Can Reboot

by Rob Reich, Mehran Sahami and Jeremy M. Weinstein  · 6 Sep 2021

The Future of the Professions: How Technology Will Transform the Work of Human Experts

by Richard Susskind and Daniel Susskind  · 24 Aug 2015  · 742pp  · 137,937 words

Messy: The Power of Disorder to Transform Our Lives

by Tim Harford  · 3 Oct 2016  · 349pp  · 95,972 words

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media

by Tarleton Gillespie  · 25 Jun 2018  · 390pp  · 109,519 words

The Great Wave: The Era of Radical Disruption and the Rise of the Outsider

by Michiko Kakutani  · 20 Feb 2024  · 262pp  · 69,328 words

The Autonomous Revolution: Reclaiming the Future We’ve Sold to Machines

by William Davidow and Michael Malone  · 18 Feb 2020  · 304pp  · 80,143 words

New Laws of Robotics: Defending Human Expertise in the Age of AI

by Frank Pasquale  · 14 May 2020  · 1,172pp  · 114,305 words

Superbloom: How Technologies of Connection Tear Us Apart

by Nicholas Carr  · 28 Jan 2025  · 231pp  · 85,135 words

Men Who Hate Women: From Incels to Pickup Artists, the Truth About Extreme Misogyny and How It Affects Us All

by Laura Bates  · 2 Sep 2020  · 364pp  · 119,398 words

The Dark Net

by Jamie Bartlett  · 20 Aug 2014  · 267pp  · 82,580 words

Age of Context: Mobile, Sensors, Data and the Future of Privacy

by Robert Scoble and Shel Israel  · 4 Sep 2013  · 202pp  · 59,883 words

News and How to Use It: What to Believe in a Fake News World

by Alan Rusbridger  · 26 Nov 2020  · 371pp  · 109,320 words

Digital Empires: The Global Battle to Regulate Technology

by Anu Bradford  · 25 Sep 2023  · 898pp  · 236,779 words

Team Human

by Douglas Rushkoff  · 22 Jan 2019  · 196pp  · 54,339 words

Open: The Story of Human Progress

by Johan Norberg  · 14 Sep 2020  · 505pp  · 138,917 words

The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future

by Orly Lobel  · 17 Oct 2022  · 370pp  · 112,809 words

Smartcuts: How Hackers, Innovators, and Icons Accelerate Success

by Shane Snow  · 8 Sep 2014  · 278pp  · 70,416 words

We Are Data: Algorithms and the Making of Our Digital Selves

by John Cheney-Lippold  · 1 May 2017  · 420pp  · 100,811 words

Searches: Selfhood in the Digital Age

by Vauhini Vara  · 8 Apr 2025  · 301pp  · 105,209 words

Common Knowledge?: An Ethnography of Wikipedia

by Dariusz Jemielniak  · 13 May 2014  · 312pp  · 93,504 words

If Mayors Ruled the World: Dysfunctional Nations, Rising Cities

by Benjamin R. Barber  · 5 Nov 2013  · 501pp  · 145,943 words

21 Lessons for the 21st Century

by Yuval Noah Harari  · 29 Aug 2018  · 389pp  · 119,487 words

The End of Nice: How to Be Human in a World Run by Robots (Kindle Single)

by Richard Newton  · 11 Apr 2015  · 94pp  · 26,453 words

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies

by Erik Brynjolfsson and Andrew McAfee  · 20 Jan 2014  · 339pp  · 88,732 words

Republic, Lost: How Money Corrupts Congress--And a Plan to Stop It

by Lawrence Lessig  · 4 Oct 2011  · 538pp  · 121,670 words

Designing Great Data Products

by Jeremy Howard, Mike Loukides and Margit Zwemer  · 23 Mar 2012  · 23pp  · 5,264 words

World Without Mind: The Existential Threat of Big Tech

by Franklin Foer  · 31 Aug 2017  · 281pp  · 71,242 words

The Ethical Algorithm: The Science of Socially Aware Algorithm Design

by Michael Kearns and Aaron Roth  · 3 Oct 2019

The Twittering Machine

by Richard Seymour  · 20 Aug 2019  · 297pp  · 83,651 words

Co-Intelligence: Living and Working With AI

by Ethan Mollick  · 2 Apr 2024  · 189pp  · 58,076 words

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World

by Pedro Domingos  · 21 Sep 2015  · 396pp  · 117,149 words

Snowden's Box: Trust in the Age of Surveillance

by Jessica Bruder and Dale Maharidge  · 29 Mar 2020  · 159pp  · 42,401 words

One Way Forward: The Outsider's Guide to Fixing the Republic

by Lawrence Lessig  · 12 Feb 2012  · 88pp  · 22,980 words

Refuge: Transforming a Broken Refugee System

by Alexander Betts and Paul Collier  · 29 Mar 2017

The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future

by Tom Chivers  · 12 Jun 2019  · 289pp  · 92,714 words

Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are

by Seth Stephens-Davidowitz  · 8 May 2017  · 337pp  · 86,320 words

Being You: A New Science of Consciousness

by Anil Seth  · 29 Aug 2021  · 418pp  · 102,597 words

You've Been Played: How Corporations, Governments, and Schools Use Games to Control Us All

by Adrian Hon  · 14 Sep 2022  · 371pp  · 107,141 words

Data and the City

by Rob Kitchin,Tracey P. Lauriault,Gavin McArdle  · 2 Aug 2017

Mining Social Media: Finding Stories in Internet Data

by Lam Thuy Vo  · 21 Nov 2019  · 237pp  · 65,794 words

Future Politics: Living Together in a World Transformed by Tech

by Jamie Susskind  · 3 Sep 2018  · 533pp