Singularitarianism

back to index

description: belief in an incipient technological singularity

31 results

Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World

by James D. Miller  · 14 Jun 2012  · 377pp  · 97,144 words

intelligence forces all smart entities to have human-like values. We might not even be safe if an ultra-AI shares our morality, since, as Singularitarian Michael Anissimov wrote: We probably make thousands of species extinct per year through our pursuit of instrumental goals, why is it so hard to imagine

should try to instill friendliness in the first ultra-AI that we create. Predicting rain doesn’t count; building arks does. —Warren Buffett89 How many Singularitarians does it take to change a light bulb? Zero! You can’t change a light bulb that’s brighter than you. —Eliezer Yudkowsky90 CHAPTER 4

in Kansas, the castrated men were found to live on average 14 years longer than their uncastrated fellows. To the best of my knowledge, no Singularitarian, not even Ray Kurzweil or bullet-eater Robin Hanson, is following the castration path to long life. WHAT HAPPENS WHEN MANY THINK IMMORTALITY IS NEAR

the future to raise the status of people with trait X, or this group might have an irrationally high opinion of trait X. Lots of Singularitarians have extremely high measured intelligence. Hanson thinks that some futurists also have a cognitive bias toward expecting “an unrealistic degree of [self-sufficiency] or independence

we are extremely special. In general, humans have a desire to feel special, so you should be distrustful of a group of people, such as Singularitarians, who have formulated supposedly logical reasons why they are special. Also, since you are probably not special, you should be distrustful of any theory that

believe in a supernatural world in which eternal life is possible. The views of many Singularity believers do have a strong religious flavor. After all, Singularitarians fear that an unfriendly ultra-AI (Devil) will destroy mankind (send us to hell) and think that Yudkowsky and the Singularity Institute he founded (church

president Stephen Van Sickle told me that there are one or more famous businesspeople, not associated with the Singularity movement, who are [secret] Alcor members. Singularitarians are the only group that cryonicists have really won over. Probably in excess of 100 million people have heard of cryonics, yet the cryonics movement

, 115 sexbots, 193–95 sex drives, 195 sex-selective abortions, 194 sexually transmitted diseases, 194 Shakespeare, 21 Shaw, George Bernard, 84 Shulman, Carl, 147, 202 Singularitarians, 215 Singularity AI as smart as humans and/or augmenting human intelligence, x AI-centered, 209 AI-induced, 21 bad, avoid making, 58–59 bad

Live Work Work Work Die: A Journey Into the Savage Heart of Silicon Valley

by Corey Pein  · 23 Apr 2018  · 282pp  · 81,873 words

further questions. What did it mean that the leaders of a corporation more powerful than most governments were willing to tacitly endorse what Kurzweil called Singularitarianism? Was it not extraordinary that these iconic and influential tech magnates would lend credence to the Kurzweilian prophecy that further human evolution meant an irreversible

machines and the sacrifice of our individual biological identities to an immortal hive mind? Was such a thing actually possible? If there was substance to Singularitarianism, then the ascension of Kurzweil at Google would one day be seen as a decisive moment in history, analagous to the Roman emperor Constantine’s

Page and a number of other early Google employees, Kurzweil had lent his imprimatur to a new venture called Singularity University, which aimed to spread Singularitarian thinking among a globally diverse student body composed of current and future leaders in the public and private sectors. Back in Mountain View, I had

, that was what persuaded their employers to pick up the tab for a midweek conference in Europe’s adult Disneyland. The broad managerial interest in Singularitarian thinking was articulated at the outset of the Summit by John Hagel, an executive from Deloitte’s “Center for the Edge,” which helped “senior executives

was the answer. Deloitte had sponsored the Singularity Summit because, in a tall glass tower somewhere, accounting majors were debating the actuarial implications of the Singularitarian future, when millions of transhuman policyholders might enjoy indefinite lifespans, and running cost-benefit analyses on investments in supposedly imminent tech ventures such as extraplanetary

humanism” that informs restriction on the genetic engineering of human fetuses. In 2012, an interviewer prompted Kurzweil to account for the difference between his transhumanist Singularitarian vision and the old eugenics programs. “Eugenics was—first of all, the technology of it didn’t work, and was antihuman,” Kurzweil said. “It involved

meager buffet, or maybe it was the Cheese, but I’d noticed that after several hours in a dark room absorbing an overwhelming torrent of Singularitarian propaganda, I had lost all capacity to recognize the bizarre, the outlandish, or the abominable. It was a giddy feeling. I feared I might soon

power they wield and their critical ability, which must be estimated as null,” he wrote. If, as Ellul has it, technology is the state religion, Singularitarianism must be seen as its most extreme and fanatical sect. It is the Opus Dei of the postwar church of gadget worship. Ray Kurzweil may

be the best-known prophet of this order, but he was not the first. The true father of Singularitarianism is a sci-fi author and retired mathematics professor from Wisconsin named Vernor Vinge. His earliest written exposition of the idea appeared in the January

as a last resort. Kurzweil’s morbid obsession with disease and death led him into the depths of tech-abetted unconventional medicine, where many a Singularitarian followed. He received a diabetes diagnosis at age thirty-five. Displeased with insulin treatment, he set out to find a better way. The result was

hairpiece? An unfortunate dye job? Or maybe Kurzweil had finally stumbled across a real miracle pill? * * * I am by no means the first to label Singularitarianism a new religion or a cult. Kurzweil himself has said the comparison was “understandable,” given the preoccupation with mortality. However, he rejects the argument that

his sect is religious in nature, because he did not come to it as a spiritual seeker. Rather, Kurzweil writes, he became a Singularitarian as a result of “practical” efforts to make “optimal tactical decisions in launching technology enterprises.” Startups showed him the way! Being a

Singularitarian, Kurzweil claims, “is not a matter of faith but one of understanding.” This is a refrain Singularitarians share with Scientologists, for L. Ron Hubbard always marketed his doctrines as “technology.” This tic makes

Singularitarians impossible to argue with. Because they believe that they have arrived at their beliefs scientifically, anyone who disputes their

in business, politics, and military affairs, its leaders might seem clownish. But they are serious, dangerously so. There was always something fundamentally misanthropic about the Singularitarian vision, with its drive for the elimination of the body and its echoes of Christian millenarianism. A Scottish science fiction writer, Ken MacLeod, has been

who embraced the Singularity, especially its apocalyptic overtones. In subsequent years, MacLeod found himself increasingly horrified by the deranged utilitarianism he found among the online Singularitarians. Perhaps the most extreme example came in a post by a programmer named Robert J. Bradbury, five days after the September 11, 2001, terrorist attacks

as a “generous, driven and often outspoken individual” who “railed against the needless deaths of people the world over.” Being eccentrics, longtime inhabitants of the Singularitarian subculture were willing to overlook clear expressions of lunacy among their own. Consider young Mike Anissimov, who, before coming out as a Hitler fanboy, ingratiated

freedom.” By freedom, he meant escape. Or, as it was more commonly known among Silicon Valley elite, exit. The religious-apocalyptic mentality of the techie Singularitarians found synthesis with their intrinsic social frustration in the political act of secession. Thiel eventually abandoned Seasteading as too impractical, reckoning that there were other

is the private sphere’s gain. To the extent that these companies accelerate American decline, they accumulate more power for themselves—and, per the dominant Singularitarian ideology, they expect that power to last in perpetuity. * * * The future looks rosy indeed for the kings of Big Tech, grim though it may appear

getaway.” Thiel, it seemed, would have company. Another Davos participant, the former World Bank economist Stewart Wallis, linked elite fears of wealth confiscation to the Singularitarian dream of space colonization. “If they can get off to another planet, some of them would,” Wallis said. “The rich are worried and they should

upon the people of the world for the sake of some faux utopia. Whole countries would be broken in the mad scramble for a profitable Singularitarian future. And not just the little countries. After leaving the Bay Area, while I was writing this book in September 2016, I joined my wife

the Silicon Valley tech companies. Which is why, as much as I’d like to laugh off each and every wild prediction made by the Singularitarians I met in Amsterdam, I’m obliged instead to concede that seemingly impossible new technologies will no doubt emerge—either despite the urgent political and

pedigrees and a shocking disregard for history, politics, language, and culture, to say nothing of the struggles of the poor. It is no wonder the Singularitarian fantasies have captured the imaginations of the world’s most zealously self-interested businesspeople: these visions promise ultimate, permanent power. The stated ambitions of America

The Singularity Is Near: When Humans Transcend Biology

by Ray Kurzweil  · 14 Jul 2005  · 761pp  · 231,902 words

own particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a "singularitarian."1 I can understand why many observers do not readily embrace the obvious implications of what I have called the law of accelerating returns (the

of one hand clapping? MOLLY 2004: Hmmm, so the Singularity is what the Zen masters had in mind all along. CHAPTER SEVEN Ich bin ein Singularitarian The most common of all follies is to believe passionately in the palpably not true. —H. L. MENCKEN Philosophies of life rooted in centuries-old

Western tradition. But the modern difference is that now everyone notices the pace of progress on some level, not simply the visionaries. —JOHN SMART A Singularitarian is someone who understands the Singularity and has reflected on its meaning for his or her own life. I have been engaged in such reflection

from there to reflect on the impact of these crucial changes on social and cultural institutions and on my own life. So, while being a Singularitarian is not a matter of faith but one of understanding, pondering the scientific trends I've discussed in this book inescapably engenders new perspectives on

that traditional religions have attempted to address: the nature of mortality and immortality, the purpose of our lives, and intelligence in the universe. Being a Singularitarian has often been an alienating and lonely experience for me because most people I encounter do not share my outlook. Most "big thinkers" are totally

of the corners of our eyes. As Max More states, the last thing we need is another dogma, nor do we need another cult, so Singularitarianism is not a system of beliefs or unified viewpoints. While it is fundamentally an understanding of basic technology trends, it is simultaneously an insight that

causes one to rethink everything, from the nature of health and wealth to the nature of death and self. To me, being a Singularitarian means many things, of which the following is a small sampling. These reflections articulate my personal philosophy, not a proposal for a new doctrine. ·We

problems is on the horizon, there may be a tendency to grow detached from mundane, present-day concerns. I share More's antipathy toward "passive Singularitarianism," One reason for a proactive stance is that technology is a double-edged sword and as such always has the potential of going awry as

Golden Braid (New York: Basic Books, 1979). Chapter One: The Six Epochs 1. According to the Transtopia site (http://transtopia.org/faq.html#1.11), "Singularitarian" was "originally defined by Mark Plus ('91) to mean 'one who believes the concept of a Singularity.' " Another definition of this term is " 'Singularity activist

the Singularity'; that is, one who acts so as to bring about a Singularity [Mark Plus, 1991; Singularitarian Principles, Eliezer Yudkowsky, 2000]." There is not universal agreement on this definition, and many Transhumanists are still Singularitarians in the original sense—that is, "believers in the Singularity concept" rather than "activists" or "friends

." Eliezer S. Yudkowsky, in The Singularitarian Principles, version 1.0.2 (January 1, 2000), http://yudkowsky.net/sing/principles.ext

.html, proposed an alternate definition: "A Singularitarian is someone who believes that technologically creating a greater-than-human intelligence is desirable, and

who works to that end. A Singularitarian is friend, advocate, defender, and agent of the future known as the Singularity." My view: one can advance the Singularity and in particular make it

totalitarian and fundamentalist belief systems and ideologies, and creating knowledge in all of its diverse forms: music, art, literature, science, and technology. I regard a Singularitarian as someone who understands the transformations that are coming in this century and who has reflected on their implications for his or her own life

Hologram," Scientific American 289.2 (August 2003): 58–65, http://www.sciam.com/article.cfm?articleID=000AF072-4891-1F0A-97AE80A84189EEDF. Chapter Seven: Ich bin ein Singularitarian 1. In Jay W. Richards et al., Are We Spiritual Machines? Ray Kurzweil vs. the Critics of Strong A.I. (Seattle: Discovery Institute, 2002), introduction

Our Final Invention: Artificial Intelligence and the End of the Human Era

by James Barrat  · 30 Sep 2013  · 294pp  · 81,292 words

that Write Programs 6. Four Basic Drives 7. The Intelligence Explosion 8. The Point of No Return 9. The Law of Accelerating Returns 10. The Singularitarian 11. A Hard Takeoff 12. The Last Complication 13. Unknowable by Nature 14. The End of the Human Era 15. The Cyber Ecosystem 16. AGI

consider how mankind’s greatest problems—hunger, disease, even death itself—may be conquered. That’s the vision espoused by Ray Kurzweil and promulgated by “Singularitarians.” Singularitarians are those who anticipate that mostly good things will emerge from the accelerated future. Their “singularity” sounds too rosy for Vinge. “We’re playing a

a tireless if rather mechanical promoter. He’s the den-master for a lot of young men, and some women, living on the singularity edge. Singularitarians tend to be twenty- and thirty-somethings, male, and childless. For the most part, they’re smart white guys who’ve heard the call of

offers no degrees and isn’t accredited. But it promises “a broad, cross-disciplinary understanding of the biggest ideas and issues in transformative technologies.”) Many Singularitarians are too smart and self-directed to get in line for traditional education anyway. And many are addled wing nuts few colleges or universities would

invite on campus. Some Singularitarians have adopted rationality as the major tenet of their creed. They believe that greater logical and reasoning abilities, particularly among tomorrow’s decision makers, decreases

an apocalyptic religion, including rituals of purification, eschewing frail human bodies, anticipating eternal life, and an uncontested (somewhat) charismatic leader. I wholeheartedly agree with the Singularitarian idea that AI is the most important thing we could be thinking about right now. But when it comes to immortality talk, I get off

the bus. Dreams about eternal life throw out a powerful distortion field. Too many Singularitarians believe that the confluence of technologies presently accelerating will not yield the kinds of disasters we might anticipate from any of them individually, nor the

, a vanguard, perhaps a remnant who make it to the Promised Land? These religious themes are all present in the rhetoric and rationalities of the Singularitarians, even if the pre- and post-millennialist interpretations aren’t consistently developed, as is certainly the case with pre-scientific Messianic movements. Unlike Good’s

the dynamics of doublings expressed by LOAR, AGI will take the world stage (and I mean take) much sooner than we think. Chapter Ten The Singularitarian In contrast with our intellect, computers double their performance every eighteen months. So the danger is real that they could develop intelligence and take over

weapons and technologies that exist today. We’ll have to develop, side by side with augmentation, a science for choosing candidates for intelligence enhancement. The Singularitarians’ conceit that anyone who can afford it will enjoy superintelligence through brain augmentation is a virtual guarantee that everyone else will have to live at

About Cochlear Implants,” last modified June 7, 2010, http://www.nidcd.nih.gov/health/hearing/pages/coch_moreon.aspx (accessed September 15, 2011). 10: THE SINGULARITARIAN In contrast with our intellect: McAuliffe, Wendy, “Hawking warns of AI world takover,” ZDNet, September 3, 2001, http://www.zdnet.co.uk/news/application-development

the National Academy of Sciences, no. 26 (January 2012), http://www.pnas.org/content/early/2012/02/21/1118373109.abstract (accessed February 11, 2012). The Singularitarians’ conceit: Incidentally, there’s some interesting writing on the Web about the concept of a Singleton. Conceived by ethicist Nick Bostrom, a “Singleton” is a

, October 4, 2010, http://www.nytimes.com/2010/10/05/science/05compute.html?pagewanted=all (accessed September 28, 2011). Many, especially those at MIRI: Some Singularitarians want to get to AGI as soon as possible, owing to its potential to alleviate human suffering. This is Ray Kurzweil’s position. Others feel

.) Searle, John self-awareness Self-Aware Systems self-improvement self-preservation September 11 attacks serial processing SETI (Search for Extraterrestrial Intelligence) Shostak, Seth Silicon Valley Singularitarians Singularity definitions of Kurzweil and technological Singularity Is Near, The (Kurzweil) Singularity Summit Singularity University Sir Groovy Siri 60 Minutes Skilling, Jeffrey Smart Action smart

Utopia Is Creepy: And Other Provocations

by Nicholas Carr  · 5 Sep 2016  · 391pp  · 105,382 words

man immortal at the instant of his obsolescence—has been called “the rapture of the geeks.” But to Ray Kurzweil, the most famous of the Singularitarians, it’s no joke. In an interview in Rolling Stone, Kurzweil describes how, in the wake of the Singularity, it will be possible not only

our mind children is to give them the freedom to be tempted. Besides, how is a computer supposed to have an intelligent conversation with the Singularitarians if it can’t use the word “bullshit”? MAX LEVCHIN HAS PLANS FOR US January 30, 2013 “I SOMETIMES IMAGINE THE low-use troughs of

–15 informality eschewed by, 197–98, 215 wealthy lifestyle of, 16–17, 195 Simonite, Tom, 136–37 simulation, see virtual world Singer, Peter, 267 Singularity, Singularitarians, 69, 147 sitcoms, 59 situational overload, 90–92 skimming, 233 “Slaves to the Smartphone,” 308–9 Slee, Tom, 61, 84 SLExchange, 26 slot machines, 218

More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity

by Adam Becker  · 14 Jun 2025  · 381pp  · 119,533 words

human could. One of those tasks, naturally, would be the design of the next AGI. And this is the mechanism by which Kurzweil and other “singularitarians” see the Singularity coming to fruition. Say that first AGI, built by humans, took thirty years to create. Building an even better, smarter one might

of the implications of the Singularity.… We’ll all have fantastic power compared to what we have today.”138 The way Kurzweil and his fellow singularitarians talk about the technology to come makes it seem like they’re playing a video game like Civilization, where there is a technology tree laid

the basic features of our own real world, or a real world destroyed and reshaped to more closely resemble an immersive computer game. To the singularitarians, the largest and most crucial difference, the one they keep coming back to in their descriptions, is the end of death. Kurzweil’s work is

read that sentence, that this was how I would be spending the rest of my life. It was just so obvious. I’ve been a Singularitarian ever since.”24 Writing his thoughts on the subject in an essay he posted to his personal website titled “Staring into the Singularity,” Yudkowsky made

sort of speculate about AI.”82 These comments echo some of the arguments against the Singularity. That’s no surprise, since the rationalists closely resemble singularitarians. They have the same sort of paradise in view. They just think that there’s a particular roadblock in the way that needs to be

the social and cultural environment that supports their intelligence.”89 Thus, the rationalists are in the same position as Kurzweil and the rest of the singularitarians: they are making extraordinary claims, and they don’t have the extraordinary evidence to back them up. The best they can do is a handful

as an example of that “wonderful” dream. But there’s another connection between AGI and racism, one that arguably runs deeper. The rationalists, like the singularitarians and other proponents of the power of AGI, frequently “[blur] the concept of general intelligence with the concepts of mind or consciousness,” wrote David Golumbia

inborn general intelligence—as part of a larger plan to improve humanity, a plan that also included many of the same themes as the rationalists, singularitarians, Extropians, and other modern transhumanists and futurists. Even the word “transhumanism” was first popularized in its modern sense by a eugenicist, Julian Huxley (the brother

talking with Sandberg, a senior research fellow at FHI, with an office not far from MacAskill’s. Sandberg has been a fixture in transhumanist and singularitarian communities for decades, going back to the days of the Extropian email listserv in the 1990s, where he first encountered a teenager by the name

an argument.”87 This promise of a benevolent godhead, a superintelligent AI that foresees and solves all human problems, is the same goal that the singularitarians and the rationalists have: the reduction of all problems to judicious application of computer science. More broadly, it’s the dream of technology as salvation

promise of eternal life in space through technology—and its colonialist and eugenicist logic—has a clear ideological link to modern movements like transhumanism and singularitarianism. Through cosmism’s influence on twentieth-century science fiction, the link is historical as well. Timnit Gebru and Émile Torres have dubbed this set of

related ideologies (traced throughout this book) the TESCREAL bundle: Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism. Gebru and Torres have done extensive work linking these ideologies to each other and to the core of racist

World Without Mind: The Existential Threat of Big Tech

by Franklin Foer  · 31 Aug 2017  · 281pp  · 71,242 words

, down to his financial ledgers, in anticipation of the day he can resurrect him. When the anthropologist of religion Robert Geraci studied Kurzweil and other singularitarians, he noticed how precisely their belief seemed to echo Christian apocalyptic texts. “Apocalyptic AI is the legitimate heir to these religious promises, not a bastardized

out little to no hope that computers will ever acquire anything approximating human consciousness. Then there are the revolutionaries who gravitate toward Kurzweil and the singularitarian view. They aim to build computers with either “artificial general intelligence” or “strong AI.” For most of Google’s history, it trained its efforts on

. In 2011, Page shifted himself back into the corner office, the CEO job he held at Google’s birth. And he redirected the company toward singularitarian goals. Over the years, he had befriended Kurzweil and worked with him on assorted projects. After he returned to his old job, Page hired Kurzweil

free ride. “If I were a student, this is where I would like to be,” Page has said. The company has indulged a slew of singularitarian obsessions. It has, for instance, invested heavily in Calico, a start-up that wants to solve the problem of death, as opposed to tackling comparatively

100 Plus: How the Coming Age of Longevity Will Change Everything, From Careers and Relationships to Family And

by Sonia Arrison  · 22 Aug 2011  · 381pp  · 78,467 words

even starts with a discussion of his religious upbringing and the first time he imagined that a computer could think. Aside from the title of “singularitarian,” he calls himself a “patternist” who “views patterns of information as the fundamental reality” (my italics).79 He argues that he knows the purpose of

close to God as I can imagine.”81 For those wondering what rituals this religion might have, aside from reading the relevant texts and attending singularitarian and trans-humanist-themed conferences, big ones include carefully taking vitamin supplements, exercising, and perhaps even signing up for cryonics and wearing a bracelet identifying

than my own thoughts,” and it’s “my personal leap of faith” that “I believe in the existence of the universe.”83 The argument that singularitarianism can be viewed as religion doesn’t mean that it is wrong or somehow less legitimate. In a YouTube video taken at a conference where

posted the YouTube video gave it the title “Ray Kurzweil: The Singularity Is Not a Religion.” Although it might indeed be true that “being a singularitarian is not a matter of faith” and that Kurzweil did not come to his “perspective as a result of searching for an alternative to customary

freeing of our thinking from the severe limitations of its biological form to be an essential spiritual undertaking.”86 This exercise of looking at how singularitarians or transhumanists have built a set of ideas that can be modeled into a working religion demonstrates what at least one strong contemporary religion looks

at NASA’s Moffett Field in California, was named Singularity University, and he was appointed chancellor. (People often confuse Singularity University with the philosophical movement “singularitarianism,” but the two are not the same, as indicated in Chapter 7.) SU’s mission is practical: “to assemble, educate and inspire leaders who strive

Global Catastrophic Risks

by Nick Bostrom and Milan M. Cirkovic  · 2 Jul 2008

physical laws will work. In the same way, Vinge said, greater-than-human machine intelligence, multiplying exponentially, would make everything about our world unpredictable. Most Singularitarians, like Vinge and Kurzweil, have focused on the emergence of superhuman machine intelligence. But the even more fundamental concept is exponential technological progress, with the

superior mental abilities) after the Singularity. The rest of humanity may however be 'left behind'. This secular 'left behind' narrative is very explicit in the Singularitarian writings of computer scientist Hans Moravec ( 1990, 2000) . For Moravec the human race will be superseded by our robot children, among whom some of us

himself holding a sign with that slogan, referencing the classic cartoon image of the EndTimes street prophet, most Singularitarians angrily reject such comparisons insisting their expectations are based solely on rational, scientific extrapolation. Other Singularitarians, however, embrace parallels with religious millennialism. John Smart, founder and director of the California-based Acceleration Studies

to one another and to machine intelligence in the emerging global telecommunications web, leading to the emergence of collective intelligence. This emergent collectivist form of Singularitarianism was proposed also by Peter Russell (1983) in The Global Brain, and Gregory Stock (1993) in Metaman. Smart (2007) argues that the scenario of an

-millennia! 'Omega Point' of union with God. Computer scientist Juergen Schmidhuber (2006) also has adopted Chardin's 'Omega' to refer to the Singularity. For most Singularitarians, as for most millennialists, the process of technological innovation is depicted as autonomous of human agency, and wars, technology bans, energy crises or simple incompetence

. 96 Signor-Lipps effect, palaeontology 1 20 Simulation Argument 138-40 Singh-Manoux, A. et al. 61-2 single-event toy model 1 22-4 Singularitarianism 79-80, 340 The Singularity is Near, Kurzweil, R. 79, 80, 82, 361 Sioux, Ghost Dance 78 sitter allele, Drosophila melanogaster 63 Index SlY (simian

The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future

by Tom Chivers  · 12 Jun 2019  · 289pp  · 92,714 words

intelligent systems start improving themselves fast enough, our usual ways of predicting the future – our assumptions that tomorrow will be essentially like today – will, say singularitarians, break down. The computer scientist and science-fiction writer Vernor Vinge wrote in 1983: ‘We will soon create intelligences greater than our own. When this

instead. Yudkowsky was not the first person to think about what would come after humans. He was firmly part of the traditions of transhumanist and singularitarian thinking, which had been around for years when he was writing ‘Staring into the Singularity’; some of the ideas they hurled about had existed for

in another way – but in its entirety, as humanity. We need a name for this new belief. Perhaps transhumanism will serve.’8 But transhumanism and singularitarianism really took off as philosophies in the last decades of the twentieth century. There were various different, and to some degree competing, ideas of what

brain to a computer, or linking human brains via computers, to improve human cognition – were a constant topic. All of this, naturally, overlapped with the ‘singularitarian’ vision of a world in which superintelligent AI or other technological advances rendered human life unrecognisable (but unrecognisable, they’d have said, in a good

We’ll get on to why the Rationalists think that AI is so dangerous soon. But first we should look at why they, and the singularitarians who came before them, are also so keen on it. The gamble, they think, is between extinction and godhood. According to the Rationalists, getting AI

to find out about the world.’ Eliezer Yudkowsky has addressed this before, in an interview with the science writer John Horgan, who had previously called singularitarianism a ‘religious rather than scientific vision’.2 ‘You’re trying to forecast empirical facts by psychoanalysing people,’ he told Horgan. ‘This never works. Suppose we

Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots

by John Markoff  · 24 Aug 2015  · 413pp  · 119,587 words

New Laws of Robotics: Defending Human Expertise in the Age of AI

by Frank Pasquale  · 14 May 2020  · 1,172pp  · 114,305 words

The Transhumanist Reader

by Max More and Natasha Vita-More  · 4 Mar 2013  · 798pp  · 240,182 words

To Be a Machine: Adventures Among Cyborgs, Utopians, Hackers, and the Futurists Solving the Modest Problem of Death

by Mark O'Connell  · 28 Feb 2017  · 252pp  · 79,452 words

Artificial Intelligence: A Modern Approach

by Stuart Russell and Peter Norvig  · 14 Jul 2019  · 2,466pp  · 668,761 words

Rationality: From AI to Zombies

by Eliezer Yudkowsky  · 11 Mar 2015  · 1,737pp  · 491,616 words

Whiplash: How to Survive Our Faster Future

by Joi Ito and Jeff Howe  · 6 Dec 2016  · 254pp  · 76,064 words

Accelerando

by Stross, Charles  · 22 Jan 2005  · 489pp  · 148,885 words

Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software

by Scott Rosenberg  · 2 Jan 2006  · 394pp  · 118,929 words

Survival of the Richest: Escape Fantasies of the Tech Billionaires

by Douglas Rushkoff  · 7 Sep 2022  · 205pp  · 61,903 words

The Politics of Bitcoin: Software as Right-Wing Extremism

by David Golumbia  · 25 Sep 2016  · 87pp  · 25,823 words

Attack of the 50 Foot Blockchain: Bitcoin, Blockchain, Ethereum & Smart Contracts

by David Gerard  · 23 Jul 2017  · 309pp  · 54,839 words

Grand Transitions: How the Modern World Was Made

by Vaclav Smil  · 2 Mar 2021  · 1,324pp  · 159,290 words

Artificial Intelligence: A Guide for Thinking Humans

by Melanie Mitchell  · 14 Oct 2019  · 350pp  · 98,077 words

Pandora's Brain

by Calum Chace  · 4 Feb 2014  · 345pp  · 104,404 words

On the Edge: The Art of Risking Everything

by Nate Silver  · 12 Aug 2024  · 848pp  · 227,015 words

Boom: Bubbles and the End of Stagnation

by Byrne Hobart and Tobias Huber  · 29 Oct 2024  · 292pp  · 106,826 words

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy

by George Gilder  · 16 Jul 2018  · 332pp  · 93,672 words

When Computers Can Think: The Artificial Intelligence Singularity

by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann and Michelle Estes  · 28 Feb 2015

Zero to One: Notes on Startups, or How to Build the Future

by Peter Thiel and Blake Masters  · 15 Sep 2014  · 185pp  · 43,609 words

Amateurs!: How We Built Internet Culture and Why It Matters

by Joanna Walsh  · 22 Sep 2025  · 255pp  · 80,203 words