by Terrence J. Sejnowski · 27 Sep 2018
. Printed and bound in the United States of America. Library of Congress Cataloging-in-Publication Data Names: Sejnowski, Terrence J. (Terrence Joseph), author. Title: The deep learning revolution / Terrence J. Sejnowski. Description: Cambridge, MA : The MIT Press, 2018. | Includes bibliographical references and index. Identifiers: LCCN 2017044863 | ISBN 9780262038034 (hardcover : alk. paper)
…
If you use voice recognition on an Android phone or Google Translate on the Internet, you have communicated with neural networks1 trained by deep learning. In the last few years, deep learning has generated enough profit for Google to cover the costs of all its futuristic projects at Google X, including self-driving cars
…
and Functional Architecture in the Cat’s Visual Cortex,” which reported for the first time the response properties of single neurons recorded with a microelectrode. Deep learning networks have an architecture similar to the hierarchy of areas in the visual cortex. 1969—Marvin Minsky and Seymour Papert published Perceptrons, which pointed
…
data; information can be used to create knowledge; knowledge leads to understanding; and understanding leads to wisdom. Welcome to the brave new world of deep learning.1 Deep learning is a branch of machine learning that has its roots in mathematics, computer science, and neuroscience. Deep networks learn from data the way that babies
…
s tensor processing unit (TPU) is now deployed on servers around the world, delivering an order-of-magnitude improvement in performance for deep learning applications. An example of how quickly deep learning can change the landscape is the impact it has had on language translation—a holy grail for artificial intelligence since it depends
…
on the ability to understand a sentence. The recently unveiled new version of Google Translate based on deep learning represents a quantum leap improvement in the quality of translation between natural languages. Almost overnight, language translation went from a fragmented hit-and-miss
…
jumble of phrases to seamless sentences (figure 1.3). Previous computer methods searched for combinations of words that could be translated together, but deep learning looks for dependencies across whole sentences. Alerted about the sudden improvement of Google Translate, on November 18, 2016, Jun Rekimoto at the University of
…
leopard. No one has ever explained what leopard wanted at that altitude.10 (Hemingway is #1.) The next step will be to train larger deep learning networks on paragraphs to improve continuity across sentences. Words have long cultural histories. Vladimir Nabokov, the Russian writer and English-language novelist who wrote Lolita
…
on slides is done by experts who make mistakes, mistakes that have deadly consequences. This is a pattern recognition problem for which deep learning should excel. And indeed, a deep learning network trained on a large dataset of slides for which ground truth was known reached an accuracy of 0.925, good but not
…
review, or discovery, will be taken over by artificial intelligence, which can sort through thousands of documents for legal evidence without getting tired. Automated deep learning systems will also help law firms comply with the increasing complexity of governmental regulations. They will make legal advice available for the average person who
…
given to the winner at the end of a sequence of moves, which paradoxically can improve decisions made much earlier. When coupled with many powerful deep learning networks, this leads to many domain-dependent bits of intelligence. And, indeed, cases have been made for different domaindependent kinds of intelligence: social, emotional,
…
steadily increasing since the advent of computer programs that play at championship levels, and so has the machine augmented intelligence of the human players.40 Deep learning will boost the intelligence not just of scientific investigators but of workers in all professions. Scientific instruments are generating data at prodigious rate. Elementary particle
…
than almost anyone else in the world and will not forget anything, becoming, in effect, your virtual doppelganger. By pressing both Internet tracking and deep learning into service, the educational opportunities for the children of today’s children will be better than the best available today to wealthy families. These grandchildren
…
beginning to benefit. Alexa, a wildly popular digital assistant operating in tandem with the Amazon Echo smart speaker, responds to natural language requests based on deep learning. Amazon Web Services (AWS) has introduced toolboxes called “Lex,” “Poly” and “Comprehend” that make it easy to develop the same natural language interfaces based
…
networks in the 1980s and as the president of the Neural Information Processing Systems (NIPS) Foundation, which has overseen discoveries in machine learning and deep learning over the last thirty years. My colleagues and I in the neural network community were for many years the underdogs, but our persistence and patience
…
earlier level. The decision demon weighs the degree of excitement and importance of its informants. This form of evidence evaluation is a metaphor for current deep learning networks, which have many more levels. From Peter H. Lindsay and Donald A. Norman, Human Information Processing: An Introduction to Psychology, 2nd ed. (New
…
based on the architecture of the visual system that used convolutional filters and a simple form of Hebbian plasticity and was a direct precursor of deep learning networks. And, for a third, Teuvo Kohonen, an electrical engineer at Helsinki University, developed a self-organizing network that could learn to cluster similar
…
networks was possible. 1986—David Rumelhart and Geoffrey Hinton publish “Learning Internal Representations by Error-Propagation,” which introduced the “backprop” learning algorithm now used for deep learning. 1988—Richard Sutton publishes “Learning to Predict by the Methods of Temporal Differences” in Machine Learning. Temporal difference learning is now believed to be the
…
2012 paper “ImageNet Classification with Deep Convolutional Neural Networks” reduces the error rate for correctly classifying objects in images by 18 percent. 2017—AlphaGo, a deep learning network program, beats Ke Jie, the world champion at Go. 6 The Cocktail Party Problem Chapter The Cocktail Party 6 Problem © Massachusetts Institute of
…
open. Recent experiments on neural network learning of language support the gradual acquisition of inflectional morphology, consistent with human learning.12 The success of deep learning with Google Translate and other natural language applications in capturing the nuances of language further supports the possibility that brains do not need to use
…
many networks yield the same behavior, the key to understanding them is the learning algorithms used by brains, which should be easier to discover. Understanding Deep Learning Whereas, in convex optimization problems, there are no local minima and convergence is guaranteed to the global minimum, in nonconvex optimization problems, this is
…
a network than those we receive from humans? Recall that consciousness does not have access to the inner workings of 124 Chapter 8 brains. Deep learning networks typically provide not one but several leading predictions in rank order, which gives us some information about the confidence of a conclusion. Supervised neural
…
zip codes on letters, using the Modified National Institute of Standards and Technology (MNIST) Figure 9.1 Geoffrey Hinton and Yann LeCun have mastered deep learning. This photo was taken at a meeting of the Neural Computation and Adaptive Perception Program of the Canadian Institute for Advanced Research around 2000, a
…
what the distributed representations at the top of the hierarchy were meant to accomplish. This illustrates the potential for fruitful symbiotic relationships between biology and deep learning. Deep Learning Meets the Visual Hierarchy A philosopher of the mind, Patricia Churchland specializes in neurophilosophy at UC, San Diego.13 That knowledge ultimately depends on how
…
following their intuitions; the theory of thermodynamics that explained how the engines worked came later, along with improvements in their efficiency. The analysis of deep learning networks by physicists and mathematicians is well under way. 134 Chapter 9 Working Memory and Persistence of Activity Neuroscience has come a long way since
…
-range dependencies are preserved selectively. This version of working memory in neural networks lay dormant for twenty years until it was awakened and implemented in deep learning networks, where it has been spectacularly successful in many domains that depend on learning sequences of inputs and outputs, such as movies, music, movements,
…
at Amherst, on difficult problems in reinforcement learning, a branch of machine learning inspired by associative learning in animal experiments (figure 10.2). Unlike a deep learning network, whose only job is to transform inputs into outputs, a reinforcement network interacts in a closed loop with the environment, receiving sensory input,
…
championship Go translate to solving other complex problems? Much of human learning is based on observation and mimicry, and we need far fewer examples than deep learning to learn to recognize a new object. Unlabeled sensory data are abundant, and powerful unsupervised learning algorithms might use these data to advantage before
…
any supervision takes place. In chapter 7, an unsupervised version of the Boltzmann learning algorithm was used to initialize deep learning networks, and in chapter 6, independent component analysis (ICA), an unsupervised learning algorithm, extracted sparse population codes from natural images and in chapter 9,
…
at Figure 11.1 Logo of the Neural Information Processing Systems conferences. Founded thirty years ago, NIPS conferences are the premier conferences on machine and deep learning. Courtesy of the NIPS Foundation. Neural Information Processing Systems 163 Figure 11.2 Edward “Ed” Posner at Caltech, who founded the NIPS conferences, which
…
new Halıcıoğlu Data Science Institute. Master’s in Data Science degrees (MDSs) are becoming as popular as MBAs. Neural Information Processing Systems 165 Deep Learning at the Gaming Table Deep learning came of age at the 2012 NIPS Conference at Lake Tahoe (figure 11.3). Geoffrey Hinton, an early pioneer in neural networks, and
…
temporal segmentation to video,19 a performance good Figure 12.6 Marian Stewart-Bartlett demonstrating facial expression analysis. The time lines are the output of deep learning networks that are recognizing facial expression for happiness, sadness, surprise, fear, anger, and disgust. Courtesy of Marian StewartBartlett. Robert Wright/LDV Vision Summit 2015.
…
a company called “Emotient” to commercialize the automatic analysis of facial expressions. Paul Ekman and I served on its Scientific Advisory Board. Emotient developed deep learning networks that had an accuracy of 96 percent in real time and with natural behavior, under a broad range of lighting conditions, and with nonfrontal
…
accomplish.19 But who could have predicted how well neural networks would scale in their performance? The Wolfram language that supports Mathematica now also supports deep learning applications, one of which was the first to provide online object recognition in images.20 Stephen introduced me to Beatrice Golomb, who was working
…
world. But now there are openly available alternatives to TensorFlow: CNTK from Microsoft, MVNet, backed by Amazon and other major Internet companies, and other viable deep learning programs, such as Caffe, Theano, and PyTorch. Hot Chips In 2011, I organized “Growing High Performance Computing in a Green Environment,” a symposium sponsored
…
need to move forward, not look backward. At every step along the way, adding a new feature from brain architecture has boosted the functionality of deep learning networks: the hierarchy of cortical areas; the brain’s coupling of deep with reinforcement learning; working memory in recurrent cortical networks; and long-term
…
after Minsky’s death, Alex Graves, Greg Wayne, and colleagues, researchers at DeepMind, achieved the next step toward a general artificial intelligence based on deep learning by adding a dynamic external memory.25 Activity patterns can only be stored temporarily in a deep recurrent neural network, which makes it difficult to
…
systems across spatial and temporal scales: gene networks, metabolic networks, immune networks, neural networks, and social networks—it’s networks all the way down. Deep learning depends on optimizing a cost function. What are the cost functions in nature? The inverse of cost in evolution is called fitness, but that is
…
1992; and many other foundational books on machine learning, including Richard Sutton and Andrew Barto’s Reinforcement Learning: An Introduction, and the leading textbook Deep Learning by Ian Goodfellow, Yoshua Bengio, and Aaron Courville. The Press’s Robert Prior helped guide the present volume around many an unexpected bend in its
…
I did not know this at the time. Recommended Reading Recommended Recommended Reading Reading © Massachusetts Institute of TechnologyAll Rights Reserved An Introduction to Neuroscience The Deep Learning Revolution only briefly touches on neuroscience, which is itself a vast field with a rapidly advancing scientific frontier. The part of neuroscience most relevant to
…
deep neural networks, 35 depends on optimizing a cost function, 267 meets the visual hierarchy, 132–133 origin and roots of, 3 understanding, 119–122 Deep learning systems, 159. See also specific topics DeepLensing, 21 DeepMind, 17, 20, 154. See also AlphaGo Deepstack, 15, 24 Index Defense Advanced Research Projects Agency
…
Goldilocks problem in, 112 Language acquisition, 184. See also Chomsky, Noam Language disorders, 190 Language translation. See Translation 331 Larochelle, Hugo, 302n4 Law firms, automated deep learning systems and, 15 Lawrence, David T., 44f, 291n9 Learning, 258. See also specific topics Chomsky and, 248f, 249f, 250, 251 forms of, 154–159
…
201, 267. See also Multilayer learning algorithms; Unsupervised learning algorithms; specific algorithms building a new generation of chips to run, 205 complex systems and, 196 deep learning and, 133, 140–141, 201 explored through simulations of small networks, 258 explosion of, 110 in historical context, 137, 172–173 unifying concepts and,
…
Minsky, Marvin Lee Petascale computing, 206, 208 Peterson, Roger Torey, 30f, 290n3 Phonemes, 113, 114f, 115, 116, 158 Piantoni, Giovanni, 227, 228f Picture captioning with deep learning, 135, 136f Pinker, Steven, 300n11 Pitts, Walter H., 106, 200, 298n21, 312n11 Planning Workshop on Facial Expression Understanding, 180–181 Plasticity critical period of, in
by David G. W. Birch and Victoria Richardson · 28 Apr 2024 · 249pp · 74,201 words
about this direction in investing is because historical robo-advising was essentially jazzed-up machine learning. The custobots that Gartner is talking about will use deep-learning algorithms to deliver something very different, and they will require very different services from financial institutions. As an obvious example, companies may have to provide
by Maximilian Kasy · 15 Jan 2025 · 209pp · 63,332 words
Story Misses 3. What This Book Does Part II. How AI Works 4. What Is Artificial Intelligence? 5. Supervised Learning 6. Overfitting and Underfitting 7. Deep Learning 8. The Exploration/Exploitation Trade-Off 9. Key Ideas to Remember Part III. Machine Power 10. Social Welfare 11. The Means of Prediction 12. Agents
…
called cross-validation. Supervised learning relies on picking the model that does best, according to this cross-validation criterion. One method for making predictions uses deep learning, a type of supervised learning that is based on training neural nets. Neural nets allow for modeling very complicated relationships. They have been extremely successful
…
relatively unsuccessful academic niche since the 1960s. This dramatically changed in the early 2000s. Based on scaling of data and computational power, the methods of deep learning in neural networks reached a critical point. The performance of these machine learning methods suddenly beat all other approaches to the problems of AI. This
…
future. Availability of data, however, might be such a limiting factor, in ways that vary widely across domains. On one extreme, where big successes using deep learning were possible, are games such as chess and go, where almost infinite data can be generated by computer self-play. A close second might be
…
of prediction errors on the holdout sample. A variant of this approach (of using sample splitting for tuning), called early stopping, is often used in deep learning, which we discuss in the next chapter. In early stopping, during the training of a neural network, predictions are repeatedly updated a little bit, in
…
attractive in settings with very large datasets and complicated models, where the cost of computation is a concern. This includes the most successful applications of deep learning. Deep learning forms the basis of almost all the recent spectacular successes of AI. It is also one of the most mystified branches of machine learning, evoking
…
created artificial brains. Our technical discussion will serve to dispel some of this mystification. But first we should answer the question, What is deep learning, actually? Stay tuned! 7 Deep Learning Alchemists have long dreamed of creating artificial life. In the early sixteenth century, the Swiss alchemist and medical pioneer Paracelsus provided a recipe
…
least artificial neural nets, since the 1960s. These researchers were adherents of what used to be known as the connectionist paradigm, which is now called deep learning. The connectionist paradigm stood in contrast to the symbolic paradigm. The latter, which was once dominant in the field of AI, sought to build AI
…
based on higher-level abstractions, instead of reducing intelligence to the level of neurons. The connectionist paradigm, deep learning, and artificial neural nets were the subject of a somewhat obscure academic niche field within AI for several decades. Now, since shortly after the turn
…
of the millennium, they dominate most branches of AI. John Hopfield and Geoffrey Hinton, two of the pioneers of deep learning from its time of obscurity, received the 2024 Nobel Prize in Physics. Why did this belated success of neural nets occur? Some recent innovations notwithstanding
…
, many of the ideas behind deep learning have been around for quite a while. Rather than involving major conceptual breakthroughs, the delayed success of neural nets can be attributed to the increased
…
growth has been sustained since the early 1970s, but there might be fundamental physical limits preventing the continuation of this trend in the future. How Deep Learning Works Deep learning is a special case of supervised learning, where artificial neural networks are used to make predictions. We have already discussed numerous examples of prediction
…
are predicted from images of faces), predicting translated text in one language from text in another language, and so forth. The neural networks used in deep learning define a particular class of prediction functions. A function takes a bunch of numbers—pixel values, say, or some numerical encoding of text—and returns
…
. If the network has multiple functions layered on top of each other like this, the network is called deep. This is the technical meaning of deep learning: Complicated prediction functions are built from a chain of simpler functions. (The technical meaning aside, calling this approach deep was also good marketing. The term
…
deep learning clearly resonated with investors and journalists, arguably more than connectionist paradigm could.) Why would we call a prediction function built in this way, as a
…
network depends on a set of numbers called weights. If we change the weights, we get different predictions for the same inputs. The goal of deep learning is to find weights that deliver good predictions. In the neural network interpretation, the weights describe the strength of synaptic connections between different neurons. Learning
…
. But relative to other methods in machine learning, this procedure is quite simple, in terms of the calculations involved. Long before the recent rise of deep learning, researchers in optimization and statistics had devised a wide range of more complicated and sophisticated ways to learn, improve predictions, and avoid overfitting. It is
…
the relative simplicity of the calculations involved in deep learning (compared to its technical antecedents) that has allowed it to scale up so much. Its simplicity makes it possible to deal with massive amounts of
…
time. Availability of this specialized hardware has contributed to the rise of deep learning. Neural Nets Are Not Artificial Brains It is tempting to think of deep learning and neural nets as artificial brains. Don’t. Despite the original biological inspiration, modern deep learning has little in common with biological neural nets, or with the process
…
of learning in human brains or animal brains. Three differences between biological and artificial neural nets stand out. First, the way deep learning algorithms update the parameters of a neural net propagates information backward through the network. This is something that does not happen for biological neurons. Information
…
parallel. These functions are quite different from functions that would model biological neurons more realistically. Instead of thinking of deep learning in terms of artificial brains, it is more useful to think of deep learning as the craft of building complicated functions out of simpler functions—to model complicated relationships, as encountered in domains
…
earlier in the sentence. Transformer models learn to pay attention to this relevant context. The development of such specialized architectures for neural nets, and of deep learning more generally, is based on a large amount of tinkering and trial and error. It does indeed resemble alchemy at times, in the spirit of
…
Paracelsus or of the Muslim alchemists of the Jabirian corpus. There is something rather unsettling about this state of deep learning—at least for people with a taste for theory and an interest in machine learning, like me: Neural nets work better than they should, according
…
figure 3, models that are too complex tend to overfit, and deliver poor predictions out-of-sample. How is it possible that the models of deep learning can have many more parameters (weights) than observations and still deliver good predictions, seemingly avoiding the fate of overfitting? And why do they often extrapolate
…
regarding how neural networks solve (prediction) problems does not prevent us from having a broad discussion regarding which (prediction) problems they should be solving. In deep learning, as in AI more broadly, such a discussion needs to be the starting point for democratic governance. Self-Supervised Learning and Generative AI As noted
…
of realistic images has also made great advances, notably in algorithms such as Stable Diffusion. Both text generation and image generation build on self-supervised deep learning. The specific model architectures used in these domains are fairly recent inventions. Only time will tell, but one might conjecture that these architectures will be
…
models Source: Wikipedia, “Large Language Model,” accessed October 1, 2024, https://en.wikipedia.org/wiki/Large_language_model Language modeling is one success story of deep learning; image generation is another one. The most successful algorithms for image generation today use diffusion models. Diffusion models are based on the following idea: Start
…
control the foundation model (e.g., OpenAI/Microsoft) and the user who chooses the prompt. Neural nets and deep learning have had surprising successes in many domains, one of which is generative AI. But deep learning remains within the framework of supervised learning—that is, of prediction. There is something very important that is
…
. This is also a very data-hungry approach. This approach can build on the machinery of supervised learning that we discussed before—in particular, on deep learning. Deep learning is indeed what programs like TD-Gammon and AlphaGo used. They trained neural networks to predict the probability of winning, in the recursive manner described
…
, to the extent that they exist, thus need to rely on approaches other than pure reinforcement learning. There is a more general lesson here. Current deep-learning-based methods in AI need lots of data. There are some settings where data have become readily available, for instance via self-play in simulated
…
environments or by scraping the internet for enormous databases of text or images. In the contexts where such enormous amounts of data have become available, deep learning has exceeded all expectations. At the same time, there might be fundamental limits to how much data can be generated in many domains. We can
…
its central processing unit, or CPU. The CPU is where all the actual calculations take place that let a computer run. Machine learning, and especially deep learning, requires a very large number of calculations during the training of new models. The calculations involved are surprisingly simple, however. For the most part, these
…
calculations reduce to a large number of multiplications and additions. This is especially true for modern deep learning. Specialized processors have been developed in recent years that are able to process a huge number of multiplications and additions in parallel (at the same
…
graphics requires a great deal of multiplications and additions that are like those needed for training neural nets. GPUs and TPUs are essential for modern deep learning. Even though there are a few companies that can produce these chips, the market for GPUs is almost entirely controlled by one company, NVIDIA, which
…
AI. This energy hunger has deeper causes, according to the following argument by Geoffrey Hinton, one of the pioneers of deep learning (and winner of the 2024 Nobel Prize in Physics). Deep learning—and machine learning more generally—uses digital computers. Digital computers consist of many transistors, which are used to store and process
…
. Computer scientists are often worried about the robustness of their systems, and about vulnerabilities to adversarial attacks. In the context of image classification, for example, deep learning algorithms might be very successful at identifying different kinds of animals in photos. In the test and training data, they might always correctly distinguish whether
…
still limited. Some of the best mathematicians and theoretical physicists are currently trying to make progress on these questions. If a theoretical understanding of how deep learning works were necessary for democratic control, our prospects would be bleak. But such a theoretical understanding is not necessary. What is necessary is a broadly
…
Spree in Gaza.” +972 Magazine, April 3, 2024. https://www.972mag.com/lavender-ai-israeli-army-gaza/. Bartlett, P. L., A. Montanari, and A. Rakhlin. “Deep Learning: A Statistical Viewpoint.” Preprint, arXiv, March 16, 2021, https://doi.org/10.48550/arXiv.2103.09177. Berger, J. Statistical Decision Theory and Bayesian Inference. Springer
…
–4 (2018): 219–354. Friedman, J., T. Hastie, and R. Tibshirani. The Elements of Statistical Learning. Springer, 2001. Goodfellow, I., Y. Bengio, and A. Courville. Deep Learning. MIT Press, 2016. Jurafsky, D., and J. H. Martin. Speech and Language Processing. 3rd ed. Accessed September 2024. https://web.stanford.edu/~jurafsky/slp3/. Kasy
…
commons, 86 Creator, The (film), 3 cross-validation, 11 data: as basis of AI, 15, 26; as component of the means of prediction, 84–89; deep learning dependent on large amounts of, 45, 63–65; democratic governance of, 16, 145–47; direct management of, 146; externalities generated by, 88–89, 109, 142
…
economics, 24–25; explainability of, 176, 186–88; ingredients of, 23–24; political and economic aspects of, 25–26 decision trees, 178 Deep Blue, 61 deep learning, 11, 27, 44–56; and artificial vs. biological neural nets, 50; computational capacity required for, 49–50, 89–90; connectionist paradigm as basis of, 44
…
compared to, 26; factors in success of, 26–27; as the generation of externalities, 15–16, 88, 109, 143; limitations of, 87–88. See also deep learning; supervised learning marginal productivity, 150–55, 166–67 Maria Theresa, empress of Austria, 19 market power, 104, 109, 154–55 Marx, Karl, 82 Mastodon, 106
…
, 50, 94; building, 50–51; and complex modeling, 11, 41, 50; in diffusion models, 54; energy consumption of, 93; illustration of, 47; role of, in deep learning, 11, 27, 45–50, 56 news media, as change agent, 106–8 +972 Magazine, 31, 133 Nobel Prize, 45, 93, 132, 150, 155 Nordic Graphic
…
of decision functions and, 178 statistics, 26–27 stochastic gradient descent, 49 stock options, 126 stratified sampling, 198 supervised learning, 10, 29–36. See also deep learning; prediction; self-supervised learning surveillance capitalism, 85 survey responses, 136–38 Sweden, 158–60 Swedish Trade Union Confederation, 159 symbolic paradigm, 44 taste-based discrimination
by Tim Wu · 4 Nov 2025 · 246pp · 65,143 words
problems with Rosenblatt’s Perceptron. After devising a solution to its most famous limit in the 1980s, in 2006 he co-introduced the concept of “deep learning,” which did much to enhance the ability of neural networks to educate themselves. Since its 1958 debut, the trademark of a neural network has been
…
coauthors wrote, a neural network “allows a machine to be fed with raw data and to automatically discover the representations needed for detection or classification.” Deep learning added the idea of approaching such pattern recognition in stages, beginning with lower-level categories (like letters) before higher-level categories (like words). Like much
…
AI capable of categorizing new and unseen images correctly. Previous winners had relied mainly on handcrafted code. But Hinton’s three-person team deployed a deep-learning neural net named AlexNet, with eight layers and some 60 million parameters. “AlexNet didn’t just win; it dominated,” wrote one observer, showing that
…
“deep learning was more than a pipe dream.”[10] AlexNet’s victory might have seemed a vindication of Rosenblatt’s ideas from the 1950s. Yet some remained
…
dubious. Gary Marcus, a cognitive scientist at NYU, wrote a dismissive article in The New Yorker after the event. “Deep learning,” he announced, “takes us, at best, only a small step toward the creation of truly intelligent machines.”[11] He expressed skepticism that one could “build
…
a machine that could understand stories” using deep learning. “Hinton has built a better ladder, but a better ladder doesn’t necessarily get you to the moon.” But the main tech platforms thought differently
…
-layer Perceptron with a backpropagation algorithm, but that solution was described only in 1986. BACK TO NOTE REFERENCE 9 “AlexNet and ImageNet: The Birth of Deep Learning,” Pinecone, accessed November 22, 2024, https://www.pinecone.io/learn/series/image-search/imagenet/. BACK TO NOTE REFERENCE 10 Gary Marcus, “Is
…
‘Deep Learning’ a Revolution in Artificial Intelligence?,” New Yorker, November 25, 2012, https://www.newyorker.com/news/news-desk/is-deep-learning-a-revolution-in-artificial-intelligence. BACK TO NOTE REFERENCE 11 Cade Metz, “ ‘The Godfather of A
…
, 95–99 chatbots, 20, 79, 88, 90, 94–95, 98–99, 101 Connectionist, 91–93 data extraction for/by, 4, 84, 88, 93–95, 98 deep learning by, 93–94 distinguishing humans from, 6, 142–43 as economic equalizer, 6–7, 144 emotional attachments to, 100–101 ImageNet data used for, 84
…
, 80–81 of human biometrics, 142–44 predictive, 68, 85–88 The Death and Life of Great American Cities (Jacobs), 19 Deep Blue computer, 92 deep learning, AI, 93–94 DeepMind firm, 98 DeepSeek chatbot, 94–95, 101 de Garton, Ms., 162–63 Denmark, land ownership in, 127–29 dependence, 72, 140
by James Ashton · 11 May 2023 · 401pp · 113,586 words
focus on diversifying the product portfolio so that it targeted key markets and increased investment in each of them. When Nvidia and Arm partnered on ‘deep learning inferencing’ in March 2018 – essentially helping to train computers to think like a human brain – Haas was front and centre. At Arm’s 2018 TechCon
by Gavin Hackeling · 31 Oct 2014
use either hand-engineered feature extraction methods that are applicable to many different problems, or automatically learn features without supervision problem using techniques such as deep learning. We will focus on the former in the next section. [ 64 ] www.it-ebooks.info Chapter 3 Extracting points of interest as features The feature
by Douglas Rushkoff · 1 Mar 2016 · 366pp · 94,209 words
the Bay Area, is the sort of business for which the flex corp structure works well. Vicarious operates in the field of artificial intelligence and deep learning; its most celebrated project to date is an attempt to crack CAPTCHAs (those annoying tests of whether a user is human) using AI. Vicarious claims
by Vivek Wadhwa and Alex Salkever · 2 Apr 2017 · 181pp · 52,147 words
kill switch on its A.I. systems.12 Other researchers are developing tools to visualize the otherwise impenetrable code in machine-generated algorithms built using Deep Learning systems. So the question that we must always be able to answer in the affirmative is whether we can stop it. With both A.I
by Thomas W. Malone · 14 May 2018 · 344pp · 104,077 words
20,000 categories of objects, including human faces, human bodies, and… cat faces.19 This system used a particularly promising approach to machine learning called deep learning, which loosely simulates the way the different layers of neurons in a brain are connected to one another. Neuromorphic Computing Still another intriguing approach to
by Susanne Foitzik and Olaf Fritsche · 5 Apr 2021 · 335pp · 86,900 words
new combinations within Ophiocordyceps. I. Myrmecophilous hirsutelloid species. Studies in Mycology, 90, 119–60. Fredericksen, M. A. et al. (2017). Three-dimensional visualization and a deep-learning model reveal complex fungal parasite networks in behaviorally manipulated ants. Proceedings of the National Academy of Sciences USA, 114, 12590–95. Hughes, D. P. et
by Sebastien Donadio · 7 Nov 2019
by Douglas B. Laney · 4 Sep 2017 · 374pp · 94,508 words
by Paris Marx · 4 Jul 2022 · 295pp · 81,861 words
by Tom Standage · 27 Nov 2018 · 215pp · 59,188 words
by Jacob Silverman · 17 Mar 2015 · 527pp · 147,690 words
by Mary L. Gray and Siddharth Suri · 6 May 2019 · 346pp · 97,330 words
by Ruth Kinna · 31 Jul 2019 · 405pp · 103,723 words
by Aurelien Geron · 14 Aug 2019
by David Spiegelhalter · 2 Sep 2019 · 404pp · 92,713 words
by David Eagleman and Anthony Brandt · 30 Sep 2017 · 345pp · 84,847 words
by Richard A. Clarke and Robert K. Knake · 15 Jul 2019 · 409pp · 112,055 words
by Brett King · 5 May 2016 · 385pp · 111,113 words
by Kariappa Bheemaiah · 26 Feb 2017 · 492pp · 118,882 words
by Susan Schneider · 1 Oct 2019 · 331pp · 47,993 words
by Scott E. Page · 27 Nov 2018 · 543pp · 153,550 words
by Denise Hearn and Vass Bednar · 14 Oct 2024 · 175pp · 46,192 words
by Seth Stephens-Davidowitz · 9 May 2022 · 287pp · 69,655 words
by Gareth Dennis · 12 Nov 2024 · 261pp · 76,645 words
by Dariusz Jemielniak and Aleksandra Przegalinska · 18 Feb 2020 · 187pp · 50,083 words
by Anders Lisdorf
by Jessica Bruder and Dale Maharidge · 29 Mar 2020 · 159pp · 42,401 words
by Simon Winchester · 1 Jan 2003 · 200pp · 71,482 words
by David Spiegelhalter · 14 Oct 2019 · 442pp · 94,734 words
by Martin Kleppmann · 16 Mar 2017 · 1,237pp · 227,370 words
by Ryan Dezember · 13 Jul 2020 · 279pp · 87,875 words
by Michael Barber · 12 Mar 2015 · 350pp · 109,379 words
by Martin Kleppmann · 17 Apr 2017
by Jamie Bartlett · 4 Apr 2018 · 170pp · 49,193 words
by Corey Pein · 23 Apr 2018 · 282pp · 81,873 words
by Adrian Hon · 14 Sep 2022 · 371pp · 107,141 words
by David Kerrigan · 18 Jun 2017 · 472pp · 80,835 words
by Sara Wachter-Boettcher · 9 Oct 2017 · 223pp · 60,909 words
by Marc Lewis Phd · 13 Jul 2015 · 288pp · 73,297 words
by Eric O'Neill · 1 Mar 2019 · 299pp · 88,375 words
by Robin Hanson · 31 Mar 2016 · 589pp · 147,053 words
by Tarleton Gillespie · 25 Jun 2018 · 390pp · 109,519 words
by Susan Fowler · 18 Feb 2020 · 205pp · 71,872 words
by Alec Ross · 13 Sep 2021 · 363pp · 109,077 words
by Mike Maples and Peter Ziebelman · 8 Jul 2024 · 207pp · 65,156 words
by Ariel Ezrachi and Maurice E. Stucke · 30 Nov 2016
by Ash Fontana · 4 May 2021 · 296pp · 66,815 words
by Camilla Pang · 12 Mar 2020 · 256pp · 67,563 words
by Nick Srnicek and Alex Williams · 1 Oct 2015 · 357pp · 95,986 words
by Eric Posner and E. Weyl · 14 May 2018 · 463pp · 105,197 words
by Matthew Syed · 9 Sep 2019 · 280pp · 76,638 words
by Michael Kearns and Aaron Roth · 3 Oct 2019
by Andreas Herrmann, Walter Brenner and Rupert Stadler · 25 Mar 2018
by Michal Zalewski · 11 Jan 2022 · 337pp · 96,666 words
by William MacAskill · 31 Aug 2022 · 451pp · 125,201 words
by Dipanjan Sarkar · 1 Dec 2016
by Robert Skidelsky Nan Craig · 15 Mar 2020
by Kenneth Cukier, Viktor Mayer-Schönberger and Francis de Véricourt · 10 May 2021 · 291pp · 80,068 words
by Charles Conn and Robert McLean · 6 Mar 2019
by Sinan Aral · 14 Sep 2020 · 475pp · 134,707 words
by Kevin Roose · 9 Mar 2021 · 208pp · 57,602 words
by Alasdair Gilchrist · 27 Jun 2016
by Lynda Gratton and Andrew Scott · 1 Jun 2016 · 344pp · 94,332 words
by Lane Greene · 15 Dec 2018 · 284pp · 84,169 words
by David Robson · 7 Mar 2019 · 417pp · 103,458 words
by Andrew Steele · 24 Dec 2020 · 399pp · 118,576 words
by Amanda Kirby and Theo Smith · 2 Aug 2021 · 424pp · 114,820 words
by Joi Ito and Jeff Howe · 6 Dec 2016 · 254pp · 76,064 words
by Franklin Foer · 31 Aug 2017 · 281pp · 71,242 words
by Tom Standage · 16 Aug 2021 · 290pp · 85,847 words
by William Davidow and Michael Malone · 18 Feb 2020 · 304pp · 80,143 words
by Henry A Kissinger, Eric Schmidt and Daniel Huttenlocher · 2 Nov 2021 · 194pp · 57,434 words
by Julie Battilana and Tiziana Casciaro · 30 Aug 2021 · 345pp · 92,063 words
by Ben Tarnoff · 13 Jun 2022 · 234pp · 67,589 words
by Ron Jeffries · 14 Aug 2015 · 444pp · 118,393 words
by James Lovelock · 27 Aug 2019 · 94pp · 33,179 words
by Jacob Helberg · 11 Oct 2021 · 521pp · 118,183 words
by Yuval Noah Harari · 9 Sep 2024 · 566pp · 169,013 words
by Nouriel Roubini · 17 Oct 2022 · 328pp · 96,678 words
by Natalie Berg and Miya Knights · 28 Jan 2019 · 404pp · 95,163 words
by Daron Acemoglu and Simon Johnson · 15 May 2023 · 619pp · 177,548 words
by Michael Lewis · 3 May 2021 · 285pp · 98,832 words
by Jeff Booth · 14 Jan 2020 · 180pp · 55,805 words
by Jordan Ellenberg · 14 May 2021 · 665pp · 159,350 words
by David de Cremer · 25 May 2020 · 241pp · 70,307 words
by Brad Stone · 10 May 2021 · 569pp · 156,139 words
by Joshua Cooper Ramo · 16 May 2016 · 326pp · 103,170 words
by Gary Marcus and Jeremy Freeman · 1 Nov 2014 · 336pp · 93,672 words
by Michael A. Cusumano, Annabelle Gawer and David B. Yoffie · 6 May 2019 · 328pp · 84,682 words
by Jan Kunigk, Ian Buss, Paul Wilkinson and Lars George · 8 Jan 2019 · 1,409pp · 205,237 words
by Sabine Hossenfelder · 11 Jun 2018 · 340pp · 91,416 words
by Martin Goodman · 25 Oct 2017 · 768pp · 252,874 words
by Jacob Ward · 25 Jan 2022 · 292pp · 94,660 words
by Jeanette Winterson · 15 Mar 2021 · 256pp · 73,068 words
by Eben Kirksey · 10 Nov 2020 · 599pp · 98,564 words
by David Sax · 15 Jan 2022 · 282pp · 93,783 words
by Richard Baldwin · 10 Jan 2019 · 301pp · 89,076 words
by Roger Bootle · 4 Sep 2019 · 374pp · 111,284 words
by Nicole Kobie · 3 Jul 2024 · 348pp · 119,358 words
by Christopher Mims · 13 Sep 2021 · 385pp · 112,842 words
by Jeff Hawkins · 15 Nov 2021 · 253pp · 84,238 words
by Matthew Brennan · 9 Oct 2020 · 282pp · 63,385 words
by Azeem Azhar · 6 Sep 2021 · 447pp · 111,991 words
by Adrienne Mayor · 27 Nov 2018
by Brad Smith and Carol Ann Browne · 9 Sep 2019 · 482pp · 121,173 words
by Brian Dumaine · 11 May 2020 · 411pp · 98,128 words
by Jaron Lanier · 21 Nov 2017 · 480pp · 123,979 words
by Joel Grus · 13 Apr 2015 · 579pp · 76,657 words
by Paul R. Daugherty and H. James Wilson · 15 Jan 2018 · 523pp · 61,179 words
by Tim Fernholz · 20 Mar 2018 · 328pp · 96,141 words
by Brian Merchant · 19 Jun 2017 · 416pp · 129,308 words
by Ajay Agrawal, Joshua Gans and Avi Goldfarb · 16 Apr 2018 · 345pp · 75,660 words
by Daniel C. Dennett · 7 Feb 2017 · 573pp · 157,767 words
by Steven Strogatz · 31 Mar 2019 · 407pp · 116,726 words
by Steven Levy · 25 Feb 2020 · 706pp · 202,591 words
by Peter H. Diamandis and Steven Kotler · 3 Feb 2015 · 368pp · 96,825 words
by Stephen M Fleming · 27 Apr 2021
by Eric Topol · 6 Jan 2015 · 588pp · 131,025 words
by Matthew Hindman · 24 Sep 2018
by Robert Wachter · 7 Apr 2015 · 309pp · 114,984 words
by Lauren Turner Claire, Laure Claire Reillier and Benoit Reillier · 14 Oct 2017 · 240pp · 78,436 words
by Rebecca Fannin · 2 Sep 2019 · 269pp · 70,543 words
by Marcus Du Sautoy · 7 Mar 2019 · 337pp · 103,522 words
by Jamie Susskind · 3 Sep 2018 · 533pp
by Igor Tulchinsky · 30 Sep 2019 · 321pp
by Jacob Turner · 29 Oct 2018 · 688pp · 147,571 words
by Kashmir Hill · 19 Sep 2023 · 487pp · 124,008 words
by Richard Rumelt · 27 Apr 2022 · 363pp · 109,834 words
by Daniel Susskind · 16 Apr 2024 · 358pp · 109,930 words
by Carl Benedikt Frey · 17 Jun 2019 · 626pp · 167,836 words
by Orly Lobel · 17 Oct 2022 · 370pp · 112,809 words
by Vauhini Vara · 8 Apr 2025 · 301pp · 105,209 words
by Max Fisher · 5 Sep 2022 · 439pp · 131,081 words
by Walter Isaacson · 11 Sep 2023 · 562pp · 201,502 words
by Nate Silver · 12 Aug 2024 · 848pp · 227,015 words
by Meredith Broussard · 19 Apr 2018 · 245pp · 83,272 words
by Peter H. Diamandis and Steven Kotler · 28 Jan 2020 · 501pp · 114,888 words
by Mo Gawdat · 29 Sep 2021 · 259pp · 84,261 words
by Stanley McChrystal and Anna Butrico · 4 Oct 2021 · 489pp · 106,008 words
by George Gilder · 16 Jul 2018 · 332pp · 93,672 words
by Gautam Baid · 1 Jun 2020 · 1,239pp · 163,625 words
by Fareed Zakaria · 5 Oct 2020 · 289pp · 86,165 words
by Kurt Andersen · 14 Sep 2020 · 486pp · 150,849 words
by Shoshana Zuboff · 15 Jan 2019 · 918pp · 257,605 words
by Andrew McAfee and Erik Brynjolfsson · 26 Jun 2017 · 472pp · 117,093 words
by Adam Greenfield · 29 May 2017 · 410pp · 119,823 words
by Kevin Kelly · 6 Jun 2016 · 371pp · 108,317 words
by Kim Stanley Robinson · 22 Oct 2018 · 492pp · 141,544 words
by Calum Chace · 28 Jul 2015 · 144pp · 43,356 words
by Thomas L. Friedman · 22 Nov 2016 · 602pp · 177,874 words
by Salim Ismail and Yuri van Geest · 17 Oct 2014 · 292pp · 85,151 words
by Martin Ford · 4 May 2015 · 484pp · 104,873 words
by Brent Donnelly · 11 May 2021
by Merlin Sheldrake · 11 May 2020
by Nick Couldry and Ulises A. Mejias · 19 Aug 2019 · 458pp · 116,832 words
by Jevin D. West and Carl T. Bergstrom · 3 Aug 2020
by Mariya Yao, Adelyn Zhou and Marlene Jia · 1 Jun 2018 · 161pp · 39,526 words
by Arthur Turrell · 2 Aug 2021 · 297pp · 84,447 words
by Don Hanlon Johnson · 10 Sep 2018 · 358pp · 106,951 words
by James Vlahos · 1 Mar 2019 · 392pp · 108,745 words
by Po Bronson · 14 Jul 2020 · 320pp · 95,629 words
by Nick Polson and James Scott · 14 May 2018 · 301pp · 85,126 words
by Christopher Wylie · 8 Oct 2019
by Judea Pearl and Dana Mackenzie · 1 Mar 2018
by Nicholas A. Christakis · 26 Mar 2019
by Madhumita Murgia · 20 Mar 2024 · 336pp · 91,806 words
by Eliezer Yudkowsky and Nate Soares · 15 Sep 2025 · 215pp · 64,699 words
by Naomi Klein · 11 Sep 2023
by Steven Pinker · 14 Oct 2021 · 533pp · 125,495 words
by William Davies · 26 Feb 2019 · 349pp · 98,868 words
by Anil Ananthaswamy · 15 Jul 2024 · 416pp · 118,522 words
by Mustafa Suleyman · 4 Sep 2023 · 444pp · 117,770 words
by Rob Reich, Mehran Sahami and Jeremy M. Weinstein · 6 Sep 2021
by Robert Elliott Smith · 26 Jun 2019 · 370pp · 107,983 words
by George Zarkadakis · 7 Mar 2016 · 405pp · 117,219 words
by Anthony M. Townsend · 15 Jun 2020 · 362pp · 97,288 words
by Toby Ord · 24 Mar 2020 · 513pp · 152,381 words
by Daniel Yergin · 14 Sep 2020
by Veljko Krunic · 29 Mar 2020
by Pedro Domingos · 21 Sep 2015 · 396pp · 117,149 words
by Michael Wooldridge · 2 Nov 2018 · 346pp · 97,890 words
by Erik J. Larson · 5 Apr 2021
by Amy Webb · 5 Mar 2019 · 340pp · 97,723 words
by Thomas H. Davenport and Julia Kirby · 23 May 2016 · 347pp · 97,721 words
by Clive Thompson · 26 Mar 2019 · 499pp · 144,278 words
by Daniel J. Levitin · 18 Aug 2014 · 685pp · 203,949 words
by Jeremy Rifkin · 27 Sep 2011 · 443pp · 112,800 words
by Nicholas Carr · 28 Sep 2014 · 308pp · 84,713 words
by Steve Sammartino · 25 Jun 2014 · 247pp · 81,135 words
by Hannah Fry · 17 Sep 2018 · 296pp · 78,631 words
by Andrew Marr · 16 May 2007 · 618pp · 180,430 words
by Paul Scharre · 23 Apr 2018 · 590pp · 152,595 words
by Fabio Nelli · 27 Sep 2018 · 688pp · 107,867 words
by Thierry Poibeau · 14 Sep 2017 · 174pp · 56,405 words
by Richard Yonck · 7 Mar 2017 · 360pp · 100,991 words
by Marc Goodman · 24 Feb 2015 · 677pp · 206,548 words
by Daniel Susskind · 14 Jan 2020 · 419pp · 109,241 words
by Tim O'Reilly · 9 Oct 2017 · 561pp · 157,589 words
by Geoffrey Cain · 28 Jun 2021 · 340pp · 90,674 words
by Otto Scharmer and Katrin Kaufer · 14 Apr 2013 · 351pp · 93,982 words
by Stuart Russell · 7 Oct 2019 · 416pp · 112,268 words
by Ronald J. Deibert · 14 Aug 2020
by John Brockman · 19 Feb 2019 · 339pp · 94,769 words
by Jeff Lawson · 12 Jan 2021 · 282pp · 85,658 words
by Nicole Aschoff
by Peter Warren Singer and Emerson T. Brooking · 15 Mar 2018
by Hawon Jung · 21 Mar 2023 · 401pp · 112,589 words
by Joshua Paul Dale · 15 Dec 2023 · 209pp · 81,560 words
by Ray Kurzweil · 25 Jun 2024
by Kenneth Payne · 16 Jun 2021 · 339pp · 92,785 words
by Michael Bhaskar · 2 Nov 2021
by Kai-Fu Lee and Qiufan Chen · 13 Sep 2021
by Martin Ford · 13 Sep 2021 · 288pp · 86,995 words
by Brian Christian · 5 Oct 2020 · 625pp · 167,349 words
by Martin Ford · 16 Nov 2018 · 586pp · 186,548 words
by John Markoff · 24 Aug 2015 · 413pp · 119,587 words
by Stuart Russell and Peter Norvig · 14 Jul 2019 · 2,466pp · 668,761 words
by Hod Lipson and Melba Kurman · 22 Sep 2016
by Kai-Fu Lee · 14 Sep 2018 · 307pp · 88,180 words
by Richard A. Clarke · 10 Apr 2017 · 428pp · 121,717 words
by David Epstein · 1 Mar 2019 · 406pp · 109,794 words
by Luke Dormehl · 10 Aug 2016 · 252pp · 74,167 words
by Holly Glenn Whitaker · 9 Jan 2020 · 334pp · 109,882 words
by Sergey Young · 23 Aug 2021 · 326pp · 88,968 words
by Xiaowei Wang · 12 Oct 2020 · 196pp · 61,981 words
by Yarden Katz
by Amy B. Zegart · 6 Nov 2021
by Stephen Witt · 8 Apr 2025 · 260pp · 82,629 words
by Anu Bradford · 25 Sep 2023 · 898pp · 236,779 words
by Keach Hagey · 19 May 2025 · 439pp · 125,379 words
by David N. Blank-Edelman · 16 Sep 2018
by Paul Scharre · 18 Jan 2023
by Matthew Williams · 23 Mar 2021 · 592pp · 125,186 words
by Parmy Olson · 284pp · 96,087 words
by John Brockman · 5 Oct 2015 · 481pp · 125,946 words
by Calum Chace · 17 Jul 2016 · 477pp · 75,408 words
by Varun Sivaram · 2 Mar 2018 · 469pp · 132,438 words
by Nick Chater · 28 Mar 2018 · 263pp · 81,527 words
by Walter Isaacson · 26 Sep 2005 · 1,330pp · 372,940 words
by Valliappa Lakshmanan, Sara Robinson and Michael Munn · 31 Oct 2020
by Frank Pasquale · 14 May 2020 · 1,172pp · 114,305 words
by Melanie Mitchell · 14 Oct 2019 · 350pp · 98,077 words
by Christopher Summerfield · 11 Mar 2025 · 412pp · 122,298 words
by Guillaume Pitron · 14 Jun 2023 · 271pp · 79,355 words
by Sonja Thiel and Johannes C. Bernhardt · 31 Dec 2023 · 321pp · 113,564 words
by Aurélien Géron · 13 Mar 2017 · 1,331pp · 163,200 words
by Aaron Bastani · 10 Jun 2019 · 280pp · 74,559 words
by Yves Hilpisch · 8 Dec 2020 · 1,082pp · 87,792 words
by Anil Seth · 29 Aug 2021 · 418pp · 102,597 words
by Karen Hao · 19 May 2025 · 660pp · 179,531 words
by Rowan Hooper · 15 Jan 2020 · 285pp · 86,858 words
by Eric Topol · 1 Jan 2019 · 424pp · 114,905 words
by Bill McKibben · 15 Apr 2019
by Bruno Macaes · 25 Jan 2018 · 287pp · 95,152 words
by Cade Metz · 15 Mar 2021 · 414pp · 109,622 words
by Hannu Rajaniemi · 1 Jan 2010 · 324pp · 91,653 words