description: field of computer science and linguistics
220 results
by Yuli Vasiliev · 2 Apr 2020
AND SPACY A Practical Introduction by Yuli Vasiliev San Francisco NATURAL LANGUAGE PROCESSING WITH PYTHON AND SPACY. Copyright © 2020 by Yuli Vasiliev. All rights reserved. No part of this work may be reproduced or transmitted in any form
…
information contained in it. About the Author Yuli Vasiliev is a programmer, freelance writer, and consultant specializing in open source development, Oracle database technologies, and natural language processing (NLP). Currently, he works as a consultant for the bot project Porphyry. The bot implements NLP techniques to give meaningful responses to user questions. A
…
BI analytics and developing machine learning models for the Online Partnerships Group at Google, specializing in mobile app monetization. BRIEF CONTENTS Introduction Chapter 1: How Natural Language Processing Works Chapter 2: The Text-Processing Pipeline Chapter 3: Working with Container Objects and Customizing spaCy Chapter 4: Extracting and Using Linguistic Features Chapter 5
…
spaCy Library Who Should Read This Book? What’s in the Book? 1 HOW NATURAL LANGUAGE PROCESSING WORKS How Can Computers Understand Language? Mapping Words and Numbers with Word Embedding Using Machine Learning for Natural Language Processing Why Use Machine Learning for Natural Language Processing? What Is a Statistical Model in NLP? Neural Network Models Convolutional Neural Networks for
…
they’re getting smarter. Even so, very few people understand how these robots work or how they might use these technologies in their own projects. Natural language processing (NLP)—a branch of artificial intelligence that helps machines understand and respond to human language—is the key technology that lies at the heart of
…
real-world problems, such as analyzing sentences, capturing the meaning of a text, composing original texts, and even building your own chatbot. Using Python for Natural Language Processing If you want to develop an NLP application, you can choose among a wide range of tools and technologies. All the examples in this book
…
you have a good understanding of NLP concepts and some basic programming, the examples will be even easier to follow. What’s in the Book? Natural Language Processing with Python and spaCy begins with a brief introduction to the basic elements and methods of the NLP technology used to process and analyze natural
…
” sections in each chapter will help you reinforce the material you just learned. Here’s what you’ll find in each chapter: Chapter 1: How Natural Language Processing Works Provides a brief introduction to the basic elements of NLP technology. It describes the machine learning techniques that generate the data NLP libraries use
…
and syntax elements discussed most frequently in the book. Readers who don’t come from linguistic backgrounds can use it as a reference. 1 HOW NATURAL LANGUAGE PROCESSING WORKS In the 19th century, explorers discovered rongorongo, a system of mysterious glyphs on the island of Rapa Nui (commonly known as Easter Island). Researchers
…
which they describe things—you most likely won’t understand the other aspects of their life, including what they do and why they do it. Natural language processing (NLP) is a subfield of artificial intelligence that tries to process and analyze natural language data. It includes teaching machines to interact with humans in
…
. In this book, you’ll use the Python programming language to build a natural language processor with spaCy, the leading open source Python library for natural language processing. But before you get started, this chapter outlines what goes on behind the scenes of building a natural language processor. How Can Computers Understand Language
…
calculating semantic similarity is outside the scope of this book, Chapter 5 will cover working with word vectors in more detail. Using Machine Learning for Natural Language Processing You can generate the numbers to put in the vectors using a machine learning algorithm. Machine learning, a subfield of artificial intelligence, creates computer systems
…
pairs of words. For example, the one shown here tells us that the verb “sent” agrees with the pronoun “she.” Why Use Machine Learning for Natural Language Processing? Your algorithm’s predictions aren’t statements of fact; they’re typically calculated with a degree of certainty. To achieve a higher degree of accuracy
…
the syntactic structure in new, previously unseen text data. Figure 1-3 summarizes how language processing works for natural languages and programming languages, respectively. A natural language processing system uses an underlying statistical model to make predictions about the meaning of input text and then generates an appropriate response. In contrast, a compiler
…
NOUN 78% 22% Of course, you’ll get other figures for the word “count” used in another context. Statistical language modeling is vital to many natural language processing tasks, such as natural language generating and natural language understanding. For this reason, a statistical model lies at the heart of virtually any NLP application
…
can easily implement these steps programmatically. We’ll describe this process in detail in Chapter 8. Summary In this chapter, you learned the basics of natural language processing. You now know that, unlike humans, machines use vector–based representations of words, which allow you to perform math on natural language units, including words
…
–95 money, 49–50 multiple intents, 113–114 MySQL databases, 135–138 N named entities, 29, 72–74 named entity recognition, 4, 29, 143–144 natural language processing (NLP), xv–xvi, 1–2 Natural Language Toolkit (NLTK), xvi ner component, 42 ner.add_label() method, 41 neural networks, 9–11 nlp.begin_training
…
word2int() function, 134 word2vec algorithm, xvi word-based dependency grammars, 186 working environment set up, 16 X XML, 129 Y yes/no questions, 56–57 Natural Language Processing with Python and spaCy is set in New Baskerville, Futura, Dogma, and TheSansMono Condensed. RESOURCES Visit https://nostarch.com/nlppython/ for errata and more information
…
-1-59327-992-9 PHONE: 800.420.7240 OR 415.863.9900 EMAIL: SALES@NOSTARCH.COM WEB: WWW.NOSTARCH.COM BUILD YOUR OWN NLP APPLICATIONS Natural Language Processing with Python and spaCy will show you how to create NLP applications like chatbots, text-condensing scripts, and order-processing tools quickly and easily. You
…
Python and spaCy. ABOUT THE AUTHOR Yuli Vasiliev is a programmer, freelance writer, and consultant who specializes in open source development, Oracle database technologies, and natural language processing. THE FINEST IN GEEK ENTERTAINMENT™ www.nostarch.com
by Steven Bird, Ewan Klein and Edward Loper · 15 Dec 2009 · 504pp · 89,238 words
Python Steven Bird, Ewan Klein, and Edward Loper Beijing • Cambridge • Farnham • Köln • Sebastopol • Taipei • Tokyo Natural Language Processing with Python by Steven Bird, Ewan Klein, and Edward Loper Copyright © 2009 Steven Bird, Ewan Klein, and Edward Loper. All rights reserved. Printed in the
…
Printing History: June 2009: First Edition. Nutshell Handbook, the Nutshell Handbook logo, and the O’Reilly logo are registered trademarks of O’Reilly Media, Inc. Natural Language Processing with Python, the image of a right whale, and related trade dress are trademarks of O’Reilly Media, Inc. Many of the designations used by
…
437 438 Afterword: The Language Challenge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449 NLTK Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459 General Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463 viii | Table of Contents Preface This is a book about Natural Language Processing. By “natural language” we mean a language that is used for everyday communication by humans; languages such as English, Hindi, or Portuguese. In contrast to
…
mathematical notations, natural languages have evolved as they pass from generation to generation, and are hard to pin down with explicit rules. We will take Natural Language Processing—or NLP for short—in a wide sense to cover any kind of computer manipulation of natural language. At one extreme, it could be as
…
provides a highly accessible introduction to the field of NLP. It can be used for individual study or as the textbook for a course on natural language processing or computational linguistics, or as a supplement to courses in artificial intelligence, text mining, or corpus linguistics. The book is intensely practical, containing hundreds of
…
of Python resources at http://docs.python.org/. New to Python? Experienced programmers can quickly learn enough Python using this book to get immersed in natural language processing. All relevant Python features are carefully explained and exemplified, and you will quickly come to appreciate Python’s suitability for this application area. The language
…
/about/success/. NLTK defines an infrastructure that can be used to build NLP programs in Python. It provides basic classes for representing data relevant to natural language processing; standard interfaces for performing tasks such as part-of-speech tagging, syntactic parsing, and text classification; and standard implementations for each task that can be
…
install. Third, we have tried to avoid clever programming tricks, since we believe that clear implementations are preferable to ingenious yet indecipherable ones. For Instructors Natural Language Processing is often taught within the confines of a single-semester course at the advanced undergraduate level or postgraduate level. Many instructors have found that it
…
product’s documentation does require permission. We appreciate, but do not require, attribution. An attribution usually includes the title, author, publisher, and ISBN. For example: “Natural Language Processing with Python, by Steven Bird, Ewan Klein, and Edward Loper. Copyright 2009 Steven Bird, Ewan Klein, and Edward Loper, 978-0-596-51649-9.” If
…
a text? 3. What tools and techniques does the Python programming language provide for such work? 4. What are some of the interesting challenges of natural language processing? This chapter is divided into sections that skip between two quite different styles. In the “computing with language” sections, we will take on some linguistically
…
language technologies. We’ll take the opportunity now to step back from the nitty-gritty of code in order to paint a bigger picture of natural language processing. At a purely practical level, we all need help to navigate the universe of information locked up in text on the Web. Search engines have
…
more arguments inside parentheses, like this: mult(3, 4), e.g., len(text1). 1.7 Further Reading This chapter has introduced new concepts in programming, natural language processing, and linguistics, all mixed in together. Many of them are consolidated in the following chapters. However, you may also want to consult the online materials
…
, and on NLP more generally, you might like to consult one of the following excellent books: • Indurkhya, Nitin and Fred Damerau (eds., 2010) Handbook of Natural Language Processing (second edition), Chapman & Hall/CRC. • Jurafsky, Daniel and James Martin (2008) Speech and Language Processing (second edition), Prentice Hall. • Mitkov, Ruslan (ed., 2002) The Oxford
…
do? Can you think of a practical application for this? 1.8 Exercises | 37 CHAPTER 2 Accessing Text Corpora and Lexical Resources Practical work in Natural Language Processing typically uses large bodies of linguistic data, or corpora. The goal of this chapter is to answer the following questions: 1. What are some useful
…
every word of the text. For example, the following text-to-speech function looks up each word of the text in the pronunciation dictionary: >>> text = ['natural', 'language', 'processing'] >>> [ph for w in text for ph in prondict[w][0]] ['N', 'AE1', 'CH', 'ER0', 'AH0', 'L', 'L', 'AE1', 'NG', 'G', 'W', 'AH0', 'JH', 'P
…
are written on this topic, and we only have space to introduce some key concepts and elaborate on the approaches that are most prevalent in natural language processing. The best-known strategy is known as divide-and-conquer. We attack a problem of size n by dividing it into two problems of size
…
0.0037 seconds, or three orders of magnitude faster! Dynamic Programming Dynamic programming is a general technique for designing algorithms which is widely used in natural language processing. The term “programming” is used in a different sense to what you might expect, to mean planning or scheduling. Dynamic programming is used when a
…
words in text. The goal of this chapter is to answer the following questions: 1. What are lexical categories, and how are they used in natural language processing? 2. What is a good Python data structure for storing words and their categories? 3. How can we automatically tag each word of a text
…
to syntax or to world knowledge. Despite these imperfections, part-of-speech tagging has played a central role in the rise of statistical approaches to natural language processing. In the early 1990s, the surprising accuracy of statistical taggers was a striking 5.5 N-Gram Tagging | 207 demonstration that it was possible to
…
.3, to improve the accuracy of the evaluations.) 5.10 Exercises | 219 CHAPTER 6 Learning to Classify Text Detecting patterns is a central part of Natural Language Processing. Words ending in -ed tend to be past tense verbs (Chapter 5). Frequent use of will is indicative of news text (Chapter 3). These observable
…
them into components for practical language technologies. We hope that the Natural Language Toolkit (NLTK) has served to open up the exciting endeavor of practical natural language processing to a broader audience than before. In spite of all that has come before, language presents us with far more than a temporary challenge for
…
, and Carnap. This work led to the notion of language as a formal system amenable to automatic processing. Three later developments laid the foundation for natural language processing. The first was formal language theory. This defined a language as a set of strings accepted by a class of automata, such as context-free
…
more profound and elusive than striving for fluency in as many programming languages as possible. This book has covered many topics in the field of Natural Language Processing. Most of the examples have used Python and English. However, it would be unfortunate if readers concluded that NLP is about how to write Python
…
, 16:155–170, 1990. [Baldwin & Kim, 2010] Timothy Baldwin and Su Nam Kim. Multiword Expressions. In Nitin Indurkhya and Fred J. Damerau, editors, Handbook of Natural Language Processing, second edition. Morgan and Claypool, 2010. [Beazley, 2006] David M. Beazley. Python Essential Reference. Developer’s Library. Sams Publishing, third edition, 2006. [Biber et al
…
in the box on the table. American Journal of Computational Linguistics, 8:139–149, 1982. [Cohen and Hunter, 2004] K. Bretonnel Cohen and Lawrence Hunter. Natural language processing and systems biology. In Werner Dubitzky and Francisco Azuaje, editors, Artificial Intelligence Methods and Tools for Systems Biology, page 147–174 Springer Verlag, 2004. [Cole
…
, 1997] Ronald Cole, editor. Survey of the State of the Art in Human Language Technology. Studies in Natural Language Processing. Cambridge University Press, 1997. [Copestake, 2002] Ann Copestake. Implementing Typed Feature Structure Grammars. CSLI Publications, Stanford, CA, 2002. [Corbett, 2006] Greville G. Corbett. Agreement. Cambridge
…
of Lecture Notes in Computer Science, pages 177–190. Springer, 2006. [Dale et al., 2000] Robert Dale, Hermann Moisl, and Harold Somers, editors. Handbook of Natural Language Processing. Marcel Dekker, 2000. [Dalrymple, 2001] Mary Dalrymple. Lexical Functional Grammar, volume 34 of Syntax and Semantics. Academic Press, New York, 2001. Bibliography | 451 [Dalrymple et
…
and David Thomas. The Pragmatic Programmer: From Journeyman to Master. Addison Wesley, 2000. [Indurkhya and Damerau, 2010] Nitin Indurkhya and Fred Damerau, editors. Handbook of Natural Language Processing. CRC Press, Taylor and Francis Group, second edition, 2010. [Jackendoff, 1977] Ray Jackendoff. X-Syntax: a Study of Phrase Strucure. Number 2 in Linguistic Inquiry
…
: Tools for Analyzing Talk. Mahwah, NJ: Lawrence Erlbaum, second edition, 1995. [http://childes.psy.cmu .edu/]. 454 | Bibliography [Madnani, 2007] Nitin Madnani. Getting started on natural language processing with Python. ACM Crossroads, 13(4), 2007. [Manning, 2003] Christopher Manning. Probabilistic syntax. In Probabilistic Linguistics, pages 289–341. MIT Press, Cambridge, MA, 2003. [Manning
…
and Schütze, 1999] Christopher Manning and Hinrich Schütze. Foundations of Statistical Natural Language Processing. MIT Press, Cambridge, MA, 1999. [Manning et al., 2008] Christopher Manning, Prabhakar Raghavan, and Hinrich Schütze. Introduction to Information Retrieval. Cambridge University Press, 2008. [McCawley
…
and Coordinator of the European Network of Excellence in Human Language Technologies (ELSNET). Edward Loper has recently completed a Ph.D. on machine learning for natural language processing at the University of Pennsylvania. Edward was a student in Steven’s graduate course on computational linguistics in the fall of 2000, and went on
…
addition to NLTK, he has helped develop two packages for documenting and testing Python software, epydoc and doctest. Colophon The animal on the cover of Natural Language Processing with Python is a right whale, the rarest of all large whales. It is identifiable by its enormous head, which can measure up to one
by Maximilian Kasy · 15 Jan 2025 · 209pp · 63,332 words
learning as the craft of building complicated functions out of simpler functions—to model complicated relationships, as encountered in domains such as image recognition and natural language processing. There are different ways that the functions making up artificial neural nets can be built. How to best do this depends on the application domain
…
. Such neural nets leverage the fact that an image can be shifted left or right, up or down, and still represent the same object. For natural language processing, transformer networks, another variant of neural networks that use self-attention, form the backbone of recent large language models. Such transformer networks are based on
by Liz Pelly · 7 Jan 2025 · 293pp · 104,461 words
-occurrence between each mood and song” is computed, and finally, the association is calculated using an equation called “pointwise mutual information,” which is part of natural language processing, a subfield of AI that tries to understand human language. The same in-house researchers wrote that their “palette of moods” included 287 terms like
…
cultures were organizing music by granular descriptors, they were fulfilling a similar role, sorting music so it could be better perceived by algorithmic methods like natural language processing. The pop music philosopher and scholar Robin James, in theorizing why music culture has become entwined with the language of “vibes,” has argued that in
by George Zarkadakis · 7 Mar 2016 · 405pp · 117,219 words
understood it, and returned an answer. For this to happen, Watson’s designers exploited the whole arsenal of AI tools and techniques, including machine learning, natural language processing and knowledge representation. What the success of their creation demonstrated was that brute computing force could overcome the obstacles that the AI pioneers faced in
by Stuart Russell and Peter Norvig · 14 Jul 2019 · 2,466pp · 668,761 words
Mansinghka (Section 18.4, Programs as Probability Models); •Ian Goodfellow (Chapter 22, Deep Learning); •Jacob Devlin and Mei-Wing Chang (Chapter 25, Deep Learning for Natural Language Processing); •Anca Dragan (Chapter 26, Robotics); •Jitendra Malik and David Forsyth (Chapter 27, Computer Vision). Then some key roles: •Cynthia Yeung and Malika Cantor (project management
…
24.2Grammar 24.3Parsing 24.4Augmented Grammars 24.5Complications of Real Natural Language 24.6Natural Language Tasks Summary Bibliographical and Historical Notes 25Deep Learning for Natural Language Processing 25.1Word Embeddings 25.2Recurrent Neural Networks for NLP 25.3Sequence-to-Sequence Models 25.4The Transformer Architecture 25.5Pretraining and Transfer Learning 25.6State
…
now, we note that programming a computer to pass a rigorously applied test provides plenty to work on. The computer would need the following capabilities: •natural language processing to communicate successfully in a human language; •knowledge representation to store what it knows or hears; •automated reasoning to answer questions and to draw new
…
. Modern linguistics and AI, then, were “born” at about the same time, and grew up together, intersecting in a hybrid field called computational linguistics or natural language processing. The problem of understanding language turned out to be considerably more complex than it seemed in 1957. Understanding language requires an understanding of the subject
…
appreciation for data, statistical modeling, optimization, and machine learning was the gradual reunification of subfields such as computer vision, robotics, speech recognition, multiagent systems, and natural language processing that had become somewhat separate from core AI. The process of reintegration has yielded significant benefits both in terms of applications—for example, the deployment
…
a year. The most popular category was machine learning. (Machine learning papers in arXiv.org doubled every year from 2009 to 2017.) Computer vision and natural language processing were the next most popular. •Sentiment: About 70% of news articles on AI are neutral, but articles with positive tone increased from 12% in 2016
…
techniques in biology journals rather than just computer science journals. ILP has made contributions to other sciences besides biology. One of the most important is natural language processing, where ILP has been used to extract complex relational information from text. Summary This chapter has investigated various ways in which prior knowledge can help
…
a hybrid (top-down and bottom-up) approach to inverse entailment and has been applied to a number of practical problems, particularly in biology and natural language processing. Muggleton (2000) describes an extension of PROGOL to handle uncertainty in the form of stochastic logic programs. A formal analysis of ILP methods appears in
…
practically usable forms of RNN. They have demonstrated excellent performance on a wide range of tasks including speech recognition and handwriting recognition. Their use in natural language processing is discussed in Chapter 25. 22.7Unsupervised Learning and Transfer Learning The deep learning systems we have discussed so far are based on supervised learning
…
indepth explanations, we refer the reader to the relevant chapters: Chapter 23 for the use of deep learning in reinforcement learning systems, Chapter 25 for natural language processing, Chapter 27 (particularly Section 27.4) for computer vision, and Chapter 26 for robotics. 22.8.1Vision We begin with computer vision, which is the
…
has to do it in real time with near-perfect accuracy. 22.8.2Natural language processing Deep learning has also had a huge impact on natural language processing (NLP) applications such as machine translation and speech recognition. Some advantages of deep learning for these applications include the possibility of end-to-end learning
…
back-propagation algorithm implements a gradient descent in parameter space to minimize the loss function. •Deep learning works well for visual object recognition, speech recognition, natural language processing, and reinforcement learning in complex environments. •Convolutional networks are particularly well suited for image processing and other tasks where the data have a grid topology
…
(s). This is not such a serious problem, because a robot using R' will behave just like a robot using the “correct” R. CHAPTER 24 NATURAL LANGUAGE PROCESSING In which we see how a computer can use natural language to communicate with humans and learn from what they have written. About 100,000
…
to grow; it is our main means of passing along cultural, legal, scientific, and technological knowledge. There are three primary reasons for computers to do natural language processing (NLP): •To communicate with humans. In many situations it is convenient for humans to use speech to interact with computers, and in most situations it
…
speaker has chosen a given string of words. (For handwritten or typed communication, we have the problem of optical character recognition.) 24.6Natural Language Tasks Natural language processing is a big field, deserving an entire textbook or two of its own (Goldberg, 2017; Jurafsky and Martin, 2020). In this section we briefly describe
…
of language, which we will cover in Chapter 25. Work on applications of language processing is presented at the biennial Applied Natural Language Processing conference (ANLP), the conference on Empirical Methods in Natural Language Processing (EMNLP), and the journal Natural Language Engineering. A broad range of NLP work appears in the journal Computational Linguistics and its
…
the accusative case. Many languages also make another distinction with a dative case for words in the indirect object position. CHAPTER 25 DEEP LEARNING FOR NATURAL LANGUAGE PROCESSING In which deep neural networks perform a variety of language tasks, capturing the structure of natural language as well as its fluidity. Chapter 24 explained
…
adding over 1018 bytes every day. Hundreds of high-quality data sets are available for a range of tasks in computer vision, speech recognition, and natural language processing. If the data you need is not already available, you can often assemble it from other sources, or engage humans to label data for you
…
of the The European Conference on Machine Learning ECP Proceedings of the European Conference on Planning EMNLP Proceedings of the Conference on Empirical Methods in Natural Language Processing FGCS Proceedings of the International Conference on Fifth Generation Computer Systems FOCS Proceedings of the Annual Symposium on Foundations of Computer Science GECCO Proceedings of
…
.). (2016). Handbook of Computational Social Choice. Cambridge University Press. Brants, T. (2000). TnT: A statistical part-of-speech tagger. In Proc. Sixth Conference on Applied Natural Language Processing. Brants, T., Popat, A. C., Xu, P., Och, F. J., and Dean, J. (2007). Large language models in machine translation. In EMNLP-CoNLL-07. Bratko
…
Sanskrit and artificial intelligence. AIMag, 6, 32–39. Brill, E. (1992). A simple rule-based part of speech tagger. In Proc. Third Conference on Applied Natural Language Processing. Brin, D. (1998). The Transparent Society. Perseus. Brin, S. and Page, L. (1998). The anatomy of a large-scale hypertextual web search engine. In Proc
…
Mathematical Logic. Princeton University Press. Church, K. (1988). A stochastic parts program and noun phrase parser for unrestricted texts. In Proc. Second Conference on Applied Natural Language Processing. Church, K. and Patil, R. (1982). Coping with syntactic ambiguity or how to put the block in the box on the table. Computational Linguistics, 8
…
, R., and Katz, S. (1999). Self-stabilizing distributed constraint satisfaction. Chicago J. of Theoretical Computer Science, 1999. Collins, M. (1999). Head-driven Statistical Models for Natural Language Processing. Ph.D. thesis, University of Pennsylvania. Collins, M. and Duffy, K. (2002). New ranking algorithms for parsing and tagging: Kernels over discrete structures, and the
…
. (2006). Reach for A*: Efficient point-to-point shortest path algorithms. In Workshop on algorithm engineering and experiments. Goldberg, Y. (2017). Neural network methods for natural language processing. Synthesis Lectures on Human Language Technologies, 10. Goldberg, Y., Zhao, K., and Huang, L. (2013). Efficient implementation of beam-search incremental parsers. In ACL-13
…
programming with a description logic. In Proc. IJCAI-03 Configuration Workshop. Jurafsky, D. and Martin, J. H. (2020). Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition (3rd edition). Prentice-Hall. Kadane, J. B. and Simon, H. A. (1977). Optimal strategies for a class of constrained sequential
…
. Addison-Wesley. Manne, A. S. (1960). Linear programming and sequential decisions. Management Science, 6, 259–267. Manning, C. and Schutze, H. (1999). Foundations of Statistical Natural Language Processing. MIT Press. Manning, C., Raghavan, P., and Schutze, H. (2008). Introduction to Information Retrieval. Cambridge University Press. Mannion, M. (2002). Using first-order logic for
…
. Ablex. Moore, R. C. and DeNero, J. (2011). L1 and L2 regularization for multiclass hinge loss models. In Symposium on Machine Learning in Speech and Natural Language Processing. Moravčík, M., Schmid, M., Burch, N., Lisy, V., Morrill, D., Bard, N., Davis, T., Waugh, K., Johanson, M., and Bowling, M. (2017). Deepstack: Expert-level
…
. (2018). NLP’s ImageNet moment has arrived. The Gradient, July 8. Ruder, S., Peters, M. E., Swayamdipta, S., and Wolf, T. (2019). Transfer learning in natural language processing. In COLING–19. Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986). Learning representations by back–propagating errors. Nature, 323, 533–536. Rumelhart
…
, 635 Nash folk theorems, 607 NATACHATA (chatbot), 1035 Natarajan, S., 551, 1094 Natarajan, V., 48, 1104 naturalism, 24 natural kind, 338 natural language inference, 931 natural language processing (NLP), 20, 874, 874–931 natural numbers, 286 natural stupidity, 347 Nau, D. S., 222, 224, 399, 400, 402, 1094, Navruzyan, A., 137, 161, 1106
…
., 638, 639, 1108 Niv, Y., 873, 1092, 1108 Nivre, J., 890, 904, 1102, 1108 Nixon, R., 352 Nixon diamond, 352 Niyogi, S., 296, 1114 NLP (natural language processing), 20, 874, 874–931 no-good, 180 no-regret learning, 722 NOAH (planning system), 398, 400 Nobel Prize, 28, 29, 40 Nocedal, J., 735, 1088
by Geoffrey C. Bowker · 24 Aug 2000
, has made one of the chief attacks on the N I C scheme. She believes that rather than standardized nursing language, computer scientists should develop natural language processing tools so that nurse narratives can be 2 74 Chapter 8 interpreted . Grobe argues for the abandonment of any goal of pro ducing "a single
by Mark Lutz · 5 Jan 2011
and HTML text parsing Parsers: grammars Custom language parsers, both handcoded and generated Embedding Running Python code with eval and exec built-ins And more Natural language processing For simpler tasks, Python’s built-in string object is often all we really need. Python strings can be indexed, concatenated, sliced, and processed with
…
parsers but are sufficient for many language tasks. For more on YAPPS, see http://theory.stanford.edu/~amitp/Yapps or search the Web at large. Natural language processing Even more demanding language analysis tasks require techniques developed in artificial intelligence research, such as semantic analysis and machine learning. For instance, the Natural Language
…
Toolkit, or NLTK, is an open source suite of Python libraries and programs for symbolic and statistical natural language processing. It applies linguistic techniques to textual data, and it can be used in the development of natural language recognition software and systems. For much more
…
on this subject, be sure to also see the O’Reilly book Natural Language Processing with Python, which explores, among other things, ways to use NLTK in Python. Not every system’s users will pose questions in a natural language
…
in Dictionaries creating, Running Strings in Dictionaries running code strings with, Running Code Strings with Results and Namespaces, Running Code Strings with Results and Namespaces natural language processing, Advanced Language Tools nested structures, Nested structures, Uploading Local Trees, Uploading Local Trees, Pickled Objects, Pickling in Action dictionaries, Nested structures pickling, Pickled Objects, Pickling
by Ben Goertzel and Pei Wang · 1 Jan 2007 · 303pp · 67,891 words
), according to the experience of the system. In language understanding process, NARS will not have separate parsing and semantic mapping phases, like in many other natural language processing systems. Instead, for an input sentence, the recognition of its syntactic structure and the recognition of its semantic structure will be carried out hand-in
…
necessary condition for being intelligent. Since the aim of NARS is not to accurately duplicate human behaviors so as to pass the Turing Test [5], natural language processing is optional for the system. 3.3 Education NARS processes tasks using available knowledge, though the system is not designed with a ready-made knowledge
…
, when the number of premises is fewer, more intensive and accurate reasoning may be carried out. To easily accept input from, and send input to, natural language processing software systems. PLN implements a wide array of first-order and higher-order inference rules including (but not limited to) deduction, Bayes’ Rule, unification, intensional
by Steven Pinker · 1 Jan 1994 · 661pp · 187,613 words
America since 1925. Washington, D.C.: American Council of Learned Societies. Jordan, M. I., & Rosenbaum, D. 1989. Action. In Posner, 1989. Joshi, A. K. 1991. Natural language processing. Science, 253, 1242–1249. Kaplan, R. 1972. Augmented transition networks as psychological models of sentence comprehension. Artificial Intelligence, 3, 77–100. Kaplan, S. 1992. Environmental
by Ray Kurzweil · 31 Dec 1998 · 696pp · 143,736 words
by Peter Van-Roy and Seif Haridi · 15 Feb 2004 · 931pp · 79,142 words
by Dipanjan Sarkar · 1 Dec 2016
by Geoffrey C. Bowker and Susan Leigh Star · 25 Aug 2000 · 357pp · 125,142 words
by Martin Kleppmann · 16 Mar 2017 · 1,237pp · 227,370 words
by Brian Christian · 5 Oct 2020 · 625pp · 167,349 words
by Shoshana Zuboff · 15 Jan 2019 · 918pp · 257,605 words
by Martin Ford · 16 Nov 2018 · 586pp · 186,548 words
by Jim Blandy and Jason Orendorff · 21 Nov 2017 · 1,331pp · 183,137 words
by Terrence J. Sejnowski · 27 Sep 2018
by Ray Kurzweil · 14 Jul 2005 · 761pp · 231,902 words
by Toby Segaran and Jeff Hammerbacher · 1 Jul 2009
by Thierry Poibeau · 14 Sep 2017 · 174pp · 56,405 words
by David Golumbia · 31 Mar 2009 · 268pp · 109,447 words
by Paul Scharre · 18 Jan 2023
by Q. Ethan McCallum · 14 Nov 2012 · 398pp · 86,855 words
by Julie Steele · 20 Apr 2010
by Daron Acemoglu and Simon Johnson · 15 May 2023 · 619pp · 177,548 words
by Martin Ford · 13 Sep 2021 · 288pp · 86,995 words
by Mehmed Kantardzić · 2 Jan 2003 · 721pp · 197,134 words
by Ed Finn · 10 Mar 2017 · 285pp · 86,853 words
by Paul R. Daugherty and H. James Wilson · 15 Jan 2018 · 523pp · 61,179 words
by Erik J. Larson · 5 Apr 2021
by Martin Campbell-Kelly and Nathan Ensmenger · 29 Jul 2013 · 528pp · 146,459 words
by Christopher Allen and Julie Moronuki · 1 Jan 2015 · 1,076pp · 67,364 words
by Martin Kleppmann · 17 Apr 2017
by Viktor Mayer-Schonberger and Kenneth Cukier · 5 Mar 2013 · 304pp · 82,395 words
by Trey Grainger and Timothy Potter · 14 Sep 2014 · 1,085pp · 219,144 words
by Eric Siegel · 19 Feb 2013 · 502pp · 107,657 words
by Matthew A. Russell · 15 Jan 2011 · 541pp · 109,698 words
by James Bridle · 18 Jun 2018 · 301pp · 85,263 words
by David J. Leinweber · 31 Dec 2008 · 402pp · 110,972 words
by Nicholas Carr · 5 Sep 2016 · 391pp · 105,382 words
by Geoff Cox and Alex McLean · 9 Nov 2012
by Jeremy Rifkin · 31 Mar 2014 · 565pp · 151,129 words
by Rob Kitchin · 25 Aug 2014
by Henry A Kissinger, Eric Schmidt and Daniel Huttenlocher · 2 Nov 2021 · 194pp · 57,434 words
by Marcus Du Sautoy · 7 Mar 2019 · 337pp · 103,522 words
by Jamie Susskind · 3 Sep 2018 · 533pp
by Jacob Turner · 29 Oct 2018 · 688pp · 147,571 words
by James Vlahos · 1 Mar 2019 · 392pp · 108,745 words
by Jiawei Han, Micheline Kamber and Jian Pei · 21 Jun 2011
by Brian Dear · 14 Jun 2017 · 708pp · 223,211 words
by Orly Lobel · 17 Oct 2022 · 370pp · 112,809 words
by Jj Geewax · 19 Jul 2021 · 725pp · 168,262 words
by Anil Ananthaswamy · 15 Jul 2024 · 416pp · 118,522 words
by Eric Redmond, Jim Wilson and Jim R. Wilson · 7 May 2012 · 713pp · 93,944 words
by Sinan Aral · 14 Sep 2020 · 475pp · 134,707 words
by Veljko Krunic · 29 Mar 2020
by Joshua Cooper Ramo · 16 May 2016 · 326pp · 103,170 words
by Kory Stamper · 14 Mar 2017 · 341pp · 95,752 words
by Amy Webb · 5 Mar 2019 · 340pp · 97,723 words
by Stephen Baker · 11 Aug 2008 · 265pp · 74,000 words
by Kariappa Bheemaiah · 26 Feb 2017 · 492pp · 118,882 words
by Gary Price, Chris Sherman and Danny Sullivan · 2 Jan 2003 · 481pp · 121,669 words
by Erik Westra · 23 May 2013
by Andrew McAfee and Erik Brynjolfsson · 26 Jun 2017 · 472pp · 117,093 words
by Adam Greenfield · 29 May 2017 · 410pp · 119,823 words
by Lane Greene · 15 Dec 2018 · 284pp · 84,169 words
by Kai-Fu Lee · 14 Sep 2018 · 307pp · 88,180 words
by Doug Turnbull and John Berryman · 30 Apr 2016 · 593pp · 118,995 words
by James Barrat · 30 Sep 2013 · 294pp · 81,292 words
by Alan Cooper · 24 Feb 2004 · 193pp · 98,671 words
by Steven Levy · 25 Feb 2020 · 706pp · 202,591 words
by Cathy O'Neil and Rachel Schutt · 8 Oct 2013 · 523pp · 112,185 words
by Parag Khanna · 5 Feb 2019 · 496pp · 131,938 words
by Nick Polson and James Scott · 14 May 2018 · 301pp · 85,126 words
by Melanie Mitchell · 14 Oct 2019 · 350pp · 98,077 words
by Richard Rumelt · 27 Apr 2022 · 363pp · 109,834 words
by Barbara Tversky · 20 May 2019 · 426pp · 117,027 words
by Jing Tsu · 18 Jan 2022 · 408pp · 105,715 words
by Nicole Kobie · 3 Jul 2024 · 348pp · 119,358 words
by Dariusz Jemielniak and Aleksandra Przegalinska · 18 Feb 2020 · 187pp · 50,083 words
by Kenneth Payne · 16 Jun 2021 · 339pp · 92,785 words
by Tsedal Neeley · 14 Oct 2021 · 223pp · 60,936 words
by Michael Bhaskar · 2 Nov 2021
by Kai-Fu Lee and Qiufan Chen · 13 Sep 2021
by Erik Brynjolfsson and Andrew McAfee · 20 Jan 2014 · 339pp · 88,732 words
by Yasha Levine · 6 Feb 2018 · 474pp · 130,575 words
by Thomas H. Davenport · 4 Feb 2014
by Drew Conway and John Myles White · 10 Feb 2012 · 451pp · 103,606 words
by Brian Merchant · 19 Jun 2017 · 416pp · 129,308 words
by Unknown · 13 Jan 2012 · 470pp · 109,589 words
by Kevin Kelly · 6 Jun 2016 · 371pp · 108,317 words
by Tom Standage · 27 Nov 2018 · 215pp · 59,188 words
by David Bellos · 10 Oct 2011 · 396pp · 107,814 words
by Marc Goodman · 24 Feb 2015 · 677pp · 206,548 words
by Karl Fogel · 13 Oct 2005
by Robert Wachter · 7 Apr 2015 · 309pp · 114,984 words
by Jacob Ward · 25 Jan 2022 · 292pp · 94,660 words
by Mariya Yao, Adelyn Zhou and Marlene Jia · 1 Jun 2018 · 161pp · 39,526 words
by Drew Conway and John Myles White · 25 Oct 2011 · 163pp · 42,402 words
by Jeanette Winterson · 15 Mar 2021 · 256pp · 73,068 words
by Jacqueline Kazil · 4 Feb 2016
by Kevin Carey · 3 Mar 2015 · 319pp · 90,965 words
by Robert Elliott Smith · 26 Jun 2019 · 370pp · 107,983 words
by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann and Michelle Estes · 28 Feb 2015
by Mark Gardener · 13 Jun 2012
by John Brockman · 5 Oct 2015 · 481pp · 125,946 words
by Nicole Perlroth · 9 Feb 2021 · 651pp · 186,130 words
by Lam Thuy Vo · 21 Nov 2019 · 237pp · 65,794 words
by Eli Berman, Joseph H. Felter, Jacob N. Shapiro and Vestal Mcintyre · 12 May 2018 · 517pp · 147,591 words
by Pedro Gairifo Santos · 7 Nov 2011 · 353pp · 104,146 words
by Luke Dormehl · 4 Nov 2014 · 268pp · 75,850 words
by Daniel Susskind · 14 Jan 2020 · 419pp · 109,241 words
by Thomas S. Mullaney, Benjamin Peters, Mar Hicks and Kavita Philip · 9 Mar 2021 · 661pp · 156,009 words
by Luke Dormehl · 10 Aug 2016 · 252pp · 74,167 words
by John E. Kelly Iii · 23 Sep 2013 · 118pp · 35,663 words
by Scott E. Page · 27 Nov 2018 · 543pp · 153,550 words
by Mary L. Gray and Siddharth Suri · 6 May 2019 · 346pp · 97,330 words
by Ben Tarnoff · 13 Jun 2022 · 234pp · 67,589 words
by Karen Hao · 19 May 2025 · 660pp · 179,531 words
by Duncan J. Watts · 28 Mar 2011 · 327pp · 103,336 words
by Christopher Summerfield · 11 Mar 2025 · 412pp · 122,298 words
by Brett Scott · 4 Jul 2022 · 308pp · 85,850 words
by David Sumpter · 18 Jun 2018 · 276pp · 81,153 words
by Jacob Helberg · 11 Oct 2021 · 521pp · 118,183 words
by Ariel Ezrachi and Maurice E. Stucke · 30 Nov 2016
by Peter H. Diamandis and Steven Kotler · 28 Jan 2020 · 501pp · 114,888 words
by Barrett Brown · 8 Jul 2024 · 332pp · 110,397 words
by James Pustejovsky and Amber Stubbs · 14 Oct 2012 · 502pp · 107,510 words
by Adam Goucher and Tim Riley · 13 Oct 2009 · 351pp · 123,876 words
by Pedro Domingos · 21 Sep 2015 · 396pp · 117,149 words
by Viktor Mayer-Schönberger and Thomas Ramge · 27 Feb 2018 · 267pp · 72,552 words
by John Markoff · 24 Aug 2015 · 413pp · 119,587 words
by Richard Susskind and Daniel Susskind · 24 Aug 2015 · 742pp · 137,937 words
by Tom Slee · 18 Nov 2015 · 265pp · 69,310 words
by Ryan Mitchell · 14 Jun 2015 · 255pp · 78,207 words
by Sebastien Donadio · 7 Nov 2019
by Nicholas Carr · 28 Jan 2025 · 231pp · 85,135 words
by Joanna Walsh · 22 Sep 2025 · 255pp · 80,203 words
by Jimmy Soni · 22 Feb 2022 · 505pp · 161,581 words
by Ash Fontana · 4 May 2021 · 296pp · 66,815 words
by Katie Hafner and Matthew Lyon · 1 Jan 1996 · 352pp · 96,532 words
by Eric Topol · 1 Jan 2019 · 424pp · 114,905 words
by Frank J. Ohlhorst · 28 Nov 2012 · 133pp · 42,254 words
by Lisa Sanders · 15 Jan 2009 · 314pp · 101,034 words
by Rizwan Virk · 31 Mar 2019 · 315pp · 89,861 words
by Eric Topol · 6 Jan 2015 · 588pp · 131,025 words
by Steve Lohr · 10 Mar 2015 · 239pp · 70,206 words
by Jan Kunigk, Ian Buss, Paul Wilkinson and Lars George · 8 Jan 2019 · 1,409pp · 205,237 words
by Valliappa Lakshmanan, Sara Robinson and Michael Munn · 31 Oct 2020
by Yarden Katz
by Maria Ressa · 19 Oct 2022
by Pete Warden · 20 Sep 2011 · 58pp · 12,386 words
by Igor Tulchinsky · 30 Sep 2019 · 321pp
by Benjamin Bengfort, Rebecca Bilbro and Tony Ojeda · 10 Jun 2018 · 125pp · 27,675 words
by Gregory Zuckerman · 5 Nov 2019 · 407pp · 104,622 words
by Adam Aleksic · 15 Jul 2025 · 278pp · 71,701 words
by Alan Rusbridger · 14 Oct 2018 · 579pp · 160,351 words
by Roger Bootle · 4 Sep 2019 · 374pp · 111,284 words
by Parmy Olson · 284pp · 96,087 words
by Sonja Thiel and Johannes C. Bernhardt · 31 Dec 2023 · 321pp · 113,564 words
by Ethan Mollick · 2 Apr 2024 · 189pp · 58,076 words
by Meredith Broussard · 19 Apr 2018 · 245pp · 83,272 words
by Gretchen McCulloch · 22 Jul 2019 · 413pp · 106,479 words
by Brad Smith and Carol Ann Browne · 9 Sep 2019 · 482pp · 121,173 words
by Sebastian Mallaby · 9 Jun 2010 · 584pp · 187,436 words
by Brian Dumaine · 11 May 2020 · 411pp · 98,128 words
by Zdravko Markov and Daniel T. Larose · 5 Apr 2007
by Thomas H. Davenport and Julia Kirby · 23 May 2016 · 347pp · 97,721 words
by Aaron Brown and Eric Kim · 10 Oct 2011 · 483pp · 141,836 words
by Brian Christian and Tom Griffiths · 4 Apr 2016 · 523pp · 143,139 words
by Leslie Sikos · 10 Jul 2015
by Aurélien Géron · 13 Mar 2017 · 1,331pp · 163,200 words
by Pistono, Federico · 14 Oct 2012 · 245pp · 64,288 words
by Frank Pasquale · 14 May 2020 · 1,172pp · 114,305 words
by Keach Hagey · 19 May 2025 · 439pp · 125,379 words
by Matthew Brennan · 9 Oct 2020 · 282pp · 63,385 words
by Nouriel Roubini · 17 Oct 2022 · 328pp · 96,678 words
by Christine Lagorio-Chafkin · 1 Oct 2018
by Joel Grus · 13 Apr 2015 · 579pp · 76,657 words
by William Hertling · 9 Apr 2014 · 247pp · 71,698 words
by David Easley, Marcos López de Prado and Maureen O'Hara · 28 Sep 2013
by Calum Chace · 17 Jul 2016 · 477pp · 75,408 words
by Tom Standage · 31 Aug 2005
by Nicholas Carr · 28 Sep 2014 · 308pp · 84,713 words
by Sergey Young · 23 Aug 2021 · 326pp · 88,968 words
by Ronald J. Deibert · 14 Aug 2020
by Dennis Yi Tenen · 6 Feb 2024 · 169pp · 41,887 words
by Raj M. Shah and Christopher Kirchhoff · 8 Jul 2024 · 272pp · 103,638 words
by Madeline Ashby · 28 Jul 2012 · 343pp · 93,544 words
by Matthew Hindman · 24 Sep 2018
by Richard Watson · 5 Nov 2013 · 219pp · 63,495 words
by Thomas Philippon · 29 Oct 2019 · 401pp · 109,892 words
by Yolande Strengers and Jenny Kennedy · 14 Apr 2020
by Alec Ross · 13 Sep 2021 · 363pp · 109,077 words
by Femi Anthony · 21 Jun 2015 · 589pp · 69,193 words
by Salim Ismail and Yuri van Geest · 17 Oct 2014 · 292pp · 85,151 words
by Tien Tzuo and Gabe Weisert · 4 Jun 2018 · 244pp · 66,977 words
by Jerry Kaplan · 3 Aug 2015 · 237pp · 64,411 words
by Yves Hilpisch · 8 Dec 2020 · 1,082pp · 87,792 words
by Carl Benedikt Frey · 17 Jun 2019 · 626pp · 167,836 words
by Danielle Dimartino Booth · 14 Feb 2017 · 479pp · 113,510 words
by Mariana Mazzucato · 1 Jan 2011 · 382pp · 92,138 words
by Sheera Frenkel and Cecilia Kang · 12 Jul 2021 · 372pp · 100,947 words
by Alan Murray · 15 Dec 2022 · 263pp · 77,786 words
by Eva Dou · 14 Jan 2025 · 394pp · 110,159 words
by Currid · 9 Nov 2010 · 332pp · 91,780 words
by Raúl Garreta and Guillermo Moncecchi · 14 Sep 2013 · 122pp · 29,286 words
by Gavin Hackeling · 31 Oct 2014
by Lawrence Ingrassia · 28 Jan 2020 · 290pp · 90,057 words
by Matthew A. Russell · 15 Feb 2011 · 71pp · 14,237 words
by Sara Wachter-Boettcher · 9 Oct 2017 · 223pp · 60,909 words
by Ali Tamaseb · 14 Sep 2021 · 251pp · 80,831 words
by Jonathan Taplin · 17 Apr 2017 · 222pp · 70,132 words
by Nick Srnicek · 22 Dec 2016 · 116pp · 31,356 words
by Rakesh Vidya Chandra and Bala Subrahmanyam Varanasi · 16 Jun 2015 · 134pp · 29,488 words
by Kendall Kim · 31 May 2007 · 224pp · 13,238 words
by Jon Bruner · 27 Mar 2013 · 49pp · 12,968 words