by Jimmy Soni and Rob Goodman · 17 Jul 2017 · 415pp · 114,840 words
had their reward. —W. H. AUDEN INTRODUCTION * * * * * * The thin, white-haired man had spent hours wandering in and out of meetings at the International Information Theory Symposium in Brighton, England, before the rumors of his identity began to proliferate. At first the autograph seekers came in a trickle, and then they
…
vision of technology, a vision that treated different types of machinery (radar, amplifiers, electric motors, computers) in analytically similar terms—paving the way for information theory, systems engineering, and classical control theory. These efforts produced not only new weapons but also a vision of signals and systems. Through ideas and through
…
about the three Bell Labs wunderkinds: It turns out that there were three certified geniuses at BTL [Bell Telephone Laboratories] at the same time, Claude Shannon of information theory fame, John Pierce, of communication satellite and traveling wave amplifier fame, and Barney. Apparently the three of those people were intellectually INSUFFERABLE. They were
…
of his life working with the conceptual tools that Hartley built, and for the better part of his life, much of his public identity—“Claude Shannon, Father of Information Theory”—was bound up in having been the one who extended Hartley’s ideas far beyond what Hartley, or anyone, could have imagined. Aside
…
to Shannon’s work came out soon thereafter. Thus, beginning with the small but dedicated readership of the Bell System Technical Journal, news of information theory rippled through the mathematical and engineering worlds. It piqued the interest of one reader in particular, who would become Shannon’s most important popularizer:
…
would or wouldn’t receive credit. Debates in his field mattered to him less for their opportunities to assert “ownership” of information theory than for their bearing on the substance of information theory itself. Credit, in the end, counted less than accuracy. 20 * * * * * * A Transformative Year Shannon turned thirty-two in 1948.
…
in the popular press and applied in a diverse array of fields, sometimes with only the loosest appreciation for what information theory actually meant. For theoretical work as suggestive as information theory—which to a casual reader might appear to offer a rubric for everything from mass media to geology—appropriation and misappropriation
…
group’s newsletter. De Rosa’s “In Which Fields Do We Graze?” was a genuine query of his colleagues working in information theory: The expansion of the applications of Information Theory to fields other than radio and wired communications has been so rapid that oftentimes the bounds within which the Professional Group interests
…
. —Ludwig Wittgenstein I’m a machine and you’re a machine, and we both think, don’t we? —Claude Shannon If Shannon had peculiar work habits before the publication of his information theory, his growing reputation granted him the license to indulge those peculiarities without reservation. After 1948, the Bell Labs bureaucracy could
…
ignoring letters, colleagues, and projects, and spending his time and attention absorbed by the puzzles that interested him most. Shannon had earned this right—information theory was painstaking work—and he found himself drawn now to new problems and fresh horizons, including some that seemed, to colleagues, borderline ridiculous for someone
…
Even sympathetic audiences and eponymous venues frightened him. In 1973, for instance, Shannon was invited to give the first Claude Shannon lecture in Ashkelon, Israel, for the Institute of Electrical and Electronics Engineers Information Theory Society. “I have never seen such stage fright,” the mathematician Elwyn Berlekamp recalled. “It never would have occurred
…
see the world. * * * In part, the invitations and recognitions kept pouring in because the technological developments of the 1970s had awakened the world to information theory’s importance. In the immediate aftermath of Shannon’s “Mathematical Theory of Communication,” said an MIT student of that era, Tom Kailath, “we always thought
…
world-class engineer says that “all the advanced signal processing that enables us to send high-speed data was done as an outgrowth of Claude Shannon’s work on information theory,” the statement rings true to people in the know—and means very little to the untrained. Yet there is value in rethinking
…
Claude Shannon—but not in the way we’d imagine. Consider him not only as a distant forefather of the digital era, but as one of
…
36 On February 6, 1967, President Lyndon B. Johnson presented Claude Shannon with the National Medal of Science in honor of his “brilliant contributions to the mathematical theories of communications and information processing.” 37 Shannon’s early MIT lectures on information theory attracted packed houses, but none drew a bigger crowd than his
…
Carl Sagan, Pale Blue Dot: A Vision of the Human Future in Space (New York: Random House, 1994), 6. “bandwagon”: Claude Shannon, “The Bandwagon,” IRE Transactions—Information Theory 2, no. 1 (1956): 3. “XFOML RXKHRJFFJUJ”: Claude Shannon, “A Mathematical Theory of Communication,” in Claude Elwood Shannon: Collected Papers, ed. N. J. A. Sloane and Aaron D
…
“As a young boy”: Shannon, interviewed by Hagemeyer, February 28, 1977. “He and my brother”: Quoted in Julie Kettlewell, “Gaylord Honors ‘Father to the Information Theory,’ ” Otsego Herald Times, September 3, 1998. “Claude was the brains”: Quoted in Melinda Cerny, “Engineering Industry Honors Shannon, His Hometown,” Otsego Herald Times, September 3
…
H. Smith to Karl Compton, April 11, 1939, Office of the President Records, MIT Archive, cited in Erico Marui Guizzo, “The Essential Message: Claude Shannon and the Making of Information Theory” (MS diss., Massachusetts Institute of Technology, 2003), 13. “Somehow I doubt”: Letter from Compton to Smith, April 13, 1939, Office of the
…
Shannon, interviewed by Hagemeyer, February 28, 1977. “appears to have taken”: Pierce, An Introduction to Information Theory, 40. “It came as a bomb”: Pierce, “The Early Days of Information Theory,” 4. Chapter 16: The Bomb “The fundamental problem”: Claude Shannon, “A Mathematical Theory of Communication,” in Claude Elwood Shannon: Collected Papers, 5. “selected from a
…
of Secret Writing (New York: Macmillan, 1953), 749. “I wrote”: Price, “Oral History: Claude E. Shannon.” “Roughly, redundancy means”: Ibid. 744. “MST PPL”: Shannon, “Information Theory,” in Encyclopaedia Britannica, 14th ed., reprinted in Claude Elwood Shannon: Collected Papers, 216. “When we write English”: Shannon, “Mathematical Theory,” 25. “certain known results”: Ibid
…
the importance to both of the concept of redundancy. “A S-M-A-L-L”: Gleick, The Information, 230. Shannon explained: Shannon, “Information Theory,” 216. like transmitting power: Massey, “Information Theory,” 27. Kahn illustrates this point: Kahn, The Codebreakers, 747. a code like this: This example is cited in Guizzo, “The Essential Message
…
that time”: Robert Gallager, interviewed by the authors, August 8, 2014. founded a new field and solved most of its problems: David J. C. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge: Cambridge University Press, 2003), 14. information reduces “entropy”: Specifically, in Shannon’s terms, entropy can be thought of as
…
in Eugene Chiu et al., “Mathematical Theory of Claude Shannon,” December 2001, web.mit.edu/6.933/www/Fall2001/Shannon1.pdf. “some of his early papers”: Betty Shannon, interviewed by the authors, November 12, 2015. Chapter 21: TMI “Great scientific theories”: Francis Bello, “The Information Theory,” Fortune, December 1953, 136–58. “Much as
…
I wish”: Quoted in Kline, The Cybernetics Moment, 124. “It may be no exaggeration”: Bello, “The Information Theory,” 136. “Gaylord native son”: “Gaylord’s Claude Shannon: ‘Einstein of Mathematical Theory,’ ” Gaylord Herald Times, October 11, 2000. “There were many”: Poundstone, Fortune’s Formula, 15. “What kind of man
…
Collected Papers, xxviii. “The expansion of the applications”: L. A. de Rosa, “In Which Fields Do We Graze?,” IRE Transactions on Information Theory 1, no. 3 (1955): 2. “Information theory has,” etc.: Shannon, “The Bandwagon,” 3. “Claude Shannon was”: Robert G. Gallager, “Claude E. Shannon: A Retrospective on His Life, Work, and Impact,” IEEE Transactions on
…
Information Theory 47, no. 7 (2001): 2694. “He got a little irritated”: Quoted in Omar Aftab et al., “Information Theory and the Digital Age,” 10,
…
Shannon, interviewed by the authors, December 9, 2015. “golden age of information theory”: Thomas Kailath, interviewed by the authors, June 2, 2016. “The intellectual content”: Anthony Ephremides, interviewed by the authors, May 31, 2016. “I believe that scientists”: “Profile of Claude Shannon—Interview by Anthony Liversidge,” in Claude Elwood Shannon: Collected Papers, xxiii.
…
of California Television, “Claude Shannon: Father of the Information Age,” 2002, www.youtube.com/watch?v=z2Whj_nL-x8. “He just felt that people”: Quoted in ibid. “Since our retirement”: Letter from Shannon to Ben [last name unknown], November 15, 1980, Shannon Papers. “we always thought that information theory”: Thomas Kailath, interviewed
…
of the Information Age.” “In our age”: Quoted in Mark Semenovich Pinsker, “Reflections of Some Shannon Lecturers,” IEEE Information Theory Society Newsletter, Summer 1998, 22. “For him, the harder a problem”: George Johnson, “Claude Shannon, Mathematician, Dies at 84,” New York Times, February 27, 2001. “Courage is one of the things”: Richard Hamming,
…
by the authors, August 7, 2014. “Shannon’s puzzle-solving”: Robert Gallager, “The Impact of Information Theory on Information Technology,” lecture slides, February 28, 2006. “Dear Dennis”: Quoted in John Horgan, “Poetic Masterpiece of Claude Shannon, Father of Information Theory, Published for the First Time,” Scientific American, March 28, 2011, blogs.scientificamerican.com/cross-check/
…
poetic-masterpiece-of-claude-shannon-father-of-information-theory-published-for-the-first-time. Acknowledgments “Modern man”: Arthur Koestler, The Act of Creation
…
(London: Hutchinson, 1976), 264. Bibliography Books and Articles Aftab, Omar, et al. “Information Theory and the Digital Age.” web.mit.edu/6.933/www/Fall2001
…
Mathematical Monthly 41, no. 3 (March 1934): 188–89. “Enrico Rastelli.” Vanity Fair, February 1932, 49. Ephremides, Anthony. “Claude E. Shannon 1916–2001.” IEEE Information Theory Society Newsletter, March 2001. Feynman, Richard P. Surely You’re Joking, Mr. Feynman. Reprint ed. New York: Norton, 1997. Fisher, Lawrence. “Bernard M. Oliver Is
…
: A Retrospective on His Life, Work, and Impact.” IEEE Transactions on Information Theory 47, no. 7 (2001): 2681–95. ———. “The Impact of Information Theory on Information Technology.” Lecture slides. February 28, 2006. “Gaylord Locals.” Otsego County Herald Times, November 15, 1934. “Gaylord’s Claude Shannon: ‘Einstein of Mathematical Theory.’ ” Gaylord Herald Times, October 11, 2000.
…
).” Science, April 20, 2001. Graham, C. Wallace, et al., eds. 1934 Michiganensian. Ann Arbor, Michigan, 1934. Guizzo, Erico Marui. “The Essential Message: Claude Shannon and the Making of Information Theory.” MS diss., Massachusetts Institute of Technology, 2003. Hamming, Richard. “You and Your Research.” Lecture, Bell Communications Research Colloquium Seminar, March 7, 1986. www
…
Horgan, John. “Claude E. Shannon: Unicyclist, Juggler, and Father of Information Theory.” Scientific American, January 1990. ———. “Poetic Masterpiece of Claude Shannon, Father of Information Theory, Published for the First Time.” Scientific American, March 28, 2011. blogs.scientificamerican.com/cross-check/poetic-masterpiece-of-claude-shannon-father-of-information-theory-published-for-the-first-time/. Hunt, Bruce J. “Scientists, Engineers, and
…
2001, 18–22. Kaplan, Fred. “Scientists at War.” American Heritage 34, no. 4 (June 1983): 49–64. Kettlewell, Julie. “Gaylord Honors ‘Father to the Information Theory.’ ” Otsego Herald Times, September 3, 1998. Kimball, Warren F., ed. Churchill and Roosevelt: The Complete Correspondence. Vol. 3. Princeton, NJ: Princeton University Press, 1984. Kipling
…
. The Calculus for Engineers. London: Edward Arnold, 1897. Pierce, John. “Creative Thinking.” Lecture. 1951. ———. “The Early Days of Information Theory.” IEEE Transactions on Information Theory 19, no. 1 (1973): 3–8. ———. An Introduction to Information Theory: Symbols, Signals, and Noise. 2nd ed. New York: Dover, 1980. Pinsker, Mark Semenovich. “Reflections of Some Shannon Lecturers.” IEEE
…
New Brunswick, NJ: Transaction, 1993. Shannon, Claude Elwood. “The Bandwagon.” IRE Transactions—Information Theory 2, no. 1 (1956): 3. ———. Claude Elwood Shannon: Collected Papers. Ed. N. J. A. Sloane and Aaron D. Wyner. New York: IEEE Press, 1992. ———. Claude Shannon’s Miscellaneous Writings. Ed. N. J. A. Sloane and Aaron D. Wyner. Murray
…
Control” (NDRC project), 85–89 “Mathematical Theory of Communication, A” (Shannon), xiii–xiv, 138, 235, 262 response to, 165–69, 172–74 see also information theory Mathematical Theory of Communication, The (Shannon and Weaver), 168–69, 172–74 “Mathematical Theory of Cryptography—Case 208078, A” (Shannon), 101–2 Mathematician’s Apology
…
honors and prizes awarded to, 257–59, 261–62 on human-machine interactions, 207–9 impatience of, 113–14 as inconsistent correspondent, 200 information theory invented by, see information theory at Institute for Advanced Study, 74–80, 162 intellectual courage of, 277–78 intuitive thought process of, 184–85, 230, 232–33,
…
ISBN 9781476766690 (trade pbk. : alk. paper) | ISBN 9781476766706 (ebook) Subjects: LCSH: Shannon, Claude Elwood, 1916–2001. | Mathematicians—United States—Biography. | Electrical engineers—United States—Biography. | Information theory. Classification: LCC QA29.S423 S66 2017 (print) | LCC QA29.S423 (ebook) | DDC 003/.54092 [B] —dc23 LC record available at https://lccn.loc.gov/2016050944
by Paul J. Nahin · 27 Oct 2012 · 229pp · 67,599 words
The Logician and the Engineer Frontispiece: Reproduced by arrangement with the artist. The Logician and the Engineer How George Boole and Claude Shannon Created the Information Age PAUL J . NAHIN PRINCETON UNIVERSITY PRESS PRINCETON AND OXFORD Copyright © 2013 by Princeton University Press Published by Princeton University Press, 41
…
press.princeton.edu All Rights Reserved Library of Congress Cataloging-in-Publication Data Nahin, Paul J. The logician and the engineer : how George Boole and Claude Shannon created the information age / Paul J. Nahin. pages cmIncludes bibliographical references and index. ISBN 978-0-691-15100-7 1. Boole, George, 1815–1864. 2
…
dorms Contents Preface 1 What You Need to Know to Read This Book Notes and References 2 Introduction Notes and References 3 George Boole and Claude Shannon: Two Mini-Biographies 3.1 The Mathematician 3.2 The Electrical Engineer Notes and References 4 Boolean Algebra 4.1 Boole’s Early Interest in
…
an amazing intellectual “collaboration” between two men who never met. The Englishman George Boole lived his entire life within the nineteenth century, while the American Claude Shannon was born in the twentieth and died at the beginning of the twenty-first. Boole, of course, never knew Shannon, but he was one of
…
intimately entangled. Later in his life Shannon’s name did become uniquely attached to the new science of information theory, but even then you’ll see as you read this book how the mathematics of information theory— probability theory—was a deep, parallel interest of Boole’s as well. What Boole and Shannon created
…
This Book If a little knowledge is dangerous, where is the man who has so much as to be out of danger? —Thomas Huxley (1877) Claude Shannon’s very technical understanding of information … is boring—it’s dry. —James Gleick, in a 2011 interview about his book The Information, expressing a view
…
grasp. My example is from a minor classic in electrical circuit theory, a problem studied in a 1956 paper coauthored by the mathematical electrical engineer Claude Shannon. (I specifically mention Shannon here because— besides being mentioned by Gleick—he is a central character in this book.) That paper opens with the following
…
required one additional contribution, one that didn’t come about until decades later, in 1938, with the work of the American electrical engineer and mathematician Claude Shannon (1916–2001). That was the year Shannon published a famous paper (based on his MIT master’s thesis) on how to implement Boole’s mathematics
…
girls (notice the plural) love bananas, then the correct answer would (again, by grammar, not math) be 1, 6, and 6. 3 George Boole and Claude Shannon Two Mini-Biographies 3.1 THE MATHEMATICIAN1 “Oh please, we are playing at lions and we want a good lion who can roar well. Do
…
-short life at exceeding even his childhood friends’ great expectations. As you’ll learn in the next mini-biography, the other hero in this book, Claude Shannon, had an extensive, first-class formal education, with graduate degrees in both electrical engineering and mathematics. Boole, on the other hand, was essentially self-taught
…
year, Boole’s name would shine forever, while it is the Athenaeum that has vanished. 3.2 THE ELECTRICAL ENGINEER9 It is no exaggeration that Claude Shannon was the Father of the Information Age and his intellectual achievement one of the greatest of the 20th century. – Notices of the American Mathematical Society
…
6) that real engineers could actually read and understand it. It was simply a tour de force, simultaneously founding the entirely new research field of information theory, posing and solving some extremely difficult problems, and pointing its readers toward other problems that remained unanswered. Einstein is famous for many sayings, but one
…
so great that by 1953 the Institute of Radio Engineers formed a Professional Group on Information Theory with a journal of its own [IRE Transactions on Information Theory]. … Information theory was a glamor science for many years. It was popularly supposed that information theory held the key to progress in remote fields to which in fact it did
…
what he had wrought, and in 1956 he authored an editorial plea, “The Bandwagon,” in the new IRE Transactions for a more restrained application of information theory. It fell mostly on deaf ears and blind eyes, however, and it took a second editorial by someone else to have an impact. Written two
…
professor, Peter Elias (1923–2001), the hilarious “Two Famous Papers” mocked both ends of the spectrum. The first (fictional) paper “Information Theory, Photosynthesis and Religion,” was a laugh at those who thought information theory was applicable to every imaginable problem. And the second (fictional) paper, “The Optimum Linear Mean Square Filter for Separating Sinusoidally
…
Modulated Triangular Signals from Randomly Sampled Stationary Gaussian Noise, with Applications to a Problem in Radar,” poked fun at those who used information theory as simply an exotic way to solve problems already solved years earlier by more traditional methods. Elias’s essay had immediate influence on improving the
…
who has ever seen it in action. It is the Ultimate Machine—the End of the Line. Beyond it there is Nothing. It sits on Claude Shannon’s desk driving people mad. Nothing could look simpler. It is merely a small wooden casket the size and shape of a cigar box, with
…
supervised a very small number of graduate theses. His fascination with gadgets never waned at MIT, and he did produce a few more results in information theory. Interestingly, however, and in contradiction to his Omni declaration of “no interest in money”—a sentiment he repeated in a 1990 Scientific American profile (“I
…
colleague John L. Kelly Jr. and the mathematician Ed Thorpe) have today been enthusiastically embraced. The Transactions on Information Theory began publishing papers on portfolio theory in the 1980s, and many PhDs in information theory have since found employment with Wall Street investment firms. Shannon himself became wealthy by applying his ideas to his
…
, cheerful man, but one who could not recognize his own handwritten papers. And there, on February 24, 2001, just shy of his eighty-fifth birthday, Claude Shannon died a death perhaps even more cruel than had been Boole’s. He had lived his life blessed with a brain of rare magnificence, but
…
me were (a) Robert G. Gallager, “Claude Elwood Shannon,” Proceedings of the American Philosophical Society, June 2003, pp. 187–191. (b) Anthony Liversidge, “Profile of Claude Shannon,” Omni, August 1987 (reprinted in Shannon’s Collected Papers, N.J.A. Sloane and Aaron D. Wyner, editors, IEEE Press, 1993). (c) Solomon W. Golomb
…
, January 2002, pp. 8–16. (d) James F. Crow, “Shannon’s Brief Foray into Genetics,” Genetics, November 2001, pp. 915–917. 10. Liversidge, “Profile of Claude Shannon.” 4 Boolean Algebra They who are acquainted with the present state of the theory of Symbolical Algebra, are aware that the validity of the processes
…
fields at the same time,” he says. He adds, after a moment of reflection, “I’ve always loved that word, Boolean.” — In a profile of Claude Shannon, Scientific American (January 1990) 5.1 DIGITAL TECHNOLOGY: RELAYS VERSUS ELECTRONICS Today’s digital circuitry is built with electronic technology that the telephone engineers of
…
be actually closed and of contacts that should be open to be actually open. … A relay [with these faults] will be called a crummy relay. — Claude Shannon, in his 1956 paper, “Reliable Circuits Using Less Reliable Relays” 6.1 A COMMON MATHEMATICAL INTEREST Boole and Shannon shared a deep interest in the
…
with equations. Sometimes this dual interest got him into trouble with pure mathematicians. In what has since become an infamous episode in the lore of information theory, the University of Illinois probability expert J. L. Doob wrote a review (in a 1949 issue of Mathematical Reviews) of Shannon’s Bell System Technical
…
on the attack again, now with an even harsher voice. Writing a guest editorial in, of all places, the Institute of Radio Engineers Transactions on Information Theory, he asked (after complaining about the lack of “theoretical results”), “Can it be that the existence of a mathematical basis [to
…
information theory] is irrelevant?” Talk about bringing the camel inside the tent and having it continue to aim in the wrong direction! In general, electrical engineers and
…
Principia, an achievement even greater than his switching theory use of Boolean algebra, and more than fifty years after Doob’s sneer the Transactions on Information Theory is still in business. 6.2 SOME FUNDAMENTAL PROBABILITY CONCEPTS This chapter will not make you a probability expert; indeed, the only mathematical background in
…
here would, somewhat paradoxically, be endorsed by the purist Doob. Indeed, in the same editorial in which he suggested that there is no need in information theory for a mathematical theory, he ended with “Can it be… that there is a context in which the word ‘information’ is accepted by general agreement
…
can find out that there is an error, why can it not find out where it is? —Richard W. Hamming, in his book Coding and Information Theory (1980) 7.1 CHANNEL CAPACITY, SHANNON’S THEOREM, AND ERROR-DETECTION THEORY The entire point of Shannon’s 1948 “A Mathematical Theory of Communication” was
…
older but still quite nice tutorial essay on this part of Shannon’s work is by one of his Bell Labs colleagues, E. N. Gilbert, “Information Theory after 18 Years,” Science, April 15, 1966, pp. 320–326. Coding procedures have, in the more than sixty years since Shannon’s “Mathematical Theory” appeared
…
speeds. Poets may decry this, but it isn’t poetry that makes your e-mail possible; it’s Shannon’s “boring” (Gleick’s word) mathematical information theory. 6. When used this way, the XOR is often called a controlled-NOT (CNOT) gate, and we’ll see it and a more sophisticated version
…
play music to it! — Both by Alan Turing, during a two-month visit in early 1943 to Bell Labs (New York City), where he met Claude Shannon and found they had a common interest in how computing machines might imitate the human brain A very small percentage of the population produces the
…
idea into the brain, you will get [alas!] half an idea out. There are other people who … produce two ideas for each idea sent in. —Claude Shannon, in a March 1952 talk at Bell Labs on creativity, during which he explained how he arrived at the logic circuitry for a machine that
…
from more than a few missteps. At the MIT Museum (Cambridge, MA) curatorial assistant Ariel Weinberg was of great help in obtaining the photo of Claude Shannon, while at University College (Cork, Ireland) archivist Carol Quinn provided gracious support in my quest for a photo of George Boole. Artist Randy Glasbergen allowed
…
, Sidney relay: crummy; theory of Riordan, John RS flip-flop sample (point); (space) shannon (information unit). See also bit Shannon, Catherine (sister of Claude) Shannon, Claude (father of Claude) Shannon, Claude Elwood; codes by; life of; on probability; his salesmen and engineers puzzle; and switches; and time machines; and Turing machines Shannon, Mabel (mother
…
of Claude) Shannon-Hagelbarger theorem Sheffer, Henry Shestakov, Victor Shor, Peter. See also algorithm (Shor’s) Sklansky, Jack source rate sphere-packing spooky-action-at-a-distance. See
by George Gilder · 16 Jul 2018 · 332pp · 93,672 words
trade. Since Claude Shannon in 1948 and Peter Drucker in the 1950s, we have all spoken of the information economy as if it were a new idea. But both Newton’s physics and his gold standard were information systems. More specifically, the Newtonian system is what we call today an information theory. Newton’s
…
biographers typically underestimate his achievement in establishing the information theory of money on a firm foundation. As one writes, Watching over the minting of a nation’s coin, catching a
…
. The mathematicians and philosophers might talk on for decades, unaware that they had been decapitated. Their successors talk on even today. But the triumphs of information theory and technology had put an end to the idea of a determinist and complete mathematical system for the universe. At the time, the leading champion
…
This recognition would liberate von Neumann himself. Not only could men discover algorithms, they could compose them. The new vision ultimately led to a new information theory of biology, anticipated in principle by von Neumann and developed most fully by Hubert Yockey,10 in which human beings might eventually reprogram parts of
…
electrons, computer logic could not escape self-referential loops as its own logical structures informed its own algorithms.12 Gödel’s insights led directly to Claude Shannon’s information theory, which underlies all computers and networks today. Conceiving the bit as the basic unit of digital computation, Shannon defined information as surprising bits—that
…
is also assured by the one-way passage of thermodynamic entropy. Gödel’s work, and Turing’s, led to Gregory Chaitin’s concept of algorithmic information theory. This important breakthrough tested the “complexity” of a message by the length of the computer program needed to generate it. Chaitin proved that physical laws
…
math, the mathematics that comes after Gödel, 1931, and Turing, 1936, open not closed math, the math of creativity. . . . ”13 That is the mathematics of information theory, of which Chaitin is the supreme living exponent. Cleaving all information is the great divide between creativity and determinism, between information entropy of surprise and
…
between consciousness and machines. Not only was a new science born but also a new economy, based on a new system of the world—the information theory articulated in 1948 by Shannon on the foundations first launched in a room in Königsberg in September 1930. This new system of the world was
…
Hector Garcia Molina supervised the doctoral work of the founders. Rollerblading down the corridors of Stanford’s computer science pantheon in the madcap spirit of Claude Shannon, the Google founders consorted with such academic giants as Donald Knuth, the conceptual king of software, Bill Dally, a trailblazer of parallel computation, and even
…
impressive portent for Google’s human face-recognition project or for car vision systems that need flawlessly to identify remote objects in real time. As Claude Shannon showed, these success rates of 95 percent, or even 99.999 percent, are deceptive, because you have no way of telling which instances are the
…
of von Neumann and Gödel early in the last century or with the breakthroughs in information theory of Claude Shannon, Gregory Chaitin, Anton Kolmogorov, and John R. Pierce. In a series of powerful arguments, Chaitin, the inventor of algorithmic information theory, has translated Gödel into modern terms. When Silicon Valley’s AI theorists push the logic
…
the time of his death in 1922, he had turned his precursors’ improvisations into a full-fledged system. Markovian techniques, which pervade the science of information theory, are behind the dominant advances of the Google era, from big data and cloud computing to speech recognition and machine learning. In an early triumph
…
properties could be grasped mathematically and predicted without knowing the particular language. In focusing on patterns of vowels and consonants, Markov came close to anticipating Claude Shannon’s information metric. Shannon’s theory treated all transmitters across a communications channel as Markov processes.3 Refining and extending Markov’s discoveries through the
…
his company. Mercer’s IBM boss, Fred Jelinek, was a protégé of the MIT information theorist Robert Fano and a student of Claude Shannon. He saw speech recognition as an information theory problem—an acoustic signal and a noisy channel. Citing the content-neutral concept behind his speech-recognition successes, Jelinek proudly declared, “Every
…
Brook Harbor off Long Island Sound. This in Markov terms is an “absorbing state” (no further turns). You have arrived. In my investigation of computers, information theory, Markov, and money, I imagined that I had penetrated to the secret heart of Google’s intellectual regime across the continent in Silicon Valley. I
…
model adds profits, the economic manifestation of entropy—the unexpected dimension of returns beyond the interest rate, which reflects average and predictable returns. Derived from Claude Shannon’s information theory, entropy in my model is surprise. Small and temporary anomalies are unsurprising and low-entropy. Correcting for leverage, I contend that profits that merely
…
the unwary into sterile fields of algorithmic finance. At IBM, by contrast, Mercer and his colleagues under Jelinek achieved a permanent advance in computer science, information theory, and speech recognition. Their discoveries are behind the Siri system in your iPhone, hands-free calling in your car, and the growing success of machine
…
great minds of the era, who imagine that information is order, or, as they sometimes put it, revealing their incomprehension, negentropy. In both thermodynamics and information theory, entropy is disorder, not order. Order defines the expected bits, the redundancy. Entropy measures the unexpected ones and gauges the information, measured by the degrees
…
it all, he might say, “More dimensions—I have no need for that hypothesis.” Eclipsing consciousness, freedom of choice, and surprise, this faith ultimately defies information theory itself. Information depends on the range of freedom of choice and the surprise that can be perceived only by a conscious being. This materialist superstition
…
legatee of Shannon’s vision. Like Shannon he can move seamlessly between the light and dark sides of information, between communication and cryptography. Shannon’s information theory, like Turing’s computational vision, began with an understanding of codes. His first major paper, “A Mathematical Theory of Cryptography” (1945) proved that a perfect
…
same time, reading the same news, watching the same performers. Invisibly they share your interest in trail races or tax rates or Leonard Cohen or Information Theory or track and field or Joan Didion or network processors or March Madness or Art Tatum or beautiful women or Yo-Yo Ma or the
…
all communications. That act was based on the assumption that technology defines and identifies service: that conduit defines content. This concept is irreconcilable both with Information Theory and with the Internet that is based on it. A product of the long-gone days when broadcast meant TV and copper wires defined telephony
…
recentralization of computing and ensures the emergence of a new architecture. Lo and behold, here it is. It is based on the same cryptography that Claude Shannon and Alan Turing developed during World War II. It now provides a new computer architecture founded on blockchains, mathematical hashes, and the array of associated
…
age of human accomplishment. Propelling such advances will be a shift of focus from the fruits of computation to its roots in trust and security. Information theory always expounded reality from two sides. On one side it measures and enables communication, transmission, redundancy, and reliable copying across time and space. On the
…
machine; on the other side, it is a truth machine, attempting to resolve the ground states of the world. In 1948, when Shannon developed his Information Theory at MIT and Bell Labs, the world was preoccupied with communication across a noisy channel. Questions of truth and consequences deferred to questions of signal
…
and noise. Information Theory began with “Communications Theory for Secrecy Systems.” This paper proved that a perfect randomized one-time-pad constitutes an unbreakable code. It is a pillar
…
molecules). Ludwig Boltzmann (1844–1906) identified this difference with missing information, or uncertainty about the arrangement of the molecules, thus opening the way for Claude Shannon and information theory. Both forms of entropy register disorder. Boltzmann’s entropy is analog and governed by the natural logarithm e, while Shannon’s entropy is digital and
…
governed by log 2. Chaitin’s Law: Gregory Chaitin, inventor of algorithmic information theory, ordains that you cannot use static, eternal, perfect mathematics to model dynamic creative life. Determinist math traps the mathematician in a mechanical process that cannot
…
” that used numbers to encode and prove algorithms also expressed in numbers. This invention, absorbed by von Neumann and Alan Turing, launched computer science and information theory and enabled the development of the Internet and the blockchain. Gold: The monetary element, atomic number 179, tested over centuries and found uniquely suitable as
…
estate are now nine times global GDP. That’s not capitalism, that’s hypertrophy of finance. Information Theory: Begun by Kurt Gödel when he made logic into functional mathematcis and algorithms. Information theory evolved through the minds of Claude Shannon (1916–2001) and Alan Turing (1912–1954) into its current role as mathematical philosophy. It depicts
…
—that is, the larger the set of possible messages—the greater the composer’s choice and the higher the entropy and information of the message. Information theory both enables and describes our digital and analog world. Main Street: The symbol of the real economy of workers paid hourly or monthly and sealed
…
, such as parallel processing, multi-threading, lower voltages, and three-dimensional chip architectures. As a learning curve, Moore’s Law is an important principle of information theory. Noise: Interference in a message. Any influence of the conduit on the content: An undesired disturbance in a communications channel. Noise is commonly the distortion
…
Poverty, Life after Television, Knowledge and Power, and The Scandal of Money. A founding fellow of the Discovery Institute, where he began his study of information theory, and an influential venture investor, he lives with his wife in western Massachusetts. LIKE REGNERY ON FACEBOOK FOLLOW US ON TWITTER Notes Prologue: Back to
…
. William Briggs, Uncertainty: The Soul of Modeling, Probability and Statistics (Switzerland: Springer International Publishing, 2016), 32. 10. Hubert Yockey, Information Theory, Evolution, and the Origin of Life (New York: Cambridge University Press, 2005). Information Theory and Molecular Biology (1992). 11. George Dyson, Turing’s Cathedral: The Origins of the Digital Universe (New York: Pantheon
…
A. Knopf, 2016). Wu, Tim. The Master Switch: The Rise and Fall of Information Empires revised paperback edition (New York: Vintage Books, 2011). Yockey, Hubert. Information Theory, Evolution, and the Origin of Life (New York: Cambridge University Press, 2005). Periodicals Andreessen, Marc. “Why Bitcoin Matters,” The New York Times, January 21, 2014
by Benjamin Peters · 2 Jun 2016 · 518pp · 107,836 words
Wiener, Warren McCulloch, and Donald MacKay, the technical and technocratic insights into a summary set of cybernetic sciences—operations research, systems theory, game theory, and information theory—presented themselves with seemingly cosmological force, delivering balance to a postwar world riven by rage. Modern computing talk owes a fair amount to these cybernetic
…
Wiener himself, the mathematician and game theorist John von Neumann, leading anthropologist Margaret Mead and her then husband Gregory Bateson, founding information theorist and engineer Claude Shannon, sociologist-statistician and communication theorist Paul Lazarsfeld, psychologist and computer scientist J.C.R. Licklider, as well as influential psychiatrists, psychoanalysts, and philosophers such as
…
immensely influential:13 Von Neumann pioneered much of the digital architecture for the computer as well as cold war game theory;14 Shannon founded American information theory; Bateson facilitated the adaptation of cybernetics in anthropology and the American counterculture;15 Lazarsfeld fashioned much of postwar American mass communication research;16 and of
…
rendered its vocabulary fecund for other sibling fields embedded in U.S. military-industrial research.18 Take, for example, the contemporary fields of information theory and game theory. Mainstream American information theory, following Bell Labs engineer Claude E. Shannon’s 1948 mathematical theory of communication, concentrates on the efficient and reliable measurement and transmission
…
whole.21 The founders of these fields disagreed about the limits and relationships between the three fields. Shannon insisted on keeping the technical principles of information theory separate from the more sweeping scope of cybernetics, Von Neumann did not rigorously distinguish between the three, and Wiener defended his grouping of the other
…
computation communication sciences. Shannon did not accept the label of cybernetics, and he also did not accept the label others had given to his own “information theory,” preferring to the end of his life his original emphasis on “mathematical theory of communication.” Each of these sciences sought to theorize the technical means
…
in 1950); Jacques Lacan’s turning to mathematical concepts; Roland Barthes’s turn to schematic accounts of communication; Gilles Deleuze’s abandonment of meaning, with Claude Shannon’s information theory in hand; Felix Guattari’s, Michel Foucault’s, and other French theorists’ experimentation with terms such as encoding, decoding, information, and communication.34 Postmodern
…
French theory owes a deep debt to postwar information theory and the cybernetic sciences. In England, cybernetics took on a different character in the form of the Ratio Club, a small but potent gathering of
…
Pask, and management cyberneticist Stafford Beer (who also features prominently in the Chilean cybernetic situation described below). Interdisciplinary discussions ranged widely across themes such as information theory, probability, pattern recognition, artifacts that act (such as William Ross Ashby’s homeostat and W. Grey Walter’s robotic tortoises), and philosophy. Among their guests
…
a practical field of human-machine applications into a society well suited to adopt them. The coauthors also integrated and expanded the stochastic analysis of Claude Shannon’s information theory while simultaneously stripping Wiener’s organism-machine analogy of its political potency.71 Wiener’s core analogies between animal and machine, machine and mind
…
rational foe to the point where the positions are reversed and foe and friend become indistinguishable.101 Cybernetics—like its sister disciplines of game theory, information theory, and others—appears as a method for rationalizing the enemy, distributing structural strategy evenly across opponents and flattening the chances that an enemy will have
…
), and, the most prominent of the Soviet cybernetic social sciences, “economic cybernetics” (discussed in later chapters).120 By 1967, the range of cybernetic sections enveloped information theory, information systems, bionics, chemistry, psychology, energy systems, transportation, and justice, with semiotics joining the linguistic section and medicine uniting with biology. Sheltering a huddling crowd
…
His vision describes a technical future that was obvious to information theorists, who were the technocratic twin of cyberneticists and could be traced back to Claude Shannon of Bell Labs and his seminal 1948 article “A Mathematical Theory of Communication.” (Kharkevich was himself a leading information theorist and specialist in noise reduction
…
domination of the Soviet school of chess over the Americans was an expression of superior long-range socialist planning. In 1968, having been influenced by Claude Shannon’s less well-known 1950 work on computer chess, Botvinnik published An Algorithm for Chess, which successfully demonstrated how to algorithmically organize attacks against an
…
, increase the installation of computers by 100 to 130 percent, build computer centers for collective use, create integrated information banks, and significantly increase research in information theory, cybernetics, microelectronics, and radio physics. The passage of time has allowed some reflection on the sources of these challenges. In 1999, Fedorenko contemplated the stubborn
…
benefited from the valuable comments of Geof Bowker, Peter Sachs Collopy, Paul Edwards, Bernard Geoghegan, Lydia Liu, Eden Medina, and Mara Mills on cybernetics and information theory, while Alex Bochannek, Elena Doshlygina, Michael Gordin, Loren Graham, Martin Kragh, Adam Leeds, Ksenia Tatarchenko, and others have taught me much about the Soviet situation
…
the Soviet Union,” Science Studies 4 (1974): 299–337; and David Mindell, Jerome Segal, and Slava Gerovitch, “From Communications Engineering to Communications Science: Cybernetics and Information Theory in the United States, France, and the Soviet Union,” in Science and Ideology: A Comparative History, ed. Mark Walker, 66–96 (New York: Routledge, 2003
…
Society 24 (2007): 27–46; Lydia Liu, “The Cybernetic Unconscious: Rethinking Lacan, Poe, and French Theory,” Critical Inquiry 36 (2010): 288–320; Bernard Geoghegan, “From Information Theory to French Theory: Jakobson, Lévi-Strauss, and the Cybernetic Apparatus,” Critical Inquiry 38 (2011): 96–126. On cybernetics in Britain, see Andrew Pickering, The Cybernetic
…
, “How to Be Universal: Some Cybernetic Strategies, 1943–70”; Galison, “The Ontology of the Enemy”; and J. R. Pierce, “The Early Days of Information Theory,” IEEE Transactions on Information Theory 19 (1) (1973): 3–8; and especially Ronald R. Kline, The Cybernetics Moment, Or Why We Call Our Age the Information Age (Baltimore, MD
…
Chicago Press, 2013). 22. Claude E. Shannon, “The Bandwagon,” IRE Transactions on Information Theory 2 (1) (1956): 3. See also Pierce, “The Early Days of Information Theory”; Norbert Wiener, “What Is Information Theory?,” IRE Transactions on Information Theory 48 (1956): 48; Ronald R. Kline, “What Is Information Theory a Theory Of? Boundary Work among Scientists in the United States and
…
la notion scientifique d’information (Paris: Syllepse, 2003). 33. Mindell, Segal, and Gerovitch, “From Communications Engineering to Communications Science.” 34. Ibid. See also Geoghegan, “From Information Theory to French Theory”; Céline LaFontaine, “The Cybernetic Matrix of French Theory”; and LaFontaine, L’empire cybernétique: des machines à penser à la pensée machine (Paris
…
), 32. 45. Aleksandr Kharkevich, “Informatsia i tekhnika” [“Information and Technology”], Kommunist 17 (1962): 94. 46. Ibid. For an example of his earlier and largely technocratic information theory work, see Aleksandr A. Kharkevich, “Basic Features of a General Theory of Communication,” Radiotekhnika [Radio Engineering] 9 (5) (1954). For the CIA document, see Conway
…
: Routledge, 2000. Geoghegan, Bernard. The Cybernetic Apparatus: Media, Liberalism, and the Reform of the Human Sciences. Ph.D. diss., Northwestern University, 2012. Geoghegan, Bernard. “From Information Theory to French Theory: Jakobson, Lévi-Strauss, and the Cybernetic Apparatus.” Critical Inquiry 38 (2011): 96–126. Geoghegan, Bernard. “The Historiographic Conceptualization of Information: A Critical
…
R. The Cybernetics Moment, Or Why We Call Our Age the Information Age. Baltimore, MD: Johns Hopkins University Press, 2015. Kline, Ronald R. “What Is Information Theory a Theory Of? Boundary Work among Scientists in the United States and Britain during the Cold War.” In The History and Heritage of Scientific and
…
, and Computing before Cybernetics. Baltimore: Johns Hopkins University Press, 2002. Mindell, David, Jerome Segal, and Slava Gerovitch. “From Communications Engineering to Communications Science: Cybernetics and Information Theory in the United States, France, and the Soviet Union.” In Science and Ideology: A Comparative History. Edited by Mark Walker, 66–96. New York: Routledge
…
, 2004. Pickering, Andrew. The Cybernetic Brain: Sketches of Another Future. Chicago: University of Chicago Press, 2010. Pierce, J. R. “The Early Days of Information Theory,” IEEE Transactions on Information Theory 19 (1) (1973): 3–8. Pipes, Richard. Russian Conservatism and its Critics: A Study in Political Culture. New Haven: Yale University Press, 2005. Poletaev
…
, Claude E. “A Mathematical Theory of Communication.” Bell Systems Technical Journal 27 (1948): 379–423, 623–656. Shannon, Claude E. “The Bandwagon,” IRE Transactions on Information Theory 2 (1) (1956): 3. Shipler, David. Russia: Broken Idols, Solemn Dreams. New York: Times Books, 1983. Shirky, Clay. Cognitive Surplus: Creativity and Generosity in a
…
Technology,” 98 Information-coordination problems, 61–62, 67 Information index, 66 Information science, 39–40 Information systems, 19 “Information Technology in the National Economy,” 107 Information theory, 20–21, 37, 98–100 Infrastructural inversion, 8 Institute directors, 140 Institute for Telecommunications, 185 Institute for the Problems of the Transmission of Information (IPPI
by Fred Turner · 31 Aug 2006 · 339pp · 57,031 words
Wiener, the world, like the anti-aircraft predictor, was composed of systems linked by, and to some extent made out of, messages. Drawing on Claude Shannon’s information theory (published in 1948, but likely familiar to Wiener much earlier), Wiener defined messages as “forms of pattern and organization.”37 Like Shannon’s information, Wiener
…
complete misconception of the facts that their underlings possess.”38 Both Cybernetics and The Human Use of Human Beings were best sellers, and together with Claude Shannon and Warren Weaver’s 1949 Mathematical Theory of Communication, they sparked a decade’s worth of debate about the proper role of computers in society
…
fundamentals of butterfly ecology and systemsoriented approaches to evolutionary biology. These preoccupations reflected the extraordinary influence of cybernetics and information theory on American biology following World War II. At the level of microbiology, information theory provided a new language with which to understand heredity. Under its influence, genes and sequences of DNA became information
…
of text to be read and decoded. In the 1950s, as Lily Kay has pointed out, microbiology became “a communication science, allied to cybernetics, information theory, and computers.” Information theory also exerted a tremendous pull on biological studies of organisms and their interaction. Before World War II, biologists often focused on the study of
…
Cage and Rauschenberg, he was right: for them, the making of art had become the building of systems of pattern and randomness, and thus, in Claude Shannon’s sense, of information.13 For Stewart Brand, such insights echoed Paul Ehrlich’s systems view of the natural world. They also offered new models
…
[ 58 ] Chapter 2 considered closed systems—when he was a naval officer. Yet his writings also bear the imprint of cold war– era military-industrial information theory. For Fuller, as for Wiener and the systems analysts of later decades, the material world consisted of information patterns made manifest. The patterns could be
…
the I Ching’s sayings as clues to a set of otherwise invisible probabilities, he could also act in concert with the probabilistic outlook of information theory. He could become a Comprehensive Designer, using the informational energies of the world to transform the “system” that was his life and, according to New
…
“energy” common to the mythos of LSD and multimedia theater, but also the celebration of form, system, and homeostasis common to cybernetics, population biology, and information theory. In this way, like the Catalog itself, they bridged high science and counterculture. In the Catalog, the products of these worlds might be juxtaposed on
…
level, the turn toward coevolution marked a return to the systems orientation of the Whole Earth Catalog. At another, it represented a shift both in information theory and in its relationship to the New Communalist critique of technocracy. To the communities among which Stewart Brand moved in the 1960s—USCO, the downtown
…
did, that the human brain and the computer could model one another. Even so, the RLE, like the Rad Lab, offered a rich soup of information theory and rhetoric, much of it growing out of the cybernetic intuition that digital systems and natural systems might model one another. Over the next two
…
containing on average forty items.42 The first category, “Prime Information,” played the role of the former “Whole Systems” section. It featured books on cybernetics, information theory, and even “whole systems,” as the earlier section had. But it also included reviews of newer work, such as Richard Dawkins’s The Selfish Gene
…
communes such as Libre and the Farm invoked as they gathered to build alternative communities. On the other hand, the technophilic orientation of cybernetics and information theory, together with the example of idiosyncratic technocrats such as Buckminster Fuller, offered the youth of the 1960s a solution to another dilemma as well. Although
…
as ordinary businesspeople and manufacturers, but as a countercultural elite. In the 1960s, these sorts of legitimacy exchange had allowed for the promiscuous mingling of information theory and other systems-oriented doctrines, particularly psychedelic mysticism and disciplines derived from Buddhism and other Eastern traditions. In the 1990s, they facilitated the fusion of
…
little in the way of language with which to describe, let alone confront, a less-than-egalitarian distribution of resources. The same was true of information theory and the universal rhetoric of cybernetics. In both cases, human power was an individual possession, born of the proper use of technologies for the amplification
…
Theory, 3. For Bertalanffy, cybernetics was only one root of systems theory, albeit an important one. Others included the servomechanisms of the nineteenth century, Claude Shannon’s information theory, Von Neumann and Morgenstern’s game theory, and the increasing need in the post–World War II world to monitor and control large systems for
…
a large-scale shift in American art away from formalism and the construction of objects and toward the use of technology and the embrace of information theory. Jack Burnham, an art critic and professor at Northwestern University, was perhaps the most articulate spokesman for this vision. In 1968, in “Systems Esthetics,” he
…
described the rise of a “systems esthetic” in American art. In keeping with the discoveries of cybernetics and information theory, he explained, artists had come to see themselves as engineers and scientists. At one level, they embraced the mission of scientific research, seeking to reveal
…
Designer, 56 –57, 58, 244; Dymaxion principle, 113; geodesic dome, 65, 94; Ideas and Integrities, 56, 57, 83; imprint of cold war– era military-industrial information theory on, 58; key influence on Whole Earth community, 4, 43, 49, 80, 82, 89, 243; notion of the world as an information system, 57–58
…
paradox of, 136 –37; free dissemination of, 137 Information Processing Techniques Office, 108 Information Superhighway, 219 information system, material world imagined as, 15 information theory: and American art, 268n13; of Claude Shannon, 265n43; and microbiology, 43 – 44 Information Week, 131 Innis, Harold, 52, 269n21 Institute for Advanced Study, 185 Intel, 212 Intercontinental Ballistic Missile
by Benoit Mandelbrot · 30 Oct 2012
saw on no other occasion. My restless curiosity led me to read works that were widely discussed when they appeared: Mathematical Theory of Communication by Claude Shannon, Cybernetics, or Control and Communication in the Animal and the Machine by Norbert Wiener, and Theory of Games and Economic Behavior by John von Neumann
…
a serious vein, liaising was a good opportunity to scout for Ph.D. topics. At Caltech, I had read the seed papers in which Claude Shannon founded information theory, and I badly wanted to know more. A get-together in London on this topic attracted me greatly, so I asked if I could attend
…
, which is grammar. In one of the very few clear-cut eureka moments of my life, I saw that it might be deeply linked to information theory and hence to statistical thermodynamics—and became hooked on power law distributions for life. Those “details” had eluded not only Zipf—not trained as a
…
intrinsically messy and suffers more from soulless order than from surrounding physical decay. Controversial Balance Between Conjecture and Proof Claude Shannon (1916–2001) was the intellectual leader whose wartime work, published in 1948, created information theory and provided RLE with an intellectual backbone. His work on noiseless channels was a point of departure for
…
for full credit, he nominated Landau for a prize for this work. Tisza and I interacted intensely for a few years after a symposium on information theory held at MIT in the summer of 1956. The paper I presented there described an axiomatic for statistical thermodynamics that developed from the second half
…
confuse with that of a “real” course, I settled on a misnamed groupe de recherche that resided in my briefcase and consisted of lectures on information theory that I gave and published. In research, I kept running around. France happened to be abuzz with great political theater, thanks to its prime minister
by Brian Christian · 5 Oct 2020 · 625pp · 167,349 words
based on the statistics of the language itself? Constructing these kinds of predictive models has long been a grail for computational linguists.54 (Indeed, Claude Shannon founded information theory in the 1940s on a mathematical analysis of this very sort, noticing that some missing words are more predictable than others, and attempting to quantify
…
, and Curiosity, Berlyne notes that a proper study of curiosity first began to emerge in the late 1940s; it is no coincidence, he argues, that information theory and neuroscience also came into their own at the same time.21 A proper understanding of curiosity appears only to be possible at the interdisciplinary
…
on nights or weekends. There was too much else to do.23 His ideas, in particular the agenda of reaching out to both neuroscience and information theory for clues, would inspire succeeding generations of psychologists for the latter half of the twentieth century, and in the twenty-first they would come full
…
be done with it.”46 For Schmidhuber, just as it was for Berlyne in the ’60s, this idea of learning has its mathematical roots in information theory and—to Schmidhuber’s mind in particular—the notion of data compression: that a more readily understood world is more concisely compressible. In fact, for
…
extend the inverse-reinforcement-learning framework. In 2008, then–PhD student Brian Ziebart and his collaborators at Carnegie Mellon developed a method using ideas from information theory. Instead of assuming that the experts we observe are totally perfect, we can imagine that they are simply more likely to take an action the
…
in Human Behavior. 20. Berlyne, Conflict, Arousal, and Curiosity. 21. And see, for instance, Berlyne’s own “Uncertainty and Conflict: A Point of Contact Between Information-Theory and Behavior-Theory Concepts.” 22. For a twenty-first-century overview of “interest” as a psychological subject, see, e.g., Silvia, Exploring the Psychology of
…
://www.youtube.com/watch?v=kN6ifrqwIMY. 28. Ziebart et al., “Maximum Entropy Inverse Reinforcement Learning,” which leverages the principle of maximum entropy derived from Jaynes, “Information Theory and Statistical Mechanics.” See also Ziebart, Bagnell, and Dey, “Modeling Interaction via the Principle of Maximum Causal Entropy.” 29. See Billard, Calinon, and Guenter, “Discriminative
…
. ———. “‘Interest’ as a Psychological Concept.” British Journal of Psychology, General Section 39, no. 4 (1949): 184–95. ———. “Uncertainty and Conflict: A Point of Contact Between Information-Theory and Behavior-Theory Concepts.” Psychological Review 64, no. 6 (1957): 329–39. Bernoulli, Daniel. “Specimen theoriae novae de mensura sortis.” Comentarii Academiae Scientarum Imperialis Petropolitanae
…
Character Recognition System Using Decision Functions.” IRE Transactions on Electronic Computers 4 (1957): 247–54. ———. “On Optimum Recognition Error and Reject Tradeoff.” IEEE Transactions on Information Theory 16, no. 1 (1970): 41–46. Christian, Brian, and Tom Griffiths. Algorithms to Live By. Henry Holt, 2016. Christiano, Paul F., Jan Leike, Tom Brown
…
Pargetter. “Oughts, Options, and Actualism.” Philosophical Review 95, no. 2 (1986): 233–55. James, William. Psychology: Briefer Course. New York: Macmillan, 1892. Jaynes, Edwin T. “Information Theory and Statistical Mechanics.” Physical Review 106, no. 4 (1957): 620–30. Jefferson, Thomas. Notes on the State of Virginia. Paris, 1785. Jelinek, Fred, and Robert
…
University, 214–15 indirect normativity, 223 infant development. See child development inference, 251–53, 269, 323–24, 385n39, 398nn29–30 See also inverse reinforcement learning information theory, 34–35, 188, 197–98, 260–61 Innocent IX (Pope), 303 Institute for Ophthalmic Research, 287 intelligence artificial general intelligence and, 209 reinforcement learning and
by Jiawei Han, Micheline Kamber and Jian Pei · 21 Jun 2011
, can be inferred from either the appearance or absence of indicators. Mutual information is one of several possible weighting functions. It is widely used in information theory to measure the mutual independency of two random variables. Intuitively, it measures how much information a random variable tells about the other. Given two frequent
…
of tuples in D and , respectively. Information Gain ID3 uses information gain as its attribute selection measure. This measure is based on pioneering work by Claude Shannon on information theory, which studied the value or “information content” of messages. Let node N represent or hold the tuples of partition D. The attribute with the
…
detailed discussion on attribute selection measures, see Kononenko and Hong [KH97]. Information gain was proposed by Quinlan [Qui86] and is based on pioneering work on information theory by Shannon and Weaver [SW49]. The gain ratio, proposed as an extension to information gain, is described as part of C4.5 (Quinlan [Qui93]). The
…
Gini index was proposed for CART in Breiman, Friedman, Olshen, and Stone [BFOS84]. The G-statistic, based on information theory, is given in Sokal and Rohlf [SR81]. Comparisons of attribute selection measures include Buntine and Niblett [BN92], Fayyad and Irani [FI92], Kononenko [Kon95], Loh and
…
. For movies, they may look for similar genres, directors, or actors. For articles, they may look for similar terms. Content-based methods are rooted in information theory. They make use of keywords (describing the items) and user profiles that contain information about users' tastes and needs. Such profiles may be obtained explicitly
…
, A.; Green, P.; Carroll, J., k-modes clustering, J. Classification 18 (2001) 35–55. [CH67] Cover, T.; Hart, P., Nearest neighbor pattern classification, IEEE Trans. Information Theory 13 (1967) 21–27. [CH92] Cooper, G.; Herskovits, E., A Bayesian method for the induction of probabilistic networks from data, Machine Learning 9 (1992) 309
…
., Towards on-line analytical mining in large databases, SIGMOD Record 27 (1998) 97–107. [Har68] Hart, P.E., The condensed nearest neighbor rule, IEEE Trans. Information Theory 14 (1968) 515–516. [Har72] Hartigan, J., Direct clustering of a data matrix, J. American Stat. Assoc. 67 (1972) 123–129. [Har75] Hartigan, J.A
…
Systems Design and Implementation (OSDI’04) San Francisco, CA. (Dec. 2004), pp. 20–22. [Llo57] Lloyd, S.P., Least squares quantization in PCM, IEEE Trans. Information Theory 28 (1982) 128–137; (original version: Technical Report, Bell Labs, 1957). [LLS00] Lim, T.-S.; Loh, W.-Y.; Shih, Y.-S., A comparison of prediction
by Kenneth Cukier, Viktor Mayer-Schönberger and Francis de Véricourt · 10 May 2021 · 291pp · 80,068 words
of Thought (Princeton, NJ: Princeton University Press, 2017). Reframing the economy: For a fascinating reframing of economics through the lens of Claude Shannon’s information theory, see: George Gilder, Knowledge and Power: The Information Theory of Capitalism and How It Is Revolutionizing our World (Washington, DC: Regnery, 2013). The idea of a “circular economy” is another
by William Poundstone · 18 Sep 2006 · 389pp · 109,207 words
into their own language, and then it soon turns into something completely different. —Johann Wolfgang von Goethe CONTENTS Prologue: The Wire Service PART ONE: ENTROPY Claude Shannon • Project X • Emmanuel Kimmel • Edward Thorp • Toy Room • Roulette • Gambler’s Ruin • Randomness, Disorder, Uncertainty • The Bandwagon • John Kelly, Jr. • Private Wire • Minus Sign
…
1956. Shannon had done what practically no one else had done since the Renaissance. He had single-handedly invented an important new science. Shannon’s information theory is an abstract science of communication that lies behind computers, the Internet, and all digital media. “It’s said that it is one of
…
Shannon’s greatest accomplishment. Shannon’s supreme opus, information theory, turned out to be one of those all-encompassing ideas that sweep up everything in history’s path. In the 1960s, 1970s, and 1980s, scarcely a year went by without a digital “trend” that made Claude Shannon more relevant than ever. The transistor, the
…
journalists and pundits trying to make sense of the digital juggernaut. Shannon’s reputation burgeoned. Largely on the strength of his groundbreaking 1948 paper establishing information theory, Shannon collected honorary degrees for the rest of his life. He kept the gowns on a revolving dry cleaner’s rack he built in his
…
’s ideas in the postwar years. Shannon later said that thinking about how to conceal messages with random noise motivated some of the insights of information theory. “A secrecy system is almost identical with a noisy communications system,” he claimed. The two lines of inquiry “were so close together you couldn’
…
in late 1947 or early 1948, before Bell Labs unveiled the transistor on June 30—and just about the time Shannon’s classic paper on information theory appeared. There is minor scandal associated with that paper. Shannon published “A Mathematical Theory of Communication” in a 1948 issue of the Bell System
…
) understood the teaching as having an ulterior motive. It was supposed to allow Shannon the free time to begin writing a long-anticipated book on information theory. “I am having a very enjoyable time here at M.I.T.,” Shannon wrote his Bell Labs boss, Hendrik Bode. “The seminar is going
…
some of the fundamental properties of general systems for the transmission of intelligence, including telephony, radio, television, telegraphy, etc.” This letter describes the beginning of information theory. As Shannon would ultimately realize, his theory of communication has surprising relevance to the problem of gambler’s ruin. Before Shannon, most engineers did not
…
his Massachusetts home “Entropy House”—a name whose appropriateness was apparent to all who set eyes on its interior. “I didn’t like the term ‘information theory,’” Robert Fano said. “Claude didn’t like it either.” But the familiar word “information” proved too appealing. It was this term that has stuck,
…
, “No Shannon, no Napster.” By the 1950s, the general press started to pick up on the importance of Shannon’s work. Fortune magazine declared information theory to be one of humanity’s “proudest and rarest creations, a great scientific theory which could profoundly and rapidly alter man’s view of the
…
world.” The very name “information theory” sounded expansive and open-ended. In the 1950s and 1960s, it was often used to embrace computer science, artificial intelligence, and robotics (fields that
…
fascinated Shannon but which he considered distinct from information theory). Thinkers intuited a cultural revolution with computers, networks, and mass media at its base. “The word communication will be used here in a very broad
…
the theater, the ballet, and in fact all human behavior.” These words were written by Shannon’s former employer Warren Weaver. Weaver’s essay presented information theory as a humanistic discipline—perhaps misleadingly so. Strongly influenced by Shannon, media theorist Marshall McLuhan coined the term “information age” in Understanding Media (1964). Oracular
…
(still analog in the 1960s) were changing the world. It implied, more presciently than McLuhan could have known, that Claude Shannon was a prime mover in that revolution. There were earnest attempts to apply information theory to semantics, linguistics, psychology, economics, management, quantum physics, literary criticism, garden design, music, the visual arts, and
…
Many of these artists were acquainted with at least the name of Claude Shannon and the conceptual gist of his theory. To people like Cage and Rauschenberg, who were exploring how minimal a work of music or art may be, information theory appeared to have something to say—even if no one was ever
…
entirely sure what. Shannon came to feel that information theory had been over-sold. In a 1956 editorial he gently derided the information theory “bandwagon.” People who did not understand the theory deeply were seizing on it as a trendy metaphor and
…
It was time, Elias acidly wrote, to stop publishing papers with titles like “Information Theory, Photosynthesis, and Religion.” To Shannon, Wiener, and Elias, the question of information theory’s relevance was more narrowly defined than it was for Marshall McLuhan. Does information theory have deep relevance to any field outside of communications? The answer, it appeared
…
, is yes. That is what a physicist named John Kelly described, in a paper he titled “Information Theory and Gambling.” John Kelly, Jr. IN 1894 THE CITY FATHERS of Corsicana, Texas, were drilling a new well. They struck oil instead of water. Corsicana
…
s career covered a variety of fields. He started out studying ways to compress television data. This brought him into Shannon’s new discipline of information theory, which Kelly probably absorbed through his own reading. Kelly was drawn into a line of research that had proven to be a black hole
…
contestant, suggesting that someone had inside information. In any case, Kelly was able to connect the $64,000 Question con to a theoretical question about information theory. Shannon’s theory, born of cryptography, pertains exclusively to coded messages. Some wondered whether the theory could apply in situations where no coding was involved
…
detected an unwholesome moral tone in Kelly’s article. He had submitted it to the Bell System Technical Journal. The executives worried about the title, “Information Theory and Gambling.” They feared the press might get hold of the article and conclude that Bell Labs was doing work to benefit illegal bookies. That
…
short run. Even people using a proportional betting system can, for all intents and purposes, go broke. Shannon invoked the law of large numbers throughout information theory. In a noisy communications channel where every bit is uncertain, the one certain thing is playing the percentages. Kelly used an analogous approach to make
…
and part to a wildly disconnected set of stock market musings. Shannon wondered about the statistical structure of the market’s random walk and whether information theory could provide useful insights. He mentions such diverse names as Bachelier, (Benjamin) Graham and (David) Dodd, (John) Magee, A. W. Jones, (Oskar) Morgenstern, and (
…
saw the Kelly formula as the mathematical essence of arbitrage. In the spring term of 1956, Shannon gave a class at MIT called Seminar on Information Theory. One lecture was titled “The Portfolio Problem.” The lecture is documented only by a mimeographed lecture handout saved by student W. Wesley Peterson (now
…
small chance of doing much better, or much worse, than usual. Statisticians are at home with both types of probability distributions, and both arise in information theory. Kelly’s tale of a gambler with inside tips presupposes exactly what the efficient market theory denies. No one is supposed to have advance knowledge
…
was being started by AT&T, the Bell Journal of Economics and Management Science. This journal was an acknowledgment of how profoundly quantitative methods from information theory and physical science were transforming formerly alien fields like finance. Thorp considers the Merton paper “a masterpiece.” “I never thought about credit, actually,” Thorp
…
rested for ten years. “Our analysis enables us to dispel a fallacy,” wrote Paul Samuelson in 1969, that has been borrowed into portfolio theory from information theory of the Shannon type. Associated with independent discoveries by J. B. Williams, John Kelly, and H. A. Latané is the notion that if one
…
than the one over “information theory of the Shannon type.” Arguing alongside Samuelson were people in his MIT circle, most notably Robert C. Merton. The opposition of these thinkers to the Kelly criterion deserved to be taken seriously and was—by academia and by Wall Street professionals. Claude Shannon was not party to the
…
recruiting mathematicians and information theorists,” wrote Elwyn Berlekamp. Tragically, Shannon saw little of the 1990s’ developments in mathematical finance or the equally impressive developments in information theory. His memory lapses worsened and were diagnosed as symptoms of Alzheimer’s disease. Shannon would be driving in the car and realize he did not
…
volatile stocks. In marketing his fund, Cover has run into resistance from conventionally trained economists and financial advisers. For many people in finance, terms like information theory and the long run still raise red flags. A Wharton School professor was quizzing Cover on behalf of potential investor Gordon Getty (who did not
…
Collection, Manuscript Division, Library of Congress (hereafter “Shannon’s papers, LOC.”) Appearance with beard: Photograph in Shannon’s papers, LOC. Dixieland music: Biographical film, Claude Shannon: Father of the Information Age, produced by UCSD Jacobs School, 2002. Juggled four or five balls, small hands: Liversidge 1987 and Elwyn Berlekamp in “Reflections
…
state amplifier”: Liversidge 1987. Meeting, courtship of Moore: Betty Shannon, interview. “One was married, and the other”: Betty Shannon, interview. Planned to write book on information theory: See letters between Riordan and Shannon dated Feb. 9 and 20, 1956, Shannon’s papers, LOC. Riordan pitched the book to an editor from John
…
al. 2001, 59. “He slept when he felt like sleeping”: Chiu, Lin, Mcferron, et al. 2001, 45. Minsky comment about why Shannon quit working on information theory: Liversidge 1987. Fano on Shannon’s knowledge of problems: Reported by Boris Tsybakov on http://chnm.gmu.edu/tools/surveys/responses/80/. “I just developed
…
“proudest and rarest creations”: Quoted in Liversidge 1987. “This, of course, involves not only”: Shannon 1949. Influence on garden design: Liversidge 1987. Scientology cites Shannon, information theory: www.dianeticstheevolutionofascience.org/chapters/eos_glossary.pdf. Philip K. Dick appears to allude to this odd blend of science, religion, and science fiction in his
…
that Hubbard made any such claim. See discussion at www.religio.de/therapie/sc/relstart.html. “Information Theory, Photosynthesis, and Religion”: Elias 1958. Date of birth: See short bio in “Contributors” section of IRE Transactions on Information Theory, Feb. 1962, 189. Kelly early biography: B. F. Logan, interview; 1930 census record for Corsicana,
…
Dostoyevsky 1966. “You’ve heard of Kuhn’s paradigm shift?”: Wilcox, interview. BIBLIOGRAPHY Aftab, Omar, Pearl Cheung, Austin Kim, Sneha Thakkar, and Neelima Yeddanapudi (2001). “Information Theory: Information Theory and the Digital Age.” Cambridge: Massachusetts Institute of Technology. Bachelier, Louis (1900). Théorie de la Spéculation. Paris: Gauthier-Villars. An English translation appears in Cootner
…
161–66. Bellman, R., and R. Kalaba (1957). “On the Role of Dynamic Programming in Statistical Communication Theory.” IRE Transactions of the Professional Group on Information Theory, IT-3 no. 3:197–203. Bennett, Charles H. (1982). “The Thermodynamics of Computation—A Review.” International Journal of Theoretical Physics 21:905–40. Benter
…
2001. Cover, Thomas (1991). “Universal Portfolios.” Mathematical Finance 1, no. 1:1–29. ———(1998a). “Shannon and Investment.” IEEE Information Theory Society Newsletter, Golden Jubilee issue (Summer 1998), 10–11. ———(1998b). “Shannon Reminiscences.” IEEE Information Theory Society Newsletter, Golden Jubilee issue (Summer 1998), 18–19. Cutler, D. M., M. Poterba, and L. H. Summers (1989
…
is available at www.gutenberg.org/etext/2197. Dunbar, Nicholas (2000). Inventing Money. New York: Wiley. Elias, Peter (1958). “Two Famous Papers.” IRE Transactions on Information Theory, Sept. 1958, 99. Epstein, Richard A. (1995). Theory of Gambling and Statistical Logic. Revised ed. San Diego: Academic Press. Evans, Robert (1994). The Kid
…
sold for large sums on eBay. A reprint is available from Ziemba (ziemba@interchange.ubc.ca). Hershberg, Philip I. (n.d. [1986]). “Claude Shannon on Investment Information: The Father of Information Theory Describes His Investment Picks.” Unpublished article. Hicks, Jerry (1982). “Blackjack’s No. 1 Guru.” Los Angeles Times, July 25, 1982. Hiltzik,
…
). “A New Interpretation of Information Rate.” Bell System Technical Journal, 917–26. ———, and O. G. Selfridge (1962), “Sophistication in Computers: A Disagreement.” IRE Transactions on Information Theory, Feb. 1962, 78–80. “J. L. Kelly, Physicist, 41” (obituary) (1965). Newark Evening News, Mar. 19, 1965, 15. Kendall, Maurice G. (1953). “The Analysis
…
2003. Perold, André F. (1999). “Long-Term Capital Management, L.P.” Harvard Business School. www.hbsp.harvard.edu. Pierce, John R. (1980). An Introduction to Information Theory: Symbols, Signals and Noise. New York: Dover, 1980. (Revised edition of 1961 book titled Symbols, Signals and Noise.) Quaife, Art (1993). “Rational Portfolio Determination.” Trans
…
Shannon Lecturers” (various authors, 1998). IEEE Information Theory Society Newsletter, Summer 1998, 16–21. Reid, Ed, and Ovid Demaris (1963). The Green Felt Jungle. New York: Trident Press. Roberts, Stanley (1978). “Welcome, Dr. Thorp.” Gambling Times, Aug. 1978, 11–14. Rogers, Everett M. (n.d.). “Claude Shannon’s Cryptography Research During World War II
…
, July and Oct. 1948, 379–423, 623–56. ———(1949). The Mathematical Theory of Communication. Urbana: University of Illinois Press. ———(1956a). “The Bandwagon.” IRE Transactions on Information Theory, June 1956, 3. ———(1956b). “The Portfolio Problem.” Unpublished lecture notes, Shannon’s papers, LOC. ———(1993). Claude Elwood Shannon: Collected Papers. Eds. Neil J. A.
by David A. Mindell · 10 Oct 2002 · 759pp · 166,687 words
by M. Mitchell Waldrop · 14 Apr 2001
by Paul Sen · 16 Mar 2021 · 444pp · 111,837 words
by Jon Gertner · 15 Mar 2012 · 550pp · 154,725 words
by Howard Rheingold · 14 May 2000 · 352pp · 120,202 words
by Sharon Bertsch McGrayne · 16 May 2011 · 561pp · 120,899 words
by James Gleick · 1 Mar 2011 · 855pp · 178,507 words
by David Kahn · 1 Feb 1963 · 1,799pp · 532,462 words
by George Zarkadakis · 7 Mar 2016 · 405pp · 117,219 words
by Edward O. Thorp · 15 Nov 2016 · 505pp · 142,118 words
by Walter Isaacson · 6 Oct 2014 · 720pp · 197,129 words
by John Brockman · 19 Feb 2019 · 339pp · 94,769 words
by Brian Christian · 1 Mar 2011 · 370pp · 94,968 words
by George Dyson · 28 Mar 2012 · 463pp · 118,936 words
by David A. Sinclair and Matthew D. Laplante · 9 Sep 2019
by Steven Johnson · 15 Nov 2016 · 322pp · 88,197 words
by William Poundstone · 267pp · 71,941 words
by T. R. Reid · 18 Dec 2007 · 293pp · 91,110 words
by Matthew Cobb · 6 Jul 2015 · 608pp · 150,324 words
by Steven Levy · 15 Jan 2002 · 468pp · 137,055 words
by Donald Ervin Knuth · 15 Jan 1998
by Jordan Ellenberg · 14 May 2021 · 665pp · 159,350 words
by Stuart Russell and Peter Norvig · 14 Jul 2019 · 2,466pp · 668,761 words
by Terrence J. Sejnowski · 27 Sep 2018
by Cal Newport · 2 Mar 2021 · 350pp · 90,898 words
by David Golumbia · 31 Mar 2009 · 268pp · 109,447 words
by Daniel C. Dennett · 7 Feb 2017 · 573pp · 157,767 words
by Jack D. Schwager · 24 Apr 2012 · 272pp · 19,172 words
by Belinda Barnet · 14 Jul 2013 · 193pp · 19,478 words
by John MacCormick and Chris Bishop · 27 Dec 2011 · 250pp · 73,574 words
by Bruce Schneier · 10 Nov 1993
by Chris Bernhardt · 12 May 2016 · 210pp · 62,771 words
by Sean M. Carroll · 15 Jan 2010 · 634pp · 185,116 words
by Melanie Mitchell · 31 Mar 2009 · 524pp · 120,182 words
by George Gilder · 23 Feb 2016 · 209pp · 53,236 words
by Alec Nevala-Lee · 22 Oct 2018 · 622pp · 169,014 words
by Steven Johnson · 329pp · 88,954 words
by Cesar Hidalgo · 1 Jun 2015 · 242pp · 68,019 words
by Amy Webb · 5 Mar 2019 · 340pp · 97,723 words
by Stuart Russell · 7 Oct 2019 · 416pp · 112,268 words
by Nathan L. Ensmenger · 31 Jul 2010 · 429pp · 114,726 words
by Marcus Du Sautoy · 7 Mar 2019 · 337pp · 103,522 words
by Paul Davies · 31 Jan 2019 · 253pp · 83,473 words
by Nate Silver · 31 Aug 2012 · 829pp · 186,976 words
by Robert Elliott Smith · 26 Jun 2019 · 370pp · 107,983 words
by Erik J. Larson · 5 Apr 2021
by James Owen Weatherall · 2 Jan 2013 · 338pp · 106,936 words
by Luke Dormehl · 10 Aug 2016 · 252pp · 74,167 words
by James Gleick · 18 Oct 2011 · 396pp · 112,748 words
by Kevin Carey · 3 Mar 2015 · 319pp · 90,965 words
by Adam Kucharski · 23 Feb 2016 · 360pp · 85,321 words
by Daniel J. Levitin · 18 Aug 2014 · 685pp · 203,949 words
by Richard Dawkins · 1 Jan 2004 · 460pp · 107,712 words
by Charles Petzold · 28 Sep 1999 · 566pp · 122,184 words
by Michael Bhaskar · 2 Nov 2021
by Byron Reese · 23 Apr 2018 · 294pp · 96,661 words
by Lee Smolin · 31 Mar 2019 · 385pp · 98,015 words
by Allen C. Benello · 7 Dec 2016
by Scott Patterson · 2 Feb 2010 · 374pp · 114,600 words
by Lance Fortnow · 30 Mar 2013 · 236pp · 50,763 words
by David Epstein · 1 Mar 2019 · 406pp · 109,794 words
by J. Doyne Farmer · 24 Apr 2024 · 406pp · 114,438 words
by Benjamin Breen · 16 Jan 2024 · 384pp · 118,573 words
by Jeff Booth · 14 Jan 2020 · 180pp · 55,805 words
by Neil A. Gershenfeld · 15 Feb 1999 · 238pp · 46 words
by Gregory Zuckerman · 5 Nov 2019 · 407pp · 104,622 words
by Clive Thompson · 11 Sep 2013 · 397pp · 110,130 words
by Nicole Kobie · 3 Jul 2024 · 348pp · 119,358 words
by Michael P. Lynch · 21 Mar 2016 · 230pp · 61,702 words
by Kariappa Bheemaiah · 26 Feb 2017 · 492pp · 118,882 words
by Luciano Floridi · 25 Feb 2010 · 137pp · 36,231 words
by Thierry Poibeau · 14 Sep 2017 · 174pp · 56,405 words
by Melanie Mitchell · 14 Oct 2019 · 350pp · 98,077 words
by John Markoff · 22 Mar 2022 · 573pp · 142,376 words
by Marcos Lopez de Prado · 2 Feb 2018 · 571pp · 105,054 words
by Feng Gu · 26 Jun 2016
by David Bellos · 10 Oct 2011 · 396pp · 107,814 words
by Anil Seth · 29 Aug 2021 · 418pp · 102,597 words
by Jimmy Soni · 22 Feb 2022 · 505pp · 161,581 words
by John Markoff · 24 Aug 2015 · 413pp · 119,587 words
by Brian Bagnall · 13 Sep 2005 · 781pp · 226,928 words
by Douglas Coupland · 29 Sep 2014 · 124pp · 36,360 words
by Bill Gates, Nathan Myhrvold and Peter Rinearson · 15 Nov 1995 · 317pp · 101,074 words
by Ray Kurzweil · 13 Nov 2012 · 372pp · 101,174 words
by Anil Ananthaswamy · 15 Jul 2024 · 416pp · 118,522 words
by Foster Provost and Tom Fawcett · 30 Jun 2013 · 660pp · 141,595 words
by Alex Bellos · 3 Apr 2011 · 437pp · 132,041 words
by Ed Finn · 10 Mar 2017 · 285pp · 86,853 words
by John Seely Brown and Paul Duguid · 2 Feb 2000 · 791pp · 85,159 words
by Michael S. Malone · 20 Jul 2021
by Nassim Nicholas Taleb · 20 Feb 2018 · 306pp · 82,765 words
by Thomas Rid · 27 Jun 2016 · 509pp · 132,327 words
by Jim Jansen · 25 Jul 2011 · 298pp · 43,745 words
by Douglas R. Dechow · 2 Jul 2015 · 223pp · 52,808 words
by Donald MacKenzie · 24 May 2021 · 400pp · 121,988 words
by James Lovelock · 27 Aug 2019 · 94pp · 33,179 words
by Dariusz Jemielniak and Aleksandra Przegalinska · 18 Feb 2020 · 187pp · 50,083 words
by Ananyo Bhattacharya · 6 Oct 2021 · 476pp · 121,460 words
by Pedro Domingos · 21 Sep 2015 · 396pp · 117,149 words
by Douglas B. Laney · 4 Sep 2017 · 374pp · 94,508 words
by Richard Dawkins · 21 Sep 2009
by Ray Kurzweil · 14 Jul 2005 · 761pp · 231,902 words
by Andy Kessler · 13 Jun 2005 · 218pp · 63,471 words
by Freeman Dyson · 1 Jan 2006 · 332pp · 109,213 words
by Joi Ito and Jeff Howe · 6 Dec 2016 · 254pp · 76,064 words
by Martin Ford · 13 Sep 2021 · 288pp · 86,995 words
by Rob Kitchin,Tracey P. Lauriault,Gavin McArdle · 2 Aug 2017
by Emanuel Derman · 1 Jan 2004 · 313pp · 101,403 words
by William Thorndike · 14 Sep 2012 · 330pp · 59,335 words
by Steven Levy · 23 Oct 2006 · 297pp · 89,820 words
by Paul Mason · 29 Jul 2015 · 378pp · 110,518 words
by Brian Merchant · 19 Jun 2017 · 416pp · 129,308 words