Mathematical Theory of Communication

back to index

description: 1948 article by Claude Shannon

79 results

pages: 137 words: 36,231

Information: A Very Short Introduction
by Luciano Floridi
Published 25 Feb 2010

As such, it has had a profound impact on the analyses of the various kinds of information, to which it has provided both the technical vocabulary and at least the initial conceptual framework. It would be impossible to understand the nature of information without grasping at least its main gist. This is the task of the present chapter. The mathematical theory of communication MTC treats information as data communication, with the primary aim of devising efficient ways of encoding and transferring data. 7. The mathematical theory of communication (MTC) It has its origin in the field of electrical engineering, as the study of communication limits, and develops a quantitative approach to information. To have an intuitive sense of the approach, let us return to our example.

Noise extends the informee's freedom of choice in selecting a message, but it is an undesirable freedom and some redundancy can help to limit it. That is why the manual of John's car includes both verbal explanations and pictures to convey (slightly redundantly) the same information. Some conceptual implications of the mathematical theory of communication For the mathematical theory of communication (MTC), information is only a selection of one symbol from a set of possible symbols, so a simple way of grasping how MTC quantifies information is by considering the number of yes/no questions required to determine what the source is communicating. One question is sufficient to determine the output of a fair coin, which therefore is said to produce one bit of information.

I am very grateful to the Akademie der Wissenschaften in Gottingen, for the privilege of being elected Gauss Professor during the academic year 2008-9, and to the University of Hertfordshire, for having been generous with my teaching schedule while visiting Gottingen and completing this book. 1 A typical information life cycle 5 2 A map of information concepts 20 3 Analogue, digital, and binary data 24 4 Types of data/ information 30 5 Environmental data/ information 32 6 Information as semantic content 34 7 The mathematical theory of communication (MTC) 38 8 Communication model 39 9 Factual semantic information 49 10 Virtual information in natural deduction 57 11 Physical information 61 12 Maxwell's demon 64 13 Biological information 74 14 DNA and the genetic code 78 15 Genetic information 81 16 Abstract scheme of a neuron 83 17 Economic information 89 18 A simple application of Bayes' theorem 101 19 The 'External' R(esource) P(roduct) T(arget) Model 104 20 The 'Internal' R(esource) P(roduct) T(arget) Model 110 Table 1 The General Definition of Information (GDI) 21 Table 2 Decimal and binary notations of positive integers 27 Table 3 Example of binary encoding 28 Table 4 Environmental information 33 Table 5 Examples of communication devices and their information power 41 Table 6 The definition of factual semantic information 50 Table 7 The normal form of a typical prisoner's dilemma 94 The goal of this volume is to provide an outline of what information is, of its manifold nature, of the roles that it plays in several scientific contexts, and of the social and ethical issues raised by its growing importance.

pages: 415 words: 114,840

A Mind at Play: How Claude Shannon Invented the Information Age
by Jimmy Soni and Rob Goodman
Published 17 Jul 2017

By 1952, the University of Illinois had succeeded in acquiring a digital computer, and simultaneously, it was awarded a large federal contract for the study of “communication theory.” * * * The publication of The Mathematical Theory of Communication stands as one of the defining moments in the history of information theory, and not only on account of its commercial success. Even the title sent an important message: in the span of a year, Shannon’s original “A Mathematical Theory of Communication” had become the definitive “The Mathematical Theory of Communication.” As electrical engineer and information theorist Robert Gallager pointed out, the subtle change in the article’s context, from one of several articles in a technical journal to centerpiece of a book, was a mark of supremacy.

It would be his second, and greatest, feat of abstraction. Before the publication of his “Mathematical Theory of Communication,” scientists could track the movement of electrons in a wire, but the possibility that the very idea they stood for could be measured and manipulated just as objectively would have to wait until it was proved by Shannon. It was summed up in his recognition that all information, no matter the source, the sender, the recipient, or the meaning, could be efficiently represented by a sequence of bits: information’s fundamental unit. Before the “Mathematical Theory of Communication,” a century of common sense and engineering trial and error said that noise—the physical world’s tax on our messages—had to be lived with.

Shannon, The” (Brewer), 207–8 Manning, Charlie, 228 Marconi, Guglielmo, 265 Maric, Mileva, 184 Marietta College, 168 Marshall, George, 97 Masonic Hall (Philadelphia), 210 Massachusetts Institute of Technology (MIT), 20, 32–34, 38, 53, 59, 154, 187, 201, 215, 226, 227, 244, 262, 266, 279 Bush at, xii, 22, 28, 29, 34–35, 49 CS as full professor at, 225, 228–33, 234–35, 236, 239, 240–41, 244, 246, 248, 261, 262, 276 CS as graduate student at, xii, 32, 34, 45–49, 61, 74, 94, 177 CS as visiting professor at, 223–25 Juggling Club at, 248–49, 268–69 Radiation Laboratory (Rad Lab) at, 167 Massey, James, 158 Mathematical Review, 172 “Mathematical Studies Relating to Fire Control” (NDRC project), 85–89 “Mathematical Theory of Communication, A” (Shannon), xiii–xiv, 138, 235, 262 response to, 165–69, 172–74 see also information theory Mathematical Theory of Communication, The (Shannon and Weaver), 168–69, 172–74 “Mathematical Theory of Cryptography—Case 208078, A” (Shannon), 101–2 Mathematician’s Apology, A (Hardy), 172 mathematics, mathematicians: communications and, 76–77, 86 in industry, 68–71 juggling and, 249–50, 251, 254–55 pure, 69, 171–72 in World War II, 86–90, 92–93 Mathematics of Juggling, The (Polster), 250 Maxwell, James Clerk, 162n Mead, Debra, 268 Mead, Margaret, 206 messenger RNA, 140 Michigan, xii, 7, 8, 14, 32, 35 Michigan, University of, 14–15, 269 College of Engineering at, 15–16 CS at, 13, 15–20, 35, 39–43 Literary College at, 15 Michigan Central Railroad, 7 Mill, John Stuart, 9 Minckler, Rex, 105 Mindell, David, 88–89 Minsky, Marvin, 171, 277 missiles, German development of, 86, 88 Mitchell, Silas Weir, 210 Monticello, 227 Moore, Betty, see Shannon, Betty Moore More, Trenchard, 229 Morgenstern, Oskar, 240 Morristown, N.J., 184 Morse, Marston, 74–75 Morse, Samuel, 146 Morse code, 10–11, 132, 145, 146, 159n, 236 Motorola, 242 Moulton, Maria, 110, 113, 114, 115 CS’s relationship with, 111 Mount Auburn Cemetery, Cambridge, 272 Müllabfuhrwortmaschine, 148n Murray State University, 264 Muslims, 250 Mystic Lake, 245 Nasar, Sylvia, 77, 170, 181, 195 Nash, John, 77–78, 170, 172, 181, 263 National Center for Atmospheric Research (NCAR), 65 National Defense Research Committee (NDRC), 81–82, 83, 89 CS’s work for, 87–89 National Medal of Science, 258 National Register of Historic Places, 227 National Research Fellowship, 63 National Security Agency (NSA), 96, 100, 194, 196–98 National Security Scientific Advisory Board, 196–98 nature-nurture problem, 53 Navajo Indians, in World War II cryptology, 98 Naval Academy, U.S., 194 Navy, U.S., 15, 85, 105, 194 Neuhoff, David, 269 Nevada, 246 New Deal, 182 New Hampshire, 63 New Jersey, 223, 226 New Jersey College for Women, 183 Newton, Isaac, xi, xii, 23, 25, 32, 54, 89, 122, 136, 180, 217, 265, 279 New York, N.Y., 12, 62, 63, 79, 91, 104–5, 126, 170, 177, 223 CS’s apartment in, 110–11, 136, 182 jazz scene in, 110 New Yorker, 209 New York State, 27 New York Times, 48, 178, 182, 273 nicknames, 153 Nobel, Alfred, 48 Nobel Prize, 67, 68, 170, 188, 263, 264 noise, xiv, 86, 309n quantification of, 136 redundancy and, 158–59 signal vs., 119–20, 123–24, 126, 127, 156–61, 179 North Korea, 197 Nyquist, Harry, 126–30, 131, 132, 133, 134, 138, 141, 144, 157, 308n fax prototype of, 126–27 Ogden, John, 11–12 Oliver, Bernard “Barney,” 67, 111–12, 113, 259 Omni, 188 one-time pads, 101–2 On the Origin of Species (Darwin), 170 Oppenheimer, J.

pages: 229 words: 67,599

The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age
by Paul J. Nahin
Published 27 Oct 2012

At the end of his year at Princeton, and once again alone, Shannon accepted an offer to return to Bell Labs as a full-time member of the technical staff in the mathematical research group. There he would enjoy an astonishingly creative fifteen years, including the production of his masterpiece—what Scientific American called “the Magna Carta of the information age”—the 1948 “A Mathematical Theory of Communication.” Initially his work at Bell Labs dealt with anti-aircraft fire-control systems, the need for which had grown in importance with the appearance of the 400 mph German pulse-jet V1 “flying robot bomb,” the world’s first cruise missile. (The German V2 rocket—the world’s first ballistic missile—is also often lumped in with the V1 as driving fire-control system development during Shannon’s day, but it would be quite difficult to shoot down a V2 today, during its 2,000 mph terminal atmospheric reentry phase, much less with 1940s gun technology!)

In 1945 Shannon wrote a classified (“Confidental,” which really isn’t very ’secret’) report, “A Mathematical Theory of Cryptography” which was declassified in 1949 when it appeared in The Bell System Technical Journal under the new title of “Communication Theory of Secrecy Systems.” Some historians of science have speculated that it was his work in cryptography that led to Shannon’s 1948 masterpiece, “A Mathematical Theory of Communication,” of which you’ll find a (very) brief discussion in Chapter 7. Shannon himself, however, was always quite clear on this, crediting papers published in The Bell System Technical Journal in the 1920s by Bell Labs scientists Ralph Hartley (1888–1970) and Harry Nyquist (1890–1976); indeed, he specifically credits both men in his “Mathematical Theory.”

Shannon himself, however, was always quite clear on this, crediting papers published in The Bell System Technical Journal in the 1920s by Bell Labs scientists Ralph Hartley (1888–1970) and Harry Nyquist (1890–1976); indeed, he specifically credits both men in his “Mathematical Theory.” And even before the war, Shannon wrote a letter, dated February 16, 1939, to Bush on “some of the fundamental properties of general systems for the transmission of intelligence” in which the work of Hartley is mentioned. “A Mathematical Theory of Communication” stunned the engineering world; it was written with such clarity and freedom from obfuscating mathematics (much to the irritation of some pure mathematicians!—see Chapter 6) that real engineers could actually read and understand it. It was simply a tour de force, simultaneously founding the entirely new research field of information theory, posing and solving some extremely difficult problems, and pointing its readers toward other problems that remained unanswered.

pages: 855 words: 178,507

The Information: A History, a Theory, a Flood
by James Gleick
Published 1 Mar 2011

But it was only the second most significant development of that year. The transistor was only hardware. An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand—“A Mathematical Theory of Communication”—and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year-old named Claude Shannon.♦ The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity—a fundamental unit of measure.

He and Betty began dating in 1948 and married early in 1949. Just then he was the scientist everyone was talking about. THE WEST STREET HEADQUARTERS OF BELL LABORATORIES, WITH TRAINS OF THE HIGH LINE RUNNING THROUGH Few libraries carried The Bell System Technical Journal, so researchers heard about “A Mathematical Theory of Communication” the traditional way, by word of mouth, and obtained copies the traditional way, by writing directly to the author for an offprint. Many scientists used preprinted postcards for such requests, and these arrived in growing volume over the next year. Not everyone understood the paper.

.”♦ Weaver had headed the government’s applied mathematics research during the war, supervising the fire-control project as well as nascent work in electronic calculating machines. In 1949 he wrote up an appreciative and not too technical essay about Shannon’s theory for Scientific American, and late that year the two pieces—Weaver’s essay and Shannon’s monograph—were published together as a book, now titled with a grander first word The Mathematical Theory of Communication. To John Robinson Pierce, the Bell Labs engineer who had been watching the simultaneous gestation of the transistor and Shannon’s paper, it was the latter that “came as a bomb, and something of a delayed action bomb.”♦ Where a layman might have said that the fundamental problem of communication is to make oneself understood—to convey meaning—Shannon set the stage differently: The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.♦ “Point” was a carefully chosen word: the origin and destination of a message could be separated in space or in time; information storage, as in a phonograph record, counts as a communication.

pages: 759 words: 166,687

Between Human and Machine: Feedback, Control, and Computing Before Cybernetics
by David A. Mindell
Published 10 Oct 2002

See also Bode and Shannon, “Simplified Derivation of Linear Least Square Smoothing and Prediction Theory,” 425, which addresses Wiener’s prediction in more detail; and Blackman, Linear Data-Smoothing and Prediction , an extension of the 1948 work. 22. Shannon, “Mathematical Theory of Communication”; Shannon and Weaver, Mathematical Theory of Communication . 23. Shannon and Weaver, Mathematical Theory of Communication . For Shannon’s early work see Shannon to Bush, 16 February 1939, in Shannon, Claude Elwood Shannon , 455–56. 24. Shannon, “Mathematical Theory of Communication,” 53n. The relationship between Shannon’s and Wiener’s work is more complex than described here. In a 1987 interview Shannon said, “I don’t think Wiener had much to do with information theory.

Claude Elwood Shannon: Collected Papers . Ed. N. J. A. Sloane and A. D. Wyner. New York: IEEE Press, 1993. ———. “A Mathematical Theory of Communication.” Bell System Technical Journal 27 (July–October 1948): 379–423, 623–56. ———. “Mathematical Theory of the Differential Analyzer.” Journal of Mathematics and Physics 20 (December 1941): 337–54. ———. “A Symbolic Analysis of Relay Switching Circuits.” Transactions of the AIEE 57 (1938): 713–23 . Shannon, Claude Elwood, and Warren Weaver. The Mathematical Theory of Communication . Chicago: University of Illinois Press, 1949. Shapin, Steven, and Simon Schaffer. Leviathan and the Air-Pump: Hobbes, Boyle, and the Experimental Life .

As always, the continuous nature of these processes makes the choice of beginning and ending somewhat arbitrary. This narrative ends in 1945, when the Office of Scientific Research and Development (OSRD) closed down, and with the subsequent publication of Cybernetics , Claude Shannon’s “Mathematical Theory of Communications,” and the Massachusetts Institute of Technology (MIT) Radiation Laboratory series of textbooks on radar, electronics, and servomechanisms. These and other publications helped spread the results of the war’s massive research and development projects and laid foundations for a new era of communications, control, and computing.

pages: 518 words: 107,836

How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy)
by Benjamin Peters
Published 2 Jun 2016

The first methodological hallmark of cybernetics is that it is not one thing but that its key concepts, especially human-machine interaction and feedback, outline a kind of vocabulary for working analogically across different systems—computational, mechanical, neurological, organic, social—that rendered its vocabulary fecund for other sibling fields embedded in U.S. military-industrial research.18 Take, for example, the contemporary fields of information theory and game theory. Mainstream American information theory, following Bell Labs engineer Claude E. Shannon’s 1948 mathematical theory of communication, concentrates on the efficient and reliable measurement and transmission of data.19 Perhaps its central seminal contribution is the theorizing of a statistical framework for understanding all data transmissions. All communication messages became a question of probabilities and stochastic analysis, and the term information abandoned its ordinary meaning of relevant facts and took on a new definition as a technical measure of the likelihood that a message contains something ordered or surprising.

Shannon insisted on keeping the technical principles of information theory separate from the more sweeping scope of cybernetics, Von Neumann did not rigorously distinguish between the three, and Wiener defended his grouping of the other two research fields under the cybernetics umbrella, even as (especially after mid-1950s) many information theorists and game theorists objected to any conflation of these fields.22 All three fields presented overlapping rational and generalized models of communication, or a “theory of messages” fit for application, even though no one—not even the founders—knew the exact limits of these computation communication sciences. Shannon did not accept the label of cybernetics, and he also did not accept the label others had given to his own “information theory,” preferring to the end of his life his original emphasis on “mathematical theory of communication.” Each of these sciences sought to theorize the technical means by which communication could be controlled. The cybernetic sciences, especially but not exclusively in the Soviet case, emerge as a communication science in search of self-governing systems. Although it has never been clear (perhaps even to cyberneticists) what cyberneticists could do exactly, it also never has been obvious what cybernetics could not do (perhaps even the definition of cybernetics is self-governing).

The coauthors also integrated and expanded the stochastic analysis of Claude Shannon’s information theory while simultaneously stripping Wiener’s organism-machine analogy of its political potency.71 Wiener’s core analogies between animal and machine, machine and mind were stressed as analogies—or how “self-organizing logical processes [appeared] similar to the processes of human thought” but were not synonyms. At the same time, the article scripts his language of control, feedback, and automated systems in the machine and organism into the common language of information, or Shannon’s mathematical theory of communication. For Kitov, this “doctrine of information” took on wholesale the task of universalizing statistical control in machines and minds. It did so by preferring the “automatic high-speed electronic calculating machine” (that is, computer) to Wiener’s original base analogy for cybernetic comparisons—the servomechanism.

pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI
by John Brockman
Published 19 Feb 2019

As a twenty-four-year old, when I first encountered Wiener’s ideas and met his colleagues at the MIT meeting I describe in the book’s introduction, I was hardly interested in Wiener’s warnings or admonitions. What drove my curiosity was the stark, radical nature of his view of life, based on the mathematical theory of communications in which the message was nonlinear: According to Wiener, “new concepts of communication and control involved a new interpretation of man, of man’s knowledge of the universe, and of society.” And that led to my first book, which took information theory—the mathematical theory of communications—as a model for all human experience. In a recent conversation, Peter told me he was beginning to write a book—about building, crashing, and thinking—that considers the black-box nature of cybernetics and how it represents what he thinks of as “the fundamental transformation of learning, machine learning, cybernetics, and the self.”

But that has never been the biggest factor: Consider, say, ancient Athens versus the rest of the world at the time. * Alfred, Lord Tennyson, “The Revenge” (1878). * Norbert Wiener, “A Scientist Rebels,” Atlantic Monthly, January 1947. * Warren Weaver, “Recent Contributions to the Mathematical Theory of Communication,” in Claude Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949), 8 (emphasis in original). Shannon’s 1948 papers were republished in the same volume. * Matthew Arnold, Culture and Anarchy, ed. Jane Garnett (Oxford, UK: Oxford University Press, 2006). * The Human Use of Human Beings (Boston: Houghton Mifflin, 1954), 17–18

“Ned” Hall and Edmund Carpenter, I started reading avidly in the fields of information theory, cybernetics, and systems theory. McLuhan suggested I read biologist J. Z. Young’s Doubt and Certainty in Science, in which he said that we create tools and we mold ourselves through our use of them. The other text he recommended was Warren Weaver and Claude Shannon’s 1949 paper “Recent Contributions to the Mathematical Theory of Communication,” which begins: “The word communication will be used here in a very broad sense to include all of the procedures by which one mind may affect another. This, of course, involves not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior.”

pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive
by Brian Christian
Published 1 Mar 2011

The Turing test, bless it, has now given us a yardstick for this shame. A Mathematical Theory of Communication It seems, at first glance, that information theory—the science of data transmission, data encryption, and data compression—would be mostly a question of engineering, having little to do with the psychological and philosophical questions that surround the Turing test and AI. But these two ships turn out to be sailing quite the same seas. The landmark paper that launched information theory is Claude Shannon’s 1948 “A Mathematical Theory of Communication,” and as it happens, this notion of scientifically evaluating “communication” binds information theory and the Turing test to each other from the get-go.

Cell phones rely heavily on “prediction” algorithms to facilitate text-message typing: guessing what word you’re attempting to write, auto-correcting typos (sometimes overzealously), and the like—this is data compression in action. One of the startling results that Shannon found in “A Mathematical Theory of Communication” is that text prediction and text generation turn out to be mathematically equivalent. A phone that could consistently anticipate what you were intending to write, or at least that could do as well as a human, would be just as intelligent as the program that could write you back like a human.

For more, see Hofstadter’s I Am a Strange Loop. 56 Benjamin Seider, Gilad Hirschberger, Kristin Nelson, and Robert Levenson, “We Can Work It Out: Age Differences in Relational Pronouns, Physiology, and Behavior in Marital Conflict,” Psychology and Aging 24, no. 3 (September 2009), pp. 604–13. 10. High Surprisal 1 Claude Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27 (1948), pp. 379–423, 623–56. 2 average American teenager: Katie Hafner, “Texting May Be Taking a Toll,” New York Times, May 25, 2009. 3 The two are in fact related: For more information on the connections between Shannon (information) entropy and thermodynamic entropy, see, e.g., Edwin Jaynes, “Information Theory and Statistical Mechanics,” Physical Review 106, no. 4, (May 1957), pp. 620–30; and Edwin Jaynes, “Information Theory and Statistical Mechanics II,” Physical Review 108, no. 2 (October 1957), pp. 171–90. 4 Donald Barthelme, “Not-Knowing,” in Not-Knowing: The Essays and Interviews of Donald Barthelme, edited by Kim Herzinger (New York: Random House, 1997). 5 Jonathan Safran Foer, Extremely Loud and Incredibly Close (Boston: Houghton Mifflin, 2005). 6 The cloze test comes originally from W.

pages: 339 words: 57,031

From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism
by Fred Turner
Published 31 Aug 2006

In particular, they should avoid adhering to a strictly top-down style of communication: “Otherwise,” wrote Wiener, “the top officials may find that they have based their policy on a complete misconception of the facts that their underlings possess.”38 Both Cybernetics and The Human Use of Human Beings were best sellers, and together with Claude Shannon and Warren Weaver’s 1949 Mathematical Theory of Communication, they sparked a decade’s worth of debate about the proper role of computers in society. Given the many pages they devoted to analyzing complex mathematical formulas, Cybernetics and The Mathematical Theory of Communication would not seem to be likely candidates for popular acclaim. Yet, in an America that had recently defeated Nazi Germany and Hirohito’s Japan and invented a weapon that could eradicate life on earth, the computational metaphor that underlay these books gave voice to two issues then much in the public eye: the sudden importance of science and its ambiguous social potential.

Heims, John Von Neumann and Norbert Wiener, 184. 37. Rosenblueth, Wiener, and Bigelow, “Behavior, Purpose, and Teleology”; Galison, “Ontology of the Enemy,” 247; Wiener, Cybernetics, 15, 21. Shannon published his theory in a 1948 article, “Mathematical Theory of Communication.” Shannon’s theories rose to public prominence through his 1949 collaboration with Warren Weaver in The Mathematical Theory of Communication. There is some controversy over the question of how much Wiener’s theory of messages owes to Shannon’s theory of information. For detailed, if differing, accounts of this question, see Waldrop, Dream Machine, 75 – 82; and Conway and Siegelman, Dark Hero of the Information Age, 185 –92. 38.

“The Origins of the Lawrence Berkeley Laboratory.” In Big Science: The Growth of Large-Scale Research, edited by Peter Galison and Bruce Hevly, 21– 45. Stanford, CA: Stanford University Press, 1992. Shannon, Claude. “A Mathematical Theory of Communication.” Bell System Technical Journal 27, no. 3 ( July 1948): 379 – 423; no. 4 (October 1948): 623 –56. Shannon, Claude, and Warren Weaver. The Mathematical Theory of Communication. Urbana: University of Illinois Press, 1949. Shiller, Robert J. Irrational Exuberance. New York: Broadway Books, 2001. Siegel, Lenny, and John Markoff. The High Cost of High Tech: The Dark Side of the Chip.

Lifespan: Why We Age—and Why We Don't Have To
by David A. Sinclair and Matthew D. Laplante
Published 9 Sep 2019

No one was more acutely disturbed by the problem of information loss than Claude Shannon, an electrical engineer from the Massachusetts Institute of Technology (MIT) in Boston. Having lived through World War II, Shannon knew firsthand how the introduction of “noise” into analog radio transmissions could cost lives. After the war, he wrote a short but profound scientific paper called “The Mathematical Theory of Communication” on how to preserve information, which many consider the foundation of Information Theory. If there is one paper that propelled us into the digital, wireless world in which we now live, that would be it.26 Shannon’s primary intention, of course, was to improve the robustness of electronic and radio communications between two points.

As Shannon put it, “This observer notes the errors in the recovered message and transmits data to the receiving point over a ‘correction channel’ to enable the receiver to correct the errors.” Though it may sound like esoteric language from the 1940s, what dawned on me in 2014 is that Shannon’s “A Mathematical Theory of Communication” is relevant to the Information Theory of Aging. In Shannon’s drawing, there are three different components that have analogs in biology: • The “source” of the information is the egg and sperm, from your parents. • The “transmitter” is the epigenome, transmitting analog information through space and time

This fundamental information, laid down early in life, is able to tell the body how to be young again—the equivalent of a backup of the original data. CLAUDE SHANNON’S 1948 SOLUTION TO RECOVERING LOST INFORMATION DURING DATA TRANSMISSIONS LED TO CELL PHONES AND THE INTERNET. It may also be the solution to reversing aging. Source: C. E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27, no. 3 (July 1948): 379–423 and 27, no. 4 (October 1948): 623–66. To end aging as we know it, we need to find three more things that Shannon knew were essential for a signal to be restored even if it is obscured by noise: • An “observer” who records the original data • The original “correction data” • And a “correcting device” to restore the original signal I believe we may have finally found the biological correcting device.

pages: 267 words: 71,941

How to Predict the Unpredictable
by William Poundstone

This was one of the founding documents of the computer age. Shannon spent a fellowship at the Institute for Advanced Study, Princeton. His first wife, Norma, poured tea for Albert Einstein one time, who “told me I was married to a brilliant, brilliant man.” That was before Shannon published the work for which he’s most renowned, “A Mathematical Theory of Communication.” The 1948 paper established the science of information theory. In Shannon’s revolutionary vision, information is one of the world’s fundamentals, on a par with matter and energy, and subject to laws of its own. These laws became the foundation of the Internet and all digital media.

“You should do something on that”: Gertner 2012, 196. Founding document: Harvard psychologist Howard Gardner called it “possibly the most important, and also the most famous, master’s thesis of the century.” “told me I was married to a brilliant, brilliant man”: Gertner 2012, 121. “A Mathematical Theory of Communication”: Shannon 1948. “We hope that research in the design”: Shannon 1955, 448. Short film about Theseus: Gertner 2012, 140. As far as I can tell, the film is not presently on YouTube. Lego block version of ultimate machine: See www.youtube.com/watch?v=gKsP4vuTg2c “My characterization of his smartness”: Gertner 2012, 143.

Finance, Sept. 21, 2012. finance.yahoo.com/blogs/the-exchange/cracking-pin-code-easy-1-2-3-4-130143629.html. Schiffman, Nathaniel (2005). Abracadabra! Amherst, NY: Prometheus Books. Schroeder, Manfred (1992). Fractals, Chaos, Power Laws: Minutes from an Infinite Paradise. New York: W.H. Freeman. Shannon, C.E. (1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, Jul. and Oct. 1948, 379–423; 623–656. Shannon, Claude (1953). “A Mind-Reading (?) Machine.” Bell Laboratories memorandum, Mar. 18, 1953. ——— (1955). “Game-Playing Machines.” Journal of the Franklin Institute 250, 447–453. Sharpe, Steven (2002). “Re-examining Stock Valuation and Inflation: The Implication of Analysts’ Earning Forecasts.”

pages: 389 words: 109,207

Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street
by William Poundstone
Published 18 Sep 2006

“Collapse of Eifuku Master Trust Happened in Seven Trading Days.” Wall Street Journal, Apr. 10, 2003. Serwer, Andy (2003). “Where the Money’s Really Made.” Fortune, Mar. 31, 2003, 106+. Shannon, Claude Elwood. Manuscript collection. Library of Congress. ———(1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, July and Oct. 1948, 379–423, 623–56. ———(1949). The Mathematical Theory of Communication. Urbana: University of Illinois Press. ———(1956a). “The Bandwagon.” IRE Transactions on Information Theory, June 1956, 3. ———(1956b). “The Portfolio Problem.” Unpublished lecture notes, Shannon’s papers, LOC. ———(1993).

The transistor was the hardware that would make so many applications of Shannon’s theory a reality. This incident would have been in late 1947 or early 1948, before Bell Labs unveiled the transistor on June 30—and just about the time Shannon’s classic paper on information theory appeared. There is minor scandal associated with that paper. Shannon published “A Mathematical Theory of Communication” in a 1948 issue of the Bell System Technical Journal. He was then thirty-two years old. Most of the work had been done years earlier, from about 1939 to 1943. Shannon told few people what he was doing. He habitually worked with his office door closed. As Bell Labs people gradually learned of this work, they were astonished that Shannon had devised such an important result and then sat on it.

In the 1950s and 1960s, it was often used to embrace computer science, artificial intelligence, and robotics (fields that fascinated Shannon but which he considered distinct from information theory). Thinkers intuited a cultural revolution with computers, networks, and mass media at its base. “The word communication will be used here in a very broad sense to include all of the procedures by which one mind may affect another,” begins the introduction to a 1949 book, The Mathematical Theory of Communication, reprinting Shannon’s paper. “This, of course, involves not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior.” These words were written by Shannon’s former employer Warren Weaver. Weaver’s essay presented information theory as a humanistic discipline—perhaps misleadingly so.

pages: 329 words: 88,954

Emergence
by Steven Johnson

Five years after his interactions with Turing, Shannon published a long essay in the Bell System Technical Journal that was quickly repackaged as a book called The Mathematical Theory of Communication. Dense with equations and arcane chapter titles such as “Discrete Noiseless Systems,” the book managed to become something of a cult classic, and the discipline it spawned—information theory—had a profound impact on scientific and technological research that followed, on both a theoretical and practical level. The Mathematical Theory of Communication contained an elegant, layman’s introduction to Shannon’s theory, penned by the esteemed scientist Warren Weaver, who had early on grasped the significance of Shannon’s work.

Classic Essays on the Culture of Cities. Englewood Cliffs, N.J.: Prentice-Hall, 1969. Shah, A. M., B. S. Baviskar, and E. A. Ramaswamy. Complex Organizations and Urban Communities. Vol. 3 of Social Structure and Change. New Dehli, London, and Thousand Oaks, Calif.: Sage Publications, Inc., 1996. Shannon, Claude E. The Mathematical Theory of Communication. Chicago: University of Illinois Press, 1998. Shapiro, Andrew L. The Control Revolution: How the Internet Is Putting Individuals in Charge and Changing the World We Know. New York: Century Foundation Books, 1999. Stephenson, Neal. Cryptonomicon. New York: Avon Books, 1999. Stopfer, Mark, Seetha Bhagavan, Brian H.

D., 238n Hardball, 135 Harmonic Convergence, 113–14 Harvard University, 53 Haussmann, Georges-Eugène, 41, 240n Hayek, Friedrich von, 260n HBO, 219 Hess, Moses, 36 heterarchies, 98 Heywood, Rik, 183, 184, 185 hierarchies, 14–15, 18, 30–31, 33, 98, 132, 136, 145, 148–49, 153, 208, 223, 225, 263n–64n Hillis, Danny, 59, 170–74, 180, 209, 223, 231 Hofstadter, Douglas, 45, 65 Holland, John, 27, 57–59, 64–65 Holldobler, Bert, 60, 75 homeostasis, 138, 140–41, 143, 146–47, 148, 149, 151, 154, 159 Homo habilis, 202 Homo sapiens, 202, 204 homosexuality, 41, 43–44 housing projects, 49–50 Howard, Ebenezer, 146, 147, 259n How the Mind Works (Pinker), 118 Hudson Street (New York City), 50, 93 hunter-gatherers, 252n–53n hypertext, 124, 152, 157, 210 Iberall, Arthur, 110–11, 252n–53n IBM, 57 IBM 701 calculator, 57 Image, The (Boorstin), 134–35 immune system, 65, 103, 128, 249n–50n, 261n Impressionism, 39 inertia laws of, 106 information: feedback on, 201–2, 231 flow of, 9, 44–45, 46, 75–76, 94, 96, 108–9, 116–26, 129, 132, 151, 152, 232–33 incomplete, 188, 200–201 from mental states, 196–97 meta-, 79 networks of, 96–97, 116–26, 134–35, 204–5, 217–18 processing of, 44–45, 53, 108, 116, 117, 118, 126–27, 233, 254n–55n retrieval of, 100, 103 sharing of, 40–41, 107–8, 158, 229–34, 251n–52n storage of, 100, 103, 107, 108, 115, 251n tracking of, 121–26, 181, 205 information theory, 44–45, 46 insects, social, 22, 73, 74, 82–83, 121 intelligence: artificial, see artificial intelligence collective, 9, 29, 33, 62–63, 73–82, 85, 97, 103–4, 108, 115, 120–21, 123, 224, 226, 237n–39n, 255n developmental levels of, 11–12, 18 distributed, 220–24, 232–33 emergent, 99–100, 113–14, 127–29 evolution of, 73, 115–17 global, 114–21, 181 human, 45, 124, 127–29, 208 macro-, 116–17 social, 197–98, 202–3 Interactive Telecommunications Program, 178 Interface Culture (Johnson), 113–14 interfaces, user, 108–9, 206, 211, 251n Internet: chaos vs. order on, 117–23 convergence and, 216–19 cultural impact of, 125–26 databases for, 121–26 decentralization of, 66, 118–21, 204–5, 217–18, 263n–64n emergent intelligence of, 113–14, 127–29 as “global brain,” 114–21, 181 as information network, 96–97, 116–26, 134–35, 204–5, 217–18 online communities on, 17, 148–62, 204–5 self-awareness of, 127–29 as web, 17, 66, 226, 229 inventions, 108–9 Jacobs, Jane, 18, 38, 50–52, 64–65, 89, 91–97, 146–48, 156, 229, 230, 236n–37n, 242n, 247n–48n, 256n–58n Jefferson, David, 59–63, 65, 263n Jennings, Peter, 131 Jodi.org, 175, 178 John Muir Trail, 61–63 Joy, Bill, 114 Joyce, James, 39 Kahle, Brewster, 121–23 karma points, 155–56, 160 Keller, Evelyn Fox, 12–17, 18, 42, 43 Kelly, Kevin, 168–69 kindergarten, 165–66 Klein, Naomi, 225, 226 Krugman, Paul, 89–91, 120, 159 “Krush Kringle” toy, 179–80, 182 Kuhn, Thomas, 48 Kurzweil, Ray, 114, 127, 262n language: body, 195 common, 104 development of, 104 digital, 115 encrypted, 44–45, 240n–41n, 255n–56n syntactical structure of, 44–45, 75–76 see also communication Larry King, 135 Latin language, 104 learning: in artificial intelligence, 52–63, 123–24, 127–29, 173 brain activity and, 133–34 by cities, 101–13, 116, 128, 232–33 by computers, 52–63, 170–74 consciousness and, 102–4 feedback in, 52, 57 open-ended, 57–58, 208 software for, 53–63, 65 Lehrer, Jim, 136 Leiser, David, 243n Leonardo da Vinci, 101 Leslie, Alan, 196 Lewinsky, Monica, 144 libraries, 109, 122, 251n Library of Congress, 122 life insurance, 46–47 life span, 99 Linux, 121, 153 logic: dialectical, 66 of emergence, 66–67 as process, 126–27, 255n swarm, 74, 75, 78, 79, 87, 181, 225–26, 232–33 Logic of Computers Group, 57 Logo, 164, 260n Los Angeles, Calif., 91, 95, 97 Luit (chimpanzee), 197 lurkers, 150–51, 152 McLuhan, Marshall, 158, 161 MacNeil/Lehrer NewsHour, 131, 136 Maeda, John, 174, 178 Maes, Patti, 207 magnetic fields, 140 Malda, Rob, 152–62 Mamucium settlement, 34 management, corporate, 67, 223–24 Manchester, England, 22, 33–43, 52, 67, 113 “manor,” 34–35 Marcus, Steven, 38 Marx, Karl, 36 Massachusetts Institute of Technology (MIT), 52, 53, 207 “master planners,” 82 Mathematical Theory of Communication, The (Shannon), 45–47 mathematics: applied, 16 bio-, 12, 13–14, 15, 42, 43 Matrix, The, 114 mazes, 11 media, mass, 130–36, 137, 143–46, 152, 208–21 meditation, 142–43 medium, message vs., 161–62 memory: cellular, 249n–50n collective, 250n institutional, 64 molecular, 65 spatial, 206–7 textual, 206 middens, ant, 32–33, 97 Middle Ages, 101, 111–12, 113, 116, 249n middle class, 37, 41, 240n Milton, John, 53–54 mind, human: decentralized, 22 interconnectedness of, 9 “reading” of, 195–226 “society” of, 65 see also consciousness Minsky, Marvin, 18, 53, 65, 167–68, 237n “mirror neurons,” 198–99 Miyamoto, Shigeru, 176, 178 moderators, online, 154–57, 160, 260n modular theory, 198–99, 202–3 molecules, 46, 65, 85, 86, 236n Molyneux, Peter, 178 monkeys, 197–99, 202, 262n Monopoly, 158–59, 181 “more is different” principle, 78, 165 Morgan, Lloyd, 242n morphogenesis, 14, 15, 42, 43, 49 Moses, Robert, 38, 50 Ms.

pages: 242 words: 68,019

Why Information Grows: The Evolution of Order, From Atoms to Economies
by Cesar Hidalgo
Published 1 Jun 2015

It is physical order, like what distinguishes different shuffles of a deck of cards. What is surprising to most people, however, is that information is meaningless, even though the meaningless nature of information, much like its physicality, is often misunderstood. In 1949 Claude Shannon and Warren Weaver published a short book entitled The Mathematical Theory of Communication. In its first section, Weaver described the conceptual aspects of information. In the second section, Shannon described the mathematics of what we now know as information theory. For information theory to be properly understood, Shannon and Weaver needed to detach the word information from its colloquial meaning.

Friedrich Hayek, “The Use of Knowledge in Society,” American Economic Review 35, no. 4 (1945): 519–530. 5. George A. Akerlof, “The Market for ‘Lemons’: Quality Uncertainty and the Market Mechanism,” Quarterly Journal of Economics 84, no. 3 (1970): 488–500. 6. Claude E. Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1963), 8. 7. Ibid., 31. 8. The formula for Boltzmann’s entropy (SB) is SB = kB ln(W) where kB is Boltzmann’s constant, which has units of energy over temperature, and W is the number of microstates corresponding to a given macrostate. Gibbs generalized the formula for entropy by defining it in terms of the probability that a system would be in a microstate (pi), instead of the total number of equivalent microstates (W).

See Crystallized imagination Incentives, as production stimulus, 77–78 India, economic growth in, 159 Individual limits on accumulation of knowledge and knowhow, 79–81, 82, 83–85, 179, 180 Indonesia, 161 Industrial development/diversification, personbyte theory and, 139, 142–144 Industrial structure and size, trust and, 115–116 Industries, geographic distribution of, 130–132 Industry-location matrices, nestedness of, 132–136, 139, 142–143 Industry space, 139, 141 Information ability to process, 35–37 behind chaos, 30–31 Boltzmann and, xiii computation of, 23–24 conceptual aspects of, xv–xvi connection with knowhow, 165–169 contained in tweet, 13–14 decoding, 23 in DNA, 5, 22–23, 34, 166–168, 176 embedded in objects, 5–6, 8, 11–13, 43–44, 45, 178 embedded in solids, 33–35, 176 embodied in product, source of, 62–63 emerging from out-of-equilibrium systems, 28–33, 35, 175 energy and, 175, 177 entropy and, 14–15 environmental conditions conducive to growth of, 176–177 evolution of, xix–xx irreversibility of time and, 26 mathematical study of, xiii–xiv meaning vs., xvi–xvii physical nature of, xvii–xxi as physical order, xv, xix, 5, 7–8 physical origins of, 28–35 sciences and, xiv Shannon’s theory of, xv–xvii, 13–15, 17–18, 19–20 social sciences and, xiv–xv as something and about something, 7 “stickiness” of, 31–35 Information growth computational ability of matter and, 35–37, 41, 176, 177–178, 181 computational capacity as constraint on, 75 economy as system of, 8–9, 177–180 entropy and, ix, xx history of universe and humans and, xviii–xix nineteenth-century physics and, 26–28 Information processing, as purpose of life, 43–44 Information-rich states, 18–24 properties of, 22–23 Information-rich steady states, out-of-equilibrium systems and, 29–31 Information theorists (cyberneticists), xiv Information theory, xv–xvi Innovative economic sectors, adaptability of firms and, 124 Instantaneous nature of reality, 40 Institutions new institutional economics, 89–91, 93, 117–118, 123 social networks and, 44–45 See also Social institutions Intel, 92, 95 International trade, product exports as crystallized imagination, 51–55 Internet, 92 iPads, 50 iPhones, 50, 92 iPods, 92 Irreversibility of time, in statistical system, 37–40 Italy, familial networks in, 122 Ito, Joi, 73 James, LeBron, 130 Jamestown colony, 170 Japan, formation of large networks in, 115, 116 Jet engine production, 154–155 Jigsaw puzzle analogy, 135–136 Jobs, social networks and, 112–114 Jobs, Steve, 65, 92, 119–120, 142 Joule, James, 60 Just, Sándor, 59 Kauffman, Stuart, 37 Kingston, 92 Knowhow, xviii connection with information, 165–169 as constraint on spread of economic development, 169–171 crystals of imagination and, 61 defined, 6–7, 165 embedded in networks, in economies, 167–169 embodied in biological organisms, 166–168 embodied in products, 52–55, 65–71 geographic distribution of, 77, 80–81, 127–128 information processing and, 35–36 objects and, 8, 41 physical embodiment of, 73–74 quantization of, 73–75, 87–88 social isolation and loss of, 169–171 value of, 61–62 Knowledge, xviii accumulation of, 79–85 creation of complex products and, 78–79 crystals of imagination and, 61 defined, 6 difficulty of accumulating in networks, 106–108 embodied in human networks, 179–180 embodied in products, 52–55, 78–79 genetic factors modulating ability to accumulate, 84 geographic distribution of, 77, 80–81, 127–128 human capital and, 152 industrial development and accumulation of, 139, 142 objects and, 8, 41 physical embodiment of, 73–74 products and practical use of, 65–71 quantization of, 87–88 sharing practical uses of, 69–70 tacit, 78 value of, 61–62 volumes contained in manufacturing networks, 105, 106–107 See also Personbytes Korea, balance of trade with Chile, 52, 54–55 Kuznets, Simon, 146–147, 149 Labor, economic growth and, 146 Labor markets, social networks and, 112–114, 121, 124 Landry, Dave, 139 Language, cost of interactions and, 100–101 Latin America, familial societies in, 115, 122 Law of induction, 59, 69 Learning experiential, 79–80, 81 social nature of, 80–81 Leontief, Wassily, 147, 148, 155, 162 Life ability to compute and, 37 non-equilibrium systems and, 32–33 purpose of, 43–44 Light bulb, invention of, 59 Lovelace, Ada, 49, 69 Low-trust familial societies, family networks and, 120, 121–123 Lyell, Charles, 27 Mach, Ernst, xii Machinarium, 92 Malaysia, export structure of, 137–139, 140 Managed by the Markets (Davis), 101 Mankiw, Gregory, 148 Manufacturing,, migration from United States to China, 161–162 Manufacturing networks Barbie doll, 101–102 as dominant model of production, 105 exchange of intermediate products in, 105–106 personal computer, 92, 105 volumes of knowledge and knowhow in, 105 Market interactions, cost of, 95, 100 language and, 100–101 Markets, coevolution of standards with, 100 Mathematical study of information, xiii–xiv The Mathematical Theory of Communication (Shannon & Weaver), xv–xvi Matter, computational capacities of, 35–37, 41, 176, 177–178, 181 Maxwell, James C., 28, 69 Meaning, information vs., xvi–xvii Medicinal pills, context and value of, 63–64 Melanesians, 170 Message, minimum volume of data needed to specify, xvii, 13–15 Microsoft, 95 Microstates, entropy and, 16, 17 Minsky, Marvin, 7 MIT Media Lab, 52, 61, 62, 73 Mozart, Wolfgang, 84, 124 Multiplicity of a state, entropy and, 16–17 Music genetic factors in musical ability, 84 instruments and access to knowledge, 66–67 National Bureau of Economic Research, 113 Natural sciences, xviii “The Nature of the Firm” (Coase), 90 NEC, 95 Negroponte, Nicholas, 61–62 Nestedness of industry-location matrices, 132–136, 139, 142–143 Netgear, 92 Netherlands, exports, 132 Networks accumulation of knowledge and knowhow in, 106–108 complex computation and, 179 limits on ability to form, 74–75 personbytes accumulated in, 88–89 transferral of productive, 143 transition points in structures of, 107 See also Firms; Manufacturing networks; Professional networks; Social networks Network size, familial societies vs. high-trust societies and, 115–116 Networks of firms, 92–93 social capital and, 152 New institutional economics, 89–91, 117, 123 Newton, Isaac, 25, 40 New York Times (newspaper), 92, 113 Nicolis, 32–33 Nigeria, 161 Non-equilibrium systems life and, 32–33 steady state of, 29–30 Nonequilibrium thermodynamics, 28 Nonspecific recurrent transactions, 94 Nortel, 95 Nova Lima (Brazil), 139, 141 Nyquist, Harry, xvii Object-oriented programming, 120, 142 Objects as crystallized imagination, 44, 178–180 information embedded in, 5–6, 8, 11–13, 43–44, 45, 178 knowledge and knowhow and, 41 See also Products; Solids Observatory of Economic Complexity, 52 Occasional and specific transactions, 94 On Competition (Porter), 147–148 Optogenetics, 51, 61 Order emerging in out-of-equilibrium systems, 29–30 functions and, 63 growth of, 26–28 See also Physical order Ordered states, entropy and, 17–19, 21 Out-of-equilibrium systems computation and, 37 information emerging from, 175 information-rich steady states and, 29–31 Page, Jimmy, 70 Pakistan, economic complexity of, 157–159 Palo Alto Research Center (Xerox PARC), 119–120, 142 Panel Study of Income Dynamics, 113 Past, unreachableness of, 40 Personal computer production by network of firms, 92, 105 professional networks and, 119–120 Personbytes, 83–84, 107, 180 accumulated in networks, 88–89 available in large networks with bureaucratic burden, 103–104 defined, 82 industrial development/diversification and, 139, 142–144 migration of manufacturing and, 161–162 required to produce cars, 88 ubiquity of products and nestedness of industry-location matrices, 135 Phenotypes/genotypes analogy, 130–131, 136 Physical capital, 152 export data and diversity of, 154–156 measuring, 153–154 Physical embodiment of knowledge and knowhow, 73–74.

On Language: Chomsky's Classic Works Language and Responsibility and Reflections on Language in One Volume
by Noam Chomsky and Mitsou Ronat
Published 26 Jul 2011

.: Well, at the end of the forties and the beginning of the fifties, there were important developments in the mathematical theory of communication, information theory, and the theory of automata. Technically, models such as finite state Markov sources were proposedh . . . Very often it was supposed that these models were appropriate for the description of language. Jakobson referred to this vaguely, but Hockett utilized them quite explicitly. In 1955 he proposed a theory of language structure based on a Markov source model borrowed from the mathematical theory of communication. Similar theories were developed by psychologists, engineers, mathematicians.

The first version of this manuscript, completed in 1955, involved a good deal of formalization but no mathematics. Shortly after, I moved from the Society of Fellows at Harvard to the Research Laboratory of Electronics at MIT. There, there was a great deal of quite justified interest in the mathematical theory of communication, and also a great deal of—less justified—faith in the potential for the study of language offered by Markov source models and the like, which had aroused considerable enthusiasm among engineers, mathematical psychologists, and some linguists. As soon as the question was clearly formulated, it was immediately obvious that these models were not adequate for the representation of language.

These theories were then very much in fashion, and they even aroused a certain degree of euphoria, I think it is fair to say. In the intellectual milieu of Cambridge there was a great impact of the remarkable technological developments associated with World War II. Computers, electronics, acoustics, mathematical theory of communication, cybernetics, all the technological approaches to human behavior enjoyed an extraordinary vogue. The human sciences were being reconstructed on the basis of these concepts. It was all connected. As a student at Harvard in the early 1950s all of this had a great effect on me. Some people, myself included, were rather concerned about these developments, in part for political reasons, at least as far as my personal motivations were concerned.

pages: 524 words: 120,182

Complexity: A Guided Tour
by Melanie Mitchell
Published 31 Mar 2009

“The actual equation”: In the equation for Boltzmann’s entropy, S = k log W, S is entropy, W is the number of possible microstates corresponding to a given macrostate, and k is “Boltzmann’s constant,” a number used to put entropy into standard units. “In his 1948 paper ‘A Mathematical Theory of Communication’ ”: Shannon, C., A mathematical theory of communication. The Bell System Technical Journal, 27, 1948, pp. 379–423, 623–656. “efforts to marry communication theory”: Pierce, J. R., An Introduction to Information Theory: Symbols, Signals, and Noise. New York: Dover, 1980, p. 24. (First edition, 1961.) Chapter 4 “Quo facto”: Leibniz, G. (1890).

One of the most important problems for AT&T was to figure out how to transmit signals more quickly and reliably over telegraph and telephone wires. Claude Shannon, 1916–2001. (Reprinted with permission of Lucent Technologies Inc./Bell Labs.) Shannon’s mathematical solution to this problem was the beginning of what is now called information theory. In his 1948 paper “A Mathematical Theory of Communication,” Shannon gave a narrow definition of information and proved a very important theorem, which gave the maximum possible transmission rate of information over a given channel (wire or other medium), even if there are errors in transmission caused by noise on the channel. This maximum transmission rate is called the channel capacity.

Proceedings of the American Meteorological Society, 8th Conference on Numerical Weather Prediction, Baltimore, MD, 1988. Shalizi, C. Networks and Netwars, 2005. Essay at [http://www.cscs.umich.edu/~crshalizi/weblog/347.html]. Shalizi, C. Power Law Distributions, 1/f noise, Long-Memory Time Series, 2007. Essay at [http://cscs.umich.edu/~crshalizi/notebooks/power-laws.html]. Shannon, C. A mathematical theory of communication. The Bell System Technical Journal, 27, 1948, pp. 379–423, 623–656. Shaw, G. B. Annajanska, the Bolshevik Empress. Whitefish, MT: Kessinger Publishing, 2004. (Originally published 1919.) Shouse, B. Getting the behavior of social insects to compute. Science, 295(5564), 2002, p. 2357.

pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy
by George Gilder
Published 16 Jul 2018

Abu-Mostafa, Malik Magdon-Ismail, Tsuan-Tien Lin, Learning from Data: A Short Course (AMLbook.com). Abu-Mostafa introduced me to his mastery of machine learning in a fascinating dinner at the Caltech Athenaeum in February 2013. 5. John Markoff, “How Many Computers to Identify a Cat? 16,000,” New York Times, June 25, 2012. 6. Claude Elwood Shannon, “A Mathematical Theory of Communication,” published in the Bell Systems Technical Journal in October 1948 and available in N. J. A. Sloane, Aaron D. Wyner, edits, Shannon Collected Papers (Piscataway, N.J.: IEEE Press, 1993), section 12: “Equivocation and Channel Capacity,” 33. 7. Thiel continued his critique in his revelatory book Zero to One: Notes on Startups, or How to Build the Future (New York: Crown, 2014). 8. 

This paper has become the sixth-most-cited in the entire corpus of computer science. 2. Philipp von Hilgers and Amy N. Langville, “The Five Greatest Applications of Markov Chains,” in Amy N. Langville and William J. Stewart, eds., Proceedings of the Markov Anniversary Meeting (Altadena, Calif.: Boson Books, 2006), 156–57. 3. Claude Elwood Shannon, “A Mathematical Theory of Communications” in The Bell System Technical Journal, October 1948, section 4, “Graphical Representation of a Markoff Process,” in Collected Papers (Piscataway, N.J.: IEEE Press, 1993), 15. “Stochastic processes of the type described above (“The Discrete Noiseless Channel”) are known mathematically as discrete Markov processes. . . .

Also, The Secret Life: Three True Stories of the Digital Age (New York: Farrar, Strauss, & Giroux, 2017). Rabiner, Lawrence. “Hidden Markov Models”, Proceedings of the IEEE, February 1989. Roberts, Jeff John and Adam Lashinsky, “Hacked: How companies fight back,” Fortune, June 22, 2017. Shannon, Claude Elwood. “A Mathematical Theory of Communications” in The Bell System Technical Journal, October 1948. Tredennick, Nick and Brion Shimamoto, “Embedded Systems and the Microprocessor,” Microprocessor Report (Cahners) April 24, 2000. von Hilgers, Philipp and Amy Langville, “The Five Greatest Applications of Markov Chains”, Proceedings of the Markov Anniversary Meeting.

pages: 372 words: 101,174

How to Create a Mind: The Secret of Human Thought Revealed
by Ray Kurzweil
Published 13 Nov 2012

If we consider that error rates escalate rapidly with increased communication and that a single-bit error can destroy the integrity of a process, digital computation was doomed—or so it seemed at the time. Remarkably, that was the common view until American mathematician Claude Shannon (1916–2001) came along and demonstrated how we can create arbitrarily accurate communication using even the most unreliable communication channels. What Shannon stated in his landmark paper “A Mathematical Theory of Communication,” published in the Bell System Technical Journal in July and October 1948, and in particular in his noisy channel-coding theorem, was that if you have available a channel with any error rate (except for exactly 50 percent per bit, which would mean that the channel was just transmitting pure noise), you are able to transmit a message in which the error rate is as accurate as you desire.

Turing, “On Computable Numbers, with an Application to the Entscheidungsproblem: A Correction,” Proceedings of the London Mathematical Society 43 (1938): 544–46. 3. John von Neumann, “First Draft of a Report on the EDVAC,” Moore School of Electrical Engineering, University of Pennsylvania, June 30, 1945. John von Neumann, “A Mathematical Theory of Communication,” Bell System Technical Journal, July and October 1948. 4. Jeremy Bernstein, The Analytical Engine: Computers—Past, Present, and Future, rev. ed. (New York: William Morrow & Co., 1981). 5. “Japan’s K Computer Tops 10 Petaflop/s to Stay Atop TOP500 List,” Top 500, November 11, 2011, http://top500.org/lists/2011/11/press-release. 6.

(TV show), 6–7, 108, 157–58, 160, 165, 166, 167, 168, 169, 172, 178, 232–33, 270 Joyce, James, 55 Kasparov, Garry, 39, 166 K Computer, 196 knowledge bases: AI systems and, 4, 6–7, 170–71, 246, 247 of digital neocortex, 177 exponential growth of, 3 as inherently hierarchical, 220 language and, 3 professional, 39–40 as recursively linked ideas, 3 Kodandaramaiah, Suhasa, 126 Koene, Randal, 89 Koltsov, Nikolai, 16 Kotler, Steven, 278 KurzweilAI.net, 161 Kurzweil Applied Intelligence, 144 Kurzweil Computer Products, 122 Kurzweil Voice, 160 lamina 1 neurons, 97 language: chimpanzees and, 3, 41 and growth of knowledge base, 3 hierarchical nature of, 56, 159, 162, 163 as metaphor, 115 as translation of thinking, 56, 68 language software, 51, 72–73, 92, 115–16, 122–23, 144–45, 145, 156, 157–72, 174, 270 expert managers in, 166–67 hand-coded rules in, 164–65, 166, 168 HHMMs in, 167–68 hierarchical systems in, 162–65 Larson, Gary, 277 “Last Voyage of the Ghost, The” (García Márquez), 3–4 lateral geniculate nucleus, 95, 100 law of accelerating returns (LOAR), 4, 6, 7, 41, 123 as applied to human brain, 261–63, 263, 264, 265 biomedicine and, 251, 252, 253 communication technology and, 253, 254 computation capacity and, 281, 316n–19n information technology and, 4, 249–57, 252, 257, 258, 259, 260, 261, 261 objections to, 266–82 predictions based on, 256–57, 257, 258, 259, 260, 261 and unlikelihood of other intelligent species, 5 “Law of Accelerating Returns, The” (Kurzweil), 267 laws of thermodynamics, 37, 267 learning, 61–65, 122, 155, 273–74 conditionals in, 65 and difficulty of grasping more than one conceptual level at a time, 65 in digital neocortex, 127–28, 175–76 environment and, 119 Hebbian, 80 hierarchical, 164, 195, 197 in neural nets, 132–33 neurological basis of, 79–80 pattern recognition as basic unit of, 80–81 of patterns, 63–64, 90 recognition as simultaneous with, 63 simultaneous processing in, 63, 146 legal systems, consciousness as basis of, 212–13 Leibniz, Gottfried Wilhelm, 34, 223 Lenat, Douglas, 162 Leviathan (Hobbes), 278 Lewis, Al, 93 Libet, Benjamin, 229–30, 231, 234 light, speed of, 281 Einstein’s thought experiments on, 18–23 linear programming, 64 LISP (LISt Processor), 153–55, 163 pattern recognition modules compared with, 154, 155 Lloyd, Seth, 316n, 317n Loebner, Hugh, 298n Loebner Prize, 298n logic, 38–39 logical positivism, 220 logic gates, 185 Lois, George, 113 love, 117–20 biochemical changes associated with, 118–19 evolutionary goals and, 119 pattern recognition modules and, 119–20 “Love Is the Drug,” 118 Lovelace, Ada Byron, Countess of, 190, 191 lucid dreaming, 72, 287n–88n Lyell, Charles, 14–15, 114, 177 McCarthy, John, 153 McClelland, Shearwood, 225 McGinn, Colin, 200 magnetic data storage, growth in, 261, 301n–3n magnetoencephalography, 129 Manchester Small-Scale Experimental Machine, 189 Mandelbrot set, 10–11, 10 Marconi, Guglielmo, 253 Mark 1 Perceptron, 131–32, 134, 135, 189 Markov, Andrei Andreyevich, 143 Markram, Henry, 80–82, 124–27, 129 mass equivalent, of energy, 22–23 Mathematica, 171 “Mathematical Theory of Communication, A” (Shannon), 184 Mauchly, John, 189 Maudsley, Henry, 224 Maxwell, James Clerk, 20 Maxwell, Robert, 225 Mead, Carver, 194–95 medial geniculate nucleus, 97, 100 medicine, AI and, 6–7, 39, 108, 156, 160–61, 168 memes: consciousness as, 211, 235 free will as, 235 memory, in computers, 185, 259, 260, 268, 301n–3n, 306n–7n memory, memories, human: abstract concepts in, 58–59 capacity of, 192–93 computers as extensions of, 169 consciousness vs., 28–29, 206–7, 217 dimming of, 29, 59 hippocampus and, 101–2 as ordered sequences of patterns, 27–29, 54 redundancy of, 59 unexpected recall of, 31–32, 54, 68–69 working, 101 Menabrea, Luigi, 190 metacognition, 200, 201 metaphors, 14–15, 113–17, 176–77 Michelson, Albert, 18, 19, 36, 114 Michelson-Morley experiment, 19, 36, 114 microtubules, 206, 207, 208, 274 Miescher, Friedrich, 16 mind, 11 pattern recognition theory of (PRTM), 5–6, 8, 11, 34–74, 79, 80, 86, 92, 111, 172, 217 thought experiments on, 199–247 mind-body problem, 221 Minsky, Marvin, 62, 133–35, 134, 199, 228 MIT Artificial Intelligence Laboratory, 134 MIT Picower Institute for Learning and Memory, 101 MobilEye, 159 modeling, complexity and, 37–38 Modha, Dharmendra, 128, 195, 271–72 momentum, 20–21 conservation of, 21–22 Money, John William, 118, 119 montane vole, 119 mood, regulation of, 106 Moore, Gordon, 251 Moore’s law, 251, 255, 268 moral intelligence, 201 moral systems, consciousness as basis of, 212–13 Moravec, Hans, 196 Morley, Edward, 18, 19, 36, 114 Moskovitz, Dustin, 156 motor cortex, 36, 99 motor nerves, 99 Mountcastle, Vernon, 36, 37, 94 Mozart, Leopold, 111 Mozart, Wolfgang Amadeus, 111, 112 MRI (magnetic resonance imaging), 129 spatial resolution of, 262–65, 263, 309n MT (V5) visual cortex region, 83, 95 Muckli, Lars, 225 music, as universal to human culture, 62 mutations, simulated, 148 names, recalling, 32 National Institutes of Health, 129 natural selection, 76 geologic process as metaphor for, 14–15, 114, 177 see also evolution Nature, 94 nematode nervous system, simulation of, 124 neocortex, 3, 7, 77, 78 AI reverse-engineering of, see neocortex, digital bidirectional flow of information in, 85–86, 91 evolution of, 35–36 expansion of, through AI, 172, 266–72, 276 expansion of, through collaboration, 116 hierarchical order of, 41–53 learning process of, see learning linear organization of, 250 as metaphor machine, 113 neural leakage in, 150–51 old brain as modulated by, 93–94, 105, 108 one-dimensional representations of multidimensional data in, 53, 66, 91, 141–42 pattern recognition in, see pattern recognition pattern recognizers in, see pattern recognition modules plasticity of, see brain plasticity prediction by, 50–51, 52, 58, 60, 66–67, 250 PRTM as basic algorithm of, 6 pruning of unused connections in, 83, 90, 143, 174 redundancy in, 9, 224 regular grid structure of, 82–83, 84, 85, 129, 262 sensory input in, 58, 60 simultaneous processing of information in, 193 specific types of patterns associated with regions of, 86–87, 89–90, 91, 111, 152 structural simplicity of, 11 structural uniformity of, 36–37 structure of, 35–37, 38, 75–92 as survival mechanism, 79, 250 thalamus as gateway to, 100–101 total capacity of, 40, 280 total number of neurons in, 230 unconscious activity in, 228, 231, 233 unified model of, 24, 34–74 as unique to mammalian brain, 93, 286n universal processing algorithm of, 86, 88, 90–91, 152, 272 see also cerebral cortex neocortex, digital, 6–8, 41, 116–17, 121–78, 195 benefits of, 123–24, 247 bidirectional flow of information in, 173 as capable of being copied, 247 critical thinking module for, 176, 197 as extension of human brain, 172, 276 HHMMs in, 174–75 hierarchical structure of, 173 knowledge bases of, 177 learning in, 127–28, 175–76 metaphor search module in, 176–77 moral education of, 177–78 pattern redundancy in, 175 simultaneous searching in, 177 structure of, 172–78 virtual neural connections in, 173–74 neocortical columns, 36–37, 38, 90, 124–25 nervous systems, 2 neural circuits, unreliability of, 185 neural implants, 243, 245 neural nets, 131–35, 144, 155 algorithm for, 291n–97n feedforward, 134, 135 learning in, 132–33 neural processing: digital emulation of, 195–97 massive parallelism of, 192, 193, 195 speed of, 192, 195 neuromorphic chips, 194–95, 196 neuromuscular junction, 99 neurons, 2, 36, 38, 43, 80, 172 neurotransmitters, 105–7 new brain, see neocortex Newell, Allen, 181 New Kind of Science, A (Wolfram), 236, 239 Newton, Isaac, 94 Nietzsche, Friedrich, 117 nonbiological systems, as capable of being copied, 247 nondestructive imaging techniques, 127, 129, 264, 312n–13n nonmammals, reasoning by, 286n noradrenaline, 107 norepinephrine, 118 Notes from Underground (Dostoevsky), 199 Nuance Speech Technologies, 6–7, 108, 122, 152, 161, 162, 168 nucleus accumbens, 77, 105 Numenta, 156 NuPIC, 156 obsessive-compulsive disorder, 118 occipital lobe, 36 old brain, 63, 71, 90, 93–108 neocortex as modulator of, 93–94, 105, 108 sensory pathway in, 94–98 olfactory system, 100 Oluseun, Oluseyi, 204 OmniPage, 122 One Hundred Years of Solitude (García Márquez), 283n–85n On Intelligence (Hawkins and Blakeslee), 73, 156 On the Origin of Species (Darwin), 15–16 optical character recognition (OCR), 122 optic nerve, 95, 100 channels of, 94–95, 96 organisms, simulated, evolution of, 147–53 overfitting problem, 150 oxytocin, 119 pancreas, 37 panprotopsychism, 203, 213 Papert, Seymour, 134–35, 134 parameters, in pattern recognition: “God,” 147 importance, 42, 48–49, 60, 66, 67 size, 42, 49–50, 60, 61, 66, 67, 73–74, 91–92, 173 size variability, 42, 49–50, 67, 73–74, 91–92 Parker, Sean, 156 Parkinson’s disease, 243, 245 particle physics, see quantum mechanics Pascal, Blaise, 117 patch-clamp robotics, 125–26, 126 pattern recognition, 195 of abstract concepts, 58–59 as based on experience, 50, 90, 273–74 as basic unit of learning, 80–81 bidirectional flow of information in, 52, 58, 68 distortions and, 30 eye movement and, 73 as hierarchical, 33, 90, 138, 142 of images, 48 invariance and, see invariance, in pattern recognition learning as simultaneous with, 63 list combining in, 60–61 in neocortex, see pattern recognition modules redundancy in, 39–40, 57, 60, 64, 185 pattern recognition modules, 35–41, 42, 90, 198 autoassociation in, 60–61 axons of, 42, 43, 66, 67, 113, 173 bidirectional flow of information to and from thalamus, 100–101 dendrites of, 42, 43, 66, 67 digital, 172–73, 175, 195 expectation (excitatory) signals in, 42, 52, 54, 60, 67, 73, 85, 91, 100, 112, 173, 175, 196–97 genetically determined structure of, 80 “God parameter” in, 147 importance parameters in, 42, 48–49, 60, 66, 67 inhibitory signals in, 42, 52–53, 67, 85, 91, 100, 173 input in, 41–42, 42, 53–59 love and, 119–20 neural connections between, 90 as neuronal assemblies, 80–81 one-dimensional representation of multidimensional data in, 53, 66, 91, 141–42 prediction by, 50–51, 52, 58, 60, 66–67 redundancy of, 42, 43, 48, 91 sequential processing of information by, 266 simultaneous firings of, 57–58, 57, 146 size parameters in, 42, 49–50, 60, 61, 66, 67, 73–74, 91–92, 173 size variability parameters in, 42, 67, 73–74, 91–92, 173 of sounds, 48 thresholds of, 48, 52–53, 60, 66, 67, 111–12, 173 total number of, 38, 40, 41, 113, 123, 280 universal algorithm of, 111, 275 pattern recognition theory of mind (PRTM), 5–6, 8, 11, 34–74, 79, 80, 86, 92, 111, 172, 217 patterns: hierarchical ordering of, 41–53 higher-level patterns attached to, 43, 45, 66, 67 input in, 41, 42, 44, 66, 67 learning of, 63–64, 90 name of, 42–43 output of, 42, 44, 66, 67 redundancy and, 64 specific areas of neocortex associated with, 86–87, 89–90, 91, 111, 152 storing of, 64–65 structure of, 41–53 Patterns, Inc., 156 Pavlov, Ivan Petrovich, 216 Penrose, Roger, 207–8, 274 perceptions, as influenced by expectations and interpretations, 31 perceptrons, 131–35 Perceptrons (Minsky and Papert), 134–35, 134 phenylethylamine, 118 Philosophical Investigations (Wittgenstein), 221 phonemes, 61, 135, 137, 146, 152 photons, 20–21 physics, 37 computational capacity and, 281, 316n–19n laws of, 37, 267 standard model of, 2 see also quantum mechanics Pinker, Steven, 76–77, 278 pituitary gland, 77 Plato, 212, 221, 231 pleasure, in old and new brains, 104–8 Poggio, Tomaso, 85, 159 posterior ventromedial nucleus (VMpo), 99–100, 99 prairie vole, 119 predictable outcomes, determined outcomes vs., 26, 239 President’s Council of Advisors on Science and Technology, 269 price/performance, of computation, 4–5, 250–51, 257, 257, 267–68, 301n–3n Principia Mathematica (Russell and Whitehead), 181 probability fields, 218–19, 235–36 professional knowledge, 39–40 proteins, reverse-engineering of, 4–5 qualia, 203–5, 210, 211 quality of life, perception of, 277–78 quantum computing, 207–9, 274 quantum mechanics, 218–19 observation in, 218–19, 235–36 randomness vs. determinism in, 236 Quinlan, Karen Ann, 101 Ramachandran, Vilayanur Subramanian “Rama,” 230 random access memory: growth in, 259, 260, 301n–3n, 306n–7n three-dimensional, 268 randomness, determinism and, 236 rationalization, see confabulation reality, hierarchical nature of, 4, 56, 90, 94, 172 recursion, 3, 7–8, 56, 65, 91, 153, 156, 177, 188 “Red” (Oluseum), 204 redundancy, 9, 39–40, 64, 184, 185, 197, 224 in genome, 271, 314n, 315n of memories, 59 of pattern recognition modules, 42, 43, 48, 91 thinking and, 57 religious ecstacy, 118 “Report to the President and Congress, Designing a Digital Future” (President’s Council of Advisors on Science and Technology), 269 retina, 95 reverse-engineering: of biological systems, 4–5 of human brain, see brain, human, computer emulation of; neocortex, digital Rosenblatt, Frank, 131, 133, 134, 135, 191 Roska, Boton, 94 Rothblatt, Martine, 278 routine tasks, as series of hierarchical steps, 32–33 Rowling, J.

pages: 608 words: 150,324

Life's Greatest Secret: The Race to Crack the Genetic Code
by Matthew Cobb
Published 6 Jul 2015

W., ‘Grandparental effects in marine sticklebacks: transgenerational plasticity across multiple generations’, Journal of Evolutionary Biology, vol. 27, 2014, pp. 2297–307. Shannon, C. E., An Algebra for Theoretical Genetics, unpublished PhD thesis, Massachusetts Institute of Technology, 1940. Shannon, C. E., ‘A mathematical theory of communication’, The Bell System Technical Journal, vol. 227, 1948a, pp. 379–423. Shannon, C. E., ‘A mathematical theory of communication’, The Bell System Technical Journal, vol. 227, 1948b, pp. 623–56. Shannon, C. E. and Weaver, W., The Mathematical Theory of Communication, Urbana, University of Illinois Press, 1949. Shapiro, B. and Hofreiter, M., ‘A paleogenomic perspective on evolution and gene function: new insights from ancient DNA’, Science, vol. 343, 2014, article 1236573.

A review of Shannon’s ‘A mathematical theory of cryptography’ from 1945’, Cryptologia, vol. 23, 1999, pp. 261–6. Roche, J., ‘Notice nécrologique: André Boivin (1895–1949)’, Bulletin de la Société de Chimie Biologique, vol. 31, 1949, pp. 1564–7. Rogers, E. M., ‘Claude Shannon’s cryptography research during World War II and the mathematical theory of communication’, Proceedings, IEEE 28th International Carnaham Conference on Security Technology, 1994, pp. 1–5. Rogozin, I. B., Carmel, L., Csuros, M. and Koonin, E. V., ‘Origin and evolution of spliceosomal introns’, Biology Direct, vol. 7, 2012, p. 11. Romiguier, J., Ranwez, V., Douzery, E. J.

pages: 444 words: 111,837

Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe
by Paul Sen
Published 16 Mar 2021

The environment at Bell Labs enabled Shannon’s remarkable brain to put together his varied experiences—barbed-wire telegraphy, SIGSALY, his conversations with Alan Turing—and come up with one of the greatest scientific insights of the modern age. In 1948, he revealed his thoughts in a paper entitled “A Mathematical Theory of Communication,” published in the Bell Labs technical journal. Less than thirty pages long, Shannon’s paper enabled humans, for the first time, to measure information in a completely objective and clearly defined way. What does this mean? A photograph, a novel, and a painting are examples of information.

Shannon: An Interview Conducted by Robert Price,” July 28, 1982. “We had dreams”: From an interview Shannon gave to Friedrich-Wilhelm Hagemeyer, February 28, 1977. “I had talked to him several times”: From “Shannon: An Interview by Price.” “They never told me”: As quoted in Mind at Play by Soni and Goodman. “A Mathematical Theory of Communication”: From Bell System Technical Journal 27 (1948). “reproducing at one point”: From the above paper. Shannon pointed the similarity out to John von Neumann: This anecdote originates in a 1971 article, “Energy and Information,” Scientific American, by Myron Tribus and Edward C. McIrvine.

See Thomson, William, Baron Kelvin Kelvin scale (absolute temperature scale), 63–66 kelvin unit of measurement, 66 kinetic energy Boltzmann’s research on, 99–100, 102 coin-swapping analogy for, 100–102 kinetic theory Bernoulli’s development of, 78 Boltzmann’s knowledge of, 97, 98 Clausius’s paper supporting, 78–82, 97 heat flow through conduction and, 133 Loschmidt’s knowledge of, 97 Maxwell’s testing of, 86, 88–90, 92–93, 97 Klein, Felix, 130 Korodi, Albert, 164–65 Kraft (energy), Helmholtz on the conservation of, 47–49, 51–52 La Amistad slave ship, 104–5 Landauer, Rolf, 192–96 background and education of, 193 change in the physical universe when using information and, 183 information processing research of, 192–93 thermodynamic cost of a bit and, 194–97 Landauer limit, 196 languages change in physical universe when using, 183 letter-frequency patterns in, 177–79 letter-pair patterns in, 179 mathematics as a universal language, 105 redundancies in, 181–82 spoken word statistical patterns in, 180 Lavoisier, Antoine, 10, 44 Lavoisier, Marie-Anne, 10 Lectures on Gas Theory (Boltzmann), 143, 147 Levor, Norma, 171–73 light black holes and speed of, 228, 230 cavity radiators measurements of, 138–41 Einstein’s paper on heuristic argument on, 145–48 Einstein’s research on speed of, 152–54 electric current flows in light bulbs for, 133, 136 electromagnetic wave oscillation rate and color of, 135–36 heat transfer using radiation and, 133–34 Maxwell’s description of, as electromagnetic wave, 135, 152 photoelectric effect with, 146 sun’s emission of, 137 light bulbs, electric current in, 133, 136 light quanta, Einstein’s research on, 147, 160, 161, 233 Linde, Carl, 111 liquids, Bernoulli’s theory about movement of particles in, 79–80 London Electrical Society, 24 Loschmidt, Josef, 97, 99, 116–17, 127 loudspeakers, Einstein’s invention of, 162 Louis XVIII, King of France, 8 Lovelock, James, 244 Lunar Society, 5 Lutz, Eric, 196 Lyell, Charles, 70–71 Lynas, Mark, 244 Mach, Ernst, 124–26, 130, 151–52 Macmillan’s Magazine, 71 Magnus, Gustav, 51–52, 231 Manchester, textile factories in, 2, 23 Manchester University, 204, 208, 212, 213 Mansell, Robert, 38 Maric, Mileva, 144–45 Marischal College, Aberdeen, 83, 84, 90 Marschak, Jacob, 166 Massachusetts Institute of Technology (MIT), 171–72 mathematical analysis Bekenstein’s black hole entropy and event horizon research with, 234 Bernoulli’s study of fluids using, 76–77 Clausius’s use of, 52, 67, 68, 75, 78, 79 critiques of Darwin’s theory of evolution and, 71 Fourier’s description of the behavior of heat using, 33–34 French steam power efficiency and, 6 Lazare Carnot’s research on physics of waterpower using, 11 Maxwell’s analysis of Clausius’s papers using, 86, 88 Maxwell’s analysis of kinetic theory using, 89–90, 92–93 Maxwell’s use of probabilities and bell curves in, 86–88 Sadi Carnot’s use of, 11 temperature behavior and effects investigated using, 66 Thomson’s analysis of age of the earth using, 72 Thomson’s heat death of the universe using, 61 mathematical equations Clausius’s definition of entropy using, 68 Einstein’s papers with, 222, 224, 225 Maxwell’s description of electromagnetism using, 134 for phyllotaxis, in Turing’s research, 212 Turing’s pattern formation in morphogenesis and, 208–9 “Mathematical Theory of Communication, A” (Shannon), 174–76, 179 mathematics British teaching of, 5 computer calculations in, 171 Gibbs’s education in, 95, 105 Maxwell’s education in, 83, 84 Noether’s recognition as woman in, 155 Noether’s research’s impact on, 159 Thomson’s education in, 34 Turing’s education in, 201–2 as a universal language, 105 Maxwell, James Clerk, xi, 83–94, 187–90 background and education of, 83–85 Clausius’s work on kinetic theory of heat and, 82, 86, 88 Einstein’s familiarity with, 107, 144, 152, 153 Einstein’s ideas on the speed of light and, 152 electromagnetism research of, 93, 134–37, 152, 153, 187 Gibbs’s papers and, 117, 118 health and death of, 86, 94 heat research of, 84, 134–36 kinetic theory testing approach of, 86, 88–90, 92–93 Loschmidt’s knowledge of ideas of, 97 Marischal College appointment and later loss by, 83, 84, 90 marriage of, 84–85, 91, 94 physics laboratory founded by, 93–94 probabilities and bell curves used by, 86–88, 89 relationship between gas viscosity and pressure tested by, 89–90, 92–93, 187 Szilard’s thought experiment (Szilard’s demon) and, 190 Thomson’s paper on, 189–90 thought experiment (Maxwell’s demon) on flaws in thermodynamic theory by, 187–90, 192 wife Katherine’s collaboration with, 85, 91–92, 93, 187 Maxwell, Katherine Dewar collaboration with husband, 85, 91–92, 93, 187 marriage of, 84–85, 91, 94 Maxwell’s demon, 187–90, 192 Mayer, Julius Robert, 47 mechanics, symmetry in, 157–58 Meitner, Lise, 132 mercury-based thermometers, 63 Mertens, Franz, 155 microphotography, 30 microwave frequencies kiln temperature and, 137 light and, 135 microwave radiation, invisible long-wavelength, 138 Millikan, Robert, 168–69 mining industry duty concept in, 4, 10 steam power in Britain and, 2–3, 10 molecules Einstein’s research on proof of, 143, 144, 148–50 energeticism on existence of, 125–26 Perrin’s study of particle drift of, 151 phenomenalism debate about, 124–25 scientific community’s acceptance of, after Einstein’s research, 151–52 momentum, in laws of mechanics, 158 Moore, Betty, 174 morphogenesis cannibal-missionary model in, 208, 216–17 diffusion in, 206, 209, 215 face formation in, 216–17 feedback in, 206–8, 210, 216 fruit fly larva study in, 215 hand shapes and, 217 pattern formation theory on, 207–10, 214, 217 promise of future developments in, 217–18 Turing’s paper on, 205–11, 212, 215–16, 217 Wolpert’s PI model in, 214–16, 217 motive power of steam engines Carnot’s description of, 12, 14, 15, 25 Carnot’s legacy of discovery of, 20 Carnot’s theory of an ideal engine and, 15–19, 247–52 Watt’s condenser and, 13–14 Murray, Arnold, 211 muscle movement, chemical causes of, 45–46 muscle power, other sources of power replacing, 12, 18, 241 Napoleon, 1, 2, 6, 8 National Conservatory of Arts and Crafts, Paris, 6, 7, 8, 9 nationalism, Boltzmann and rise of, 126 National Physical Laboratory (NPL), 204 natural philosophy, 5, 34, 35, 83, 84, 90, 242 natural selection, 71 “Nature of the Motion We Call Heat, The” (Clausius), 78–79 Nazism, 165–66, 193, 202 Newcomen, Thomas, 3 Newcomen engines, 3–4 Newman, Max, 204 Newton, Isaac, 60, 70, 76 Einstein’s general theory of relativity and, 220, 222, 224 on Einstein’s thought experiment on falling objects and gravity, 222 theory of gravity of, 86, 156, 220, 221–22, 225 Newtonian mechanics, Bernoulli’s research on gases using, 76–77 Newton’s laws British universities’ teaching of, 5 French approach to, as foundation, 6 Maxwell’s approach to testing kinetic theory using, 86 Nobel Prize, 142, 145 Noether, Emmy, xi, 155–59, 165–66 background and education of, 155–56 Einstein’s theories and, 156–57, 159 Noether’s theorem, 157–59 “Notes on Mathematics, Physics, and Other Subjects” (S.

pages: 230 words: 61,702

The Internet of Us: Knowing More and Understanding Less in the Age of Big Data
by Michael P. Lynch
Published 21 Mar 2016

Recent books reflecting these themes include Weinberger, Too Big to Know; Bilton, I Live in the Future; Rifkin, The Zero Marginal Cost Society; Rudder, Dataclysm. 11. James, Pragmatism. 164–65. 12. Wieseltier, “Among the Disrupted.” 13. A few other sticks have been placed in the stream, for example: Carr, The Shallows; Roberts, The Impulse Society; Sunstein, Republic.com 2.0. 14. Shannon, “A Mathematical Theory of Communication.” See Gleick, The Information, for discussion. 15. Grimm, “Is Understanding a Species of Knowledge?”; Kvanvig, The Value of Knowledge, 185–96. 16. Bostrom, “Are We Living in a Computer Simulation?” Bostrom’s argument is ingeniously simple: Assume that in the future, some culture of super beings eventually reaches technological “maturity.”

Helbing. “A Network Framework of Cultural History.” Science 345, no. 6196 (2014): 558–62. Schuster, Jack H., and Martin J. Finkelstein. The American Faculty: The Restructuring of Academic Work and Careers. Baltimore: Johns Hopkins University Press, 2006. Shannon, Claude Elwood. “A Mathematical Theory of Communication.” ACM SIGMOBILE Mobile Computing and Communications Review 5, no. 1 (2001): 3–55. Silverman, Craig, ed. Verification Handbook: An Ultimate Guideline to Digital Age Sourcing for Emergency Coverage. Netherlands: European Journalism Centre, 2014. Solzehnitsyn, Aleksandr. Cancer Ward. 1968.

pages: 463 words: 118,936

Darwin Among the Machines
by George Dyson
Published 28 Mar 2012

By repeated sorting and other iterated functions, primitive punched-card machines could perform complex operations, but, like the original Turing machine, they had only a small number of possible states. The fundamental unit of information was the bit; its explicit definition as the contraction of “binary digit” was first noted in an internal Bell Laboratories memo written by John W. Tukey on 9 January 1947,20 and first published in Claude Shannon’s Mathematical Theory of Communication in 1948.21 Shannon’s definition was foreshadowed by Vannevar Bush’s analysis, in 1936, of the number of “bits of information” that could be stored on a punched card. In those days bits were assigned only fleetingly to electrical or electronic form. Most bits, most of the time, were bits of paper (or bits of missing paper, represented by the chad that was carted off to landfills by the ton).

Austrian, Herman Hollerith: Forgotten Giant of Information Processing (New York: Columbia University Press, 1982), 39–40. 18.Emmanuel Scheyer, “When Perforated Paper Goes to Work: How Strips of Paper Can Endow Inanimate Machines with Brains of Their Own,” Scientific American 127 (December 1922): 395. 19.Vannevar Bush, “Instrumental Analysis,” Bulletin of the American Mathematical Society 42 (October 1936): 652. 20.John W. Tukey, 9 January 1947, “Sequential Conversion of Continuous Data to Digital Data,” in Henry S. Tropp, “Origin of the Term Bit,” Annals of the History of Computing 6, no. 2 (April 1984): 153–154. 21.Claude E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27 (July and October 1948): 379–423, 623–656. 22.Bush, “Instrumental Analysis,” 653–654. 23.Irving J. Good, “Pioneering Work on Computers at Bletchley,” in Nicholas Metropolis, J. Howlett, and Gian-Carlo Rota, eds., A History of Computing in the Twentieth Century (New York: Academic Press, 1980), 35. 24.Peter Hilton, “Reminiscences of Bletchley Park, 1942–1945,” in A Century of Mathematics in America, part 1 (Providence, R.I.: American Mathematical Society, 1988), 293–294. 25.Diana Payne, “The Bombes,” in F.

; digital computers; electronics; human-machine systems; self-organizing systems; telecommunications; Turing machine abandoned, 93 evolution of, and Erasmus Darwin, 21 human subservience to, 25–26, 33, 226–27 Lamarckian tendencies among, 30 Leibniz and Babbage on coded descriptions of, 38 miniaturization of, 15, 173–74 relational and differential (Smee), 47, 171 sanctuary from, 17 self-reproducing, 31, 76, 108–109, 172, 175, 185, 191 symbiosis with, 10, 12, 172, 179, 224, 226–27 ultraintelligent, 72, 171, 205, 209 virtual, 125, 127, 128, 185 in World War I, 193, 221 MacPhail, Malcolm, on Turing, 58 Macy (Cybernetics) conferences, 101 magic, and artificial intelligence, 212–14 Malebranche, Nicolas (1638–1715), 181 Malthus, Thomas (1766–1834), 162 Manchester Mark I (and “baby” Mark I) computer, 67, 70, 104 Manchester University, 69–70, 104–105, 118, 119, 204 Mandl, Alex, on AT&T, 9 Manhattan Project, 76. see also Los Alamos mapping (of information), 38, 133, 137–38, 216, 225, 228 between genotype and phenotype, 118–19, 216, 225 Marchant (mechanical calculator), 84 Margulis, Lynn, 12, 113 Marquand, Allan (1853–1924), 58–59 Marschak, Jacob, on von Neumann, 154 materialism, of Hobbes, 3, 5, 51, 227 Mathematical Analysis of Logic (Boole), 43 Mathematical Foundations of Quantum Mechanics (von Neumann), 77 mathematical tables, and digital computing, 39–40 Mathematical Theory of Communication (Shannon), 61 mathematics. see also algebra; arithmetic; digital computers; formal systems; game theory; incompleteness; logic foundations of, 6–7, 43–44, 49–50, 53–58, 156, philosophy of, 7, 9, 38, 39, 41–44, 49–50, 130, 168, 190, 218, 228 truth and proof in, 39, 42, 49–50, 53–54, 56, 168, 228 matrix biological, 19 computational, 177, 183, 215 Matthew, Patrick (1790–1874), 17 Mauchly, John W. (1907–1980), 81, 82, 85, 90, 98 McClelland, J.

pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence
by George Zarkadakis
Published 7 Mar 2016

They convert signals that exist in the physical world into binary representations of ‘0s’ and ‘1s’.24 In binary code ‘0’ denotes the absence of a signal and ‘1’ the presence of a signal. Every time you use your smartphone to take a picture, light captured by your phone’s camera is converted into binary digits and stored in the memory. Digital information is a long, long sequence of zeros and ones. Shannon’s breakthrough idea in his seminal paper ‘A Mathematical Theory of Communication’25 was to borrow the probabilistic mathematics of thermodynamics and apply them to the new field of telecommunications. Thermodynamics describes how molecules move as they heat up or cool down. The greater the heat, the more energetic the molecules become. A key concept in thermodynamics is how ordered the system of molecules is, or how evenly they are spread around at a given temperature.

For now, I want to focus on four individuals who took part in the Macy Conferences, and whose work laid the foundations for Artificial Intelligence: Norbert Wiener, Claude Shannon, Warren McCulloch and John von Neumann. We have already met the first two. Norbert Wiener was the grand visionary of cybernetics. Inspired by mechanical control systems, such as artillery targeting and servomechanisms, as well as Claude Shannon’s mathematical theory of communication and information, he articulated the theory of cybernetics in his landmark book, Cybernetics, of 1948.4 Godfather number two, Claude Shannon, was the genius who gave us information theory. We saw how Wiener and Shannon pondered on the ontology of information, and how they decided to regard it as something beyond matter and energy.

Both Minsky and Shannon worked together at Bell Labs in 1952. 24Perhaps the title of my book holds an ‘encrypted’ message. If you put the initials of the title together (IOOI), and transform them to binary numbers (1001), you get the decimal result 9 (whatever that means …)! 25Shannon, C. E., and Weaver W. (1948), The Mathematical Theory of Communication. Champaign: University of Illinois Press. Shannon co-wrote the book with Warren Weaver, a pioneer in machine translation. 26I am rephrasing here an example given by Katherine Hayles in her 1999 book, How We Became Posthuman. 27The number of cells in our body is estimated between 5 billion and 200 billion.

pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise
by Nathan L. Ensmenger
Published 31 Jul 2010

Not only did it lay claim to the valuable intellectual territory suggested by the commonsense understanding of information as knowledge or data but it also linked the discipline to the specific formulation of information developed in the late 1940s by the mathematician Claude Shannon. In his seminal book with Warren Weaver from 1949, A Mathematical Theory of Communication, Shannon had defined information in terms of the physical concept of negative entropy.61 His information theory appealed to scientists in a wide variety of disciplines, and for a time it appeared as if information might serve as a broadly unifying concept in the sciences.62 But despite its intellectual appeal, Shannon’s mathematical definition of information was never widely applicable outside of communications engineering.

Michael Mahoney, “Software as Science–Science as Software,” in Mapping the History of Computing: Software Issues, ed. Ulf Hashagen, Reinhard Keil-Slawik, and Arthur Norberg (Berlin: Springer-Verlag, 2002), 25–48. 60. ACM Curriculum Committee, “An Undergraduate Program in Computer Science.” 61. Claude Shannon and Warren Weaver, A Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949). 62. Lily Kay, “Who Wrote the Book of Life? Information and the Transformation of Molecular Biology,” Science in Context 8 (1995): 609–634; Ronald Kline, “Cybernetics, Management Science, and Technology Policy: The Emergence of ‘Information Technology’ as a Keyword, 1948–1985,” Technology and Culture 47, no. 3 (2006): 513–535. 63.

CBI 116, “Institute for Certification of Computer Professionals Records, 1960–1993,” Box 1, Folder 30, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 17th RAND Symposium: Problems of the AFIPS Societies Revisited (1975), CBI 78, “RAND Symposia on Computing Transcripts,” Box 3, Folder 7, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. Shannon, Claude, and Warren Weaver. A Mathematical Theory of Communication. Urbana: University of Illinois Press, 1949. Shapiro, Stuart. “Splitting the Difference: The Historical Necessity of Synthesis in Software Engineering.” IEEE Annals of the History of Computing 19 (1) (1997): 20–54. Shapiro, Stuart, and Steven Woolgar. “Balancing acts: reconciling competing visions of the way software technologists work.”

pages: 254 words: 76,064

Whiplash: How to Survive Our Faster Future
by Joi Ito and Jeff Howe
Published 6 Dec 2016

Although the project remained secret until the 1970s, and all of the records associated with it were destroyed, several of the people who had worked on the project went on to build the next generation of digital computers.31 Much of their work was informed by two papers published by Claude Shannon in the late 1940s, “A Mathematical Theory of Communication”32 and “Communication Theory of Secrecy Systems,” 33 which established the field of information theory and proved that any theoretically unbreakable cipher must share the characteristics of the one-time pad. Originally developed in the late nineteenth century, and rediscovered near the end of the First World War, the one-time pad requires that both the sender and the receiver have a key made up of a string of random digits at least the length of the message.

Kindle Edition, chapter 1: “The Cipher of Mary, Queen of Scots.” 25 Ibid. 26 Pierre Berloquin, Hidden Codes & Grand Designs: Secret Languages from Ancient Times to Modern Day (New York: Sterling Publishing Company, Inc., 2008). 27 Singh, The Code Book. 28 Ibid. 29 Singh, The Code Book, chapter 2: “Le Chiffre Indéchiffrable”; Richard A. Mollin, An Introduction to Cryptography (Boca Raton, FL: CRC Press, 2000). 30 Singh, The Code Book. 31 Singh, The Code Book, chapter 6: “Alice and Bob Go Public.” 32 C. E. Shannon, “A Mathematical Theory of Communication,” SIGMOBILE Moble Computing Communications Review 5, no. 1 (January 2001): 3–55, doi:10.1145/584091.584093. 33 C. E. Shannon, “Communication Theory of Secrecy Systems,” Bell System Technical Journal 28, no. 4 (October 1, 1949): 656–715, doi:10.1002/j.1538-7305.1949.tb00928.x. 34 B.

Cognitive Gadgets: The Cultural Evolution of Thinking
by Cecilia Heyes
Published 15 Apr 2018

Culture and the sequence of steps in theory of mind development. Developmental Psychology, 47(5), 1239–1247. Shanks, D. R. (2010). Learning: from association to cognition. Annual Review of Psychology, 61, 273–301. Shannon, C. E. (1949). The mathematical theory of communication. In C. E. Shannon and W. Weaver (eds.), The Mathematical Theory of Communication. Urbana: University of Illinois Press. Shea, N. (2009). Imitation as an inheritance system. Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences, 364, 2429–2443. Shea, N. (2013). Inherited representations are read in development.

pages: 253 words: 83,473

The Demon in the Machine: How Hidden Webs of Information Are Finally Solving the Mystery of Life
by Paul Davies
Published 31 Jan 2019

The project began as war work: if you are cursed with a hissing radio or a crackly telephone line, what is the best strategy you can adopt to get word through with the least chance of error? Shannon set out to study how information can be encoded so as to minimize the risk of garbling a message. The project culminated in 1949 with the publication of The Mathematical Theory of Communication.2 The book was released without fanfare but history will judge that it represented a pivotal event in science, one that goes right to the heart of Schrödinger’s question ‘What is life?’ Shannon’s starting point was to adopt a mathematically rigorous definition of information. The one he chose turned on the notion of uncertainty.

(Springer, 2018), pp. 17–27. 8. Eric Smith, ‘Chemical Carnot cycles, Landauer’s principle and the thermodynamics of natural selection’, Talk/Lecture, Bariloche Complex Systems Summer School (2008). 2. ENTER THE DEMON 1. Peter Hoffman, Life’s Ratchet (Basic Books, 2012), p. 136. 2. Claude Shannon, The Mathematical Theory of Communication (University of Illinois Press, 1949). 3. Christoph Adami, ‘What is information?’, Philosophical Transactions of The Royal Society, A 374: 20150230 (2016). 4. Leo Szilárd, ‘On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings’, Zeitschrift fur Physik, 53, 840–56 (1929). 5.

pages: 350 words: 90,898

A World Without Email: Reimagining Work in an Age of Communication Overload
by Cal Newport
Published 2 Mar 2021

Perhaps his largest intellectual leap was his 1937 MIT master’s thesis, which he submitted at the age of twenty-one and, among other contributions, laid the foundation for all of digital electronics.1 But it’s toward another of his most famous works that I’ll turn our attention now, as it will prove useful in our quest to move beyond the hyperactive hive mind workflow. I’m talking about Shannon’s invention of information. To be more precise, Shannon wasn’t the first person to talk carefully about information or to try to quantify it. But his 1948 paper, “A Mathematical Theory of Communication,” established a framework called information theory that fixed the flaws of earlier attempts to study this topic formally and provided the tools that ended up making the modern digital communication revolution possible. Underlying this framework is a simple but profound idea: by adding complexity to the rules we use to structure our communication, the actual amount of information required by the interactions can be reduced.

,” 219, 239, 241 It Doesn’t Have to Be Crazy at Work (Fried and Hansson), 195–96 IT professionals, 25–27, 130–33 Jackson, Mike, 196 Janz, Bruce, 239–40 Jersild, Arthur, 15–16 Jira, 154 Jobs, Steve, 66, 68 Johnson, Brian, 144–50, 155, 170 Journal of Applied Psychology, The, 24 Journal of Computer-Mediated Communication, The, 45 Journal of Personality and Social Psychology, The, 52 journalists, 30, 58 Kanban, 157, 159–60, 163–67 Katzenberg, Jeffrey, 67 Kenya, 47–49 Kirsner, Scott, 196 Knapp, Jake, 235–37, 239 knowledge sector automatic processes for, 170–77 brains add value to, 121 current state of, xv deploying capital and, 121 Drucker’s autonomy theory and, 111 economic effectiveness of, 103 and energy-minimizing email, 190 and hyperactive hive mind, xvii–xix, 10, 115 latent productivity in, xix–xx process aversion of, 140–42 undervalues concentration, xiv–xv knowledge work coining of the term, 89–91, 259 and constant communication, 31 and deep thinking, 19–20, 24 and diminishment of specialization, 219 email’s advantages for, xiii–xiv, 9 emergence of, 109 increasing productivity of, 102–4 and innovative workflow, xxii parallel track approach to, 13–14, 18 undervalues concentration, xiv–xv knowledge workers communication habits of, 10–13 driven by email, 105, 107–8 driven by project boards, 105–8, 111 effectiveness of, 27, 37 execution of own work and, 124–27 with highly trained skills, 218 large number of, 39 others’ expectations about, 124, 127–31 process aversion of, 140–42 process needs of, 143–44 specialized efforts of, 92, 229 studies of, 36–37 vested in new workflows, 125–27, 133, 152 See also autonomy: of knowledge workers Kowitz, Braden, 236 Kruger, Justin, 52 Lakein, Alan, 56 Lamott, Anne, 227, 234 Lamport, Leslie, 81 Landmarks of Tomorrow (Drucker), 117 Langley, Virginia, 63 leadership approaches, 20–25, 117–18 Leroy, Sophie, 16–18 lesson videos, 145–47 Leviathan (Hobbes), 142 Lewis, Peter, 68 lexical decision task, 17 Lieberman, Matthew, 42–43 Like button, 74–75 linguistic channel, 49 Linguistic Inquiry and Word Count, 36 Lloyd, William Forster, 91 locus of control theory, 125–27, 133 Lukaczyk, Madison, 11–12 Lütke, Tobi, 201 Lynch, Nancy, 80 Macdonell, Robby, 10 mail carts, xiii, 64–65, 247 makers attention switching and, 19–20 Graham’s vision of, 19–20, 28 hyperactive hive mind and, 30–31 reduced productivity of, 30–31 “Maker’s Schedule, Manager’s Schedule” (Graham), 19–20 management 30x rule for, 174 founding modern era of, 88–90 industrial, 101, 117, 136–40 for knowledge work sector, xix–xx literature on, 117–18 scientific approach to, 135–37, 139–40 systems for, 6, 20–25, 248 Trello boards for, 105–8, 111 Management of Time, The (McCay), 117–18 managers, 113, 121 attention switching and, 19–20 and autonomous workers, 92, 109–11, 118 big-picture goals of, 22–24 and deep thinking/work, 24, 147–48 effectiveness of, 24, 31, 117–18 and email use, 18–19, 24, 31, 81 hyperactive hive mind and, 20–21, 23–24 and in-person problem solving, 54–55 meetings and, 19, 148 and new workflows, 117–18 responsiveness trap and, 19, 23–25 shield workers from interruptions, 223–25, 227, 241 team sizes and, 86–88 tracking time of, 8 manual labor, 101, 109, 117, 119–20, 139, 259 Mark, Gloria, 6–9, 11, 16, 36, 55–59 marketing sector, 104–8, 111–12, 120–21, 124, 141, 148, 160, 223 Markoff, John, 67–68 Marshall, George, 20–25 Martel, Charles, 71–73, 76 mass media, 73–74 “Mathematical Theory of Communication, A” (Shannon), 179–80, 183 Mbendjele BaYaka people, 39–42 McCay, James, 117–18 McKeown, Greg, 221 McLuhan, Marshall, 74 media companies, 30, 144–52 medieval feudalism, 71–73, 76–77 Medieval Technology and Social Change (Martel), 72–73 meetings, 31, 176, 187 budgeting time for, 244 constant demand for, 82, 113 on FaceTime, 148–49 highly restricted, 100 hire an assistant for, 190–93, 230 making them effective, 6–7 managers and, 19, 148 online scheduling tools for, 188, 191–93, 244 protocols for, 185, 188–93, 198–200 for reviewing task boards, 160–61, 169 scheduled, xii–xiii, 7–9, 14, 151–52, 184, 218 “status,” 60, 187, 208–13, 244 for synchronizing efforts, 148–49, 211 time-wasting, 228–29 while standing up, 210 mental health, xv, 36, 61, 177 Microsoft, 68 minders, 25–28 mindfulness exercises, 102 misery (of knowledge workers) email leads to, xv, 35–39, 43–46, 54, 69 hyperactive hive mind and, 33, 43, 88 mechanisms of, 61 MIT, 49–51, 54, 179, 202 MIT Sloan Management Review, 61 Model Ts, 97–101 Modern Times (film), 120–21 “Modest Proposal: Eliminate Email, A,” 194–95 Modus Cooperandi, 163 Morris, Noel, 203 Mortensen, Dennis, 188 motivation, 125–26, 133, 140, 165, 212–13 Mpala Research Center (Kenya), 47–49 multitasking, 6–10, 15 Nash, John, 91 National Productivity Review, 217–18 Nature Scientific Reports, 40 Neolithic Revolution, 40 neuroscience, xviii, 14 New York Times, The, xv, 66, 68, 100–101 New York University, 52 New Yorker, The, xv newsletters, 251 Newton, Elizabeth, 51–53 Nikias, C.

pages: 293 words: 91,110

The Chip: How Two Americans Invented the Microchip and Launched a Revolution
by T. R. Reid
Published 18 Dec 2007

His master’s thesis, in 1937, demonstrated how computerized mathematical circuits should be designed; this youthful piece of work not only served as the cornerstone of computer architecture from then on, but also launched a new academic discipline known as switching theory. Ten years later, as a researcher at Bell Labs, Shannon got to thinking about efficient means of electronic communications (for example, how to send the largest number of telephone conversations through a single wire). He published another seminal paper, “A Mathematical Theory of Communication,” that launched an even more important new academic discipline known as information theory; today information theory is fundamental not only in electronics and computer science but also in linguistics, sociology, and numerous other fields. You could argue that Claude Shannon was the Alexander Graham Bell of the cellular phone, because mobile communications would be impossible without the basic formulas of information theory that Shannon devised.

Dover Press deserves our gratitude for keeping in print a paperback version of George Boole’s masterpiece, The Laws of Thought (New York: Dover Publications, 1951). There is as yet no biography of Claude Shannon, but a reader might be interested in the book that launched the burgeoning field of information theory—that is, Claude E. Shannon, The Mathematical Theory of Communication (Champaign: University of Illinois Press, 1949). Computer history is just now emerging as an academic discipline of its own, and there will no doubt be some fine books written on the work of von Neumann, Turing, and other computer pioneers. There is a good general history in Joseph C.

Turing's Cathedral
by George Dyson
Published 6 Mar 2012

Tukey shortly after he joined von Neumann’s project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. “Any difference that makes a difference” is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms.2 To a digital computer, the only difference that makes a difference is the difference between a zero and a one. That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623.

Army Ordnance and the Aberdeen Proving Ground) was one of the principal organizers of the army’s Electronic Numerical Integrator and Computer, or ENIAC, whose existence would not be made public until February 1946. Statistician John Tukey (of Princeton University and Bell Laboratories) provided a direct link to Claude Shannon, whose mathematical theory of communication showed how a computer built from unreliable components could be made to function reliably from one cycle to the next. Jan Rajchman and Arthur Vance were engineers, and George Brown a statistician, from RCA. This first meeting of the Institute for Advanced Study’s Electronic Computer Project established principles that would guide the destiny of computing for the next sixty years.

Foundation Maehly, Hans magnetohydrodynamics Manchester Mark 1 (computer) Manchester University, 8.1, 12.1, 13.1, 13.2, 18.1 Mandelbrot, Benoît (1924–2010), 3.1, 18.1 Manhattan Project, 4.1, 4.2, 5.1, 7.1, 10.1 see also Los Alamos MANIAC (Mathematical and Numerical Integrator and Computer), see ECP; IAS computer MANIAC-1 (Los Alamos), 5.1, 15.1, 18.1 Marchant (electromechanical desk calculator), 4.1, 5.1 Mark, Carson (1913–1997), 10.1, 10.2, 11.1, 11.2, 11.3, 11.4 Marshall, Andrew Marshall Islands, 1.1, 11.1, 11.2, 11.3, 18.1 Martians, Hungarians as, 4.1, 15.1, 15.2 Massachusetts Institute of Technology (MIT), 5.1, 5.2, 5.3, 7.1, 7.2, 7.3, 7.4, 8.1, 8.2, 9.1, 9.2, 17.1, 18.1, 18.2 Mathematical Foundations of Quantum Mechanics (von Neumann, 1932), 4.1, 15.1 Mathematical Theory of Communication (Shannon, 1948) Mathematical Theory of Cryptography (Shannon, 1945) mathematics advantages of, to IAS education, in Hungary and physics, 4.1, 10.1, 11.1 pure vs. applied, 3.1, 4.1, 10.1 “Mathematics and the Arts” (Morse, 1950) Mauchly, John W. (1907–1980), 5.1, 7.1, 8.1, 9.1, 18.1 on stored-program computing, 5.1, 5.2 “Maxims for Ideal Prognosticators” (Bigelow, 1941), 7.1, 8.1, 16.1 Maxwell, I.

pages: 665 words: 159,350

Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy, and Everything Else
by Jordan Ellenberg
Published 14 May 2021

Then you can use a random number generator to select the next letter; there should be a 14.7% chance it is S, an 11.3% chance it is T, and so on. Having chosen your next letter (T, say), you have your next bigram (NT) and you can proceed as before, as long as you like. Shannon’s paper “A Mathematical Theory of Communication” (the one that launched the entire field of information theory) was written in 1948 and thus did not have access to 3.5 trillion letters of English text in a modern magnetic storage system. So he estimated the Markov chain in a different way. If the bigram before him were ON, he would take a book off the shelf and look through it until he found the letters O and N in succession.

Norvig, “English Letter Frequency Counts: Mayzner Revisited, or ETAOIN SRHLDCU,” 2013, available at http://norvig.com/mayzner.html. Some of the bigram and trigram frequencies are taken from Norvig’s earlier “Natural Language Corpus Data,” in Beautiful Data, eds. T. Segaran and J. Hammerbacher, eds. (Sebastopol, CA: O’Reilly, 2009). It was the engineer: Claude E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27, no. 3 (1948): 388. Here’s some text: All the Markov-chain-generated text here was carried out by Brian Hayes’s incredibly fun “Drivel Generator,” available at http://bit-player.org/wp-content/extras/drivel/drivel.html, using public baby-name data from the U.S.

See also mosquito distribution problem Malkiel, Burton, 82 Manna, Subhrangshu S., 398 Man versus Machine World Checkers/Draughts Championship, 98–99 Māori people, 349 Mapmaker game, 408 maps and directionality, 45, 198 geometries of, 303–10 of personality traits, 192–94, 193 of political ideology, 194–95, 195 projection types, 307–8, 355 and spatial analysis, 191–92 and word vectors, 195–99 See also gerrymandering margins of error, 71–72 Mark One computer, 125 Markov, Andrei Andreyevich, 85–90, 90–92, 289, 291, 299 Markov chains, 88–90, 91–95, 128, 392, 419 Martins, Robert, 138, 142 Maryland, 351, 379, 384, 409 Massachusetts, 363, 402 Math and Mysticism (course), 277 mathematical induction, 119–20 “Mathematical Theory of Communication, A” (Shannon), 93 Mathews, George Ballard, 277 MathOverflow website, 78 math pedagogy and difficulty of math, 26–28, 200–202, 203, 203–4, 419–20 gerrymandering as tool for, 410 and gradient of confidence, 22–23 and playfulness of mathematics, 121 and proof process, 17–24 and self-education ideal, 210 and Sylvester, 319–24 trail-and-error approach, 144–46 mātrā-vrrta, 236, 237 matrices, 285–86, 291n Mattingly, Jonathan, 389, 401 Maxwell’s equations, 60–62 mazes, 396–99, 397, 399 Mazur, Barry, 325 McCain, John, 384 McCallum, Daniel, 106 McGhee, Eric, 375–76 McKellar, Danica, 315, 330n McKendrick, Anderson, 211, 233, 240, 286 measles, 218, 243 mechanical computers, 253–54 medicine, 207–8.

Language and Mind
by Noam Chomsky
Published 1 Jan 1968

There were few so benighted as to question the possibility, in fact the immediacy, of a final solution to the problem of converting speech into writing by available engineering technique. And just a few years later, it was jubilantly discovered that machine translation and automatic abstracting were also just around the corner. For those who sought a more mathematical formulation of the basic processes, there was the newly developed mathematical theory of communication, which, it was widely believed in the early 1950s, had provided a fundamental concept – the concept of “information” – that would unify the social and behavioral sciences and permit the development of a solid and satisfactory mathematical theory of human behavior on a probabilistic base.

It has, I believe, become quite clear that if we are ever to understand how language is used or acquired, then we must abstract for separate and independent study a cognitive system, a system of knowledge and belief, that develops in early childhood and that interacts with many other factors to determine the kinds of behavior that we observe; to introduce a technical term, we must isolate and study the system of linguistic competence that underlies behavior but that is not realized in any direct or simple way in behavior. And this system of linguistic competence is qualitatively different from anything that can be described in terms of the taxonomic methods of structural linguistics, the concepts of S-R psychology, or the notions developed within the mathematical theory of communication or the theory of simple automata. The theories and models that were developed to describe simple and immediately given phenomena cannot incorporate the real system of linguistic competence; “extrapolation” for simple descriptions cannot approach the reality of linguistic competence; mental structures are not simply “more of the same” but are qualitatively different from the complex networks and structures that can be developed by elaboration of the concepts that seemed so promising to many scientists just a few years ago.

pages: 1,799 words: 532,462

The Codebreakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet
by David Kahn
Published 1 Feb 1963

I’d worked on communication systems and I was appointed to some of the committees studying cryptanalytic techniques. The work on both the mathematical theory of communications and the cryptology went forward concurrently from about 1941. I worked on both of them together and I had some of the ideas while working on the other. I wouldn’t say one came before the other—they were so close together you couldn’t separate them.” Though the work on both was substantially complete by about 1944, he continued polishing them until their publication as separate papers in the abstruse Bell System Technical Journal in 1948 and 1949. Both articles—”A Mathematical Theory of Communication” and “Communication Theory of Secrecy Systems”—present their ideas in densely mathematical form, pocked with phrases like “this inverse must exist uniquely” and expressions like “TiRj(TkR1)-1TmRn.”

An example of the theory’s use in history is Roberta Wohlstetter’s Pearl Harbor, which fruitfully uses the concepts “signal” and “noise” to help explain the catastrophe. 744 redundancy: “A Mathematical Theory of Communication,” Colin Cherry, On Human Communication (New York: John Wiley & Sons, 1957), 115-120, 180-187; George A. Miller, Language and Communication (New York: McGraw-Hill, 1951), chs. 4 and 5. The latter are extremely valuable books. 745 four-letter language: adapted from G. T. Guilbaud, What Is Cybernetics?, trans. by Valerie MacKay (New York: Grove Press (Evergreen), 1960), 102. 745 “Two extremes of redundancy”: “A Mathematical Theory of Communication,” §7. 745 Dewey’s count: Relative Frequency of English Speech Sounds (Cambridge, Mass.: Harvard University Press, 1923), 17-19. 746 voiced stops: George K.

What is perhaps most striking is that the eight shorter counts average to 18.0 per cent—a difference of only one e per thousand letters from the Kaeding standard. Thus does language cleave to its statistical norms! Why? The answer may be found within the theory formulated after World War II that not only explains cryptanalysis but also extends far beyond. It is called “information theory” or, sometimes, a “mathematical theory of communication.” It deals in general with the mathematical laws that govern systems designed to communicate information. Originating in transmission problems of telephony and telegraphy, it has grown to embrace virtually all information-processing devices, from standard communications systems to electronic computers and servomechanisms, and even the nerve networks of animals and men.

pages: 94 words: 33,179

Novacene: The Coming Age of Hyperintelligence
by James Lovelock
Published 27 Aug 2019

He felt so strongly about this that he asked for the simple formula expressing his thoughts to be carved on his gravestone. The first attempt to tackle information scientifically was in the 1940s, when the American mathematician and engineer Claude Shannon was working on cryptography. In 1948 this work resulted in his article ‘A Mathematical Theory of Communication’, a primary document of post-war technology. Information theory is now at the centre of mathematics, computer science and many other disciplines. The basic unit of information is the bit, which can have a value of zero or one, as in true or false, on or off, yes or no. I see a bit as primarily an engineering term, the tiniest thing from which all else is constructed.

pages: 124 words: 36,360

Kitten Clone: Inside Alcatel-Lucent
by Douglas Coupland
Published 29 Sep 2014

But we couldn’t have done anything too practical with the transistor unless someone figured out that information needs to be converted into ones and zeroes to be processed properly. That happened here, and was shared with the world in the July and October 1948 issues of the Bell System Technical Journal, in Claude Shannon’s seminal ‘A Mathematical Theory of Communication.’ In the next decades, lasers and optical fibre were developed to overcome the limitations of copper wiring; cellular-based wireless communication was developed to maximize the amount of information that could be passed within a system. An iPhone 4, if made with vacuum tubes, would fill one-quarter of the Grand Canyon.”

pages: 396 words: 112,748

Chaos: Making a New Science
by James Gleick
Published 18 Oct 2011

The patterns revealed a stretching and folding that led back to the horseshoe map of Smale. THE MOST CHARACTERISTICALLY Santa Cruzian imprint on chaos research involved a piece of mathematics cum philosophy known as information theory, invented in the late 1940s by a researcher at the Bell Telephone Laboratories, Claude Shannon. Shannon called his work “The Mathematical Theory of Communication,” but it concerned a rather special quantity called information, and the name information theory stuck. The theory was a product of the electronic age. Communication lines and radio transmissions were carrying a certain thing, and computers would soon be storing this same thing on punch cards or magnetic cylinders, and the thing was neither knowledge nor meaning.

DOYNE FARMER Farmer is the main figure and Packard is a secondary figure in The Eudemonic Pie, the story of the roulette project, written by a sometime associate of the group. PHYSICS AT SANTA CRUZ Burke, Farmer, Crutchfield. “GIZMO-ORIENTED” Shaw. FORD HAD ALREADY DECIDED Ford. THEY REALIZED THAT MANY SORTS Shaw, Farmer. INFORMATION THEORY The classic text, still quite readable, is Claude E. Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Illinois, 1963), with a helpful introduction by Weaver. “WHEN ONE MEETS THE CONCEPT” Ibid., p. 13. NORMAN PACKARD WAS READING Packard. IN DECEMBER 1977 Shaw. WHEN LORENZ WALKED INTO THE ROOM Shaw, Farmer. HE FINALLY MAILED HIS PAPER “Strange Attractors, Chaotic Behavior, and Information Flow.”

pages: 169 words: 41,887

Literary Theory for Robots: How Computers Learned to Write
by Dennis Yi Tenen
Published 6 Feb 2024

Richards, The Meaning of Meaning: A Study of the Influence of Language Upon Thought and of the Science of Symbolism (New York: Harcourt, Brace, 1936), 47. 103 “As for my reasons”: L. I. Emelyakh, “Delo ob otluchenii ot tserkvi akademika A. A. Markova” [The case of the excommunication from the church of academician A. A. Markov], Voprosy istorii religii i ateizma 2 (1954): 397–­411. 106 “These semantic aspects of communication”: Claude Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27, no. 3 (1948): 379. 110 The program was capable of: Charles M. Vossler and Neil M. Branston, “The Use of Context for Correcting Garbled English Text,” in Proceedings of the 1964 ACM 19th National Conference (New York: Association for Computing Machinery, 1964), 42.401–­42.4013.

pages: 561 words: 120,899

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy
by Sharon Bertsch McGrayne
Published 16 May 2011

One problem complemented the other; the purpose of information is to reduce uncertainty while the purpose of encryption is to increase it. Shannon was using Bayesian approaches for both. He said, “Bell Labs were working on secrecy systems. I’d work on communications systems and I was appointed to some of the committees studying cryptanalytic techniques. The work on both the mathematical theory of communications and the cryptography went forward concurrently from about 1941. I worked on both of them together and I had some of the ideas while working on the other. I wouldn’t say that one came before the other—they were so close together you couldn’t separate them.”32 Shannon’s efforts united telegraph, telephone, radio, and television communication into one mathematical theory of information.

Rejewski M. (1981) How Polish mathematicians deciphered the Enigma. Annals of the History of Computing (3) 223. Rukhin, Andrew L. (1990) Kolmogorov’s contributions to mathematical statistics. Annals of Statistics (18:3) 1011–16. Sales, Tony. www.codesandciphers.org.uk/aescv.htm. Shannon, Claude E. (July, October 1948) A mathematical theory of communication. Bell System Technical Journal (27) 379–423, 623–56. ———. (1949) Communication theory of secrecy systems. netlab.cs.ucla.edu/wiki/files/Shannon1949.pdf. Acc. March 31, 2007. Shiryaev, Albert N. (1989) Kolmogorov: Life and Creative Activities. Annals of Probability (17:3) 866–944. ———. (1991) Everything about Kolmogorov was unusual.

pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software
by Charles Petzold
Published 28 Sep 1999

Although this circuit contains nothing that wasn't invented in the nineteenth century, nobody in that century ever realized that Boolean expressions could be directly realized in electrical circuits. This equivalence wasn't discovered until the 1930s, most notably by Claude Elwood Shannon (born 1916), whose famous 1938 M.I.T. master's thesis was entitled "A Symbolic Analysis of Relay and Switching Circuits." (Ten years later, Shannon's article "The Mathematical Theory of Communication" was the first publication that used the word bit to mean binary digit.) Prior to 1938, people knew that when you wired two switches in series, both switches had to be closed for current to flow, and when you wired two switches in parallel, one or the other had to be closed. But nobody had shown with Shannon's clarity and rigor that electrical engineers could use all the tools of Boolean algebra to design circuits with switches.

Claude Shannon (born 1916) was another influential thinker. In Chapter 11, I discussed his 1938 master's thesis, which established the relationship between switches, relays, and Boolean algebra. In 1948, while working for Bell Telephone Laboratories, he published a paper in the Bell System Technical Journal entitled "A Mathematical Theory of Communication" that not only introduced the word bit in print but established a field of study today known as information theory. Information theory is concerned with transmitting digital information in the presence of noise (which usually prevents all the information from getting through) and how to compensate for that.

pages: 180 words: 55,805

The Price of Tomorrow: Why Deflation Is the Key to an Abundant Future
by Jeff Booth
Published 14 Jan 2020

Available at digitallibrary.hsp.org/index.php/Detail/objects/9792. 45. Voltaire, Le Siècle de Louis XIV (1752). 46. Karl Popper, as quoted by Mark Damazer, “In Our Time’s Greatest Philosopher Vote,” In Our Time (BBC 4). 47. “The Babbage Engine,” Computer History Museum. computer history.org/babbage. 48. Claude E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal, 1948. 49. John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon, “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” August 31, 1955. Available at www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html. 50.

pages: 550 words: 154,725

The Idea Factory: Bell Labs and the Great Age of American Innovation
by Jon Gertner
Published 15 Mar 2012

What he’d been working on at home during the early 1940s had become a long, elegant manuscript by 1947, and one day soon after the press conference in lower Manhattan unveiling the invention of the transistor, in July 1948, the first part of Shannon’s manuscript was published as a paper in the Bell System Technical Journal; a second installment appeared in the Journal that October.28 “A Mathematical Theory of Communication”—“the magna carta of the information age,” as Scientific American later called it—wasn’t about one particular thing, but rather about general rules and unifying ideas. “He was always searching for deep and fundamental relations,” Shannon’s colleague Brock McMillan explains. And here he had found them.

“Davy,” 28–30, 32, 33, 37, 40, 43, 52, 60, 61, 69, 152, 353, 359 Murray Hill complex and, 76 DeButts, John, 273–74, 298 De Forest, Lee, 23 Depression, Great, 36, 37, 41, 43, 75 digital computing, 123, 251 digital information, 129–31, 185, 250–51 digital photography, 261 DiPiazza, Gerry, 293–95 discovery, invention vs., 106–7 Distant Early Warning (DEW) line, 161, 182 “Don’t Write: Telegraph” (Pierce), 202–3 Dorros, Irwin, 239, 264, 265, 333 Drucker, Peter, 302–3, 330 DuBridge, Lee, 157, 245 DuPont, 167 Echo, 212–20, 221, 222, 224, 225, 226, 227, 228, 244, 254, 323, 340 Edison, Thomas, 11–13, 14, 29, 81, 152 carbon granules and, 12, 20 Einstein, Albert, 43, 51, 267 Shannon and, 121, 131–32 Eisenhower, Dwight D., 157, 182, 217–18, 246, 247 electrolytes, 93–94, 95 electromagnetic waves, 235–36 electron diffraction, 37 electronic switching (ESS), 229, 231–34, 235, 260, 261, 290–91 electrons, 15, 42, 43, 83–84, 85, 95, 101 Electrons and Holes in Semiconductors (Shockley), 112 Elizabeth II, Queen, 224 Elmendorf, Chuck, 55, 192–94, 236, 283, 288, 312, 358 energy innovation, 355–56 Engel, Joel, 287–91, 294–95, 354 Epstein, Paul, 15 Espenschied, Lloyd, 63–64 Facebook, 344, 353–54 Fairchild Semiconductor, 251, 252 Fano, Robert, 131 Federal Communications Commission (FCC), 226–27, 260, 270–72, 280–83, 286–88, 290, 295–97, 302, 329 Fermi, Enrico, 43, 60 Feynman, Richard, 42, 63 fiber optics, 258–62, 277–79, 331, 341 testing of, 296–97 field effect, 90–91, 92, 101 Fisk, Jim, 2, 3, 38, 41, 43, 55, 59–60, 68–72, 80, 88, 133, 159, 170, 184–85, 211–13, 234–37, 242, 245–46, 248, 252, 254, 258, 260, 263, 268, 285, 300, 304, 306, 307, 311–13 Baker and, 241–42, 243 retirement of, 266 Fleckenstein, Bill, 237 Fletcher, Harvey, 15, 16, 22, 25, 26, 28, 40, 43, 65, 80, 96, 267 Jewett and, 24 Flexner, Simon, 205 Forrester, Jay, 105–6, 334 Fortune, 142, 163–64, 166, 184–85, 219, 243, 270 Frenkiel, Dick, 284–96, 351 Friis, Harald, 152–53, 174, 196, 206, 209, 213 Fry, Thornton, 122–23 Fuller, Cal, 168–69, 171–72 functional devices, 252 Galambos, Louis, 19–20 Gallatin, Mo., 9–11, 38, 342 Gates, Bill, 4, 357 General Electric (GE), 163, 251, 303, 348 germanium, 86–87, 93–95, 99, 102–4, 107, 109–10, 165–66, 168, 169 purification of, 114, 134 Gibney, Robert B., 93, 96 Ginsparg, Paul, 337 glass, 83 Glennan, Keith, 211 glider planes, 189–90, 192 Goeken, Jack, 271 “Gold Bug, The” (Poe), 124 Golden, William, 157 Goodell, Rae, 313 Google, 341, 344, 353–54 Gordon, Eugene, 109 Gould, Gordon, 255 Gray, Elisha, 17–18, 98 Greene, Harold, 297–98, 299, 302 Gunther-Mohr, Robert, 305–6 Hagelbarger, David, 144, 148 Hagstrum, Homer, 201–2 Hartley, Ralph, 121 Hayes, Brian, 339 Hecht, Jeff, 259 Herriott, Donald, 256 Hewlett, Bill, 308 Hewlett-Packard, 308, 319 Hill, Charles, 179–80 Hoddeson, Lillian, 44, 79, 88, 105 Hoerni, Jean, 181 Holmdel, 213–17, 278, 281 Black Box, 284–85, 331, 338, 339, 340, 354 Crawford Hill, 214–18, 220, 223, 258, 259, 340 horn antenna, 173–74, 206, 207, 209, 215, 223 Howard Hughes Medical Institute, 354–55 How to Build and Fly Gliders (Pierce), 189, 190, 192, 200 Hughes Aircraft, 255 IBM, 348 Kelly as consultant for, 305–6 Imperial College of Science and Engineering, 141 information, 342 digital, 129–31, 185, 250–51 see also communication and messages information theory, 125, 128–30, 135, 136, 141, 142, 149, 151, 185–86, 202, 281, 318–19 innovation, 152–53, 250, 260, 343–44 competition and, 352 energy, 355–56 at Janelia Farm, 355 Kelly’s approach to, 151–52, 186, 211, 343, 345, 347 mistakes in, 262 Morton on, 108–9, 113, 152 spurs to, 153 use of term, 107, 151–52 venture economy and, 347–48 innovation hubs, 355 innovator’s dilemma, 349–50 Institute of Radio Engineers, 203 integrated circuits, 253–54, 260, 261–62, 339 Intel, 290, 308, 341 Internet, 334, 335, 342 invention, 152–53 discovery vs., 106–7 individual genius vs. collaboration in, 133–35 Jakes, Bill, 212–18, 227, 280, 291–92, 295 Jakes, Mary, 214, 215, 216 James, Frank, 10 James, Jesse, 10 Janelia Farm, 354–55 Jansky, Karl, 106 Japan Prize, 359 Javan, Ali, 256 Jet Propulsion, 203 Jet Propulsion Laboratory, 210–11, 214, 215, 325 Jewett, Frank Baldwin, 16–19, 21, 24, 26, 27, 30, 31–33, 36–37, 45, 59, 64, 82, 83, 106, 157, 192, 246, 268, 300, 353, 356 as chairman, 78 Fletcher and, 24 Millikan and, 16–17, 22, 24 Murray Hill complex and, 76–77 transatlantic phone service and, 176 transcontinental phone service and, 21–22 Jobs, Steve, 357 Johns Hopkins University, 14 Johnson, Lyndon B., 223, 247, 248 Kahn, David, 125 Kao, Charles, 258–59, 261 Kappel, Frederick, 219, 220, 223, 231 Kasparov, Garry, 322 Keefauver, Bill, 239 Kelly, Joseph Fennimore, 9–10, 11 Kelly, Katherine, 16, 28, 155 Kelly, Mervin, 2, 3, 9–11, 13, 14, 24, 25, 26, 28–30, 32, 33, 36–38, 40, 41, 44–48, 51, 52, 59–71, 73–74, 78–81, 83, 85, 88, 108, 113–14, 127, 134, 141, 149–62, 163, 165, 169, 170, 172–73, 180, 183–85, 234, 236, 242–46, 249, 253, 266–67, 270–71, 274, 285, 300, 304–7, 311, 339, 342–43, 345–46, 352, 353 amplifier work and, 95–97 as Bell Labs executive vice president, 79, 156 as Bell Labs president, 156, 157 death of, 306–7 early life of, 9–10, 342 gardens of, 155–56, 304 as IBM consultant, 305–6 innovation as viewed by, 151–52, 186, 211, 343, 345, 347 interdisciplinary groups created by, 79–80 lectures about Bell labs given by, 149–52 military work of, 157–62, 307 Millikan and, 16 mobile phones and, 280 Murray Hill complex and, 75–78 Nobel Prize and, 181 Parkinson’s disease of, 306 Pierce and, 195–96, 306–7, 345–46 retirement of, 212, 304–5 Sandia Labs and, 159–60, 271 satellite project and, 210, 211–12, 220, 225 Shockley and, 56, 180–82 transatlantic phone cable and, 176–79 transistor and, 99, 101, 105, 108, 110–13, 180 vacuum tube work of, 33–36, 37, 82, 349 work habits of, 155, 156–57 Kennedy, John F., 224, 247, 248 Kilby, Jack, 251–54, 262 Killian, James, 157, 245 Kim, Jeong, 337–38, 343 Kleiner, Eugene, 181, 346 Kleiner Perkins, 346, 348 Kleinrock, Len, 317, 319 Kogelnik, Herwig, 256–57, 341, 345 Kompfner, Rudi, 198, 199, 201, 207, 210–14, 216, 217, 223, 256–58, 265, 275–77, 323–24, 341 death of, 324 fiber optics and, 259–60, 261, 277 Kwajalein, 293–94 Kyoto Prize, 322 Land, Edwin, 248 Landau, Henry, 190 language, 125–26 lasers, 207, 254–58, 261, 276–79, 341 Lewbel, Arthur, 320 Li, Tingye, 259 light, 275–76 infrared, 254–55 lasers, 207, 254–58, 261, 276–79, 341 optical communications, 256–61; see also fiber optics Lilienthal, David, 160 Lillienfield, Julius, 101 linemen, 49 Lombardo, Guy, 218–19 Long Lines, 25, 173, 269, 301 Los Alamos, 159 Los Angeles Times, 314 Lucent, 335–37, 338, 340, 346 Lucky, Bob, 129, 131, 144, 265, 274, 332, 357, 359 on Baker, 238 Pierce and, 190–91 Macdonald, Stuart, 90, 108, 152 magnetron, 67–69, 70, 71 Maiman, Ted, 255–56 Manhattan Project, 4, 63, 64, 66, 134, 157, 356 Marconi, Guglielmo, 177 masers, 207, 208, 209, 254–55, 359 “Mathematical Theory of Communication, A” (Shannon), 127–32, 133 “Mathematical Theory of Cryptography, A” (Shannon), 124, 125, 131 Mathews, Max, 185–86, 225, 325–26 Mayo, John, 300, 302, 327, 331, 344, 348–50, 353 McCalley, Andrew, 39 McGowan, Bill, 271–72 MCI (Microwave Communications Inc.), 271–74, 299, 328 McMillan, Brock, 127, 132, 133, 136, 138, 156, 211 Mendel, Gregor, 134 messages, see communication and messages Metcalfe, Robert, 264 Microsoft, 341, 353–54 Microwave Communications Inc.

pages: 573 words: 157,767

From Bacteria to Bach and Back: The Evolution of Minds
by Daniel C. Dennett
Published 7 Feb 2017

“Vervet Monkey Alarm Calls: Semantic Communication in a Free-Ranging Primate.” Animal Behaviour 28 (4): 1070–1094. Shanahan, Murray. 2010. Embodiment and the Inner Life. New York: Oxford University Press. Shannon, Claude Elwood. 1948. “A Mathematical Theory of Communication.” Bell System Technical Journal 27 (3). Shannon, Claude Elwood, and Warren Weaver. 1949. The Mathematical Theory of Communication. Urbana: University of Illinois Press. Shepard, Roger N., and Jacqueline Metzler. 1971. “Mental Rotation of Three Dimensional Objects.” Science 171 (3972): 701–703. Shepard, Roger N., and Lynn A. Cooper. 1982. Mental Images and Their Transformations.

The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal
by M. Mitchell Waldrop
Published 14 Apr 2001

It's telling that Shannon felt no need to explain why; in 1939 Bush would have known that communication engineering was still mostly a matter of trial and error, lore, and rules of thumb. Shannon's tacit hope was that the field could be transformed from an art into a science, that a rigorous mathematical theory of communication would provide engineers with the tools to design their systems with assurance. And indeed, the approach he outlined in the re- mainder of his letter would prove remarkably prescient, comprising many of the ideas he would publish in 1948 and giving a hint that he had already thought about most of the rest.

Scientific American agreed wholeheart- edly: "[Information theory encompasses] all of the procedures by which one mind may affect another," held an article written by the director of the Rocke- feller Foundation's Natural Sciences Division, Warren Weaver. " [It] involves not only written and oral speech, but also music, the pictorial arts, the theatre, the ballet, and in fact all human behavior." Shannon's fellow mathematicians were enthralled. "A Mathematical Theory of Communication" had created a whole new domain of applied mathematics at a stroke, and suddenly there were a million questions to play with. What was the information capacity of a two-way channel? What did information theory have to say about cryptography? What were the most efficient encodings for English text as opposed to numerical data?

What were the most efficient encodings for English text as opposed to numerical data? Academic bulletin boards soon -Weaver would subsequently arrange to have a collection of Shannon's papers published as a book, with his own article serving as the introduction and with himself billed as full coauthor. Pub- lished in 1949, The Mathematical Theory of Communication by Shannon and Weaver exposed the the- ory to a much wider audience and became such a standard reference in the field that many people to this day refer to its subject matter (incorrectly) as "Shannon-Weaver" information theory. 94 THE DREAM MACHINE began to display announcements for seminars, conferences, and courses on in- formation theory

pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science
by Chris Bernhardt
Published 12 May 2016

Rejewsksi, when asked about this, is said to have replied that he couldn’t think of a better name. 8. Both Shannon and Turing were interested in using ideas from probability to extract information from data. Shannon would later extend some of his wartime work and write the groundbreaking paper “A Mathematical Theory of Communication” that is one of the foundations of Information Theory. Turing wrote several articles on the application of probability to cryptography. These were classified and only now are being made available to the public (two papers were declassified in 2012. They are available at http://www.nationalarchives.gov.uk). 9.

pages: 625 words: 167,349

The Alignment Problem: Machine Learning and Human Values
by Brian Christian
Published 5 Oct 2020

One is to predict a missing word given its context, and the other is the reverse: to predict contextual words from a given word. These methods are referred to as “continuous bag-of-words” (CBOW) and “skip-gram,” respectively. For simplicity, we focus our discussion on the former, but both approaches have advantages, though they tend to result ultimately in fairly similar models. 55. Shannon, “A Mathematical Theory of Communication.” 56. See Jelinek and Mercer, “Interpolated Estimation of Markov Source Parameters from Sparse Data,” and Katz, “Estimation of Probabilities from Sparse Data for the Language Model Component of a Speech Recognizer”; for an overview, see Manning and Schütze, Foundations of Statistical Natural Language Processing. 57.

“‘Along an Imperfectly-Lighted Path’: Practical Rationality and Normative Uncertainty.” PhD thesis, Rutgers University, 2010. Shafto, Patrick, Noah D. Goodman, and Thomas L. Griffiths. “A Rational Account of Pedagogical Reasoning: Teaching by, and Learning from, Examples.” Cognitive Psychology 71 (2014): 55–89. Shannon, Claude E. “A Mathematical Theory of Communication.” Bell System Technical Journal 27, no. 3 (July 1948): 379–423. Shaw, J. Cliff, Allen Newell, Herbert A. Simon, and T. O. Ellis. “A Command Structure for Complex Information Processing.” In Proceedings of the May 6–8, 1958, Western Joint Computer Conference: Contrasts in Computers, 119–28.

pages: 222 words: 74,587

Paper Machines: About Cards & Catalogs, 1548-1929
by Markus Krajewski and Peter Krapp
Published 18 Aug 2011

The Evolution of Bibliographic Systems in the United States, 1876–1945. Library Trends 25 (1):293–309. Seeck, Otto. 1924. Laterculum. In Paulys Realencyclopädie der classischen Altertumswissenschaft, ed. Georg Wissowa. Vol. 12, col. 904–907. Stuttgart: Metzler. Shannon, Claude E. [1948] 1993. A Mathematical Theory of Communication. In Claude Elwood Shannon: Collected Papers, ed. N. J. A. Sloane and Aaron D. Wyner, 5–83. New York: IEEE Press. Sherman, E. W. 1916. History and Growth of Indexing Department. L.B. Monthly News 24:42–44. Supplement: 40th Anniversary Number 1876–1916. Sibley, John Langdon. 1863. Librarian’s Report, 12 July, 1861.

pages: 250 words: 73,574

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers
by John MacCormick and Chris Bishop
Published 27 Dec 2011

Hamming we have met already: it was his annoyance at the weekend crashes of a company computer that led directly to his invention of the first error-correcting codes, now known as Hamming codes. However, error-correcting codes are just one part of a larger discipline called information theory, and most computer scientists trace the birth of the field of information theory to a 1948 paper by Claude Shannon. This extraordinary paper, entitled “The Mathematical Theory of Communication,” is described in one biography of Shannon as “the Magna Carta of the information age.” Irving Reed (co-inventor of the Reed-Solomon codes mentioned below) said of the same paper: “Few other works of this century have had greater impact on science and engineering. By this landmark paper…he has altered most profoundly all aspects of communication theory and practice.”

pages: 294 words: 82,438

Simple Rules: How to Thrive in a Complex World
by Donald Sull and Kathleen M. Eisenhardt
Published 20 Apr 2015

. [>] He wrote a seminal: Warren Weaver, “Translation” (unpublished memorandum, Rockefeller Foundation, July 15, 1949), available at Machine Translation Archive, http://www.mt-archive.info/Weaver-1949.pdf; and Matt Novak, “The Cold War Origins of Google Translate,” BBC Online, May 30, 2012, http://www.bbc.com/future/story/20120529-a-cold-war-google-translate. Weaver also coauthored, with Claude E. Shannon, The Mathematical Theory of Communication (Champaign: University of Illinois Press, 1949), which laid out the principles required to build modern telecommunications networks, including the Internet. [>] When India and: Justin Gillis, “Norman Borlaug, Plant Scientist Who Fought Famine, Dies at 95,” New York Times, September 13, 2009. [>] In his 1948 article: Weaver, “Science and Complexity,” 536–44.

pages: 791 words: 85,159

Social Life of Information
by John Seely Brown and Paul Duguid
Published 2 Feb 2000

Regional Advantage: Culture and Competition in Silicon Valley and Route 128. Cambridge: Harvard University Press. Schwartz, Evan I. 1998. "Shopbot Pandemonium." Wired [Online] 6.12 (December). Available: http://www.wired.com/wired/archive/6.12/mustread.html [1999, July 21]. Shannon, Claude E., and Warren Weaver. 1964. The Mathematical Theory of Communication. Urbana, IL: University of Illinois Press. Shao, Maria. 1995. "Beyond Reengineering." Boston Globe, 12 November, sec. A, p. 21. Shapiro, Carl, and Hal R. Varian. 1999. Information Rules: A Strategic Guide to the Network Economy. Boston: Harvard Business School Press. Sherman, William H. 1995.

The Fractalist
by Benoit Mandelbrot
Published 30 Oct 2012

I never seriously thought of moving over, but I felt energized and kept looking for analogous openings closer to my strengths. The timing was ideal because several new developments that had been “bottled up” by war conditions were being revealed in a kind of fireworks I saw on no other occasion. My restless curiosity led me to read works that were widely discussed when they appeared: Mathematical Theory of Communication by Claude Shannon, Cybernetics, or Control and Communication in the Animal and the Machine by Norbert Wiener, and Theory of Games and Economic Behavior by John von Neumann and Oskar Morgenstern. Except for a fleeting thought that I might return to mathematics in 1949 via the University of Chicago, I was beginning to think that the examples of Wiener and von Neumann might guide me to an idea big enough to make me, in some way, the Delbrück of a new field.

The End of Accounting and the Path Forward for Investors and Managers (Wiley Finance)
by Feng Gu
Published 26 Jun 2016

For a given information source, we include only events that do not coincide with the events of other information sources (e.g., we exclude managers’ forecasts that occur on the same day as quarterly earnings are announced). Sample firms included in Figure 4.1 are all US-listed companies with the required data, obtained from Compustat, CRSP, I/B/E/S First Call, and the S&P SEC Filings Database. Worse Than at First Sight 49 NOTES 1. Claude Shannon and Warren Weaver, The Mathematical Theory of Communication (Champaign–Urbana: University of Illinois Press, 1949). 2. Mathematically, the amount of information conveyed by a message is measured in communication theory by the logarithm of the ratio of the prior (before the message was received) to the posterior (after the message reception) probabilities of the event (e.g., rain at 3:00 pm tomorrow) occurring.

Noam Chomsky: A Life of Dissent
by Robert F. Barsky
Published 2 Feb 1997

The stipend that accompanied his position meant that he no longer had to support himself with nonacademic jobs. In the early 1950s, debate was raging over the breakthroughs that new technology was promising in the understanding of human behavior. Computers, electronics, acoustics, mathematical theories of communication and cybernetics were all in vogue, and researchers were busy exploiting them. Chomsky, a graduate student in his early twenties, was uneasy with this activity: "Some people, myself included, were rather concerned about these developments, in part for political reasons, at least as far as my motivations were concerned because this whole complex of ideas seemed linked to potentially quite dangerous political currents: manipulative, and connected with behaviorist concepts of human nature" (Language and Politics 44).

pages: 362 words: 97,862

Physics in Mind: A Quantum View of the Brain
by Werner Loewenstein
Published 29 Jan 2013

In From Complexity to Life, edited by Gregersen, N. H. New York: Oxford University Press. Chaitin, G. J. 1987. Algorithmic Information Theory. Cambridge: Cambridge University Press. Chothia, C. 1992. One thousand families for the molecular biologists. Nature 357:543–544. Shannon, C. E., and Weaver, W. 1959. The Mathematical Theory of Communication. Urbana: University of Illinois Press. Wilson, A. C., Ochman, H., and Prager, E. M. 1987. Molecular time scale for evolution. Trends in Genetics 3:241–247. Zurek, W. H. 1989. Algorithmic randomness and physical entropy. Physical Review A 40:4731–4751. A Quantum Random Generator of Molecular Form Haseltine, N. 1983.

pages: 313 words: 101,403

My Life as a Quant: Reflections on Physics and Finance
by Emanuel Derman
Published 1 Jan 2004

In physics and engineering the Labs was an experimental and theoretical powerhouse, producing research in electronics and information theory that made possible many of the subsequent advances in communications. Bardeen, Brattain, and Shockley had invented the transistor there in 1947, and Claude Shannon published his landmark paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in 1948. There were fundamental discoveries made, too-Penzias and Wilson won the Nobel Prize for discovering the cosmic radiation left behind by the Big Bang, as predicted by Robert Herman. Even during the period I worked there, Horst Stormer, now at Columbia University, did the research on the quantum Hall effect that recently won him a share of the Nobel Prize.

pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity
by Amy Webb
Published 5 Mar 2019

“The Lovelace 2.0 Test of Artificial Creativity and Intelligence.” https://arxiv.org/pdf/1410.6142.pdf. Schneier, B. “The Internet of Things Is Wildly Insecure—and Often Unpatchable.” Wired, January 6, 2014. https://www.wired.com/2014/01/theres-no-good-way-to-patch-the-Internet-of-things-and-thats-a-huge-problem/. Shannon, C., and W. Weaver. The Mathematical Theory of Communication. Urbana: University of Illinois Press, 1963. Singer, P. Wired for War: The Robotics Revolution and Conflict in the 21st Century. London: Penguin Press, 2009. Stanford University. “One Hundred Year Study on Artificial Intelligence (AI100).” https://ai100.stanford.edu/. Toffler, A.

pages: 571 words: 105,054

Advances in Financial Machine Learning
by Marcos Lopez de Prado
Published 2 Feb 2018

Meucci, A. (2009): “Managing diversification.” Risk Magazine, Vol. 22, pp. 74–79. Norwich, K. (2003): Information, Sensation and Perception, 1st ed. Academic Press. Ornstein, D.S. and B. Weiss (1993): “Entropy and data compression schemes.” IEEE Transactions on Information Theory, Vol. 39, pp. 78–83. Shannon, C. (1948): “A mathematical theory of communication.” Bell System Technical Journal, Vol. 27, No. 3, pp. 379–423. Ziv, J. and A. Lempel (1978): “Compression of individual sequences via variable-rate coding.” IEEE Transactions on Information Theory, Vol. 24, No. 5, pp. 530–536. Bibliography Easley, D., R. Engle, M. O'Hara, and L.

pages: 268 words: 109,447

The Cultural Logic of Computation
by David Golumbia
Published 31 Mar 2009

Minds, Brains and Science. Cambridge, MA: Harvard University Press. ———. 1992. The Rediscovery of the Mind. Cambridge, MA: The MIT Press. Shannon, Claude. 1951. “Prediction and Entropy of Printed English.” Bell Systems Technical Journal 30, 50–64. Shannon, Claude, and Warren Weaver. 1949. The Mathematical Theory of Communication. Urbana, IL: University of Illinois Press. Sharp, Duane E. 2003. Customer Relationship Management Systems Handbook. Boca Raton, FL: Auerbach/CRC Press. Shields, Rob. 2003. The Virtual. New York: Routledge. Shirky, Clay. 2008. Here Comes Everybody: The Power of Organizing Without Organizations.

pages: 460 words: 107,712

A Devil's Chaplain: Selected Writings
by Richard Dawkins
Published 1 Jan 2004

You could spend a lifetime reading in this ancient library and die unsated by the wonder of it. 1 See ‘Unfinished Correspondence with a Darwinian Heavyweight’ (pp. 256–62). 2 The producers never deigned to send me a copy: I completely forgot about it until an American colleague called it to my attention. 3 See Barry Williams, ‘Creationist deception exposed’, the Skeptic 18 (1998), 3, pp. 7–10, for an account of how my long pause (trying to decide whether to throw them out) was made to look like hesitant inability to answer the question, followed by an apparently evasive answer to a completely different question. 4 It is important not to blame Shannon for my verbal and intuitive way of expressing what I think of as the essence of his idea. Mathematical readers should go straight to the original, C. Shannon and W. Weaver, The Mathematical Theory of Communication (University of Illinois Press, 1949). Claude Shannon, by the way, had an imaginative sense of humour. He once built a box with a single switch on the outside. If you threw the switch, the lid of the box slowly opened, a mechanical hand appeared, reached down and switched off the box.

pages: 332 words: 109,213

The Scientist as Rebel
by Freeman Dyson
Published 1 Jan 2006

And for the last ten years of his life, as he traveled from country to country preaching the gospel of cybernetics, he used analog language almost exclusively. In spite of his original intentions, cybernetics became a theory of analog processes. Meanwhile, also in 1948, Claude Shannon published his classic pair of papers with the title “A Mathematical Theory of Communication,” in The Bell System Technical Journal. Shannon’s theory was a theory of digital communication, using many of Wiener’s ideas but applying them in a new direction. Shannon’s theory was mathematically elegant, clear, and easy to apply to practical problems of communication. It was far more user-friendly than cybernetics.

pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable
by James Owen Weatherall
Published 2 Jan 2013

“A History of the Efficient Market Hypothesis.” University College London Department of Computer Science Research Note. Available at http://www-typo3.cs.ucl.ac.uk/fileadmin/UCL-CS/images/Research_Student_Information/RN_11_04.pdf. Shannon, Claude Elwood, and Warren Weaver. 1949. A Mathematical Theory of Communication. Champaign: University of Illinois Press. Sharpe, William. 1964. “Capital Asset Prices: A Theory of Market Equilibrium Under Conditions of Risk.” Journal of Finance 19 (3): 425–42. Sheehan, Frederick J. 2010. Panderer to Power: The Untold Story of How Alan Greenspan Enriched Wall Street and Left a Legacy of Recession.

The Deep Learning Revolution (The MIT Press)
by Terrence J. Sejnowski
Published 27 Sep 2018

The latest growth spurt has been fueled by the widespread availability of big data, and the story of NIPS has been one of preparing for this day to come. III Technological and Scientific Impact Timeline 1971—Noam Chomsky publishes “The Case against B. F. Skinner” in the New York Review of Books, an essay that steered a generation of cognitive scientists away from learning. 1982—Claude Shannon publishes the seminal book A Mathematical Theory of Communication, which laid the foundation for modern digital communication. 1989—Carver Mead publishes Analog VLSI and Neural Systems, founding the field of neuromorphic engineering, which builds computer chips inspired by biology. 2002—Stephen Wolfram publishes A New Kind of Science, which explored the computational capabilities of cellular automata, algorithms that are even simpler than neural networks but still capable of powerful computing. 2005—Sebastian Thrun’s team wins the DARPA Grand Challenge for an autonomous vehicle. 2008—Tobias Delbrück develops a highly successful spiking retina chip called the “Dynamic Vision Sensor” (DVS) that uses asynchronous spikes rather than synchronous frames used in current digital cameras. 2013—U.S.

pages: 298 words: 43,745

Understanding Sponsored Search: Core Elements of Keyword Advertising
by Jim Jansen
Published 25 Jul 2011

Information Technology and Management, vol. 9(3), pp. 201–214. [20] Jansen, B. J. and Spink, A. 2005. “How Are We Searching the World Wide Web? A Comparison of Nine Search Engine Transaction Logs.” Information Processing & Management, vol. 42(1), pp. 248–263. [21] Shannon, C. E. 1948. “A Mathematical Theory of Communication.” Bell System Technical Journal, vol. 27(July/October), pp. 379–423, 623–656. [22] Miller, G. A. 1956. “The Magical Number Seven Plus or Minus Two: Some Limits on Our Capacity for Processing Information.” Psychological Review, vol. 63(1), pp. 81–97. [23] Zipf, G. K. 1949. Human Behavior and the Principle of Least Effort.

pages: 494 words: 142,285

The Future of Ideas: The Fate of the Commons in a Connected World
by Lawrence Lessig
Published 14 Jul 2001

(“Profit maximization will force competitive band managers to devise better technical means to increase wireless communications traffic.”) 18 The source of the skepticism is traced, as Yochai Benkler describes, to Claude E. Shannon, “Communication in the Presence of Noise,” Proceedings of the IRE 37 (1949): 10, and Claude E. Shannon, “A Mathematical Theory of Communication,” Bell System Technology Journal 27 (1948): 379 and 623 (two-part publication). “These articles lay out the theoretical underpinnings of direct sequencing spread spectrum.” Benkler, Overcoming Agoraphobia, 323, note 171. 19 In order to be called “spread spectrum,” two conditions must be met: (1) the transmitted signal bandwidth is greater than the minimal information bandwidth needed to successfully transmit the signal; and (2) some function other than the information itself is being employed to determine the resultant transmitted bandwidth.

pages: 468 words: 137,055

Crypto: How the Code Rebels Beat the Government Saving Privacy in the Digital Age
by Steven Levy
Published 15 Jan 2002

After receiving an MIT doctorate in 1940, he had worked for Bell Telephone Laboratories during the war, specializing in secrecy systems. The work was classified, of course, but in the late part of the decade the two key papers in Shannon’s wartime work found their way into the public domain. In 1948, Shannon’s seminal article on information, “Mathematical Theory of Communication,” ran in the Bell System Technical Journal, and subtly set the stage for the digital epoch. A year later, “Communication Theory of Secrecy Systems” appeared in the same journal. Both efforts were highly technical; those without advanced math degrees could barely venture a few paragraphs without being snared in a thicket of thorny equations and formulas.

pages: 592 words: 152,445

The Woman Who Smashed Codes: A True Story of Love, Spies, and the Unlikely Heroine Who Outwitted America's Enemies
by Jason Fagone
Published 25 Sep 2017

secret NSA projects Transcript of Solomon Kullback oral history interview with NSA, August 26, 1982. Kullback discusses the NSA’s interest in Shannon’s research and says, “We had very close contacts with the Bell Laboratories. They were very, let’s say, willing to work along with us.” communicating through a noisy system C. E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27, no. 3 (July 1948), http://ieeexplore.ieee.org/document/6773024/. CHAPTER 5: THE ESCAPE PLOT 93 “To be your North Star” ESF to WFF, February 7, 1917, box 2, folder 1, ESF Collection. 94 “I miss you infinitely” Ibid. “I shall work for you” Ibid.

pages: 660 words: 141,595

Data Science for Business: What You Need to Know About Data Mining and Data-Analytic Thinking
by Foster Provost and Tom Fawcett
Published 30 Jun 2013

Sengupta, S. (2012). Facebook’s prospects may rest on trove of data. Shakhnarovich, G., Darrell, T., & Indyk, P.(Eds., 2005). Nearest-Neighbor Methods in Learning and Vision. Neural Information Processing Series. The MIT Press, Cambridge, Massachusetts, USA. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423. Shearer, C. (2000). The CRISP-DM model: The new blueprint for data mining. Journal of Data Warehousing, 5(4), 13–22. Shmueli, G. (2010). To explain or to predict?. Statistical Science, 25(3), 289–310. Silver, N. (2012). The Signal and the Noise.

pages: 547 words: 173,909

Deep Utopia: Life and Meaning in a Solved World
by Nick Bostrom
Published 26 Mar 2024

The World as Will and Representation (E. F. J. Payne, Trans.; Vol. 1). New York: Dover Publications. Shakespeare, W. [1604] 1898. Measure for Measure. London: Bliss & Sands. Shakespeare, W. [1607] 1918. The Tragedy of Macbeth (C. M. Lewis, Ed.). New Haven: Yale University Press. Shannon, C. E. 1948. “A Mathematical Theory of Communication”. The Bell System Technical Journal, 27(3), 379–423. Shulman, C., & Bostrom, N. 2021. “Sharing the World with Digital Minds”. In S. Clarke, H. Zohny, & J. Savulescu (Eds.), Rethinking Moral Status (pp. 306–326). Oxford: Oxford University Press. Skidelsky, E., & Skidelsky, R. 2012.

pages: 634 words: 185,116

From eternity to here: the quest for the ultimate theory of time
by Sean M. Carroll
Published 15 Jan 2010

Decoding the Universe: How the New Science of Information Is Explaining Everything in the Cosmos, from Our Brains to Black Holes. New York: Viking, 2006. Sethna, J. P. Statistical Mechanics: Entropy, Order Parameters, and Complexity. Oxford: Oxford University Press, 2006. Shalizi, C. R. Notebooks (2009). http://www.cscs.umich.edu/~crshalizi/notebooks/. Shannon, C. E. “A Mathematical Theory of Communication.” Bell System Technical Journal 27 (1948): 379- 423 and 623-56. Singh, S. Big Bang: The Origin of the Universe. New York: Fourth Estate, 2004. Sklar, L. Physics and Chance: Philosophical Issues in the Foundations of Statistical Mechanics. Cambridge: Cambridge University Press, 1993.

pages: 685 words: 203,949

The Organized Mind: Thinking Straight in the Age of Information Overload
by Daniel J. Levitin
Published 18 Aug 2014

Journal of Consumer Research, 27(2), 233–248. information you don’t care about and can’t use Kahneman, D., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press. developed information theory in the 1940s Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–423, 623–656. See also, Cover, T. M., & Thomas, J. A. (2006). Elements of information theory (2nd ed.). New York, NY: Wiley-Interscience. and, Hartley, R. V. L. (1928). Transmission of information. The Bell System Technical Journal, 7(3), 535–563.

The Chomsky Reader
by Noam Chomsky
Published 11 Sep 1987

For example, Skinner believes that “information theory” ran into a “problem when an inner ‘processor’ had to be invented to convert input into output” (p. 18). This is a strange way of describing the matter; “information theory” ran into no such “problem.” Rather, the consideration of “inner processors” in the mathematical theory of communication or its applications to psychology followed normal scientific and engineering practice. Suppose that an investigator is presented with a device whose functioning he does not understand, and suppose that through experiment he can obtain information about input-output relations for this device.

Energy and Civilization: A History
by Vaclav Smil
Published 11 May 2017

Frothingham. Cambridge: Riverside Press. Sexton, A. H. 1897. Fuel and Refractory Materials. London: Vlackie and Son. Sharma, R. 2012. Wheat Cultivation Practices: With Special Reference to Nitrogen and Weed Management. Saarbrücken: LAP Lambert Academic Publishing. Shannon, C. E. 1948. A mathematical theory of communication. Bell System Technical Journal 27:379–423, 623–656. Sheehan, G. W. 1985. Whaling as an organizing focus in Northwestern Eskimo society. In Prehistoric Hunter-Gatherers, ed. T. D. Price and J. A. Brown, 123–154. Orlando, FL: Academic Press. Sheldon, C. D. 1958. The Rise of the Merchant Class in Tokugawa Japan, 1600–1868: An Introductory Survey.

pages: 781 words: 226,928

Commodore: A Company on the Edge
by Brian Bagnall
Published 13 Sep 2005

Over the years, he filled his beachside house with juggling robots, maze-solving robot mice, chess-playing programs, mind-reading machines, and an electric chair to transport his children down to the lake. In 1948, while working at Bell Labs, Shannon produced a groundbreaking paper, A Mathematical Theory of Communication. In it, he rigorously analyzed the concept of Information Theory and how pictures, words, sounds and other media are transmitted using a stream of ones and zeros. He even coined the word “bit.” Peddle was enchanted with his theories. “Today, you take this for granted, but you have to remember that someone had to dream all this up,” he says.

pages: 761 words: 231,902

The Singularity Is Near: When Humans Transcend Biology
by Ray Kurzweil
Published 14 Jul 2005

Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less space it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper "A Mathematical Theory of Communication," The Bell System Technical Journal 27 (July–October 1948): 379–423, 623–56. Data compression is possible because of factors such as redundancy (repetition) and probability of appearance of character combinations in data. For example, silence in an audio file could be replaced by a value that indicates the duration of the silence, and letter combinations in a text file could be replaced with coded identifiers in the compressed file.

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems
by Martin Kleppmann
Published 17 Apr 2017

Lockwood: “Hadoop’s Uncomfortable Fit in HPC,” glennklock‐ wood.blogspot.co.uk, May 16, 2014. 312 | Chapter 8: The Trouble with Distributed Systems [11] John von Neumann: “Probabilistic Logics and the Synthesis of Reliable Organ‐ isms from Unreliable Components,” in Automata Studies (AM-34), edited by Claude E. Shannon and John McCarthy, Princeton University Press, 1956. ISBN: 978-0-691-07916-5 [12] Richard W. Hamming: The Art of Doing Science and Engineering. Taylor & Fran‐ cis, 1997. ISBN: 978-9-056-99500-3 [13] Claude E. Shannon: “A Mathematical Theory of Communication,” The Bell Sys‐ tem Technical Journal, volume 27, number 3, pages 379–423 and 623–656, July 1948. [14] Peter Bailis and Kyle Kingsbury: “The Network Is Reliable,” ACM Queue, vol‐ ume 12, number 7, pages 48-55, July 2014. doi:10.1145/2639988.2639988 [15] Joshua B. Leners, Trinabh Gupta, Marcos K.

pages: 1,237 words: 227,370

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems
by Martin Kleppmann
Published 16 Mar 2017

[11] John von Neumann: “Probabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Components,” in Automata Studies (AM-34), edited by Claude E. Shannon and John McCarthy, Princeton University Press, 1956. ISBN: 978-0-691-07916-5 [12] Richard W. Hamming: The Art of Doing Science and Engineering. Taylor & Francis, 1997. ISBN: 978-9-056-99500-3 [13] Claude E. Shannon: “A Mathematical Theory of Communication,” The Bell System Technical Journal, volume 27, number 3, pages 379–423 and 623–656, July 1948. [14] Peter Bailis and Kyle Kingsbury: “The Network Is Reliable,” ACM Queue, volume 12, number 7, pages 48-55, July 2014. doi:10.1145/2639988.2639988 [15] Joshua B. Leners, Trinabh Gupta, Marcos K.

pages: 851 words: 247,711

The Atlantic and Its Enemies: A History of the Cold War
by Norman Stone
Published 15 Feb 2010

The Dupont Company, the first of America’s great ones, already employed a hundred technicians in gleaming new buildings, and the example was followed by General Electric, where a mathematical genius tinkered away at Schenectady in New York State, and helped produce the cathode tube and the high-frequency alternator which made commercial broadcasting possible. General Electric then set up the National Broadcasting Corporation. Bell Laboratories was the research side of AT&T and it produced sixteen Nobel Prizes over the half-century: even the theory of information technology came from there, with a paper in 1948 called ‘A Mathematical Theory of Communications’, by Claude Elwood Shannon (like other mathematicians a considerable eccentric, who rode along the corridors on a specially made bicycle which enabled him along the way to juggle balls). As Kenneth and William Hopper say, these vastly successful companies ‘achieved a delicate . . . balance between . . . shareholders, employees, managers, suppliers, customers - and researchers’.

Data Mining: Concepts and Techniques: Concepts and Techniques
by Jiawei Han , Micheline Kamber and Jian Pei
Published 21 Jun 2011

Royal Statistical Society 36 (1974) 111–147. [SVA97] Srikant, R.; Vu, Q.; Agrawal, R., Mining association rules with item constraints, In: Proc. 1997 Int. Conf. Knowledge Discovery and Data Mining (KDD’97) Newport Beach, CA. (Aug. 1997), pp. 67–73. [SW49] Shannon, C.E.; Weaver, W., The Mathematical Theory of Communication. (1949) University of Illinois Press . [Swe88] Swets, J., Measuring the accuracy of diagnostic systems, Science 240 (1988) 1285–1293. [Swi98] Swiniarski, R., Rough sets and principal component analysis and their applications in feature extraction and selection, data model building and classification, In: (Editors: Pal, S.K.; Skowron, A.)

Applied Cryptography: Protocols, Algorithms, and Source Code in C
by Bruce Schneier
Published 10 Nov 1993

Vuillemin, “Hardware Speedups in Long Integer Multiplication,” Proceedings of the 2nd Annual ACM Symposium on Parallel Algorithms and Architectures, 1990, pp. 138–145. 1430. D. Shanks, Solved and Unsolved Problems in Number Theory, Washington D.C.: Spartan, 1962. 1431. C.E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal, v. 27, n. 4, 1948, pp. 379–423, 623–656. 1432. C.E. Shannon, “Communication Theory of Secrecy Systems,” Bell System Technical Journal, v. 28, n. 4, 1949, pp. 656–715. 1433. C.E. Shannon, Collected Papers: Claude Elmwood Shannon, N.J.A. Sloane and A.D.

pages: 2,466 words: 668,761

Artificial Intelligence: A Modern Approach
by Stuart Russell and Peter Norvig
Published 14 Jul 2019

Shani, G., Pineau, J., and Kaplow, R. (2013). Asurvey of point-based POMDP solvers. Autonomous Agents and Multi-Agent Systems, 27, 1–51. Shankar, N. (1986). Proof‑Checking Metamathematics. Ph.D. thesis, Computer Science Department, University of Texas at Austin. Shannon, C. E. and Weaver, W. (1949). The Mathematical Theory of Communication. University of Illinois Press. Shannon, C. E. (1950). Programming a computer for playing chess. Philosophical Magazine, 41, 256–275. Shapley, S. (1953b). Stochastic games. PNAS, 39, 1095–1100. Sharan, R. V. and Moir, T. J. (2016). An overview of applications and advancements in automatic sound recognition.