Entropy House

back to index

4 results

pages: 415 words: 114,840

A Mind at Play: How Claude Shannon Invented the Information Age
by Jimmy Soni and Rob Goodman
Published 17 Jul 2017

Nearly every story about him, from 1957 on, situated him at the house on the lake—usually in the two-story addition that the Shannons built as an all-purpose room for gadget storage and display, a space media profiles often dubbed the “toy room,” but which his daughter Peggy and her two older brothers simply called “Dad’s room.” The Shannons gave their home a name: Entropy House. Claude’s status as a mathematical luminary would make it a pilgrimage site for students and colleagues, especially as his on-campus responsibilities dwindled toward nothing. * * * Even at MIT, Shannon bent his work around his hobbies and enthusiasms. “Although he continued to supervise students, he was not really a co-worker, in the normal sense of the term, as he always seemed to maintain a degree of distance from his fellow associates,” wrote one fellow faculty member.

Of her many memories of her father, that would prove a distinctive one for Shannon’s daughter, Peggy: “He did a lot of work at home so he would only go into the office to teach and to meet with graduate students, but if he didn’t have to be there, he didn’t spend much time at MIT. So my sense growing up was that he was around a lot. It was different from a lot of working people.” Entropy House became his office; students dropped by, seeking feedback on projects but just as often looking to see what the Sage of Winchester had cooked up at his in-home laboratory. Even more conventional professors and old Bell Labs hands would make the trek to Winchester, and Shannon would walk them from room to room, all the while showing off his collection of contraptions and oddities.

“Oddly enough, I don’t think he even realized what it turned into. . . . He would have been absolutely astounded,” Betty said. And he would surely have been pleased by the 1993 announcement of codes whose capacity very nearly approached, but did not break, the Shannon Limit, had the news found any purchase on him. From 1983 to 1993, Shannon continued to live at Entropy House and carry on as well as he could. Perhaps it says something about the depth of his character that, even in the last stages of his decline, much of his natural personality remained intact. “The sides of his personality that seemed to get stronger were the sweet, boyish, playful sides. . . . We were lucky,” Peggy noted.

pages: 389 words: 109,207

Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street
by William Poundstone
Published 18 Sep 2006

The equation for entropy in physics takes the same form as the equation for information in Shannon’s theory. (Both are logarithms of a probability measure.) Shannon accepted von Neumann’s suggestion. He used both the word “entropy” and its usual algebraic symbol, H. Shannon later christened his Massachusetts home “Entropy House”—a name whose appropriateness was apparent to all who set eyes on its interior. “I didn’t like the term ‘information theory,’” Robert Fano said. “Claude didn’t like it either.” But the familiar word “information” proved too appealing. It was this term that has stuck, both for Shannon’s theory and for its measure of message content.

Orange juice analogy: I’ve loosely adapted a statement in Kelly and Selfridge 1962: “It is impossible (practically) to make good synthetic orange juice.” “an important influence on my life”: “A Conversation with Claude Shannon,” transcript of interview with Robert Price, Dec. 20, 1983, Shannon’s papers, LOC. “Entropy House”: Rogers n.d. “I didn’t like the term”: Aftab, Cheung, Kim, et al. 2001. “To make the chance of error”: Waldrop 2001. Use more bandwidth, more power: Aftab, Cheung, Kim, et al. 2001, 15. “No Shannon, no Napster”: Waldrop 2001. “proudest and rarest creations”: Quoted in Liversidge 1987. “This, of course, involves not only”: Shannon 1949.

pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy
by George Gilder
Published 16 Jul 2018

How a software programmer can miss the essence of his own trade is a mystery, but Chesterton understood the myopia of the expert: The . . . argument of the expert, that the man who is trained should be the man who is trusted, would be absolutely unanswerable if it were really true that the man who studied a thing and practiced it every day went on seeing more and more of its significance. But he does not. He goes on seeing less and less of its significance.11 The materialist superstition is a strange growth in an age of information. Writing from his home, which he named “Entropy House,” Shannon showed that information itself is measured by unexpected bits—by its surprisal. This is a form of disorder echoing the disorder of thermodynamic entropy. Information is surprise. A determinist machine is by definition free from surprise. The answers are always implicit in the questions.

pages: 374 words: 114,600

The Quants
by Scott Patterson
Published 2 Feb 2010

He gestured for Thorp to sit down again. “Continue.” Several hours later, Thorp left Shannon’s office into the darkening November night. Thorp started paying regular visits to Shannon’s home later that November as the two scientists set to work on the roulette problem. Shannon called his home “Entropy House,” a nod to a core concept in information theory, borrowed from the second law of thermodynamics. The law of entropy essentially means everything in the universe will eventually turn into a homogenous, undifferentiated goop. In information theory, Shannon used entropy as a way to discover order within the apparent chaos of strings of seemingly random numbers.