computer age

back to index

295 results

pages: 528 words: 146,459

Computer: A History of the Information Machine
by Martin Campbell-Kelly and Nathan Ensmenger
Published 29 Jul 2013

Thus, for the cost of a couple of 407s, the 1401 provided the printing capacity of four standard accounting machines—and the flexibility of a stored-program computer was included, as it were, for free. The new printing technology was an unanticipated motive for IBM’s customers to enter the computer age, but no less real for that. For a while, IBM was the victim of its own success, as customer after customer decided to turn in old-fashioned accounting machines and replace them with computers. In this decision they were aided and abetted by IBM’s industrial designers, who excelled themselves by echoing the modernity and appeal of the new computer age: out went the round-cornered steel-gray punched-card machines, and in came the square-cornered light-blue computer cabinets.

With these changes we hope that, for the next several years, the third edition of Computer will continue to serve as an authoritative, semi-popular history of computing. INTRODUCTION IN JANUARY 1983, Time magazine selected the personal computer as its Man of the Year, and public fascination with the computer has continued to grow ever since. That year was not, however, the beginning of the computer age. Nor was it even the first time that Time had featured a computer on its cover. Thirty-three years earlier, in January 1950, the cover had sported an anthropomorphized image of a computer wearing a navy captain’s hat to draw readers’ attention to the feature story, about a calculator built at Harvard University for the US Navy.

But now we can see that it is important to the history of computing in that it pioneered three key features of the office-machine industry and the computer industry that succeeded it: the perfection of the product and low-cost manufacture, a sales organization to sell the product, and a training organization to enable workers to use the technology. THE RANDS Left to itself, it is unlikely that the Remington Typewriter Company would have succeeded in the computer age—certainly none of the other typewriter companies did. However, in 1927 Remington became part of a conglomerate, Remington Rand, organized by James Rand Sr. and his son James Jr. The Rands were the inventor-entrepreneur proprietors of the Rand Kardex Company, the world’s leading supplier of record-keeping systems.

pages: 224 words: 12,941

From Gutenberg to Google: electronic representations of literary texts
by Peter L. Shillingsburg
Published 15 Jan 2006

Works by other scholars form greater and more important enabling contexts, knowledge of which might help the reader to assess my arguments for their intended effects. This book could be seen as the third book of a trilogy that was not intended as such, but which seems to me to have happened accidentally. My Scholarly Editing in the Computer Age (1984, revised in 1986 and again in 1996) attempted to survey the prevailing notions about the nature of literary texts that propelled and guided scholarly editors. Its idea most relevant to the present work is that literary works are traditionally viewed from one of five rather different and mutually exclusive ‘‘orientations’’ which depend on how one posits authority for or ownership of the text.

It can be questioned whether textuality, in the constrained form of print, has been allowed to reveal its nature fully. It can still be argued that texts were not constrained by print technology but, instead, were designed specifically for print technology. This argument might hold that while electronic media have provided novelists and poets in the computer age with new visions about how and what to write, it would be inappropriate to drag texts written with print design in mind – indeed, written with no notion of any alternative ‘‘condition of being’’ other than print – into an electronic environment with some notion of releasing them from the constraints of print.

They are also bibliographers and they know how to conduct literary and historical research. But they are usually not also librarians, typesetters, printers, publishers, book designers, programmers, web-masters, or systems analysts. In the days of print editions, some editors undertook some of those production roles, and in the computer age, some editors try to program and design interfaces. In both book design and electronic presentations, textual scholarship often visibly outdistances the ability of these same persons’ amateur technical attempts at beauty and dexterity. Yet, in many cases, textual critics, whose business it is to study the composition, revision, publication, and transmission of texts, have had to adopt these other roles just to get the fruits of their textual labor produced at all or produced with scholarly quality control.

pages: 468 words: 137,055

Crypto: How the Code Rebels Beat the Government Saving Privacy in the Digital Age
by Steven Levy
Published 15 Jan 2002

As more people used computers, wireless telephones, and other electronic devices, they would demand cryptography. Just as the invention of the telegraph upped the cryptographic ante by moving messages thousands of miles in the open, presenting a ripe opportunity for eavesdroppers of every stripe, the computer age would be moving billions of messages previously committed to paper into the realm of bits. Unencrypted, those bits were low-hanging fruit for snoopers. Could cryptography, that science kept intentionally opaque by the forces of government, help out? The answer was as clear as plaintext. Of course it could!

It was a perpetual kind of voyage of discovery because he kept checking out these people. And sometimes he’d say, ‘I want you to stand here to listen. I don’t want anybody to see you but I just want you to listen.’ So I went on some of these encounters. But basically I didn’t have a clue what he was up to.” Sometimes Diffie would try to explain his motivations to her. The computer age, he told Mary, held terrible implications for privacy. As these machines become ascendant, and we use them for everyday communication, he warned, we may never experience privacy as we know it today. His apocalyptic tone unsettled Mary, but she wanted to hear more. Eventually, Mary understood how Diffie’s mission mixed the political with the personal.

Ultimately, it was only by questioning the conventional rules of cryptography and finding some of them “stupid” that Diffie made his breakthroughs. A case in point: the belief that the workings of a secure cryptosystem had to be treated with utmost secrecy. That might have held true for military organizations, but in the computer age, that didn’t make sense. There would be unlimited users who needed a system for privacy; obviously, such a system would have to be distributed so widely that potential crackers would have no trouble getting their hands on it and would have plenty of opportunity to practice attacking it. Instead, the secrecy had to rest somewhere else in the system.

A People’s History of Computing in the United States
by Joy Lisi Rankin

Coordinated Science Lab at the University of Illinois, “1950s: The Classified Years,” http://­c sl​.­i llinois​.­edu ​/­about​-­lab​/­1950s​-­classified​-­years, archived at perma.cc/N3PC-5XGN. 7. Bethany Anderson, “The Birth of the Computer Age at Illinois,” University of Illinois Archives, 2013, http://­archives​.­library​.­illinois​.­edu​/ ­blog​/ ­birth​ -­of​-­t he​-­computer​-­age, archived at perma.cc/RJ57–9NV8. 8. Donald Bitzer, oral history interview by Mollie Price on August 17, 1982, Charles Babbage Institute. 9. Bitzer, interview; PLATO Quarterly Pro­ g ress Report for June–­ August 1960, Box 22, CBI PLATO Collection. 10.

Constructing Youth Hackers in ­Family Computing Magazines (1983–1987).” International Journal of Communication 8 (2014): 673–698. Alpert, Daniel, and Donald Bitzer. “Advances in Computer-­Based Education.” Science 167, no. 3925 (1970): 1582–1590. Anderson, Bethany. “The Birth of the Computer Age at Illinois.” University of Illinois Archives (2013). http://­archives​.­library​.­illinois​.­edu​/ ­blog​/ ­birth​-­of​ -­t he​-­computer​-­age. Archived at perma.cc/RJ57–9NV8. Anderson, Terry H. The Movement and the Sixties: Protest in Amer­i­ca from Greensboro to Wounded Knee. New York: Oxford University Press, 1995. Aspray, William. Computing before Computers. Ames: Iowa State University Press, 1990. —­—­—.

Information about Secondary School Proj­ect schools, the NSF grant, and its purpose in Information File for the Computation Center, Rauner Library; and (for example) Kiewit Comments 1, no. 5 (June 19, 1967); Kiewit Comments 2, no. 6 (August 19, 1968); Kiewit Comments 2, no. 9 (November 22, 1968); Kiewit Comments 3, no. 2 (February 20, 1969); Kiewit Comments 3, no. 5 (June 1, 1969); Kiewit Comments 3, no. 6 (September 20, 1969); and Kiewit Comments 4, no. 6 (September 28, 1970). 62. “Loomis Catches Up with Computer Age,” Loomis Bulletin, July 1968, Loomis Chaffee School Archives; G. Albert Higgins, “BASIC: A New Language at Mount Hermon,” The Bulletin of Northfield and Mount Hermon Schools (Spring 1967), clipping in Folder DA 29 (7841) 5 of 6, labeled: Miscellaneous Time-­Sharing, Rauner Library. 63. Clippings from Valley News, Caledonia Rec­ord, Recorder-­Gazette (Greenfield, MA), Gazette (Haverhill, MA), Ea­gle (Claremont, NH), Eve­ning Eagle-­Tribune (Lawrence, MA), Herald (Portsmouth, NH), Union (Springfield, MA), Monitor & New Hampshire Patriot (Concord, NH), Telegraph (Nashua, NH), Sunday Globe (Boston, MA), Connecticut Valley Times Reporter (Bellows Falls, VT), Hartford Courant, Mascoma Week (Canaan, NH), in Folder 1964–67, Box 4242, RDCCS , DA-181, Rauner Library.

pages: 239 words: 70,206

Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else
by Steve Lohr
Published 10 Mar 2015

Decisions of all kinds will increasingly be made based on data and analysis rather than on experience and intuition—more science and less gut feel. Throughout history, technological change has challenged traditional practices, ways of educating people, and even ways of understanding the world. In 1959, at the dawn of the modern computer age, the English chemist and novelist C. P. Snow delivered a lecture at Cambridge University, “The Two Cultures.” In it, Snow dissected the differences and observed the widening gap between two camps, the sciences and the humanities. The schism between scientific and “literary intellectuals,” he warned, threatened to stymie economic and social progress, if those in the humanities remained ignorant of the advances in science and their implications.

Productivity gains—more wealth created per hour of labor—are the fuel of rising living standards, and a by-product of the efficiency that technology is supposed to generate. The conundrum raised the question of whether all of the investment in, and enthusiasm for, digital technology was justified. Robert Solow, a Nobel Prize–winning economist, tartly summed up the quandary in the late 1980s, when he wrote, “You can see the computer age everywhere but in the productivity statistics.” Solow’s critique became known as the productivity paradox. Brynjolfsson, a technology optimist, has two answers for the skeptics. First, he argues, the official statistics do not fully capture the benefits of digital innovation. And second, he says that in technology, revolutions take time.

Uncertainty and experimentation while pursuing a new set of problems and opportunities are how disciplines emerge in technology. In the postwar years, big computers were the disruptive technology of the day, with the potential to transform scientific research, business, and government operations. To really create a computer age, skilled people and new tools and techniques were needed. In the 1960s, universities responded with programs in computer science, a new discipline that combined mathematics and electrical engineering. We see a similar pattern with data science. It is certainly where established academic departments, like statistics and computer science, are headed, and have been for a while.

pages: 434 words: 135,226

The Music of the Primes
by Marcus Du Sautoy
Published 26 Apr 2004

2 The Atoms of Arithmetic 3 Riemann’s Imaginary Mathematical Looking-Glass 4 The Riemann Hypothesis: From Random Primes to Orderly Zeros 5 The Mathematical Relay Race: Realising Riemann’s Revolution 6 Ramanujan, the Mathematical Mystic 7 Mathematical Exodus: From Göttingen to Princeton 8 Machines of the Mind 9 The Computer Age: From the Mind to the Desktop 10 Cracking Numbers and Codes 11 From Orderly Zeros to Quantum Chaos 12 The Missing Piece of the Jigsaw Acknowledgements Further Reading Illustration and Text Credits Index P.S. About the Author Portrait of Marcus du Sautoy Snapshot Top Ten Favourite Books About the Book A Critical Eye Jerzy Grotowski Read On If You Loved This, You’ll Like … Find Out More Bookshop About the Author Praise Copyright About the Publisher CHAPTER ONE Who Wants To Be a Millionaire?

Instead it was based on solid calculation and theoretical ideas that Riemann had chosen not to reveal to the world. Within a few years of Siegel’s discovery of Riemann’s secret formula, it would be used by Hardy’s students in Cambridge to confirm that the first 1,041 zeros were on Riemann’s line. The formula, however, would truly come into its own with the dawn of the computer age. It is rather odd that it took mathematicians so long to realise that Riemann’s notes might contain such gems. There are certainly clues in Riemann’s ten-page paper, and in letters he wrote to other mathematicians at the time, that he was sitting on something. In the paper he mentions a new formula but goes on to say that he ‘has not yet sufficiently simplified it to announce it’.

Turing’s addiction for real-life inventions had infused his theoretical considerations. Although the universal machine was only a machine of the mind, his description of it sounded like the plan for an actual contraption. A friend of his joked that if it were ever built it would probably fill the Albert Hall. The universal machine marked the dawn of the computer age, which would equip mathematicians with a new tool in their exploration of the universe of numbers. Even during his lifetime, Turing appreciated the impact that real computing machines might have on investigating the primes. What he could not have foreseen was the role that his theoretical machine would later play in unearthing one of the Holy Grails of mathematics.

pages: 343 words: 102,846

Trees on Mars: Our Obsession With the Future
by Hal Niedzviecki
Published 15 Mar 2015

But the highlight for the audience was a speech-controlled train that could back up, stop, and move forward on command. The whole experience happened in a pavilion called the “House of Magic.”34 The prevailing attitude—future-as-passive-spectacle in which the forces of government and industry shape the time to come while “man conforms”—continued even up through the 1950s and ’60s, the dawn of the computer age and the golden age of a certain kind of future-longing we now look back at with nostalgia. Consider something like this 1956 ad in Scientific American for General Motors New Direction Ball Bearings: A week’s shopping in minutes! And you haven’t moved from your car. It’s that simple at the Drive-In Market of tomorrow.

Death, like so many other things, is the future. And the future is just chaos waiting to be shaped into the information. In the big-data age, the future, far more than the past and the present, is open to quantification, manipulation, alteration and disruption. This—not technological progress or the rise of the computing age—is the great story of the postmodern era. Am I exaggerating our faith in the power of abstraction—that ruptured awareness that abruptly separated us from the rest of the living creatures on Earth—to allow us to know and then control all things? Well we’ve already established that under the auspices of future our largest most powerful corporations are investing billions in schemes to convert the world to data.

In particular, Ada Lovelace, daughter of the poet Lord Byron and Babbage’s mathematically gifted muse, enthused that Babbage’s analytical engine would not just calculate numbers, it would perform operations on “any process which alters the mutual relation of two or more things.” She went on: “This is the most general definition, and would include all subjects in the universe.”18 Lovelace, dubbed everything from “the prophet of the computer age” to “the enchantress of numbers,” had that rare combination of mathematical genius and inherited gift for language; as a result she had a vision for what a language of pure information could make possible—just about anything. She writes with beauty and scope about the information landscape to come: “A new, a vast, and a powerful language is developed . . . in which to wield its truths so that these may become of more speedy and accurate practical application for the purposes of mankind than the means hitherto in our possession have rendered possible.”19 Around the same time, Pierre-Simon Laplace, the great French astronomer and mathematician, an advocate for Newtonian principles, wrote of a new kind of “intelligence” that “would embrace in the same formula the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes.”20 Uncertainty banished.

pages: 414 words: 109,622

Genius Makers: The Mavericks Who Brought A. I. To Google, Facebook, and the World
by Cade Metz
Published 15 Mar 2021

“It changed the way many others looked at it, too.” Some researchers, most notably Demis Hassabis, the young neuroscientist behind DeepMind, even believed they were on their way to building a machine that could do anything the human brain could do, only better, a possibility that has captured the imagination since the earliest days of the computer age. No one was quite sure when this machine would arrive. But even in the near term, with the rise of machines that were still a very long way from true intelligence, the social implications were far greater than anyone realized. Powerful technologies have always fascinated and frightened humanity, and humanity has gambled on them time and again.

A team from the University of Washington, including one researcher who soon moved to Facebook, used a neural network to build a video that put new words in the mouth of Barack Obama. At a start-up in China, engineers used similar techniques to turn Donald Trump into a Chinese speaker. It was not that fake images were a new thing. People had been using technology to doctor photographs since the dawn of photographs, and in the computer age, tools like Photoshop gave nearly anyone the power to edit both photos and videos. But because the new deep learning methods could learn the task on their own—or at least part of the task—they threatened to make the editing that much easier. Rather than paying farms of people to create and distribute fake images and fake video, political campaigns, nation-states, activists, and insurrectionists could potentially build systems that did the job automatically.

“When I was an undergrad at King’s College Cambridge, Les Valiant, who won the Turing Award in 2010, lived in the adjacent room on X staircase,” the tweet read. “He just told me that Turing lived on X staircase when he was a fellow at King’s and probably wrote his 1936 paper there.” This was the paper that helped launch the computer age. The awards ceremony was held two months later in the Grand Ballroom of the Palace Hotel in downtown San Francisco. Jeff Dean attended in black tie. So did Mike Schroepfer. A waitstaff in white jackets served dinner to more than five hundred guests sitting at round tables with white tablecloths, and as they ate, various other awards were presented to more than a dozen engineers, programmers, and researchers from across industry and academia.

pages: 397 words: 110,222

Habeas Data: Privacy vs. The Rise of Surveillance Tech
by Cyrus Farivar
Published 7 May 2018

This notion ended up becoming: Office of the Press Secretary, “Statement by the Press Secretary,” April 16, 1993. Available at: http://cd.textfiles.com/​hackersencyc/​PC/​CRYPTO/​CLIPPER.TXT. Clipper chip would not: John Markoff, “Big Brother and the Computer Age,” The New York Times, May 6, 1993. Available at: http://www.nytimes.com/​1993/​05/​06/​business/​big-brother-and-the-computer-age.html. Many of them—notably FBI: Levy, p. 245. In 1995, Kallstrom: James C. McKinley, Jr., “Wiretap Expert Named to Head New York City Office of F.B.I.,” The New York Times, February 17, 1995. Available at: http://www.nytimes.com/​1995/​02/​17/​nyregion/​wiretap-expert-named-to-head-new-york-city-office-of-fbi.html.

m=.1 But at the same time: Anthony Ramirez, “FBI’s Proposal on Wiretaps Criticized by Federal Agency,” The New York Times, January 15, 1993. Available at: http://www.nytimes.com/​1993/​01/​15/​us/​fbi-s-proposal-on-wiretaps-criticized-by-federal-agency.html And in the end, the FBI’s efforts: John Markoff, “Big Brother and the Computer Age,” The New York Times, May 6, 1993. Available at: http://www.nytimes.com/​1993/​05/​06/​business/​big-brother-and-the-computer-age.html. The law primarily targeted: Nate Anderson, The Internet Police (W. W. Norton & Company, 2014). p. 107. Crucially, the law does: 47 U.S. Code § 1002. Available at: https://www.law.cornell.edu/​uscode/​text/​47/​1002. In late June 1996: Philip Zimmermann, “Testimony of Philip R.

But it is almost impossible to think of late 18th-century situations that are analogous to what took place in this case. (Is it possible to imagine a case in which a constable secreted himself somewhere in a coach and remained there for a period of time in order to monitor the movements of the coach’s owner?). In any case, the Alito wing pointed out that in the pre-computer age, there was an inherent mechanism that made such broad surveillance untenable: economics. “Traditional surveillance for any extended period of time was difficult and costly and therefore rarely undertaken,” Alito continued. The surveillance at issue in this case—constant monitoring of the location of a vehicle for four weeks—would have required a large team of agents, multiple vehicles, and perhaps aerial assistance.

pages: 413 words: 119,587

Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots
by John Markoff
Published 24 Aug 2015

He also presented more than a glimmer of the theoretical possibility and practical impact of machine learning: “The limitations of such a machine are simply those of an understanding of the objects to be attained, and of the potentialities of each stage of the processes by which they are to be attained, and of our power to make logically determinate combinations of those processes to achieve our ends. Roughly speaking, if we can do anything in a clear and intelligible way, we can do it by machine.”12 At the dawn of the computer age, Wiener could see and clearly articulate that automation had the potential of reducing the value of a “routine” factory employee to where “he is not worth hiring at any price,” and that as a result “we are in for an industrial revolution of unmitigated cruelty.” Not only did he have early dark forebodings of the computer revolution, but he foresaw something else that was even more chilling: “If we move in the direction of making machines which learn and whose behavior is modified by experience, we must face the fact that every degree of independence we give the machine is a degree of possible defiance of our wishes.

His machine would be composed of “hands,” “sensory organs,” “memory,” and a “brain.”1 Shockley’s inspiration for a humanlike factory robot was that assembly work often consists of a myriad of constantly changing unique motions performed by a skilled human worker, and that such a robot was the breakthrough needed to completely replace human labor. His insight was striking because it came at the very dawn of the computer age, before the impact of the technology had been grasped by most of the pioneering engineers. At the time it was only a half decade since ENIAC, the first general purpose digital computer, had been heralded in the popular press as a “giant brain,” and just two years after Norbert Wiener had written his landmark Cybernetics, announcing the opening of the Information Age.

Mind Children: The Future of Robot and Human Intelligence (1988) contains an early detailed argument that the robots that he has loved since childhood are in the process of evolving into an independent intelligent species. A decade later he refined the argument in Robot: Mere Machine to Transcendent Mind (1998). Significantly, although it is not widely known, Doug Engelbart had made the same observation, that computers would increase in power exponentially, at the dawn of the interactive computing age in 1960.33 He used this insight to launch the SRI-based augmentation research project that would help lead ultimately to both personal computing and the Internet. In contrast, Moravec built on his lifelong romance with robots. Though he has tempered his optimism, his overall faith never wavered.

pages: 436 words: 127,642

When Einstein Walked With Gödel: Excursions to the Edge of Thought
by Jim Holt
Published 14 May 2018

And in their place came acceptance. All this vast majesty of creation—it had to mean something. And then I meant something too. Yes, smaller than the smallest, I meant something too. To God, there is no zero. I still exist. And so, one feels, does the infinitesimal. PART VI Heroism, Tragedy, and the Computer Age 14 The Ada Perplex: Was Byron’s Daughter the First Coder? The programming language that the U.S. Department of Defense uses to control its military systems is named Ada, after Ada Byron, the daughter of Lord Byron. This act of nomenclature was not meant to be entirely fanciful. Augusta Ada Byron, who became by marriage the Countess of Lovelace, is widely supposed to have produced the first specimen of what would later be called computer programming.

Augusta Ada Byron, who became by marriage the Countess of Lovelace, is widely supposed to have produced the first specimen of what would later be called computer programming. In her lifetime, she was deemed a mathematical prodigy, the Enchantress of Numbers. After her death in 1852—at the age of thirty-six, just like her father—popular biographers hymned her intellect and Byronic pedigree. With the coming of the computer age, Ada’s posthumous renown expanded to new proportions. She has been hailed as a technological visionary; credited with the invention of binary arithmetic; made into the cult goddess of cyber feminism. It is an index of her prestige as a scientific virtuosa that Tom Stoppard, in his play Arcadia, presents a character based upon Ada groping her way toward the law of entropy and the theory of chaos, not to mention a proof of Fermat’s last theorem.

But the halting problem, it turned out, was merely the decision problem in disguise. Turing was able to prove that no computing machine of the kind he envisaged could solve the decision problem. Reasoning could not be reduced to computation after all. But the death of Leibniz’s dream turned out to be the birth of the computer age. The boldest idea to emerge from Turing’s analysis was that of a universal Turing machine: one that, when furnished with the number describing the mechanism of any particular Turing machine, would perfectly mimic its behavior. In effect, the “hardware” of a special-purpose computer could be translated into “software” and then entered like data into the universal machine, where it would be run as a program.

pages: 626 words: 167,836

The Technology Trap: Capital, Labor, and Power in the Age of Automation
by Carl Benedikt Frey
Published 17 Jun 2019

Before the computer revolution, the Bureau of Labor Statistics described the role of the secretary as follows: “Secretaries relieve their employers of routine duties so they can work on more important matters. Although most secretaries type, take shorthand, and deal with callers, the time spent on these duties varies in different types of organizations.”15 The impact of the computer age becomes evident when we look at the description of the same job from the same source in the 2000s: “As technology continues to expand in offices across the Nation, the role of the secretary has greatly evolved. Office automation and organizational restructuring have led secretaries to assume a wide range of new responsibilities once reserved for managerial and professional staff.

But consistent with what we know from studies in neuroscience pointing out that women perform better in interactive and social settings, women have adjusted much better than men to an increasingly interactive world of work.33 Instead of being pushed back into low-wage service jobs, where women had traditionally been dominant, many have moved up into professional and managerial jobs. Women also are more likely to graduate from college than men, and consequently their skills are more suitable for the computer age. Indeed, while men have found themselves increasingly likely to be replaced by computer-controlled machines, women are more likely to use a computer at work.34 The rising share of women in the professions and the decline of male-dominated blue-collar sectors have allowed many women to overtake their male counterparts in terms of career advancement.

When a prototype has been developed and operations have become more standardized, it makes economic sense to relocate to places where real estate is cheaper and the costs of production lower. New jobs, in other words, will eventually spread to other locations. As long as nursery cities do not churn out new jobs faster than they diffuse geographically, convergence in employment will follow. But new jobs spread only as they become standardized, and since the beginning of the computer age, jobs that have become standardized have not diffused across the country: they have either been automated away or sent abroad. The flourishing cities of America have become nursery cities for innovation. But the rest is done abroad or by machines. The places where work has been replaced, rather than complemented, by machines, are the ones that are in decline.

From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry
by Martin Campbell-Kelly
Published 15 Jan 2003

Gallagher, Management Information Systems and the Computer (American Management Association, 1961), esp. pp. 150–176. 42. McKenney, Waves of Change, p. 105. 43. Bashe et al., IBM’s Early Computers, 518. 44. McKenney, Waves of Change, p. 111. 45. Gilbert Burck, The Computer Age (Harper & Row, 1965), p. 31. 46. Bashe et al., IBM’s Early Computers, p. 521. 47. This appellation appears in Burck, The Computer Age, p. 34. 48. R. W. Parker, “The SABRE System,” Datamation, September 1965: 49–52. 49. “A Survey of Airline Reservation Systems,” Datamation, June 1962: 53–55. 50. The original SABRE software was very long-lived. The same a code base was still being used more than two decades later, in 1987, when the system had expanded to process over 1,000 messages per second on a system that used eight 3090 mainframes, to support 12,000 agent terminals.

See Thomas Kuhn, The Structure of Scientific Revolutions (University of Chicago Press, 1962); Thomas Parke Hughes, Networks of Power: Electrification in Western Society, 1880–1930 (Johns Hopkins University Press, 1983); Nathan Rosenberg, Inside the Black Box: Technology and Economics (Cambridge University Press, 1982). 26. Douglas K. Smith and R. C. Alexander, Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer (Morrow, 1988); Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (HarperBusiness, 1999). 27. Steven Levy, Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything (Penguin, 1994). 28. See, e.g., “A Fierce Battle Brews Over the Simplest Software Yet,” Business Week, November 21, 1983: 61–63. 29. Phil Lemmons, “A Guided Tour of VisiOn,” Byte, June 1983: 256ff. 30.

The Mythical Man-Month: Essays on Software Engineering. Addison-Wesley, 1975. Brooks, John. The Go-Go Years: The Drama and Crashing Finale of Wall Street’s Bullish 60s. Wiley, 1973, 1999. Burck, Gilbert. The Assault on Fortress I.B.M.. Fortune, June 1964: 112–116, 196, 198, 200, 202, 207. Burck, Gilbert, The Computer Age. Harper & Row, 1965. Burck, Gilbert. The Computer Industry’s Great Expectations. Fortune, August 1968: 92–97, 142, 145–146. Burton Grad Associates Inc. Evolution of the US Packaged Software Industry. Tarrytown, N.Y.: Burton Grad Associates Inc., 1992. Bibliography 351 Business Communications Corp.

AC/DC: The Savage Tale of the First Standards War
by Tom McNichol
Published 31 Aug 2006

As a result, the AC/DC war serves as a cautionary tale for the Information Age, which produces ever more arcane disputes over technical standards. In a standards war, the appeal is always to fear, whether it’s the fear of being killed, as it was in the AC/DC battle, or the palpable dread of the computer age, the fear of being left behind. c01.qxp 7/15/06 8:37 PM Page 5 1 FIRST SPARKS The story of electricity begins with a bang, the biggest of them all. The unimaginably enormous event that created the universe nearly 14 billion years ago gave birth to matter, energy, and time itself. The Big Bang was not an explosion in space but of space itself, a cataclysm occurring everywhere at once.

The future of computing lies in making digital devices truly portable, so that users can communicate on any device, anytime, from anywhere in the world. To build the “always connected” world, devices will have to be untethered from wires, including the wall outlet, and powered by long-lasting rechargeable batteries or fuel cells. In short, a move from AC to DC. The Industrial Age was powered almost exclusively by AC, but the Computer Age may well turn out to be DC’s revenge. c12.qxp 7/15/06 178 8:47 PM Page 178 AC/DC If Edison were alive today, he’d no doubt be in the thick of the effort to come up with a powerful and portable “box of electricity” to power electronic devices and even automobiles for days or weeks on a single charge.

D., 114 Aeneid (Virgil), 7 Alt, Whitey, 144, 145 Alternating current (AC): accidental deaths attributed to, 88–89, 92, 107, 116, 119; author’s childhood experience with, 1–2; Brown’s claims linking execution to, 117, 118; Brown’s writings on dangers of, 89–90, 107–108, 116–117; compared to, DC, 66, 80; demonstration of animal-killing power of, 108–110; Edison’s opinion of, 66–67, 77, 84–85, 118–120, 131, 170; experiments on relative dangers of DC vs., 90–91, 92–95, 97–106; long-distance transmission of, in Germany, 130–131; patents for, purchased by Edison’s company, 120; recommended use of, to execute criminals, 110–112; reliance of modern life on, 3, 173–174; as standard by 1930s, 173; Tesla’s Columbian Exposition demonstration of, 138–139 Alternating current (AC) system: first power plant using, 82; Gaulard-Gibbs, 66, 81; increasing number of power plants using, 91, 108, 114, 130–131; installed at hydroelectric power plants, 129–130, 140, 141–142; national-scale conceptualization of, 121; proposal to limit voltage in, 89–90, 117, 119–120; technical papers as defense for, 108; Westinghouse’s development of, 81–83; winning in marketplace, 108, 114, 131 Amber, 7, 9 Animal experiments: on calves, 108–109, 115; on dogs, 90–91, 92–95, 97–106, 115; on horses, ii, 109–110, 115 Ansonia Brass & Copper Company, 63 Arc lamps, 41–42, 88 Automobile, electric, 155–158, 159–161 B Bantu tribesmen, view of lightning, 8 Batchelor, Charles, 75 Batteries: in Computer Age, 177–178; Edison’s “A,” 159–161; Edison’s continued work on, 168; Edison’s “E,” 155–158; efforts to increase longevity of, 178; first rechargeable, 156; invention of, 22 Baum, Frank L., 136 Bible, on lightning, 8 Black Elk, 8–9 Blount, J. F., 144 Boxing Cats (film), 152 Brown, Harold: background of, 87–88; demonstrated AC’s power to kill animals, ii, 108–110, 115; demonstrated electrical resistance, 96–97; described DC-powered utopia, 117; linked AC to execution, 117, 118; procured AC generators for death chair, 115–116; relationship with Edison, 87, 88, 91–92, 102, 103, 112, 119, 123, 171; showed danger of DC vs.

pages: 309 words: 114,984

The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age
by Robert Wachter
Published 7 Apr 2015

—Captain Chesley “Sully” Sullenberger speaker; consultant; author of Highest Duty and Making a Difference; pilot of US Airways 1549, the “Miracle on the Hudson” “With vivid stories and sharp analysis, Wachter exposes the good, the bad, and the ugly of electronic health records and all things electronic in the complex settings of hospitals, physician offices, and pharmacies. Everyone will learn from Wachter’s intelligent assessment and become a believer that, despite today’s glitches and frustrations, the future computer age will make medicine much better for us all.” —Ezekiel J. Emanuel, MD, PhD Vice Provost for Global Initiatives and Chair, Departments of Medical Ethics and Health Policy, University of Pennsylvania “In Bob Wachter, I recognize a fellow mindful optimist: someone who understands the immense power of digital technologies, yet also realizes just how hard it is to incorporate them into complicated, high-stakes environments full of people who don’t like being told what to do by a computer.

On our Maslow’s hierarchy of needs, just finding the right test result for the right patient was a small, sweet triumph. We didn’t dare hope for more. For those of us whose formative years were spent rummaging through shoeboxes, how could we help but greet healthcare’s reluctant, subsidized entry into the computer age with unalloyed enthusiasm? Yet once we clinicians started using computers to actually deliver care, it dawned on us that something was deeply wrong. Why were doctors no longer making eye contact with their patients? How could one of America’s leading hospitals (my own) give a teenager a 39-fold overdose of a common antibiotic, despite (scratch that— because of) a state-of-the-art computerized prescribing system?

The history of technology tells us that it is these financial, environmental, and organizational factors, rather than the digital wizardry itself, that determine the success and impact of new IT tools. This phenomenon is known as the “productivity paradox” of information technology. 38 The name comes from the fact that Gross and Tecco decided to launch the organization while sitting in Harvard Business School’s Rock Hall. Chapter 26 The Productivity Paradox You can see the computer age everywhere but in the productivity statistics. —Nobel Prize–winning MIT economist Robert Solow, writing in 1987 Between the time David Blumenthal stepped down as national coordinator for health IT and became CEO of the Commonwealth Fund, he returned to Boston from 2011 to 2013 to manage the transition of Partners HealthCare from a homegrown electronic health record to the one made by Epic.

pages: 720 words: 197,129

The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution
by Walter Isaacson
Published 6 Oct 2014

In the case of computers, there were many such incremental advances made by faceless engineers at places like IBM. But that was not enough. Although the machines that IBM produced in the early twentieth century could compile data, they were not what we would call computers. They weren’t even particularly adroit calculators. They were lame. In addition to those hundreds of minor advances, the birth of the computer age required some larger imaginative leaps from creative visionaries. DIGITAL BEATS ANALOG The machines devised by Hollerith and Babbage were digital, meaning they calculated using digits: discrete and distinct integers such as 0, 1, 2, 3. In their machines, the integers were added and subtracted using cogs and wheels that clicked one digit at a time, like counters.

Bush’s machine, however, was not destined to be a major advance in computing history because it was an analog device. In fact, it turned out to be the last gasp for analog computing, at least for many decades. New approaches, technologies, and theories began to emerge in 1937, exactly a hundred years after Babbage first published his paper on the Analytical Engine. It would become an annus mirabilis of the computer age, and the result would be the triumph of four properties, somewhat interrelated, that would define modern computing: DIGITAL. A fundamental trait of the computer revolution was that it was based on digital, not analog, computers. This occurred for many reasons, as we shall soon see, including simultaneous advances in logic theory, circuits, and electronic on-off switches that made a digital rather than an analog approach more fruitful.

The almost-working machine was put into storage in the basement of the physics building at Iowa State, and a few years later no one seemed to remember what it did. When the space was needed for other uses in 1948, a graduate student dismantled it, not realizing what it was, and discarded most of the parts.37 Many early histories of the computer age do not even mention Atanasoff. Even if it had worked properly, his machine had limitations. The vacuum-tube circuit made lightning-fast calculations, but the mechanically rotated memory units slowed down the process enormously. So did the system for burning holes in the punch cards, even when it worked.

pages: 291 words: 81,703

Average Is Over: Powering America Beyond the Age of the Great Stagnation
by Tyler Cowen
Published 11 Sep 2013

Magnus Carlsen is, as I write, the highest rated player in the world and arguably the most impressive chess prodigy of all time, having attained grandmaster status at thirteen and world number one status at age nineteen, the latter a record. He is from Tønsberg, in southern Norway, and prior to the computer age Norway has no record of producing top chess players at all. Even Oslo (Carlsen now lives on its outskirts) is a relatively small metropolitan area of fewer than 1.5 million people. Carlsen, of course, had the chance to play chess over the internet. Many more young chess players come from the far reaches of the globe, including distant parts of China and India.

One day soon we will look back and see that we produced two nations, a fantastically successful nation, working in the technologically dynamic sectors, and everyone else. Average is over. Notes For the opening quotation, see D.T. Max, “The Prince’s Gambit: A Chess Star Emerges for the Post-Computer Age,” The New Yorker, March 21, 2011. Chapter 1: Work and Wages in iWorld For the figures on wages of college graduates, see Heidi Shierholz, Natalie Sabadish, and Hilary Wething, “The Class of 2012: Labor Market for Young Graduates Remains Grim,” Economic Policy Institute, May 3, 2012, http://www.epi.org /publication/bp340-labor-market-young-graduates/.

Glickman, “Sex Differences in Intellectual Performance: Analysis of a Large Cohort of Competitive Chess Players,” Psychological Science, December 2006, 17(12): 1040–46. For the quotations on looking at all chess through the eyes of the computer, see D.T. Max, “The Prince’s Gambit: A Chess Star Emerges for the Post-Computer Age,” The New Yorker, March 21, 2011. Chapter 7: The New Office: Regular, Stupid, and Frustrating For various reports on the failures of GPS, see Tom Vanderbilt, “It Wasn’t Me, Officer! It Was My GPS: What Happens When We Blame Our Navigation Devices for Our Car Crashes,” Slate, June 9, 2010.

pages: 615 words: 168,775

Troublemakers: Silicon Valley's Coming of Age
by Leslie Berlin
Published 7 Nov 2017

George Pake, “Research and Development Management and the Establishment of Xerox Palo Alto Research Center,” Remarks for the IEEE Convocation “The Second Century Begins,” January 1985, XPA. 13. Pake’s role in increased salaries: Michael Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: Harper-Collins, 2000): 61; Taylor’s role: performance reviews, RWT. 14. Taylor’s title: Performance Appraisal Notice, March 1, 1971, RWT. 15. In May 1971, Taylor talked about “Xerox’s ‘Information Company’ intent,” and he described the emphases for both CSL and SSL as “prototype systems experiments, especially with regard to library systems, office systems, medical systems, and educational systems.”

They appear to be notes for a talk he planned to give upon returning to PARC after Boca Raton. 18. Taylor, interview by author, July 18, 2014. 19. Performance reviews, RWT. “Because Bob is good” (in footnote): Sproull to Sutherland (acting director of PARC), May 4, 1977, RWT. 20. Michael Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperCollins, 2000): 379. Beyond the startling comparison (and Hiltzik points it out as such), Pake rarely spoke a word against Taylor. However, when Pake made a list of “pioneering managers” at PARC, he never elevated Taylor above “associate manager,” and he listed that qualified title before Taylor’s name, while every other manager was named before his title was given.

The demos, which have taken on mythical status in the history of Silicon Valley, have been detailed and rehashed so many times that only the broadest outlines are necessary here. The best, most detailed account of the PARC/Apple demo is in Michael Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperCollins, 2000). 10. Roughly fifty Xerox stores carried Apple computers, along with the Xerox 820 and machines from Hewlett-Packard and Osborne, until July 1982, when the agreement between the parent companies expired. Scott Mace and Paul Freiberger, “Xerox Stores Take Aim at Retail Computer Market,” Info World, March 29, 1982; Jeff Brown, “Apples Picked off Shelves of Xerox Corp’s Retail Stores,” Info World, July 26, 1982; Bertil Nordin to Arthur Rock, 12 July 1982, AR. 11.

pages: 159 words: 45,073

GDP: A Brief but Affectionate History
by Diane Coyle
Published 23 Feb 2014

This epoch of the information and communications revolution has spanned forty years. THE NEW ECONOMY BOOM It was obvious by the mid-1980s that a lot of businesses were buying and using computers, but what effect this was having on the economy was not at all apparent. Robert Solow wrote a frequently quoted New York Times Book Review article in 1987 claiming, “You can see the computer age everywhere but the productivity statistics.”4 In fact, it took the convergence of a number of separate streams of technological innovation, plus the investment in new computer and communications equipment, plus the reorganization of businesses to use these new tools, before any benefit in terms of productivity or GDP could occur.

The Sino-U.S. bilateral trade imbalance has been greatly inflated,” according to one study of the statistics.7 Value-added trade statistics are now becoming available, and their study is likely to change the big picture we hold in our minds about the shape of the world economy. PRODUCTIVITY If economists were to play a game of word association, the one that would leap to mind on hearing productivity would be puzzle. I already quoted Robert Solow’s famous 1987 version of the productivity puzzle: “You can see the computer age everywhere but in the productivity figures.” As discussed in chapter 5, the New Economy era from the mid-1990s to 2001 did see productivity growth increase in the official figures, although that has slowed down again in the postcrisis economy. But a different “puzzle” may have emerged in the United Kingdom: despite more or less zero GDP growth since 2008, employment has increased.

The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal
by M. Mitchell Waldrop
Published 14 Apr 2001

Not only was it huge, being eight feet tall, fifty-one feet long, and two feet thick, but it had a sleek, shiny, sci-fi look; at the insistence of Watson, who was a past master at public relations, the machine had been encased in a futuristic stainless-steel- and-glass skin. The reporters instantly dubbed it "the electronic brain," a phrase that Aiken despised. But for better or worse, the name stuck, and the American public had its first introduction to the computer age. The Automatic Sequence Controlled Calculator acquired its "Mark I" desig- nation a year later, when Aiken and his team began work on an upgraded Mark II for the navy. (There would eventually be a Mark III and a Mark IV as well.) None of these machines would have much impact on the development of com- puter hardware per se; Aiken's insistence on features such as base-l0 arithmetic was just too idiosyncratic.

The whole unedifying saga would drag on for another year, ending only in April 1947, when exasperated army attorneys at last threw out everybody's patent claims on the ground that von Neumann's "First Draft" paper represented prior public disclosure. They decreed that the stored-program idea rightfully belonged in the public domain. And there it has remained. 64 THE DREAM MACHINE That was probably just as well. However fierce the controversy surrounding its birth, the stored-program concept now ranks as one of the great ideas of the computer age-arguably the great idea. By rendering software completely abstract and decoupling it from the physical hardware, the stored-program concept has had the paradoxical effect of making software into something that is almost physically tangible. Software has become a medium that can be molded, sculpted, and engineered on its own terms.

Indeed, he went on, looking back from the imag- ined viewpoint of the year 2000, "[the electronic commons] has supplanted the postal system for letters, the dial-tone phone system for conversations and tele- conferences, stand-alone batch-processing and time-sharing systems for compu- :- The occasion was a series of essays on the future of computing, collected by Mike Dertouzos and his deputy Joel Moses and published as The Computer Age: A Twenty-Year View. Lick's forty-page chapter, entitled "Computers and Government," was one of the longest in the book. In addition to his fantasy about the Multinet/Internet, it included a very thorough overview of the policy issues raised by information technology-an analysis that stands up pretty well today.

pages: 371 words: 93,570

Broad Band: The Untold Story of the Women Who Made the Internet
by Claire L. Evans
Published 6 Mar 2018

“That brain of mine is something more than merely mortal,” she boasted as she sorted out all the ways the machine could deduce Bernoulli numbers. “Before ten years are over, the Devil’s in it if I have not sucked out some of the life-blood from the mysteries of this universe, in a way that no purely mortal lips or brains could do.” The Analytical Engine would never be completed, but it represents the conceptual dawn of the computer age. The four components of its design—input, storage, processing, and output—remain core components of all computers today, and the strikingly original notes that Ada prepared to explain this new kind of machine would presage the literature of computer science by nearly a century. To demonstrate how the engine could calculate Bernoulli Numbers without any assistance from a “human hand or head,” she wrote mathematical proofs that many scholars characterize as the first computer programs ever written, and all for a machine that never even existed.

Floating in and out of reality with doses of laudanum, wine, and chloroform, she echoed the family chord of recklessness and tragedy. “I do dread that horrible struggle, which I fear is in the Byron blood,” she wrote to her mother. “I don’t think we die easy.” Like her father’s, Ada’s work outlived her, although it would be nearly a century before it was properly recognized. It took until the beginning of the computer age, when the magnitude of their prescience became undeniable, for her Notes to be republished, in a British computing symposium; its editor marveled, in 1953, that “her ideas are so modern that they have become of great topical interest once again.” Ada was lucky to have been born wealthy, noble, and relatively idle.

“It is a known fact,” Babbage proclaimed: Charles Babbage, Passages from the Life of a Philosopher (New Brunswick, NJ: Rutgers University Press, 1994), 116–17. writing of a “store” to hold the numbers: Ibid., 117. “very costly toy”: Gleick, The Information, 101–5. “mad, bad, and dangerous to know”: Betty Alexander Toole, Ada, the Enchantress of Numbers: Prophet of the Computer Age (Mill Valley, CA: Strawberry Press, 1992), 6. “Oh, my poor dear child!”: Ibid., 21. “I do not believe that my father was”: Ibid., 156–57. “aptitude for grasping the strong points”: B. V. Bowden, “A Brief History of Computation,” in Faster Than Thought: A Symposium on Digital Computing Machines, ed.

pages: 223 words: 52,808

Intertwingled: The Work and Influence of Ted Nelson (History of Computing)
by Douglas R. Dechow
Published 2 Jul 2015

In addition, members of the Chapman University Department of English assisted in producing the Festschrift; graduate students Danny De Maio and Tatiana Servin transcribed several of the talks, and Dr. Anna Leahy provided editing for those talks. Douglas R. Dechow Daniele C. Struppa Orange, CA February 7, 2015 Contents Part I Artistic Contributions 1 The Computer Age Ed Subitzky 2 Odes to Ted Nelson Ben Shneiderman Part II Peer Histories 3 The Two-Eyed Man Alan Kay 4 Ted Nelson’s Xanadu Ken Knowlton 5 Hanging Out with Ted Nelson Brewster Kahle 6 Riffing on Ted Nelson—Hypermind Peter Schmideg and Laurie Spiegel 7 Intertwingled Inspiration Andrew Pam 8 An Advanced Book for Beginners Dick Heiser Part III Hypertext and Ted Nelson-Influenced Research 9 The Importance of Ted’s Vision Belinda Barnet 10 Data, Metadata, and Ted Christine L.

SubitzkyChapman University, Orange, CA, USA Ed Subitzky Noah Wardrip-FruinDepartment of Computational Media, University of California, Santa Cruz, CA, USA Part I Artistic Contributions © The Author(s) 2015 Douglas R. Dechow and Daniele C. Struppa (eds.)IntertwingledHistory of Computing10.1007/978-3-319-16925-5_1 1. The Computer Age Ed Subitzky1 (1)New York, USA Deceased Cartoonist and humor writer Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited

pages: 559 words: 157,112

Dealers of Lightning
by Michael A. Hiltzik
Published 27 Apr 2000

DEALERS OF LIGHTNING Xerox PARC and the Dawn of the Computer Age Michael Hiltzik To Deborah, Andrew, and David Contents Cast of Characters v Timeline ix Introduction The Time Machine xv Part I: Prodigies 1 Chapter 1 The Impresario 3 Chapter 2 McColough’s Folly 21 Chapter 3 The House on Porter Drive 33 Chapter 4 Utopia 52 Chapter 5 Berkeley’s Second System 68 Chapter 6 “Not Your Normal Person” 80 Chapter 7 The Clone 97 Chapter 8 The Future Invented 117 Part II: Inventors 125 Chapter 9 The Refugee 127 Chapter 10 Beating the Dealer 145 Chapter 11 Spacewar 155 Chapter 12 Thacker’s Bet 163 Chapter 13 The Bobbsey Twins Build a Network 178 Chapter 14 What You See Is What You Get 194 Chapter 15 On the Lunatic Fringe 211 Chapter 16 The Pariahs 229 Chapter 17 The Big Machine 242 Part III: Messengers 257 Chapter 18 Futures Day 259 Chapter 19 Future Plus One 274 Chapter 20 The Worm That Ate the Ethernet 289 Chapter 21 The Silicon Revolution 300 Chapter 22 The Crisis of Biggerism 314 Chapter 23 Steve Jobs Gets His Show and Tell 329 Chapter 24 Supernova 346 Chapter 25 Blindsided 361 Chapter 26 Exit the Impresario 371 Epilogue Did Xerox Blow It?

Determined in principle to move into the digital world but yoked in practice to the marketing of the copier machine (and unable to juggle two balls at once), Xerox management regarded PARC’s achievements first with bemusement, then uneasiness, and finally hostility. Because Xerox never fully understood the potential value of PARC’s technology, it stood frozen on the threshold of new markets while its rivals—including big, lumbering IBM—shot past into the computer age. Yet this relationship is too easily, and too often, simplified. Legend becomes myth and myth becomes caricature—which soon enough gains a sort of liturgical certitude. PARC today remains a convenient cudgel with which to beat big business in general and Xerox in particular for their myriad sins, including imaginary ones, of corporate myopia and profligacy.

In 1945 Bush had turned his attention to the scientific advances produced in the name of war and to how they might serve the peace. The result was a small masterpiece of scientific augury entitled “As We May Think,” which appeared in the July 1945 issue of The Atlantic Monthly. “As We May Think” remains one of the few genuinely seminal documents of the computer age. Even today it stands out as a work of meticulous scientific and social analysis. The contemporary reader is struck by its pragmatism and farsightedness, expressed without a hint of platitude or utopianism, those common afflictions of writing about the future. Bush was not interested in drawing magical pictures in the air; he was busy scrutinizing the new technologies of the postwar world to see how they might relieve society’s pressing burdens.

pages: 1,104 words: 302,176

The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War (The Princeton Economic History of the Western World)
by Robert J. Gordon
Published 12 Jan 2016

Long-term Productivity Growth,” called attention to the mid-century peak in the U.S. growth process. At the same time there was clear evidence that the long post-1972 slump in U.S. productivity growth was over, at least temporarily, as the annual growth rate of labor productivity soared in the late 1990s. I was skeptical, however, that the inventions of the computer age would turn out to be as important for long-run economic growth as electricity, the internal combustion engine, and the other “great inventions” of the late nineteenth century. My skepticism took the form of an article, also published in 2000, titled “Does the ‘New Economy’ Measure Up to the Great Inventions of the Past?”

We can state this puzzle in two symmetric ways: Why was TFP growth so slow before 1920? Why was it so fast during the fifty years after 1920? The leading hypothesis is that of Paul David, who provided a now well-known analogy between the evolution of electric machinery and of the electronic computer.14 In 1987, Robert Solow quipped, “We can see the computer age everywhere but in the productivity statistics.”15 David responded, in effect: “Just wait”—suggesting that the previous example of the electric dynamo and other electric machinery implied that a long gestation period could intervene between a major invention and its payoff in productivity growth.

However, an examination of railroad timetables tells a surprising story about speed, which improved steadily from 1870 to 1940. Improvements came from mergers, interconnections, better switching, roller bearings, and eventually, in the 1930s, the conversion from inefficient steam locomotives to diesel–electric propulsion and air-conditioned passenger cars. In the pre-computer age, planning rail trips relied heavily on The Official Guide of the Railways, which dates back to 1868. The guide provides a unique window on a world that no longer exists, at least within the United States, of an extremely dense railroad network that connected almost every city and town, no matter how small.25 As an example of this density, the local train between Portland and Bangor, Maine, in 1900 made thirty-two stops along its 135-mile route (one stop every 4.2 miles) and required five hours to do so, for an average speed of twenty-seven miles per hour.

pages: 187 words: 55,801

The New Division of Labor: How Computers Are Creating the Next Job Market
by Frank Levy and Richard J. Murnane
Published 11 Apr 2004

HD6331.L48 2004 331.1—dc22 2003065497 British Library Cataloging-in-Publication Data is available This book has been composed in Dante Printed on acid-free paper. f pup.princeton.edu www.russellsage.org Printed in the United States of America 3 5 7 9 10 8 6 4 2 CONTENTS Acknowledgments vii CHAPTER 1 New Divisions of Labor 1 PART I Computers and the Economy CHAPTER 2 Why People Still Matter 13 CHAPTER 3 How Computers Change Work and Pay PART II The Skills Employers Value CHAPTER 4 Expert Thinking 57 31 vi CONTENTS CHAPTER 5 Complex Communication 76 PART III How Skills Are Taught CHAPTER 6 Enabling Skills 99 CHAPTER 7 Computers and the Teaching of Skills CHAPTER 8 Standards-Based Education Reform in the Computer Age 131 CHAPTER 9 The Next Ten Years Notes 159 Index 169 149 109 ACKNOWLEDGMENTS MANY PEOPLE HELPED US DURING THE YEARS THAT WE worked on this book. First and foremost, we thank our friend and colleague David Autor, professor of economics at MIT. David has had a long-standing interest in the impacts of computers on work.

For rich discussions of the asymmetric information and self-selection ideas, see Daron Acemoglu and Jorn-Steffan Pischke, “Beyond Becker: Training in Imperfect Labour Markets,” Economic Journal 109, no. 453 (February 1999): F112–42; and David Autor, “Why Do Temporary Help Firms Provide Free General Skills Training?” Quarterly Journal of Economics 116, no. 4 (November 2001): 1409–48. CHAPTER 8. Standards-Based Education Reform in the Computer Age 1. The wage data come from the following Economic Policy Institute website: http://www.epinet.org/content.cfm/datazone dznational. 2. Expessed in constant 2000–01 dollars, the relevant numbers are $4,427 for the 1969–70 school year and $7,653 for the 1989–90 school year. These figures are taken from the Digest of Education Statistics 2001, p. 191, table 167. 3.

pages: 351 words: 107,966

The Secret Life of Bletchley Park: The WWII Codebreaking Centre and the Men and Women Who Worked There
by Sinclair McKay
Published 24 May 2010

The austere wooden huts on the lawns and in the meadows played host to some of the most gifted – and quirky – individuals of their generation. Not only were there long-standing cryptographers of great genius; there were also fresh, brilliant young minds, such as Alan Turing, whose work was destined to shape the coming computer age, and the future of technology. Also at Bletchley Park were thousands of dedicated people, mostly young, many drawn straight from university. Some came straight from sixth form. As the war progressed, numbers grew. Alongside the academics, there were platoons of female translators and hundreds of eager Wrens, there to operate the fearsomely complicated prototype computing machines; there was also a substantial number of well-bred debutantes, sought out upon the social grapevine, and equally determined to do their bit.

The one name that shines out in terms of engineering ingenuity was Tommy Flowers, familiarly known as the ‘clever cockney’. There are some who argue that the name should be known in every household – for, they believe, he was the man who realised the dreams of Alan Turing and truly brought the computer age into being. In 1943, Bletchley Park had seen the establishment of a new section known as ‘The Newmanry’. It was set up under the aegis of mathematician Professor Max Newman from St John’s College, Cambridge, and the idea of it was to find ways of applying more advanced machinery to codebreaking work.

Flowers himself said: ‘It was a feat made possible by the absolute priority they were given to command materials and services and the prodigious efforts of the laboratory staff, many of whom did nothing but work, eat and sleep for weeks and months on end except for one half day a week … the US also contributed valves and an electric typewriter under the lend-lease’.8 And so this monster, this Colossus, was delivered to Bletchley in January 1944; and with it, many argue, came the dawn of the computer age. For this was more than just a huge, elaborate counting machine; it worked to a program, via electronic valve pulses and delicate, complex circuits, at a rate hitherto unimagined, opening up the Lorenz messages at a terrific rate. Tommy Flowers was vindicated; the work he did proved utterly invaluable.

pages: 408 words: 105,715

Kingdom of Characters: The Language Revolution That Made China Modern
by Jing Tsu
Published 18 Jan 2022

As China entered a new era of internationalization, a new technological integration challenge loomed. It would dwarf the earlier challenges of telegraphy. As much as the Chinese script revolution had accomplished up to this point, it would still need to make several more steps to keep pace with the coming computing age. Now with a second identity in alphabetic form, the Chinese script had never been in a better position to propel China forward. The question is how far and how fast it could make that next leap. six ENTERING INTO THE COMPUTER (1979) converting input and output Even without being able to see the July sun, Zhi Bingyi felt its smoldering heat between his back and a thin, sweat-drenched straw mat.

The data for calculating and controlling flight paths, military targets, and geographical positioning, or tracking agricultural and industrial output, had to be collected first. Yet all the existing records, documents, and reports were in Chinese. It became clear that in order to be part of the computing age at all, the Chinese script would have to be rendered digitally. Western computing technology was also moving in the direction of text processing and communication, not just running large-scale calculations. Converting human language scripts into digital form was the next frontier. The arms race during the Cold War was advancing the state of computing technology in both the Soviet Union and the United States.

During Zhi’s incarceration, China was in the throes of its biggest social and political upheaval yet and hardly had the resources to make such a bid for the future. But for a country so far behind the Western world, science and technology were not just a barrier. They were viewed as essential for helping China leapfrog out of backwardness and speed up the process of modernization. China was doubly invested in exploring the computing age. Of the countless obstacles that lay in the country’s path, the Chinese language was the one that could stall the state’s ambitious plans before they even got off the ground. The challenge was multifaceted: to devise a code for Chinese that is easy for humans to remember and use and that can be entered into a machine via punched tape or keyboard; to find a way for the machine to store the massive amount of information required to identify and reproduce Chinese characters; and to be able to retrieve and restore the script with pinpoint precision, on paper or on a screen.

pages: 843 words: 223,858

The Rise of the Network Society
by Manuel Castells
Published 31 Aug 1996

When it was turned on, its electricity consumption was so high that Philadelphia’s lighting twinkled.47 Yet the first commercial version of this primitive machine, UNIVAC–1, produced in 1951 by the same team, then under the Remington Rand brand name, was extremely successful in processing the 1950 US census. IBM, also supported by military contracts and relying partly on MIT research, overcame its early reservations about the computer age, and entered the race in 1953 with its 701 vacuum tube machine. In 1958, when Sperry Rand introduced a second-generation computer mainframe machine, IBM immediately followed up with its 7090 model. But it was only in 1964 that IBM, with its 360/370 mainframe computer, came to dominate the computer industry, populated by new (Control Data, Digital), and old (Sperry, Honeywell, Burroughs, NCR) business machines companies.

Furthermore, since the mid-1980s, microcomputers cannot be conceived of in isolation: they perform in networks, with increasing mobility, on the basis of portable computers. This extraordinary versatility, and the capacity to add memory and processing capacity by sharing computing power in an electronic network, decisively shifted the computer age in the 1990s from centralized data storage and processing to networked, interactive computer power-sharing. Not only did the whole technological system change, but its social and organizational interactions as well. Thus, the aver-age cost of processing information fell from around $75 per million operations in 1960 to less than one-hundredth of a cent in 1990.

Ball-Rokeach, Sandra J. and Cantor, Muriel (eds) (1986) Media, Audience and Social Structure, Beverly Hills, CA: Sage. Banegas, Jesus (ed.) (1993) La industria de la información: situación actual y perspectivas, Madrid: Fundesco. Bar, François (1990) “Configuring the telecommunications infrastructure for the computer age: the economics of network control”, unpublished PhD dissertation, Berkeley, CA: University of California. —— (1992) “Network flexibility: a new challenge for telecom policy”, Communications and Strategies, special issue, June: 111–22. —— and Borrus, M. (1993) The Future of Networking, Berkeley, CA: University of California, BRIE working paper. —— and —— with Coriat, Benjamin (1991) Information Networks and Competitive Advantage: Issues for Government Policy and Corporate Strategy Development, Brussels: Commission of European Communities, DGIII–BRIE–OECD Research Program.

pages: 567 words: 171,072

The Greatest Capitalist Who Ever Lived: Tom Watson Jr. And the Epic Story of How IBM Created the Digital Age
by Ralph Watson McElvenny and Marc Wortman
Published 14 Oct 2023

It was the biggest privately financed commercial project ever undertaken.”28 The total amount spent on the 360, equivalent to almost $50 billion today, was more than double the cost of the Manhattan Project, the development of the atomic bomb, and twice IBM’s total revenue in 1962, the year when the company embarked on the New Product Line project.29 With a veritable torrent of cash pouring out and delays in deliveries continuing, IBM would not see any significant return on the System/360 for months, if not years for some models, to offset its capital outlay. Meanwhile, growth in existing revenues from clients’ now-obsolete computers, aged tabulators, and popular Selectric typewriters could not keep up with out-of-control expenses. Company cash coffers were dwindling like water from a leaky bucket while the fire raged. Every morning, Watson arrived at his Armonk office to more disturbing news that set his pulse racing. There was no getting around the trouble.

Five years after the first shipments began, IBM had installed about five thousand of its medium and large System/360 models and more than ten thousand of its lower-end lines within the US, and about 50 percent of that total to the rest of the world.10 At the high end, the 360 could perform one million instructions per second, and even the slowest models could handle 75,000. And with this newfound power, at prices most sizable businesses could afford, myriad companies and entire industries moved into the computer age. IBM estimated that by 1970, at least three thousand different types of businesses and scientific research organizations relied on one or more of the System/360’s nineteen models.11 From airline reservations, credit card transactions, and inventory control, to stock sales, loan interest setting, and insurance rate determinations, and from high-speed data telecommunications, air traffic control, and educational testing to rocket guidance and biomedical data analysis, virtually any process or operation incorporating a database could be automated, calculated, communicated, and executed almost instantaneously.

Catherine Fredman with Gideon Gartner, About Gartner: The Making of a Billion-Dollar IT Advisory Firm (Lemonade Heroes, 2014), 128; quoted in Cortada, IBM, 279. 10. Ralph W. McElvenny. 11. Louis V. Gerstner Jr., Who Says Elephants Can’t Dance?: Inside IBM’s Historic Turnaround (New York: HarperBusiness, 2002), 182. 12. “Thomas J. Watson Jr.; Led IBM into Computer Age,” Los Angeles Times, January 1, 1994. 13. Jeannette Watson, It’s My Party: A Memoir (Brooklyn: Turtle Point Press, 2017), 56. 14. Conversation with Ralph W. McElvenny. 15. Conversation with Ralph W. McElvenny. 16. Ralph W. McElvenny; family sources. 17. “Thomas Watson, Jr. Remembered,” IBM, www.ibm.com/ibm/history/exhibits/watsonjr/watsonjr_remembered.html.

User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work & Play
by Cliff Kuang and Robert Fabricant
Published 7 Nov 2019

What user-centered design did was to build a sensing process that gave companies a way to mimic that of the inventor. The gospel of innovation, and the imperative to innovate or be washed away by the rising tide of competition, rings hollow unless you have some mechanism for finding new ideas. The beauty of the design process as articulated at the dawn of the computer age was that we could all innovate, if only we knew how to empathize. Industrial empathy arose precisely when a new wave of technology arrived that few people understood, and that almost no one had ever bought for themselves. But when empathy becomes an imperative, then the question becomes: With whom should you empathize?

Jane Fulton Suri, “Saving Lives Through Design,” Ergonomics in Design (Summer 2000): 2–10. 15. Interview with Bill Atkinson and Andy Hertzfeld, May 14, 2018. 16. Interview with Bruce Horn, May 9, 2018. 17. Interview with Atkinson; Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperCollins, 1999), 332–45. 18. Interview with Hertzfeld. 19. Hiltzik, Dealers of Lightning, 340. 20. Alan Kay, “A Personal Computer for Children of All Ages,” Proceedings of the ACM National Conference, Xerox Palo Alto Research Center, 1972, Viewpoints Research Institute, http://worrydream.com/refs/Kay%20-%20A%20Personal%20Computer%20for%20Children%20of%20All%20Ages.pdf. 21.

New York Times, May 7, 2011. www.nytimes.com/2011/05/08/technology/08class.html. Hempel, Jessi. “What Happened to Facebook’s Grand Plan to Wire the World?” Wired, May 17, 2018. www.wired.com/story/what-happened-to-facebooks-grand-plan-to-wire-the-world/. Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperCollins, 1999. Hounshell, David A. From the American System to Mass Production, 1800–1932. Baltimore: Johns Hopkins University Press, 1985. Hunt, Morton M. “The Course Where Students Lose Earthly Shackles.” Life, May 16, 1955. Hutchins, Edwin. Cognition in the Wild. Cambridge, MA: MIT Press, 1995.

pages: 193 words: 19,478

Memory Machines: The Evolution of Hypertext
by Belinda Barnet
Published 14 Jul 2013

Such an observation would be entirely unnecessary for a hypertext system now, but in 1986, by contrast, most ‘computer people’ who were tinkering with networks were doing AI.9 They go on to note with no small sense of irony that ‘we may be among a very small number of current software developers who do not claim to be doing AI’ (Bolter and Joyce 1986, 34). Bolter published a book shortly after his fellowship at Yale that would become a classic in computing studies, Turing’s Man: Western Culture in the Computer Age (1984). In Turing’s Man, he sets out to ‘foster a process of crossfertilization’ between computing science and the humanities and to explore the cultural impact of computing (Bolter 1984, xii).10 He also introduces some ideas around ‘spatial’ writing that would recur and grow in importance in his later work: in particular, the relationship between early Greek systems of place memory loci (the art of memory) and electronic writing.

In Hypertext ’91 Proceedings, edited by John J. Leggett, 243–60. San Antonio: ACM. Bogost, Ian. 2010. ‘Cow Clicker: The Making of an Obsession’. Ian Bogost Blog, 21 July. Online: http://www.bogost.com/blog/cow_clicker_1.shtml (accessed March 2012). Bolter, Jay David. 1984. Turing’s Man: Western Culture in the Computer Age. Chapel Hill: University of North Carolina Press. . 1985. ‘The Idea of Literature in the Electronic Medium’. Topic 39: 23–34. . 1991. Writing Space: The Computer, Hypertext and the History of Writing. Hillsdale: Lawrence Erlbaum Associates. Bolter, Jay David and Richard Grusin. 2000. Remediation: Understanding New Media.

pages: 271 words: 77,448

Humans Are Underrated: What High Achievers Know That Brilliant Machines Never Will
by Geoff Colvin
Published 3 Aug 2015

A likely explanation is that the whole notion seems kind of weird—learning how to do something that we innately do. Most people wouldn’t even consider empathy a skill; they’d say it’s a trait, something you just have. We will see that, in this, it’s like many of the skills that turn out to be the high-value skills of the computer age—very deeply human, widely regarded as traits, not skills, and the kinds of things we don’t even think of as trainable. But we can get better at them—extraordinarily better—if we’re willing to think about them in a new way. In fact, we know a great deal about how it’s done. There exists a vast store of knowledge about how to make ordinary people much, much better at some essential abilities of human interaction, including the ones that will prove most valuable in the coming economy, and this knowledge resides in a most unexpected place.

In a world like that . . . Loukas Karabarbounis and Brent Neiman, “The Global Decline of the Labor Share,” National Bureau of Economic Research, October 2013. The authors find that “the decrease in the relative price of investment goods, often attributed to advances in information technology and the computer age, induced firms to shift away from labor and toward capital.” Economists aren’t the only experts . . . Quotations in this paragraph are from Pew Research Center, op. cit. Microsoft founder Bill Gates . . . He made these remarks during a session at the American Enterprise Institute in Washington, D.C., 13 March 2014.

pages: 238 words: 77,730

Final Jeopardy: Man vs. Machine and the Quest to Know Everything
by Stephen Baker
Published 17 Feb 2011

Naturally, a Jeopardy computer would run on IBM hardware. But the heart of the system, like IBM itself, would be the software created to answer difficult questions. A Jeopardy machine would also respond to another change in technology: the move toward human language. For most of the first half-century of the computer age, machines specialized in orderly rows of numbers and words. If the buyers in a database were listed in one column, the products in another, and the prices in a third, everything was clear: Computers could run the numbers in a flash. But if one of the customers showed up as “Don” in one transaction and “Donny” in another, the computer viewed them as two people: The two names represented different strings of ones and zeros, and therefore Don ≠ Donny.

In its scope, Cyc was as ambitious as the eighteenth-century French encyclopedists, headed by Denis Diderot, who attempted to catalogue all of modern knowledge (which had grown significantly since the days of Aristotle). Cyc, headed by a computer scientist named Douglas Lenat, aspired to fill a similar role for the computer age. It would lay out the relationships of practically everything, from plants to presidents, so that intelligent machines could make inferences. If they knew, for example, that Ukraine produced wheat, that wheat was a plant, and that plants died without water, it could infer that a historic drought in Ukraine would curtail wheat production.

pages: 437 words: 132,041

Alex's Adventures in Numberland
by Alex Bellos
Published 3 Apr 2011

He wants simply to show people that there is an alternative to the decimal system, and that perhaps it suits their needs better. He knows that the chances of the world abandoning dix for douze are non-existent. The change would be both confusing and expensive. And decimal works well enough for most people – especially in the computer age, where mental arithmetic skills are less required generally. ‘I would say that dozenal is the optimum base for general computation, for everyday use,’ he added, ‘but I am not here to convert anybody.’ An immediate goal of the DSA is to get the numerals for dek and el into Unicode, the repertoire of text characters used by most computers.

When we arrived at his house, his wife made us a cup of tea and we retired to his study, where he presented me with a wooden 1970s Faber-Castell slide-rule with a magnolia-coloured plastic finish. The rule was the size of a normal 30cm ruler and had a sliding middle section. On it, several different scales were marked in tiny writing. It also had a transparent movable cursor marked with a hairline. The shape and feel of the Faber-Castell were deeply evocative of a kind of post-war, pre-computer-age nerdiness – when geeks had shirts, ties and pocket protectors rather than T-shirts, sneakers and iPods. I went to secondary school in the 1980s, by which time slide-rules were no longer used, so Hopp gave me a quick tutorial. He recommended that as a beginner I should use the log scale from 1 to 100 on the main ruler and adjacent log scale from 1 to 100 on the sliding middle section.

In 1876, two and a half centuries after Mersenne wrote his list, the French number theorist Edouard Lucas devised a method that was able to check whether numbers written 2n – 1 are prime, and he found that Mersenne was wrong about 67 and that he had left out 61, 89 and 107. Amazingly, however, Mersenne had been right about 127. Lucas used his method to prove that 2127 – 1, or 170,141,183,460,469,231, 731,687,303,715,884,105,727, was prime. This was the highest-known prime number until the computer age. Lucas, however, was unable to determine if 2257 – 1 was prime or not; the number was simply too large to work on with pencil and paper. Despite its patches of error, Mersenne’s list immortalized him; and now a prime that can be written in the form 2n – 1 is known as a Mersenne prime. The proof of whether 2257 – 1 is prime would take until 1952 to be proven, using the Lucas method, but with a big assist.

pages: 464 words: 127,283

Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia
by Anthony M. Townsend
Published 29 Sep 2013

Manned by a small team of attendants from the Royal Sappers and Miners, the British military’s engineering corps, the vents were adjusted every two hours based on readings from fourteen thermostats placed throughout the structure.1 While far from automatic, the Crystal Palace’s ventilation system showed how mechanical controls and sensors could work together to dynamically reconfigure an entire, massive building in response to changes in the environment. Paxton’s contraption was a harbinger of the automation revolution that will transform the buildings and cities we live in over the coming decades. More than a century later, at the dawn of the computer age, a design for a very different kind of gathering space spurred another bold leap into building automation. Howard Gilman was the heir to a paper-making fortune but his true avocation was philanthropist and patron of the arts. Gilman lavished his family fortune on a variety of causes, supporting trailblazers in dance, photography, and wildlife preservation.

Cybernetic thinking inspired new directions in engineering, biology, neuroscience, organizational studies, and sociology. Cybernetics underpinned the plotline for Foundation, but advances in computing provided the props. Just weeks before the 1945 American nuclear strikes on Hiroshima and Nagasaki, Vannevar Bush published a seminal article in The Atlantic that laid out a road map for the computer age. Bush was a technological authority without equal, an MIT man who during World War II had directed the entire US scientific effort, including the Manhattan Project that developed the nuclear weapons used against Japan. Like Asimov’s psychohistorians, who wielded tablet computers as cognitive prosthetics as they built their socioeconomic simulations, Bush believed that the new thinking machines would liberate the creative work of cyberneticians from the drudgery of computation.

But as data mining and recommendations move to the forefront, Foursquare runs the risk of becoming a quixotic attempt to compute serendipity and spontaneity. The city of Foursquare might look like a lattice, but is it becoming an elaborate tree traced by hidden algorithms? Instead of urging us to explore on our own, will it guide us down a predetermined path based on what we might buy? The DIY City For most people the computer age began with the IBM PC, which went on sale in 1981. True geeks, however, date the opening shots of the personal-computer revolution to the launch of the MITS Altair 8800 in 1975. The Altair dramatically democratized access to computing power. At the time, Intel’s Intellec-8 computer cost $2,400 in its base configuration (and as much as $10,000 with all the add-ons needed to develop software for it).

Paper Knowledge: Toward a Media History of Documents
by Lisa Gitelman
Published 26 Mar 2014

Ivan Edward Sutherland, “Sketchpad, A Man-­Machine Graphical Communication System,” PhD diss., Massachusetts Institute of Technology, 1963. 28. Thierry Bardini, Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing (Stanford, CA: Stanford University Press, 2000), 84. See also Michael Hiltzik, Dealers of Lightning: Xerox parc and the Dawn of the Computer Age (New York: Harper Collins, 1999), 90–91. 29. Ivan Edward Sutherland, “Sketchpad, A Man-­Machine Graphical Communication System,” PhD diss., Massachusetts Institute of Technology, 1963, 70–71. 30. Ivan Edward Sutherland, “Sketchpad: A Man-­Machine Graphical Communication System,” afips Proceedings 23 (1963): 335. 31.

Chapel Hill: published in association with the American Antiquarian Society by the University of North Carolina Press, 2009. Hilderbrand, Lucas. Inherent Vice: Bootleg Histories of Videotape and Copyright. Durham, NC: Duke University Press, 2009. Hiltzik, Michael. Dealers of Lightning: Xerox parc and the Dawn of the Computer Age. New York: Harper Collins, 1999. Hockey, Susan. “The History of Humanities Computing.” In A Companion to the Digital Humanities, edited by Susan Schreibman, Ray Siemens, and John Unsworth, 3–19. Malden, MA: Blackwell, 2004. Hofmeyr, Isabel. Gandhi’s Printing Press: Experiments in Slow Reading.

pages: 361 words: 83,886

Inside the Robot Kingdom: Japan, Mechatronics and the Coming Robotopia
by Frederik L. Schodt
Published 31 Mar 1988

Why, though, if karakuri became only a form of art and entertainment, are they given so much attention in Japan today? Tatsukawa, the professor partly responsible for re-popularizing them, suggests that modern people, being surrounded by cold, impersonal machines, long for technology with a more human face. The cute, simple karakuri help satisfy this craving. He also notes that since our computer age has made us reliant on millions of automatic and mechatronic devices, there is an overpowering interest in simple, easy-to-understand automatic and autonomous mechanisms. Finally, he points out how Japanese "used to think that automata only existed in Europe. Realizing that Japan also had this technological capability in the Edo period has increased interest, because karakuri can be seen as a point where Japan's machine civilization began."9 The Roots of Modern Japanese Technology When Commodore Perry and his fleet steamed into Uraga Bay in 1853, demanding trading rights at the point of a gun, Japan's nearly two hundred and fifty years of isolation was effectively ended.

The solution was easy—to rotate employees among different tasks and help them grasp the workings of the total system—but the problem was by no means limited to Star Micronics.22 * * * * * * * * * * * * While robots should also liberate humans from hazardous work, sometimes they themselves are the hazard. Robots only do as commanded by their programmers or operators, yet they are distinctly different from pre-computer-age industrial machines such as cranes or fork lift trucks that operate under direct human control. The robot's movements are set in advance. If equipped with sensors, a robot may begin moving through a programmed sequence or change its moves according to the information they feed it. But a malfunction due to component failure or electronic interference from other machines may make an industrial robot arm that seems at rest suddenly start moving, at a speed or in a direction never intended.

pages: 261 words: 81,802

The Trouble With Billionaires
by Linda McQuaig
Published 1 May 2013

Back in that early postwar period, the enormous economic gains were more widely shared, as a result of the more egalitarian ethos that produced more powerful unions, more progressive taxation and more generous social programmes. Today, the stupendous economic gains made possible by the technological advances of the computer age have been almost entirely captured and retained by a tiny elite, with little of the wealth flowing back to society through the tax system. This seems grossly unfair. It also appears to have led some of the lucky few to develop a false sense of their own contribution. We take issue, for instance, with Lew Frankfort, the CEO of Coach, who argued that today’s billionaires deserve their fortune because they had the ‘vision, lateral thinking, courage and an ability to see things’ that was necessary to succeed in the ‘technological age’.

Each passport has a unique identification number. Every time a person crosses a border, her number is swiped into a computer, which instantly discloses information about her. Transmitting an electronic record of all payments made by all banks would be no more complicated than that. It turns out that, among the benefits of the computer age, are not only video games, but also the easy tracking and taxation of the gigantic fortunes of the world’s billionaires. People wanting to conceal income from authorities would then find it no easier to move money undetected around the world than to travel without a passport. Checkmate. Support the international implementation of a Financial Transaction Tax, sometimes referred to as the ‘Robin Hood Tax’ or the ‘Tobin Tax’ The idea of curbing financial speculation by imposing a tax on financial transactions was first proposed by John Maynard Keynes in 1936 during the Great Depression.

pages: 304 words: 82,395

Big Data: A Revolution That Will Transform How We Live, Work, and Think
by Viktor Mayer-Schonberger and Kenneth Cukier
Published 5 Mar 2013

Knowing this, it could then treat translation as a giant math problem, with the computer figuring out probabilities to determine what word best substitutes for another between languages. Of course Google was not the only organization that dreamed of bringing the richness of the world’s written heritage into the computer age, and it was hardly the first to try. Project Gutenberg, a volunteer initiative to place public domain works online as early as 1971, was all about making the texts available for people to read, but it didn’t consider ancillary uses of treating the words as data. It was about reading, not reusing.

A worldview we thought was made of causes is being challenged by a preponderance of correlations. The possession of knowledge, which once meant an understanding of the past, is coming to mean an ability to predict the future. These issues are much more significant than the ones that presented themselves when we prepared to exploit e-commerce, live with the Internet, enter the computer age, or take up the abacus. The idea that our quest to understand causes may be overrated—that in many cases it may be more advantageous to eschew why in favor of what—suggests that the matters are fundamental to our society and our existence. The challenges posed by big data may not have set answers, either.

pages: 361 words: 81,068

The Internet Is Not the Answer
by Andrew Keen
Published 5 Jan 2015

The Stanford University political scientist Francis Fukuyama, assuming that the great debate between capitalists and socialists over the best way to organize industrial society had finally been settled, described the moment that the Wall came down as the “End of History.” But the converse is actually true. Nineteen eighty-nine actually represents the birth of a new period of history, the Networked Computer Age. The Internet has created new values, new wealth, new debates, new elites, new scarcities, new markets, and above all, a new kind of economy. Well-intentioned technologists like Vannevar Bush, Norbert Wiener, J. C. R. Licklider, Paul Baran, Robert Kahn, and Tim Berners-Lee had little interest in money, but one of the most significant consequences of their creation has been the radical reshaping of economic life.

But even Solow, whose research was mostly based on productivity improvements from the 1940s, ’50s, and ’60s, later became more skeptical of labor’s ability to maintain its parity with capital in terms of reaping the rewards of more economic productivity. In a 1987 New York Times Book Review piece titled “We’d Better Watch Out,” he acknowledged that what he called “Programmable Automation” hadn’t increased labor productivity. “You can see the computer age everywhere,” he memorably put it, “but in the productivity statistics.”73 Timothy Noah, the author of The Great Divergence, a well-received book on America’s growing inequality crisis, admits that computer technology does create jobs. But these, he says, are “for highly skilled, affluent workers,” whereas the digital revolution is destroying the jobs of “moderately skilled middle class workers.”74 The influential University of California, Berkeley economist and blogger J.

Humble Pi: A Comedy of Maths Errors
by Matt Parker
Published 7 Mar 2019

If you can write a short computer program to generate a sequence of numbers, then that sequence cannot be random. If the only way to communicate a sequence is to print it out in its entirety, then you’ve got some randomness on your hands. And printing random numbers out is sometimes the best option. Let’s get physical Before the computer age, lists of random numbers had to be generated in advance and printed as books for people to buy. I say ‘before the computer age’, but when I was at school in the 1990s we still had books with tables of random numbers in them. Handheld calculators, and indeed handheld computers, have come a long way since then. But, for true randomness, the printed page is hard to beat.

pages: 308 words: 85,880

How to Fix the Future: Staying Human in the Digital Age
by Andrew Keen
Published 1 Mar 2018

The whole spectacle—the dilapidated room, the mesmerized audience, the pixelated face flickering on the giant screen—recalls for me one of television’s most iconic commercials, the Super Bowl XVIII slot for the Apple Macintosh computer. In this January 1984 advertisement for the machine that launched the personal computer age, a man on a similarly large screen in a similarly decrepit room addresses a crowd of similarly transfixed people. But in the Macintosh commercial the man is a version of Big Brother, the omniscient tyrant from Orwell’s twentieth-century dystopian novel Nineteen Eighty-Four. The young man on the Berlin screen, in contrast, is an enemy of authoritarianism.

Reback has a keen sense of history. In his very readable history of antitrust, Free the Market! Why Only Government Can Keep the Marketplace Competitive,19 he explicitly compares what he calls the “excesses” of nineteenth-century “robber baron” industrial capitalism to the networked economics of the computer age. Yet Reback is too sophisticated a student of modern history to fall into the trap of seeing it endlessly repeat itself, like an annoying internet meme. So yes, he acknowledges, Google is, in many ways, the new Microsoft, and the great challenge for antitrust lawyers like himself, as it was in the 1990s with the so-called Beast of Redmond, is to build a case against the Mountain View leviathan.

pages: 696 words: 143,736

The Age of Spiritual Machines: When Computers Exceed Human Intelligence
by Ray Kurzweil
Published 31 Dec 1998

As transistor die sizes decrease, the electrons streaming through the transistor have less distance to travel, hence the switching speed of the transistor increases. So exponentially improving speed is the first strand. Reduced transistor die sizes also enable chip manufacturers to squeeze a greater number of transistors onto an integrated circuit, so exponentially improving densities of computation is the second strand. In the early years of the computer age, it was primarily the first strand—increasing circuit speeds—that improved the overall computation rate of computers. During the 1990s, however, advanced microprocessors began using a form of parallel processing called pipelining, in which multiple calculations were performed at the same time (some mainframes going back to the 1970s used this technique).

Collins, eds. Representation and Understanding. New York: Academic Press, 1975. Boden, Margaret. Artificial Intelligence and Natural Man. New York: Basic Books, 1977. ____. The Creative Mind: Myths & Mechanisms. New York: Basic Books, 1991. Bolter, J. David. Turing’s Man: Western Culture in the Computer Age. Chapel Hill: The University of North Carolina Press, 1984. Boole, George. An Investigation of the Laws of Thought on Which Are Founded the Mathematical Theories of Logic and Probabilities. 1854. Reprint. Peru, IL: Open Court Publishing, 1952. Botvinnik, M. M. Computers in Chess: Solving Inexact Search Problems.

New York: Copernicus, 1997. Depew, David J. and Bruce H. Weber, eds. Evolution at a Crossroads. Cambridge, MA: MIT Press, 1985. Dertouzos, Michael. What Will Be: How the New World of Information Will Change Our Lives. New York: HarperCollins, 1997. Dertouzos, Michael L. and Joel Moses Dertouzos. The Computer Age: A Twenty Year View. Cambridge, MA: MIT Press, 1979. Descartes, R. Discourse on Method, Optics, Geometry, and Meteorology. 1637. Reprint. Indianapolis, IN: Bobbs-Merrill, 1956. _____. Meditations on First Philosophy. Paris: Michel Soly, 1641. _____. Treatise on Man. Paris, 1664. Devlin, Keith.

pages: 515 words: 152,128

Material World: A Substantial Story of Our Past and Future
by Ed Conway
Published 15 Jun 2023

It is not merely deployed as the foundations of buildings or the substance with which countries extend their territory. The story of sand is not only one of astounding scale, but one of astounding miniaturisation. For the very same atom that comprises the backbone of concrete is the atom upon which we etched and formed the computer age. The paradox of sand is that today we make one of humankind’s most precious creations from one of nature’s most abundant elements. But doing so involves a set of transformations even more astounding than that of glass or of concrete. It also involves one of the most marvellous journeys in the world. 3 The Longest Journey There are many extraordinary journeys in the natural world.

For, by combining enough of these switches, each one a tiny, physical manifestation of the binary code, zero or one, you could create a computer on a tiny piece of silicon, chipped off a circular wafer (hence ‘chips’). These leaps of innovation, from the switch itself to the ‘integrated circuit’, the first of which was etched on to silicon by Robert Noyce at Fairchild Semiconductor in 1959, represented the physical foundation of the computing age. That word, physical, matters here. Sometimes innovations are the fruit of a simple brainwave. As historian Anton Howes has pointed out, there was no inherent reason why the flying shuttle – John Kay’s 1733 invention which revolutionised the weaving of wool – couldn’t have been produced thousands of years earlier.

Over the past century and a bit Chuqui, as locals call it, has produced more copper than any other such mine in history. Of every 13 grams of copper ever mined and refined from this planet’s crust, at least one of them came from here.3 In the past century and a quarter, countless companies have come and gone and business empires have risen and fallen. The age of electricity took flight, the computer age was born and matured and electric cars began to replace their petroleum counterparts, but throughout it all, Chuqui has endured. Year after year, billions of tonnes of rock have been torn from the ground and refined into hundreds of thousands of tonnes of pure metal. From this hole which few in the world have heard of, let alone seen, came the copper that helped power the twentieth century.

pages: 339 words: 88,732

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
by Erik Brynjolfsson and Andrew McAfee
Published 20 Jan 2014

As Intel executive Mike Marberry puts it, “If you’re only using the same technology, then in principle you run into limits. The truth is we’ve been modifying the technology every five or seven years for 40 years, and there’s no end in sight for being able to do that.”6 This constant modification has made Moore’s Law the central phenomenon of the computer age. Think of it as a steady drumbeat in the background of the economy. Charting the Power of Constant Doubling Once this doubling has been going on for some time, the later numbers overwhelm the earlier ones, making them appear irrelevant. To see this, let’s look at a hypothetical example. Imagine that Erik gives Andy a tribble, the fuzzy creature with a high reproductive rate made famous in an episode of Star Trek.

.* FIGURE 7.1 Labor Productivity Productivity improvement was particularly rapid in the middle part of the twentieth century, especially the 1940s, 50s, and 60s, as the technologies of the first machine age, from electricity to the internal combustion engine, started firing on all cylinders. However, in 1973 productivity growth slowed down (see figure 7.1). In 1987, Bob Solow himself noted that the slowdown seemed to coincide with the early days of the computer revolution, famously remarking, “We see the computer age everywhere, except in the productivity statistics.”4 In 1993, Erik published an article evaluating the “Productivity Paradox” that noted the computers were still a small share of the economy and that complementary innovations were typically needed before general purpose technologies like IT had their real impact.5 Later work taking into account more detailed data on productivity and IT use among individual firms revealed a strong and significant correlation: the heaviest IT users were dramatically more productive than their competitors.6 By the mid-1990s, these benefits were big enough to become visible in the overall U.S. economy, which experienced a general productivity surge.

pages: 288 words: 92,175

Rise of the Rocket Girls: The Women Who Propelled Us, From Missiles to the Moon to Mars
by Nathalia Holt
Published 4 Apr 2016

Launch Vehicles (Lexington: University Press of Kentucky, 2002). More information about the Fibonacci sequence can be found in Alfred S. Posamentier and Ingmar Lehmann, The (Fabulous) Fibonacci Numbers (Amherst, NY: Prometheus Books, 2007). The history of the IBM 701 can be found in Paul E. Ceruzzi, Beyond the Limits: Flight Enters the Computer Age (Cambridge, MA: MIT Press, 1989), and Emerson W. Pugh, Building IBM: Shaping an Industry and Its Technology (Cambridge, MA: MIT Press, 1995). Janez Lawson and Elaine Chappell were both sent to the IBM training school, as reported in JPL’s Lab-Oratory newsletter, February 1953. Remembrances of hearing magnetic-tape audio recordings during World War II are from John T.

The IBM 1620’s nickname of CADET was facetiously said to stand for “Can’t Add, Doesn’t Even Try” because it had no digital circuit that performed addition functions, which meant that operators had to look up their answers in tables instead, as described in Richard Vernon Andree, Computer Programming and Related Mathematics (Hoboken, NJ: John Wiley, 1966). A missing bar in the program was partly responsible for the Mariner accident, as reported in Ceruzzi, Beyond the Limits: Flight Enters the Computer Age (Cambridge, MA: MIT Press, 1989). Arthur C. Clarke mistakenly said that Mariner 1 was “wrecked by the most expensive hyphen in history” in The Promise of Space (New York: Berkley, 1955), and similar reports have been made elsewhere. Ceruzzi explains how the Mariner 1 failure was a “combination of a hardware failure and software bug.”

pages: 474 words: 87,687

Stealth
by Peter Westwick
Published 22 Nov 2019

Rich and Leo Janos, Skunk Works: A Personal Memoir of My Years at Lockheed (New York: Little, Brown, 1994), 306. 30 Waaland interview with Volker Janssen, November 10, 2010. 31 Cashen 1 and 2; Waaland interview with Westwick; Kinnu; Hal Maninger interview, September 28, 2017. 32 Waaland interview with Westwick. 33 Cashen 1; Waaland interview with Westwick. 34 Sybil Francis, “Warhead Politics: Livermore and the Competitive System of Nuclear Weapon Design” (PhD dissertation, MIT, 1996); Anne Fitzpatrick, “Igniting the Elements: The Los Alamos Thermonuclear Project, 1942–1952,” Los Alamos National Laboratory, LA-13577-T, July 1999. 35 Paul Ceruzzi, Beyond the Limits: Flight Enters the Computer Age (Cambridge, MA: MIT Press, 1989), 20–30. 36 Cashen 2; Ken Dyson and Robert Loschke interview, January 9, 2012. 37 Waaland interview with Westwick. Chapter six: showdown at ratscat 1 Ryan H. Edgington, Range Wars: The Environmental Contest for White Sands Missile Range (Lincoln: University of Nebraska Press, 2014). 2 H.

(Ben Rich papers, box 2, folder 10, Huntington Library). 23 Overholser interview. 24 Waaland interview with Janssen; Cashen 1. 25 Rebecca Grant, B-2: The Spirit of Innovation, Northrop Grumman Aeronautical Systems, report NGAS 13–0405 (2013, available at www.northropgrumman.com/Capabilities/B2SpiritBomber/Documents/pageDocuments/B-2-Spirit-of-Innovation.pdf), 11. Chapter seven: have blue and the f-117a 1 David A. Mindell, Digital Apollo: Human and Machine in Spaceflight (Cambridge, MA: MIT Press, 2008); Paul Ceruzzi, Beyond the Limits: Flight Enters the Computer Age (Cambridge, MA: MIT Press, 1989), 191–95. On test pilots as cowboys: Westwick, “An Album of Early Southern California Aviation,” in Blue Sky Metropolis: The Aerospace Century in Southern California, ed. Peter J. Westwick (Berkeley: University of California Press, 2012), 24. 2 Robert Loschke and Ken Dyson interview, January 9, 2012. 3 Robert Ferguson, NASA’s First A: Aeronautics from 1958 to 2008 (Washington, DC: NASA, 2013), 107–13. 4 Loschke-Dyson interview; Robert Loschke emails to Westwick, January 30 and February 1, 2019. 5 Sherman N.

pages: 89 words: 31,802

Headache and Backache
by Ole Larsen
Published 24 Apr 2011

There are many tips for protecting the cartilage discs, but I’d like to especially emphasise two of these: Tarzan then and now A natural lifestyle involves varied movements, produces normal muscular tension and gives relaxation. An unnatural lifestyle imprisons the body, especially in the computer age. Break free and become a Tarzan! ● Get ”up on your marks“ and place your spine in a neutral position with low disc pressure If you’re reading this book sitting down, stop for a second and think about the position of your spine right now! You’re probably sitting slumped forward with a round back, like a flower that’s begun to wilt.

pages: 544 words: 168,076

Red Plenty
by Francis Spufford
Published 1 Jan 2007

Available at www.sovietcomputing.com. See also D.A.Pospelov & Ya. Fet, Essays on the History of Computer Science in Russia (Novosibirsk: Scientific Publication Centre of the RAS, 1998), and the chapter about Lebedev and the very first Soviet computer in Mike Hally, Electronic Brains: Stories from the Dawn of the Computer Age (London: Granta, 2005), pp. 137–60. 2 And, more secretly still, an M-40 exists, and an M-50 too: for Lebedev’s computers for the Soviet missile-defence project, and the imaginary Moscow in the Kazakh desert, see Malinovsky, Pioneers of Soviet Computing, pp. 101–3. For ‘military cybernetics’ in general, see Gerovitch, From Newspeak to Cyberspeak. 3 ‘We can shoot down a fly in outer space, you know’: Malinovsky, Pioneers of Soviet Computing, p. 103 4 Remembering the story his rival Izaak Bruk told him: see Malinovsky, Pioneers of Soviet Computing, p. 70, which does not however specify the codename flower the vacuum tube buyer had to mention.

Available at www.sovietcomputing.com. See also D.A.Pospelov & Ya. Fet, Essays on the History of Computer Science in Russia (Novosibirsk: Scientific Publication Centre of the RAS, 1998), and the chapter about Lebedev and the very first Soviet computer in Mike Hally, Electronic Brains: Stories from the Dawn of the Computer Age (London: Granta, 2005), pp. 137–60. 2 And, more secretly still, an M-40 exists, and an M-50 too: for Lebedev’s computers for the Soviet missile-defence project, and the imaginary Moow in the Kazakh desert, see Malinovsky, Pioneers of Soviet Computing, pp. 101–3. For ‘military cybernetics’ in general, see Gerovitch, From Newspeak to Cyberspeak. 3 ‘We can shoot down a fly in outer space, you know’: Malinovsky, Pioneers of Soviet Computing, p. 103 4 Remembering the story his rival Izaak Bruk told him: see Malinovsky, Pioneers of Soviet Computing, p. 70, which does not however specify the codename flower the vacuum tube buyer had to mention.

Stuart, Russian and Soviet Economic Performance and Structure, 6th edn (Reading MA: Addison-Wesley, 1998) Jukka Gronow, Caviar with Champagne: Common Luxury and the Ideals of the Good Life in Stalin’s Russia (Oxford: Berg, 2003) Gregory Grossman, ed., Value and Plan: Economic Calculation and Organization in Eastern Europe (Berkeley CA: University of California Press, 1960) Vasily Grossman, Life and Fate, translated by Robert Chandler (London: Harvill, 1995) —, Forever Flowing, translated by Thomas P. Whitney (New York: Harper & Row, 1972) P. Charles Hachten, ‘Property Relations and the Economic Organization of Soviet Russia, 1941 to 1948: Volume One’, PhD thesis, University of Chicago 2005 Mike Hally, Electronic Brains: Stories from the Dawn of the Computer Age (London: Granta, 2005) John Pearce Hardt, ed., Mathematics and Computers in Soviet Economic Planning (New Haven CT: Yale University Press, 1967) Robert L. Heilbroner, The Worldly Philosophers: The Lives, Times and Ideas of the Great Economic Thinkers, 4th edn (New York: Simon & Schuster, 1971) Jochen Hellbeck, Revolution on My Mind: Writing a Diary Under Stalin (Cambridge MA: Harvard University Press, 2006) Fiona Hill and Clifford Gaddy, The Siberian Curse: How Communist Planners Left Russia Out in the Cold (Washington DC: Brookings Institution Press, 2003) Walter Hixson, Parting the Curtain: Propaganda, Culture, and the Cold War, 19450) (New York: St Martin’s Press, 1997) Geoffrey M.

pages: 855 words: 178,507

The Information: A History, a Theory, a Flood
by James Gleick
Published 1 Mar 2011

: Byron to Augusta Leigh, 12 October 1823, in Leslie A. Marchand, ed., Byron’s Letters and Journals, vol. 9 (London: John Murray, 1973–94), 47. ♦ “I AM GOING TO BEGIN MY PAPER WINGS”: Ada to Lady Byron, 3 February 1828, in Betty Alexandra Toole, Ada, the Enchantress of Numbers: Prophet of the Computer Age (Mill Valley, Calif.: Strawberry Press, 1998), 25. ♦ “MISS STAMP DESIRES ME TO SAY”: Ada to Lady Byron, 2 April 1828, ibid., 27. ♦ “WHEN I AM WEAK”: Ada to Mary Somerville, 20 February 1835, ibid., 55. ♦ AN “OLD MONKEY”: Ibid., 33. ♦ “WHILE OTHER VISITORS GAZED”: Sophia Elizabeth De Morgan, Memoir of Augustus De Morgan (London: Longmans, Green, 1882), 89

Toronto: Oxford University Press, 1986. Boden, Margaret A. Mind as Machine: A History of Cognitive Science. Oxford: Oxford University Press, 2006. Bollobás, Béla, and Oliver Riordan. Percolation. Cambridge: Cambridge University Press, 2006. Bolter, J. David. Turing’s Man: Western Culture in the Computer Age. Chapel Hill: University of North Carolina Press, 1984. Boole, George. “The Calculus of Logic.” Cambridge and Dublin Mathematical Journal 3 (1848): 183–98. ———. An Investigation of the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities. London: Walton & Maberly, 1854. ———.

In Proceedings of the 2nd IEEE International Symposium on Wearable Computers. Washington, D.C.: IEEE Computer Society, 1998. Toole, Betty Alexandra. “Ada Byron, Lady Lovelace, an Analyst and Metaphysician.” IEEE Annals of the History of Computing 18, no. 3 (1996): 4–12. ———. Ada, the Enchantress of Numbers: Prophet of the Computer Age. Mill Valley, Calif.: Strawberry Press, 1998. Tufte, Edward R. “The Cognitive Style of PowerPoint.” Cheshire, Conn.: Graphics Press, 2003. Turing, Alan M. “On Computable Numbers, with an Application to the Entscheidungsproblem.” Proceedings of the London Mathematical Society 42 (1936): 230–65. ———.

pages: 407 words: 103,501

The Digital Divide: Arguments for and Against Facebook, Google, Texting, and the Age of Social Netwo Rking
by Mark Bauerlein
Published 7 Sep 2011

Many of these Natives rarely enter a library, let alone look something up in a traditional encyclopedia; they use Google, Yahoo, and other online search engines. The neural networks in the brains of these Digital Natives differ dramatically from those of Digital Immigrants: people—including all baby boomers—who came to the digital/computer age as adults but whose basic brain wiring was laid down during a time when direct social interaction was the norm. The extent of their early technological communication and entertainment involved the radio, telephone, and TV. As a consequence of this overwhelming and early high-tech stimulation of the Digital Native’s brain, we are witnessing the beginning of a deeply divided brain gap between younger and older minds—in just one generation.

Conservation biologist Oliver Pergams at the University of Illinois recently found a highly significant correlation between how much time people spend with new technology, such as video gaming, Internet surfing, and video watching, and the decline in per capita visits to national parks. Digital Natives are snapping up the newest electronic gadgets and toys with glee and often putting them to use in the workplace. Their parents’ generation of Digital Immigrants tends to step more reluctantly into the computer age, not because they don’t want to make their lives more efficient through the Internet and portable devices but because these devices may feel unfamiliar and might upset their routine at first. During this pivotal point in brain evolution, Natives and Immigrants alike can learn the tools they need to take charge of their lives and their brains, while both preserving their humanity and keeping up with the latest technology.

pages: 343 words: 101,563

The Uninhabitable Earth: Life After Warming
by David Wallace-Wells
Published 19 Feb 2019

* * * — That technology might liberate us, collectively, from the strain of labor and material privation is a dream at least as old as John Maynard Keynes, who predicted his grandchildren would work only fifteen-hour weeks, and yet never ultimately fulfilled. In 1987, the year he won the Nobel Prize, economist Robert Solow famously commented, “You can see the computer age everywhere but in the productivity statistics.” This has been, even more so, the experience of most of those living in the developed world in the decades since—rapid technological change transforming nearly every aspect of everyday life, and yet yielding little or no tangible improvement in any conventional measures of economic well-being.

as old as John Maynard Keynes: Keynes extended the prediction—much, much talked about ever since—in an essay notably published in 1930, just after the stock market crash of 1929: John Maynard Keynes, “Economic Possibilities for Our Grandchildren,” Nation and Athenaeum, October 11 and 18, 1930. “You can see the computer age”: This line first appeared in Robert M. Solow, “We’d Better Watch Out,” review of Manufacturing Matters by Stephen S. Cohen and John Zysman, The New York Times Book Review, July 12, 1987. a million transatlantic flights: Alex Hern, “Bitcoin’s Energy Usage Is Huge—We Can’t Afford to Ignore It,” The Guardian, January 17, 2018.

pages: 350 words: 98,077

Artificial Intelligence: A Guide for Thinking Humans
by Melanie Mitchell
Published 14 Oct 2019

FIGURE 5: Plots showing how many kilos of rice are needed for each chess square in order to fulfill the sage’s request; A, squares 1–24 (with y-axis showing hundreds of kilos); B, squares 24–64 (with y-axis showing tens of trillions of kilos) Exponential Progress in Computers For Ray Kurzweil, the computer age has provided a real-world counterpart to the exponential fable. In 1965, Gordon Moore, cofounder of Intel Corporation, identified a trend that has come to be known as Moore’s law: the number of components on a computer chip doubles approximately every one to two years. In other words, the components are getting exponentially smaller (and cheaper), and computer speed and memory are increasing at an exponential rate.

In particular, the best-known reinforcement-learning successes have been in the domain of game playing. Applying reinforcement learning to games is the topic of the next chapter. 9 Game On Since the earliest days of AI, enthusiasts have been obsessed with creating programs that can beat humans at games. In the late 1940s, both Alan Turing and Claude Shannon, two founders of the computer age, wrote programs to play chess before there were even computers that could run their code. In the decades that followed, many a young game fanatic has been driven to learn to program in order to get computers to play their favorite game, whether it be checkers, chess, backgammon, Go, poker, or, more recently, video games.

pages: 268 words: 109,447

The Cultural Logic of Computation
by David Golumbia
Published 31 Mar 2009

I focus on the institutional effects of computing not merely to ensure that cultural criticism fully addresses our moment; rather, I am convinced both intellectually and experientially that computers have different effects and meanings when seen from the nodes of institutional power than from the ones they have when seen from other perspectives. If an unassailable slogan of the computing age is that “computers empower users,” the question I want to raise is not what happens when individuals are empowered in this The Cultural Logic of Computation p4 fashion (a question that has been widely treated in literature of many different sorts), but instead what happens when powerful institutions— corporations, governments, schools—embrace computationalism as a working philosophy.

These thinkers were lumped together at the time under the term mechanists as opposed to vitalists (who thought living matter was different in kind from mechanisms like watches), and it is the mechanists we associate especially with the rest of possessive-individualist doctrine. A contemporary name for this doctrine, I have been suggesting, is computationalism. Contrary to the views of advocates and critics alike that the computer age should be characterized by concepts like “decentralizing, globalizing, har- Computationalism and Political Authority p 219 monizing, and empowering” (Negroponte 1995, 229), it seems more plausible that the widespread striating effects of computerization walk hand-in-hand with other, familiar concepts from the histories of rationalism.

pages: 391 words: 105,382

Utopia Is Creepy: And Other Provocations
by Nicholas Carr
Published 5 Sep 2016

To my surprise (and, I admit, delight), bloggers swarmed around the piece like phagocytes. Within days it had been viewed by thousands and had sprouted a tail of comments. So began my argument with—what should I call it? There are so many choices: the digital age, the information age, the internet age, the computer age, the connected age, the Google age, the emoji age, the cloud age, the smartphone age, the data age, the Facebook age, the robot age, the posthuman age. The more names we pin on it, the more vaporous it seems. If nothing else, it is an age tailored to the talents of the brand manager. I’ll just call it Now.

The paper’s predicament highlights a broader issue about the web’s tenacious but malleable memory. Viktor Mayer-Schönberger, a public policy professor at Harvard, tells Hoyt that newspapers “should program their archives to ‘forget’ some information, just as humans do.” Through the ages, humans have generally remembered the important stuff and forgotten the trivial, he said. The computer age has turned that upside down. Now, everything lasts forever, whether it is insignificant or important, ancient or recent, complete or overtaken by events. Following Mayer-Schönberger’s logic, the Times could program some items, like news briefs, which generate a surprising number of the complaints, to expire, at least for wide public access, in a relatively short time.

pages: 331 words: 104,366

Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins
by Garry Kasparov
Published 1 May 2017

Some of these questions have been answered while others are more passionately disputed than ever. CHAPTER 2 RISE OF THE CHESS MACHINES IN 1968, when the 2001 book and movie were created, it was not yet a foregone conclusion that computers would come to dominate humans at chess, or anything else beyond rote automation and calculation. As you might expect from the dawn of the computer age, predictions about machine potential were all over the map. Utopian dreams about the fully automated world just around the corner shared column space with dystopian nightmares of, well, pretty much the same thing. This is a critical point to keep in mind before we criticize or praise anyone for their predictions, and before we make our own.

The basic formula hasn’t changed since 1949, when the American mathematician and engineer Claude Shannon wrote a paper describing how it might be done. In “Programming a Computer for Playing Chess,” he proposed a “computing routine or ‘program’” for use on the sort of general-purpose computer Alan Turing had theorized years earlier. You can tell how early it was in the computer age that Shannon put the word “program” in quotation marks as jargon. As with many who followed him, Shannon was slightly apologetic at proposing a chess-playing device of “perhaps no practical importance.” But he saw the theoretical value of such a machine in other areas, from routing phone calls to language translation.

pages: 484 words: 104,873

Rise of the Robots: Technology and the Threat of a Jobless Future
by Martin Ford
Published 4 May 2015

The decline in labor’s share in China—the country that most of us assume is hoovering up all the work—was especially precipitous, falling at three times the rate in the United States. Karabarbounis and Neiman concluded that these global declines in labor’s share resulted from “efficiency gains in capital producing sectors, often attributed to advances in information technology and the computer age.”23 The authors also noted that a stable labor share of income continues to be “a fundamental feature of macro-economic models.”24 In other words, just as economists do not seem to have fully assimilated the implications of the circa-1973 divergence of productivity and wage growth, they are apparently still quite happy to build Bowley’s Law into the equations they use to model the economy.

Computers are getting dramatically better at performing specialized, routine, and predictable tasks, and it seems very likely that they will soon be poised to outperform many of the people now employed to do these things. Progress in the human economy has resulted largely from occupational specialization, or as Adam Smith would say, “the division of labour.” One of the paradoxes of progress in the computer age is that as work becomes ever more specialized, it may, in many cases, also become more susceptible to automation. Many experts would say that, in terms of general intelligence, today’s best technology barely outperforms an insect. And yet, insects do not make a habit of landing jet aircraft, booking dinner reservations, or trading on Wall Street.

pages: 440 words: 108,137

The Meritocracy Myth
by Stephen J. McNamee
Published 17 Jul 2013

In short, the future would belong to the nerds. In some ways, Bell’s predictions seem to have been realized with the advance of the computer age. The computer has become the leading edge of the information society. Computer-related industries, populated by young, technically competent experts, flourished in the digital e-boom of the 1980s and 1990s. The postindustrial society would presumably create demand for a more highly educated labor force. While it is true that the computer age ushered in a new genre of occupational specialties, it is also true that the bulk of the expansion of new jobs, as we have seen, has actually been very low tech.

pages: 518 words: 107,836

How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy)
by Benjamin Peters
Published 2 Jun 2016

See, for example, Clay Shirky, Cognitive Surplus: Creativity and Generosity in a Connected Age (New York: Penguin Books, 2010), and Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom (New Haven: Yale University Press, 2006), 35–132. 8. Anatoly Kitov, Electronnie tsifrovie mashini [Electronic Ciphered Machines] (Moscow: Radioeletronika Nauka, 1956). 9. Charles Eames and Ray Eames, A Computer Perspective: Background to the Computer Age (Cambridge: Harvard University Press, 1973), 64, 96–97. 10. Marc Raeff, The Well-Ordered Police State: Social and Institutional Change through Law in the Germanies and Russia, 1600–1800 (New Haven: Yale University Press, 1983); Jacob Soll, The Information Master: Jean-Baptiste Colbert’s Secret State Intelligence System (Ann Arbor: University of Michigan Press, 2009). 11.

B. DeBevoise. Cambridge: MIT Press, 2009. Dyker, David. Restructuring the Soviet Economy. New York: Routledge, 1991. Dyson, George. Turing’s Cathedral: The Origins of the Digital Universe. Pantheon Books: New York, 2012. Eames, Charles, and Ray Eames. A Computer Perspective: Background to the Computer Age. Cambridge: Harvard University Press, 1973. Edmonds, David, and John Eidinow. Bobby Fisher Goes to War: How a Lone American Star Defeated the Soviet Chess Machine. New York: Harper Perennial, 2005. Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America.

pages: 404 words: 110,942

A Place for Everything: The Curious History of Alphabetical Order
by Judith Flanders
Published 6 Feb 2020

Designed for the use of merchants (5th edn, London, T.H., 1727) Leedham-Green, Elisabeth, ‘A Catalogue of Caius College Library, 1569’, Transactions of the Cambridge Bibliographical Society, 8, 1, 1981, pp. 29–41 Le Men, Ségolène, Les abécédaires français illustrés du XIXe siècle (Paris, Promodis, 1984) Lendinara, Patrizia, Loredana Lazzari and Claudia Di Sciacca, eds., Rethinking and Recontextualizing Glosses: New Perspectives in the Study of Late Anglo-Saxon Glossography (Turnhout, Brepols, 2011) Lerner, Fred, The Story of Libraries: From the Invention of Writing to the Computer Age (New York, Continuum, 2009) Lieshout, H.H.M. van, The Making of Pierre Bayle’s Dictionnaire historique et critique (Amsterdam and Utrecht, APA-Holland University Press, 2001) Locke, John, ‘A New Method of Making Commonplace Books’ (London, J. Greenwood, 1706) Logan, Robert K., The Alphabet Effect: The Impact of the Phonetic Alphabet on the Development of Western Civilization (New York, St Martin’s Press, 1986) Loveland, Jeff, The European Encyclopedia: From 1650 to the Twenty-first Century (Cambridge, Cambridge University Press, 2019) Lund, Roger D., ‘The Eel of Science: Index Learning, Scriblerian Satire, and the Rise of Information Culture’, Eighteenth-century Life, vol. 22, no. 2, 1998, pp. 18–39 Lyall, R.J., ‘Materials: The Paper Revolution’, in Book Production and Publishing in Britain, 1375–1475, Jeremy Griffiths and Derek Pearsall, eds.

Others assert without hesitation that Callimachus was the author of the Pinakes, and they were ordered alphabetically. See, for example, Margaret Zeegers and Deirdre Barron, Gatekeepers of Knowledge: A Consideration of the Library, the Book and the Scholar in the Western World (Oxford, Chandos, 2010), pp. 12–13; Fred Lerner, The Story of Libraries: From the Invention of Writing to the Computer Age (New York, Continuum, 2009), pp. 16–17. 26. Hatzimichali, ‘Encyclopaedism’, in König and Woolf, Encyclopaedism from Antiquity, pp. 76–7. 27. Daly, Contributions, pp. 23, 40. 28. Ibid., pp. 45–50, 75. 29. Werner Hüllen, English Dictionaries, 800–1700: The Topical Tradition (Oxford, Clarendon Press, 2006), pp. 30–31. 30.

pages: 407 words: 104,622

The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution
by Gregory Zuckerman
Published 5 Nov 2019

But Simons hoped rigorous testing and sophisticated predictive models, based on statistical analysis rather than eyeballing price charts, might help him escape the fate of the chart adherents who had crashed and burned. But Simons didn’t realize that others were busy crafting similar strategies, some using their own high-powered computers and mathematical algorithms. Several of these traders already had made enormous progress, suggesting that Simons was playing catch-up. Indeed, as soon as the computer age dawned, there were investors, up bright and early, using computers to solve markets. As early as 1965, Barron’s magazine spoke of the “immeasurable” rewards computers could render investors, and how the machines were capable of relieving an analyst of “dreary labor, freeing him for more creative activity.”

IBM knew it faced a real problem the day the wife of a member of the chess team, who taught at a Catholic college, spoke with the college’s president, an elderly nun, and the sister kept referring to IBM’s amazing “Deep Throat” program. IBM ran a contest to rename the chess machine, choosing Brown’s own submission, Deep Blue, a nod to IBM’s longtime nickname, Big Blue. A few years later, in 1997, millions would watch on television as Deep Blue defeated Garry Kasparov, the chess world champion, a signal that the computing age had truly arrived.6 Brown, Mercer, and the rest of the team made progress enabling computers to transcribe speech. Later, Brown realized probabilistic mathematical models also could be used for translation. Using data that included thousands of pages of Canadian parliamentary proceedings featuring paired passages in French and English, the IBM team made headway toward translating text between languages.

The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley
by Leslie Berlin
Published 9 Jun 2005

Uses for integrated circuits: “A Briefing on Integrated Circuits”; “Engineers Eye Integrated Consumer Products,” Television Digest, 30 March 1964, 7–8; Michael F. Wolff, “When Will Integrated Circuits Go Civilian? Good Guess: 1965,” Electronics, 10 May 1963, 20–24. 37. Cost of IBM System/360: Campbell-Kelly and Aspray, Computer, 140. Only computers anyone would need: Don Palfreman and Doron Swade, The Dream Machine: Exploring the Computer Age (London: BBC Books, 1993), 78– Notes to Pages 139–146 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 52. 53. 54. 55. 335 80. For a contemporary description of the IBM System/360 series, see International Business Machines, Introduction to IBM Data Processing Systems (White Plains, N.Y.: 1969).

Competitive Edge: The Semiconductor Industry in the U.S. and Japan. Stanford: Stanford University Press, 1984. Ozaki, Robert S. “How Japanese Industrial Policy Works. In The Industrial Policy Debate, ed. Chalmers Johnson, 47–70. San Francisco: ICS Press, 1984. Palfreman, Don, and Doron Swade. The Dream Machine: Exploring the Computer Age. London: BBC Books, 1993. Parks, Martha Smith. Microelectronics in the 1970s. Rockwell International Corporation, 1974. Bibliography 373 Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996. Pitti, Stephen J. The Devil in Silicon Valley : Northern California, Race, and Mexican Americans.

“Dissatisfaction as a Spur to Career,” New York Times, 15 December 1976. Bibliography 379 Noyce, Robert N. “False Hopes and High-Tech Fiction.” Harvard Business Review, January–February 1990, 31–34. ———. “Microelectronics.” Scientific American, September 1977. ———. “Hardware Prospects and Limitations.” In The Computer Age: A TwentyYear View, ed. Michael Dertouzous and Joel Moses, 321–337. Cambridge: MIT Press, 1979. ———. “Competition and Cooperation—A Prescription for the Eighties.” Research Management, March 1982, 13–17. Noyce, Robert, and Marcian E. Hoff, Jr. [Ted Hoff]. “A History of Microprocessor Development at Intel,” IEEE Micro, February 1981.

pages: 598 words: 183,531

Hackers: Heroes of the Computer Revolution - 25th Anniversary Edition
by Steven Levy
Published 18 May 2010

There was no subject in the media hotter than computers in the early 1980s, and with the New York public relations firm helping channel the dazzled inquisitors, a steady stream of long-distance phone calls and even long-distance visitors began to arrive in Oakhurst that autumn. This included an “NBC Magazine” camera crew which flew to Oakhurst from New York City to document this thriving computer-age company for its video magazine show. NBC shot the requisite footage of Roberta mapping a new adventure game at her home, Ken going over his phone messages, Ken and Roberta touring the building site of their new home. But the NBC producer was particularly anxious to speak to the heart of the company: the young programmers.

IBM was power and money. I wanted both,” she said. Electronic Arts seemed the way. The products and philosophy of the company would be truth and beauty, and the company founders would all be powerful and rich. And the programmers, who would be treated with the respect they deserved as the artists of the computer age, would be elevated to the status of rock or movie stars. This message managed to find its way around the Applefest, enough so knots of programmers began gathering outside the Convention Hall for the buses that supposedly would take them to the Stanford Court Hotel, where Electronic Arts was throwing its big party.

Tom Tatum, former lawyer, lobbyist, and Carter campaign aide, now a leading purveyor of video “docusports” programming, thought he had serendipitously latched on to a jackpot bigger than that of any slot machine in the casino only yards away from where he stood. “This is the event where Hollywood meets the Computer Age,” said Tom Tatum to the crowd of reporters and computer tradespeople in town for the Comdex show. “The ultra-contest of the eighties.” Tom Tatum’s creation was called Wizard vs. Wizards. It was to be a television contest where game designers play each other’s games for a set of prizes. Tatum had gathered programmers from companies like On-Line and Sirius because he sensed the arrival of a new kind of hero, one who fought with brains instead of muscle, one who represented America’s bold willingness to stay ahead of the rest of the world in the technological battle of supremacy: the hacker.

pages: 450 words: 113,173

The Age of Entitlement: America Since the Sixties
by Christopher Caldwell
Published 21 Jan 2020

The distance abolished was the kind that is in people’s heads. Looked at this way, computers have been not so much an expression of America’s historic ingenuity as an alternative to it. In his history of economic growth in the United States, the Northwestern University economist Robert Gordon found no special productivity boost from the computer age. Outside of Silicon Valley, according to the economist Edmund Phelps, American innovation “would narrow to a trickle” after the 1960s. In 1969, U.S. Industries, Inc., had promised that within a decade the 1960s would seem like the Dark Ages, once Americans got used to “automatic highways—computerized kitchens—person-to-person television . . . food from under the sea.”

The first harvest of innovation came from Japan, starting with the Panasonic Toot-a-Loop AM-radio-and-bracelet, and culminating in the Sony Walkman, the great bridge product from the industrial to the information economy, released in the final weeks of the 1970s. Artists were generally as blind as anyone else to the upshot of computers, even as the computer age approached. Stanley Kubrick’s 2001: A Space Odyssey (1968) correctly anticipated both voiceprint security identification and, seemingly, the end of the Cold War. Kubrick, however, viewed twentieth-century American corporations as immortal, placing a Howard Johnson’s motel in a space station that could be reached by a Pan Am moon shuttle.

pages: 453 words: 117,893

What Would the Great Economists Do?: How Twelve Brilliant Minds Would Solve Today's Biggest Problems
by Linda Yueh
Published 4 Jun 2018

For instance, the strong period of growth in the 1950s and 60s is associated with post-war technological advances, such as widespread air travel and industrial robots. Curiously, recent technological improvements, centred on computing, information and communication technologies (ICT) and the internet, do not seem to have raised productivity across the economy. Solow’s 1987 observation that ‘You can see the computer age everywhere but in the productivity statistics’ is known as the Solow paradox.6 He revisited this question decades later, but concluded that we still do not know, as the role of computing is still evolving. Solow points out that since our lives and work have been transformed by computers, this technology should have improved our productivity.

It’s challenging, as seen in Japan, and some factors like demographics are difficult to alter, but the above suggestions can help and the advent of new technologies could be game changing. Solow would probably view the debate over whether the technologies of the digital era are as productive as the steam engine or electrification of the earlier industrial revolutions as being related to investment. If the computer age is to increase productivity and so lead to a stronger phase of economic growth, it will require investment in not just R&D but also people’s skills and firms’ practices to embed those technologies into how businesses operate. The basic tenets of Robert Solow’s model of economic growth point the way forward.

Decoding Organization: Bletchley Park, Codebreaking and Organization Studies
by Christopher Grey
Published 22 Mar 2012

(Hill, 2004: 43) The shoeboxes reflected the general tightness of resources, but of course the use of a paper-based system of any sort reflects the necessities of a pre-computerization era. BP is well known as the place where arguably the first semi-programmable electronic computer – Colossus – was developed (Goldstine, 1993; Copeland, 2001, 2006) and in this sense stands on the cusp of the computer age, but for the most part it used pre-computer technologies. Clearly, an equivalent organization now would use computerized databases to store and retrieve the information held in indexes and almost all of the human labour of indexing would disappear (for that matter, much of the more skilled work at BP would now be readily computerized).

‘The Turing bombe’, The Rutherford Journal 3, www.rutherfordjournal.org/article030108.html. Child, J. 1972. ‘Organizational structure, environment and performance: The role of strategic choice’, Sociology 6: 1–22. Clayton, A. 1980. The Enemy is Listening: The Story of the Y Service. London: Hutchinson. Copeland, B. J. 2001. ‘Colossus and the dawning of the computer age’ in Smith, M. and Erskine, R. (eds.), Action This Day. London: Bantam, pp. 342–69. Copeland, B. J. (ed.) 2004. The Essential Turing. Oxford: Oxford University Press. Copeland, B. J. 2006. Colossus. The Secrets of Bletchley Park’s Codebreaking Computers. Oxford: Oxford University Press. Clark, P. and Rowlinson, M. 2004.

pages: 374 words: 113,126

The Great Economists: How Their Ideas Can Help Us Today
by Linda Yueh
Published 15 Mar 2018

For instance, the strong period of growth in the 1950s and 60s is associated with post-war technological advances, such as widespread air travel and industrial robots. Curiously, recent technological improvements, centred on computing, information and communication technologies (ICT) and the internet, do not seem to have raised productivity across the economy. Solow’s 1987 observation that ‘You can see the computer age everywhere but in the productivity statistics’ is known as the Solow paradox.6 He revisited this question decades later, but concluded that we still do not know, as the role of computing is still evolving. Solow points out that since our lives and work have been transformed by computers, this technology should have improved our productivity.

It’s challenging, as seen in Japan, and some factors like demographics are difficult to alter, but the above suggestions can help and the advent of new technologies could be game changing. Solow would probably view the debate over whether the technologies of the digital era are as productive as the steam engine or electrification of the earlier industrial revolutions as being related to investment. If the computer age is to increase productivity and so lead to a stronger phase of economic growth, it will require investment in not just R&D but also people’s skills and firms’ practices to embed those technologies into how businesses operate. The basic tenets of Robert Solow’s model of economic growth point the way forward.

Hacking Capitalism
by Söderberg, Johan; Söderberg, Johan;

Natalie Rothstein, “The Introduction of the Jacquard Loom to Great Britain, in ed. Veronika Gervers, Studies in Textile History—In Memory of Harold B. Burnham (Toronto: Alger Press, 1977). 5. For a historical account of the Luddite uprising, see Kirkpatrick Sale, Rebels Against the Future—The Luddites and Their War on the Industrial Revolution, Lessons for the Computer Age (Reading Mass.: Addison-Wesley Publishing Company, 1995). 6. Even if machine breaking could not stop industrial capitalism, Eric Hobsbawm estimated that the implementation of labour-saving technologies in local areas was held back due to sabotage. Furthermore, the breaking of machines was part of a more general strategy of ‘collective bargaining by riot’, as he called it, which could also include arsoning the employer’s stock and home.

. ——— No-Collar—The Human Workplace and its Hidden Costs, Philadelphia: Temple University Press, 2004. Sahlins, Marshall. Stone Age Economics, Chicago: Aldine Publishing Company, 1972. Sale, Kirkpatrick. Rebels Against the Future—The Luddites and Their War on the Industrial Revolution—Lessons for the Computer Age, Reading Mass.: Addison-Wesley Publishing Company, 1995. Salus, Peter. A Quarter Century of Unix, Reading Mass.: Addison-Wesley, 1994. Sassen, Saskia. Losing Control?—Sovereignity in an Age of Globalization, New York: Columbia University Press, 1996. Schell, Bernadette, and John Dodge. The Hacking of America—Who’s Doing it, Why, and How, London: Quorum Books, 2002.

pages: 829 words: 186,976

The Signal and the Noise: Why So Many Predictions Fail-But Some Don't
by Nate Silver
Published 31 Aug 2012

The last forty years of human history imply that it can still take a long time to translate information into useful knowledge, and that if we are not careful, we may take a step back in the meantime. The term “information age” is not particularly new. It started to come into more widespread use in the late 1970s. The related term “computer age” was used earlier still, starting in about 1970.28 It was at around this time that computers began to be used more commonly in laboratories and academic settings, even if they had not yet become common as home appliances. This time it did not take three hundred years before the growth in information technology began to produce tangible benefits to human society.

In 1971, for instance, it was claimed that we would be able to predict earthquakes within a decade,29 a problem that we are no closer to solving forty years later. Instead, the computer boom of the 1970s and 1980s produced a temporary decline in economic and scientific productivity. Economists termed this the productivity paradox. “You can see the computer age everywhere but in the productivity statistics,” wrote the economist Robert Solow in 1987.30 The United States experienced four distinct recessions between 1969 and 1982.31 The late 1980s were a stronger period for our economy, but less so for countries elsewhere in the world. Scientific progress is harder to measure than economic progress.32 But one mark of it is the number of patents produced, especially relative to the investment in research and development.

—Present (Berkeley, CA: University of California Press, 1988). http://econ161.berkeley.edu/TCEH/1998_Draft/World_GDP/Estimating_World_GDP.html. 27. Figure 1-2 is drawn from DeLong’s estimates, although converted to 2010 U.S. dollars rather than 1990 U.S. dollars as per his original. 28. Google Books Ngram Viewer. http://books.google.com/ngrams/graph?content=information+age%2C+computer+age&year_start=1800&year_end=2000&corpus=0&smoothing=3. 29. Susan Hough, Predicting the Unpredictable: The Tumultuous Science of Earthquake Prediction (Princeton: Princeton University Press, Kindle edition, 2009), locations 862–869. 30. Robert M. Solow, “We’d Better Watch Out,” New York Times Book Review, July 12, 1987. http://www.standupeconomist.com/pdf/misc/solow-computer-productivity.pdf. 31.

The Code: Silicon Valley and the Remaking of America
by Margaret O'Mara
Published 8 Jul 2019

On the relationship between Vietnam-era countercultural politics and the emergence of the personal computer, as well as a much deeper dive into the lives and careers of the people discussed in this chapter, see Fred Turner, From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (Chicago: The University of Chicago Press, 2006); Markoff, What the Dormouse Said; and Michael Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperBusiness, 1999). 13. Lee Felsenstein, “Resource One/Community Memory—1972–1973,” http://www.leefelsenstein.com/?page_id=44 archived at https://perma.cc/4K8U-2BG3; Turner, From Counterculture to Cyberculture, 69–102; Claire L. Evans, Broad Band: The Untold Story of the Women Who Made the Internet (New York: Portfolio / Penguin, 2018), 95–108. 14.

Burt McMurtry, interview with the author, January 15, 2015; Leena Rao, “Sand Hill Road’s Consiglieres: August Capital,” TechCrunch, June 14, 2014, https://techcrunch.com/2014/06/14/sand-hill-roads-consiglieres-august-capital/, archived at https://perma.cc/6DN4-DERQ. 8. Charles Simonyi, interview with the author, October 4, 2017, Bellevue, Wash.; Michael Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperBusiness, 1999), 194–210; Michael Swaine and Paul Freiberger, Fire in the Valley: The Birth and Death of the Personal Computer, 3rd ed. (Raleigh, N.C.: The Pragmatic Bookshelf, 2014), 271. 9. Simonyi interview; Charles Simonyi, oral history interview by Grady Booch, February 6, 2008, CHM, 30–34; Manes and Andrews, Gates, 167. 10.

Bureau of the Census, Robert Kominski, Current Population Reports, Special Studies, Series P-23, No. 155, Computer Use in the United States: 1984 (Washington, D.C.: U.S. Government Printing Office, 1988). 31. Author interview with former associate of RMI, Inc., August 6, 2018. 32. Mike Hogan, “Fighting for the Heavyweight Title,” California Business, November 1984, 78–93; Computer Age, December 12, 1983, quoted in Thomas & Company, “Competitive Dynamics”; Hogan, “Fighting for the Heavyweight Title.” 33. SRI’s Values and Lifestyles (VALS) program was relied upon heavily by Apple for its market research. See Macintosh Product Introduction Plan, October 7, 1983, M1007, Series 7, Box 13, FF 21, SU. 34.

pages: 144 words: 43,356

Surviving AI: The Promise and Peril of Artificial Intelligence
by Calum Chace
Published 28 Jul 2015

Some people argue that the fears are over-done because technology is not actually advancing as fast as the excitable folk in Silicon Valley suppose. It is true that economists have long struggled to record the productivity improvements that would be expected from the massive investments in information technology of the last half-century; this failure prompted economist Robert Solow to remark back in 1987 that “You can see the computer age everywhere but in the productivity statistics.” (Of the various explanations for this phenomenon, the one which seems most plausible to me is that there is an increase in productivity, but for some reason our economic measurements don’t catch it. When I started work in the early 1980s we used to spend hours each day looking for information by searching in files and phoning each other up.

The History and Uncertain Future of Handwriting
by Anne Trubek
Published 5 Sep 2016

” * There is no logical explanation for the rise of all these different hands; it was a cultural phenomena. To understand, consider how many different website designs or themes are available to someone who wants to set up a site using WordPress. Each one has its own look and its own associations, and when there is no consensus as to how words should look, there is enormous choice. Early in the computer age, people reveled in choosing a different font to best clothe their letters. That has settled down now, as Times New Roman—another “New Roman” script, like Carolingian minuscule and humanist—has become a de facto standard. Chapter 7 RIGHTEOUS, MANLY HANDS Geneva-on-the-Lake, Ohio, is a tired resort town on the shores of Lake Erie.

pages: 481 words: 121,300

Why geography matters: three challenges facing America : climate change, the rise of China, and global terrorism
by Harm J. De Blij
Published 15 Nov 2007

Nineteenth-century maps tend to represent science, not art, although many were still hand colored for clarity. But the twentieth century witnessed the revolution that would transform cartography and is still under way: the introduction of photography from airplanes, the launching of image-transmitting orbital satellites, and the coming of the computer age. In the process, the very definition of the term "map" has changed. In traditional works on cartography such as The Nature of Maps by Arthur H. Robinson and Barbara B. Petchenik (1976) or P. C. and J. O. Muehreke's Map Use (1997) the authors define a map as "a graphic representation of the milieu" or "any geographical image of environment."

See also specific countries and borders, 111-13, 119, 182-83,262-65 decolonization, 37, 112, 184, 197, 265, 276 and globalization, 9, 111, 197 Columbia, 180 comet hypothesis, 58, 59 Committee on Foreign Names, 40 Common Market (European Economic Community), 210, 211 Commonwealth of Independent States (CIS), 251 communism. See also Cold War; Soviet Union and Africa, 266 and cartography, 35 of China, 125, 127, 139 Iron Curtain, 197, 198,209 and U.S. intervention, 197, 209, 276 compass rose, 27-28 computer age, 24 Confucius (Kongfuzi or Kongzi), 138 291 Congo and Belgium, 162, 263, 265 Islam in, 185, 186 and Mobutu regime, 268, 269 in Pangaea, 61 conical projections in maps, 32, 33 continental drift, 54-57, 55 Convention on Law of the Sea, 115, 148, 252, 281 Corsicans, 152, 206, 207, 218, 220, 222 Costa Rica, 180 Cote d'lvoire, 176, 183-84, 185, 186, 260, 269 Council of Ministers, 215, 216 countries, size of, 30, 33, 34 Cretaceous period, 62, 64 Crete, 76, 128 crime, 248-49 Crimea, Ukraine, 206 Croatia conflicts, 207 creation of, 109 devolutionary pressures, 206 and European Union, 218, 225, 227 Islam in, 169 Cro-Magnons, 70-71 crustal (sea-floor) spreading, 54-55 Cuba, 114, 176, 180, 181 Cyclades, 76 cylindrical projections in maps, 33 Cyprus conflict, 222 and European Union, 217, 218, 225 name, 37 Czechoslovakia (former), 226 Czech Republic, 212, 213, 218 Dagestan Republic, Russia, 246 Dalai Lama, 137 Dardanelles Strait, 236 Dar es Salaam, Tanzania, 181, 186 Darfur Province, 104, 165, 183 Davao International Airport attack (2003), 160 Davis, William Morris, 118 Dayton model, 195 decay, 10 deception in maps, 36 Defense Mapping Agency, 40 deglaciation, 76 delimitation, 119 demarcation of boundaries, 120 democracy in Africa, 268, 270 [see also specific countries) China's reaction to, 131 in Germany, 145, 148,276 in Hong Kong, 142, 143 in Iraq, 112, 145, 194, 195, 276-77 in Japan, 145, 146, 148, 194, 195 in Nigeria, 255, 268. 269-70 and population, 94 in Russia, ix, 204, 205, 231, 250, 251, 256 in South Africa, 255, 268 in South Korea, 129 in Taiwan, 131 in United States, 148, 195, 276, 281 Democratic Republic of Congo, 185, 186 dendochronology (tree ring research), 78 Deng Xiaoping economy, 129, 141 "One Child Only" program, 137, 143 policies, 127, 137, 141 political administration, 139 Denmark, 110, 169,210,213 Denmark Strait, 80 D'Estaing, Giscard, 214 devolutionary pressures, 206, 218, 220, 221-22, 228 Devonian period, 59 Diamond, Jared, 7-8, 90, 259 diets and nutrition, 100 dinosaurs, 59, 62, 102 disease.

pages: 472 words: 117,093

Machine, Platform, Crowd: Harnessing Our Digital Future
by Andrew McAfee and Erik Brynjolfsson
Published 26 Jun 2017

Phase one of the second machine age describes a time when digital technologies demonstrably had an impact on the business world by taking over large amounts of routine work—tasks like processing payroll, welding car body parts together, and sending invoices to customers. In July of 1987 the MIT economist Robert Solow, who later that year would win a Nobel prize for his work on the sources of economic growth, wrote, “You can see the computer age everywhere but in the productivity statistics.” By the mid-1990s, that was no longer true; productivity started to grow much faster, and a large amount of research (some of it conducted by Erik‡‡ and his colleagues) revealed that computers and other digital technologies were a main reason why.

id=8186897092162507742#x0026;hl=en. 13 Within a few hours the campaign raised: Jonathan Shieber, “GE FirstBuild Launches Indiegogo Campaign for Next Generation Icemaker,” TechCrunch, July 28, 2015, https://techcrunch.com/2015/07/28/ge-firstbuild-launches-indiegogo-campaign-for-next-generation-icemaker. 13 in excess of $1.3 million: Samantha Hurst, “FirstBuild’s Opal Nugget Ice Maker Captures $1.3M during First Week on Indiegogo,” Crowdfund Insider, August 3, 2015, http://www.crowdfundinsider.com/2015/08/72196-firstbuilds-opal-nugget-ice-maker-captures-1-3m-during-first-week-on-indiegogo. 13 the Opal campaign had attracted more than $2.7 million: “FirstBuild Launches Affordable Nugget Ice Machine,” Louisville Business First, July 22, 2015, http://www.bizjournals.com/louisville/news/2015/07/22/firstbuild-launches-affordable-nugget-ice-machine.html. 13 more than 5,000 preorder customers: Indiegogo, “Opal Nugget Ice Maker.” 16 “You can see the computer age”: Robert M. Solow, “We’d Better Watch Out,” New York Times, July 21, 1987, http://www.standupeconomist.com/pdf/misc/solow-computer-productivity.pdf. 17 more than a billion users of smartphones: Don Reisinger, “Worldwide Smartphone User Base Hits 1 Billion,” CNET, October 17, 2012, https://www.cnet.com/news/worldwide-smartphone-user-base-hits-1-billion. 18 more than 40% of the adults: Jacob Poushter, “Smartphone Ownership and Internet Usage Continues to Climb in Emerging Economies,” Pew Research Center, February 22, 2016, http://www.pewglobal.org/2016/02/22/smartphone-ownership-and-internet-usage-continues-to-climb-in-emerging-economies. 18 approximately 1.5 billion more were sold: Tess Stynes, “IDC Cuts Outlook for 2016 Global Smartphone Shipments,” Wall Street Journal, September 1, 2016, http://www.wsj.com/articles/idc-cuts-outlook-for-2016-global-smartphone-shipments-1472740414. 19 “There were many factories”: Warren D.

pages: 409 words: 125,611

The Great Divide: Unequal Societies and What We Can Do About Them
by Joseph E. Stiglitz
Published 15 Mar 2015

What is happening today is analogous to developments a few decades ago, early in the era of personal computers. In 1987, the economist Robert Solow—awarded the Nobel Prize for his pioneering work on growth—lamented, “You can see the computer age everywhere but in the productivity statistics.” There are several possible explanations for this. Perhaps GDP does not really capture the improvements in living standards that computer-age innovation is engendering. Or perhaps this innovation is less significant than its enthusiasts believe. As it turns out, there is some truth in both perspectives. Recall how a few years ago, just before the collapse of Lehman Brothers, the financial sector prided itself on its innovativeness.

pages: 165 words: 45,397

Speculative Everything: Design, Fiction, and Social Dreaming
by Anthony Dunne and Fiona Raby
Published 22 Nov 2013

As the science fiction writer Frederick Pohl once remarked, a good writer does not think up only the automobile but also the traffic jam. Just as ergonomics emerged during the mechanical age to ensure a better physical fit between our bodies and machines, and user-friendliness came about during the computer age to ensure a better fit between our minds and computers, ethics needs to be at the forefront of working with biological technologies. We need to zoom out and consider what it means to be human and how to manage our changing relationship to nature and our new powers over life. This shift in focus requires new design methods, roles, and contexts.

pages: 175 words: 45,815

Automation and the Future of Work
by Aaron Benanav
Published 3 Nov 2020

It is on this basis that commentators typically cite rapidly rising labor productivity, rather than an influx of low-cost imports from abroad, as the primary cause of industrial job loss in advanced economies.10 On closer inspection, however, this explanation also turns out to be inadequate. Manufacturing productivity has been growing at a sluggish pace for decades, leading economist Robert Solow to quip, “We see the computer age everywhere, except in the productivity statistics.”11 Automation theorists discuss this “productivity paradox” as a problem for their account—explaining it in terms of weak demand for products, or the persistent availability of low-wage workers—but they understate its true significance. This is partly due to the appearance of steady labor-productivity growth in US manufacturing, at an average rate of around 3 percent per year since 1950.

pages: 372 words: 152

The End of Work
by Jeremy Rifkin
Published 28 Dec 1994

More than half of all black workers held positions in the four job categories where companies made net employment cuts: office and clerical, skilled, semi-skilled and laborers."34 John Johnson, the director of labor for the National Association for the Advancement of Colored People (NAACP), says that "what the whites often don't realize is that while they are in a recession, blacks are in a depression."35 More than forty years ago, at the dawn of the computer age, the father of cybernetics, Norbert Weiner, warned of the likely adverse consequences of the new automation technologies. "Let us remember," he said, "that the automatic machine ... is the precise economic equivalent of slave labor. Any labor which competes with slave labor must accept the economic consequences of slave labor."36 Not surprisingly, the first community to be devastated by the cybernetics revolution was black America.

Borrus opined that "for every company using computers right, there is one using them wrong-and the two negate each other."9 American corporations and companies around the world had been structured one hundred years earlier to produce and distribute goods and services in an age of rail transport and telephone and postal communication. Their organizational apparatus proved wholly inadequate to deal with the speed, agility, and information-gathering ability of computer-age technology. OLD-FASHIONED MANAGEMENT Modem management had its birth in the railroad industry in the 1850s. In the early years, railroads ran their trains along a single track. Keeping "track" of train movements became critical to maintaining safe passage along the line. When the Western Railroad experienced a Post-Fordism 93 series of accidents on its Hudson River rail, culminating in a head-on crash on October 4, 1841, that killed a passenger and conductor, the company responded to the growing safety problem by instituting elaborate changes in its organizational management, including a more systematic process of data collection from its roadmasters and faster dissemination of vital scheduling information to its train crews.

pages: 675 words: 141,667

Open Standards and the Digital Age: History, Ideology, and Networks (Cambridge Studies in the Emergence of Global Enterprise)
by Andrew L. Russell
Published 27 Apr 2014

After all, many standards for American telephone and telegraph networks in the nineteenth century were established within the corporate hierarchies of AT&T and Western Union, and the self-conscious movement for open systems and open standards did not begin until the 1970s. But the key principles and formative practices of open standards – due process, consensus, and a balance of interests – were not inventions of the computer age; rather, their roots stretch back to the late nineteenth century, when engineers first experimented with specialized committees to set industrywide standards. In order to understand the technological and ideological history of the twenty-first-century digital age, it is necessary to disrupt the familiar linear narrative of communication networks (telegraph to telephone to Internet) and explore how the key principles and formative practices of industrial standardization emerged from a variety of American industrial practices in the late nineteenth century.

Software and protocols, which were new, were left to graduate students.19 The split between hardware and software was a recent development in computing, and perhaps just as significant for the history of standardization as for the history of computing.20 Indeed, most instances of compatibility standards to this point – with the minor yet telling exceptions of programming languages such as FORTRAN discussed in Chapter 5 – concerned interfaces between tangible objects such as nuts and bolts or electrical plugs. In the computer age, the boundaries between technologies and organizations increasingly became problems that could be solved by software. We will see how the speed with which software could be written and deployed introduced new dynamics into negotiations over standards that had previously implied changes in the manufacture of artifacts.

pages: 193 words: 47,808

The Flat White Economy
by Douglas McWilliams
Published 15 Feb 2015

IT investment as an enabling technology In the 1980s, economic studies seemed not to be able to find much evidence of information technology making much difference to economic growth. This was known at the time as the ‘productivity paradox.’6 Nobel laureate Robert Solow famously quipped “You can see the computer age everywhere except in the productivity statistics”.7 But micro studies since have provided compelling evidence that IT is not only a technology that enhances growth but also one that enables further productivity gains. A Google search lists over 64,000 references for ‘information technology as an enabler’.8 The modern thinking about IT investment is that it encourages improved and innovative products, services and methods of production.

pages: 237 words: 50,758

Obliquity: Why Our Goals Are Best Achieved Indirectly
by John Kay
Published 30 Apr 2010

Chapter 10 COMPLEXITY—How the World Is Too Complex for Directness to Be Direct Computers don’t do obliquity. Computers work through prescribed routines of any degree of complexity in a direct, linear manner with incredible speed and accuracy. Sudoku is an easy problem for a computer, and chess seems not much harder. At the dawn of the computer age, some people really believed that not just sudoku and chess but lives, loves and businesses could be efficiently run by computer. Herbert Simon, a pioneer of artificial intelligence, wrote (in 1958) that:there are now in the world machines that think, that learn and that create. Moreover, their ability to do other things is going to increase rapidly until—in a visible future—the range of problems they can handle will be coextensive with the range to which the human mind has been applied.1 Such a machine is the murderous computer HAL, star of Stanley Kubrick’s film 2001: A Space Odyssey (released in 1968).

pages: 209 words: 53,175

The Psychology of Money: Timeless Lessons on Wealth, Greed, and Happiness
by Morgan Housel
Published 7 Sep 2020

You might be on the clock for fewer hours than you would in 1950. But it feels like you’re working 24/7. Derek Thompson of The Atlantic once described it like this: If the operating equipment of the 21st century is a portable device, this means the modern factory is not a place at all. It is the day itself. The computer age has liberated the tools of productivity from the office. Most knowledge workers, whose laptops and smartphones are portable all-purpose media-making machines, can theoretically be as productive at 2 p.m. in the main office as at 2 a.m. in a Tokyo WeWork or at midnight on the couch.²⁹ Compared to generations prior, control over your time has diminished.

pages: 495 words: 154,046

The Rights of the People
by David K. Shipler
Published 18 Apr 2011

And that is not the standard for issuance of an NSL.”40 A BIGGER HAYSTACK Following the revelations in the 1970s about FBI snooping, the agency reportedly stopped amassing huge files on people who were not part of criminal or counterintelligence investigations. In the first place, everything was on paper, one former agent noted, and warehouse space was limited. But it doesn’t take warehouses to store data in a computer age, and since 9/11, law enforcement and intelligence officers have understood one very bold lesson: They risk more criticism from Congress and the public by missing an attack than by violating privacy. So, they collect. “As a criminal investigator, my goal is to gather evidence necessary to prosecute a bad guy,” to get “one step closer to putting that guy in handcuffs and going to court,” said Mike German, the former undercover agent who infiltrated domestic militia groups for the FBI.

• • • “If a burglar brings us documents he stole from someone’s house,” said a federal prosecutor in California, “the law is clear that we can use that information. There’s no Fourth Amendment violation, as long as we didn’t instigate the burglary.” So, the private, high-tech burglar has flourished in the computer age like a digital bounty hunter. Immune from the Fourth Amendment, a hacker with a moral cause can burrow into people’s online crimes for the sheer satisfaction of seeing the criminals put away. Even if he violates anti-hacking laws, he hardly risks arrest by police who are grateful for his tips and files of electronic evidence.

pages: 464 words: 155,696

Becoming Steve Jobs: The Evolution of a Reckless Upstart Into a Visionary Leader
by Brent Schlender and Rick Tetzeli
Published 24 Mar 2015

We also relied on passages from the following books: Gates, by Stephen Manes and Paul Andrews; Odyssey: Pepsi to Apple, A Journey of Adventure, Ideas, and the Future, by John Sculley; The Bite in the Apple: A Memoir of My Life with Steve Jobs, by Chrisann Brennan; Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company, by Owen W. Linzmayer; Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age, by Michael A. Hiltzik; and Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything, by Steven Levy; as well as Moritz’s The Little Kingdom, and Wozniak and Smith’s iWoz. Other journalistic sources included “The Fall of Steve” by Bro Uttal, published in Fortune on August 5, 1985; and the PBS television documentary The Entrepreneurs, broadcast in 1986.

Stuttgart: Arnoldsche Verlaganstalt, 2014. Grove, Andrew S. Swimming Across: A Memoir. New York: Grand Central Publishing, 2001. Hertzfeld, Andy. Revolution in the Valley: The Insanely Great Story of How the Mac Was Made. Sebastopol, CA: O’Reilly Media, 2004. Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperBusiness, 1999. Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011. Kahney, Leander. Jony Ive: The Man Behind Apple’s Greatest Products. New York: Portfolio Hardcover, 2013. Krueger, Myron W. Artificial Reality II. Boston: Addison-Wessley Professional, 1991. Lashinsky, Adam.

pages: 519 words: 142,646

Track Changes
by Matthew G. Kirschenbaum
Published 1 May 2016

But this was by no means true of elsewhere in the world, even other Anglophone nations. By contrast, for example, Joe Moran suggests that in the United Kingdom it was the September 1985 debut of Alan Sugar’s Amstrad PCW 8256 that marked “the tipping point when many writers, published and aspiring, made the trek to Dixons, where it was exclusively sold, and joined the computer age.” The Amstrad, which Sugar had designed after seeing word processors in Tokyo, came with a twelve-inch screen, a dot matrix printer, and its own word processing program, called LocoScript. It cost £399. See Moran, “Typewriter, You’re Fired! How Writers Learned to Love the Computer,” The Guardian, August 28, 2015, http://www.theguardian.com/books/2015/aug/28/how-amstrad-word-processor-encouraged-writers-use-computers. 14.

Catano, “Poetry and Computers: Experimenting with the Communal Text,” Computers and the Humanities 13, no. 4 (October–December 1979): 269–275; see also Catano, “Computer-Based Writing: Navigating the Fluid Text,” College Composition and Communication 36, no. 3 (October 1985): 309–316. 12. Larry Tesler, interview with the author, October 11, 2013. 13. Ibid. 14. Ibid. 15. The best overview of Xerox PARC’s history and associated innovations is Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: Harper, 1999). 16. This episode (and Apple’s subsequent development of the technologies) is recounted in detail in Steven Levy, Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything (New York: Penguin, 1994), 77–103. For video of Tesler describing the demo in 2011, see Philip Elmer-DeWitt, Fortune, August 24, 2014, embedded YouTube video, http://fortune.com/2014/08/24/raw-footage-larry-tesler-on-steve-jobs-visit-to-xerox-parc/. 17.

pages: 543 words: 153,550

Model Thinker: What You Need to Know to Make Data Work for You
by Scott E. Page
Published 27 Nov 2018

As the personal computer market grew, other companies developed software compatible with DOS, providing more positive feedbacks. These events—the success of DOS, the growth of the personal computer market, and the development of software running on the DOS platform—can be thought of as one color of ball being consistently drawn from the urn. Each outcome made the next more likely. The computer age may have been inevitable, but Microsoft’s central role and the growth of the personal computer represent one of many potential paths. We can contrast the path dependence of Microsoft’s growth with the assassination of Archduke Franz Ferdinand on June 28, 1914, which many see as a tipping point that led to World War I.

Quarterly Journal of Economics 87, no. 3: 355–374. Squicciarini, Mara, and Nico Voigtländer. 2015. “Human Capital and Industrialization: Evidence from the Age of Enlightenment.” Quarterly Journal of Economics 30, no. 4: 1825–1883. Starfield, Anthony, Karl Smith, and Andrew Bleloch. 1994. How to Model It: Problem Solving for the Computer Age. Minneapolis, MN: Burgess International. Stein, Richard A. 2011. “Superspreaders in Infectious Diseases.” International Journal of Infectious Diseases 15, no. 8: e510–e513. Sterman, John D. 2000. Business Dynamics: Systems Thinking and Modeling for a Complex World. New York: McGraw-Hill. Sterman, John. 2006.

pages: 524 words: 154,652

Blood in the Machine: The Origins of the Rebellion Against Big Tech
by Brian Merchant
Published 25 Sep 2023

Land of Lost Content: The Luddite Revolt. London: Heinemann, 1986. Romilly, Samuel. The Life of Sir Samuel Romilly Written by Himself, with a Selection from His Correspondence. 2 vols. London: John Murray, 1842. Sale, Kirkpatrick. Rebels against the Future: The Luddites and Their War on the Industrial Revolution; Lessons for the Computer Age. Reading, MA: Addison-Wesley, 1995. Sandler, Matt. The Black Romantic Revolution: Abolitionist Poets at the End of Slavery. London: Verso, 2020. Say, Jean-Baptiste. A Treatise on Political Economy. Trans. Charles Robert Trinsop. London, 1821. First published as Traité d’économie politique, ou simple exposition de la manière dont se forment les richesses (1803).

Imagine millions of ordinary people Scenes like the one that opens the book, as well as numerous newspaper accounts, are detailed in nearly every major Luddite history, including Alan Brooke and Leslie Kipling, Liberty or Death: Radicals, Republicans and Luddites 1728–1828 (Huddersfield, UK: Huddersfield Local History Society, 2012); Frank Peel, The Risings of the Luddites, Chartists, and Plugdrawers (Heckmondwike, UK, 1880); and Kirkpatrick Sale, Rebels against the Future: The Luddites and Their War on the Industrial Revolution; Lessons for the Computer Age (Reading, MA: Addison-Wesley, 1995). The scene describes the way the Luddites typically assembled before embarking on a raid against an entrepreneur who ran automating machinery. 2. The public cheered F[rank]. O[gley]. Darvall, Popular Disturbances and Public Order in Regency England: Being an Account of the Luddite and Other Disorders in England During the Years 1811–1817 and of the Attitude and Activity of the Authorities (London: Oxford University Press, 1934; reprint, 1969), introduction. 3.

pages: 223 words: 58,732

The Retreat of Western Liberalism
by Edward Luce
Published 20 Apr 2017

But the growth vanished almost as quickly as it came. We are still awaiting the productivity gains we were assured would result from the digital economy. With the exception of most of the 1990s, productivity growth has never recaptured the rates it achieved in the post-war decades. ‘You can see the computer age everywhere but in the productivity statistics,’ said Robert Solow, the Nobel Prize-winning economist. Peter Thiel, the Silicon Valley billionaire, who has controversially backed Donald Trump, put it more vividly: ‘We wanted flying cars, instead we got 140 characters [Twitter].’ That may be about to change, with the acceleration of the robot revolution and the spread of artificial intelligence.

pages: 204 words: 58,565

Keeping Up With the Quants: Your Guide to Understanding and Using Analytics
by Thomas H. Davenport and Jinho Kim
Published 10 Jun 2013

Rama Ramakrishnan, “Three Ways to Analytic Impact,” The Analytic Age blog, July 26, 2011, http://blog.ramakrishnan.com/. 10. People v. Collins, 68 Cal. 2d 319 (1968); http://scholar.google.com/scholar_case?case=2393563144534950884; “People v. Collins,” http://en.wikipedia. org/wiki/People_v._Collins. Chapter 3 1. A. M. Starfield, Karl A. Smith, and A. L. Bleloch, How to Model It: Problem Solving for the Computer Age (New York: McGraw-Hill, 1994), 19. 2. George Box and Norman R. Draper, Empirical Model-Building and Response Surfaces (New York: Wiley, 1987), 424. 3. Garth Sundem, Geek Logik: 50 Foolproof Equations for Everyday Life, (New York: Workman, 2006). 4. Minnie Brashears, Mark Twain, Son of Missouri (Whitefish, MT: Kessinger Publishing, 2007). 5.

pages: 194 words: 57,434

The Age of AI: And Our Human Future
by Henry A Kissinger , Eric Schmidt and Daniel Huttenlocher
Published 2 Nov 2021

Europe, unlike China and the United States, has yet to create homegrown global network platforms or cultivate the sort of domestic digital technology industry that has supported the development of major platforms elsewhere. Still, Europe commands the attention of the major network platform operators with its leading companies and universities, its tradition of Enlightenment exploration, which laid essential foundations for the computer age, its sizable market, and a regulatory apparatus that is formidable in its ability to innovate and impose legal requirements. Yet Europe continues to face disadvantages for the initial scaling of new network platforms because of its need to serve many languages and national regulatory apparatuses in order to reach its combined market.

pages: 864 words: 272,918

Palo Alto: A History of California, Capitalism, and the World
by Malcolm Harris
Published 14 Feb 2023

The path from the vacuum-tube triode to the silicon integrated circuit was historically short—a matter of a few decades—and geographically even shorter. Hewlett-Packard managed to bridge the two technologies cleanly, and that stability amid the churning start-up seas helped make it the region’s signature firm, even as its followers overtook it in size. Unlike Litton or Varian, HP made it from the tube to the computer age on its own terms. More than Shockley or even Fred Terman, H and P were suited to the era. Offshore To understand the emergence of Silicon Valley, we need to grasp more than the procession of inventors and investors who came to define the region. We also need to understand the role Silicon Valley played in the transition from a chaotic global order based on national rivalries and alliances to the Cold War’s bipolar reorganization.

Ibid, 22. 10. “The 1968 Demo—Interactive,” Doug Engelbart Institute, https://dougengelbart.org/content/view/374/464. 11. Leslie Berlin, Troublemakers: Silicon Valley’s Coming of Age (New York: Simon & Schuster, 2017), 29. 12. Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperBusiness, 1999), 67, 78. 13. Engelbart, “Augmenting Human Intellect,” para. 4d. 14. John Markoff, What the Dormouse Said—: How the Sixties Counterculture Shaped the Personal Computer Industry (New York: Viking, 2005), 26–28. 15. Jay Stevens, Storming Heaven: LSD and the American Dream, Perennial Library (New York: Harper & Row, 1995), 178. 16.

Margaret O’Mara, The Code: Silicon Valley and the Remaking of America (New York: Penguin Press, 2019), 189. 12. Steve Johnson, “What You Didn’t Know about Apple’s ‘1984’ Super Bowl Ad,” Chicago Tribune, February 2, 2017. 13. Sol Libes, “Bytelines,” BYTE, June 1961, 208. 14. Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperBusiness, 1999), 335. 15. Brice Carnahan and James O. Wilkes, The IBM Personal Computers and the Michigan Terminal System (Ann Arbor: University of Michigan College of Engineering, 1984), 1.32. 16. Hiltzik, Dealers of Lightning, 344–45. 17. Andrew Pollack, “Big I.B.M. Has Done It Again,” New York Times, March 27, 1983, https://www.nytimes.com/1983/03/27/business/big-ibm-has-done-it-again.html. 18.

pages: 219 words: 63,495

50 Future Ideas You Really Need to Know
by Richard Watson
Published 5 Nov 2013

And don’t think you’re safe at work either: 75 percent of US companies monitor employees’ email and 30 percent track keystrokes and the amount of time employees spend on their computer. Monitoring employee activity isn’t new, but it is becoming more pervasive thanks to digital technologies that make activities easier to capture, store and search. Other by-products of the computer age that go unnoticed include cell phones, most of which now contain cameras, which may one day be linked to face recognition technology. On top of that, people are increasingly choosing to communicate with each other through digital interfaces, which leave a digital trace. Nothing is private As a consequence, we can now look very closely at things that were previously unobservable.

Speaking Code: Coding as Aesthetic and Political Expression
by Geoff Cox and Alex McLean
Published 9 Nov 2012

London: Pluto Press, 2008. Bogost, Ian. Persuasive Games: The Expressive Power of Videogames. Cambridge, MA: MIT Press, 2007. References 137 Bogost, Ian. Unit Operations: An Approach to Videogame Criticism. Cambridge, MA: MIT Press, 2004. Bolter, Jay David. Turing’s Man: Western Culture in the Computer Age. Chapel Hill: University of North Carolina Press, 1984. Bolter, Jay, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 1999. Borrelli, Loretta. “The Suicide Irony: Seppukoo and Web 2.0 Suicide Machine.” Digimag 52 (March 2010). Available at http://www.digicult.it/digimag/article.asp?

pages: 202 words: 62,901

The People's Republic of Walmart: How the World's Biggest Corporations Are Laying the Foundation for Socialism
by Leigh Phillips and Michal Rozworski
Published 5 Mar 2019

Today, this may seem obvious (and its very obviousness is likely a product of how influential cybernetic notions have become in our culture; this is where the word “feedback” comes from), but at the time, this was a revelation wherein linear, “if this, then that” control systems dominated. As Richard Barbrook recounts in his 2007 history of the dawn of the computer age, Imaginary Futures, despite the military engineering origins of the field, Wiener would go on to be radicalized by the Cold War and the arms race, not only declaring that scientists had a responsibility to refuse to participate in military research, but asserting the need for a socialist interpretation of cybernetics.

Debtor Nation: The History of America in Red Ink (Politics and Society in Modern America)
by Louis Hyman
Published 3 Jan 2011

Dixon saw this legislation as important because “the rapid growth of interconnected credit bureaus tied in with computer centers and telephone lines constitutes an agglomerate growth pattern which will likely parallel, in ultimate significance, the history of the railroad and telephone systems.”157 Credit reporting would form the infrastructure of the computer-age economy, as the railroad and telephone had made possible the industrial-age economy. To Dixon and the authors of the bill, the fairness of the Fair Credit Reporting Act (FCRA) was in preserving the accuracy of information. Dixon saw “the two words as synonymous in this bill. Fair and accurate.”158 These two issues, privacy and accuracy, drove the debates surrounding the FCRA, and how to resolve them would shape the future of the credit reporting industry, which, outside of legal questions, was under other pressures as well.

As Retail Credit Company used its large capital to become Equifax in the mid-1970s, it changed more than its name—it changed the way it did business. The older, inaccurate, expensive investigative reports gave way to the new methods. The profitability of the new credit methods led to Credit Data’s acquisition by the large conglomerate TRW in the early 1970s.186 Following the efficiencies of the computer age, TRW abandoned investigative reports on consumers and had no information about habits, moral character, driving record, or health in their records, just financial data on outstanding debts, income, and payment histories.187 Rather than relying on investigators, TRW relied on accounting books. The information was cheaper, more reliable, and easier to quantify and to store on a computer’s magnetic tape.

pages: 578 words: 168,350

Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies
by Geoffrey West
Published 15 May 2017

Indeed, the litany of such discoveries both large and small is testament to the extraordinary ingenuity of the collective human mind. Unfortunately, however, there is another serious catch. Theory dictates that such discoveries must occur at an increasingly accelerating pace; the time between successive innovations must systematically and inextricably get shorter and shorter. For instance, the time between the “Computer Age” and the “Information and Digital Age” was perhaps twenty years, in contrast to the thousands of years between the Stone, Bronze, and Iron ages. If we therefore insist on continuous open-ended growth, not only does the pace of life inevitably quicken, but we must innovate at a faster and faster rate.

This quintessential modern human dynamic has led to the speeding up of the pace of life and of the rate at which we have to make major innovations in order to combat the imminent threat of what finite time singularities portend. The image of an accelerating Sisyphus haunts us. The time between the “Computer Age” and the “Information and Digital Age” was no more than about thirty years—to be compared with the thousands of years between the Stone, Bronze, and Iron ages. The clock by which we measure time on our watches and digital devices is very misleading; it is determined by the daily rotation of the Earth around its axis and its annual rotation around the sun.

pages: 1,243 words: 167,097

One Day in August: Ian Fleming, Enigma, and the Deadly Raid on Dieppe
by David O’keefe
Published 5 Nov 2020

Kahn, David. The Codebreakers: The Comprehensive History of Secret Communication from Ancient Times to the Internet. New York: Simon & Schuster, 1996. ——. Seizing the Enigma: The Race to Break the German U-Boat Codes: 1939–1943. Barnsley, UK: Pen and Sword, 2012. Montaqim, Abdul. Pioneers of the Computer Age: From Charles Babbage to Steve Jobs. Monsoon Media, 2012. Newton, David E. Alan Turing: A Study in Light and Shadow. Bloomington, IN: Xlibris, 2003. Paterson, Michael. Voices of the Codebreakers. London: David and Charles, 2007. Sebag-Montefiore, Hugh. Enigma. London: Weidenfeld & Nicolson, 2000.

The author would like to thank Joel Silver for his kind help in sorting through the massive collection. 18. Charles Babbage, The Writings of Charles Babbage (Alvin, TX: Halcyon Press, 2009), Kindle edition; Bruce Collier and James MacLachlan, Charles Babbage and the Engines of Perfection (New York: Oxford University Press, 1999); Abdul Montaqim, Pioneers of the Computer Age: From Charles Babbage to Steve Jobs (London: Monsoon Media, 2012). The vast collection at Indiana University includes Babbage’s ‘Reflexions on the decline of science in England, and some of its causes’ from 1830 and the work of Dionysius Lardner, who in 1834 penned a review of seven papers by or about Babbage’s work including the Second ‘Difference Engine’ and one on a projected analytical machine. 19.

pages: 266 words: 67,272

Fun Inc.
by Tom Chatfield
Published 13 Dec 2011

And yet, as the next chapter explores their own brief history represents an evolution of incredible rapidity and scope: one that has from the beginning lain at the cutting edge of the computer revolution, and that is now beginning to remould our actions everywhere from the classroom to the boardroom to the arenas of twenty-first century warfare. CHAPTER 2 Technology and magic ‘Any sufficiently advanced technology is indistinguishable from magic’ wrote the science fiction novelist Arthur C Clarke in 1973, giving the computer age one of its most memorable maxims. Had Clarke, who died in 2008, lived just a year longer, he would have been able to see a piece of technology being demonstrated at a 2009 Expo in Los Angeles that looked, to many in the audience, very close to magic indeed. The machine, perched on a black conical stand, looked like nothing so much as an oversized television remote control.

Kindle Fire: The Missing Manual
by Peter Meyers
Published 9 Feb 2012

Visit the Dojo to select new styles, more of which become available the more you play. Quizzes and Brain Teasers Rest your fingers. Time to exercise your mind. Video games meet education in these brain-building apps. The Secret of Grisly Manor ($0.99). This app is one of the best examples of what some predicted would happen to novels in the Computer Age: They’d turn into audience-piloted explorations of richly illustrated worlds. Behind each door, and within each room, users could inspect objects, chew on clues (“There’s something hidden underneath the grate”), and decide how to navigate the storyline. Needless to say, Stephen King still has a job.

pages: 230 words: 71,320

Outliers
by Malcolm Gladwell
Published 29 May 2017

been the massive, expensive mainframes of the sort sitting in the white expanse of the Michigan Computer Center. For years, every hacker and electronics whiz had dreamt of the day when a computer would come along that was small and inexpensive enough for an ordinary person to use and own. That day had finally arrived. If January 1975 was the dawn of the personal computer age, then who would be in the best position to take advantage of itThe same principles apply here that applied to the era of John Rockefeller and Andrew Carnegie. “If you're too old in nineteen seventy-five, then you'd already have a job at IBM out of college, and once people started at IBM, they had a real hard time making the transition to the new world,” says Nathan Myhrvold, who was a top executive at Microsoft for many years.

pages: 235 words: 62,862

Utopia for Realists: The Case for a Universal Basic Income, Open Borders, and a 15-Hour Workweek
by Rutger Bregman
Published 13 Sep 2014

Whereas in 1800, water power still supplied England with three times the amount of energy as steam, 70 years later English steam engines were generating the power equivalent of 40 million grown men.24 Machine power was replacing muscle power on a massive scale. Now, two centuries later, our brains are next. And it’s high time, too. “You can see the computer age everywhere but in the productivity statistics,” the economist Bob Solow said in 1987. Computers could already do some pretty neat things, but their economic impact was minimal. Like the steam engine, the computer needed time to, well, gather steam. Or compare it to electricity: All the major technological innovations happened in the 1870s, but it wasn’t until around 1920 that most factories actually switched to electric power.25 Fast forward to today, and chips are doing things that even ten years ago were still deemed impossible.

pages: 391 words: 71,600

Hit Refresh: The Quest to Rediscover Microsoft's Soul and Imagine a Better Future for Everyone
by Satya Nadella , Greg Shaw and Jill Tracie Nichols
Published 25 Sep 2017

In recent decades, the world has invested hundreds of billions of dollars in technology infrastructure—PCs, cell phones, tablets, printers, robots, smart devices of many kinds, and a vast networking system to link them all. The aim has been to increase productivity and efficiency. Yet what, exactly, do we have to show for it? Nobel Prize–winning economist Robert Solow once quipped, “You can see the computer age everywhere but in the productivity statistics.” However, from the mid-1990s to 2004, the PC Revolution did help to reignite once-stagnant productivity growth. But other than this too brief window, worldwide per capita GDP growth—a proxy for economic productivity—has been disappointing, just a little over 1 percent per year.

pages: 281 words: 71,242

World Without Mind: The Existential Threat of Big Tech
by Franklin Foer
Published 31 Aug 2017

He imagined a test of the computer’s intelligence in which a person would send written questions to a human and a machine in another room. Receiving two sets of answers, the interrogator would have to guess which answers came from the human. Turing predicted that within fifty years the machine would routinely fool the questioner. • • • THIS PREDICTION SET THE TERMS for the computer age. Ever since, engineers have futilely attempted to build machines capable of passing Turing’s test. For many of those seeking to invent AI, their job is just a heap of mathematics, a thrilling intellectual challenge. But for a significant chunk of others, it’s a theological pursuit. They are at the center of a transformative project that will culminate in the dawning of a new age.

pages: 239 words: 56,531

The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine
by Peter Lunenfeld
Published 31 Mar 2011

They will have enough power to be entirely shaped by software.” 18 . Kay was especially impressed by the ways in which MIT mathematician Seymour Papert used Piaget’s theories when he developed the LOGO programming language. 19 . See Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: HarperCollins, 1999); Douglas K. Smith and Robert C. Alexander, Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer (New York: William Morrow, 1988). 20 . I take this phrase from Robert X. Cringely’s documentary for the Public Broadcasting System, The Triumph of the Nerds: The Rise of Accidental Empires (1996), which drew from his earlier book Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date (New York: Harper Business, 1993). 21 .

pages: 233 words: 66,446

Bitcoin: The Future of Money?
by Dominic Frisby
Published 1 Nov 2014

When such communication takes place electronically, and confidentiality is desired – perhaps it is a communication between you and your bank, and you don’t want internet service providers, crooks or the NSA to know what’s being said – some sort of encryption software is used. As you can imagine, the role of encryption in the computer age is enormous. The science of encoding and decoding data to maintain privacy is the science of cryptography. And remember, when Satoshi Nakamoto first announced Bitcoin, he did it on a mailing list only read by people with an interest in cryptography. The primitive tropical island that would become a blueprint for Bitcoin The tiny tropical island of Yap lies in the eastern Pacific Ocean, about a thousand miles to the east of the Philippines.

pages: 212 words: 68,754

Thinking in Numbers
by Daniel Tammet
Published 15 Aug 2012

Near the end of the line, I gathered my things – my bearings too – and stepped out. The platform was covered in litter and broken glass, but for an instant, at least, it felt unambiguously good to be back. Time is more than an attitude or a frame of mind. It is about more than seeing the hourglass as half empty or half full. More than ever in this age, let us call it the computer age, a lifetime has become a discrete and eminently measurable quality. To date, to believe the surveys in newspapers, I have spent some one hundred thousand minutes standing in a queue, and five hundred hours making tea. I have spent a year’s worth of waking days on the hunt for lost things. This year, I knew, contained my twelve thousand and twelfth day and night.

pages: 272 words: 64,626

Eat People: And Other Unapologetic Rules for Game-Changing Entrepreneurs
by Andy Kessler
Published 1 Feb 2011

So what I ended up doing right before the final, I guess a few days before, I went to the course Web site, downloaded all the images, made a new Web site, where there was a page for each image, right, where the image was there and there was a box to add comments, and then I sent out a link to this site to the class list, and said, ‘Hey, guys, I built a study tool. Everyone just can go use this to go comment and see what everyone else was commenting on these photos.’” Cheating in the computer age. Or was it? “So within an hour or two, a bunch of people in the class went and filled out all the information about the photos. I just went back and kind of absorbed it all. I got an A in the class. I think generally, I heard something afterwards, that the grades in that class on the final were way higher than they have ever been.”

pages: 245 words: 12,162

In Pursuit of the Traveling Salesman: Mathematics at the Limits of Computation
by William J. Cook
Published 1 Jan 2011

It is not that we have a desperate thirst for the details of particular optimal tours, but rather a desperate need to know that the TSP can be pushed back just a bit further. The salesman may defeat us in the end, but not without a good fight. From 49 to 85,900 The heroes of the field are Dantzig, Fulkerson, and Johnson. Despite the dawning of the computer age and a steady onslaught of new researchers tackling the TSP, the 49-city example that Dantzig et al. solved by hand stood as an unapproachable record for seventeen years. Algorithms were developed, computer codes written, and research reports published, but year after year their record held its ground.

The Intelligent Asset Allocator: How to Build Your Portfolio to Maximize Returns and Minimize Risk
by William J. Bernstein
Published 12 Oct 2000

Why is this so important? As already discussed the most diversification benefit is obtained from uncorrelated assets. The above Math Details: How to Calculate a Correlation Coefficient In this book’s previous versions, I included a section on the manual calculation of the correlation coefficient. In the personal computer age, this is an exercise in masochism.The easiest way to do this is with a spreadsheet. Let’s assume that you have 36 monthly returns for two assets, A and B. Enter the returns in columns A and B, next to each other, spanning rows 1 to 36 for each pair of values. In Excel,enter in a separate cell the formula ⫽ CORREL(A1:A36, B1:B36) In Quattro Pro, the formula would be @CORREL(A1..A36, B1..B36) Both of these packages also contain a tool that will calculate a “correlation grid” of all of the correlations of an array of data for more than two assets.Those of you who would like an explanation of the steps involved in calculating a correlation coefficient are referred to a standard statistics text. 40 The Intelligent Asset Allocator analysis suggests that there is not much benefit from mixing domestic small and large stocks and that there is great benefit from mixing REITs and Japanese small stocks.

pages: 244 words: 66,599

Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything
by Steven Levy
Published 2 Feb 1994

Allover the Valley, people were whispering about how a small group of geniuses was devising something along the lines of Apple's impressive but prohibitively expensive Lisa computer. Introduced in January 1983, Lisa had been acclaimed as offering breakthrough technology, but few could afford it. Hopes abounded, however, that this new computer would break through to the masses, single-handedly launching the computer age into the stratosphere. Those depressed by the ease with which IBM had rocketed ahead of Apple looked to this new machine as the magic bullet that could stop Big Blue in its tracks. Very little in the way of specifics had leaked out of Apple, but it was common knowledge that the shipping date had slipped more than once.

pages: 232 words: 71,237

Kill It With Fire: Manage Aging Computer Systems
by Marianne Bellotti
Published 17 Mar 2021

The best codes now were ones that were a bit more complex, had a fixed length, and ultimately stored more data. A few different systems were developed. The first one to stick was developed by Emile Baudot in 1870. The so-called Baudot code, aka International Telegraph Alphabet No. 1, was a 5-bit binary system. Fast-forward to the early computer age when people were developing massive room-sized machines that also were using binary systems. They needed a way to input data and instructions, but they had no visual interface. Computers wouldn’t be developed to work with monitors until 1964 when Bell Labs incorporated the first primitive visual interface into the Multics time-sharing system.

In the Age of the Smart Machine
by Shoshana Zuboff
Published 14 Apr 1988

Whenever one is dealing with a multi- plicity of individuals on whom a task or a particular form of behavior must be imposed, the panoptic schema may be used."? THE PANOPTIC POWER OF INFORMATION TECHNOLOGY Information systems that translate, record, and display human behavior can provide the computer age version of universal transparency with a degree of illumination that would have exceeded even Bentham's most outlandish fantasies. Such systems can become information panopticons that, freed from the constraints of space and time, do not depend upon the physical arrangement of buildings or the laborious record keeping of industrial administration.

Ramchandran Jaikumar, "Postindustrial Manufacturing," Harvard Business Review (November-December 1986): 69-76. 8. David F. Noble, Forces of Production: A Social History of Industrial Automation (New York: Oxford University Press, 1986); see particularly 144-92. 9. Harley Shaiken, Work Transformed: Automation and Labor in the Computer Age (New York: Holt, Rinehart & Winston, 1985), 264. 10. Robert Howard, Brave New Workplace (New York: Viking, 1985); see partic- ularly 15-35. 11. The psychiatrist R. D. Laing used the term "knots" to describe the com- plexities of interpersonal communication and understanding. See his Knots (New York: Pantheon Books, 1970).

Microchip: An Idea, Its Genesis, and the Revolution It Created
by Jeffrey Zygmont
Published 15 Mar 2003

When the Yankees play, Sue wants to watch the game clear through, from opening pitch to final out, without any between-inning cuts to catch up on the contests shown on other channels. So John retreats to a separate TV, where he is free to surf around all of Major League Baseball. Now, in an era that is mislabeled the Computer Age, cable television may not appear to occupy the technical vanguard. But consider that the ball games that entertain Sue and John on so many summer evenings first arrive at the couple's cable company as dense streams of digitally encoded signals beamed earthward by a satellite. Forget the fact that so much of the inner workings of the satellite itself consist of unimaginably compressed and compact electronic circuits that contain mil- XIII xiv Prologue lions and millions of parts.

Longevity: To the Limits and Beyond (Research and Perspectives in Longevity)
by Jean-Marie Robine , James W. Vaupel , Bernard Jeune and Michel Allard
Published 2 Jan 1997

As shown by Coale and Kisker (1986, p. 398), the ratios of those aged 95 years or over to those aged 70 or over in the 23 countries with accurate data quality were all less than six per thousand, whereas the ratios for the 28 countries with poor data clearly showed the exaggeration of very old persons aged 95 or over, extending from 1 % to 10 %. This ratio for male and female Han Chinese in 1990 is 0.76 per thousand and 2.18 per thousand, respectively, which is almost exactly the same as their Swedish counterparts in the period from 1985 -1994. Coale and Kisker calculated 2 It is important to ask for the date of birth and to compute age by subtracting from the date of the census or survey (if respondents supply the Chinese calendar, conversion to the Western calendar is needed). If the questionnaire asks the individual's age, the Chinese system of reckoning nominal age makes the response ambiguous, because a person may be counted as one-year-old on the day of birth and one year older with each new year according to the Chinese tradition. 94 Wang Zhenglian et al.

pages: 297 words: 77,362

The Nature of Technology
by W. Brian Arthur
Published 6 Aug 2009

The Sociology of Invention. Follett, Chicago. 1935. Grübler, Arnulf. Technology and Global Change. Cambridge University Press, Cambridge, UK. 1998. Heidegger, Martin. The Question Concerning Technology. Harper and Row, New York. 1977. Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. HarperBusiness, New York. 1999. Hughes, Thomas P. Networks of Power: Electrification in Western Society, 1880–1930. Johns Hopkins University Press, Baltimore. 1983. Hughes, Thomas P. Rescuing Prometheus. Pantheon Books, New York. 1998. Jewkes, John, David Sawers, and Richard Stillerman. The Sources of Invention.

pages: 253 words: 80,074

The Man Who Invented the Computer
by Jane Smiley
Published 18 Oct 2010

In his recent volume of essays, historian John Lukacs catalogs the ways in which, seventy years later, World War II is still shaping the world we live in, even though all the power relationships and ideologies then in play, among the Allies and the Soviet Union and Hitler’s Germany, have shifted utterly. In the index of Lukacs’s book, no mention is made of the computer. But, as we will see, the Second World War was the sine qua non of the invention of the computer and the transformation of the nature of information and the nature of human thought that the computer age has brought about. However, we begin with another war, a small war in a place very far removed from Rock Island, Illinois. Chapter One John Vincent Atanasoff’s father, Ivan, was born in 1876, in the midst of a period of climaxing political unrest. His parents were landed peasants in the Bulgarian village of Boyadzhik (about eighty miles from the Black Sea and perhaps halfway between Istanbul and Sofia).

pages: 265 words: 79,896

Red Rover: Inside the Story of Robotic Space Exploration, From Genesis to the Mars Rover Curiosity
by Roger Wiens
Published 12 Mar 2013

During the concept phase, our support staff at JPL and Lockheed Martin consisted of the minimum number of people—basically their “dreamworks” proposal hotshots who could extract the critical information from the various experts in propulsion, thermal, navigation, and other specialties. The hotshots sent the information on to us. Actually pulling the information together into a coherent proposal was going to fall to Don and me. The word-processing era was just coming of age. Having been born closer to the computer age than Don, I took charge of publishing the volumes, while he reviewed everything and coordinated inputs. We had graduated from the TRS-80 computer to a “386” that was connected to a printer. The most recent advance was an e-mail hookup. E-mail at this point consisted of simple messages; attachments were as yet unheard of.

pages: 268 words: 75,850

The Formula: How Algorithms Solve All Our Problems-And Create More
by Luke Dormehl
Published 4 Nov 2014

“My belief is that this will one day become the norm,” Boyce says of the Quantified Self. “It will become a commodity, with its own sense of social currency.” Shopping Is Creating One of the chapters in Douglas Coupland’s debut novel Generation X, written at the birth of the networked computer age, is titled “Shopping Is Not Creating.”9 It is a wonderfully pithy observation about 1960s activism sold off as 1990s commercialism, from an author whose fiction books Microserfs and JPod perfectly lampoon techno-optimism at the turn of the millennium. It is also no longer true. Every time a person shops online (or in a supermarket using a loyalty card) their identity is slightly altered, being created and curated in such a way that is almost imperceptible.

Raw Data Is an Oxymoron
by Lisa Gitelman
Published 25 Jan 2013

At the Smithsonian, he has curated a number of exhibits concerning the interplay of computing and aerospace technology. He is the author of several books on the history of computing, including Reckoners: The Prehistory of The Digital Computer (Greenwood Press, 1983), Beyond the Limits: Flight Enters the Computer Age (MIT Press, 1989), A History of Modern Computing, 2nd ed. (MIT Press, 2003), and Internet Alley: High Technology in Tysons Corner (MIT Press, 2008). Ann Fabian teaches American history at Rutgers University. Her most recent book is The Skull Collectors: Race, Science, and America’s Unburied Dead (University of Chicago Press, 2010).

pages: 269 words: 74,955

The Crash Detectives: Investigating the World's Most Mysterious Air Disasters
by Christine Negroni
Published 26 Sep 2016

The riveting story of how experience and teamwork saved the day follows in part 5 of this book. The lesson here comes from Pearson, who said he and others learned that day that they were unprepared for the monumental leap in technology—and this from a man who had literally flown into the jet age. “Transitioning from the noncomputer age to the computer age was more difficult than transitioning from propeller planes to jets, and it wasn’t because they flew twice as high and twice as fast. It was all the big unknowns,” he said. After years of accidents attributable to pilot error, automating some functions was intended to make flying more precise, more efficient, and of course safer.

pages: 242 words: 73,728

Give People Money
by Annie Lowrey
Published 10 Jul 2018

Yet, as the economist Chad Syverson has noted, for roughly a quarter century following its introduction, productivity growth was relatively slow. The same is true for the first information technology era, when computers started to become ubiquitous in businesses and homes. As the economist Robert Solow—hence the Solow residual—quipped in 1987, “You can see the computer age everywhere but in the productivity statistics.” In most cases, productivity did speed up once innovators invented complementary technologies and businesses had a long while to adjust—suggesting that the innovation gains and job losses of our new machine age might be just around the corner. If so, mass unemployment might be a result—and a UBI might be a necessary salve.

pages: 333 words: 76,990

The Long Good Buy: Analysing Cycles in Markets
by Peter Oppenheimer
Published 3 May 2020

In the same way, a transfer of transportation away from the internal combustion engine to electrification may be technically possible, but will require an integrated power supply system and refuelling points before it can be fully adopted. Concerns about the lack of productivity growth and, therefore, the misvaluation of companies associated with technology, were widespread in the 1980s. In 1987, Nobel Laureate Robert Solow argued that ‘you can see the computer age everywhere except in the productivity statistics’.12 These concerns faded when many economies saw a dramatic improvement in productivity in the 1990s. But the weakness in productivity growth in many economies since the Great Recession and the financial crisis has once more stimulated this debate.

pages: 305 words: 75,697

Cogs and Monsters: What Economics Is, and What It Should Be
by Diane Coyle
Published 11 Oct 2021

Paul David (1990) provided one well-known historical account of these characteristics of a GPT, comparing the spread of computer technology in the 1980s to the electric dynamo in the early twentieth century, by way of explaining the ‘productivity paradox’ Robert Solow (1987, 36) had complained about: ‘You can see the computer age everywhere but in the productivity statistics.’ While the ultimate impacts were therefore substantial, the impact took a long time to show through in GDP and productivity figures. Although some economists question whether digital technologies are in the same league as these past GPTs in terms of their broad impact (Gordon 2016; Bloom et al. 2020), my view is that digital will be as transformational as earlier GPTs: eventually talking of the digital economy will sound as strange as talking of the electricity economy.

pages: 256 words: 73,068

12 Bytes: How We Got Here. Where We Might Go Next
by Jeanette Winterson
Published 15 Mar 2021

(In Britain it was the miners looking for a 35% pay increase that triggered the 3-day week. Margaret Thatcher would smash them for that when she was in power in 1985. Looking back, it seems now that the 1970s were really the end-time years of the West’s Industrial Revolution. We were waiting for the Computer Age that wasn’t quite ready. Desktop computers didn’t appear till the mid-1970s, and they weren’t built by companies, but by guys in garages. Acceleration also leads to exhaustion – because humans aren’t a version of Moore’s Law, where the number of transistors on a microchip doubles every 2 years, increasing speed and lowering price.

pages: 345 words: 75,660

Prediction Machines: The Simple Economics of Artificial Intelligence
by Ajay Agrawal , Joshua Gans and Avi Goldfarb
Published 16 Apr 2018

For some, the answer was easy: “Find where we do lots of calculations and substitute computers for humans; they’re better, faster, and cheaper.” For other businesses, it was less obvious. Nonetheless, they experimented. But the fruits of those experiments took time to materialize. Robert Solow, a Nobel laureate economist, lamented, “You can see the computer age everywhere but in the productivity statistics.”1 From this challenge came an interesting business movement called “reengineering.” In 1993, Michael Hammer and James Champy, in their book Reengineering the Corporation, argued that to use the new general-purpose technology—computers—businesses needed to step back from their processes and outline the objective they wanted to achieve.

pages: 252 words: 79,452

To Be a Machine: Adventures Among Cyborgs, Utopians, Hackers, and the Futurists Solving the Modest Problem of Death
by Mark O'Connell
Published 28 Feb 2017

With these invocations, he moves his arms downward, then outward to either side, before clasping his hands to his chest. He turns about the room, bestowing a gesture of esoteric benediction on the four points of the compass, speaking in each of these positions the hallowed name of a prophet of the computer age: Alan Turing, John von Neumann, Charles Babbage, Ada Lovelace. Then he stands perfectly still, this priestly young man, arms outspread in a cruciform posture. “Around me shines the bits,” he says, “and in me is the bytes. The data, the code, the communications. Forever, amen.” This young man, I learned, was a Swedish academic named Anders Sandberg.

pages: 280 words: 74,559

Fully Automated Luxury Communism
by Aaron Bastani
Published 10 Jun 2019

But besides those older judgements regarding the often zealous manner in which GDP was used, by the late 1980s another criticism began to emerge. Now, some said, it was no longer capable of even measuring economic growth properly. This was most famously expressed by the economist Robert Solow when he claimed in 1987 that ‘you can see the computer age everywhere but the productivity statistics.’ That conclusion was a response to the ‘productivity paradox’ which so troubled economists at the time – namely, how investment in information technology over the 1980s had a seemingly negligible impact on productivity measures, which actually slowed over the decade.

pages: 267 words: 71,941

How to Predict the Unpredictable
by William Poundstone

He was welcome to do that because he had published work of such phenomenal value to AT&T that it would have been petty for anyone to complain. Shannon was the godfather of our digital universe. His MIT master’s thesis described how symbolic logic could be encoded in electrical circuits, and how those circuits might compute using binary 0s and 1s rather than decimal digits. This was one of the founding documents of the computer age. Shannon spent a fellowship at the Institute for Advanced Study, Princeton. His first wife, Norma, poured tea for Albert Einstein one time, who “told me I was married to a brilliant, brilliant man.” That was before Shannon published the work for which he’s most renowned, “A Mathematical Theory of Communication.”

pages: 280 words: 76,638

Rebel Ideas: The Power of Diverse Thinking
by Matthew Syed
Published 9 Sep 2019

Recombination is about cross-pollination, reaching across the problem space, bringing together ideas that have never been connected before. We might call these ‘rebel combinations’: merging the old with the new, the alien and the familiar, the outside and the inside, the yin and the yang. This trend is not slowing up but accelerating in the computer age, with its vast networks. Think of Waze. This is classically recombinant, combining a location sensor, data transmission device, GPS system and social network. Or take Waymo, the self-driving car technology company, which brings together the internal combustion engine, fast computation, a new generation of sensors, extensive map and street information, and many other technologies.14 Indeed, almost all tech innovations connect disparate ideas, minds, concepts, technologies, data-sets and more.

pages: 272 words: 83,378

Digital Barbarism: A Writer's Manifesto
by Mark Helprin
Published 19 Apr 2009

And the third is screening: preventing any and all means of attack from penetrating an assiduously defended perimeter. The American tendency has been to focus on the second approach, because it is cheaper and easier than the other two. It should not be surprising that we deal with mortal peril by turning to systems analysis born of the computer age and entirely reliant upon probabilities rather than upon hard-won certainties. This has become our way of life, and its advocates, drunk on the bureaucratic elixir of information-getting, believe in it as if it were religion. Which they must, for in light of its fundamental ineffectiveness continual support requires nothing less than blind faith.

pages: 287 words: 86,919

Protocol: how control exists after decentralization
by Alexander R. Galloway
Published 1 Apr 2004

There are many different types of hardware: controllers (keyboards, joysticks), virtualization apparatuses (computer monitors, displays, virtual reality hardware), the interface itself (i.e., the confluence of the controller and the virtualization apparatus), the motherboard, and physical networks both intra (a computer’s own guts) and inter (an Ethernet LAN, the Internet). However, the niceties of hardware design are less important than the immaterial software existing within it. For, as Alan Turing demonstrated at the dawn of the computer age, the important characteristic of a computer is that it can mimic any machine, any piece of hardware, provided that the functionality of that hardware can be broken down into logical processes. Thus, the key to protocol’s formal relations is in the realm of the immaterial software. Record The first term in Net form is the record.

pages: 246 words: 81,625

On Intelligence
by Jeff Hawkins and Sandra Blakeslee
Published 1 Jan 2004

From the dawn of the industrial revolution, people have viewed the brain as some sort of machine. They knew there weren't gears and cogs in the head, but it was the best metaphor they had. Somehow information entered the brain and the brain-machine determined how the body should react. During the computer age, the brain has been viewed as a particular type of machine, the programmable computer. And as we saw in chapter 1, AI researchers have stuck with this view, arguing that their lack of progress is only due to how small and slow computers remain compared to the human brain. Today's computers may be equivalent only to a cockroach brain, they say, but when we make bigger and faster computers they will be as intelligent as humans.

pages: 282 words: 80,907

Who Gets What — and Why: The New Economics of Matchmaking and Market Design
by Alvin E. Roth
Published 1 Jun 2015

Apple chose a “closed” operating system that allowed it to control which apps could be sold to iPhone users. Google, which came later to the game, opted for an “open” system, publishing the code so that any developer could build for it. These choices echoed similarly opposing strategic decisions made by Apple and Microsoft at the dawn of the personal computer age. Anybody could make software for the PC platform, but only Apple (or those developers it allowed to do so) could make software for its personal computer, the Mac. These choices allowed the market for PC software to grow thick much more quickly than the market for Mac software. But Apple’s decision to keep both its hardware and software on a proprietary standard eventually allowed it to reap huge profits.

pages: 791 words: 85,159

Social Life of Information
by John Seely Brown and Paul Duguid
Published 2 Feb 2000

New York: Columbia University Press. Hesse, Carla. 1997. "Humanities and the Library in the Digital Age." In What's Happened to the Humanities?, edited by Alvin Kernan, 107 21. Princeton, NJ: Princeton University Press. Hiltzik, Michael A. 1999. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: Harper Business. Hobsbawm, Eric J. 1977. The Age of Revolution, 1789 1848. London: Abacus. Hoggdon, Paul N. 1997. "The Role of Intelligent Agent Software in the Future of Direct Response Marketing." Direct Marketing 59 (9): 10 18. Hornett, Andrea. In preparation. "Cyber Wars: Organizational Conflicts of a Virtual Team."

pages: 260 words: 80,230

Everything That Makes Us Human: Case Notes of a Children's Brain Surgeon
by Jay Jayamohan
Published 20 Feb 2020

It’s how registrars learn. It’s exactly as the treating hospital explained. One minute the baby’s fine, the next minute she’s on the floor. Scans show a clot near the top of the brain. I’m in the theatre running an eye once again over the scans. In the old days they’d all be pinned up on the walls, but in the computer age I get to look at them one at a time on screen then whizz forward to the next. It’s cheaper than printing, I have no doubt, but far from as efficacious. Who has time to be flicking between images on a laptop? We begin to prepare to do the blood clot/AVM treatment. When the anaesthetist wheels the patient in, I do another check of the room.

pages: 283 words: 85,906

The Clock Mirage: Our Myth of Measured Time
by Joseph Mazur
Published 20 Apr 2020

By the mid-eighteenth century, clockmakers were able to miniaturize clock mechanisms so much that fairly accurate mantelpiece clocks were possible, and reasonably accurate pocket watches were made available to the few wealthy folks who could afford them—class symbols in the West. Clock design improved with a need for accuracy, and commercial demands of more and more accuracy continued with every civilization’s advance of time, from the eras of global exploration and shipping to the Industrial Revolution to the computer age. It was a slow advance from one timepiece design to another. Pace kept up with need. By the end of the nineteenth century almost every household in economically advanced western countries had at least one clock. Time was in control of everything one did; increasingly available precision was establishing a new order of behavioral regiments while dictating utilitarian routines and habits.

pages: 345 words: 84,847

The Runaway Species: How Human Creativity Remakes the World
by David Eagleman and Anthony Brandt
Published 30 Sep 2017

Music outside the Lines: Ideas for Composing in K-12 Music Classrooms. Oxford: Oxford University Press, 2012. Hilmes, Michele. Hollywood and Broadcasting: From Radio to Cable. Urbana: University of Illinois Press, 1990. Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperCollins, 2000. Hofstadter, Douglas R., and Emmanuel Sander. Surfaces and Essences: Analogy as the Fuel and Fire of Thinking. New York: Basic Books, 2013. Holt, Rackham. George Washington Carver: An American Biography. Garden City, NY: Doubleday, 1943. Horgan, John, and Jack Lorenzo.

pages: 301 words: 85,126

AIQ: How People and Machines Are Smarter Together
by Nick Polson and James Scott
Published 14 May 2018

And if you deviate from the grammar in even the tiniest of ways, like by misspelling a word or forgetting a semicolon, then the machine basically just gives you the middle finger, or as we like to write it, 00100. For decades, these were the only terms under which people and computers could have a successful conversation. As you’ll learn in this chapter, they’re a huge improvement over the way things were at the start of the computer age, when people were forced talk to computers in their native “binary” language of 0s and 1s. But these terms hardly let us use our full powers of language to get our message across. Of course, we can also get computers to do a few trifling things by pointing, clicking, swiping, etc. But that’s just so crude.

pages: 282 words: 81,873

Live Work Work Work Die: A Journey Into the Savage Heart of Silicon Valley
by Corey Pein
Published 23 Apr 2018

Andreessen’s buzzword simply means taking a regulated service provided by traditional banks, such as making cash loans to individuals, and offering the same service online, where legal precedent did not apply and where regulators were often poorly prepared to police. Startups can offer deceptively low price points for consumers and higher profits for investors precisely because they do not follow the same rules as offline competitors—rules that were designed for the protection of investors and consumers alike. Prior to the computer age, “unbundled” lending went by many names: usury, gouging, loansharking. But old-fashioned brick-and-mortar loan sharks didn’t enjoy advantages like ample VC funding and catchy dot-com domains. All that was old is new again, and with Wall Street in great disrepute, the tech-focused venture capital firms can pass off “peer-to-peer” lending and “microloan” schemes with outrageous fees and effective interest rates as a humane alternative to traditional banking.

pages: 295 words: 87,204

The Capitalist Manifesto
by Johan Norberg
Published 14 Jun 2023

The social networks that ruled were Sixdegrees, AIM, Friendster and most importantly MySpace, which was so hot that Google saw it as a breakthrough to get a three-year advertising agreement with the network in 2006, which was signed at a glamorous party on Pebble Beach with such guests as Bono and Tony Blair. Finally, Google got to hang out with the big boys. Apple, on the other hand, was a veteran of the personal computer age, but after a long crisis it had become a symbol of the fact that an early dominant position does not mean much in a fast-moving market. However, Steve Jobs had recently returned to the company, and the launch of the iPod at the end of 2001 gave Apple new hope. In 2003, the company was finally able to enjoy a modest annual profit.

pages: 288 words: 86,995

Rule of the Robots: How Artificial Intelligence Will Transform Everything
by Martin Ford
Published 13 Sep 2021

Turing’s most important accomplishment came in 1936, just two years after he graduated from the University of Cambridge, when he laid out the mathematical principles for what is today called a “universal Turing machine”—essentially the conceptual blueprint for every real-world computer that has ever been built. Turing clearly understood at the very inception of the computer age that machine intelligence was a logical and perhaps inevitable extension of electronic computation. The phrase “artificial intelligence” was coined by John McCarthy, who was then a young mathematics professor at Dartmouth College. In the summer of 1956, McCarthy helped arrange the Dartmouth Summer Research Project on Artificial Intelligence at the college’s New Hampshire campus.

Shady Characters: The Secret Life of Punctuation, Symbols, and Other Typographical Marks
by Keith Houston
Published 23 Sep 2013

Knuth’s , which contained 4,500 “positive” and “negative” hyphenation patterns (for example, b-s, -cia, con-s, and -ment are potential hyphenation points but b-ly, -cing, io-n, and i-tin are not) extracted from Merriam-Webster’s Pocket Dictionary, along with a list of fourteen exceptional words, found 89 percent of all hyphenation points and placed almost none incorrectly.59 Reliable hyphenation had finally joined justification in the computer age. Computers enabled another innovation that even the most diligent of hand compositors would have been hard pressed to match. Whereas a Linotype operator focused exclusively on the single line at hand, and a hand compositor could jockey characters back and forth on the composing stick to adjust H&J over a few lines at a time, with the processing power of a computer behind it, could try every possible paragraph layout* in the search for the best overall combination of word spacing, letter spacing, and hyphenation.61 In order to compute the optimal paragraph layout, each decision made in setting a given text was assigned a “badness” score: marginal hyphens were bad, successive marginal hyphens worse; word and letter spaces were allowed to vary within a small range of values but excessive adjustment carried a penalty.

pages: 708 words: 223,211

The Friendly Orange Glow: The Untold Story of the PLATO System and the Dawn of Cyberculture
by Brian Dear
Published 14 Jun 2017

“Computer-Based Instruction in Nutrition.” NACTA Journal 18(4), 1974, 71–74. Paceley, C. “Dan Alpert Continues Shaping Technology and Education.” Physics Illinois News, Number 1, 2006, UIUC. ———. “The Innovation of PLATO Homework.” Physics Illinois News 1, 2006, 5, UIUC. Papert, S. “Computers and Learning,” in The Computer Age: A Twenty-Year Review, M. L. Dertousoz and J. Mosesm, eds. Cambridge: MIT Press, 1979, 73–86. ———. The Children’s Machine: Rethinking School in the Age of the Computer. New York: Basic Books, 1993. ———. “Computer as Mudpie,” in Intelligent Schoolhouse: Readings on Computers and Learning, D. Peterson, ed.

Retrieved 2014-09-16 from http://www.dougengelbart.org/​firsts/​dougs-1968-demo.html. Engelbart, D., and B. English. “A Research Center for Augmenting Human Intellect,” in Proceedings of the 1968 Fall Joint Computer Conference, Vol. 33. San Francisco, December 9, 1968, 395–410. Hiltzik, M. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. San Francisco: HarperBusiness, 1999. Isaacson, W. Steve Jobs. New York: Simon & Schuster, 2011. Kay, A. “An Early History of Smalltalk. HOPL-II.” The Second ACM SIGPLAN Conference on History of Programming Languages, Association for Computing Machinery, 1993, 69–95. ———. “The Reactive Engine.”

pages: 286 words: 90,530

Richard Dawkins: How a Scientist Changed the Way We Think
by Alan Grafen; Mark Ridley
Published 1 Jan 2006

Back in the 1960s and early 1970s, when PDP 8 and PDP 11 computers the size of a room had far less capacity than does a hand calculator today, and when you had to store your programs and data on reams of paper tape or enormous stacks of punch cards, Richard was right at the forefront of their use in recording and analysing behavioural data. He dragged us fellow members of the Animal Behaviour Group at Oxford into the computer age, teaching us how to write programs in machine code. He also invented the ‘Dawkins Organ’, an early event recorder that enabled one to record behavioural data as tones on a continuously running magnetic tape, to be subsequently decoded by one of those room-sized PDP 11s. I was one of the lucky first beneficiaries of this step change in data processing technology when I worked on Great Blue Herons at the University of British Columbia in the early 1970s.

pages: 273 words: 93,419

Let them eat junk: how capitalism creates hunger and obesity
by Robert Albritton
Published 31 Mar 2009

Consumers’ needs, wants and desires are almost totally socially constructed in their historical specificity. They are constructed by the actual array of commodities available, by their price, by the socioeconomic status of the consumer and by marketing or sociocultural practices that shape desires. American consumers in 1870 would not place an automobile high on their preference schedule, nor in the computer age would they likely pine for a mechanical typewriter. People might want to take public transportation to work if offered the option, but lacking adequate public transportation, they may be forced to struggle with gridlock every day in their personal car. The desire for a leopardskin coat is not likely to be high on the want list of poor women.

pages: 322 words: 88,197

Wonderland: How Play Made the Modern World
by Steven Johnson
Published 15 Nov 2016

The innovations that music inspired turned out to unlock other doors in the adjacent possible, in fields seemingly unrelated to music, the way the “Instrument Which Plays by Itself” carved out a pathway that led to textile design and computer software. Seeking out new sounds led us to create new tools—which invariably suggested new uses for those tools. Legendary violin maker Stradivari’s workshop Consider one of the most essential and commonly used inventions of the computer age: the QWERTY keyboard. Many of us today spend a significant portion our waking hours pressing keys with our fingertips to generate a sequence of symbols on a screen or page: typing up numbers in a spreadsheet, writing e-mails, or tapping out texts on virtual keyboards displayed on smartphone screens.

pages: 313 words: 91,098

The Knowledge Illusion
by Steven Sloman
Published 10 Feb 2017

Some people have always looked upon science and technology with distrust and apprehension, and despite the astonishing scientific progress in the last century, antiscientific thought is still strong today. At the extreme are self-identified “neo-Luddites,” like the participants in the Second Luddite Congress of 1996, a meeting organized around opposition to the “increasingly bizarre and frightening technologies of the Computer Age.” But you don’t have to look hard to find numerous mainstream examples, examples that represent serious danger to our future well-being. A reasonable skepticism toward science and technology is probably healthy for society, but antiscientific thinking can be dangerous when it goes too far. Perhaps the most important issue of our day is climate change, a debate suffused with antiscientific rhetoric.

pages: 346 words: 89,180

Capitalism Without Capital: The Rise of the Intangible Economy
by Jonathan Haskel and Stian Westlake
Published 7 Nov 2017

Since the mid-1970s productivity growth in developed countries had been disappointingly low. This was despite the advent of widely hyped new computer technologies that were supposedly going to transform business for the better. Robert Solow, who contributed more to the study of economic growth than most, famously pointed out in 1987 that the impact of the Computer Age could be seen everywhere but in the productivity statistics (Solow 1987). Goaded by these criticisms, statistical agencies, led by the US Bureau of Economic Analysis (BEA), began to examine their treatment of information and information technology more closely. They introduced two types of innovation.

pages: 309 words: 91,581

The Great Divergence: America's Growing Inequality Crisis and What We Can Do About It
by Timothy Noah
Published 23 Apr 2012

Yes, although the extent to which it’s occurring depends on how you pose the question. Certainly a great many of the better-paying jobs that the middle class previously depended on are gone forever. There’s more than one explanation as to why that occurred, but right now let’s consider the disruptions brought about by technological change. Our story begins at the dawn of the computer age in the 1950s, when long-standing worries that automation would create mass unemployment entered an acute phase. Economic theory dating back to the nineteenth century said that technological advances wouldn’t reduce net employment because the number of jobs wasn’t fixed; a new machine might eliminate jobs in one part of the economy, but it would also create jobs in another part.6 For example, someone had to be employed to make these new machines.

pages: 340 words: 96,149

@War: The Rise of the Military-Internet Complex
by Shane Harris
Published 14 Sep 2014

So they brought in a hacker. He was an ex–military officer and a veteran of the military’s clandestine cyber campaigns. He’d cut his teeth in some of the army’s earliest information-warfare operations in the mid-1990s, the kind designed to get inside an enemy’s head more than his databases. These were computer-age variants of classic propaganda campaigns; they required military hackers to know how to penetrate an enemy’s communications systems and transmit messages that looked as if they came from a trusted source. Later the former officer’s work evolved into going after insurgents and terrorists on the battlefields of Iraq, tracking them down via their cell phones and Internet messages.

pages: 355 words: 92,571

Capitalism: Money, Morals and Markets
by John Plender
Published 27 Jul 2015

Thomas Edison, inventor of the light bulb and founder of the company that turned into the modern General Electric, only managed three months at school, where his teacher referred to him as ‘addled’ – a misjudgement that might rank in history with Emperor Joseph II of Austria telling Mozart that The Marriage of Figaro had too many notes. Edison was taught at home by his formidable mother, evidently to great effect. Alexander Graham Bell, inventor of the telephone, left school in Edinburgh at fifteen, having achieved poor grades and been notable for frequent bunking off. More recently, in the computer age, Bill Gates, founder of Microsoft, famously dropped out of Harvard University, while Michael Dell, who started his personal computer business in a dormitory room at the University of Texas at Austin, never completed his degree course. Steve Jobs, co-founding genius of Apple, dropped out of Reed College in Portland, Oregon.

pages: 382 words: 92,138

The Entrepreneurial State: Debunking Public vs. Private Sector Myths
by Mariana Mazzucato
Published 1 Jan 2011

When so many ‘life science’ companies are focusing on their stock price rather than on increasing their side of the R in R&D, simply subsidising their research will only worsen the problem rather than create the type of learning that Rodrik (2004) rightly calls for. 1 From now on ‘pharma’ will refer to pharmaceutical companies, and Big Pharma the top international pharma companies. Chapter 2 TECHNOLOGY, INNOVATION AND GROWTH You can see the computer age everywhere but in the productivity statistics. Solow (1987, 36) In a special report on the world economy, the Economist (2010a) stated: A smart innovation agenda, in short, would be quite different from the one that most rich governments seem to favor. It would be more about freeing markets and less about picking winners; more about creating the right conditions for bright ideas to emerge and less about promises like green jobs.

pages: 323 words: 90,868

The Wealth of Humans: Work, Power, and Status in the Twenty-First Century
by Ryan Avent
Published 20 Sep 2016

This slice of history played out during a period that economist Tyler Cowen, of George Mason University, has labelled the ‘Great Stagnation’.8 A half-century of extraordinary gains in computing power somehow did not return humanity to the days of dizzying economic and social change of the nineteenth century. In 1987 the Nobel Prize-winning economist Robert Solow mused, in a piece pooh-poohing the prospect of a looming technological transformation, that the evidence for the revolutionary power of computers simply wasn’t there. ‘You can see the computer age everywhere but in the productivity statistics’, he reckoned, and he had a point.9 Productivity perked up in the 1990s but wheezed out again in the 2000s. And that, some seemed to conclude, was all there was. In the 2000s Robert Gordon began posing a thought experiment to his audiences: would they, he wondered, prefer a world with all the available technology up to 2000, or one with all available technology up to the present day except for indoor plumbing?

pages: 327 words: 90,542

The Age of Stagnation: Why Perpetual Growth Is Unattainable and the Global Economy Is in Peril
by Satyajit Das
Published 9 Feb 2016

John Kenneth Galbraith, The Great Crash, 1929, Penguin, 1975. ——, The Age of Uncertainty, Houghton Mifflin, 1977. ——, The Affluent Society, Mariner, 1998. Jon Gertner, The Idea Factory: Bell Labs and the Great Age of American Innovation, Penguin, 2012. Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age, Harper Business, 1991. Tony Judt, Postwar: A History of Europe Since 1945, Vintage, 2005. Charles P. Kindleberger, Manias, Panics and Crashes: A History of Financial Crisis, Basic Books, 1978. Luuk van Middelaar, The Passage to Europe: How a Continent Became a Union, Yale University Press, 2013.

pages: 336 words: 93,672

The Future of the Brain: Essays by the World's Leading Neuroscientists
by Gary Marcus and Jeremy Freeman
Published 1 Nov 2014

Proof that we are not there yet—that we still haven’t “solved” the brain—comes from the fact that we are still apparently quite far from being able to build one. If we really understood the principles behind thought, we could build a machine capable of humanlike thinking. But so far we can’t. At the dawn of the computer age over half a century ago, expectations were high that computers would soon perform many of the same cognitive functions that humans do. Herbert Simon, one of the fathers of artificial intelligence (AI), predicted in 1965 that “machines will be capable, within twenty years, of doing any work a man can do.”

Microserfs
by Douglas Coupland
Published 14 Feb 1995

Microsoft would have been heaven if my system had been operative and in place." SATURDAY New Year's Day, 1994 Abe left for SFO Airport and then we all went for a drive in the Carp - Karla, Ethan, Todd, Bug, and I. We drove past the home of Thomas Watson Jr., 99 Notre Dame Avenue, San Jose, California. Watson steered IBM into the computer age - and was made prez of the company in 1952. In 1953 he developed the first commercial storage device for computers. He died on a New Year's Eve. * * * On the radio we heard that Bill got married, on Lanai in Hawaii, and we all screamed so loudly that the Carp nearly went off the road. And apparently Alice Cooper was there.

pages: 339 words: 92,785

I, Warbot: The Dawn of Artificially Intelligent Conflict
by Kenneth Payne
Published 16 Jun 2021

Much of the research in AI has been driven (and funded) by organisations that are interested in using the product of the research for military purposes. It still is. On one view, the link between war and intelligence goes far deeper: the connection between intelligence and fighting stretches right back through evolutionary history.1 Violence shaped brains, for humans and other species alike. Now, in the computer age, the link persists. Computers themselves were developed in wartime, and those charged with national security were quick to grasp the military implications of machines that could act intelligently. Code-breaking, translation, imagery analysis, to say nothing of autonomous weapons—all were immediately seized upon as ripe for research.

words: 49,604

The Weightless World: Strategies for Managing the Digital Economy
by Diane Coyle
Published 29 Oct 1998

Impatient shoppers spend minutes waiting for an under-trained sales clerk to figure out how to enter a purchase on the terminal, which will control the inventory, and for their credit card to be validated. Economists have dubbed this the productivity puzzle. Nobel Laureate Robert Solow famously joked: ‘You can see the computer age everywhere but in the productivity statistics’.7 So why have computers not generated extra growth in output? There are at least three answers: under-measurement of the output of industries using information technology; over-estimation of the importance of computers relative to all other types of capital equipment; and over-optimism about how quickly new technologies spread.

pages: 294 words: 96,661

The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity
by Byron Reese
Published 23 Apr 2018

We discovered a magical part of the brain that defies all laws of physics, and which therefore requires us to throw out all the science we have based on that physics for the last four hundred years.” No, one by one, the inner workings of the brain are revealed. And yes, the brain is a fantastic organ, but there is nothing magical about it. It is just another device. Since the beginning of the computer age, people have come up with lists of things that computers will supposedly never be able to do. One by one, computers have done them. And even if there were some magical part of the brain (which there isn’t), there would be no reason to assume that it is the mechanism by which we are intelligent.

pages: 299 words: 88,375

Gray Day: My Undercover Mission to Expose America's First Cyber Spy
by Eric O'Neill
Published 1 Mar 2019

Today, the FBI’s internal counterintelligence and computer security programs hunt spies from within as voraciously as the men and women of the bureau investigate external threats. The counterintelligence walls that Hanssen dismantled over twenty-two years of espionage are now bulwarks of defense that leave few places for future moles to hide. Most important, the FBI leapt into the computer age. Programs to detect improper computer usage and to enforce “need to know” when accessing information prevent the abuse of databases that thrilled Hanssen. The FBI’s systems now audit access to critical cases in real time and sound an alarm when someone dips fingers where they don’t belong. The security exploits and self-name searching in the ACS that Hanssen routinely abused are things of the past.

pages: 350 words: 90,898

A World Without Email: Reimagining Work in an Age of Communication Overload
by Cal Newport
Published 2 Mar 2021

Sassone crunches the numbers and argues that the organizations he studied could immediately reduce their staffing costs by 15 percent by hiring more support staff, allowing their professionals to become more productive. To Sassone, this analysis provides a compelling answer to the stagnating productivity in the early personal computer age. “Indeed, in many instances firms have used technology to decrease, rather than to increase, intellectual specialization,” he writes. In the intervening decades, the non-specialization issues reported by Sassone have become even worse. Knowledge workers with highly trained skills, and the ability to produce high-value output with their brains, spend much of their time wrangling with computer systems, scheduling meetings, filling out forms, fighting with word processors, struggling with PowerPoint, and of course, above all, sending and receiving digital messages from everyone about everything at all times.

pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy
by George Gilder
Published 16 Jul 2018

Within a decade, so the attendees prophesied, “scientists will be able to create new species and carry out the equivalent of 10 billion years of evolution in one year.” More than four decades later, the hopes and fears of the 1975 Asilomar conference are nowhere near to coming true. The roots of nearly a half-century of frustration reach back to the meeting in Königsberg in 1930, where von Neumann met Gödel and launched the computer age by showing that determinist mathematics could not produce creative consciousness. Von Neumann stepped forward to become the oracle of the age we are now consummating. Reflecting on the 1975 conference, the eminent chemist-biologist Michael Denton concludes, “The actual achievements of genetic engineering are rather more mundane . . . , a relatively trivial tinkering rather than genuine engineering, analogous to tuning a car engine rather than redesigning it, an exploitation of the already existing potential for variation which is built into all living systems. . . . ” Thousands of transgenic plants have been developed with results “far from the creation or radical reconstruction of a living organism.”14 All that the first Asilomar conference managed to achieve was triggering an obtuse paranoia about “genetically modified organisms” that hinders agricultural progress around the world.

pages: 293 words: 91,110

The Chip: How Two Americans Invented the Microchip and Launched a Revolution
by T. R. Reid
Published 18 Dec 2007

There I found the following: “In 1958 Kilby conceived the Monolithic Idea, that is, the concept of building an entire circuit out of germanium or silicon. . . . About this same time, Noyce . . . also had the monolithic-circuit idea.” This hit me like a bolt of lightning. For the first time I realized the obvious: This miraculous chip was a man-made miracle. All the marvels of the computer age, all the “electronic brains” and “artificial intelligence,” were simply products of the most powerful intelligence of all—the human brain. Beyond that, I was quite taken with the concept that an idea could have a name—“the monolithic idea”—and that two living Americans had conceived an idea that would change the world.

pages: 281 words: 95,852

The Googlization of Everything:
by Siva Vaidhyanathan
Published 1 Jan 2010

Solove could not have predicted the revelation in 2005 that the NSA was monitoring American phone calls through an illegal secret program that relied on the cooperation of the major telecommunication companies. 21. James Rule, Privacy in Peril (Oxford: Oxford University Press, 2007). 22. James Rule, Private Lives and Public Surveillance: Social Control in the Computer Age (New York: Schocken Books, 1974). NOT ES TO PAGES 97–103 237 23. Ibid. 24. In May 2008, Google announced it would deploy special tricycles to extend Street View to roads and alleys in which cars would have trouble navigating. The tricycle experiment began in Italy but was soon used throughout Europe.

pages: 309 words: 101,190

Climbing Mount Improbable
by Richard Dawkins and Lalla Ward
Published 1 Jan 1996

Raup, in turn, was inspired by the celebrated D’Arcy Wentworth Thompson, of the ancient and distinguished Scottish University of St Andrews, whose book, On Growth and Form (first published in 1919), has been a persistent, if not quite mainstream, influence on zoologists for most of the twentieth century. It is one of the minor tragedies of biology that D’Arcy Thompson died just before the computer age, for almost every page of his great book cries out for a computer. Raup wrote a program to generate shell form, and I have written a similar program to illustrate this chapter although—as might be expected—I incorporated it in a Blind Watchmaker-style artificial selection program. The shells of snails and other molluscs, and also the shells of creatures called brachiopods which have no connection with molluscs but superficially resemble them, all grow in the same kind of way, which is different from the way we grow.

Future Files: A Brief History of the Next 50 Years
by Richard Watson
Published 1 Jan 2008

Do you remember the predictions about the paperless office and the leisure society? Between 1999 and 2002 global use of paper increased by 22% and we now seem to have less spare time than ever. We are also sleeping less than we used to, down from 9 hours per day in 1900 to 6.9 hours today. Indeed, the benefits of the computer age can be seen everywhere except in the productivity statistics, because we are inventing new ways of making ourselves busy. Comfortably numb This obsession with “busyness” can be seen in the way the work ethic has invaded childhood. Children must be kept busy at all times. As a result, they are becoming overscheduled and we are creating a cohort that cannot think for itself, a generation of passive, risk-averse citizens and comfortably numb consumers with almost no imagination or self-reliance. 28 FUTURE FILES The Japanese word benriya loosely translates as conveniencedoers.

pages: 345 words: 100,135

Snakes in Suits: When Psychopaths Go to Work
by Dr. Paul Babiak and Dr. Robert Hare
Published 7 May 2007

Advances in computerization, in particular, have accelerated the rate of technological change affecting organizations and have led to dramatic social changes among the workforce as well. Some of this change has had a positive effect. The Internet has opened a whole new world of exploration and study. Commerce in the computer age has advanced to the point where people can shop or do their banking at home at any time of night or day, and small entrepreneurial companies have grown in number as markets opened up that were once thought out of reach. Education—on just about everything—is now available to a greater number of individuals around the globe.

pages: 317 words: 101,074

The Road Ahead
by Bill Gates , Nathan Myhrvold and Peter Rinearson
Published 15 Nov 1995

The computer chips we use today are integrated circuits containing the equivalent of millions of transistors packed onto less than a square inch of silicon. In a 1977 Scientific American article, Bob Noyce, one of the founders of Intel, compared the $300 microprocessor to ENIAC, the moth-infested mastodon from the dawn of the computer age. The wee microprocessor was not only more powerful, but as Noyce noted, "It is twenty times faster, has a larger memory, is thousands of times more reliable, consumes the power of a lightbulb rather than that of a locomotive, occupies 1/30,000 the volume and costs 1/10,000 as much. It is available by mail order or at your local hobby shop." 1946: A view inside a part of the ENIAC computer Of course, the 1977 microprocessor seems like a toy now.

pages: 360 words: 100,991

Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence
by Richard Yonck
Published 7 Mar 2017

This isn’t a problem that’s limited to just human beings. Many shortcomings of artificial intelligence correspond to an inability to know where to direct attention and focus. As this chapter will explore, the absence of an emotion-like function may be a key factor behind this. From the very beginning of the computer age, scientists and researchers have sought to create artificial intelligences, programs by which computers are able to perform some or all of the cognitive functions that people do. Early on it was thought this lofty goal would soon be in our grasp. After all, machines had already proven they were capable of churning through vast numerical tasks far more rapidly than any person ever could.

pages: 372 words: 101,174

How to Create a Mind: The Secret of Human Thought Revealed
by Ray Kurzweil
Published 13 Nov 2012

This unfinished document nonetheless remains a brilliant and prophetic foreshadowing of what I regard as humanity’s most daunting and important project. It was published posthumously as The Computer and the Brain in 1958. It is fitting that the final work of one of the most brilliant mathematicians of the last century and one of the pioneers of the computer age was an examination of intelligence itself. This project was the earliest serious inquiry into the human brain from the perspective of a mathematician and computer scientist. Prior to von Neumann, the fields of computer science and neuroscience were two islands with no bridge between them. Von Neumann starts his discussion by articulating the similarities and differences between the computer and the human brain.

pages: 352 words: 96,532

Where Wizards Stay Up Late: The Origins of the Internet
by Katie Hafner and Matthew Lyon
Published 1 Jan 1996

New York: John Wiley & Sons, 1974–1976. Langdon-Davies, John. NPL: Jubilee Book of the National Physical Laboratory. London: His Majesty’s Stationery Office, 1951. Lebow, Irwin. Information Highways & Byways: From the Telegraph to the 21st Century. New York: IEEE Press, 1995. Licklider, J. C. R. “Computers and Government.” In The Computer Age: A Twenty-Year View, edited by Michael L. Dertouzos and Joel Moses. MIT Bicentennial Series. Cambridge: MIT Press, 1979. ———. Libraries of the Future. Cambridge: MIT Press, 1965. Padlipsky, M. A. The Elements of Networking Style and Other Essays & Animadversions of the Art of Intercomputer Networking.

pages: 383 words: 98,179

Last Trains: Dr Beeching and the Death of Rural England
by Charles Loft
Published 27 Mar 2013

The key question here is, if the Freshwater branch closes will those using it who have started their journey in London continue to use the railway to reach Newport (in which case none of the contributory revenue is lost and the case for closure is strengthened), will they make the entire journey by road (in which case all the contributory revenue is lost and the economics of the other two services might be adversely affected to the point where the closure makes no sense) or will they go to Margate by train instead (in which case, the precise balance of lost and new revenue is virtually impossible to gauge)? When one considers that a resort might earn contributory revenue on a wide variety of routes, the complexity of the calculations involved in the pre-computer age becomes clear. The fact that figures for contributory revenue were gross revenue, and therefore took no account of the profitability of the services on which they were earned, adds another complication. The Isle of Wight lines might generate a large amount of additional traffic on the London–Portsmouth main line during the summer; but if that traffic required the provision of extra signalling, coaches, locomotives and staff that were only used on a few summer weekends, it was not necessarily profitable.

pages: 417 words: 97,577

The Myth of Capitalism: Monopolies and the Death of Competition
by Jonathan Tepper
Published 20 Nov 2018

Audretsch, “Testing the Schumpeterian Hypothesis,” Eastern Economic Journal XIV, no. 2 (1988). 51. https://www.marketwatch.com/story/americas-most-successful-companies-are-killing-the-economy-2017-05-24. 52. https://www.bloomberg.com/news/articles/2017-10-12/google-has-made-a-mess-of-robotics. 53. http://blog.luxresearchinc.com/blog/2016/03/the-downfall-of-google-robotics/. 54. Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (HarperBusiness, 1999). 55. Barry C. Lynn, Cornered: The New Monopoly Capitalism and the Economics of Destruction (Hoboken, NJ: Wiley, 2010). 56. https://qz.com/801706/innovation-guru-clayton-christensens-new-theory-will-help-protect-you-from-disruption/. 57. Frederic M. Scherer, “Technological Innovation and Monopolization” (October 2007).

pages: 348 words: 97,277

The Truth Machine: The Blockchain and the Future of Everything
by Paul Vigna and Michael J. Casey
Published 27 Feb 2018

Grigg worked in the field of cryptography, a science that dates way back to ancient times, when coded language to share “ciphers,” or secrets, first arose. Ever since Alan Turing’s calculating machine cracked the German military’s Enigma code, cryptography has underpinned much of what we’ve done in the computing age. Without it we wouldn’t be able to share private information across the Internet—such as our transactions within a bank’s Web site—without revealing it to unwanted prying eyes. As our computing capacity has exponentially grown, so too has the capacity of cryptography to impact our lives. For his part, Grigg believed it would lead to a programmable record-keeping system that would make fraud virtually impossible.

pages: 418 words: 102,597

Being You: A New Science of Consciousness
by Anil Seth
Published 29 Aug 2021

Perception of the outside world is obviously useful for guiding action, but why should our internal physiological condition be built into our conscious lives from the ground up? Answering this question takes us back in history once again, but this time only to the mid twentieth century and to the neglected amalgam of computer science, artificial intelligence, engineering, and biology known as cybernetics. — In the 1950s, at the dawn of the computer age, the emerging disciplines of cybernetics and artificial intelligence (AI) were equally promising and in many ways inseparable. Cybernetics – from the Greek kybernetes, meaning ‘steersman’ or ‘governor’ – was described by one of its founders, the mathematician Norbert Wiener, as ‘the scientific study of control and communication in the animal and the machine’.

pages: 329 words: 101,233

We Are Electric: Inside the 200-Year Hunt for Our Body's Bioelectric Code, and What the Future Holds
by Sally Adee
Published 27 Feb 2023

In 1941, just as Adrian was writing up his letter recommending Berger to the Nobel committee, the latter, mired in despair and depression, took his own life. After seventeen years of progress in EEG technology, the field stalled for another four decades. During this time, we decided that we’d rather send electricity into the brain than decipher the codes hiding in the natural kind. How we decided the brain was a computer At the dawn of the computer age—as engineers began to assemble the very first room-sized computing machines—those computers were also being built (and conceived of) as a kind of brain. In 1944, the electronics manufacturer Western Electric, in a glossy Life magazine ad for its new anti-aircraft guidance system, declared that “this electrical brain—the Computer—thinks of everything.”

The Jasons: The Secret History of Science's Postwar Elite
by Ann Finkbeiner
Published 26 Mar 2007

ARPA told Jason to hire young, smart scientists; to solve technical problems; to point out science that academics weren’t developing but that the military might use; to analyze but not to experiment. It anticipated that “minimum expenditures will be made for computers [and] assistants.” The prohibition on computers was a relic of the time, the beginning of the computer age when, said Hal Lewis, a primeval Jason, “we lost students to computers—they got mesmerized and forgot to do physics. You didn’t want this to be turning into a computer buffs’ organization.” The rationale for the prohibition on assistants, Lewis said, “was that Jason should be the work of the absolutely top physicists and one didn’t want to use their names when graduate students really did the work.”

pages: 387 words: 110,820

Cheap: The High Cost of Discount Culture
by Ellen Ruppel Shell
Published 2 Jul 2009

The supplier sent out the items the next day in containers tagged for scanning at the buyer’s distribution center where it was unloaded and routed to a truck for delivery to the designated store. To maintain efficiencies across their supply chains, discounters required that suppliers tag their products at the factory or warehouse, thereby pulling manufacturers and wholesalers—some kicking and screaming—into the stark new computer age. This just-in-time model reduced the problem of languishing inventory. It also meant that manufacturers had to play by the retailers’ rules, limiting their production to items that discounters could sell at low prices and in vast volumes. Options for both manufacturers and consumers narrowed: Manufacturers had much less discretion in what they could produce or how they could produce it, and consumers, although treated to what seemed like an ever-expanding variety of merchandise, were in fact being offered less variety and more variations on a theme.

pages: 289 words: 112,697

The new village green: living light, living local, living large
by Stephen Morris
Published 1 Sep 2007

Your Money or Your Life: Transforming Your Relationship with Money and Acheiving Financial Independence. Joe Dominguez & Vicki Robin. Penguin Books, 1992. 254 chapter 8 : The Good Life Colophon T by Michael Potts raditionally, this is where the book designer would name the type fonts (Garamond Book, Times New Roman, and Kabel), and, in this computer age, the programs employed (Quark, Firefox, and PhotoPaint).Then the printer might tell about the printing process. (See this book’s last page.) This being a non-traditional book, I have saved myself a few pages to share the experience of designing this book. New Village Green is a hopeful plunge into a salty ocean of ideas.

pages: 502 words: 107,657

Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die
by Eric Siegel
Published 19 Feb 2013

Mining the Web to Play ‘Who Wants to be a Millionaire?’” UAI ’03, Proceedings of the 19th Conference in Uncertainty in Artificial Intelligence, Acapulco, Mexico, August 7–10, 2003, pages 337–345. http://arxiv.org/ftp/arxiv/papers/1212/1212.2477.pdf. Regarding “Time flies like an arrow”: Gilbert Burck, The Computer Age and Its Potential for Management (Harper & Row, 1965). My (the author’s) PhD research pertained to the “have a car/baby” example (temporal meaning in verbs): Eric V. Siegel and Kathleen R. McKeown, “Learning Methods to Combine Linguistic Indicators: Improving Aspectual Classification and Revealing Linguistic Insights,” Computational Linguistics 26, issue 4 (December 2000). doi:10.1162/089120100750105957, http://dl.acm.org/citation.cfm?

pages: 371 words: 108,317

The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future
by Kevin Kelly
Published 6 Jun 2016

There were online bulletin boards, experimental teleconferences, and this place called the internet. The portal through the phone line opened up something both vast and at the same time human scaled. It felt organic and fabulous. It connected people and machines in a personal way. I could feel my life jumping up to another level. Looking back, I think the computer age did not really start until this moment, when computers merged with the telephone. Stand-alone computers were inadequate. All the enduring consequences of computation did not start until the early 1980s, that moment when computers married phones and melded into a robust hybrid. In the three decades since then, this technological convergence between communication and computation has spread, sped up, blossomed, and evolved.

pages: 375 words: 106,536

Lost at Sea
by Jon Ronson
Published 1 Oct 2012

And this is an industry that’s self-regulating. Why is that?” • • • LATER I HEAR the story of why it takes three days for an electronic transfer to clear. Transfers used to really take three days to clear, in the days they were delivered by carrier pigeon, or whatever. But now, in this computer age, they take an instant to clear, but they keep the three-day rule going so they can accrue three days of interest. The banks make tens of millions from these wheezes. • • • IN OCTOBER 2003 Matthew Barrett, the CEO of Barclays, was called before the Treasury Select Committee. He was asked about the small print.

pages: 319 words: 106,772

Irrational Exuberance: With a New Preface by the Author
by Robert J. Shiller
Published 15 Feb 2000

There was an electrograph, a machine that transmitted pictures by wire (forerunner of the fax machine), and a tel-autograph, a machine that enabled one to transmit one’s signature over long distances (forerunner of credit card signatureverification devices). The exposition even offered a simulated trip to the moon on the airship Luna: the visitor could take a stroll through the streets and shops of the moon before returning to earth. In a sense, the high-tech age, the computer age, and the space age seemed just around the corner in 1901, though the concepts were expressed in different words than we would use today. People were upbeat, and in later years the first decade of the twentieth century came to be called the Age of Optimism, the Age of Confidence, or the Cocksure Era.

pages: 389 words: 109,207

Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street
by William Poundstone
Published 18 Sep 2006

Ed told Vivian that Las Vegas was a great place for a bargain-priced vacation. He wanted to get a look at the casino roulette wheels in action. Just before this trip, a friend gave Thorp an article from the Journal of the American Statistical Association. It was an analysis of the game of blackjack. Until the computer age, it was impractical to calculate the exact probabilities in blackjack and many other card games. There are an astronomical number of possible arrangements of a deck of fifty-two cards. Unlike in the case of roulette, the blackjack player has decisions to make. The odds in blackjack therefore depend on what strategy the player uses.

pages: 319 words: 105,949

Skyfaring: A Journey With a Pilot
by Mark Vanhoenacker
Published 1 Jun 2015

If I like a song or two about leaving New York, my preferred aerial song of the city would be one of arrival from far out at sea. The city looks as if a huge vase of pixels had been tipped over Manhattan, stacking and tumbling outward, flattening into the suburbs and gradually disappearing into the dark forests of the continent’s interior, as if in some computer-age myth of its foundation. The city’s bays and rivers glow in this reflected, electric gold; while further out the waters are themselves scattered with the constellated lights of vessels, as if an autumn storm had blown particles of light from the land where they first fell, onto the pitch-dark waters of the city’s maritime approaches.

pages: 394 words: 108,215

What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry
by John Markoff
Published 1 Jan 2005

Kay, “The Early History of Smalltalk,” ACM SIGPLAN Notices 28:3 (March 1993): 13. 3.Ibid. 4.Ibid. 5.Ambitious distributed computing projects like Microsoft’s .Net and IBM’s Websphere indicate the persistence of this goal. 6.Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age New York: Harper Business, 1999), p. 164. 7.Author interview with Robert Taylor, Woodside, Calif., June 17, 2003. 8.Hiltzik, Dealers of Lightning, pp. 168–69. 9.Author interview, Adele Goldberg, San Francisco, Calif., July 15, 2001. 10.Author interview, Larry Tesler, Menlo Park, Calif., August 27, 2001. 8 | Borrowing Fire from the Gods 1.Fred Moore, letter to Dick Raymond and Point Agents, February 28, 1972, personal papers, courtesy of Irene Moore. 2.Fred Moore, personal journal, March 24, 1972. 3.Author interview, Dennis Allison, Palo Alto, Calif., July 28, 2001. 4.Gregory Yob, “Hunt the Wumpus,” in The Best of Creative Computing, vol. 1, ed.

pages: 379 words: 108,129

An Optimist's Tour of the Future
by Mark Stevenson
Published 4 Dec 2010

Written out as A’s, T’s, C’s and G’s on paper, it’d take up roughly the equivalent of two hundred volumes the size of the 1,000-page Manhattan phone book, and take nine and half years to read out loud (assuming you read ten letters a second and never slept, ate or went to the toilet while you were doing it). Which in our computer age isn’t actually that big a deal. It’s roughly three gigabytes of data – just over twice the amount in an iTunes movie download of the genetic comedy Twins starring Danny DeVito and Arnold Schwarzenegger (whose scriptwriters, by the way, must have glanced over the science and said, ‘well, we won’t be needing that!’).

pages: 382 words: 105,819

Zucked: Waking Up to the Facebook Catastrophe
by Roger McNamee
Published 1 Jan 2019

Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing, by Thierry Bardini (Palo Alto: Stanford University Press, 2000), tells the story of the genius who created the mouse, visualized a networked world of PCs, and gave the Mother of All Demos. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age, by Michael A. Hiltzik (New York: HarperBusiness, 1999), takes the reader inside the research center in Palo Alto where Steve Jobs saw the future. Troublemakers: Silicon Valley’s Coming of Age, by Leslie Berlin (New York: Simon & Schuster, 2017), tells the story of the men and women, some well known, others obscure, who helped to build Silicon Valley.

pages: 392 words: 108,745

Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think
by James Vlahos
Published 1 Mar 2019

By World War II what had once been a far-fetched pursuit now seemed achievable. While it would take decades of additional work to produce more natural and realistic sounds, machines were undoubtedly gaining their voices. It took the advent of a whole new type of technology, however, for them to acquire something even more important: brains. Only in the computer age have talking objects become capable of anything more than the playing back of recorded messages. Of course, that a wondrous new invention—the electronic digital computer—would be adept at mathematical calculation was obvious from the jump. The 1936 paper by Alan Turing that first laid out a vision for such devices was titled, “On Computable Numbers.”

pages: 383 words: 105,021

Dark Territory: The Secret History of Cyber War
by Fred Kaplan
Published 1 Mar 2016

Two years earlier, Roberts had designed a communications link, over a 1200-baud phone line, between a computer at MIT’s Lincoln Lab, where he was working at the time, and a colleague’s computer in Santa Monica. It was the first time anyone had pulled off the feat: he was, in effect, the Alexander Graham Bell of the computer age. Yet Roberts hadn’t thought about the security of this hookup. In fact, Ware’s paper annoyed him. He begged Lukasik not to saddle his team with a security requirement: it would be like telling the Wright brothers that their first airplane at Kitty Hawk had to fly fifty miles while carrying twenty passengers.

pages: 380 words: 109,724

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US
by Rana Foroohar
Published 5 Nov 2019

The reaction is less intellectual than visceral. Life feels off. People feel stressed, behind, out of sorts, disconnected, lost. It’s not just the whacked-out politics of the current presidential administration, not just the political polarization, not just job anxieties, not just the upheaval of industrialism giving way to the computer age. It’s both more than that, and less. “I think if you zoom out, this is like 1946 in a sense that we’ve just invented this new, powerful, and very dangerous new technology,” says Harris. “We’ve developed a system of manipulating our own [social] system that is more powerful than the ability of even our own mind to track it.”26 On that score, it’s impossible not to be reminded of Mary Shelley, who wrote her famous Frankenstein in 1818 at a turbulent time—in many ways not unlike our own—that was causing romantics like her to declare a “crisis of feeling” as others before them had declared a crisis of faith.

pages: 419 words: 109,241

A World Without Work: Technology, Automation, and How We Should Respond
by Daniel Susskind
Published 14 Jan 2020

Please use the search function on your e-reading device to search for terms of interest. For your reference, the terms that appear in the print index are listed below. abandonment ability bias Acemoglu, Daron adaptive learning systems admissions policies, conditional basic income and affective capabilities affective computing Age of Labor ALM hypothesis and optimism and overview of before and during twentieth century in twenty-first century Agesilaus AGI. See artificial general intelligence agoge agriculture Airbnb airborne fulfillment centers Alaska Permanent Fund Alexa algorithms alienation al-Khwarizmi, Abdallah Muhammad ibn Musa ALM (Autor-Levy-Murnane) hypothesis AlphaGo AlphaGo Zero AlphaZero Altman, Sam Amara, Roy Amazon artificial intelligence and changing-pie effect and competition and concerns about driverless vehicles and market share of network effects and profit and Andreessen, Marc ANI.

pages: 374 words: 111,284

The AI Economy: Work, Wealth and Welfare in the Robot Age
by Roger Bootle
Published 4 Sep 2019

In particular, they don’t conform to the traditional pattern of technological advance that has underpinned economic progress over the last two centuries, namely the replacement of human labor by machines. You would have thought that computerization most definitely did conform to this paradigm. Yet the Nobel Prize-winning economist Robert Solow famously remarked in 1987: “you can see the computer age everywhere but in the productivity statistics.” 29 (Mind you, the pickup in US productivity in the late 1990s suggests that the gains from computers were real but, as with many other advances, delayed.) The American entrepreneur and venture capitalist, Peter Thiel, has put recent technological disappointment more pithily.

pages: 382 words: 105,166

The Reckoning: Financial Accountability and the Rise and Fall of Nations
by Jacob Soll
Published 28 Apr 2014

The list of acronyms in the preceding paragraphs attests to the depersonalized character of twentieth-century accounting. It now was a subject only for the expertly informed, inscrutable even to the best educated citizen. Accountants now became synonymous not only with professional success but also with the dehumanizing large-scale number crunching of the mainframe computer age. Like postwar economic growth, the golden age of accounting did not last, and the role of the accountant as a leading, even gentlemanly, social figure and neutral arbiter of business and regulation soon enough began to erode. Competition between the auditing firms became fierce in the mid-1950s.

pages: 374 words: 110,238

Fall: The Mysterious Life and Death of Robert Maxwell, Britain's Most Notorious Media Baron
by John Preston
Published 9 Feb 2021

Undeterred, he kept on repeating his claims that he was the most successful encyclopedia salesman the world had ever known. While some may have had their doubts, others were only too happy to cling to his coat-tails hoping to share in the spoils. Saul Steinberg was an American businessman who had made a fortune out of leasing IBM computers. Aged twenty-nine, he owned a twenty-nine-room house on Long Island and was reputed to have made more money more quickly than anyone else in America. Steinberg had heard a lot about Maxwell, and liked the sound of him. He may have been a brute, but he appeared to be a very successful brute. In the spring of 1969, Steinberg approached Maxwell with an offer.

pages: 356 words: 106,161

The Glass Half-Empty: Debunking the Myth of Progress in the Twenty-First Century
by Rodrigo Aguilera
Published 10 Mar 2020

But despite the fact that many of these items eventually become part of the basic basket of goods, their ultimate contribution to productivity is dubious at best; productivity growth in many Western countries has slowed down or even turned negative in the post-crisis years.17 When economist Robert Solow famously griped in 1987 that “you can see the computer age everywhere except in the productivity statistics”, little did he imagine how much money people would be spending two decades later to have a mini-computer in their pockets whose main use was to flag a taxi or play Candy Crush. Figure 3.3: The winners and losers of globalization Notes: This is the elephant chart showing the growth in incomes across every ventile of world population, plus the top 1 percent.

pages: 405 words: 105,395

Empire of the Sum: The Rise and Reign of the Pocket Calculator
by Keith Houston
Published 22 Aug 2023

Underwood was in a more parlous state than anyone had been willing to admit, leading Pero to declare that both factories and working practices would have to be demolished and rebuilt to Olivetti’s higher standards.35 Olivetti needed something to recapture the old magic. It had always been good at making mechanical devices, such as its typewriters and hand-cranked calculators; now, with the Elea 9003, it was also a player in the highest echelons of the computer age.36 As Roberto and Tchou discussed what they should do next, the answer seemed obvious: conquer the middle ground. What if a calculator could punch up, or a computer could punch down? Most companies that could afford computers kept a tight rein on them because of their extravagant costs, but the Olivetti pair imagined a device that occupied a desk, not a room, and that could be operated by the average office worker rather than the high priests of the computing department.37 One might almost call it a personal computer.

pages: 872 words: 259,208

A History of Modern Britain
by Andrew Marr
Published 2 Jul 2009

Yet most people had in the end done well under her, not just the Yuppies and Essex boys, but also her snidest middle-class critics. Britons were on average much wealthier than they had been at the end of seventies. The country was enjoying bigger cars, a far wider range of holidays, better food, a wider choice of television channels, home videos, and the first slew of gadgets from the computer age. Yet this was not quite the Britain of today. More people smoked. The idea of smoke-free public areas, or smoking bans in offices and restaurants, was lampooned as a weird Californian innovation that would never come here. People seen talking to themselves with a wire dangling from one ear would have been considered worryingly disturbed.

After a spate of transport disasters there was a widespread feeling that large investment was needed in the country’s infrastructure. French and British engineers celebrated in 1990 when they met under the Channel. Mobile phone use was tiny by modern standards, mainly confined to commercial business travellers’ cars and a few much-mocked City slickers carrying objects the size of a brick. The computer age was further advanced. The Thatcher years had seen a glittering waterfall of new products and applications, most of them generated in California’s new ‘silicon valley’, a hotbed of computer inventiveness recognized by name as early as 1971. The revolutionary Apple II computer had been launched in 1977, followed by Tandys, Commodores and Ataris with their floppy disks and basic games.

pages: 490 words: 40,083

PostgreSQL: introduction and concepts
by Bruce Momjian
Published 15 Jan 2001

Figure 14.15 shows an example of CHECK constraints using a modified version of the friend table from Figure 3.2, page 13. This figure has many CHECK clauses: state Forces the column to be two characters long. CHAR() pads the field with spaces, so state must be trim()-ed of trailing spaces before length() is computed. age Forces the column to hold only positive values. gender Forces the column to hold either M or F. last_met Forces the column to include dates between January 1, 1950, and the current date. table Forces the table to accept only rows where firstname is not ED or lastname is not RIVERS. The effect is to prevent Ed Rivers from being entered into the table.

pages: 463 words: 118,936

Darwin Among the Machines
by George Dyson
Published 28 Mar 2012

The scales are shifting both in distance and in time; the intelligence a large corporation once gathered for its annual report is now available to any small business using a personal computer to manage its day-to-day accounts. “We felt that the distinction between micro- and macro-economics, while appropriate in a non-computer age, was no longer necessary,”36 remarked economist Gerald Thompson, recalling his final collaboration with Oskar Morgenstern in 1975, two years before Morgenstern’s death. Money is a recursive function, defined, layer upon layer, in terms of itself. The era when you could peel away the layers to reveal a basis in precious metals ended long ago.

pages: 524 words: 120,182

Complexity: A Guided Tour
by Melanie Mitchell
Published 31 Mar 2009

More technically, it is used to describe a vast array of phenomena ranging from the fiber-optic transmissions that constitute signals from one computer to another on the Internet to the tiny molecules that neurons use to communicate with one another in the brain. The different examples of complex systems I described in chapter 1 are all centrally concerned with the communication and processing of information in various forms. Since the beginning of the computer age, computer scientists have thought of information transmission and computation as something that takes place not only in electronic circuits but also in living systems. In order to understand the information and computation in these systems, the first step, of course, is to have a precise definition of what is meant by the terms information and computation.

pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise
by Nathan L. Ensmenger
Published 31 Jul 2010

As Michael Williams suggests in a recent volume edited by Raul Rojas and Ulf Hashagen called The First Computers (note the crucial use of the plural), any particular claim to the priority of invention must necessarily be heavily qualified: if you add enough adjectives you can always claim your own favorite.11And indeed, the ENIAC has a strong claim to this title: not only was it digital, electronic, and programmable (and therefore looked a lot like a modern computer) but the ENIAC designers—John Mauchly and J. Presper Eckert—went on to form the first commercial computer company in the United States. The ENIAC and its commercial successor, the Universal Automatic Computer (UNIVAC), were widely publicized as the first of the “giant brains” that presaged the coming computer age. But even the ENIAC had its precursors and competitors. For example, in the 1930s, a physicist at Iowa State University, John Atanasoff, had worked on an electronic computing device and had even described it to Mauchly. Others were working on similar devices. During the Second World War in particular, a number of government and military agencies, both in the United States and abroad, had developed electronic computing devices, many of which also have a plausible claim to being if not the first computer, then at least a first computer.

pages: 289 words: 113,211

A Demon of Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation
by Richard Bookstaber
Published 5 Apr 2007

Half a century after the limits of measurement and thus of physical knowledge were demonstrated by Heisenberg in the world of quantum mechanics, Lorenz piled on a result that showed how microscopic errors could propagate to have a stultifying impact in nonlinear dynamic systems. This limitation could come into the forefront only with the dawning of the computer age, because it is manifested in the subtle errors of computational accuracy. The essence of the butterfly effect is that small perturbations can have large repercussions in massive, random forces such as weather. Edward Lorenz was a professor of meteorology at MIT, and in 1961 he was testing and tweaking a model of weather dynamics on a rudimentary vacuumtube computer.

pages: 422 words: 113,830

Bad Money: Reckless Finance, Failed Politics, and the Global Crisis of American Capitalism
by Kevin Phillips
Published 31 Mar 2008

If Paulson wanted to keep the spotlight off the real culprits—the financial and mortgage giants with their leverage laboratories, multiple trillions worth of exotic mortgages, toxic CDOs, and Las Vegas-like credit swaps—then academician Bernanke at the Fed was a good partner. The narrow-gauge theory in which the economics Ph.D. had immersed himself basically ignored twenty-first-century mega-innovations and looked back seven decades to the crash of the 1930s and how that long-ago, pre-computer-age debacle might have been prevented. Too little of that outdated context applied more than seven decades later. Foremost among Bernanke’s economist heroes were the late Milton Friedman and the latter’s wife, Anna Schwartz. Some four decades earlier, they had coauthored a landmark volume entitled A Monetary History of the United States, which made them the preeminent theorists of how the Federal Reserve deserved principal blame for letting the circumstances of 1929-1933 deepen into depression—and also of how the Fed might have prevented that deflationary descent.

pages: 347 words: 112,727

Rust: The Longest War
by Jonathan Waldman
Published 10 Mar 2015

In early 1945 the US District Court for the Northern District of Ohio found Ball in violation of the Sherman Antitrust Act, and the Supreme Court, a few months later, affirmed the decision, which meant the end of the Ball brothers’ expansion. The brothers’ mustaches had gotten too big. The ruling made modernizing glass factories pointless; Ball’s only choice was to diversify. So Ball diversified, and how. The company got involved in every age: the plastics age, the computer age, the space age. Ball made display monitors, pressure cookers, Christmas ornaments, roofing, nursing bottles, prefab housing, battery shells, and a chemical for preserving vinyl LPs. In the early 1980s, Ball made about 12 billion pennies—or rather, 12 billion copper-plated zinc penny blanks for the US mints in San Francisco, Denver, Philadelphia, and West Point.

pages: 508 words: 120,339

Working Effectively With Legacy Code
by Michael Feathers
Published 14 Jul 2004

When I need to make changes in particularly tangled legacy code, I often spend time trying to figure out where I should write my tests. This involves thinking about the change I am going to make, seeing what it will affect, seeing what the affected things will affect, and so on. This type of reasoning is nothing new; people have been doing it since the dawn of the computer age. Programmers sit down and reason about their programs for many reasons. The funny thing is, we don’t talk about it much. We just assume that everyone knows how to do it and that doing it is “just part of being a programmer.” Unfortunately, that doesn’t help us much when we are confronted with terribly tangled code that goes far beyond our ability to reason easily about it.

The Fugitive Game: Online With Kevin Mitnick
by Jonathan Littman
Published 1 Jan 1996

"There's a service. It's called "Protect My Friend Service," Mitnick chuckles. "You pay me a certain fee per month and I make sure nobody causes you problems." "Is this the Capone program?" "Yeah. It's a new program. It was developed throughout the years to protect stores and stuff, and now we're going into the computer age." Mitnick can't stop laughing. I can't either. "I think you really need this service!" Mitnick howls. "So what sort of services are provided?" Mitnick catches himself, holding back the laughter. "Don't print that shit because someone's actually going to believe it!" "There's nothing I can do, huh?"

pages: 395 words: 116,675

The Evolution of Everything: How New Ideas Emerge
by Matt Ridley

And if Moore’s Law has continued through technological change-overs, then there’s no reason to think it will not happen again. When chips eventually reach their miniaturisation limit, the plummeting cost will continue in another technology. Nor is Moore’s Law the only such regularity to emerge in the computer age. Kryder’s Law says that the cost per performance of hard disk computer storage is rising exponentially, at 40 per cent a year. Cooper’s Law finds that the number of possible simultaneous wireless communications has doubled every thirty months since 1895, when Marconi first broadcast. These are largely independent of Moore’s Law.

pages: 352 words: 120,202

Tools for Thought: The History and Future of Mind-Expanding Technology
by Howard Rheingold
Published 14 May 2000

Military-sponsored research-and-development teams on both sides of the Atlantic continued to work on digital computers of their own. A few of these independent research efforts grew out of Ballistics work. Others were connected with the effort to build the first nuclear fission and fusion bombs. Over a hundred years had passed between Babbage and Turing. The computer age might have been delayed for decades longer if World War II had not provided top-notch engineering teams, virtually unlimited funds, and the will to apply scientific findings to real-world problems at the exact point in the history of mathematics when the theory of computation made computers possible.

pages: 447 words: 111,991

Exponential: How Accelerating Technology Is Leaving Us Behind and What to Do About It
by Azeem Azhar
Published 6 Sep 2021

This is what hominids did with flints. It is what pharaonic stonemasons did to the stone blocks of the pyramids. And it is what Michelangelo did when he chiselled a block of marble to create David. Today, we can do this at a grander scale and with greater precision, but the process is essentially the same. Even as the computer age heralded precise computerised machining, this was still a subtractive process: the early human’s hammering of flint on stone was replaced by a diamond cutter controlled by a computer. Of course, there are other methods of making stuff, such as using casts to mould metals or plastic. These have the advantage over chiselling in that they don’t waste any material.

pages: 415 words: 114,840

A Mind at Play: How Claude Shannon Invented the Information Age
by Jimmy Soni and Rob Goodman
Published 17 Jul 2017

This leap, writes Walter Isaacson, “became the basic concept underlying all digital computers.” It would be six more years before Turing and Shannon met in a wartime scientists’ cafeteria, each of their projects so classified that they could only speak of them in hints. They had barely begun to build. Yet in one year, “an annus mirabilis of the computer age,” they had laid the foundations. In particular, they had shown the possibilities of digital computing, of minute, discrete decisions arrayed one after the other. Less than a decade after Shannon’s paper, the great analog machine, the differential analyzer, was effectively obsolete, replaced by digital computers that could do its work literally a thousand times faster, answering questions in real time, driven by thousands of logic gates that each acted as “an all-or-none device.”

pages: 389 words: 119,487

21 Lessons for the 21st Century
by Yuval Noah Harari
Published 29 Aug 2018

For a general survey of methods, see: Jose David Fernández and Francisco Vico, ‘AI Methods in Algorithmic Composition: A Comprehensive Survey’, Journal of Artificial Intelligence Research 48 (2013), 513–82. 12 Eric Topol, The Patient Will See You Now: The Future of Medicine is in Your Hands (New York: Basic Books, 2015); Robert Wachter, The Digital Doctor: Hope, Hype and Harm at the Dawn of Medicine’s Computer Age (New York: McGraw-Hill Education, 2015); Simon Parkin, ‘The Artificially Intelligent Doctor Will Hear You Now’, MIT Technology Review 9 March 2016; James Gallagher, ‘Artificial intelligence “as good as cancer doctors”’, BBC, 26 January 2017. 13 Kate Brannen, ‘Air Force’s lack of drone pilots reaching “crisis” levels’, Foreign Policy, 15 January 2015. 14 Tyler Cowen, Average is Over: Powering America Beyond the Age of the Great Stagnation (New York: Dutton, 2013); Brad Bush, ‘How combined human and computer intelligence will redefine jobs’, TechCrunch, 1 November 2016. 15 Ulrich Raulff, Farewell to the Horse: The Final Century of Our Relationship (London: Allen Lane, 2017); Gregory Clark, A Farewell to Alms: A Brief Economic History of the World (Princeton: Princeton University Press, 2008), 286; Margo DeMello, Animals and Society: An Introduction to Human-Animal Studies (New York: Columbia University Press, 2012), 197; Clay McShane and Joel Tarr, ‘The Decline of the Urban Horse in American Cities’, Journal of Transport History 24:2 (2003), 177–98. 16 Lawrence F.

pages: 424 words: 114,905

Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again
by Eric Topol
Published 1 Jan 2019

The previous books, examined in the light of where we are now, reveal his prescient vision. In Deep Medicine, Eric tells us we are living in the Fourth Industrial Age, a revolution so profound that it may not be enough to compare it to the invention of steam power, the railroads, electricity, mass production, or even the computer age in the magnitude of change it will bring. This Fourth Industrial Age, revolving around artificial intelligence (AI), robotics, and Big Data, heralds a profound revolution that is already visible in the way we live and work, perhaps even in the way we think of ourselves as humans. It has great potential to help, but also to harm, to exaggerate the profound gap that already exists between those who have much and those who have less each passing year.

pages: 394 words: 117,982

The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age
by David E. Sanger
Published 18 Jun 2018

While the Ukrainians did not have defenses as sophisticated as many American utility companies, a quaint oddity in Ukrainian systems ultimately saved them from an even greater disaster. It turned out that their electric grid, built by the Soviets, was so antiquated that it wasn’t entirely dependent on computers. “They still had the big, old metal switches that ran the power grid back in the pre-computer age,” Ozment explained, as if admiring the simplicity of an original Ford Model A engine. The investigators reported that Ukrainian engineers got into their trucks and went scrambling from one substation to another, looking for switches that they could throw to route around the computers and turn the lights back on.

pages: 387 words: 119,244

Making It Happen: Fred Goodwin, RBS and the Men Who Blew Up the British Economy
by Iain Martin
Published 11 Sep 2013

They were shepherded by Cameron McPhail, a Glasgow University economics graduate and part of Mathewson’s small team who had worked on the original reorganisation leading to the night of the long knives. He would report to Tony Schofield and to Mathewson. Prior to joining the Royal Bank, McPhail had been living in San Francisco, promoting Scotland as an investment location and imbibing the spirit of the coming computer age. Mathewson wanted modern thinking to imbue their deliberations. The Royal Bank should be reorganised from top to bottom, he told them, in ways that controlled costs of course. The management consultants McKinsey were hired to assist the effort. In terms of the bank’s staff, a premium was put on having non-bankers involved, outsiders who had worked in other industries and who would not be swayed by old loyalties and ingrained traditions.

pages: 463 words: 115,103

Head, Hand, Heart: Why Intelligence Is Over-Rewarded, Manual Workers Matter, and Caregivers Deserve More Respect
by David Goodhart
Published 7 Sep 2020

Here is Turner from his 2018 paper Capitalism in the Age of Robots: In a world of ever increasing automation possibilities, we only need a very small number of very clever IT literate people to write all the code we need for all the robots, all the apps, and all the computer games, and we need only a miniscule fraction of the global population to drive inexorable progress towards ever more profound artificial intelligence and super intelligence. Three decades or more since we first began to talk of living in a computer age, the total number of workers employed in the development and production of computer hardware, software and applications, is still only 4 per cent of the total workforce in the US, and the US Bureau of Labor Statistics predicts just 135,000 new jobs in software development over 2014 to 2024, versus 458,000 additional personal care aides, and 348,000 home health aides.

pages: 426 words: 117,775

The Charisma Machine: The Life, Death, and Legacy of One Laptop Per Child
by Morgan G. Ames
Published 19 Nov 2019

The resulting illusions included newfound freedoms and potentials for computer-based self-governance, the inversion of traditional social institutions (putting, of course, computer-savvy hackers at the top), the flattening of bureaucracies, the end of geographic inequity, and a reboot of history.86 Even though computerization has largely entrenched existing power structures, these ideals of the computer age live on, especially among technologists who have not had much contact with those actively excluded from the technology world. Not coincidentally, this era of Gibson’s law-flouting “console cowboys” and Levy’s youthful hacker pranks occurred during the childhoods or early adulthoods of many of those involved with One Laptop per Child, and some, such as Papert and Negroponte, write about being active participants in the early stages of this world.

pages: 353 words: 355

The Long Boom: A Vision for the Coming Age of Prosperity
by Peter Schwartz , Peter Leyden and Joel Hyatt
Published 18 Oct 2000

The Y2K crisis will also accelerate the shift from the problematic computer systems that are a legacy of the past to the newest generation of networked computer systems, which are a solid foundation going forward into the new century. These will be the new baseline for the twenty-first century computer age—and for the centuries that come next. The nature of crises In general is that they accelerate change. That's the silver living in entering what by most accounts is a situation that nobody wants to go through. Crises tend to clear the decks of old leaders and old ideas. These people and ideas are often rightfully blamed for helping create the crisis in the first place.

Innovation and Its Enemies
by Calestous Juma
Published 20 Mar 2017

Thomond and Fioina Lettice, “Allocating Resources to Disruptive Innovation Projects: Challenging Mental Models and Overcoming Management Resistance,” International Journal of Technology Management 44, nos. 1–2 (2008): 140. 54. Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York, HarperBusiness, 2000). 55. See, for example, Rachel Schurman and William A. Munro, Fighting for the Future of Food: Activists versus Agribusiness in the Struggle over Biotechnology (Minneapolis: University of Minnesota Press, 2010). 56. Kenneth A. Oye, Kevin Esvelt, Evan Appleton, Flaminia Catteruccia, George Church, Todd Kuiken, Shlomiya Bar-Yam Lightfoot, Julie McNamara, Andrea Smidler, and James P.

pages: 420 words: 124,202

The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention
by William Rosen
Published 31 May 2010

The figures in question are J. Bradford deLong’s slightly different estimates. 4 The nineteenth-century French infant Numbers from UN and CIA Factbook. 5 A skilled fourth-century weaver Kirkpatrick Sale, Rebels Against the Future: The Luddites and Their War on the Industrial Revolution—Lessons for the Computer Age (Reading, MA: Addison-Wesley, 1995). 6 But by 1900 In 1900, the average U.S. hourly wage was $0.22, and a loaf of bread cost about a nickel; in 2000, the average wage was $18.65, and a loaf of bread cost less than $1.79. 7 “[a]bout 1760, a wave of gadgets swept over England” T. S. Ashton, Industrial Revolution (Oxford: Oxford University Press, 1997). 8 “fizzled out” Joel Mokyr, “The Great Synergy: The European Enlightenment as a Factor in Modern Economic Growth,” April 2005, online article at http://faculty.weas.northwestern.edu/∼jmokyr/Dolfsma.pdf.

pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software
by Charles Petzold
Published 28 Sep 1999

Of course, the word has the normal meaning, "a small portion, degree, or amount," and that normal meaning is perfect because a bit—one binary digit—is a very small quantity indeed. Sometimes when a new word is invented, it also assumes a new meaning. That's certainly true in this case. A bit has a meaning beyond the binary digits used by dolphins for counting. In the computer age, the bit has come to be regarded as the basic building block of information. Now that's a bold statement, and of course, bits aren't the only things that convey information. Letters and words and Morse code and Braille and decimal digits convey information as well. The thing about the bit is that it conveys very little information.

pages: 288 words: 16,556

Finance and the Good Society
by Robert J. Shiller
Published 1 Jan 2012

When the information was nally prepared, the tapes mounted on the computer, and the data processed, the authority of the idea that speculation makes for perfect market prices was much enhanced. That same “CRSP tape,” considerably updated, is still the major source for daily prices of individual stocks going back to 1926. By bringing nancial analysis into the computer age, the e cient markets hypothesis gained the status of an icon, and as a result it led people to infer much—in fact too much—about the perfection of markets. The discovery that day-to-day uctuations in stock prices are di cult to forecast should have come as no big surprise: if there were a simple trading strategy that consistently o ered a pro t of as little as a tenth of a percent a day, it would yield annual returns of over 30%.

pages: 510 words: 120,048

Who Owns the Future?
by Jaron Lanier
Published 6 May 2013

If you turned it one way and started reading, it was what Che would have been reading in the jungle if he had been a computer nerd. Flip it upside down and around and you had a hippie wow book with visions of crazy psychedelic computation. Ted often said that if this book had been published in a font large enough to read, he would have been one of the most famous figures of the computer age, and I agree with him. The main reason for Ted’s obscurity, however, is that Ted was just too far ahead of his time. Even the most advanced computer science labs were not in a position to express the full radical quality of change that digital technology would bring. For instance, I first visited Xerox PARC when some of the original luminaries were still gathered there.

pages: 456 words: 123,534

The Dawn of Innovation: The First American Industrial Revolution
by Charles R. Morris
Published 1 Jan 2012

It has placed the construction of machinery in the ranks of demonstrative science. The day will arrive when no school of mechanical drawing will be thought complete without teaching it.”33 Once again, however, Babbage was pointing directly to developments still far in the future. It was only with the onset of the postwar Computer Age that technologists began to execute their chip and other hardware designs in software so they could be tested and exercised without the expense of building physical components. And It Worked The tantalizing historical question for Babbage admirers was always whether his machines would actually have worked.

pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence
by John Brockman
Published 5 Oct 2015

Today, the engineers who design the artificial-intelligence-based programs and robots have a tremendous influence over how we use them. As computer systems are woven more deeply into the fabric of everyday life, the tension between intelligence augmentation and artificial intelligence becomes increasingly visible. At the dawn of the computing age, Wiener had a clear sense of the significance of the relationship between humans and smart machines. He saw the benefits of automation in eliminating human drudgery, but he also clearly saw the possibility of the subjugation of humanity. The intervening decades have only sharpened the dichotomy he first identified.

pages: 485 words: 126,597

Paper: A World History
by Mark Kurlansky
Published 3 Apr 2016

Lately, it has become popular to attribute the invention—as Walter Isaacson suggests in The Innovators and James Essinger affirms in Ada’s Algorithm—to Ada Lovelace, poet Lord Byron’s neglected and brilliant daughter. In the early nineteenth century, she wrote the first algorithm intended to be carried out by a machine. The computer age does seem to have its origins in the Industrial Revolution, and Ada Lovelace was probably the first to write about a machine that could be programmed to work on any problem. But her ideas were based on those of Charles Babbage, who had built a machine, the Difference Engine, that could make calculations.

pages: 627 words: 127,613

Transcending the Cold War: Summits, Statecraft, and the Dissolution of Bipolarity in Europe, 1970–1990
by Kristina Spohr and David Reynolds
Published 24 Aug 2016

The revenue from Soviet oil exports and the provision of détente-era credits from the West had helped in the short term but oil prices collapsed in the mid-1980s while the credits built up a huge debt burden across Eastern Europe. A return to economic autarky was, however, not an option because the Soviet bloc needed continued access to the West in order to keep up technologically, especially in the dawning computer age, and also to placate ordinary people with more sophisticated consumer goods. The socio-economic malaise was exacerbated by imperial overstretch. With roughly a quarter of GDP being channelled into the armed forces and the military-industrial complex, manpower and resources were not available for the rejuvenation of the civilian economy.

pages: 444 words: 124,631

Buy Now, Pay Later: The Extraordinary Story of Afterpay
by Jonathan Shapiro and James Eyers
Published 2 Aug 2021

Ten floors below him in the most coveted office building in Manhattan, Philippe Laffont’s Coatue Management ran US$16 billion of assets in his specialist technology hedge fund. While most hedge fund reception halls were decorated with expensive art, Coatue’s paid homage to tech nostalgia. A white cabinet displayed relics that were precursors to the dawning computer age: early Nintendo consoles, the first-ever Apple computer, and the Apple Newton, a clunky handheld device that was discontinued in 1998. Laffont had been an analyst around the same time as Coleman, but Robertson had passed him over for funds, instead backing Coleman, Lee Ainslie’s Maverick and Ole Andreas Halvorsen’s Viking Global.

Why Things Bite Back: Technology and the Revenge of Unintended Consequences
by Edward Tenner
Published 1 Sep 1997

Schor, The Overworked American: The Unexpected Decline of Leisure (New York: Basic Books, 1991), illustrate the sources of our ambiguity; Daniel Crevier, AI: The Tumultuous History of the Search for Intelligence (New York: Basic Books, 1993), blends sober realism with unquenchable optimism, as does Thomas K. Landauer, The Trouble with Computers (Cambridge, Mass.: MIT Press, 1995). And on the folklore of computer revenge effects, there is Karla Jennings, The Devouring Fungus: Tales of the Computer Age (New York: Norton, 199o). No history of the technology of sports exists. Probably the most influential interpretation of sports history is Allen Guttmann, From Ritual to Record: The Nature of Modern Sports (New York: Columbia University Press, 1978). Peter J. Brancazio, Sport Science: Physical Laws and Optimum Performance (New York: Simon & Schuster, 1984), is an excellent overview and reference, with some valuable points on the effects of technological change on sports.

pages: 538 words: 141,822

The Net Delusion: The Dark Side of Internet Freedom
by Evgeny Morozov
Published 16 Nov 2010

Who could have predicted that the development of “labor-saving devices” had the effect of increasing the burden of housework for most women? Similarly, the introduction of computers into the workforce failed to produce expected productivity gains (Tetris was, perhaps, part of some secret Soviet plot to halt the capitalist economy). The Nobel Prize-winning economist Robert Solow quipped that “one can see the computer age everywhere but not in the productivity statistics!” Part of the problem in predicting the exact economic and social effects of a technology lies in the uncertainty associated with the scale on which such a technology would be used. The first automobiles were heralded as technologies that could make cities cleaner by liberating them of horse manure.

pages: 416 words: 39,022

Asset and Risk Management: Risk Oriented Finance
by Louis Esch , Robert Kieffer and Thierry Lopez
Published 28 Nov 2005

We will, however, be using Rt in our subsequent reasoning. Example Let us calculate in Table 3.1 the quantities Rt and Rt∗ for a few values of Ct . The differences observed are small, and in addition, we have:   11 100 ln = 0.0039 + 0.0271 − 0.0794 − 0.0907 = −0.1391 12 750 2 3 An argument that no longer makes sense with the advent of the computer age. See, for example, the portfolio return shown below. 38 Asset and Risk Management Table 3.1 Classic and logarithmic returns Rt Rt∗ 0.0039 0.0273 −0.0760 −0.0864 0.0039 0.0271 −0.0794 −0.0907 Ct 12 750 12 800 13 150 12 150 11 100 3.1.1.2 Return on a portfolio Let us consider a portfolio consisting of a number N of equities, and note nj , Cj t and Rj t , respectively the number of equities (j ), the price for those equities at the end of period t and the dividend paid on the equity during that period.

pages: 436 words: 76

Culture and Prosperity: The Truth About Markets - Why Some Nations Are Rich but Most Remain Poor
by John Kay
Published 24 May 2004

"The Penn World Table (Mark 5): An Expanded Set of International Comparisons, 1950-1988." Quarterly journal of Economics 106 (2) (May): 327-68. Hicks, ] . R 19 3 9. Value and Capital: An Inquiry into Some Fundamental Principles of Economic Theory. Revised ed. Oxford: Clarendon Press. 1946. Hiltzik, M.A. 1999. Dealers of Lightning: Xerox Pare and the Dawn of the Computer Age. New York: Harper Business. Hinsley, F. H., and A. Stripp, eds. 1993. Codebreakers: The Inside Story ofBletchley Park. Oxford: Oxford University Press. Hirsch, F. 1976. Social Limits to Growth. Cambridge, Mass.: Harvard University Press. Hirschman, A. 0. 1970. Exit) Voice) and Loyalty: Responses to Decline in Firms) Organizations) and States.

pages: 431 words: 129,071

Selfie: How We Became So Self-Obsessed and What It's Doing to Us
by Will Storr
Published 14 Jun 2017

But this new collective mood, so alien to the individualist American core, didn’t come without costs. Nightmares wafted in from the disturbed national subconscious. These were the years of Red Terror and McCarthyism in which the paranoid idea spread that America might tip into Communism. There was a similar feeling of rising dread over the coming computer age. It was feared the future would be a ‘technocracy’ in which freedom and individuality would be crushed and the population dominated by machines of coercion, conformity and control. Computers were seen as a war technology that would be co-opted by the collective powers of government and corporation and used against us.

pages: 422 words: 131,666

Life Inc.: How the World Became a Corporation and How to Take It Back
by Douglas Rushkoff
Published 1 Jun 2009

World’s Fairs in 1939 and again in 1964 offered the experience of a future America where a benevolent corporation would address every need imaginable. AT&T, GM, and the U.S. Rubber Company sponsored utopian pavilions with names such as Pool of Industry and the Avenue of Transportation. Corporations would take us into the automobile age, the space age, and even the computer age. No matter the sponsor, the overarching message was the same: American-style corporatism would create a bright future for us all. The intelligentsia played along. Former socialist academics and Nazi expatriates alike were finding easy money in the form of research grants from both the military and industry if they recanted their prior socioeconomic theories and promoted the new corporatism.

pages: 377 words: 21,687

Digital Apollo: Human and Machine in Spaceflight
by David A. Mindell
Published 3 Apr 2008

Bibliography 315 Castells, Manuel. The Rise of the Network Society. Cambridge, Mass.: Blackwell Publishers, 1996. Cernan, Eugene, and Don Davis. The Last Man on the Moon: Astronaut Eugene Cernan and America’s Race in Space, 1st ed. New York: St. Martin’s Press, 1999. Ceruzzi, Paul. Beyond the Limits: Flight Enters the Computer Age. Cambridge, Mass.: MIT Press, 1989. Chaikin, Andrew. A Man on the Moon: The Voyages of the Apollo Astronauts. New York: Penguin Books, 1998. Chandler, Alfred Dupont. The Visible Hand: The Managerial Revolution in American Business. Cambridge, Mass.: Belknap Press, 1977. Cheatham, Donald C., and Floyd Bennett.

pages: 742 words: 137,937

The Future of the Professions: How Technology Will Transform the Work of Human Experts
by Richard Susskind and Daniel Susskind
Published 24 Aug 2015

While mathematicians call this ‘exponential growth’, professionals might simply think of it as explosive growth. This growth in processing power has already had profound effects. Michael Spence, a Nobel Laureate in economics, notes that Moore’s Law resulted in ‘roughly a 10-billion-times’ reduction in the cost of processing power in the first fifty years of the ‘computer age’ (which, he thinks, began roughly in 1950). Ray Kurzweil, in his books The Singularity is Near and How to Create a Mind, stresses that this will continue. According to Kurzweil, the ‘fundamental measures of information technology follow predictable and exponential trajectories’.19 In explaining exponential growth, he says that: the pace of change for our human-created technology is accelerating and its powers are expanding at an exponential pace.

pages: 466 words: 127,728

The Death of Money: The Coming Collapse of the International Monetary System
by James Rickards
Published 7 Apr 2014

Friedrich Hayek, in his classic 1945 essay “The Use of Knowledge in Society,” written almost two hundred years after Adam Smith’s work, makes the same argument but with a shift in emphasis. Whereas Smith focused on individuals, Hayek focused on information. This was a reflection of Hayek’s perspective on the threshold of the computer age, when models based on systems of equations were beginning to dominate economic science. Of course, Hayek was a champion of individual liberty. He understood that the information he wrote about would ultimately be created at the level of individual autonomous actors within a complex economic system.

pages: 493 words: 145,326

Fire and Steam: A New History of the Railways in Britain
by Christian Wolmar
Published 1 Mar 2009

Households were separated; homes were desecrated by the emissaries of the law.’31 While there is no small measure of Victorian hyperbole in this account, it demonstrates quite clearly that the middle classes invested far more in the railways than in goods such as cotton or corn which tended to be traded only by professional investors and speculators. There may have been a range of newspapers covering the railway industry, but there was nothing like the availability of information in today’s computer age. Therefore there was no immediate panic as the price of stock began to go down as a result of the higher interest rates, but the process was steady and irreversible. Against an index value of 100 in 1840, railway shares had risen to 149 at the peak when interest rates rose and then fell back to 95.5 by 1848 and to 70.4 two years later.

pages: 494 words: 142,285

The Future of Ideas: The Fate of the Commons in a Connected World
by Lawrence Lessig
Published 14 Jul 2001

For the property obsessed, or those who believe that progress comes only from strong and powerful property rights, pause on this point and read it again: The most important space for innovation in our time was built upon a platform that was free. As Alan Cox, second only to Linus Torvalds in the Linux chain, puts it in an essay in response to Microsoft's attack on open code values: [M]ost of the great leaps of the computer age have happened despite, rather than because of, [intellectual property rights (IPR)]. [B]efore the Internet the proprietary network protocols divided customers, locked them into providers and forced them to exchange much of their data by tape. The power of the network was not unlocked by IPR. It was unlocked by free and open innovation shared amongst all.23 Not strong, perfect control by proprietary vendors, but open and free protocols, as well as open and free software that ran on top of those protocols: these produced the Net.

pages: 409 words: 129,423

Mapping Mars: Science, Imagination and the Birth of a World
by Oliver Morton
Published 15 Feb 2003

The television producer Gene Roddenberry combined the two ideas into a “final frontier”: Star Trek’s conflation of physical exploration with the search for knowledge was profoundly influential, making space travel the dominant metaphor for the creation of scientific and technological novelty. Space enthusiasts have been using frontier imagery relentlessly ever since. As the historian Patricia Nelson Limerick has pointed out, this use has been consistently self-deluding and ill-informed, but that hasn’t put an end to it.* The computer age became yet another new frontier when William Gibson’s idea of “cyberspace” gave it a sense of dimensionality; its pioneering freedoms of expression were upheld by an Electronic Frontier Foundation that had frequent recourse to the rhetoric of the Wild West. I’m fairly sure Zubrin would laugh in scorn at the idea that a computer screen could be a frontier.

pages: 517 words: 139,477

Stocks for the Long Run 5/E: the Definitive Guide to Financial Market Returns & Long-Term Investment Strategies
by Jeremy Siegel
Published 7 Jan 2014

Investors are inevitably drawn to firms able to generate high earnings and revenue growth. But empirical data show this pursuit of growth often leads to subpar returns. To illustrate how growth does not necessarily translate into superior returns, imagine for a moment that you are an investor in 1950, at the dawn of the computer age. You have $1,000 to invest and are given the choice of two stocks: Standard Oil of New Jersey (now ExxonMobil) or a much smaller, promising new company called IBM. You will instruct the firm you choose to reinvest all dividends paid back into new shares, and you will put your investment under lock and key for the next 62 years, to be distributed at the end of 2012 to your great-grandchildren or to your favorite charity.

pages: 573 words: 142,376

Whole Earth: The Many Lives of Stewart Brand
by John Markoff
Published 22 Mar 2022

Brand missed the irony when summarizing the demise: “All this was good people doing work for good service to the customers, but so structured it threatened our fundamental service.”[8] * * * At the end of 1972, Brand had defiantly announced in his Rolling Stone article on Spacewar!: “Ready or not, computers are coming to the people.”[9] It would be another decade, however, before he would acquire his own computer. When it arrived, the machine was a Kaypro II, one of the last successful entries in the hobbyist period of the personal computer age. It was what was referred to as a “luggable” computer, about the size and weight of a heavy sewing machine. It was modeled after the earlier Osborne 1, which had revolutionized the neophyte industry by bundling a complete suite of business software for less than $2,000—word processing, database, spreadsheet, and, most significantly, a communications program that connected the machine to a device known as a modem.

pages: 474 words: 130,575

Surveillance Valley: The Rise of the Military-Digital Complex
by Yasha Levine
Published 6 Feb 2018

In this rebranded world, computers were the new communes: a digital frontier where the creation of a better world was still possible. In the parlance of today’s Silicon Valley, Brand “pivoted.” He transformed the Whole Earth Catalog into the Whole Earth Software Catalog and Whole Earth Review—magazines billed as “tools and ideas for the computer age.” He also launched the Good Business Network, a corporate consulting company that applied his counterculture public relations strategies to problems faced by clients such as Shell Oil, Morgan Stanley, Bechtel, and DARPA.31 He also organized an influential computer conference that brought together leading computer engineers and journalists.32 It was called, simply, “Hackers’ Conference” and was held in Marin County in 1984.

pages: 454 words: 139,350

Jihad vs. McWorld: Terrorism's Challenge to Democracy
by Benjamin Barber
Published 20 Apr 2010

Gore genuinely believes in the role of government as a regulator and equalizer, but after the elections of November 1994, there is little to suggest he will get much support in Congress or the nation. PART II. THE OLD WORLD OF JIHAD Chapter 10. Jihad vs. McWorld or Jihad via McWorld? 1. See David Gonzalez, “The Computer Age Bids Religious World to Enter,” The New York Times, July 24, 1994, Section 1, p. 1. 2. See Allan Bloom, The Closing of the American Mind (New York: Simon & Schuster, 1987). I have explored the ironies of Bloom’s complaint elsewhere in An Aristocracy of Everyone: The Politics of Education and the Future of America (New York: Ballantine Books, 1993), Chapter 5. 3.

The Book of Why: The New Science of Cause and Effect
by Judea Pearl and Dana Mackenzie
Published 1 Mar 2018

Tech. rep., Department of Computer Science, University of California, Los Angeles, CA. Submitted to Communications of the ACM. Accessed online at https://arXiv:1707.04327. Domingos, P. (2015). The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World. Basic Books, New York, NY. Efron, B., and Hastie, T. (2016). Computer Age Statistical Inference. Cambridge University Press, New York, NY. Freedman, D., Pisani, R., and Purves, R. (2007). Statistics. 4th ed. W. W. Norton & Company, New York, NY. Hacking, I. (1990). The Taming of Chance (Ideas in Context). Cambridge University Press, Cambridge, UK. Hoover, K. (2008).

pages: 493 words: 136,235

Operation Chaos: The Vietnam Deserters Who Fought the CIA, the Brainwashers, and Themselves
by Matthew Sweet
Published 13 Feb 2018

Here, Marcus explained, brainwash candidates had been put under hypnosis, subjected to electric shocks, forced to eat their own excrement and endure sexual humiliation, and tortured until they whined like puppies. Once they had been reduced to mental pulp, the programming began. A literal kind of programming, following the rules of the computer age. Numbers linked to functions. Infinite loops of coded instructions, drilled into the subject by repetition, violence, the application of electrodes to bare skin. Finally, cyanide pills had been secreted inside their bodies, in order to eliminate the killers once they had fulfilled their programs.

pages: 1,079 words: 321,718

Surfaces and Essences
by Douglas Hofstadter and Emmanuel Sander
Published 10 Sep 2012

Much like the concept hump for Mica, the concept of a solid desk — a piece of furniture — was, for adults who grew up before the era of personal computers, a category with an old town, a downtown, and suburbs, like so many other categories. To make this vivid, we can cite a dictionary definition dating back to the pre-computer age. In particular, the following enormous and admirable vintage-1932 dictionary: FUNK & WAGNALLS New Standard Dictionary [Reg. U. S. Pat. Off.] OF THE English Language UPON ORIGINAL PLANS DESIGNED TO GIVE, IN COMPLETE AND ACCURATE STATEMENT, IN THE LIGHT OF THE MOST RECENT ADVANCES IN KNOWLEDGE, IN THE READIEST FORM FOR POPULAR USE, THE ORTHOGRAPHY, PRONUNCIATION, MEANING, AND ETYMOLOGY OF ALL THE WORDS, AND THE MEANING OF IDIOMATIC PHRASES, IN THE SPEECH AND LITERATURE OF THE ENGLISH- SPEAKING PEOPLES, TOGETHER WITH PROPER NAMES OF ALL KINDS, THE WHOLE ARRANGED IN ONE ALPHABETICAL ORDER PREPARED BY MORE THAN THREE HUNDRED AND EIGHTY SPECIALISTS AND OTHER SCHOLARS defined the word “desk” as follows: desk, n. 1.

Thus once we have two “rival” categories of desk, this allows us to put our finger on the essence of the original category by constructing a more abstract concept of desk. Much as acquiring a second language allows one to understand the nature of one’s native language more clearly, the emergence of computer-age desks has helped us gain a newer and deeper understanding of our old category desk. The advent of home computers changed the venerable old concept of desk, making it no longer associated with one indivisible concept. The emergence of three new types of desks — hard-desk, soft-desk, and general-desk — has allowed us to perceive more clearly an essence, hidden up till then, of the original old category.

pages: 514 words: 153,274

The Cobweb
by Neal Stephenson and J. Frederick George
Published 31 May 2005

about the authors Neal Stephenson is the author of THE SYSTEM OF THE WORLD, THE CONFUSION, QUICKSILVER, CRYPTONOMICON, THE DIAMOND AGE, SNOW CRASH, and other books and articles. J. Frederick George is a historian and writer living in Paris. Also by Neal Stephenson and J. Frederick George INTERFACE Praise for Interface also by Neal Stephenson and J. Frederick George “A Manchurian Candidate for the computer age.” —Seattle Weekly “Qualifies as the sleeper of the year, the rare kind of science-fiction thriller that evokes genuine laughter while simultaneously keeping the level of suspense cranked to the max.” —San Diego Tribune “Complex, entertaining, frequently funny.” —Publishers Weekly Now available wherever Bantam Books are sold.

pages: 650 words: 155,108

A Man and His Ship: America's Greatest Naval Architect and His Quest to Build the S.S. United States
by Steven Ujifusa
Published 9 Jul 2012

The Philadelphia Inquirer reported that he underwent “a serious operation for internal troubles,” and that his family was “not permitted to see or talk to him because the physicians deemed that it would be taxing the patient’s strength.”2 William Warren Gibbs recovered, but two years later the family had another scare when the family’s rented Haverford house caught fire. Motorists leapt out of their cars to help the Gibbs brothers drag out what they could. The household was eventually put back together.3 The Gibbs brothers managed to salvage their plans and get back to work. Like the garage inventors of the computer age, the brothers did not let surroundings distract them. Within a year or so, they were ready to present preliminary drawings for a duo of ocean liners. Judged by the standards Gibbs would set in the years to come, the designs were awkward and derivative, a hodgepodge of visual features from predecessors, and a four-stacked silhouette echoing British liners.

pages: 339 words: 57,031

From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism
by Fred Turner
Published 31 Aug 2006

“Gender Differences in Computer-Mediated Communication: Bringing Familiar Baggage to the New Frontier.” Paper presented at the American Library Association Annual Convention, Miami, FL, 1994. Hiltz, Starr Roxanne, and Murray Turoff. The Network Nation: Human Communication via Computer. Reading, MA: Addison-Wesley, 1978. Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperBusiness, 1999. Hipschman, David. “Who Speaks for Cyberspace?” Posted September 14, 1995, to Web Review at http://www.gnn.org (site now discontinued); reposted December 5, 1995, to http://www.balikinos.de/dokfest/1995/reviewof.htm (accessed July 16, 2005). Hitt, Jack, and Paul Tough.

pages: 570 words: 158,139

Overbooked: The Exploding Business of Travel and Tourism
by Elizabeth Becker
Published 16 Apr 2013

The country that literally invented the Internet and modern computing finally had a barebones national travel website a decade behind the rest of the world, but at least the United States was back in the game. This new tourism website is underwritten by a new fee charged foreign tourists, not the American taxpayer. It took two years, but in May 2012, the Discover America website was in business. For the first time since the dawn of the computer age, foreigners could go to one website and find what the United States had to offer. No more guessing at things like the name of the national railroad (Amtrak). With the door open this far, the industry pushed even harder. They were too close to their goal of catching up from that lost decade. Now the association lobbied for the last piece of the puzzle: to improve the visa process by making it easier to apply, and to reduce the waiting time for approval.

pages: 554 words: 158,687

Profiting Without Producing: How Finance Exploits Us All
by Costas Lapavitsas
Published 14 Aug 2013

The weakness of productivity growth underscores the relatively poor results of GDP growth during the period, shown in figure 2. Looking more closely at figure 6, from the middle of the 1970s to the middle of the 1990s, productivity growth was broadly flat or declining, including in the US, the leading country in introducing the new technologies of the era.14 Robert Solow observed that ‘You can see the computer age everywhere but in the productivity statistics’, and his quip became the ‘Solow Paradox’ characteristic of the new era.15 After 1995, however, significant technological improvements in the microprocessor industry and faster productivity growth in general seemed to materialize for the US economy.

pages: 479 words: 144,453

Homo Deus: A Brief History of Tomorrow
by Yuval Noah Harari
Published 1 Mar 2015

If you cannot make up your mind, or if you make a mistake, the computer has passed the Turing Test, and we should treat it as if it really has a mind. However, that won’t really be a proof, of course. Acknowledging the existence of other minds is merely a social and legal convention. The Turing Test was invented in 1950 by the British mathematician Alan Turing, one of the fathers of the computer age. Turing was also a gay man in a period when homosexuality was illegal in Britain. In 1952 he was convicted of committing homosexual acts and forced to undergo chemical castration. Two years later he committed suicide. The Turing Test is simply a replication of a mundane test every gay man had to undergo in 1950 Britain: can you pass for a straight man?

pages: 573 words: 157,767

From Bacteria to Bach and Back: The Evolution of Minds
by Daniel C. Dennett
Published 7 Feb 2017

That is, innovations in memes that made them more effective reproducers in brains that were not yet well designed for dealing with them could provide the early “proof of concept” that would underwrite, in effect, the more expensive and time-consuming genetic adjustments in brain hardware that would improve the working conditions for both memes and their hosts. This basic pattern has been echoed hundreds of times since the dawn of the computer age, with software innovations leading the way and hardware redesigns following, once the software versions have been proven to work. If you compare today’s computer chips with their ancestors of fifty years ago, you will see many innovations that were first designed as software systems, as simulations of new computers running on existing hardware computers.

pages: 486 words: 150,849

Evil Geniuses: The Unmaking of America: A Recent History
by Kurt Andersen
Published 14 Sep 2020

What’s more, because “any worker who now performs his task by following specific instructions can, in principle, be replaced by a machine,” that was bound to happen and accelerate, so “labor will become less and less important. I do not see that the new industries can employ everybody who wants a job.” For a few years, Leontief had refrained from publicly sharing his dark analogy about the inevitable process he foresaw, but with the mass computer age arriving—10 percent of Americans owned a PC and 1 percent were on the Internet in 1983—he decided on candor. What happened to horses in America during his lifetime, he wrote, when we went from owning 26 million in 1915 to owning 3 million in 1960, would happen to human workers as we entered the twenty-first century.

pages: 661 words: 156,009

Your Computer Is on Fire
by Thomas S. Mullaney , Benjamin Peters , Mar Hicks and Kavita Philip
Published 9 Mar 2021

I am not aware of much sober scholarship on this particular transition from East Coast government to Silicon Valley private business, although much of the dated rhetoric that pits state against corporations can be found in popular accounts, such as Michael Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (New York: Harper Business, 1999), and L. Gordon Crovitz, “Who Really Invented the Internet?” Wall Street Journal (July 22, 2012), https://www.wsj.com/articles/SB10000872396390444464304577539063008406518. 28. Judy O’Neill, “Interview with Paul Baran,” Charles Babbage Institute, OH 182 (February 5, 1999), Menlo Park, CA, accessed September 15, 2017, http://www.gtnoise.net/classes/cs7001/fall_2008/readings/baran-int.pdf. 29.

pages: 467 words: 149,632

If Then: How Simulmatics Corporation Invented the Future
by Jill Lepore
Published 14 Sep 2020

Griese, “Rosser Reeves and the 1952 Eisenhower TV Spot Blitz,” Journal of Advertising 4 (1975): 30. “Stevenson Bids Eggheads Unite,” New York Herald Tribune, March 23, 1954. “National Affairs: The Third Brother,” Time, March 31, 1958. Ira Chinoy, “Battle of the Brains: Election-Night Forecasting at the Dawn of the Computer Age” (PhD diss., Johns Hopkins University, 2010). At first, IP was merely sharing research he had already conducted. Jack Shea to Thomas K. Finletter, February 28, 1956, encloses a “first draft of a study by de Sola Pool on impressions of Eisenhower and Stevenson through TV and radio media in the 1952 campaign.

Trade Your Way to Financial Freedom
by van K. Tharp
Published 1 Jan 1998

French economist George Anderla has measured changes in the rate of information flow with which we human beings must cope. He has concluded that information flow doubled in the 1,500 years between the time of Jesus and Leonardo da Vinci. It doubled again by the year 1750 (that is, in about 250 years). The next doubling took only about 150 years to the turn of the century. The onset of the computer age reduced the doubling time to about 5 years. And, with today’s computers offering electronic bulletin boards, DVDs, fiber optics, the Internet, and so on, the amount of information to which we are exposed currently doubles in about a year or less. Researchers estimate that humans, with what we currently use of our brain potential, can take in only 1 to 2 percent of the visual information available at any one time.

Turing's Cathedral
by George Dyson
Published 6 Mar 2012

“They fight against popular creationism, but at the same time they fight fanatically for their own creationism,” he noted in 1984.4 Alfvén divided his later years between La Jolla, California, where he held a position as professor of physics at UC–San Diego, and the Royal Institute of Technology in Stockholm, where he had been appointed to the School of Electrical Engineering in 1940, just in time to witness the arrival of the computer age firsthand. Sweden’s BESK (Binär Elektronisk Sekvens Kalkylator) was a first-generation copy of the IAS machine, becoming operational in 1953. It had faster memory and arithmetic, partly through clever Swedish engineering (including the use of 400 germanium diodes) and partly through reducing the memory of each Williams tube to 512 bits.

Digital Accounting: The Effects of the Internet and Erp on Accounting
by Ashutosh Deshmukh
Published 13 Dec 2005

Information-intensive industries will certainly benefit from the coming standardization. Copyright © 2006, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited. 88 Deshmukh Chapter IV Electronic Data Interchange What is Electronic Data Interchange? Before the dawn of the computer age, intra- and inter-business activities, especially purchasing and selling of products and services, were paper-intensive. Paper documents such as purchase orders, invoices, shipping notices, and bills of lading needed to be prepared in multiple copies. These copies had to be approved, signed, preserved in files for a certain duration, forwarded to trading partners and processed in a myriad of ways.

pages: 555 words: 163,712

War of Shadows: Codebreakers, Spies, and the Secret Struggle to Drive the Nazis From the Middle East
by Gershom Gorenberg
Published 19 Jan 2021

THE REASON THAT Welchman’s book upset his heirs at the NSA and GCHQ, historian Nigel West argued, was that Welchman had stressed that in theory Enigma was “impregnable if it had been used properly.” German mistakes and failure to follow simple security rules had created the openings Bletchley Park exploited. His book, therefore, was not just about how an archaic cipher from before the computer age was cracked. It was a warning to contemporary enemies to tighten security.26 If they took note, their encryption could become unbreakable. Yet Welchman was partly mistaken, and so were the agencies frightened by his book. Welchman’s error was in underestimating what Rejewski had achieved before the war.

pages: 589 words: 167,680

The Red and the Blue: The 1990s and the Birth of Political Tribalism
by Steve Kornacki
Published 1 Oct 2018

Perot was a guest on CNN’s Larry King Live, which, in seven years on the air, had become the cable network’s signature show. On February 20, King’s guest for the full hour was Perot. Plenty of people already knew his name. He’d been dubbed America’s first populist billionaire, a folksy Texan who’d created a data processing company at the dawn of the computer age, sold it to General Motors for $2.4 billion, then gotten GM to cough up $750 million more to get him off its board. Perot had a deep sense of patriotism, a flair for showmanship, and bottomless disgust with government bureaucracy. During the Vietnam War, he paid for Christmas gift baskets for Americans being held captive, then flew to Hanoi and stood outside a prison demanding to deliver them personally.

pages: 533

Future Politics: Living Together in a World Transformed by Tech
by Jamie Susskind
Published 3 Sep 2018

‘Characterizing Quantum Supremacy in Near-Term Devices’. sarXiv, 5 Apr. 2017 <https://arxiv.org/abs/1608.00263> (accessed 28 Nov. 2017). Bollen, Johan, Huina Mao, and Xiao-Jun Zeng. ‘Twitter Mood Predicts the Stock Market’.arXiv,14 Oct.2010 <https://arxiv.org/pdf/1010.3003. pdf> (accessed 1 Dec. 2017). Bolter, J. David, Turing’s Man:Western Culture in the Computer Age. London: Duckworth, 1984. Bolukbasi, Tolga, Kai-Wei Chang, James Zou, and Venkatesh Saligrama. ‘Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings’. arXiv, 21 Jul. 2016 <https://arxiv.org/ pdf/1607.06520.pdf> (accessed 3 Dec. 2017). Bonchi, Francesco, Carlos Castillo, and Sara Hajian.

pages: 665 words: 159,350

Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy, and Everything Else
by Jordan Ellenberg
Published 14 May 2021

That’s more than the number of . . . okay, actually, it’s more than the number of anything in the universe, and it is certainly not a number of things you’re going to comb through one by one and write little W’s, L’s, and D’s next to. In principle, yes; in reality, no. This phenomenon of computations we know exactly how to do, but don’t have time to do, is a somber minor-key motif that sounds through the whole history of computer-age mathematics. Back to prime factorization for a second. We’ve already seen that you can carry that out without much real thought. If you start with a number like 1,001, you just have to find a number that divides it up evenly, and if you can’t find one, 1,001 is prime. Does 2 work? No, 1,001 can’t be split in half. 3?

pages: 622 words: 169,014

Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction
by Alec Nevala-Lee
Published 22 Oct 2018

Science fiction had set its stories in the future or in space because that was where the action was, and Astounding had begun to take itself seriously as prophecy only after its core assumptions were already in place, with its best guesses arising mostly by chance. With so much wild speculation, some of it was bound to be correct—even if the man at the helm had often steered in the wrong direction. Yet if the future—from atomic energy to the space race to the computer age, which would threaten the existence of the very magazines from which it had emerged—felt like science fiction, it was largely because the prophecy had fulfilled itself. It had inspired countless readers to enter the sciences, where they set themselves, consciously or not, to enacting its vision.

pages: 613 words: 181,605

Circle of Greed: The Spectacular Rise and Fall of the Lawyer Who Brought Corporate America to Its Knees
by Patrick Dillon and Carl M. Cannon
Published 2 Mar 2010

In the meantime new technologies being developed in California were making it ever so much easier to commit fraud, to manipulate financial markets, and to filch on a massive scale. Lerach and his partners would be waiting for these bandits too, using the same technologies against them. FOR BILL LERACH AS for the rest of the country, the Computer Age was ushered in with the faint beep, beep, beep sound of the Soviet satellite Sputnik that circled the globe in 1957 to general amazement and not a small amount of panic. Lerach’s father initially thought it a communist hoax, but he and his wife and sons joined the other families on Kennedy Avenue, including Gene Carney’s, to peer into the October sky looking for the thing.

The Rough Guide to Jamaica
by Thomas, Polly,Henzell, Laura.,Coates, Rob.,Vaitilingam, Adam.

Though none were rawer than Yellow, who added energetic stage performances and self-deprecating humour to the expletives, other DJs – fuelled by a positive response from their Jamaican audience – emulated his lewd approach, and sexually explicit lyrics – or “slackness” – began to proliferate. In 1985, Wayne Smith’s hit Under Me Sleng Teng – voiced for a King Jammy’s rhythm of the same name – heralded the start of the computer age in the dancehall. Studios switched from analogue to digital recording formats, and producers seized upon computerized rhythms as a quicker and cheaper way of putting out a record. The mixing board had become an instrument unto itself, with a new breed of producers like Bobby Digital, Donovan Germaine, Mikey Bennett, Dave Kelly, Jeremy Harding and Patrick Roberts becoming the Reids and Dodds of the 1990s and DJs becoming the island’s biggest stars.

The Man Who Knew Infinity: A Life of the Genius Ramanujan
by Robert Kanigel
Published 25 Apr 2016

Barbara first encountered the name of Ramanujan in late 1987, a time when magazines and newspapers in the United States, India, and Britain were full of articles marking the hundredth anniversary of his birth. Like Mahalanobis in King’s College Chapel, Barbara was smitten. First, with the sheer romance of his life—the story in it. But also with how today, years after his death and long into the computer age, some of his theorems were, as she put it later, being “snatched back from history.” “Ramanujan who?” I said when my agent, Vicky Bijur, told me of Barbara’s interest in a biography of him. Though skeptical, I did some preliminary research into his life, as recorded by his Indian biographers.

The Big Score
by Michael S. Malone
Published 20 Jul 2021

After the 360 there wasn’t much competition. It was the most successful computer ever introduced up to its time, and with its direct (and compatible) descendant, the 370 Series, it became the largest-selling computer family in history. It can be argued that the 360 Series changed the world. With it the computer age arrived. Computers now had a future. Now a company could acquire, say, the low-end Model 20 and, as it grew, graduate to increasingly powerful models, right up to the 360 Model 91, which was 500 times as powerful as the Model 20—yet the firm could still hang the same printers and tape drives on its newest 360 Model and run the same software.

pages: 829 words: 187,394

The Price of Time: The Real Story of Interest
by Edward Chancellor
Published 15 Aug 2022

The Northwestern economist lamented that the digital revolution ‘provided new opportunities for consumption on the job and in leisure hours rather than a continuation of the historical tradition of replacing human labor with machines’.18 Gordon’s concerns recall a famous comment made by MIT economist Bob Solow in 1987, that ‘you can see the computer age everywhere but in the productivity statistics.’ The Nobel laureate Solow spoke too soon. Not long afterwards US productivity growth picked up, assisted presumably by advances in information technology. As Gordon’s book The Rise and Fall of American Growth went to press in early 2016 (its publication facilitated by digital technologies), the internet continued to disrupt countless industries while the media fanned fears of an impending ‘second machine age’, in which robots replace human workers.

pages: 619 words: 177,548

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity
by Daron Acemoglu and Simon Johnson
Published 15 May 2023

Moreover, much of the growth in both patenting and research spending is driven by new patents in electronics, communication, and software, the fields that were supposed to propel us forward. But look a little closer, and the fruits of the digital revolution are much harder to see. In 1987, Nobel Prize winner Robert Solow wrote: “You can see the computer age everywhere but in the productivity statistics,” pointing out the small gains from investments in digital technologies. Those more optimistic about computers told Solow that he had to be patient; productivity growth would soon be upon us. More than thirty-five years have passed, and we are still waiting.

Statistics in a Nutshell
by Sarah Boslaugh
Published 10 Nov 2012

Both are possible, but a more logical explanation is that people over the age of 65 are more likely not to be employed and more likely to have a diagnosis of arthritis. To test the hypothesis that age distribution is the reason for the observed differences in rate of arthritis diagnosis by employment status, we need to compute age-adjusted rates of arthritis by using a standard population. First, we need to calculate age-specific rates for employed and unemployed individuals, as in Figure 15-15. Figure 15-15. Age-specific rates of arthritis diagnosis Looking at the age distribution and age-specific rates for the employed versus unemployed populations, we see that for each age group, the rates of arthritis are somewhat higher in the unemployed group than in the employed group (the opposite pattern from that seen when data from all the age categories is combined).

pages: 700 words: 201,953

The Social Life of Money
by Nigel Dodd
Published 14 May 2014

Above all, it would be a world in which people never feel compelled to hide their money in freezers. * * * 1 Breton 1969: 18. 2 The idea of a moneyless world usually breaks down over the prospect of finding people to barter with (the double coincidence of wants problem), but some commentators believe that this hitch is much less of a problem in the computer age. For example, David Birch argues that “we can resolve the long chain of intermediate coincidences, minimising each step by search, in a few milliseconds. In this way, it is possible to imagine trades taking place with Google replacing Bank of England notes,” see “Imagine there’s no money,” http://fw.ifslearning.ac.uk/Archive/2011/October/Comment/Imaginetheresnomoneydavidbirch.aspx, accessed May 10, 2013. 3 Simmel uses the phrase in relation to socialism: “For by declaring war upon this monetary system, socialism seeks to abolish the individual’s isolation in relation to the group” (Simmel 2004: 346). 4 Utopianism is by no means unambiguously positive, of course, and what seems utopian from one perspective can be dystopian from another.

pages: 781 words: 226,928

Commodore: A Company on the Edge
by Brian Bagnall
Published 13 Sep 2005

The machine had Japanese Katakana characters in place of many of the PETSCII characters and it booted up to black text on a pink background as opposed to the familiar blue colors. It sold for a price of 99,800 yen (approximately $400 US). Commodore users in Japan received support from a magazine called Vic! The Magazine for Computer Age. Computing with the portable C64. Although the C64 was unable to dominate the Japanese marketplace, it was responsible for keeping Japan from entering the North American market. In a broadcast of Computer Chronicles, Tramiel told his hosts, “As far as the Japanese are concerned, I was able to keep those people out of the US market and almost the world market for the past seven years. … What I’m trying to do is come out with the best product, the best quality, and the best price and by doing so, I keep those people out.

pages: 735 words: 214,791

IBM and the Holocaust
by Edwin Black
Published 30 Jun 2001

Changing perspective was perhaps the dominant reason why the relationship between IBM and the Holocaust has never been explored. When I first wrote The Transfer Agreement in 1984, no one wanted to focus on assets. Now everyone talks about the assets. The formative years for most Holocaust scholarship was before the computer age, and well before the Age of Information. Everyone now possesses an understanding of how technology can be utilized in the affairs of war and peace. We can now go back and look at the same documentation in a new light. Many of us have become enraptured by the Age of Computerization and the Age of Information.

pages: 846 words: 232,630

Darwin's Dangerous Idea: Evolution and the Meanings of Life
by Daniel C. Dennett
Published 15 Jan 1995

Of course, Darwin is the man who painstakingly uncovered a host of jaw-dropping complexities in the lives and bodies of barnacles, orchids, and earthworms, and described them with obvious relish. Had he had a prophetic dream back in 1859 about the wonders of DNA, he would no doubt have reveled in it, but I wonder if he could have recounted it with a straight face. Even to those of us accustomed to the "engineering miracles" of the computer age, the facts are hard to encompass. Not only molecule-sized copying machines, but proofreading enzymes that correct mistakes, all at blinding speed, on a scale that super-computers still cannot match. "Biological macromolecules have a storage capacity that exceeds that of the best present-day information stores by several orders of magnitude.

pages: 915 words: 232,883

Steve Jobs
by Walter Isaacson
Published 23 Oct 2011

There have been analogous situations in history, when an era is shaped by the relationship and rivalry of two orbiting superstars: Albert Einstein and Niels Bohr in twentieth-century physics, for example, or Thomas Jefferson and Alexander Hamilton in early American governance. For the first thirty years of the personal computer age, beginning in the late 1970s, the defining binary star system was composed of two high-energy college dropouts both born in 1955. Bill Gates and Steve Jobs, despite their similar ambitions at the confluence of technology and business, had very different personalities and backgrounds. Gates’s father was a prominent Seattle lawyer, his mother a civic leader on a variety of prestigious boards.

pages: 976 words: 235,576

The Meritocracy Trap: How America's Foundational Myth Feeds Inequality, Dismantles the Middle Class, and Devours the Elite
by Daniel Markovits
Published 14 Sep 2019

As the economist Robert Solow, whose work on economic growth won him the Nobel Prize, once wryly admitted, it is “somewhat embarrass[ing] . . . that what everyone feels to have been a technological revolution, a drastic change in our productive lives, has been accompanied everywhere . . . by a slowing-down of productivity growth, not by a step up. We can see the computer age everywhere but in the productivity statistics.” Robert Solow, “We’d Better Watch Out,” New York Times, July 12, 1987 (reviewing Stephen S. Cohen and John Zysman, Manufacturing Matters: The Myth of the Post-Industrial Economy). Conclusion: What Should We Do? “a thousand years of successful German history”: The phrase comes from Alexander Gauland, the head of the populist Alternativ für Deutschland.

pages: 869 words: 239,167

The Story of Work: A New History of Humankind
by Jan Lucassen
Published 26 Jul 2021

Himself the son of a pharmacist, he observes that ‘most pharmacists are employed only because the law says that there has to be a pharmacist present to dispense drugs’.30 Think also of the proliferation of the legal profession (especially private law experts in the Anglo-Saxon tradition),31 or of specialists in detecting, fighting and preventing cyber-crime – the other side of the shining medallion of digitization. Without striving for completeness, think also of the flexibilization of work that requires much greater coordination, the fight against burn-out and other work stress complaints. The same applies to the demands of éducation permanente in the computer age. If everyone must constantly upskill in order to stay up to date, then many more ‘upskillers’ are necessary, including specialists in digital support. Apparently, we need more inspectors, controllers and supervisors, more auditors of auditors of auditors and so on, ad infinitum, in the private and in the public sectors, both in the position of wage worker and in that of the hired-in freelancer.

Global Catastrophic Risks
by Nick Bostrom and Milan M. Cirkovic
Published 2 Jul 2008

Historical Rev., 1 04(5). http: J jwww.historycooperative.orgjjournalsjahr/ 1 04.5 jah001 582.html Russell, P. ( 1 983). The Global Brain: Speculation on the Evolutionary Leap to Planetary Consciousness (Los Angels: Tarcher) . Sale, K. (200 1 ) . Rebels Against the Future: The Luddites and Their War on the Industrial Revolution: Lessonsfor the Computer Age (New York, NY: Basic Books). Schaefer, N.A. (2004). Y2K as an endtime sign: apocalypticism in America at the fin­ de-millennium. ]. Popular Cult., 38( 1 ) , 82-105. Schmidhuber, ) . (2006) . New millennium AI and the convergence of history. In Duch, W. and Mandziuk, J. (eds.), Challenges to Computational Intelligence (Berlin: Springer). http:/ Jarxiv.orgjabsjcs.AI /0606081 Seidensticker, B. (2006).

pages: 913 words: 265,787

How the Mind Works
by Steven Pinker
Published 1 Jan 1997

Our brains are adapted to that long-vanished way of life, not to brand-new agricultural and industrial civilizations. They are not wired to cope with anonymous crowds, schooling, written language, government, police, courts, armies, modern medicine, formal social institutions, high technology, and other newcomers to the human experience. Since the modern mind is adapted to the Stone Age, not the computer age, there is no need to strain for adaptive explanations for everything we do. Our ancestral environment lacked the institutions that now entice us to nonadaptive choices, such as religious orders, adoption agencies, and pharmaceutical companies, so until very recently there was never a selection pressure to resist the enticements.

pages: 1,060 words: 265,296

The Wealth and Poverty of Nations: Why Some Are So Rich and Some So Poor
by David S. Landes
Published 14 Sep 1999

And France, the country, remained one of the world’s most beautiful places to visit, a work of natural and man-made art, a tourist paradise. By the 1990s France had one of the highest standards of living in the world, with income a quarter again as high as that of old rival Great Britain. The old staples had slumped; France did not learn to mass-produce the high-tech devices of the computer age. But wine and cheese and fabrics and fashions remained. One sticking point, source of weakness as well as strength: the French are proud. They have their way of doing things and, unlike the British, do not take easily to loss of power. This makes them poor learners of foreign ways. They have their own way.

pages: 764 words: 261,694

The Elements of Statistical Learning (Springer Series in Statistics)
by Trevor Hastie , Robert Tibshirani and Jerome Friedman
Published 25 Aug 2009

Estimating the error rate of a prediction rule: some improvements on cross-validation, Journal of the American Statistical Association 78: 316–331. Efron, B. (1986). How biased is the apparent error rate of a prediction rule?, Journal of the American Statistical Association 81: 461–70. 706 References Efron, B. and Tibshirani, R. (1991). Statistical analysis in the computer age, Science 253: 390–395. Efron, B. and Tibshirani, R. (1993). An Introduction to the Bootstrap, Chapman and Hall, London. Efron, B. and Tibshirani, R. (1996). Using specially designed exponential families for density estimation, Annals of Statistics 24(6): 2431–2461. Efron, B. and Tibshirani, R. (1997).

pages: 893 words: 282,706

The Great Shark Hunt: Strange Tales From a Strange Time
by Hunter S. Thompson
Published 6 Nov 2003

Yet neither Sturmthal nor Godwin would have balked for an instant at the prospect of climbing into the cockpit of the thing and pushing it as high and hard as it could possibly go. The Air Force has been trying for 20 years to croak the image of the wild-eyed, full-force, "aim it at the ground and see if it crashes" kind of test pilot, and they have finally succeeded. The vintage-'69 test pilot is a supercautious, super-trained, superintelligent monument to the Computer Age. He is a perfect specimen, on paper, and so confident of his natural edge on other kinds of men that you begin to wonder -- after spending a bit of time in the company of test pilots -- if perhaps we might not all be better off if the White House could be moved, tomorrow morning, to this dreary wasteland called Edwards Air Force Base.

pages: 956 words: 288,981

Ghost Wars: The Secret History of the CIA, Afghanistan, and Bin Laden, from the Soviet Invasion to September 10, 2011
by Steve Coll
Published 23 Feb 2004

Gushing oil revenue poured into every bureaucratic nook and cranny in the kingdom. Saudi Arabia’s five-year government budget from 1969 to 1974 was $9.2 billion. During the next five years it was $142 billion. Just a generation removed from nomadic poverty, the kingdom was on a forced march to the computer age. Turki wired up the General Intelligence Department offices inside the kingdom and in thirty-two embassies and consulates abroad. All the software, however, failed to detect the violent plot by the crazed Juhayman al-Utaybi to seize Mecca in November 1979. With its echoes of the Ikhwan revolt put down by Abdul Aziz, the Mecca uprising rattled all of the Saudi security agencies.

Rainbow Six
by Tom Clancy
Published 2 Jan 1998

Thanks, Jimmy." Werner lifted his phone and dialed the international number. "Mr. Tawney, please," he told the operator. "It's Gus Werner calling from FBI Headquarters in Washington." "Hello, Gus. That was very fast of you," Tawney said, half in his overcoat and hoping to get home. "The wonders of the computer age, Bill. I have a possible hit on this Serov guy. He flew from Heathrow to Chicago yesterday. The flight was about three hours after the fracas you had at Hereford. I have a rental car, a hotel hill, and a flight from Chicago to New York City after he got here." "Address?" "We're not that lucky.

pages: 931 words: 79,142

Concepts, Techniques, and Models of Computer Programming
by Peter Van-Roy and Seif Haridi
Published 15 Feb 2004

[34] Nicholas Carriero and David Gelernter. Linda in context. Communications of the ACM, 32(4):444–458, 1989. Nicholas Carriero and David Gelernter. Coordination languages and their significance. Communications of the ACM, 35(2):96–107, February 1992. [35] [36] [37] Paul E. Ceruzzi. Beyond the Limits: Flight Enters the Computer Age. MIT Press, Cambridge, MA, 1989. Emmanuel Chailloux, Pascal Manoury, and Bruno Pagano. Développement d’applications avec Objective Caml. O’Reilly & Associates, Paris, 2000. [38] Randy Chow and Theodore Johnson. Distributed Operating Systems and Algorithms. Addison-Wesley, San Francisco, CA, 1997

pages: 1,009 words: 329,520

The Last Tycoons: The Secret History of Lazard Frères & Co.
by William D. Cohan
Published 25 Dec 2015

BETWEEN 1966 AND 1969, investment banking fees soared, mirroring the merger boom across Wall Street. The year 1970 would be very different. On Wall Street a full-fledged crisis was brewing, with brokerages becoming overwhelmed by an explosion in the volume of equities traded, without having the back-office capability to handle the increased paperwork. While the problem sounds mundane in the computer age, it was anything but boring for those involved. Even the most prescient firms suffered. The New York Stock Exchange quickly figured out it had a major problem. To get a handle on how to solve the crisis of failing firms and to salvage as many of them as possible, the exchange created the Surveillance Committee of the New York Stock Exchange, loosely referred to as the Crisis Committee.

pages: 1,263 words: 371,402

The Year's Best Science Fiction: Twenty-Sixth Annual Collection
by Gardner Dozois
Published 23 Jun 2009

I believe you, Daniel.” Daniel had had some experience reading the Phites’ body language directly, and to him Primo seemed reasonably calm. Perhaps when you were as old as he was, and had witnessed so much change, such a revelation was far less of a shock than it would have been to a human at the dawn of the computer age. “You created this world?” Primo asked him. “Yes.” “You shaped our history?” “In part,” Daniel said. “Many things have been down to chance, or to your own choices.” “Did you stop us having children?” Primo demanded. “Yes,” Daniel admitted. “Why?” “There is no room left in the computer.

The Master and His Emissary: The Divided Brain and the Making of the Western World
by Iain McGilchrist
Published 8 Oct 2012

From the analytic point of view, as Steiner says, one has constantly to attempt to ‘jump “outside” and beyond the speaker’s own shadow’.65 One must never also lose sight of the interconnected nature of things, so that Heidegger’s project is in this, too, opposed to Descartes, who limited himself to viewing objects singly: ‘if one tries to look at many objects at one glance, one sees none of them distinctly’.66 Heidegger reached naturally towards metaphor, in which more than one thing is kept implicitly (hiddenly) before the mind, since he valued, unusually for a philosopher, the ambiguity of poetic language. He lamented the awful Eindeutigkeit – literally the ‘one-meaningness’, or explicitness – to which in a computer age we tend: both Wittgenstein and Heidegger, according to Richard Rorty, ‘ended by trying to work out honourable terms on which philosophy might surrender to poetry’.67 Wittgenstein’s work became increasingly apophthegmatic: he repeatedly struggled with the idea that philosophy was not possible outside of poetry.68 And Heidegger ultimately found himself, in his last works, resorting to poetry to convey the complexity and depth of his meaning.

pages: 1,351 words: 385,579

The Better Angels of Our Nature: Why Violence Has Declined
by Steven Pinker
Published 24 Sep 2012

But not in their wildest dreams do they expect the messy data from history to be so well behaved. The data we are looking at come from a ragbag of deadly quarrels ranging from the greatest cataclysm in the history of humanity to a coup d’état in a banana republic, and from the dawn of the Industrial Revolution to the dawn of the computer age. The jaw drops when seeing this mélange of data fall onto a perfect diagonal. Piles of data in which the log of the frequency of a certain kind of entity is proportional to the log of the size of that entity, so that a plot on log-log paper looks like a straight line, are called power-law distributions.51 The name comes from the fact that when you put away the logarithms and go back to the original numbers, the probability of an entity showing up in the data is proportional to the size of that entity raised to some power (which translates visually to the slope of the line in the log-log plot), plus a constant.

Germany
by Andrea Schulte-Peevers
Published 17 Oct 2010

Established by the local founder of Nixdorf computers (since swallowed by bigger corporations), it displays calculating machines, typewriters, cash registers, punch-card systems, manual telephone exchanges, accounting machines and other time-tested gadgets, although the heart of the museum clearly belongs to the computer age. Most memorable is the full-scale replica of Eniac, a room-sized vacuum-tube computer developed for the US Army in the 1940s. These days, the data it held would fit onto a barely-there microchip. There are plenty of machines to touch, push and prod as well as computer games and a virtual-reality theatre.

pages: 1,993 words: 478,072

The Boundless Sea: A Human History of the Oceans
by David Abulafia
Published 2 Oct 2019

From the human perspective the late nineteenth century was a period of free movement, another possible indication of globalization, but early in the twentieth century this was challenged by restrictions imposed – not necessarily for economic reasons – by a number of governments, notably that of the United States, which forgot the words written on the Statue of Liberty.3 These arguments, cogent as they are, do depend on a particular definition of globalization, and the new globalization of the years around 2000, based on the astonishing technological achievements of the computer age, undoubtedly has a different character to the globalization visible around 1900. Yet the twentieth century saw a complete transformation in the character of ocean shipping, with the development of cruise lines at the start of the century, the loss of passenger services once jet traffic across the oceans had become safe, and, most importantly, the container revolution, which made it possible to send goods through ports without any need to unload them there, one consequence of which was the rise of new or revived ports such as Rotterdam and Felixstowe and the eclipse of old ones such as Liverpool and London.

pages: 1,799 words: 532,462

The Codebreakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet
by David Kahn
Published 1 Feb 1963

Cryptanalysis still has room—indeed, may have more room than ever before—for flair, intuition, experience, individual brilliance. The computers at N.S.A. are—as they are wherever computers are used—the tools of their operators, not their replacements. They are robot cryptanalysts to a very limited degree. Thus, in the last half of the twentieth century, in the flowering of the computer age, cryptanalysis often comes down to exactly the same problem that four centuries earlier faced the West’s first great cryptanalyst, Giovanni Soro of Venice: Does x stand for a or o? The quality of the systems N.S.A. attacks varies greatly from country to country. Competence in cryptology, as in other fields of endeavor, seems to vary in direct proportion to the technological knowledge and the economic wealth of a country.

pages: 2,466 words: 668,761

Artificial Intelligence: A Modern Approach
by Stuart Russell and Peter Norvig
Published 14 Jul 2019

But various philosophers had raised similar issues long before AI was invented. Maurice Merleau-Ponty’s Phenomenology of Perception (1945) stressed the importance of the body and the subjective interpretation of reality afforded by our senses, and Martin Heidegger’s Being and Time (1927) asked what it means to actually be an agent. In the computer age, Alva Noe (2009) and Andy Clark (2015) propose that our brains form a rather minimal representation of the world, use the world itself on a just-in-time basis to maintain the illusion of a detailed internal model, and use props in the world (such as paper and pencil as well as computers) to increase the capabilities of the mind.