Von Neumann architecture

back to index

description: computer architecture

42 results

pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science
by Chris Bernhardt
Published 12 May 2016

Turing Machines Examples of Turing Machines Computable Functions and Calculations Church-Turing Thesis Computational Power Machines That Don’t Halt 5. Other Systems for Computation The Lambda Calculus Tag Systems One-Dimensional Cellular Automata 6. Encodings and the Universal Machine A Method of Encoding Finite Automata Universal Machines Construction of Universal Machines Modern Computers Are Universal Machines Von Neumann Architecture Random Access Machines RAMs Can Be Emulated by Turing Machines Other Universal Machines What Happens When We Input 〈M〉 into M? 7. Undecidable Problems Proof by Contradiction Russell’s Barber Finite Automata That Do Not Accept Their Encodings Turing Machines That Do Not Accept Their Encodings Does a Turing Machine Diverge on Its Encoding?

That we can simulate Turing machines on modern computers is not surprising. What is surprising is that we can design a Turing machine to simulate a modern computer, showing that Turing machines are equivalent in computing power to modern computers. We will sketch how this is done. The first step is to get a concrete description of the modern computer. Von Neumann Architecture Later we will talk more about John von Neumann, but it is important to know a few facts before we proceed. The First Draft of a Report on the EDVAC is probably the most important paper on the design of modern computers. It was written in 1945, as the first electronic computers were being built.

His machines were theoretical constructs meant to incorporate the basic computational steps of human computers. Von Neumann was interested in building a physical machine. His focus was not on the theory of computation, but on the design of an efficient machine for doing actual computations. The resulting design outlined in the report is often referred to as von Neumann architecture and most modern computers are based on this architecture. Von Neumann’s design built on the ideas of many people. The First Draft, as its name suggests, was a draft of a paper and it was only meant to be circulated to a small number of people. The fact that von Neumann was listed as the sole author and that other people’s work was not credited correctly would not have been a problem if the readership was restricted, as originally intended, to just a few colleagues, but the First Draft was widely circulated and became enormously influential in the design of all subsequent computers.

pages: 463 words: 118,936

Darwin Among the Machines
by George Dyson
Published 28 Mar 2012

Turing’s automatic computing engine, like Babbage’s analytical engine, was never built. Turing’s proposal “synthesized the concepts of a stored-program universal computer, a floating-point subroutine library, artificial intelligence, details such as a hardware bootstrap loader, and much else.”36 At a time when no such machines were in existence and the von Neumann architecture had only just been proposed, Turing produced a complete description of a million-cycle-per-second computer that foreshadowed the RISC (Reduced Instruction Set Computer) architecture that has now gained prominence after fifty years. The report was accompanied by circuit diagrams, a detailed physical and logical analysis of the internal storage system, sample programs, detailed (if bug-ridden) subroutines, and even an estimated (if unrealistic) cost of £11,200.

Progress reports were disseminated not only among the participating funding agencies and to a half-dozen groups that were duplicating the IAS design, but to any location where the potential of high-speed digital computers might fall on fertile ground. It is no accident that the vast majority of computers in circulation today follow the von Neumann architecture—characterized by a central processing unit operating in parallel on the multiple bits of one word of data at a time, a hierarchical memory ranging from fast but limited random-access memory to slow but unlimited media, such as floppy disks or tape, and a distinction between hardware and software that enabled robust computers (and a robust computer industry) to advance by a leapfrog process with each element evolving freely on its own.

“Quite often the likelihood of getting actual numerical results was very much larger if he was not in the computer room, because everybody got so nervous when he was there,” reported Martin Schwarzschild. “But when you were in real thinking trouble, you would go to von Neumann and nobody else.”43 Von Neumann’s reputation, after fifty years, has been injured less by his critics than by his own success. The astounding proliferation of the von Neumann architecture has obscured von Neumann’s contributions to massively parallel computing, distributed information processing, evolutionary computation, and neural nets. Because his deathbed notes for his canceled Silliman lectures at Yale were published posthumously (and for a popular audience) as The Computer and the Brain (1958), von Neumann’s work has been associated with the claims of those who were exaggerating the analogies between the digital computer and the brain.

When Computers Can Think: The Artificial Intelligence Singularity
by Anthony Berglas , William Black , Samantha Thalind , Max Scratchmann and Michelle Estes
Published 28 Feb 2015

Then in 1948 ENIAC was modified to have what is essentially a von Neumann architecture. This made it much easier to program. However, it also made the computer six times slower than it had been previously because it could now only execute one instruction at a time. Even on that ancient computer that ran thousands of times slower than modern computers, the trade off was considered worthwhile. Being easy to program was and is generally far more important than being very efficient. Today there are variations of the basic von Neumann architecture. Graphics Processing Units (GPUs) contain hundreds of von Neumann subsystems that can compute at the same time and so render complex scenes in real time.

Reasoning about program logic 5. Automating program generation 6. High-level models 7. Learning first order concepts 8. Evolutionary algorithms 9. Artificial life 10. Evolutionary programming Computer Hardware 1. Introduction 2. Transistors 3. Logic Elements 4. Programmable Logic Arrays 5. Von Neumann Architecture 6. PLAs vs von Neumann 7. Analog Computers 8. Neurons Brains 1. Gross anatomy 2. Neocortex 3. Brain activity 4. Brain function and size 5. Brain simulation 6. Worms 13. Computational Neuroscience 1. Neurons 2. Neuron synapse 3. Integrate and fire (IF) neurons 4. Hebbian learning 5. Plasticity 6.

Neurons are also relatively slow, with only roughly 200 firings per second, so they have to work concurrently to produce results in a timely manner. On the other hand, ordinary personal computers might contain 4 billion bytes of fast memory, and several thousand billion bytes of slower disk storage. Unlike a neuron, a byte of computer memory is passive, and a conventional “von Neumann” architecture can only process a few dozen bytes at any one time. That said, the computer can perform several billion operations per second, which is millions of times faster than neurons. Specialized hardware and advanced architectures can perform many operations simultaneously, but we also know from experience that it is difficult to write highly concurrent programs that utilize that hardware efficiently.

pages: 118 words: 35,663

Smart Machines: IBM's Watson and the Era of Cognitive Computing (Columbia Business School Publishing)
by John E. Kelly Iii
Published 23 Sep 2013

A cognitive computer employing these systems will respond to inquiries more quickly than today’s computers; less data movement will be required and less energy will be used. Today’s von Neumann–style computing won’t go away when cognitive systems come online. New chip and computing technologies will extend its life far into the future. In many cases, the cognitive architecture and the von Neumann architecture will be employed side by side in hybrid systems. Traditional computing will become ever more capable while cognitive technologies will do things that were not possible before. Already, cloud, social networking, mobile, and new ways to interact with computing from tablets to glasses are fueling the desire for cognitive systems that will, for example, both harvest insights from social networks and enhance our experiences within them.

He concluded that while it was futile in the short term to try to invent a new technology for a cognitive machine, that didn’t mean the project should be abandoned. Instead, the team needed to refocus on CMOS chip technology and on digital circuitry rather than analog circuitry. They would create an entirely new non–von Neumann architecture in both silicon and software that would simulate the functions of neurons and synapses. Using that architecture, they would produce chips for sense-making tasks that would be vastly more efficient than today’s standard digital processors.7 Dharmendra called a meeting of all of the participants in the project.

Computing intelligence will be too costly to be practical. Scientists at IBM Research believe that to make computing sustainable in the era of big data, we will need a different kind of machine—the data-centric computer. Today’s computers are processor-centric. The microprocessor, which is the central processing unit in the von Neumann architecture, is where much of the action happens in computing. Working hand in hand with the operating system, the microprocessor sends out instructions to various components within the computer, requesting data from where it’s stored, including memory chips and disk drives. If the computer is part of a larger network, the processor fetches data from storage systems located out on the network.

pages: 253 words: 80,074

The Man Who Invented the Computer
by Jane Smiley
Published 18 Oct 2010

Partisans of von Neumann make the case that, as with everything else von Neumann did, he took the raw material of another man’s ideas and immediately transcended it, or, as Macrae says, “Johnny grabbed other people’s ideas, then by his clarity leapt five blocks ahead of them, and helped put them into practical effect.” The most important contribution of the “First Draft” to computer design was that it laid out what came to be known as “von Neumann architecture”—that is, that the computer could contain a set of instructions in its memory like the set of instructions that Turing’s human “computer” would have been given and would have to follow day after day forever. The instructions would be stored in the memory, which the electronic computer could readily access (not like a paper tape or a deck of punch cards).

Flowers promised the machine by August, but postwar repairs and improvements to the telephone system superseded the project, and by February 1947 the ACE was going nowhere because Turing could not persuade Womersley to commit himself to Turing’s ideas—for example, an engineering department was set up, but made no progress. Possibly, Womersley was the sort of administrator who thinks contradictory ideas constitute a backup plan, but in the end they constituted no plan at all because what had come to be called “von Neumann architecture”—the principles of computer design set out in the “First Draft”—were simply taking over by coming to seem tried and tested.3 Turing quit. In the autumn of 1947, he returned to Cambridge. 1. One reason that Zuse’s autobiography is interesting is that it gives Americans a perspective on life in Nazi Germany that we rarely get.

Eckert was still with Sperry Rand (he stayed with Sperry, and then Unisys, until 1989). Neither Mauchly nor Eckert had profited directly from the ENIAC patent, but they did get credit (and they did seek that credit) for inventing the computer. Eckert, in particular, was vocal about the inaccuracy of the phrase “von Neumann architecture”—he thought it should be called “Eckert architecture.” But the vagaries of patent law and the delay in awarding the Eckert and Mauchly patents seemed to be working for Sperry. If the patent had been awarded in 1947, it would have run out by 1964, before computers became big business. However, in 1960, the patent was still being challenged.

pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software
by Charles Petzold
Published 28 Sep 1999

These instructions should be sequential in memory and addressed with a program counter but should also allow conditional jumps. This design came to be known as the stored-program concept. These design decisions were such an important evolutionary step that today we speak of von Neumann architecture. The computer that we built in the last chapter was a classic von Neumann machine. But with von Neumann architecture comes the von Neumann bottleneck. A von Neumann machine generally spends a significant amount of time just fetching instructions from memory in preparation for executing them. You'll recall that the final design of the Chapter 17 computer required that three-quarters of the time it spent on each instruction be involved in the instruction fetch.

It makes it sound as if computer architecture were patterned after an office. The truth is that the distinction between memory and storage is an artificial one and exists solely because we don't have a single storage medium that is both fast and vast as well as nonvolatile. What we know today as "von Neumann architecture"—the dominant computer architecture for over 50 years—is a direct result of this technical deficiency. Here's another question that someone once asked me: "Why can't you run Macintosh programs under Windows?" My mouth opened to begin an answer when I realized that it involved many more technical issues than I'm sure my questioner was prepared to deal with in one sitting.

Because C has operations that parallel many common processor instructions, C is sometimes categorized as a high-level assembly language. More than any ALGOL-like language, C closely mimics common processor instruction sets. Yet all ALGOL-like languages—which really means most commonly used programming languages—were designed based on von Neumann architecture computers. Breaking out of the von Neumann mind-set when designing a computer language isn't easy, and getting other people to use such a language is even harder. One such non–von Neumann language is LISP (which stands for List Processing), which was designed by John McCarthy in the late 1950s and is useful for work in the field of artificial intelligence.

Turing's Cathedral
by George Dyson
Published 6 Mar 2012

., adopts “Johnny” nickname (1930) appointed to IAS (1933) resigns positions in Germany (1933–1935) becomes U.S. citizen (1937) marries Klára (Klári) Dan (1938) mission to England during World War II (1943), 4.1, 13.1 joins Manhattan Project at Los Alamos (1943), 4.1, 5.1 appointed to Atomic Energy Commission (1954) accepts position at UCLA (1954) diagnosed with cancer (1955) final 18 months of life, 14.1, 18.1 death (1957) and alternative models of computing, 14.1, 14.2 on analog vs. digital, 9.1, 12.1, 14.1 and applied mathematics, 3.1, 4.1, 5.1, 10.1 and Atomic Energy Commission (AEC), 1.1, 11.1, 14.1, 14.2, 14.3, 15.1, 15.2, 16.1, 18.1 on axiomatization 49–50 and Nils Barricelli and beginnings of ECP, 6.1, 7.1 and Julian Bigelow, 7.1, 14.1, 18.1 with Niels Bohr in Copenhagen (1938) on bottlenecks in computing, 5.1, 5.2 and the brain in Cambridge with Turing (1935) on communism, 4.1, 10.1, 10.2 character of, 3.1, 4.1, 4.2, 4.3, 4.4, 5.1, 5.2, 5.3, 7.1, 7.2, 8.1, 8.2, 8.3, 10.1, 10.2, 10.3, 11.1, 11.2, 11.3, 12.1, 13.1, 13.2, 14.1, 17.1, 18.1, 18.2, 18.3 and Cybernetics movement driving habits of, 4.1, 10.1, 15.1 and economics, 4.1, 4.2, 15.1 and the EDVAC and engineers, 4.1, 8.1, 10.1 and the ENIAC, 5.1, 5.2, 5.3, 9.1, 9.2, 10.1, 11.1 and foundations of mathematics frequent travels of, 4.1, 4.2, 5.1, 10.1, 10.2, 13.1, 16.1 on future of computing and future of technology and gambling, 10.1, 10.2, 10.3 and game theory, 4.1, 4.2, 4.3, 8.1, 10.1, 15.1, 18.1 and Gödel, 4.1, 6.1, 6.2, 6.3, 6.4, 8.1, 13.1, 15.1, 18.1 and Herman Goldstine and IBM, 4.1, 5.1, 5.2, 8.1, 9.1, 14.1, 18.1 and incompleteness on infinity, 4.1, 16.1 and Los Alamos, prf.1, 1.1, 1.2, 4.1, 5.1, 5.2, 10.1, 10.2, 10.3, 11.1, 11.2, 11.3, 15.1, 18.1 mental abilities of, 1.1, 4.1, 4.2, 4.3, 4.4, 7.1, 7.2, 11.1, 14.1, 18.1 and military, 1.1, 4.1, 4.2, 10.1, 10.2, 10.3, 11.1, 14.1, 16.1, 17.1 and molecular biology, 12.1, 15.1 on “a network of computers that would span the world” (1945) on non-linear coding and nuclear weapons, prf.1, 1.1, 1.2, 4.1, 6.1, 10.1, 10.2, 11.1, 11.2, 11.3, 11.4, 15.1, 16.1 and numerical weather prediction, 9.1, 9.2, 9.3, 18.1 and Oppenheimer, 9.1, 14.1, 14.2 and origins of ECP, 5.1, 5.2, 5.3 and patent rights, 5.1, 5.2, 8.1 and preventive (“quick”) war, 6.1, 10.1, 10.2, 15.1, 18.1 and punched cards, 4.1, 9.1 and RCA, 5.1, 5.2, 9.1 on reliability, 7.1, 12.1 religious beliefs of, 4.1, 14.1 on remote input/output and Selectron and shock waves, 4.1, 12.1, 16.1, 16.2 on singularity (technological) and stored-program computing, 5.1, 5.2, 6.1, 8.1, 10.1, 16.1 superstitions of and theory of self-reproducing automata, 1.1, 15.1, 15.2 and Turing (and Universal Turing Machine), 1.1, 3.1, 5.1, 6.1, 8.1, 13.1, 13.2, 13.3, 13.4, 13.5, 15.1 and Stan Ulam, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 10.1, 10.2, 11.1, 11.2, 14.1, 14.2 and Oswald Veblen, 4.1, 14.1, 18.1 work habits, 4.1, 7.1, 10.1 and World War II, 4.1, 6.1 von Neumann, Klári (Klára Dán, 1911–1963), 4.1, 10.1, 10.2, 14.1, 14.2, 16.1, 18.1, 18.2 childhood in “Roaring Twenties” Budapest meets von Neumann in Monte Carlo (1930s) meets von Neumann in Budapest (1937) marries John von Neumann (1938) arrives in U.S. (1938) suicide of father (1939) pregnancy and miscarriage (1942) death, in La Jolla (1963) and computer programming, 10.1, 10.2 and depression, 10.1, 10.2 on ECP, 5.1, 5.2, 7.1, 7.2, 18.1 and ENIAC, 9.1, 10.1 on Abraham Flexner, 3.1, 3.2 on Hungary and Hungarians on IAS, 3.1, 6.1, 7.1, 7.2, 11.1 marriages of, 10.1, 10.2, 18.1, nts.1n and Monte Carlo (code), 10.1, 10.2, 16.1 and nuclear weapons design, 10.1, 18.1 and population research, 10.1, 10.2, 10.3 and Princeton, 4.1, 10.1, 14.1, 18.1 on John von Neumann, 3.1, 4.1, 4.2, 4.3, 4.4, 5.1, 5.2, 10.1, 11.1, 14.1, 18.1 on von Neumann and destruction of Europe, 4.1, 4.2, 10.1, 10.2, 10.3 on von Neumann and Morgenstern, 4.1, 4.2 on von Neumann and U.S. landscape, 4.1, 10.1 on Oswald Veblen, 3.1, 4.1, 4.2, 14.1 Vonneumann, Nicholas (1911–2011), 4.1, 4.2, 4.3 on life under Béla Kun on family life in Budapest on John von Neumann, 4.1, 4.2, 4.3, 4.4, 14.1 on Max von Neumann von Neumann, Max (1873–1928), 4.1, 4.2, 4.3, 4.4 awarded nobility, 1913 von Neumann, Michael (1907–1989) von Neumann (Whitman), Marina, 4.1, 10.1, 10.2 on John von Neumann, 4.1, 4.2, 10.1, 14.1 von Neumann architecture and non–von Neumann architecture von Neumann bottleneck Wald, Abraham, 7.1, 7.2 Walter Reed Hospital, 4.1, 14.1 Ware, Willis, 1.1, 5.1, 7.1, 7.2, 7.3, 7.4, 8.1, 8.2, 18.1, 18.2 on Bigelow, 7.1, 8.1, 14.1 on duplication of IAS computer joins ECP, 1946 on numerical testing routines on opposition to ECP at IAS on Presper Eckert on Selectron memory on von Neumann, prf.1, prf.2, 5.1, 10.1 Warren, Robert B.

The “First Draft of a Report on the EDVAC,” reproduced by mimeograph and released into limited distribution by the Moore School on June 30, 1945, outlined the design of a high-speed stored-program electronic digital computer, including the requisite formulation and interpretation of coded instructions—“which must be given to the device in absolutely exhaustive detail.”42 The functional elements of the computer were separated into a hierarchical memory, a control organ, a central arithmetic unit, and input/output channels, making distinctions still known as the “von Neumann architecture” today. A fast internal memory, coupled to a larger secondary memory, and linked in turn to an unlimited supply of punched cards or paper tape, gave the unbounded storage that Turing had prescribed. The impediment of a single channel between memory and processor is memorialized as the “von Neumann bottleneck,” although its namesake attempted, unsuccessfully, to nip this in the bud.

However, it is believed to be not far from an important central truth, that highly recursive, conditional and repetitive routines are used because they are notationally efficient (but not necessarily unique) as descriptions of underlying processes.”40 Bigelow questioned the persistence of the von Neumann architecture and challenged the central dogma of digital computing: that without programmers, computers cannot compute. He (and von Neumann) had speculated from the very beginning about “the possibility of causing various elementary pieces of information situated in the cells of a large array (say, of memory) to enter into a computation process without explicitly generating a coordinate address in ‘machine-space’ for selecting them out of the array.”41 Biology has been doing this all along.

pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI
by John Brockman
Published 19 Feb 2019

A second foundational piece of work was in a 1945 “First Draft” report on the design for a digital computer, wherein von Neumann advocated for a memory that could contain both instructions and data.* This is now known as a von Neumann architecture computer—as distinct from a Harvard architecture computer, where there are two separate memories, one for instructions and one for data. The vast majority of computer chips built in the era of Moore’s Law are based on the von Neumann architecture, including those powering our data centers, our laptops, and our smartphones. Von Neumann’s digital-computer architecture is conceptually the same generalization—from early digital computers constructed with electromagnetic relays at both Harvard University and Bletchley Park—that occurs in going from a special-purpose Turing Machine to a Universal Turing Machine.

In the fifty-year Moore’s Law–fueled race to produce software that could exploit the doubling of computer capability every two years, the typical care and certification of engineering disciplines were thrown by the wayside. Software engineering was fast and prone to failures. This rapid development of software without standards of correctness has opened up many routes to exploit von Neumann architecture’s storage of data and instructions in the same memory. One of the most common routes, known as “buffer overrun,” involves an input number (or long string of characters) that is bigger than the programmer expected and overflows into where the instructions are stored. By carefully designing an input number that is too big by far, someone using a piece of software can infect it with instructions not intended by the programmer, and thus change what it does.

pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence
by George Zarkadakis
Published 7 Mar 2016

Von Neumann was fascinated by the design of ENIAC, and wondered how the computer might be easily reprogrammed to perform a different set of operations – not involving artillery ballistics this time, but to predict the results of a hydrogen bomb explosion. Invited by the team that developed ENIAC to advise them, von Neumann produced a landmark report,7 which described a machine that could store both data and programs.8 The ‘von Neumann architecture’ – as it has hitherto been known – demonstrated how computers could be reprogrammed easily. Until then computers had fixed programs, and had to be physically rewired in order to be reprogrammed. Von Neumann’s architecture allowed code in a computer to be self-modified. One could thus write programs that write programs, an idea that makes possible the host of automated tools that computer engineers have nowadays at their disposal, such as assemblers and compilers.

This is a very hard question to answer, since we do not yet have a way to collect credible evidence.26 Nevertheless, I personally would be inclined to bet that the spontaneous emergence of self-awareness in current technological cyberspace is highly improbable. Since the 1940s, we have been locked in a specific approach to computer technology that separates hardware from software, and which is mostly based on a specific hardware architecture called the ‘von Neumann architecture’, as we saw in the previous chapter. There could have been many other paths we could have taken in computer evolution (for instance advanced analogue computers), but we did not. The word ‘evolution’ is of great importance here. The pseudo-cybernetic assumption of the AI Singularity hypothesis essentially claims that an evolutionary kind of emergence of self-awareness is involved.

., McAfee A. (2014),The Second Machine Engine. New York: W.W. Norton & Co. 13In Turing’s description the tape with the symbols (the ‘data’) is separate from the table of instructions (the ‘program’). In modern computers data and programs are stored in the same storage, a key insight that is part of the ‘von Neumann architecture’. 14According to historians Robert Friedel and Paul Israel at least twenty-two other inventors ‘discovered’ the incandescent lamp prior to Thomas Edison. However, it was Edison who developed the lamp into an effective source of electric lighting by selecting an effective incandescent material, achieving a higher vacuum and using a higher resistance filament. 15Konrad Zuse invented the world’s first programmable computer Z3, which became operational in May 1941. 14 From Bletchley Park to Google Campus 1‘Global Information Report 2013’, World Economic Forum (www.weforum.com). 2This is a phrase from Greek philosopher Heraclitus (535–475 BC).

pages: 352 words: 120,202

Tools for Thought: The History and Future of Mind-Expanding Technology
by Howard Rheingold
Published 14 May 2000

He was one of history's most brilliant physicists, logicians, and mathematicians, as well as the software genius who invented the first electronic digital computer. John von Neumann was the center of the group who created the "stored program" concept that made truly powerful computers possible, and he specified a template that is still used to design almost all computers--the "von Neumann architecture." When he died, the Secretaries of Defense, the Army, Air Force, and Navy and the Joint Chiefs of staff were all gathered around his bed, attentive to his last gasps of technical and policy advice. Norbert Wiener, raised to be a prodigy, graduated from Tufts at fourteen, earned his Ph.D. from Harvard at eighteen, and studied with Bertrand Russell at nineteen.

All such machines, the authors of the "Preliminary Report" declared, must have a unit where arithmetic and logical operations can be performed (the processing unit where actual calculation takes place, equivalent to Babbage's "mill"), a unit where instructions and data for the current problem can be stored (like Babbage's "store," a kind of temporary memory device), a unit that executes the instructions according to the specified sequential order (like the "read/write head" of Turing's theoretical machine), and a unit where the human operator can enter raw information or see the computed output (what we now call "input-output devices"). Any machine that adheres to these principles -- no matter what physical technology is used to implement these logical functions -- is an example of what has become known as "the von Neumann architecture." It doesn't matter whether you build such a machine out of gears and springs, vacuum tubes, or transistors, as long as its operations follow this logical sequence. This theoretical template was first implemented in the Unites States at the Institute for Advanced Study. Modified copies of the IAS machine were made for the Rand Corporation, an Air Force spinoff "think tank" that was responsible for keeping track of targets for the nation's new but fast-growing nuclear armory, and for the Los Alamos Laboratory.

"He reminds me of Moses parting the Red Sea," is the way Alan Kay describes Engelbart's gentle charisma. Of course, the original Moses never set foot in the promised Land. And he never had the reputation of being an easy man to work with. In 1951, Engelbart quit his job at Ames and went to graduate school at the University of California at Berkeley, where one of the first von Neumann architecture computers was being built. That was when he began to notice that not only didn't people know what he was talking about, but some presumably "objective" scientists were overly hostile. He started saying the wrong things to people who could affect his career, things that simply sounded strange to the other electrical engineers.

pages: 239 words: 70,206

Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else
by Steve Lohr
Published 10 Mar 2015

The big-data era is the next evolutionary upheaval in the landscape of computing. The things people want to do with data, like real-time analysis of data streams or continuously running machine-learning software, pose a threat to the traditional computer industry. Conventional computing—the Von Neumann architecture, named for mathematician and computer scientist John von Neumann—operates according to discrete steps of program, store, and process. Major companies and markets were built around those tiers of computing—software, disk drives, and microprocessors, respectively. Modern data computing, according to John Kelly, IBM’s senior vice president in charge of research, will “completely disrupt the industry as we know it, creating new platforms and players.”

., 5–6 Snyder, Steven, 165–67, 170 social networks, research using human behavior and, 86–94 retail use, 153–62 spread of information and, 73–74 Twitter posts and, 197–202 see also privacy concerns Social Security numbers, data used to predict person’s, 187–88 software, origin of term, 96 Solow, Robert, 72 Speakeasy programming language, 160 Spee (Harvard club), 28–30 Spohrer, Jim, 25 Stanford University, 211–12 Starbucks, 157 Stockholm, rush-hour pricing in, 47 storytelling, computer algorithms and, 120–21, 149, 165–66, 205, 214 structural racism, in big data racial profiling, 194–95 Structure of Scientific Revolutions, The (Kuhn), 175 Sweeney, Latanya, 193–95 System S, at IBM, 40 Tarbell, Ida, 208 Taylor, Frederick Winslow, 207–8 Tecco, Halle, 16, 25, 28, 168–69 Tetlock, Philip, 67–68 thermostats, learning by, 143–45, 147–53 Thinking, Fast and Slow (Kahneman), 66–67 toggling, 84 Truth in Lending Act (1968), 185 T-shaped people, 25 Tukey, John, 96–97 Turing, Alan, 178–79 Tversky, Amos, 66 Twitter, 85 posts studied for personal information, 197–202 “Two Cultures, The” (Snow), 5–6 “universal machine” (Turing’s theoretical computer), 179 universities, data science and, 15–16, 97–98, 211–12 Unlocking the Value of Personal Data: From Collection to Usage (World Economic Forum), 203 “Unreasonable Effectiveness of Data, The” (Norvig), 116 use-only restrictions, on data, 203 Uttamchandani, Menka, 77–78, 80, 212 VALS (Values, Attitudes, and Lifestyles), 155 Van Alstyne, Marshall, 74 Vance, Ashlee, 85 Vargas, Veronica, 159–60 Varma, Anil, 136–37 Veritas, 91 vineyards, data used for precision agriculture in, 123–33, 212 Vivero, David, 29 Vladeck, David, 203, 204 von Neumann, John, 54 Von Neumann architecture, 54 Walker, Donald, 2, 63, 212 Walmart, 104, 154 Watson, Thomas Jr., 49 Watson technology, of IBM, 45, 66–67, 120, 205 as cloud service, 9, 54 Jeopardy and, 7, 40, 111, 114 medical diagnoses and, 69–70, 109 Watts, Duncan J., 86 weather analysis, with big data, 129–32 Weitzner, Daniel, 184 “Why ask Why?”

pages: 476 words: 121,460

The Man From the Future: The Visionary Life of John Von Neumann
by Ananyo Bhattacharya
Published 6 Oct 2021

He congratulated von Neumann for providing the first ‘complete logical framework for the machine’ and contrasted the streamlined design with the ENIAC, which was ‘chuck full of gadgets that have as their only raison d’etre that they appealed to John Mauchly’.43 Computer designers now refer to the whole configuration as the ‘von Neumann architecture’, and nearly all computers in use today – smart phones, laptops, desktops – are built according to its precepts. The design’s fundamental drawback, now called the ‘von Neumann bottleneck’, is that instructions or data have to be found and fetched serially from memory – like standing in a line, and being able to pass messages only forwards or backwards.

How, then, do brains far more mundane than von Neumann’s accomplish incredible feats that defeat today’s most sophisticated computers like, for example, making up an amusing pun? The answer is that neurons do not fire one after the other, but do their work simultaneously: they are not serial, like von Neumann architecture computers, but parallel – massively so. It was a lasting insight. The artificial neural networks that power today’s best-performing artificial intelligence systems, like those of Google’s DeepMind, are also a kind of parallel processor: they seem to ‘learn’ in a somewhat similar way to the human brain – altering the various weights of each artificial neuron until they can perform a particular task.

‘Bertus’ 22, 149, 150 Budapest anti-Semitism 2, 5 Jewish population 2 Kann-Heller 2 Lenin Boys 13 Romanian occupation 13–14 von Neumann’s life in 1–9, 5, 7 Budapest, University of 6–7, 11, 25, 65 Burks, Arthur 125, 130, 231, 235, 243, 258, 265 Burroughs, William S. xiv Bush, Vannevar 77, 107 California Institute of Technology 184, 245 Cambridge, University of, King’s College 70 Cantor, Georg 20, 20–21 carbon dioxide emissions 283 cardinality 23 Carleton University, Ottawa 225 Carter, Jimmy 264 causality 29, 48–9, 51, 60, 76, 298n63 Cayley, Arthur 32 cellular automata see self-reproducing automata Cellular Automata Machine 245 chain reaction, nuclears, Monte Carlo bomb simulations 8, 80, 81–2, 87–8, 133–4 Champernowne, David 151 Champlain, SS 77 chance 144–5 Chandrasekhar, Subrahmanyan 79 Chebyshev polynomials 12–13, 295n32 chess 4, 142–3, 144, 162–3, 164, 257, 289–90n8, 313n51 China 209 Church, Alonzo 118–9 Churchill, Winston 80, 90 cities, segregation 270, 271, 272 ‘clanking replicators’ 261–3, 262 Clauser, John 57 Cliff, Rodger 2654 climate change 283, 284 Clippinger, Richard 135 closed subroutine, the 138 Cockcroft, John 301n23 Codd, Edgar 258 Cold War 218–20183, 203, 208–12 counterforce strategy 222 doctrine of preventive war 208–10 game theory analysis 212–16 ICBM threat 216–18 kill-or-be-killed paranoia 203 little wars strategy 222–4 nuclear deterrence 212–16, 221–224 paranoia 203 preventive war 208–10 Soviet aggression 222 Soviet Union hydrogen bomb test 216–17 VNs view of 208–9 Collbohm, Frank 185–6, 217 Columbia University 78, 213, 214 Communist Party of Germany (KPD) 99 Compleat Strategyst, The (Williams) 189 Computer and the Brain, The (von Neumann) 275–6 computer programmer job born 131–2 ‘low class individuals’ 278 Neumann, Klára Dán von, as one of the first 133, 135, 136–138 computer science, foundations of 70, 121 computers xiii approach to programming 134–5 Atanasoff–Berry 127 Automatic Computing Engine (ACE) 121, 125 Automatic Sequence Controlled Calculator (ASCC) see Harvard Mark I birth of 16, 28, 102–40 closed subroutine 138 coding 115–16 comparison to the brain 273–6 delay-lines 124–5 differential analysers 107–8 earliest 73 quantum 59 women’s role 85–6, 108–9, 120–1 differential analyser 107–8 EDVAC patent dispute 125–6, 127–8 EDVAC report 121–7 see First Draft of a Report on the EDVAC (von Neumann) ENIAC xi, 1054–11, 106, 120, 123, 124, 126, 127–8 ENIAC conversion 130–5, 309–10n68 First Draft of a Report on the EDVAC (von Neumann) 111, 121–7111 for-loops 116 Gödel’s contribution 111–18 Harvard Mark I 104 human 120–1 IAS project 127, 128–30, 131, 138–9, 140, 193 IBM 701 14039 JOHNNIAC 193, 194 Manchester Baby 138 MANIAC I 137, 139, 310n77 origin of 28 program-controlled 119–20 Project PY 1101 proliferation 272–3 RAND Corporation and 192–3 size 106 Small-Scale Experimental Machine (SSEM) see Manchester Baby 138 storage capacity 123 stored-program 120, 121, 122 see also First Draft of a Report on the EDVAC subroutines 119 Turing’s contribution 118–21 the universal Turing machine 118–21, 306–7n35, 307n37 virus, first 236 VNs contribution 122, 125–76, 129–130, 131, 139–140, 308n48 VNs early interests in 79–80, 103–5 von Neumann architecture 123, 128, 275 von Neumann bottleneck 123 Conan Doyle, Arthur, ‘The Final Problem’ 153–4, 165–6, 165, 314n52 Conferences on Cybernetics 227 Connes, Alain 62 Conus textile sea snail 249 Conway, John Horton 237–41, 243 hexagonal packing of circles 237, 238 Life 239–41, 239, 240, 242, 243, 244, 245, 257 Universal Turing Machine within 241, 243 survey of life forms 240 Universal Turing machine 241, 243 cooperative game theory 172–3, 176, 178, 196–7 Copeland, B.

pages: 294 words: 96,661

The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity
by Byron Reese
Published 23 Apr 2018

Who could have guessed that such a humble little device could do all that? Well, Turing could, of course. But no one else seems to have had that singular idea. Exit Turing. Enter John von Neumann, whom we call the father of modern computing. In 1945, he developed the von Neumann architecture for computers. While Turing machines are purely theoretical, designed to frame the question of what computers can do, the von Neumann architecture is about how to build actual computers. He suggested an internal processor and computer memory that holds both programs and data. In addition to the computer’s memory, there might also be external storage to hold data and information not currently needed.

pages: 339 words: 92,785

I, Warbot: The Dawn of Artificially Intelligent Conflict
by Kenneth Payne
Published 16 Jun 2021

He demonstrated that logical propositions, like AND/OR and IF/THEN, and binary numbers could be implemented by switches in a telephone circuit.13 Switches, binary code and Boolean logic soon became part of the basic architecture of modern digital computing. Another significant figure early in the history of computing was the charismatic American, John von Neumann.14 His contribution is rather less clear-cut; others can also lay claim to what has become known as the von Neumann architecture for modern computers. By this stage, spurred by the war and the problems of fire control, communications and code-breaking, computing was a large and rapidly expanding field. Still, von Neumann, a brilliant mathematician and consummate committee man, certainly played an outsized part in the story of the computer, advocating a design where the instructions for the computer were separate from the machinery of computing (rather like Turing’s logic processor and ticker tape).

In Turing’s example, the ‘software’ program, inscribed on the tape, did a huge amount of heavy lifting. The hardware itself was pretty basic. But the balance could shift, and the boundary wasn’t always clear cut, as with the idea of ‘firmware’, semi-permanent software that helps run the machinery, which in turn implements the software. And there was another enduring design feature in the von Neumann architecture—memory, on which to store the program and information that the computer would use in its calculations. In Turing’s machine, the tape itself served as the memory. In the real world, creating memory was an engineering challenge, initially solved using cathode tubes appropriated from television screens—discrete ‘bits’ of information would circulate in the tube, being read off the screen with a foil covering and being looped back into the tube—in other words, being ‘held’ in memory.

pages: 346 words: 97,890

The Road to Conscious Machines
by Michael Wooldridge
Published 2 Nov 2018

Across the Atlantic in Pennsylvania, a team led by John Mauchly and J. Presper Eckert developed a machine called ENIAC to compute artillery tables. With some tweaks by the brilliant Hungarian mathematician John von Neumann, ENIAC established the fundamental architecture of the modern computer (the architecture of conventional computers is called the Von Neumann architecture, in his honour). Over in post-war England, Fred Williams and Tom Kilburn built the Manchester Baby, which led directly to the world’s first commercial computer, the Ferranti Mark 1 – Turing himself joined the staff of Manchester University in 1948, and wrote some of the first programs to run on it.

A A* 77 À la recherche du temps perdu (Proust) 205–8 accountability 257 Advanced Research Projects Agency (ARPA) 87–8 adversarial machine learning 190 AF (Artificial Flight) parable 127–9, 243 agent-based AI 136–49 agent-based interfaces 147, 149 ‘Agents That Reduce Work and Information Overload’ (Maes) 147–8 AGI (Artificial General Intelligence) 41 AI – difficulty of 24–8 – ethical 246–62, 284, 285 – future of 7–8 – General 42, 53, 116, 119–20 – Golden Age of 47–88 – history of 5–7 – meaning of 2–4 – narrow 42 – origin of name 51–2 – strong 36–8, 41, 309–14 – symbolic 42–3, 44 – varieties of 36–8 – weak 36–8 AI winter 87–8 AI-complete problems 84 ‘Alchemy and AI’ (Dreyfus) 85 AlexNet 187 algorithmic bias 287–9, 292–3 alienation 274–7 allocative harm 287–8 AlphaFold 214 AlphaGo 196–9 AlphaGo Zero 199 AlphaZero 199–200 Alvey programme 100 Amazon 275–6 Apple Watch 218 Argo AI 232 arithmetic 24–6 Arkin, Ron 284 ARPA (Advanced Research Projects Agency) 87–8 Artificial Flight (AF) parable 127–9, 243 Artificial General Intelligence (AGI) 41 artificial intelligence see AI artificial languages 56 Asilomar principles 254–6 Asimov, Isaac 244–6 Atari 2600 games console 192–6, 327–8 augmented reality 296–7 automated diagnosis 220–1 automated translation 204–8 automation 265, 267–72 autonomous drones 282–4 Autonomous Vehicle Disengagement Reports 231 autonomous vehicles see driverless cars autonomous weapons 281–7 autonomy levels 227–8 Autopilot 228–9 B backprop/backpropagation 182–3 backward chaining 94 Bayes nets 158 Bayes’ Theorem 155–8, 365–7 Bayesian networks 158 behavioural AI 132–7 beliefs 108–10 bias 172 black holes 213–14 Blade Runner 38 Blocks World 57–63, 126–7 blood diseases 94–8 board games 26, 75–6 Boole, George 107 brains 43, 306, 330–1 see also electronic brains branching factors 73 Breakout (video game) 193–5 Brooks, Rodney 125–9, 132, 134, 243 bugs 258 C Campaign to Stop Killer Robots 286 CaptionBot 201–4 Cardiogram 215 cars 27–8, 155, 223–35 certainty factors 97 ceteris paribus preferences 262 chain reactions 242–3 chatbots 36 checkers 75–7 chess 163–4, 199 Chinese room 311–14 choice under uncertainty 152–3 combinatorial explosion 74, 80–1 common values and norms 260 common-sense reasoning 121–3 see also reasoning COMPAS 280 complexity barrier 77–85 comprehension 38–41 computational complexity 77–85 computational effort 129 computers – decision making 23–4 – early developments 20 – as electronic brains 20–4 – intelligence 21–2 – programming 21–2 – reliability 23 – speed of 23 – tasks for 24–8 – unsolved problems 28 ‘Computing Machinery and Intelligence’ (Turing) 32 confirmation bias 295 conscious machines 327–30 consciousness 305–10, 314–17, 331–4 consensus reality 296–8 consequentialist theories 249 contradictions 122–3 conventional warfare 286 credit assignment problem 173, 196 Criado Perez, Caroline 291–2 crime 277–81 Cruise Automation 232 curse of dimensionality 172 cutlery 261 Cybernetics (Wiener) 29 Cyc 114–21, 208 D DARPA (Defense Advanced Research Projects Agency) 87–8, 225–6 Dartmouth summer school 1955 50–2 decidable problems 78–9 decision problems 15–19 deduction 106 deep learning 168, 184–90, 208 DeepBlue 163–4 DeepFakes 297–8 DeepMind 167–8, 190–200, 220–1, 327–8 Defense Advanced Research Projects Agency (DARPA) 87–8, 225–6 dementia 219 DENDRAL 98 Dennett, Daniel 319–25 depth-first search 74–5 design stance 320–1 desktop computers 145 diagnosis 220–1 disengagements 231 diversity 290–3 ‘divide and conquer’ assumption 53–6, 128 Do-Much-More 35–6 dot-com bubble 148–9 Dreyfus, Hubert 85–6, 311 driverless cars 27–8, 155, 223–35 drones 282–4 Dunbar, Robin 317–19 Dunbar’s number 318 E ECAI (European Conference on AI) 209–10 electronic brains 20–4 see also computers ELIZA 32–4, 36, 63 employment 264–77 ENIAC 20 Entscheidungsproblem 15–19 epiphenomenalism 316 error correction procedures 180 ethical AI 246–62, 284, 285 European Conference on AI (ECAI) 209–10 evolutionary development 331–3 evolutionary theory 316 exclusive OR (XOR) 180 expected utility 153 expert systems 89–94, 123 see also Cyc; DENDRAL; MYCIN; R1/XCON eye scans 220–1 F Facebook 237 facial recognition 27 fake AI 298–301 fake news 293–8 fake pictures of people 214 Fantasia 261 feature extraction 171–2 feedback 172–3 Ferranti Mark 1 20 Fifth Generation Computer Systems Project 113–14 first-order logic 107 Ford 232 forward chaining 94 Frey, Carl 268–70 ‘The Future of Employment’ (Frey & Osborne) 268–70 G game theory 161–2 game-playing 26 Gangs Matrix 280 gender stereotypes 292–3 General AI 41, 53, 116, 119–20 General Motors 232 Genghis robot 134–6 gig economy 275 globalization 267 Go 73–4, 196–9 Golden Age of AI 47–88 Google 167, 231, 256–7 Google Glass 296–7 Google Translate 205–8, 292–3 GPUs (Graphics Processing Units) 187–8 gradient descent 183 Grand Challenges 2004/5 225–6 graphical user interfaces (GUI) 144–5 Graphics Processing Units (GPUs) 187–8 GUI (graphical user interfaces) 144–5 H hard problem of consciousness 314–17 hard problems 84, 86–7 Harm Assessment Risk Tool (HART) 277–80 Hawking, Stephen 238 healthcare 215–23 Herschel, John 304–6 Herzberg, Elaine 230 heuristic search 75–7, 164 heuristics 91 higher-order intentional reasoning 323–4, 328 high-level programming languages 144 Hilbert, David 15–16 Hinton, Geoff 185–6, 221 HOMER 141–3, 146 homunculus problem 315 human brain 43, 306, 330–1 human intuition 311 human judgement 222 human rights 277–81 human-level intelligence 28–36, 241–3 ‘humans are special’ argument 310–11 I image classification 186–7 image-captioning 200–4 ImageNet 186–7 Imitation Game 30 In Search of Lost Time (Proust) 205–8 incentives 261 indistinguishability 30–1, 37, 38 Industrial Revolutions 265–7 inference engines 92–4 insurance 219–20 intelligence 21–2, 127–8, 200 – human-level 28–36, 241–3 ‘Intelligence Without Representation’ (Brooks) 129 Intelligent Knowledge-Based Systems 100 intentional reasoning 323–4, 328 intentional stance 321–7 intentional systems 321–2 internal mental phenomena 306–7 Internet chatbots 36 intuition 311 inverse reinforcement learning 262 Invisible Women (Criado Perez) 291–2 J Japan 113–14 judgement 222 K Kasparov, Garry 163 knowledge bases 92–4 knowledge elicitation problem 123 knowledge graph 120–1 Knowledge Navigator 146–7 knowledge representation 91, 104, 129–30, 208 knowledge-based AI 89–123, 208 Kurzweil, Ray 239–40 L Lee Sedol 197–8 leisure 272 Lenat, Doug 114–21 lethal autonomous weapons 281–7 Lighthill Report 87–8 LISP 49, 99 Loebner Prize Competition 34–6 logic 104–7, 121–2 logic programming 111–14 logic-based AI 107–11, 130–2 M Mac computers 144–6 McCarthy, John 49–52, 107–8, 326–7 machine learning (ML) 27, 54–5, 168–74, 209–10, 287–9 machines with mental states 326–7 Macintosh computers 144–6 magnetic resonance imaging (MRI) 306 male-orientation 290–3 Manchester Baby computer 20, 24–6, 143–4 Manhattan Project 51 Marx, Karl 274–6 maximizing expected utility 154 Mercedes 231 Mickey Mouse 261 microprocessors 267–8, 271–2 military drones 282–4 mind modelling 42 mind-body problem 314–17 see also consciousness minimax search 76 mining industry 234 Minsky, Marvin 34, 52, 180 ML (machine learning) 27, 54–5, 168–74, 209–10, 287–9 Montezuma’s Revenge (video game) 195–6 Moore’s law 240 Moorfields Eye Hospital 220–1 moral agency 257–8 Moral Machines 251–3 MRI (magnetic resonance imaging) 306 multi-agent systems 160–2 multi-layer perceptrons 177, 180, 182 Musk, Elon 238 MYCIN 94–8, 217 N Nagel, Thomas 307–10 narrow AI 42 Nash, John Forbes Jr 50–1, 161 Nash equilibrium 161–2 natural languages 56 negative feedback 173 neural nets/neural networks 44, 168, 173–90, 369–72 neurons 174 Newell, Alan 52–3 norms 260 NP-complete problems 81–5, 164–5 nuclear energy 242–3 nuclear fusion 305 O ontological engineering 117 Osborne, Michael 268–70 P P vs NP problem 83 paperclips 261 Papert, Seymour 180 Parallel Distributed Processing (PDP) 182–4 Pepper 299 perception 54 perceptron models 174–81, 183 Perceptrons (Minsky & Papert) 180–1, 210 personal healthcare management 217–20 perverse instantiation 260–1 Phaedrus 315 physical stance 319–20 Plato 315 police 277–80 Pratt, Vaughan 117–19 preference relations 151 preferences 150–2, 154 privacy 219 problem solving and planning 55–6, 66–77, 128 programming 21–2 programming languages 144 PROLOG 112–14, 363–4 PROMETHEUS 224–5 protein folding 214 Proust, Marcel 205–8 Q qualia 306–7 QuickSort 26 R R1/XCON 98–9 radiology 215, 221 railway networks 259 RAND Corporation 51 rational decision making 150–5 reasoning 55–6, 121–3, 128–30, 137, 315–16, 323–4, 328 regulation of AI 243 reinforcement learning 172–3, 193, 195, 262 representation harm 288 responsibility 257–8 rewards 172–3, 196 robots – as autonomous weapons 284–5 – Baye’s theorem 157 – beliefs 108–10 – fake 299–300 – indistinguishability 38 – intentional stance 326–7 – SHAKEY 63–6 – Sophia 299–300 – Three Laws of Robotics 244–6 – trivial tasks 61 – vacuum cleaning 132–6 Rosenblatt, Frank 174–81 rules 91–2, 104, 359–62 Russia 261 Rutherford, Ernest (1st Baron Rutherford of Nelson) 242 S Sally-Anne tests 328–9, 330 Samuel, Arthur 75–7 SAT solvers 164–5 Saudi Arabia 299–300 scripts 100–2 search 26, 68–77, 164, 199 search trees 70–1 Searle, John 311–14 self-awareness 41, 305 see also consciousness semantic nets 102 sensors 54 SHAKEY the robot 63–6 SHRDLU 56–63 Simon, Herb 52–3, 86 the Singularity 239–43 The Singularity is Near (Kurzweil) 239 Siri 149, 298 Smith, Matt 201–4 smoking 173 social brain 317–19 see also brains social media 293–6 social reasoning 323, 324–5 social welfare 249 software agents 143–9 software bugs 258 Sophia 299–300 sorting 26 spoken word translation 27 STANLEY 226 STRIPS 65 strong AI 36–8, 41, 309–14 subsumption architecture 132–6 subsumption hierarchy 134 sun 304 supervised learning 169 syllogisms 105, 106 symbolic AI 42–3, 44, 181 synapses 174 Szilard, Leo 242 T tablet computers 146 team-building problem 78–81, 83 Terminator narrative of AI 237–9 Tesla 228–9 text recognition 169–71 Theory of Mind (ToM) 330 Three Laws of Robotics 244–6 TIMIT 292 ToM (Theory of Mind) 330 ToMnet 330 TouringMachines 139–41 Towers of Hanoi 67–72 training data 169–72, 288–9, 292 translation 204–8 transparency 258 travelling salesman problem 82–3 Trolley Problem 246–53 Trump, Donald 294 Turing, Alan 14–15, 17–19, 20, 24–6, 77–8 Turing Machines 18–19, 21 Turing test 29–38 U Uber 168, 230 uncertainty 97–8, 155–8 undecidable problems 19, 78 understanding 201–4, 312–14 unemployment 264–77 unintended consequences 263 universal basic income 272–3 Universal Turing Machines 18, 19 Upanishads 315 Urban Challenge 2007 226–7 utilitarianism 249 utilities 151–4 utopians 271 V vacuum cleaning robots 132–6 values and norms 260 video games 192–6, 327–8 virtue ethics 250 Von Neumann and Morgenstern model 150–5 Von Neumann architecture 20 W warfare 285–6 WARPLAN 113 Waymo 231, 232–3 weak AI 36–8 weapons 281–7 wearable technology 217–20 web search 148–9 Weizenbaum, Joseph 32–4 Winograd schemas 39–40 working memory 92 X XOR (exclusive OR) 180 Z Z3 computer 19–20 PELICAN BOOKS Economics: The User’s Guide Ha-Joon Chang Human Evolution Robin Dunbar Revolutionary Russia: 1891–1991 Orlando Figes The Domesticated Brain Bruce Hood Greek and Roman Political Ideas Melissa Lane Classical Literature Richard Jenkyns Who Governs Britain?

The Deep Learning Revolution (The MIT Press)
by Terrence J. Sejnowski
Published 27 Sep 2018

But there was one problem: coming to equilibrium and collecting statistics became increasingly slow to simulate, and larger networks took much longer to reach equilibrium. In principle, it is possible to build a computer with a massively parallel architecture that is much faster than one with a traditional von Neumann architecture that makes one update at a time. Digital computers in the 1980s could perform only a million operations per second. Today’s computers perform billions of operations per second, and, by linking together many thousands of cores, high-performance computers are a million times faster than before—an unprecedented increase in technological power.

The race is on to design and build a new generation of chips to run learning algorithms, whether deep, reinforcement, or other, thousands of times faster and more efficiently than the way they are now simulated on generalpurpose computers. The new very large-scale integration (VLSI) chips have parallel processing architectures, with memory onboard to alleviate the bottleneck between memory and the central processing unit (CPU) in the sequential von Neumann architectures that have dominated computing for the last fifty years. We are still in an exploratory phase with regard to hardware, and each type of special-purpose VLSI chip has different strengths and limitations. Massive amounts of computer power will be needed to run the large-scale networks that are being developed for AI applications, and there is tremendous potential for profit in building efficient hardware.

pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All
by Robert Elliott Smith
Published 26 Jun 2019

The others include devising a way to make ENIAC (arguably the world’s first real computer) programmable; making substantial contributions to quantum physics6 and equilibrium theories in economics; and, inventing game theory, an area of mathematical research which shaped cold war politics for a generation through his descriptions of a game-theoretic construct he called “mutually assured destruction.” Inspired by Turing’s papers on computation, von Neumann also came up with the modern conception of Babbage’s ‘store’ and ‘mill’ computer structure, in what is now called the ‘von Neumann architecture’, the architecture at the heart of almost all modern computers. Amongst this world-changing productivity, von Neumann also speculated about how computer programs, like genetic organisms, might be able to self-replicate. His ‘cellular automata’ theory closely parallels the actual replication methods of biological DNA, despite the fact that von Neumann’s work was done in advance of the actual structure of DNA being discovered by Watson and Crick in 1953.7 At the same time, statistician George Box suggested ‘evolutionary operations’8 as a methodology for optimizing industrial processes in the late 1950s, though he never implemented the procedure as a computer algorithm, and there are a number of other scientists who also struck close to the ideas that would eventually emerge as evolutionary computation.

See TSP truth uncertainty, here TSP (travelling salesman problem), here, here Tuckett, David, here, here, here, here Tumblr, here Turing, Alan, here, here, here, here, here Turing Test, here, here, here Turkers, here, here Turner, Ted, here Twitter, here, here, here, here Uber, here, here UCL (University College London), here, here, here, here, here, here, here UK Eugenics Records Office, here uncertainty, here, here, here, here, here, here, here, here uncertainty factors, here University College London. See UCL Ursula Le Guin, here USENET groups, here Vaucanson, Jacques, here, here, here Verhulst, Pierre François, here Vlaams Belang, here Voltaire, here Von Neumann, John, here, here Von Neumann Architecture, here Von Neumann’s game theory, here Wallace, Alfred Russel, here, here, here Wallace/Darwin synchronicity, here Walras, Leon, here, here Washington, Booker T., here Watson, James, here, here Weizenbaum, Joseph, here Whale, James, here Whitman, Walt, here Williams, Robert, here Wollstonecraft, Mary, here, here, here, here, here World Economic Forum at Davos, here, here Yager, Chuck, here YouTube, here Zuckerberg, Mark, here, here, here, here, here BLOOMSBURY BUSINESS Bloomsbury Publishing Plc 50 Bedford Square, London, WC1B 3DP, UK 1385 Broadway, New York, NY 10018, USA BLOOMSBURY, BLOOMSBURY BUSINESS and the Diana logo are trademarks of Bloomsbury Publishing Plc First published in Great Britain 2019 Copyright © Robert Elliott Smith, 2019 Cover design by Alice Marwick Robert Elliott Smith has asserted his right under the Copyright, Designs and Patents Act, 1988, to be identified as Author of this work.

pages: 420 words: 119,928

The Three-Body Problem (Remembrance of Earth's Past)
by Cixin Liu
Published 11 Nov 2014

In the future, any malfunctions will be dealt with the same way!” Von Neumann glanced at Newton, disgusted. They watched as a few riders dashed into the motherboard with their swords unsheathed. After they “repaired” the faulty component, the order to restart was given. This time, the operation went very smoothly. Twenty minutes later, Three Body’s Von Neumann architecture human-formation computer had begun full operations under the Qin 1.0 operating system. “Run solar orbit computation software ‘Three Body 1.0’!” Newton screamed at the top of his lungs. “Start the master computing module! Load the differential calculus module! Load the finite element analysis module!

Against the background of the three suns in syzygy, text appeared: Civilization Number 184 was destroyed by the stacked gravitational attractions of a tri-solar syzygy. This civilization had advanced to the Scientific Revolution and the Industrial Revolution. In this civilization, Newton established nonrelativistic classical mechanics. At the same time, due to the invention of calculus and the Von Neumann architecture computer, the foundation was set for the quantitative mathematical analysis of the motion of three bodies. After a long time, life and civilization will begin once more, and progress through the unpredictable world of Three Body. We invite you to log on again. * * * Just as Wang logged out of the game, a stranger called.

pages: 720 words: 197,129

The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution
by Walter Isaacson
Published 6 Oct 2014

To facilitate this, von Neumann came up with a variable-address program language that enabled an easy switch to substitute instructions while the program was running.57 The team at Penn proposed to the Army that a new and improved ENIAC be built along these lines. It would be binary rather than decimal, use mercury delay lines for memory, and include much, though not all, of what became known as “von Neumann architecture.” In the original proposal to the Army, this new machine was called the Electronic Discrete Variable Automatic Calculator. Increasingly, however, the team started referring to it as a computer, because it would do so much more than merely calculate. Not that it mattered. Everyone simply called it EDVAC.

Watson was a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history.

Dyson, Turing’s Cathedral, 1957. See also Aspray, John von Neumann and the Origins of Modern Computing. 62. Eckert oral history, Charles Babbage Institute. See also McCartney, ENIAC, 125, quoting Eckert: “We were clearly suckered by John von Neumann, who succeeded in some circles at getting my ideas called the ‘von Neumann architecture.’ ” 63. Jennings Bartik, Pioneer Programmer, 518. 64. Charles Duhigg and Steve Lohr, “The Patent, Used as a Sword,” New York Times, Oct. 7, 2012. 65. McCartney, ENIAC, 103. 66. C. Dianne Martin, “ENIAC: The Press Conference That Shook the World,” IEEE Technology and Society, Dec. 1995. 67.

pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence
by John Brockman
Published 5 Oct 2015

What distinguishes natural from artificial intelligence is not what it is but only how it’s made. Of course, that little word only is doing some heavy lifting here. Brains use a highly parallel architecture and mobilize many noisy analog units (i.e., neurons) firing simultaneously, while most computers use von Neumann architecture, with serial operation of much faster digital units. These distinctions are blurring, however, from both ends. Neural-net architectures are built in silicon, and brains interact ever more seamlessly with external digital organs. Already I feel that my laptop is an extension of my self—in particular, it is a repository for both visual and narrative memory, a sensory portal into the outside world, and a big part of my mathematical digestive system. 2.

This is far from obvious; we lack any data, either way. I personally think that consciousness is incredibly more complex than is currently assumed by the “experts.” A human being is not merely x numbers of axons and synapses, and we have no reason to assume that we can count our flops-per-second in a plain Von Neumann architecture, reach a certain number, and suddenly out pops a thinking machine. If true consciousness can emerge, let’s be clear what that could entail. If the machine is truly aware, it will, by definition, develop a “personality.” It may be irascible, flirtatious, maybe the ultimate know-it-all, possibly incredibly full of itself.

pages: 211 words: 57,618

Quantum Computing for Everyone
by Chris Bernhardt
Published 19 Mar 2019

There is nothing special about this. Exactly the same argument can be used whatever value is chosen for this bit. The inability to clone a qubit has many important consequences. We want to be able to back up files and send copies of files to other people. Copying is ubiquitous. Our everyday computers are based on von Neumann architecture, which is heavily based on the ability to copy. When we run a program we are always copying bits from one place to another. In quantum computing this is not possible for general qubits. So, if programmable quantum computers are designed they will not be based on our current architecture.

pages: 894 words: 190,485

Write Great Code, Volume 1
by Randall Hyde
Published 6 Aug 2012

Writing great code requires a strong knowledge of the computer’s architecture. 6.1 The Basic System Components The basic operational design of a computer system is called its architecture. John von Neumann, a pioneer in computer design, is given credit for the principal architecture in use today. For example, the 80x86 family uses the von Neumann architecture (VNA). A typical von Neumann system has three major components: the central processing unit (CPU), memory, and input/output (I/O), as shown in Figure 6-1. Figure 6-1. Typical von Neumann machine In VNA machines, like the 80x86, the CPU is where all the action takes place. All computations occur within the CPU.

The extra pins needed on the processor to support two physically separate buses increase the cost of the processor and introduce many other engineering problems. However, microprocessor designers have discovered that they can obtain many benefits of the Harvard architecture with few of the disadvantages by using separate on-chip caches for data and instructions. Advanced CPUs use an internal Harvard architecture and an external von Neumann architecture. Figure 9-9 shows the structure of the 80x86 with separate data and instruction caches. Each path between the sections inside the CPU represents an independent bus, and data can flow on all paths concurrently. This means that the prefetch queue can be pulling instruction opcodes from the instruction cache while the execution unit is writing data to the data cache.

pages: 968 words: 224,513

The Art of Assembly Language
by Randall Hyde
Published 8 Sep 2003

There hasn't been any real assembly language. Before we can progress any further and learn some real assembly language, a detour is necessary; unless you understand the basic structure of the Intel 80x86 CPU family, the machine instructions will make little sense. The Intel CPU family is generally classified as a Von Neumann Architecture Machine. Von Neumann computer systems contain three main building blocks: the central processing unit (CPU), memory, and input/output (I/0) devices. These three components are interconnected using the system bus (consisting of the address, data, and control buses). The block diagram in Figure 1-4 shows this relationship.

The first form we've been using throughout this chapter, so there is little need to discuss it here. The second form, the register indirect call, calls the procedure whose address is held in the specified 32-bit register. The address of a procedure is the byte address of the first instruction to execute within that procedure. Remember, on a Von Neumann architecture machine (like the 80x86), the system stores machine instructions in memory along with other data. The CPU fetches the instruction opcode values from memory prior to executing them. When you execute the register indirect call instruction, the 80x86 first pushes the return address onto the stack and then begins fetching the next opcode byte (instruction) from the address specified by the register's value.

pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy
by George Gilder
Published 16 Jul 2018

A fifty-seven-year-old, brown-haired engineer with a black hat and backpack and hiking boots, he is dressed Silicon-Valley-mountaineer style to take me on a high-altitude adventure in microchips and software, ideas and speculations, Google maps and Elon Musk “reality distortion fields” down Route 101 at five o’clock on a late-August Friday evening. It’s not quite Doctor Brown’s Back to the Future ride in a DeLorean, but it will suffice for some modest time-travel in the history of computing. Since writing his college thesis in the late 1970s, Dally has rebelled against the serial step-by-step computing regime known as the von Neumann architecture. After working on the “Cosmic Cube” under Chuck Seitz for his Ph.D. at Caltech (1983), Dally has led design of parallel machines at MIT (the J-machine and the M-machine), introduced massive parallelism to Cray supercomputers (the T-3D and 3E), and pioneered parallel graphics at Stanford (the Imagine project, a streaming parallel device incorporating programmable “shaders,” now ubiquitous in the industry’s graphic processors from Nvidia and others).

pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages
by Federico Biancuzzi and Shane Warden
Published 21 Mar 2009

To me, that all seems like wishful thinking. At the same time, I don’t know what will happen. There could be a quantum jump where, even though the computers that we know don’t actually change, a different kind of platform suddenly becomes much more prevalent and the rules are different. Perhaps a shift away from the von Neumann architecture? Guido: I wasn’t even thinking of that, but that’s certainly also a possibility. I was more thinking of what if mobile phones become the ubiquitous computing device. Mobile phones are only a few years behind the curve of the power of regular laptops, which suggests that in a few years, mobile phones, apart from the puny keyboard and screen, will have enough computing power so that you don’t need a laptop anymore.

They were very innovative and spawned a lot of follow-on work over the years. Unfortunately, there were a few problems that I didn’t solve, and neither did anybody else. So here was a promising idea, but it just didn’t quite work in the long run. I pulled some of those ideas into UML, but data flow architecture doesn’t seem to replace von Neumann architecture in most cases. So I had my shot and didn’t quite make it. There are also cellular automata. I think over half of my fellow grad students tried to build on them a highly parallel computer. That has to be the right approach, because that’s how the universe is constructed. (Or maybe not. Modern physics is stranger than fiction.

pages: 317 words: 101,074

The Road Ahead
by Bill Gates , Nathan Myhrvold and Peter Rinearson
Published 15 Nov 1995

John von Neumann, a brilliant Hungarian-born American, who is known for many things, including the development of game theory and his contributions to nuclear weaponry, is credited with the leading role in figuring out a way around this problem. He created the paradigm that all digital computers still follow. The "von Neumann architecture," as it is known today, is based on principles he articulated in 1945—including the principle that a computer could avoid cabling changes by storing instructions in its memory. As soon as this idea was put into practice, the modern computer was born. Today the brains of most computers are descendants of the microprocessor Paul Allen and I were so knocked out by in the seventies, and personal computers often are rated according to how many bits of information (one switch in the lighting example) their microprocessor can process at a time, or how many bytes (a cluster of eight bits) of memory or disk-based storage they have.

pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive
by Brian Christian
Published 1 Mar 2011

Ray Kurzweil (in 2005’s The Singularity Is Near), among several other computer scientists, speaks of a utopian future where we shed our bodies and upload our minds into computers and live forever, virtual, immortal, disembodied. Heaven for hackers. To Ackley’s point, most work on computation has not traditionally been on dynamic systems, or interactive ones, or ones integrating data from the real world in real time. Indeed, theoretical models of the computer—the Turing machine, the von Neumann architecture—seem like reproductions of an idealized version of conscious, deliberate reasoning. As Ackley puts it, “The von Neumann machine is an image of one’s conscious mind where you tend to think: you’re doing long division, and you run this algorithm step-by-step. And that’s not how brains operate.

pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity
by Amy Webb
Published 5 Mar 2019

If you don’t have enough of either, the machine will start running hot, or you’ll get an error message, or it will simply shut down. It’s a problem known as the “von Neumann bottleneck.” No matter how fast the processor is capable of working, the program memory and data memory cause the von Neumann bottleneck, limiting the data transfer rate. Just about all of our current computers are based on the von Neumann architecture, and the problem is that existing processors can’t execute programs any faster than they’re able to retrieve instructions and data from memory. The bottleneck is a big problem for AI. Right now, when you talk to your Alexa or Google Home, your voice is being recorded, parsed, and then transmitted to the cloud for a response—given the physical distance between you and the various data centers involved, it’s mind-blowing that Alexa can talk back within a second or two.

pages: 377 words: 97,144

Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World
by James D. Miller
Published 14 Jun 2012

To understand why, let me tell you a bit about von Neumann. Although a fantastic scientist, a pathbreaking economist, and one of the best mathematicians of the twentieth century, von Neumann also possessed fierce practical skills. He was, arguably, the creator of the modern digital computer.11 The computer architecture he developed, now called “von Neumann architecture,” lies at the heart of most computers.12 Von Neumann’s brains took him to the centers of corporate power, and he did high-level consulting work for many private businesses, including Standard Oil, for which he helped to extract more resources from dried-out wells.13 Johnny (as his biographer often calls him in tribute to von Neumann’s unpretentious nature) was described as having “the invaluable faculty of being able to take the most difficult problem, separate it into its components, whereupon everything looked brilliantly simple. . . .”14 During World War II, von Neumann became the world’s leading expert on explosives and used this knowledge to help build better conventional bombs, thwart German sea mines, and determine the optimal altitude for airborne detonations. 15 Johnny functioned as a human computer as a part of the Manhattan Project’s efforts to create fission bombs. 16 Whereas atomic weapons developers today use computers to decipher the many mathematical equations that challenge their trade, the Manhattan Project’s scientists had to rely on human intellect alone.

pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control
by Stuart Russell
Published 7 Oct 2019

It was all the more remarkable for the fact that, unlike monetary amounts, the utility values of various bets and prizes are not directly observable; instead, utilities are to be inferred from the preferences exhibited by an individual. It would be two centuries before the implications of the idea were fully worked out and it became broadly accepted by statisticians and economists. In the middle of the twentieth century, John von Neumann (a great mathematician after whom the standard “von Neumann architecture” for computers was named16) and Oskar Morgenstern published an axiomatic basis for utility theory.17 What this means is the following: as long as the preferences exhibited by an individual satisfy certain basic axioms that any rational agent should satisfy, then necessarily the choices made by that individual can be described as maximizing the expected value of a utility function.

The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal
by M. Mitchell Waldrop
Published 14 Apr 2001

Moreover, he had a point: it was only in the late 1970s, with the availability of reliable and inexpensive microchips, that computer scientists would begin serious experimentation with "parallel" computers that could carry out many operations simultaneously. To this day, the vast majority of computers in the world-including essentially all personal computers-are still based on the serial, step-by-step "von Neumann" architecture. Von Neumann mailed off his handwritten manuscript to Goldstine at the Moore School in late June 1945. He may well have felt rushed at that point, since the Trinity test of the plutonium bomb was less than three weeks away (it would take place on July 16). But in any case, he left numerous blank spaces for names, references, and other information that he planned to insert after his col- leagues had had a chance to comment.

Once they were in, moreover, assign each of them a securely walled-off piece of the computer's memory where they could store data and programming code without anybody else's horning in. And fi- nally, when the users needed some actual processing power, dole it out to them via an artful trick. You couldn't literally divide a computer's central processing unit, McCarthy knew; the standard von Neumann architecture allowed for only one such unit, which could carry out only one operation at a time. However, even the slowest electronic computer was very, very fast on any human time scale. So, Mc- Carthy wondered, why not let the CPU skip from one user's memory area to the next user's in sequence, executing a few steps of each task as it went?

pages: 492 words: 118,882

The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory
by Kariappa Bheemaiah
Published 26 Feb 2017

It was during the time of developing ENIAC that he met the renowned polymath, John von Neumann, and with his help went on to design a stored-program computer, the EDVAC (Electronic Discrete Variable Automatic Computer), the first binary computer (ENIAC was decimal). See Figure 4-11. Figure 4-11.General design of the Electronic Discrete Variable Automatic Computer. Reference Source: ‘The von Neumann Architecture’, The Computing Universe, 2014 From an abstract architecture perspective, von Neumann’s design is logically equivalent to Turing’s Universal Turing Machine. In fact, von Neumann had read Turing’s theoretical papers prior to designing his machine. Ultimately it was this simple design that was built upon by successive generations of computer scientists and led to the design of computers with multiple processors and the creation of parallel computing.

pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise
by Nathan L. Ensmenger
Published 31 Jul 2010

In 1945–1946, von Neumann circulated an informal “First Draft of a Report on the EDVAC,” which described the EDVAC in terms of its logical structure, using notation borrowed from neurophysiology. Ignoring most of the physical details of the EDVAC design, such as its vacuum tube circuitry, von Neumann focused instead on the main functional units of the computer: its arithmetic unit, memory, and input and output. The “von Neumann architecture,” as it came to be known, served as the logical basis for almost all computers designed in subsequent decades. By abstracting the logical design of the digital computer from any particular physical implementation, von Neumann took a crucial first step in the development of a modern theory of computation.55 His was not the only contribution; in 1937, for example, Turing had described, for the purposes of demonstrating the limits of computation, what would become known as the Universal Turing Machine.

pages: 436 words: 127,642

When Einstein Walked With Gödel: Excursions to the Edge of Thought
by Jim Holt
Published 14 May 2018

It thus fell short of the modern computer, which stores its instructions in the form of coded numbers, or “software.” Von Neumann aspired to create a truly universal machine, one that (as Dyson aptly puts it) “broke the distinction between numbers that mean things and numbers that do things.” A report sketching the architecture for such a machine—still known as the von Neumann architecture—was drawn up and circulated toward the end of the war. Although the report contained design ideas from the ENIAC inventors, von Neumann was listed as the sole author, which occasioned some grumbling among the uncredited. And the report had another curious omission. It failed to mention the man who, as von Neumann well knew, had originally worked out the possibility of a universal computer: Alan Turing.

pages: 303 words: 67,891

Advances in Artificial General Intelligence: Concepts, Architectures and Algorithms: Proceedings of the Agi Workshop 2006
by Ben Goertzel and Pei Wang
Published 1 Jan 2007

Since the number of processing unit is a constant, and so does the capacity of each unit, they will need to be shared by the concepts, because the system as a whole will producing new concepts from time to time, whose number will soon exceed the number of processing units. Consequently, the system still need time-sharing and space-sharing, and it is only that what to be shared is not a single CPU and RAM, but many processing units. Some people blame the von Neumann architecture of computer for the past failure of AI, but the argument is not convincing. It is true that the current computer architecture is not designed especially for AI, but it has not been proved that it cannot be used to implement a truly intelligent system. Special hardware is optional for NARS, since the system can be fully implemented on the current hardware/software platform, though special hardware will surely make it work better. 3.6 Evolution Under the assumption of insufficient knowledge, all object-level knowledge in NARS can be modified by the system’s various learning mechanisms.

pages: 502 words: 132,062

Ways of Being: Beyond Human Intelligence
by James Bridle
Published 6 Apr 2022

In his proposal for the EDVAC, the first all-digital, stored-program computer, von Neumann specified a particular architecture: a single connection, or ‘bus’, between the memory and the central processor, meaning that the computer could not fetch data and execute commands at the same time. Today, just as almost all computers are based on Turing’s a-machine, almost all computers use the von Neumann architecture. But a problem results from this: the central processing unit (CPU) is constantly forced to wait on required information as it is moved into or out of memory, which can result in a serious drag on its processing speed. The original decision to build computers this way was made for reasons of simplicity, but it means that significant amounts of computer time, of software design and of electrical energy are expended on moving information around rather than doing anything with it.

pages: 528 words: 146,459

Computer: A History of the Information Machine
by Martin Campbell-Kelly and Nathan Ensmenger
Published 29 Jul 2013

Although the 101-page report was in draft form, with many references left incomplete, twenty-four copies were immediately distributed to people closely associated with Project PY. Von Neumann’s sole authorship of the report seemed unimportant at the time, but it later led to his being given sole credit for the invention of the modern computer. Today, computer scientists routinely speak of “the von Neumann architecture” in preference to the more prosaic “stored-program concept”; this has done an injustice to von Neumann’s co-inventors. Although von Neumann’s EDVAC Report was a masterly synthesis, it had the effect of driving the engineers and logicians further apart. For example, in the report von Neumann had pursued the biological metaphor by eliminating all the electronic circuits in favor of logical elements using the “neurons” of brain science.

pages: 500 words: 146,240

Gamers at Work: Stories Behind the Games People Play
by Morgan Ramsay and Peter Molyneux
Published 28 Jul 2011

I had been starting companies all of my life, so it just seemed like the natural thing to do. I really didn’t question any alternatives to starting the company and then licensing the hardware. Ramsay: Was there a lot of focus on hardware then? Bushnell: It was all hardware. As it turned out, the first video games didn’t have Von Neumann architectures at all. They had what we called “digital-state machines.” These machines were, essentially, clocked output signal generators that created waveforms that drove the television monitor. If you wanted to change anything, you had to change the hardware. There was no software at all. In fact, the very first game that executed a program was Asteroids in 1979.

pages: 489 words: 148,885

Accelerando
by Stross, Charles
Published 22 Jan 2005

She reaches over for Boris's pitcher of jellyfish juice, but frowns as she does so: "Aineko wasn't conscious back then, but later … when SETI@home finally received that message back, oh, however many years ago, Aineko remembered the lobsters. And cracked it wide open while all the CETI teams were still thinking in terms of von Neumann architectures and concept-oriented programming. The message was a semantic net designed to mesh perfectly with the lobster broadcast all those years ago, and provide a high-level interface to a communications network we're going to visit." She squeezes Boris's fingertips. "SETI@home logged these coordinates as the origin of the transmission, even though the public word was that the message came from a whole lot farther away – they didn't want to risk a panic if people knew there were aliens on our cosmic doorstep.

pages: 523 words: 154,042

Fancy Bear Goes Phishing: The Dark History of the Information Age, in Five Extraordinary Hacks
by Scott J. Shapiro

(The vacuum tubes were arranged in rings of ten, and only one tube was on at a time, representing one digit.) Von Neumann understood that binary symbols are easier to encode electronically. Open circuits would count as zeros, closed circuits as ones. EDVAC became the world’s first digital computer. Von Neumann is also credited with inventing the “stored program” computer, now known as the “von Neumann architecture.” For all its virtues, the ENIAC had one problem: code was hardwired into the machine. Whenever a user wanted to run a new program, a team of women, known as the programmers, manually changed ENIAC’s internal wiring to implement the code. A program might take two weeks just to load and test before it could run.

pages: 578 words: 168,350

Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies
by Geoffrey West
Published 15 May 2017

The great John von Neumann, mathematician, physicist, computer scientist, and polymath, a man whose ideas and accomplishments have had a huge influence on your life, made the following remarkably prescient observation more than seventy years ago: “The ever accelerating progress of technology and changes in the mode of human life . . . gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”7 Among von Neumann’s many accomplishments before he died at the relatively young age of fifty-three in 1957 are his seminal role in the early development of quantum mechanics, his invention of game theory, which is a major tool in economic modeling, and the conceptual design of modern computers universally referred to as the von Neumann architecture. So can we imagine making an innovation as powerful and influential as the invention of the Internet every fifteen, ten, or even five years? This is a classic reductio ad absurdum argument showing that regardless of how ingenious we are, how many marvelous gadgets and devices we invent, we simply won’t be able to overcome the threat of the ultimate singularity if we continue business as usual.

pages: 798 words: 240,182

The Transhumanist Reader
by Max More and Natasha Vita-More
Published 4 Mar 2013

Now, the rapidly increasing number of processing cores in ­general-purpose CPUs and GPU arrays are indicative of a drive toward parallel computation. Parallel computation is a more natural fit to neural computation. It is essential for the acquisition and analysis of data from the brain. Of course, compared with a sequential Von Neumann architecture, parallel computing platforms, and in particular neuromorphic platforms, are a much better target for the implementation of a whole brain emulation. An example of neuromorphic processor hardware is the chip developed at IBM as an outcome of research in the DARPA SyNAPSE program led by Dharmendra Modha.