seminal paper

back to index

description: an academic paper that has a significant and lasting influence on its field

201 results

pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages
by Federico Biancuzzi and Shane Warden
Published 21 Mar 2009

Symbols 169-176 breakthroughs needed in, Waiting for a Breakthrough, Waiting for a Breakthrough by mathematicians, Waiting for a Breakthrough, Experience clean design, Programming by Example compared to library design, Language and Design debugging considerations for, Theory and Practice designer’s preferences influencing, Theory and Practice environment influencing, Language and Design errors reduced by, Theory and Practice for handheld devices, Power or Simplicity for system programming, Power or Simplicity formalisms for mathematical, Language Design usefulness of, Unix and Its Culture implementation affecting, Language Design implementation considerations for, Designing a New Language, Theory and Practice implementation related to, Language and Design improvements to process of, Designing a New Language, Designing a New Language influencing program design, Language Design inspiration for, Creativity, Refinement, and Patterns personal approach for, Elementary Principles prototypes for, Designing a New Language scientific approach for, Designing a New Language, Theory and Practice, Growing a Language starting with small core set of functionality, Growing a Language, Proofreading Languages syntax choices for, Theory and Practice teams for democratic nature of, Feedback Loop user considerations, Language Design utility considerations, Language Design A A Note on Pattern Matching: Where do you find the match to an empty array (Falkoff), Elementary Principles A Programming Language (Iverson), Paper and Pencil abstraction in functional programming, A Functional Team in SQL, Feedback and Evolution agents, Beyond Informatics Aho, AWK Aho-Corasick algorithm, Computer Science automata theory, Computer Science command-line tools, Unix and Its Culture compilers course taught by, The Role of Documentation–The Role of Documentation, The Role of Documentation, The Role of Documentation, The Role of Documentation data size handled by AWK, The Life of Algorithms debugging, Language Design, The Role of Documentation documentation leading to better software design, The Role of Documentation–The Role of Documentation, The Role of Documentation, The Role of Documentation domain, Unix and Its Culture file concept applied to Internet, Unix and Its Culture formalizing semantics of languages, Unix and Its Culture graphical interfaces, Unix and Its Culture hardware availability, The Life of Algorithms hardware efficiency, Unix and Its Culture improving programming skills, Language Design knowledge required to use AWK, Language Design large programs, The Life of Algorithms lex, Language Design portability of Unix, Unix and Its Culture programming, The Role of Documentation programming language design, Language Design programming languages, Unix and Its Culture purposes appropriate for use of AWK, The Life of Algorithms, Unix and Its Culture research in computer science, Computer Science role in AWK development, The Life of Algorithms security and degree of formalism, Unix and Its Culture teaching programming, Language Design theory and practice as motivation, Unix and Its Culture utility of programming language, Language Design yacc, Language Design Aho-Corasick algorithm, Computer Science algebraic language, The Goals Behind BASIC allocated memory, Compiler Design API design, Expedients and Experience, Feedback Loop, C# APL, APL character set for, Paper and Pencil, Elementary Principles collections in, Parallelism design, Elementary Principles design history of, Paper and Pencil, Elementary Principles general arrays in, Elementary Principles implementation on handheld devices, Paper and Pencil learning, Paper and Pencil lessons learned from design of, Legacy namespaces, Parallelism parallelism with, Parallelism–Legacy, Parallelism, Parallelism, Legacy, Legacy regrets about, Legacy resources used efficiently by, Paper and Pencil standardization of, Elementary Principles syntax for based on algebraic notation, Paper and Pencil, Elementary Principles simplicity/complexity of, Paper and Pencil, Elementary Principles, Elementary Principles teaching programming with, Elementary Principles APL\360, Paper and Pencil arbitrary precision integers, The Pythonic Way architects, Be Ready for Change aspect orientation, Learning and Teaching Aspect-Oriented Software Development with Use Cases (Jacobson; Ng), Learning and Teaching asymmetrical coroutines, The Power of Scripting asynchronous operation, Hardware audio applications, Language Design automata theory, Computer Science automatic code checking, Designing a Language AWK, AWK, The Life of Algorithms compared to SQL, Bits That Change the Universe initial design ideas for, Bits That Change the Universe large programs good practices for, The Life of Algorithms improvements for, Theory and Practice longevity of, Theory and Practice programming advice for, Designing a New Language programming by example, Programming by Example–Programming by Example, Programming by Example, Programming by Example, Programming by Example, Programming by Example, Programming by Example regrets about, Bits That Change the Universe AWT, Power or Simplicity B backward compatibility, Formalism and Evolution for potentially redesigned UML, Layers and Languages with Java, Power or Simplicity with JVM, Language and Design with UML, Language Design BASIC, BASIC comments, Language and Programming Practice compiler one pass for, The Goals Behind BASIC, The Goals Behind BASIC two passes for, The Goals Behind BASIC design of considerations for, The Goals Behind BASIC, The Goals Behind BASIC holding up over time, Language Design encapsulation, The Goals Behind BASIC hardware evolution influencing, The Goals Behind BASIC large programs, The Goals Behind BASIC lessons learned from design of, Language Design libraries, Language Design number handling, The Goals Behind BASIC, The Goals Behind BASIC performance of, The Goals Behind BASIC teaching programming using, The Goals Behind BASIC True BASIC, The Goals Behind BASIC variable declarations not required in, Compiler Design bitmap fonts, Designed to Last Booch, UML backward compatibility with UML, Language Design benefits of UML, UML body of literature for programming, Training Developers business rules, Creativity, Refinement, and Patterns complexity and OOP, Creativity, Refinement, and Patterns complexity of UML, UML concurrency, Creativity, Refinement, and Patterns constraints contributing to innovation, Creativity, Refinement, and Patterns creativity and pragmatism, Language Design design of UML, UML implementation code, UML language design, Creativity, Refinement, and Patterns language design compared to programming, Language Design language design influencing programs, Language Design legacy software, Creativity, Refinement, and Patterns OOP influencing correct design, Creativity, Refinement, and Patterns percentage of UML used all the time, UML, Language Design redesigning UML, UML simplicity, Creativity, Refinement, and Patterns standardization of UML, Language Design–Training Developers, Language Design, Language Design, Training Developers training programmers, Training Developers–Creativity, Refinement, and Patterns, Creativity, Refinement, and Patterns bottom-up design with Forth, The Forth Language and Language Design with Python, The Good Programmer Boyce, SQL, A Seminal Paper brown field development, Creativity, Refinement, and Patterns bugs in language design, The Theory of Meaning business rules, Creativity, Refinement, and Patterns C C as system programming language, Power or Simplicity longevity of, Legacy Culture Objective-C as extension of, Engineering Objective-C performance of, Power or Simplicity signedness in, Waiting for a Breakthrough size of code, Project Management and Legacy Software C#, C# as replacement for C++, Growing a Language debugging, C# design team for, C# evolution of, C# formal specifications for, C# Java as inspiration for, Designing a Language longevity of, Growing a Language user feedback for, Language and Design, C# C++ backward compatibility with C, Legacy Culture C# as replacement for, C# compared to Objective-C, Engineering Objective-C, Objective-C and Other Languages compatibility requirements of, The Pythonic Way complexity of, Engineering Objective-C concurrency support in, OOP and Concurrency evolution of, Growing a Language future versions of, Future lessons learned from design of, Future multithreading in, Designing a Language pointers in problems with, Designing a Language popularity of, Engineering Objective-C C++ 2.0, Future C++0x, OOP and Concurrency, Future Calculus of Communicating Systems (CCS), The Soundness of Theorems CCS (Calculus of Communicating Systems), The Soundness of Theorems Celes, Lua Chamberlin, SQL complexity of SQL, Feedback and Evolution concurrent data access in SQL, The Language declarative nature of SQL, The Language design history of SQL, A Seminal Paper–A Seminal Paper, A Seminal Paper, A Seminal Paper, A Seminal Paper design principles of SQL, The Language determinism, Feedback and Evolution Excel compared with relational database systems, Feedback and Evolution external visibility of, Feedback and Evolution injection attacks on SQL, Feedback and Evolution knowledge required to use SQL, Feedback and Evolution languages, A Seminal Paper popularity of SQL, Feedback and Evolution scalability of SQL, Feedback and Evolution standardization of SQL and XQuery, XQuery and XML usability tests on SQL, Feedback and Evolution user feedback on SQL, Feedback and Evolution users of SQL, Feedback and Evolution views in SQL, The Language XML, XQuery and XML XQuery, XQuery and XML character set, Paper and Pencil, Elementary Principles class system, The Haskell Language classes, Project Management and Legacy Software closure in Lua, The Power of Scripting in SQL, The Language Codd, SQL, A Seminal Paper code browsing, The Pythonic Way code examples in programming manuals, Breeding Little Languages code reuse, Compiler Design collections design implications of, Parallelism large unstructured, Parallelism operations on each element of, Elementary Principles color, Designed to Last colorForth, The Forth Language and Language Design command line AWK used with, Unix and Its Culture compared to graphical interface, Designing a New Language composing programs on, Unix and Its Culture limitations of, Bits That Change the Universe resurgence of, Legacy Culture tools for, Unix and Its Culture comments, Application Design, The Theory of Meaning in BASIC, Compiler Design in C#, C# role of, Experience communication among interactive agents, Beyond Informatics role in informatics, Beyond Informatics compilers quality of code in, Application Design writing, The Forth Language and Language Design, Compiler Design, The Role of Documentation completeness, The Language complex algorithms, The Life of Algorithms components, Objective-C and Other Languages, Objective-C and Other Languages, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Quality As an Economic Phenomenon, Growing a Language computer science current problems in, The Future of Computer Science future of, Interfaces to Longevity problems of, Beyond Informatics, Components, Sand, and Bricks research in, Beyond Informatics role of mathematics in, Elementary Principles, Computer Science, Experience, Beyond Informatics whether it is a science, Experience, Quality As an Economic Phenomenon computer science education approaches for, Bits That Change the Universe, Spreading (Functional) Education beginning programming, The Pythonic Way, Elementary Principles, The Goals Behind BASIC, Compiler Design functional languages, Spreading (Functional) Education multiple languages, Language and Programming Practice, Breeding Little Languages teaching languages, The Goals Behind BASIC, The Goals Behind BASIC, Language and Design teamwork in, Unix and Its Culture–The Role of Documentation, The Role of Documentation, The Role of Documentation, The Role of Documentation topics needed in, Education and Training, Be Ready for Change concurrency, Creativity, Refinement, and Patterns adding to language, Language and Design analyzing concurrent systems, The Soundness of Theorems approaches for, Hardware challenges of, The Future of Computer Science, The Future of Computer Science design affected by, Concurrency framework handling, The Future of Computer Science functional languages and, A Bit of Reusability in C++, OOP and Concurrency in C++0x, OOP and Concurrency in Lua, The Power of Scripting in Python, Multiple Pythons in SQL, The Language language design affected by, The Future of Computer Science network distribution and, OOP and Concurrency OOP and, OOP and Concurrency, Objective-C and Other Languages, A Bit of Reusability pattern matching using, Computer Science requirements for, Concurrency conditionals, Application Design consistency, Feedback and Evolution constraints, Creativity, Refinement, and Patterns cooperative multithreading, Hardware Corasick, Computer Science Cox, Objective-C components, Objective-C and Other Languages, Components, Sand, and Bricks–Quality As an Economic Phenomenon, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Quality As an Economic Phenomenon concurrency and OOP, Objective-C and Other Languages configurability, Objective-C and Other Languages educational background of, Education encapsulation, Components, Sand, and Bricks garbage collection, Objective-C and Other Languages lessons learned from design of Objective-C, Objective-C and Other Languages, Components, Sand, and Bricks lightweight threads, Components, Sand, and Bricks multiple inheritance, Objective-C and Other Languages namespaces not supported in Objective-C, Objective-C and Other Languages Objective-C as extension of C and Smalltalk, Objective-C and Other Languages, Objective-C and Other Languages Objective-C compared to C++, Objective-C and Other Languages OOP increasing complexity of applications, Objective-C and Other Languages quality of software, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Education security of software, Components, Sand, and Bricks single inheritance in Objective-C, Objective-C and Other Languages superdistribution, Components, Sand, and Bricks, Education trusting software, Components, Sand, and Bricks CPAN, Community, Community creative arts, Training Developers creativity as role of programmer, Growing a Language importance of, Learning and Teaching in programming, Language Design necessity of, Creativity, Refinement, and Patterns opportunity to use, Knowledge stimulating in programmers, Bits That Change the Universe tension from, Language Design customer vocabulary, The Forth Language and Language Design D Dahl, An Inspired Afternoon data models, A Seminal Paper data sizes, Programming by Example debugging code C#, C# design considerations for, Designing a Language ease of, Language Design, Designing a New Language functional programming and, Trajectory of Functional Programming language design considerations for, Theory and Practice, Growing a Language Lua, Language Design PostScript, Interfaces to Longevity Python, Multiple Pythons debugging languages, The Pythonic Way declarations, Parallelism Design by Contract, An Inspired Afternoon, An Inspired Afternoon design patterns, Creativity, Refinement, and Patterns, Creativity, Refinement, and Patterns Design Patterns: Elements of Reusable Object-Oriented Software (Gamma; Helm; Johnson; Vlissides), Layers and Languages determinism, Feedback and Evolution Dijkstra, An Inspired Afternoon documentation of programming language, Breeding Little Languages documentation of programs comments, Application Design, Experience, C# content of, The Theory of Meaning importance of, Programming by Example leading to better software design, The Role of Documentation programmers writing, C# domain-driven design, The Forth Language and Language Design, Language Design, Language Design, Unix and Its Culture, Concurrency domain-specific languages (DSL), Growing a Language disadvantages of, Language and Design, Growing a Language existence of, C# growth of, Breeding Little Languages Lua used as, Language Design moving to general-purpose, Language programs as, Elementary Principles UML redesigned as set of, UML, UML dynamic languages benefits of, The Good Programmer security and, The Good Programmer dynamic typing, The Pythonic Way E ECMA standardization for C#, C# economic model of software, Components, Sand, and Bricks, Components, Sand, and Bricks, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon Eiffel, Eiffel adding features to, Managing Growth and Evolution backward compatibility for, Managing Growth and Evolution evolution of, Managing Growth and Evolution–Managing Growth and Evolution, Managing Growth and Evolution, Managing Growth and Evolution, Managing Growth and Evolution, Managing Growth and Evolution extensibility of, Reusability and Genericity forward compatibility for, Managing Growth and Evolution history of, An Inspired Afternoon–An Inspired Afternoon, An Inspired Afternoon, An Inspired Afternoon, An Inspired Afternoon, An Inspired Afternoon information hiding, Reusability and Genericity proofs in, Proofreading Languages reusability of, Reusability and Genericity streaming serialization, Proofreading Languages embedded applications Forth for, The Forth Language and Language Design emergent systems, Layers and Languages encapsulation, Components, Sand, and Bricks advantages of, Language Design in BASIC, The Goals Behind BASIC engineering links to informatics, Beyond Informatics programming as, Learning and Teaching error messages in Lua, Language Design quality of, Theory and Practice errors handling, Trajectory of Functional Programming language design reducing number of, Theory and Practice reduced by language design, Theory and Practice Excel, Feedback and Evolution extensibility, The Language F Falkoff, APL collections, Parallelism, Parallelism, Parallelism computer science, Elementary Principles design of APL longevity of, Paper and Pencil language design, Elementary Principles language design influencing program design, Elementary Principles parallelism, Parallelism–Legacy, Legacy Perl influenced by APL, Legacy pointers not used in APL, Parallelism programmers, Paper and Pencil relational database design influenced by APL, Parallelism resources, Paper and Pencil The Design of APL, Paper and Pencil Figueiredo, Lua comments, Experience design of Lua, Experience dialects of users, Language Design environments changing design of Lua, Language Design error messages in Lua, Language Design hardware availability, Experience limited resources, Language Design local workarounds versus global fixes in code, Language Design mathematics, Experience mistakes in Lua, Experience programming in Lua, The Power of Scripting programming language design, Language Design–Language Design, Language Design, Language Design, Language Design, Language Design, Language Design, Language Design regrets about Lua, Experience security capabilities of Lua, The Power of Scripting success, Experience teaching debugging, Experience testing Lua, Language Design VM for Lua, Language Design file, Unix and Its Culture file handling, Parallelism first-class functions, The Power of Scripting font scaling in PostScript, Designed to Last for loop, Experience formal semantics benefits for language design, A Seminal Paper not used for PostScript, Designed to Last usefulness of, Formalism and Evolution–Formalism and Evolution, Formalism and Evolution, Formalism and Evolution, Formalism and Evolution formal specifications for C#, C# for languages, Language Design necessity of, Designing a Language Forth, Forth, The Forth Language and Language Design application design with, Application Design–Application Design, Application Design, Application Design, Application Design, Application Design, Application Design asynchronous operation, Hardware comparing to PostScript, Designed to Last conditionals in, Application Design design of longevity of, The Forth Language and Language Design error causes and detection, The Forth Language and Language Design, Application Design for embedded applications, The Forth Language and Language Design I/O capabilities of, Hardware indirect-threaded code, The Forth Language and Language Design loops in, Application Design maintainability of, Application Design minimalism in design of, The Forth Language and Language Design porting, Hardware programmers receptive to, The Forth Language and Language Design programming in, Application Design readability of, The Forth Language and Language Design, The Forth Language and Language Design reusable concepts of meaning with, The Soundness of Theorems simplicity of, The Forth Language and Language Design, The Forth Language and Language Design, Application Design syntax of small words, The Forth Language and Language Design, The Forth Language and Language Design word choice in, Application Design fourth-generation computer language, The Forth Language and Language Design frameworks, Knowledge functional closures, The Haskell Language functional programming, Trajectory of Functional Programming–The Haskell Language, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, The Haskell Language abstraction in, Trajectory of Functional Programming concurrency and, A Bit of Reusability debugging in, Trajectory of Functional Programming error handling in, Trajectory of Functional Programming longevity of, Trajectory of Functional Programming parallelism and, Trajectory of Functional Programming popularity of, Trajectory of Functional Programming Scala for, Concurrency side effects, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming usefulness of, Bits That Change the Universe functions first class, The Power of Scripting higher-order, The Soundness of Theorems G garbage collection, Objective-C and Other Languages in JVM, Designing a Language in Lua, The Power of Scripting in Objective-C, Objective-C and Other Languages in Python, Multiple Pythons general arrays, Elementary Principles general-purpose languages, Growing a Language generic programming as alternative to OOP, OOP and Concurrency generic types, The Haskell Language genericity, Reusability and Genericity generics in Java, The Haskell Language Geschke, PostScript bugs in ROM, Designed to Last computer science, Research and Education concatenative language, Designed to Last design team for PostScript, Designed to Last hardware considerations, Designed to Last, Interfaces to Longevity history of software and hardware evolution, Research and Education Imaging Sciences Laboratory, Research and Education kerning and ligatures in PostScript, Designed to Last longevity of programming languages, Interfaces to Longevity mathematical background, Designed to Last popularity of languages, Interfaces to Longevity PostScript as language instead of data format, Designed to Last programmer skill, Designed to Last two-dimensional constructs, Designed to Last web use of PostScript, Standard Wishes Gosling adding to Java, Feedback Loop array subscript checking in Java, Power or Simplicity AWT, Power or Simplicity backward compatibility with Java, Power or Simplicity C stacks, Designing a Language C# inspired by Java, Designing a Language complexity, Power or Simplicity complexity of Java, Power or Simplicity computer science, Designing a Language concurrency, Concurrency–Designing a Language, Designing a Language debugging, Designing a Language documentation, Designing a Language error prevention and containment in Java, A Matter of Taste, Designing a Language formal specifications, Designing a Language freeing source code to Java, Feedback Loop garbage collection, Designing a Language Java EE, Power or Simplicity JIT, Power or Simplicity JVM satisfaction with, A Matter of Taste language design affected by network issues, Power or Simplicity language design influencing software design, Designing a Language language designed for personal use of, Designing a Language languages designed by, A Matter of Taste Moore’s Law, Concurrency performance, A Matter of Taste pointers in C++, Designing a Language programmers, Designing a Language references in Java, Designing a Language Scala, Concurrency, Designing a Language simplicity and power, Power or Simplicity system programming languages, Power or Simplicity user feedback for Java, Designing a Language virtual machine for Java, A Matter of Taste GOTO statements, The Goals Behind BASIC, The Goals Behind BASIC graphical interface limitations of, Unix and Its Culture H half-toning for color, Designed to Last Halloween problem, The Language handheld devices, Power or Simplicity hardware availability of, The Life of Algorithms, Breeding Little Languages, Programming by Example computational power of, Hardware considerations for, Designed to Last innovation driven by, Interfaces to Longevity predicting future of, Engineering Objective-C, Engineering Objective-C requirements for concurrency, Hardware viewing as a resource or a limit, Hardware Haskell, Haskell class system for, The Haskell Language competing implementations of, Formalism and Evolution evolution of, Formalism and Evolution–Formalism and Evolution, Formalism and Evolution, Formalism and Evolution influencing other languages, The Haskell Language list comprehensions, The Haskell Language team designing, A Functional Team–Trajectory of Functional Programming, Trajectory of Functional Programming type system for, The Haskell Language, The Haskell Language, The Haskell Language, The Haskell Language, The Theory of Meaning Hejlsberg, C# backward compatibility with JVM, Language and Design comments in C#, C# computer science, The Future of Computer Science debugging, Growing a Language domain-specific languages, Growing a Language, The Future of Computer Science dynamic programming languages, The Future of Computer Science higher-order functions, Language and Design, The Future of Computer Science implementing and designing languages, Language and Design language design, Growing a Language leveraging existing components, Growing a Language personal themes in language design, Language and Design programmers, The Future of Computer Science programming language design, Language and Design–Growing a Language, Language and Design, Language and Design, Language and Design, Language and Design, Language and Design, Growing a Language, Growing a Language safety versus creative freedom, Growing a Language simplicity in language design, Growing a Language teaching languages, Language and Design higher-order functions, The Future of Computer Science higher-order functions in ML, The Soundness of Theorems Hoare, An Inspired Afternoon HOPL-III: The development of the Emerald programming language, OOP and Concurrency HTML, Standard Wishes Hudak functional programming, Trajectory of Functional Programming–The Haskell Language, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, The Haskell Language Haskell’s influence on other languages, The Haskell Language language design influencing software design, The Haskell Language teaching programming and computer science, Spreading (Functional) Education Hughes functional programming, Trajectory of Functional Programming–The Haskell Language, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, The Haskell Language hybrid typing, The Pythonic Way I I/O, Hardware Ierusalimschy, Lua closures in Lua, The Power of Scripting code sharing with Lua, Language Design comments, Experience computer science, Experience concurrency with Lua, The Power of Scripting debugging Lua, Language Design extensibility of Lua, Language Design feature set complete for Lua, Language Design first-class functions in Lua, The Power of Scripting fragmentation issues with Lua, Language Design implementation of language affecting design of, Language Design limitations of Lua, The Power of Scripting limited resources, Language Design number handling by Lua, The Power of Scripting programmers, Experience simplicity of Lua, Language Design success, Experience upgrading Lua during development, Language Design user feedback on Lua, Language Design VM for Lua, Language Design implementation, An Inspired Afternoon indirect-threaded code, The Forth Language and Language Design informatics definition of, Beyond Informatics inheritance, Compiler Design injection attacks, Feedback and Evolution intelligent agents for programming, Knowledge interface design, Expedients and Experience, Transformative Technologies Internet as representation of agents, Beyond Informatics Iverson, APL J Jacobson, UML benefits of UML, UML, UML complexity of UML, UML, UML computer science, Learning and Teaching designing UML, UML DSLs, UML Ericsson, Learning and Teaching future possible changes to UML, UML implementation code, UML legacy software, The Role of the People Object-Oriented Software Engineering, Learning and Teaching programming, Learning and Teaching, Knowledge programming approaches in different parts of the world, Learning and Teaching programming knowledge linked to languages, UML programming methods and processes, The Role of the People SDL influencing improvements to UML, UML simplicity, Knowledge size of project determining usefulness of UML, UML social engineering, The Role of the People teams for programming, Learning and Teaching use cases, Learning and Teaching Java, Java AWT and, Power or Simplicity Java EE, Power or Simplicity Javadoc tool, Designing a Language JavaScript, Interfaces to Longevity, Standard Wishes JIT, Power or Simplicity Jones formal semantics, Formalism and Evolution functional programming, Trajectory of Functional Programming, Trajectory of Functional Programming teaching computer science, Spreading (Functional) Education JVM new languages built on, Designing a Language popularity of, A Matter of Taste K kanji characters, Designed to Last Kemeny, BASIC Kernighan, AWK backward compatibility versus innovation, Legacy Culture C, Legacy Culture C++, Legacy Culture command line, Legacy Culture domain-specific languages (DSL), Breeding Little Languages hardware availability, Breeding Little Languages implementation considerations for language design, Legacy Culture language design style of, Language Design large systems, Designing a New Language learning programming languages, Computer Science little languages, Legacy Culture OOP, Designing a New Language programmers, Breeding Little Languages programming first interest in, Breeding Little Languages programming language manuals, Breeding Little Languages programming languages, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language rewriting programs, Legacy Culture success, Breeding Little Languages Tcl/Tk, Transformative Technologies teaching debugging, Breeding Little Languages testing, Transformative Technologies upgrading, Transformative Technologies user considerations in programming, Breeding Little Languages Visual Basic, Transformative Technologies writing text, Breeding Little Languages kerning, Designed to Last knowledge transfer, Learning and Teaching, The Role of the People, Knowledge, Be Ready for Change, Be Ready for Change Kurtz, BASIC algebraic language, The Goals Behind BASIC comments in BASIC, Language and Programming Practice compilers, Compiler Design debugging code, Language Design design of BASIC, The Goals Behind BASIC, Compiler Design encapsulation, Language Design language design influencing program design, Language and Programming Practice learning programming, The Goals Behind BASIC libraries, Language Design, Language Design mathematical formalism, Language Design OOP, Language and Programming Practice polymorphism, Compiler Design productivity when programming, Work Goals programming languages, Language and Programming Practice simplicity of languages, The Goals Behind BASIC single-pass compiler for BASIC, Compiler Design success in programming, Work Goals teaching programming, Compiler Design True BASIC, The Goals Behind BASIC users, Work Goals, Work Goals visual and audio applications, Language Design Visual Basic, Language Design Visual Basic as object-oriented language, Language Design words used in languages, Language Design L language toolkit, The Forth Language and Language Design lazy evaluation, Trajectory of Functional Programming, The Haskell Language LCF, The Soundness of Theorems limits of, The Soundness of Theorems legacy software, Bits That Change the Universe, Theory and Practice approaches for, Project Management and Legacy Software, The Role of the People, Training Developers preventing problems of, Project Management and Legacy Software, Components, Sand, and Bricks problems of, Hardware less is more philosophy, Expedients and Experience levels of abstraction, Using UML lex as transformative technologies, Transformative Technologies lexical scoping, Language libraries as method for extending languages, Unix and Its Culture design of, Unix and Its Culture ligatures, Designed to Last lightweight threads, Components, Sand, and Bricks line numbers in BASIC, The Goals Behind BASIC, Language Design Lisp level of success of, Waiting for a Breakthrough list comprehensions, The Haskell Language little languages, Growing a Language loops alternatives to, Elementary Principles in Forth, Application Design Love, Objective-C appropriate uses of Smalltalk, Engineering Objective-C classes, Project Management and Legacy Software distributed teams, Project Management and Legacy Software hardware, Engineering Objective-C, Engineering Objective-C languages new, Growing a Language legacy software, Project Management and Legacy Software maintaining software, Project Management and Legacy Software managers understanding of languages, Project Management and Legacy Software Objective-C as extension of C and Smalltalk, Growing a Language Objective-C compared to C++, Engineering Objective-C programmers advice for, Project Management and Legacy Software programming, Engineering Objective-C real-life experience, Education and Training simplicity in design, Project Management and Legacy Software success of a project, Project Management and Legacy Software teaching complex technical concepts, Education and Training uses of Objective-C, Engineering Objective-C Lua, Lua, The Power of Scripting feedback from users regarding, Language Design platform independence of, Language Design resources used by, Experience testing features of, Language Design VM choice of ANSI C for, Language Design debugging affected by, Language Design register-based, Language Design M M language, Language Design, Creativity, Refinement, and Patterns Make utility, Transformative Technologies mathematical formalism in language design, Language Design pipes used for, Unix and Its Culture mathematicians, Waiting for a Breakthrough, Experience mathematics importance of learning, Theory and Practice role in computer science, Elementary Principles, Computer Science, Bits That Change the Universe, Experience, Beyond Informatics metalanguages for models, The Soundness of Theorems Méthodes de Programmation (Meyer), An Inspired Afternoon Meyer, Eiffel analysis required before implementation, Proofreading Languages concurrency and OOP, An Inspired Afternoon Design by Contract, An Inspired Afternoon, An Inspired Afternoon genericity, Reusability and Genericity information hiding in Eiffel, Reusability and Genericity language design, Proofreading Languages languages influencing programs, An Inspired Afternoon mathematical versus linguistic perspective for programming, Proofreading Languages multilingual background of, Proofreading Languages objects, An Inspired Afternoon philosophies of programming, An Inspired Afternoon program provability, Proofreading Languages reusability, Reusability and Genericity seamless development, Proofreading Languages small versus large programs, Proofreading Languages specification and implementation, An Inspired Afternoon structured versus OO programming, Proofreading Languages microprocessors, Application Design millenium bug, The Theory of Meaning Milner, ML bugs, The Soundness of Theorems communication among agents, Beyond Informatics computer science, Beyond Informatics concurrent systems, The Soundness of Theorems defining as informatic scientist, Beyond Informatics informatics, Beyond Informatics language design, The Theory of Meaning language design influencing program design, The Soundness of Theorems languages specific to each programmer, The Theory of Meaning levels of models, The Soundness of Theorems logic expressed by ML, The Soundness of Theorems mathematics, Beyond Informatics paradigms, The Theory of Meaning programs, The Theory of Meaning purpose of ML, The Theory of Meaning structural problems in programs, The Theory of Meaning teaching theorems and provability, The Soundness of Theorems theory of meaning, Beyond Informatics ubiquitous systems, Beyond Informatics undecidability in lower levels of models, The Soundness of Theorems minimalism, The Forth Language and Language Design ML, ML formal specification of, The Theory of Meaning role of, The Soundness of Theorems type system for, The Theory of Meaning model-driven development, Proofreading Languages models for systems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, Beyond Informatics Moore, Forth concurrency, Hardware elegant solutions, The Forth Language and Language Design indirect-threaded code in Forth, The Forth Language and Language Design language design, Application Design legacy software, Application Design operating systems, The Forth Language and Language Design parallel processing, The Forth Language and Language Design resuming programming after a hiatus, The Forth Language and Language Design stack, Hardware teamwork in programming, Application Design words, The Forth Language and Language Design, The Forth Language and Language Design, Application Design Moore’s Law, Concurrency multicore computers, Application Design multiple paradigms in Python, The Pythonic Way multithreading as precursor to parallel processing, The Forth Language and Language Design cooperative, Hardware Java frameworks for, Concurrency mathematical software and, Concurrency problems in C++ with, Designing a Language synchronization primitives for, Concurrency music, Education and Training, Growing a Language, Training Developers N namespaces in APL, Parallelism Objective-C not supporting, Objective-C and Other Languages National Instruments Lab View, Creativity, Refinement, and Patterns NetBeans, Designing a Language networked small computers, Application Design networks distribution of, OOP and Concurrency influencing software design, Quality As an Economic Phenomenon SOAs and, Components, Sand, and Bricks superdistribution and, Components, Sand, and Bricks Ng, Learning and Teaching number handling in BASIC, The Goals Behind BASIC in Lua, The Power of Scripting in Python, The Pythonic Way O object-oriented programming (OOP) concurrency and, OOP and Concurrency, A Bit of Reusability correct design influenced by, Creativity, Refinement, and Patterns generic programming as alternative to, OOP and Concurrency good design using, OOP and Concurrency limited applications of, Growing a Language objects handled outside of language, An Inspired Afternoon reusability and, A Bit of Reusability scalability of, A Bit of Reusability, Creativity, Refinement, and Patterns success of, A Matter of Taste usefulness of, Language and Programming Practice, Designing a New Language uses of, Proofreading Languages using well, A Matter of Taste with Visual Basic, Language and Programming Practice Objective-C, Objective-C single inheritance, Objective-C and Other Languages objects, Theory and Practice open source model, Quality As an Economic Phenomenon open source projects, Interfaces to Longevity open standards, Interfaces to Longevity operating systems, The Forth Language and Language Design, Hardware Oracle, A Seminal Paper orthogonality, Feedback and Evolution P parallel processing, The Forth Language and Language Design parallelism in APL, Elementary Principles–Legacy, Parallelism, Parallelism, Parallelism, Legacy uses of, Components, Sand, and Bricks parser for Lua, Language Design patch utility, Transformative Technologies pattern matching algorithms for, Computer Science evolution of, The Life of Algorithms pattern movement, Be Ready for Change, Layers and Languages patterns, Creativity, Refinement, and Patterns, Creativity, Refinement, and Patterns PEP (Python Enhancement Proposal), The Pythonic Way performance of BASIC, The Goals Behind BASIC practical implications of, A Matter of Taste Perl, Perl APL influencing, Parallelism community participation in, Community–Evolution and Revolution, Community, Community, Community, Evolution and Revolution context in, Language–Language, Language, Language, Language CPAN for, Community dual licensing, Community evolution of, Language, Language, Evolution and Revolution, Evolution and Revolution, Evolution and Revolution, Evolution and Revolution human language principles influencing, The Language of Revolutions, Language multiple ways of doing something, The Language of Revolutions purposes of, The Language of Revolutions scoping in, The Language of Revolutions syncretic design of, Language transition from text tool to complete language, The Language of Revolutions version 6, The Language of Revolutions, Evolution and Revolution, Evolution and Revolution Peters, The Pythonic Way, The Good Programmer physical processes, The Soundness of Theorems pi calculus, The Soundness of Theorems Pike, Breeding Little Languages pointers compiler handling, Compiler Design polyglot virtual machines, Language and Design polymorphism, Compiler Design postfix operators, The Forth Language and Language Design, The Forth Language and Language Design PostScript, PostScript as concatenative language, Designed to Last design decisions for, Designed to Last fonts, Designed to Last for Apple graphics imaging model, Designed to Last for NeXT graphics imaging model, Designed to Last formal semantics not used for, Designed to Last future evolution of, Designed to Last JavaScript interface, Interfaces to Longevity kerning in, Designed to Last print imaging models, Designed to Last purposes of, Designed to Last writing by hand, Designed to Last pragmatism and creativity, Language Design productivity of programmers language affecting, Growing a Language programmer quality affecting, Project Management and Legacy Software programming language affecting, Theory and Practice when working alone, Work Goals productivity of users, The Language, Feedback and Evolution, Feedback and Evolution programmers all levels of, The Pythonic Way good, The Pythonic Way, Application Design, Research and Education hiring, The Good Programmer improving skills of, Bits That Change the Universe knowledge of, Knowledge, Be Ready for Change paradigms influencing, The Theory of Meaning productivity of, Programming by Example recognizing good, Experience, Education and Training teams of Design by Contract helping, An Inspired Afternoon distributed, Project Management and Legacy Software education for, Designing a Language effectiveness of, Creativity, Refinement, and Patterns importance of, Application Design in classroom, Unix and Its Culture, The Role of Documentation, The Role of Documentation skills required for, Education and Training users as, Knowledge, Be Ready for Change programming analysis in preparation for, Compiler Design, Proofreading Languages approaches to, Learning and Teaching as engineering, Learning and Teaching by example, Theory and Practice, Programming by Example, Programming by Example compared to language design, Theory and Practice compared to mathematical theorems work, Bits That Change the Universe, Programming by Example compared to writing text, Breeding Little Languages components in, Objective-C and Other Languages, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks hardware availability affecting, Programming by Example linguistic perspective of, Proofreading Languages mathematical perspective of, Proofreading Languages nature of, Hardware resuming after a hiatus, The Role of Documentation users, Breeding Little Languages programming language design, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Legacy Culture programming languages adding features to, Language and Design evolution of, Future, The Pythonic Way, Language Design, Engineering Objective-C, Growing a Language, Growing a Language, Growing a Language, Growing a Language, Growing a Language experiments of, Language extensibility of, Expedients and Experience, Unix and Its Culture, Waiting for a Breakthrough, Growing a Language families of, The Theory of Meaning general-purpose, Designing a New Language, Waiting for a Breakthrough growth of, Feedback Loop implementation of, Experience interface for, Transformative Technologies linguistics as influence on, Language little making more general, Legacy Culture, Waiting for a Breakthrough, Growing a Language resurgence of, Legacy Culture longevity of, Unix and Its Culture new, Growing a Language number of in use, Growing a Language productivity affected by, Theory and Practice, Growing a Language safety of, Growing a Language size of, Unix and Its Culture strengths of, Designing a New Language teaching languages, Language and Design testing new features of, Designing a New Language theory of meaning for, Beyond Informatics usability of, Using UML validating, Beyond Informatics programs as domain-specific languages, Elementary Principles beauty or elegance of, Language Design complexity of, A Bit of Reusability, Creativity, Refinement, and Patterns computer’s ability to state meaning of, The Theory of Meaning legacy, Project Management and Legacy Software local workarounds versus global fixes, Bits That Change the Universe maintainability of, Bits That Change the Universe, Waiting for a Breakthrough, Education and Training, Project Management and Legacy Software performance of, Programming by Example problems in, Programming by Example revising heavily before shipping, Transformative Technologies rewriting, Waiting for a Breakthrough size of, Designing a New Language written in 1970s, Hardware protocols, Objective-C and Other Languages provability, The Soundness of Theorems, The Soundness of Theorems proving theorems, The Soundness of Theorems, The Theory of Meaning, The Theory of Meaning Python, Python adding features to, The Pythonic Way, The Pythonic Way, The Pythonic Way bottom-up versus top-down design, The Good Programmer concurrency with, Multiple Pythons design process using, The Good Programmer dynamic features of, The Good Programmer elegance philosophy for, The Pythonic Way, The Good Programmer experts using, The Pythonic Way garbage collection, Multiple Pythons lessons learned from design of, Expedients and Experience macros in, Multiple Pythons maintainability of, The Good Programmer multiple implementations of, Multiple Pythons–Multiple Pythons, Multiple Pythons, Multiple Pythons, Multiple Pythons multiple paradigms in, The Pythonic Way new versions of, The Pythonic Way novices using, The Pythonic Way prototyping uses of, The Good Programmer searching large code bases, Expedients and Experience security of, The Good Programmer simple parser used by, Multiple Pythons strict formatting in, Multiple Pythons type of programmers using, The Pythonic Way Python 3000, The Good Programmer Python Enhancement Proposal (PEP), The Pythonic Way Pythonic, The Pythonic Way Q Quill, Feedback and Evolution R RAD (rapid application development), Future readability, The Forth Language and Language Design refactoring, OOP and Concurrency Reisner, Feedback and Evolution relational databases, Parallelism research groups, Research and Education, Research and Education resilience, Feedback and Evolution resources limited, Experience reusability, Reusability and Genericity and OOP, A Bit of Reusability, A Bit of Reusability and SOA, A Bit of Reusability rule-based technology, Knowledge Rumbaugh, UML background of, Be Ready for Change benefits of UML, Using UML change, Symmetric Relationships communication facilitated by UML, Using UML computer science, Be Ready for Change concurrency, A Bit of Reusability implementation code, Using UML lessons learned by design of UML, Be Ready for Change pattern movement, Layers and Languages programming, Be Ready for Change programming knowledge linked to languages, Be Ready for Change purposes of UML, Using UML redesigning UML, Layers and Languages, Layers and Languages reusability and OOP, A Bit of Reusability security, Symmetric Relationships simplicity, Using UML simplifying UML, Using UML size of project determining usefulness of UML, Using UML SOA, A Bit of Reusability standardization of UML, Layers and Languages universal model/language, Using UML S Scala, Concurrency, Designing a Language SCOOP model, An Inspired Afternoon scoping, The Language of Revolutions SDL, UML, UML, UML seamless development, Proofreading Languages security of software formalisms of language affecting, Unix and Its Culture importance of, Symmetric Relationships language choice affecting, Theory and Practice multilevel integration affecting, Components, Sand, and Bricks with dynamic languages, The Good Programmer with Lua, The Power of Scripting with Python, The Good Programmer SEQUEL, A Seminal Paper service-oriented architecture (SOA), Components, Sand, and Bricks shared variables, Parallelism shell scripts, Unix and Its Culture simplicity advice for, Bits That Change the Universe of Forth, Application Design relationship to power, Power or Simplicity sketching tools, Expedients and Experience Smalltalk browser for, The Future of Computer Science incorporated in Objective-C, Growing a Language social engineering, The Role of the People Software and the Future of Programming Languages (Aho), Unix and Its Culture space insensitivity, The Goals Behind BASIC, Language Design specialization in programming, Layers and Languages specialization of labor, Components, Sand, and Bricks, Components, Sand, and Bricks, Education specifications distinct from implementation, An Inspired Afternoon SQL, SQL, A Seminal Paper–A Seminal Paper, A Seminal Paper, A Seminal Paper influencing future language design, The Language updates on indexes, The Language stack management, Application Design stack-based design, Designed to Last stack-based subroutine calls, The Forth Language and Language Design standardization of APL, Paper and Pencil of C#, C# of UML, Layers and Languages problems with, Standard Wishes static typing, The Pythonic Way statically checked interfaces, OOP and Concurrency Stroustrup academic pursuits of, Future C++0x FAQ, Future concurrency, OOP and Concurrency concurrency and network distribution, OOP and Concurrency creating a new language, Future industry connections of, Teaching lessons from design of C++, Future structured programming, Proofreading Languages Structured Programming (Dahl; Dijkstra; Hoare), An Inspired Afternoon superdistribution, Components, Sand, and Bricks, Education symmetric relationships, Symmetric Relationships–Symmetric Relationships, Symmetric Relationships, Symmetric Relationships System R project, A Seminal Paper systems wider not faster, Concurrency T tables, The Power of Scripting Tcl/Tk, Transformative Technologies teams of programming language designers, Bits That Change the Universe, A Functional Team, A Functional Team, A Functional Team, Feedback Loop, C#, UML, Designed to Last templates, OOP and Concurrency test cases, Learning and Teaching testing code, Experience Python, Multiple Pythons writing code to facilitate, Transformative Technologies The Design and Evolution of C++ (Stroustrup), Future The Design of APL (Falkoff; Iverson), Paper and Pencil The Elements of Programming Style (Kernighan), Breeding Little Languages The Formal Description of System 360 (Falkoff; Iverson; Sussenguth), Paper and Pencil The Practice of Programming (Kernighan; Pike), Breeding Little Languages theorems proving as purpose of ML, The Theory of Meaning with LCF and ML, The Soundness of Theorems with type system, The Theory of Meaning working on, Bits That Change the Universe, Programming by Example transformative technologies, Transformative Technologies–Transformative Technologies, Transformative Technologies, Transformative Technologies True BASIC, The Goals Behind BASIC, The Goals Behind BASIC, Language Design type checking, The Forth Language and Language Design type systems decidability of, The Soundness of Theorems in ML, The Theory of Meaning U ubiquitous systems, Beyond Informatics UML (Unified Modeling Language), UML, UML, UML backward compatibility with, Layers and Languages persuading people of benefits of, UML, UML, Using UML, UML purposes of, UML removing elements from, UML semantic definitions in, UML Unix, Unix and Its Culture use cases, Learning and Teaching user-created and built-in language elements, Elementary Principles users considering when programming, Language Design, Work Goals, Breeding Little Languages V van Rossum, Python dynamic typing, The Pythonic Way garbage collection in Python, Multiple Pythons interface or API design, Expedients and Experience learning Python, The Good Programmer macros in Python, Multiple Pythons programmers, The Pythonic Way recognizing good, The Good Programmer Pythonic, The Pythonic Way resuming programming, Expedients and Experience skills of, The Good Programmer static typing, The Pythonic Way testing Python code, Expedients and Experience visual applications, Language Design Visual Basic limitations of, Language Design usefulness of, Transformative Technologies visual programming languages, Creativity, Refinement, and Patterns W Wadler class system in Haskell, The Haskell Language language design influencing software design, The Haskell Language Wall, Perl complexity of languages, Language CPAN, Community languages compared to tools, Language languages moving from specialized to general-purpose, Language transition of Perl from text tool to complete language, The Language of Revolutions Warnock, PostScript font building for PostScript, Designed to Last web, Standard Wishes website resources C++ Standards Committee, Future Weinberger, AWK AWK compared to SQL, Bits That Change the Universe C, Waiting for a Breakthrough creativity in programmers, Bits That Change the Universe error messages, Theory and Practice extensible languages, Waiting for a Breakthrough functional programming, Bits That Change the Universe general-purpose languages, Waiting for a Breakthrough implementation affecting language design, Theory and Practice language design, Theory and Practice, Waiting for a Breakthrough, Waiting for a Breakthrough, Waiting for a Breakthrough, Waiting for a Breakthrough, Programming by Example large programs in AWK, Waiting for a Breakthrough learning new things on Internet, Bits That Change the Universe Lisp, Waiting for a Breakthrough little programs, Bits That Change the Universe mathematics, Bits That Change the Universe mistakes made by, Bits That Change the Universe objects compared to system components, Theory and Practice problems in software, Programming by Example programming, Bits That Change the Universe programming by example, Theory and Practice programming language design, Theory and Practice, Theory and Practice programs rewriting, Waiting for a Breakthrough security, Theory and Practice simplicity, Bits That Change the Universe success, Waiting for a Breakthrough teaching debugging, Bits That Change the Universe teaching programming, Theory and Practice whitespace insensitivity, Language Design WYSIWYG editors, Language Design X X Window system, Legacy Culture XML, XQuery and XML XQuery, XQuery and XML Y yacc as transformative technology, Legacy Culture Yahoo!

Symbols 169-176 breakthroughs needed in, Waiting for a Breakthrough, Waiting for a Breakthrough by mathematicians, Waiting for a Breakthrough, Experience clean design, Programming by Example compared to library design, Language and Design debugging considerations for, Theory and Practice designer’s preferences influencing, Theory and Practice environment influencing, Language and Design errors reduced by, Theory and Practice for handheld devices, Power or Simplicity for system programming, Power or Simplicity formalisms for mathematical, Language Design usefulness of, Unix and Its Culture implementation affecting, Language Design implementation considerations for, Designing a New Language, Theory and Practice implementation related to, Language and Design improvements to process of, Designing a New Language, Designing a New Language influencing program design, Language Design inspiration for, Creativity, Refinement, and Patterns personal approach for, Elementary Principles prototypes for, Designing a New Language scientific approach for, Designing a New Language, Theory and Practice, Growing a Language starting with small core set of functionality, Growing a Language, Proofreading Languages syntax choices for, Theory and Practice teams for democratic nature of, Feedback Loop user considerations, Language Design utility considerations, Language Design A A Note on Pattern Matching: Where do you find the match to an empty array (Falkoff), Elementary Principles A Programming Language (Iverson), Paper and Pencil abstraction in functional programming, A Functional Team in SQL, Feedback and Evolution agents, Beyond Informatics Aho, AWK Aho-Corasick algorithm, Computer Science automata theory, Computer Science command-line tools, Unix and Its Culture compilers course taught by, The Role of Documentation–The Role of Documentation, The Role of Documentation, The Role of Documentation, The Role of Documentation data size handled by AWK, The Life of Algorithms debugging, Language Design, The Role of Documentation documentation leading to better software design, The Role of Documentation–The Role of Documentation, The Role of Documentation, The Role of Documentation domain, Unix and Its Culture file concept applied to Internet, Unix and Its Culture formalizing semantics of languages, Unix and Its Culture graphical interfaces, Unix and Its Culture hardware availability, The Life of Algorithms hardware efficiency, Unix and Its Culture improving programming skills, Language Design knowledge required to use AWK, Language Design large programs, The Life of Algorithms lex, Language Design portability of Unix, Unix and Its Culture programming, The Role of Documentation programming language design, Language Design programming languages, Unix and Its Culture purposes appropriate for use of AWK, The Life of Algorithms, Unix and Its Culture research in computer science, Computer Science role in AWK development, The Life of Algorithms security and degree of formalism, Unix and Its Culture teaching programming, Language Design theory and practice as motivation, Unix and Its Culture utility of programming language, Language Design yacc, Language Design Aho-Corasick algorithm, Computer Science algebraic language, The Goals Behind BASIC allocated memory, Compiler Design API design, Expedients and Experience, Feedback Loop, C# APL, APL character set for, Paper and Pencil, Elementary Principles collections in, Parallelism design, Elementary Principles design history of, Paper and Pencil, Elementary Principles general arrays in, Elementary Principles implementation on handheld devices, Paper and Pencil learning, Paper and Pencil lessons learned from design of, Legacy namespaces, Parallelism parallelism with, Parallelism–Legacy, Parallelism, Parallelism, Legacy, Legacy regrets about, Legacy resources used efficiently by, Paper and Pencil standardization of, Elementary Principles syntax for based on algebraic notation, Paper and Pencil, Elementary Principles simplicity/complexity of, Paper and Pencil, Elementary Principles, Elementary Principles teaching programming with, Elementary Principles APL\360, Paper and Pencil arbitrary precision integers, The Pythonic Way architects, Be Ready for Change aspect orientation, Learning and Teaching Aspect-Oriented Software Development with Use Cases (Jacobson; Ng), Learning and Teaching asymmetrical coroutines, The Power of Scripting asynchronous operation, Hardware audio applications, Language Design automata theory, Computer Science automatic code checking, Designing a Language AWK, AWK, The Life of Algorithms compared to SQL, Bits That Change the Universe initial design ideas for, Bits That Change the Universe large programs good practices for, The Life of Algorithms improvements for, Theory and Practice longevity of, Theory and Practice programming advice for, Designing a New Language programming by example, Programming by Example–Programming by Example, Programming by Example, Programming by Example, Programming by Example, Programming by Example, Programming by Example regrets about, Bits That Change the Universe AWT, Power or Simplicity B backward compatibility, Formalism and Evolution for potentially redesigned UML, Layers and Languages with Java, Power or Simplicity with JVM, Language and Design with UML, Language Design BASIC, BASIC comments, Language and Programming Practice compiler one pass for, The Goals Behind BASIC, The Goals Behind BASIC two passes for, The Goals Behind BASIC design of considerations for, The Goals Behind BASIC, The Goals Behind BASIC holding up over time, Language Design encapsulation, The Goals Behind BASIC hardware evolution influencing, The Goals Behind BASIC large programs, The Goals Behind BASIC lessons learned from design of, Language Design libraries, Language Design number handling, The Goals Behind BASIC, The Goals Behind BASIC performance of, The Goals Behind BASIC teaching programming using, The Goals Behind BASIC True BASIC, The Goals Behind BASIC variable declarations not required in, Compiler Design bitmap fonts, Designed to Last Booch, UML backward compatibility with UML, Language Design benefits of UML, UML body of literature for programming, Training Developers business rules, Creativity, Refinement, and Patterns complexity and OOP, Creativity, Refinement, and Patterns complexity of UML, UML concurrency, Creativity, Refinement, and Patterns constraints contributing to innovation, Creativity, Refinement, and Patterns creativity and pragmatism, Language Design design of UML, UML implementation code, UML language design, Creativity, Refinement, and Patterns language design compared to programming, Language Design language design influencing programs, Language Design legacy software, Creativity, Refinement, and Patterns OOP influencing correct design, Creativity, Refinement, and Patterns percentage of UML used all the time, UML, Language Design redesigning UML, UML simplicity, Creativity, Refinement, and Patterns standardization of UML, Language Design–Training Developers, Language Design, Language Design, Training Developers training programmers, Training Developers–Creativity, Refinement, and Patterns, Creativity, Refinement, and Patterns bottom-up design with Forth, The Forth Language and Language Design with Python, The Good Programmer Boyce, SQL, A Seminal Paper brown field development, Creativity, Refinement, and Patterns bugs in language design, The Theory of Meaning business rules, Creativity, Refinement, and Patterns C C as system programming language, Power or Simplicity longevity of, Legacy Culture Objective-C as extension of, Engineering Objective-C performance of, Power or Simplicity signedness in, Waiting for a Breakthrough size of code, Project Management and Legacy Software C#, C# as replacement for C++, Growing a Language debugging, C# design team for, C# evolution of, C# formal specifications for, C# Java as inspiration for, Designing a Language longevity of, Growing a Language user feedback for, Language and Design, C# C++ backward compatibility with C, Legacy Culture C# as replacement for, C# compared to Objective-C, Engineering Objective-C, Objective-C and Other Languages compatibility requirements of, The Pythonic Way complexity of, Engineering Objective-C concurrency support in, OOP and Concurrency evolution of, Growing a Language future versions of, Future lessons learned from design of, Future multithreading in, Designing a Language pointers in problems with, Designing a Language popularity of, Engineering Objective-C C++ 2.0, Future C++0x, OOP and Concurrency, Future Calculus of Communicating Systems (CCS), The Soundness of Theorems CCS (Calculus of Communicating Systems), The Soundness of Theorems Celes, Lua Chamberlin, SQL complexity of SQL, Feedback and Evolution concurrent data access in SQL, The Language declarative nature of SQL, The Language design history of SQL, A Seminal Paper–A Seminal Paper, A Seminal Paper, A Seminal Paper, A Seminal Paper design principles of SQL, The Language determinism, Feedback and Evolution Excel compared with relational database systems, Feedback and Evolution external visibility of, Feedback and Evolution injection attacks on SQL, Feedback and Evolution knowledge required to use SQL, Feedback and Evolution languages, A Seminal Paper popularity of SQL, Feedback and Evolution scalability of SQL, Feedback and Evolution standardization of SQL and XQuery, XQuery and XML usability tests on SQL, Feedback and Evolution user feedback on SQL, Feedback and Evolution users of SQL, Feedback and Evolution views in SQL, The Language XML, XQuery and XML XQuery, XQuery and XML character set, Paper and Pencil, Elementary Principles class system, The Haskell Language classes, Project Management and Legacy Software closure in Lua, The Power of Scripting in SQL, The Language Codd, SQL, A Seminal Paper code browsing, The Pythonic Way code examples in programming manuals, Breeding Little Languages code reuse, Compiler Design collections design implications of, Parallelism large unstructured, Parallelism operations on each element of, Elementary Principles color, Designed to Last colorForth, The Forth Language and Language Design command line AWK used with, Unix and Its Culture compared to graphical interface, Designing a New Language composing programs on, Unix and Its Culture limitations of, Bits That Change the Universe resurgence of, Legacy Culture tools for, Unix and Its Culture comments, Application Design, The Theory of Meaning in BASIC, Compiler Design in C#, C# role of, Experience communication among interactive agents, Beyond Informatics role in informatics, Beyond Informatics compilers quality of code in, Application Design writing, The Forth Language and Language Design, Compiler Design, The Role of Documentation completeness, The Language complex algorithms, The Life of Algorithms components, Objective-C and Other Languages, Objective-C and Other Languages, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Quality As an Economic Phenomenon, Growing a Language computer science current problems in, The Future of Computer Science future of, Interfaces to Longevity problems of, Beyond Informatics, Components, Sand, and Bricks research in, Beyond Informatics role of mathematics in, Elementary Principles, Computer Science, Experience, Beyond Informatics whether it is a science, Experience, Quality As an Economic Phenomenon computer science education approaches for, Bits That Change the Universe, Spreading (Functional) Education beginning programming, The Pythonic Way, Elementary Principles, The Goals Behind BASIC, Compiler Design functional languages, Spreading (Functional) Education multiple languages, Language and Programming Practice, Breeding Little Languages teaching languages, The Goals Behind BASIC, The Goals Behind BASIC, Language and Design teamwork in, Unix and Its Culture–The Role of Documentation, The Role of Documentation, The Role of Documentation, The Role of Documentation topics needed in, Education and Training, Be Ready for Change concurrency, Creativity, Refinement, and Patterns adding to language, Language and Design analyzing concurrent systems, The Soundness of Theorems approaches for, Hardware challenges of, The Future of Computer Science, The Future of Computer Science design affected by, Concurrency framework handling, The Future of Computer Science functional languages and, A Bit of Reusability in C++, OOP and Concurrency in C++0x, OOP and Concurrency in Lua, The Power of Scripting in Python, Multiple Pythons in SQL, The Language language design affected by, The Future of Computer Science network distribution and, OOP and Concurrency OOP and, OOP and Concurrency, Objective-C and Other Languages, A Bit of Reusability pattern matching using, Computer Science requirements for, Concurrency conditionals, Application Design consistency, Feedback and Evolution constraints, Creativity, Refinement, and Patterns cooperative multithreading, Hardware Corasick, Computer Science Cox, Objective-C components, Objective-C and Other Languages, Components, Sand, and Bricks–Quality As an Economic Phenomenon, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks, Quality As an Economic Phenomenon concurrency and OOP, Objective-C and Other Languages configurability, Objective-C and Other Languages educational background of, Education encapsulation, Components, Sand, and Bricks garbage collection, Objective-C and Other Languages lessons learned from design of Objective-C, Objective-C and Other Languages, Components, Sand, and Bricks lightweight threads, Components, Sand, and Bricks multiple inheritance, Objective-C and Other Languages namespaces not supported in Objective-C, Objective-C and Other Languages Objective-C as extension of C and Smalltalk, Objective-C and Other Languages, Objective-C and Other Languages Objective-C compared to C++, Objective-C and Other Languages OOP increasing complexity of applications, Objective-C and Other Languages quality of software, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Education security of software, Components, Sand, and Bricks single inheritance in Objective-C, Objective-C and Other Languages superdistribution, Components, Sand, and Bricks, Education trusting software, Components, Sand, and Bricks CPAN, Community, Community creative arts, Training Developers creativity as role of programmer, Growing a Language importance of, Learning and Teaching in programming, Language Design necessity of, Creativity, Refinement, and Patterns opportunity to use, Knowledge stimulating in programmers, Bits That Change the Universe tension from, Language Design customer vocabulary, The Forth Language and Language Design D Dahl, An Inspired Afternoon data models, A Seminal Paper data sizes, Programming by Example debugging code C#, C# design considerations for, Designing a Language ease of, Language Design, Designing a New Language functional programming and, Trajectory of Functional Programming language design considerations for, Theory and Practice, Growing a Language Lua, Language Design PostScript, Interfaces to Longevity Python, Multiple Pythons debugging languages, The Pythonic Way declarations, Parallelism Design by Contract, An Inspired Afternoon, An Inspired Afternoon design patterns, Creativity, Refinement, and Patterns, Creativity, Refinement, and Patterns Design Patterns: Elements of Reusable Object-Oriented Software (Gamma; Helm; Johnson; Vlissides), Layers and Languages determinism, Feedback and Evolution Dijkstra, An Inspired Afternoon documentation of programming language, Breeding Little Languages documentation of programs comments, Application Design, Experience, C# content of, The Theory of Meaning importance of, Programming by Example leading to better software design, The Role of Documentation programmers writing, C# domain-driven design, The Forth Language and Language Design, Language Design, Language Design, Unix and Its Culture, Concurrency domain-specific languages (DSL), Growing a Language disadvantages of, Language and Design, Growing a Language existence of, C# growth of, Breeding Little Languages Lua used as, Language Design moving to general-purpose, Language programs as, Elementary Principles UML redesigned as set of, UML, UML dynamic languages benefits of, The Good Programmer security and, The Good Programmer dynamic typing, The Pythonic Way E ECMA standardization for C#, C# economic model of software, Components, Sand, and Bricks, Components, Sand, and Bricks, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon, Quality As an Economic Phenomenon Eiffel, Eiffel adding features to, Managing Growth and Evolution backward compatibility for, Managing Growth and Evolution evolution of, Managing Growth and Evolution–Managing Growth and Evolution, Managing Growth and Evolution, Managing Growth and Evolution, Managing Growth and Evolution, Managing Growth and Evolution extensibility of, Reusability and Genericity forward compatibility for, Managing Growth and Evolution history of, An Inspired Afternoon–An Inspired Afternoon, An Inspired Afternoon, An Inspired Afternoon, An Inspired Afternoon, An Inspired Afternoon information hiding, Reusability and Genericity proofs in, Proofreading Languages reusability of, Reusability and Genericity streaming serialization, Proofreading Languages embedded applications Forth for, The Forth Language and Language Design emergent systems, Layers and Languages encapsulation, Components, Sand, and Bricks advantages of, Language Design in BASIC, The Goals Behind BASIC engineering links to informatics, Beyond Informatics programming as, Learning and Teaching error messages in Lua, Language Design quality of, Theory and Practice errors handling, Trajectory of Functional Programming language design reducing number of, Theory and Practice reduced by language design, Theory and Practice Excel, Feedback and Evolution extensibility, The Language F Falkoff, APL collections, Parallelism, Parallelism, Parallelism computer science, Elementary Principles design of APL longevity of, Paper and Pencil language design, Elementary Principles language design influencing program design, Elementary Principles parallelism, Parallelism–Legacy, Legacy Perl influenced by APL, Legacy pointers not used in APL, Parallelism programmers, Paper and Pencil relational database design influenced by APL, Parallelism resources, Paper and Pencil The Design of APL, Paper and Pencil Figueiredo, Lua comments, Experience design of Lua, Experience dialects of users, Language Design environments changing design of Lua, Language Design error messages in Lua, Language Design hardware availability, Experience limited resources, Language Design local workarounds versus global fixes in code, Language Design mathematics, Experience mistakes in Lua, Experience programming in Lua, The Power of Scripting programming language design, Language Design–Language Design, Language Design, Language Design, Language Design, Language Design, Language Design, Language Design regrets about Lua, Experience security capabilities of Lua, The Power of Scripting success, Experience teaching debugging, Experience testing Lua, Language Design VM for Lua, Language Design file, Unix and Its Culture file handling, Parallelism first-class functions, The Power of Scripting font scaling in PostScript, Designed to Last for loop, Experience formal semantics benefits for language design, A Seminal Paper not used for PostScript, Designed to Last usefulness of, Formalism and Evolution–Formalism and Evolution, Formalism and Evolution, Formalism and Evolution, Formalism and Evolution formal specifications for C#, C# for languages, Language Design necessity of, Designing a Language Forth, Forth, The Forth Language and Language Design application design with, Application Design–Application Design, Application Design, Application Design, Application Design, Application Design, Application Design asynchronous operation, Hardware comparing to PostScript, Designed to Last conditionals in, Application Design design of longevity of, The Forth Language and Language Design error causes and detection, The Forth Language and Language Design, Application Design for embedded applications, The Forth Language and Language Design I/O capabilities of, Hardware indirect-threaded code, The Forth Language and Language Design loops in, Application Design maintainability of, Application Design minimalism in design of, The Forth Language and Language Design porting, Hardware programmers receptive to, The Forth Language and Language Design programming in, Application Design readability of, The Forth Language and Language Design, The Forth Language and Language Design reusable concepts of meaning with, The Soundness of Theorems simplicity of, The Forth Language and Language Design, The Forth Language and Language Design, Application Design syntax of small words, The Forth Language and Language Design, The Forth Language and Language Design word choice in, Application Design fourth-generation computer language, The Forth Language and Language Design frameworks, Knowledge functional closures, The Haskell Language functional programming, Trajectory of Functional Programming–The Haskell Language, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, The Haskell Language abstraction in, Trajectory of Functional Programming concurrency and, A Bit of Reusability debugging in, Trajectory of Functional Programming error handling in, Trajectory of Functional Programming longevity of, Trajectory of Functional Programming parallelism and, Trajectory of Functional Programming popularity of, Trajectory of Functional Programming Scala for, Concurrency side effects, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming usefulness of, Bits That Change the Universe functions first class, The Power of Scripting higher-order, The Soundness of Theorems G garbage collection, Objective-C and Other Languages in JVM, Designing a Language in Lua, The Power of Scripting in Objective-C, Objective-C and Other Languages in Python, Multiple Pythons general arrays, Elementary Principles general-purpose languages, Growing a Language generic programming as alternative to OOP, OOP and Concurrency generic types, The Haskell Language genericity, Reusability and Genericity generics in Java, The Haskell Language Geschke, PostScript bugs in ROM, Designed to Last computer science, Research and Education concatenative language, Designed to Last design team for PostScript, Designed to Last hardware considerations, Designed to Last, Interfaces to Longevity history of software and hardware evolution, Research and Education Imaging Sciences Laboratory, Research and Education kerning and ligatures in PostScript, Designed to Last longevity of programming languages, Interfaces to Longevity mathematical background, Designed to Last popularity of languages, Interfaces to Longevity PostScript as language instead of data format, Designed to Last programmer skill, Designed to Last two-dimensional constructs, Designed to Last web use of PostScript, Standard Wishes Gosling adding to Java, Feedback Loop array subscript checking in Java, Power or Simplicity AWT, Power or Simplicity backward compatibility with Java, Power or Simplicity C stacks, Designing a Language C# inspired by Java, Designing a Language complexity, Power or Simplicity complexity of Java, Power or Simplicity computer science, Designing a Language concurrency, Concurrency–Designing a Language, Designing a Language debugging, Designing a Language documentation, Designing a Language error prevention and containment in Java, A Matter of Taste, Designing a Language formal specifications, Designing a Language freeing source code to Java, Feedback Loop garbage collection, Designing a Language Java EE, Power or Simplicity JIT, Power or Simplicity JVM satisfaction with, A Matter of Taste language design affected by network issues, Power or Simplicity language design influencing software design, Designing a Language language designed for personal use of, Designing a Language languages designed by, A Matter of Taste Moore’s Law, Concurrency performance, A Matter of Taste pointers in C++, Designing a Language programmers, Designing a Language references in Java, Designing a Language Scala, Concurrency, Designing a Language simplicity and power, Power or Simplicity system programming languages, Power or Simplicity user feedback for Java, Designing a Language virtual machine for Java, A Matter of Taste GOTO statements, The Goals Behind BASIC, The Goals Behind BASIC graphical interface limitations of, Unix and Its Culture H half-toning for color, Designed to Last Halloween problem, The Language handheld devices, Power or Simplicity hardware availability of, The Life of Algorithms, Breeding Little Languages, Programming by Example computational power of, Hardware considerations for, Designed to Last innovation driven by, Interfaces to Longevity predicting future of, Engineering Objective-C, Engineering Objective-C requirements for concurrency, Hardware viewing as a resource or a limit, Hardware Haskell, Haskell class system for, The Haskell Language competing implementations of, Formalism and Evolution evolution of, Formalism and Evolution–Formalism and Evolution, Formalism and Evolution, Formalism and Evolution influencing other languages, The Haskell Language list comprehensions, The Haskell Language team designing, A Functional Team–Trajectory of Functional Programming, Trajectory of Functional Programming type system for, The Haskell Language, The Haskell Language, The Haskell Language, The Haskell Language, The Theory of Meaning Hejlsberg, C# backward compatibility with JVM, Language and Design comments in C#, C# computer science, The Future of Computer Science debugging, Growing a Language domain-specific languages, Growing a Language, The Future of Computer Science dynamic programming languages, The Future of Computer Science higher-order functions, Language and Design, The Future of Computer Science implementing and designing languages, Language and Design language design, Growing a Language leveraging existing components, Growing a Language personal themes in language design, Language and Design programmers, The Future of Computer Science programming language design, Language and Design–Growing a Language, Language and Design, Language and Design, Language and Design, Language and Design, Language and Design, Growing a Language, Growing a Language safety versus creative freedom, Growing a Language simplicity in language design, Growing a Language teaching languages, Language and Design higher-order functions, The Future of Computer Science higher-order functions in ML, The Soundness of Theorems Hoare, An Inspired Afternoon HOPL-III: The development of the Emerald programming language, OOP and Concurrency HTML, Standard Wishes Hudak functional programming, Trajectory of Functional Programming–The Haskell Language, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, The Haskell Language Haskell’s influence on other languages, The Haskell Language language design influencing software design, The Haskell Language teaching programming and computer science, Spreading (Functional) Education Hughes functional programming, Trajectory of Functional Programming–The Haskell Language, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, Trajectory of Functional Programming, The Haskell Language hybrid typing, The Pythonic Way I I/O, Hardware Ierusalimschy, Lua closures in Lua, The Power of Scripting code sharing with Lua, Language Design comments, Experience computer science, Experience concurrency with Lua, The Power of Scripting debugging Lua, Language Design extensibility of Lua, Language Design feature set complete for Lua, Language Design first-class functions in Lua, The Power of Scripting fragmentation issues with Lua, Language Design implementation of language affecting design of, Language Design limitations of Lua, The Power of Scripting limited resources, Language Design number handling by Lua, The Power of Scripting programmers, Experience simplicity of Lua, Language Design success, Experience upgrading Lua during development, Language Design user feedback on Lua, Language Design VM for Lua, Language Design implementation, An Inspired Afternoon indirect-threaded code, The Forth Language and Language Design informatics definition of, Beyond Informatics inheritance, Compiler Design injection attacks, Feedback and Evolution intelligent agents for programming, Knowledge interface design, Expedients and Experience, Transformative Technologies Internet as representation of agents, Beyond Informatics Iverson, APL J Jacobson, UML benefits of UML, UML, UML complexity of UML, UML, UML computer science, Learning and Teaching designing UML, UML DSLs, UML Ericsson, Learning and Teaching future possible changes to UML, UML implementation code, UML legacy software, The Role of the People Object-Oriented Software Engineering, Learning and Teaching programming, Learning and Teaching, Knowledge programming approaches in different parts of the world, Learning and Teaching programming knowledge linked to languages, UML programming methods and processes, The Role of the People SDL influencing improvements to UML, UML simplicity, Knowledge size of project determining usefulness of UML, UML social engineering, The Role of the People teams for programming, Learning and Teaching use cases, Learning and Teaching Java, Java AWT and, Power or Simplicity Java EE, Power or Simplicity Javadoc tool, Designing a Language JavaScript, Interfaces to Longevity, Standard Wishes JIT, Power or Simplicity Jones formal semantics, Formalism and Evolution functional programming, Trajectory of Functional Programming, Trajectory of Functional Programming teaching computer science, Spreading (Functional) Education JVM new languages built on, Designing a Language popularity of, A Matter of Taste K kanji characters, Designed to Last Kemeny, BASIC Kernighan, AWK backward compatibility versus innovation, Legacy Culture C, Legacy Culture C++, Legacy Culture command line, Legacy Culture domain-specific languages (DSL), Breeding Little Languages hardware availability, Breeding Little Languages implementation considerations for language design, Legacy Culture language design style of, Language Design large systems, Designing a New Language learning programming languages, Computer Science little languages, Legacy Culture OOP, Designing a New Language programmers, Breeding Little Languages programming first interest in, Breeding Little Languages programming language manuals, Breeding Little Languages programming languages, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language rewriting programs, Legacy Culture success, Breeding Little Languages Tcl/Tk, Transformative Technologies teaching debugging, Breeding Little Languages testing, Transformative Technologies upgrading, Transformative Technologies user considerations in programming, Breeding Little Languages Visual Basic, Transformative Technologies writing text, Breeding Little Languages kerning, Designed to Last knowledge transfer, Learning and Teaching, The Role of the People, Knowledge, Be Ready for Change, Be Ready for Change Kurtz, BASIC algebraic language, The Goals Behind BASIC comments in BASIC, Language and Programming Practice compilers, Compiler Design debugging code, Language Design design of BASIC, The Goals Behind BASIC, Compiler Design encapsulation, Language Design language design influencing program design, Language and Programming Practice learning programming, The Goals Behind BASIC libraries, Language Design, Language Design mathematical formalism, Language Design OOP, Language and Programming Practice polymorphism, Compiler Design productivity when programming, Work Goals programming languages, Language and Programming Practice simplicity of languages, The Goals Behind BASIC single-pass compiler for BASIC, Compiler Design success in programming, Work Goals teaching programming, Compiler Design True BASIC, The Goals Behind BASIC users, Work Goals, Work Goals visual and audio applications, Language Design Visual Basic, Language Design Visual Basic as object-oriented language, Language Design words used in languages, Language Design L language toolkit, The Forth Language and Language Design lazy evaluation, Trajectory of Functional Programming, The Haskell Language LCF, The Soundness of Theorems limits of, The Soundness of Theorems legacy software, Bits That Change the Universe, Theory and Practice approaches for, Project Management and Legacy Software, The Role of the People, Training Developers preventing problems of, Project Management and Legacy Software, Components, Sand, and Bricks problems of, Hardware less is more philosophy, Expedients and Experience levels of abstraction, Using UML lex as transformative technologies, Transformative Technologies lexical scoping, Language libraries as method for extending languages, Unix and Its Culture design of, Unix and Its Culture ligatures, Designed to Last lightweight threads, Components, Sand, and Bricks line numbers in BASIC, The Goals Behind BASIC, Language Design Lisp level of success of, Waiting for a Breakthrough list comprehensions, The Haskell Language little languages, Growing a Language loops alternatives to, Elementary Principles in Forth, Application Design Love, Objective-C appropriate uses of Smalltalk, Engineering Objective-C classes, Project Management and Legacy Software distributed teams, Project Management and Legacy Software hardware, Engineering Objective-C, Engineering Objective-C languages new, Growing a Language legacy software, Project Management and Legacy Software maintaining software, Project Management and Legacy Software managers understanding of languages, Project Management and Legacy Software Objective-C as extension of C and Smalltalk, Growing a Language Objective-C compared to C++, Engineering Objective-C programmers advice for, Project Management and Legacy Software programming, Engineering Objective-C real-life experience, Education and Training simplicity in design, Project Management and Legacy Software success of a project, Project Management and Legacy Software teaching complex technical concepts, Education and Training uses of Objective-C, Engineering Objective-C Lua, Lua, The Power of Scripting feedback from users regarding, Language Design platform independence of, Language Design resources used by, Experience testing features of, Language Design VM choice of ANSI C for, Language Design debugging affected by, Language Design register-based, Language Design M M language, Language Design, Creativity, Refinement, and Patterns Make utility, Transformative Technologies mathematical formalism in language design, Language Design pipes used for, Unix and Its Culture mathematicians, Waiting for a Breakthrough, Experience mathematics importance of learning, Theory and Practice role in computer science, Elementary Principles, Computer Science, Bits That Change the Universe, Experience, Beyond Informatics metalanguages for models, The Soundness of Theorems Méthodes de Programmation (Meyer), An Inspired Afternoon Meyer, Eiffel analysis required before implementation, Proofreading Languages concurrency and OOP, An Inspired Afternoon Design by Contract, An Inspired Afternoon, An Inspired Afternoon genericity, Reusability and Genericity information hiding in Eiffel, Reusability and Genericity language design, Proofreading Languages languages influencing programs, An Inspired Afternoon mathematical versus linguistic perspective for programming, Proofreading Languages multilingual background of, Proofreading Languages objects, An Inspired Afternoon philosophies of programming, An Inspired Afternoon program provability, Proofreading Languages reusability, Reusability and Genericity seamless development, Proofreading Languages small versus large programs, Proofreading Languages specification and implementation, An Inspired Afternoon structured versus OO programming, Proofreading Languages microprocessors, Application Design millenium bug, The Theory of Meaning Milner, ML bugs, The Soundness of Theorems communication among agents, Beyond Informatics computer science, Beyond Informatics concurrent systems, The Soundness of Theorems defining as informatic scientist, Beyond Informatics informatics, Beyond Informatics language design, The Theory of Meaning language design influencing program design, The Soundness of Theorems languages specific to each programmer, The Theory of Meaning levels of models, The Soundness of Theorems logic expressed by ML, The Soundness of Theorems mathematics, Beyond Informatics paradigms, The Theory of Meaning programs, The Theory of Meaning purpose of ML, The Theory of Meaning structural problems in programs, The Theory of Meaning teaching theorems and provability, The Soundness of Theorems theory of meaning, Beyond Informatics ubiquitous systems, Beyond Informatics undecidability in lower levels of models, The Soundness of Theorems minimalism, The Forth Language and Language Design ML, ML formal specification of, The Theory of Meaning role of, The Soundness of Theorems type system for, The Theory of Meaning model-driven development, Proofreading Languages models for systems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, The Soundness of Theorems, Beyond Informatics Moore, Forth concurrency, Hardware elegant solutions, The Forth Language and Language Design indirect-threaded code in Forth, The Forth Language and Language Design language design, Application Design legacy software, Application Design operating systems, The Forth Language and Language Design parallel processing, The Forth Language and Language Design resuming programming after a hiatus, The Forth Language and Language Design stack, Hardware teamwork in programming, Application Design words, The Forth Language and Language Design, The Forth Language and Language Design, Application Design Moore’s Law, Concurrency multicore computers, Application Design multiple paradigms in Python, The Pythonic Way multithreading as precursor to parallel processing, The Forth Language and Language Design cooperative, Hardware Java frameworks for, Concurrency mathematical software and, Concurrency problems in C++ with, Designing a Language synchronization primitives for, Concurrency music, Education and Training, Growing a Language, Training Developers N namespaces in APL, Parallelism Objective-C not supporting, Objective-C and Other Languages National Instruments Lab View, Creativity, Refinement, and Patterns NetBeans, Designing a Language networked small computers, Application Design networks distribution of, OOP and Concurrency influencing software design, Quality As an Economic Phenomenon SOAs and, Components, Sand, and Bricks superdistribution and, Components, Sand, and Bricks Ng, Learning and Teaching number handling in BASIC, The Goals Behind BASIC in Lua, The Power of Scripting in Python, The Pythonic Way O object-oriented programming (OOP) concurrency and, OOP and Concurrency, A Bit of Reusability correct design influenced by, Creativity, Refinement, and Patterns generic programming as alternative to, OOP and Concurrency good design using, OOP and Concurrency limited applications of, Growing a Language objects handled outside of language, An Inspired Afternoon reusability and, A Bit of Reusability scalability of, A Bit of Reusability, Creativity, Refinement, and Patterns success of, A Matter of Taste usefulness of, Language and Programming Practice, Designing a New Language uses of, Proofreading Languages using well, A Matter of Taste with Visual Basic, Language and Programming Practice Objective-C, Objective-C single inheritance, Objective-C and Other Languages objects, Theory and Practice open source model, Quality As an Economic Phenomenon open source projects, Interfaces to Longevity open standards, Interfaces to Longevity operating systems, The Forth Language and Language Design, Hardware Oracle, A Seminal Paper orthogonality, Feedback and Evolution P parallel processing, The Forth Language and Language Design parallelism in APL, Elementary Principles–Legacy, Parallelism, Parallelism, Parallelism, Legacy uses of, Components, Sand, and Bricks parser for Lua, Language Design patch utility, Transformative Technologies pattern matching algorithms for, Computer Science evolution of, The Life of Algorithms pattern movement, Be Ready for Change, Layers and Languages patterns, Creativity, Refinement, and Patterns, Creativity, Refinement, and Patterns PEP (Python Enhancement Proposal), The Pythonic Way performance of BASIC, The Goals Behind BASIC practical implications of, A Matter of Taste Perl, Perl APL influencing, Parallelism community participation in, Community–Evolution and Revolution, Community, Community, Community, Evolution and Revolution context in, Language–Language, Language, Language, Language CPAN for, Community dual licensing, Community evolution of, Language, Language, Evolution and Revolution, Evolution and Revolution, Evolution and Revolution, Evolution and Revolution human language principles influencing, The Language of Revolutions, Language multiple ways of doing something, The Language of Revolutions purposes of, The Language of Revolutions scoping in, The Language of Revolutions syncretic design of, Language transition from text tool to complete language, The Language of Revolutions version 6, The Language of Revolutions, Evolution and Revolution, Evolution and Revolution Peters, The Pythonic Way, The Good Programmer physical processes, The Soundness of Theorems pi calculus, The Soundness of Theorems Pike, Breeding Little Languages pointers compiler handling, Compiler Design polyglot virtual machines, Language and Design polymorphism, Compiler Design postfix operators, The Forth Language and Language Design, The Forth Language and Language Design PostScript, PostScript as concatenative language, Designed to Last design decisions for, Designed to Last fonts, Designed to Last for Apple graphics imaging model, Designed to Last for NeXT graphics imaging model, Designed to Last formal semantics not used for, Designed to Last future evolution of, Designed to Last JavaScript interface, Interfaces to Longevity kerning in, Designed to Last print imaging models, Designed to Last purposes of, Designed to Last writing by hand, Designed to Last pragmatism and creativity, Language Design productivity of programmers language affecting, Growing a Language programmer quality affecting, Project Management and Legacy Software programming language affecting, Theory and Practice when working alone, Work Goals productivity of users, The Language, Feedback and Evolution, Feedback and Evolution programmers all levels of, The Pythonic Way good, The Pythonic Way, Application Design, Research and Education hiring, The Good Programmer improving skills of, Bits That Change the Universe knowledge of, Knowledge, Be Ready for Change paradigms influencing, The Theory of Meaning productivity of, Programming by Example recognizing good, Experience, Education and Training teams of Design by Contract helping, An Inspired Afternoon distributed, Project Management and Legacy Software education for, Designing a Language effectiveness of, Creativity, Refinement, and Patterns importance of, Application Design in classroom, Unix and Its Culture, The Role of Documentation, The Role of Documentation skills required for, Education and Training users as, Knowledge, Be Ready for Change programming analysis in preparation for, Compiler Design, Proofreading Languages approaches to, Learning and Teaching as engineering, Learning and Teaching by example, Theory and Practice, Programming by Example, Programming by Example compared to language design, Theory and Practice compared to mathematical theorems work, Bits That Change the Universe, Programming by Example compared to writing text, Breeding Little Languages components in, Objective-C and Other Languages, Components, Sand, and Bricks, Components, Sand, and Bricks, Components, Sand, and Bricks hardware availability affecting, Programming by Example linguistic perspective of, Proofreading Languages mathematical perspective of, Proofreading Languages nature of, Hardware resuming after a hiatus, The Role of Documentation users, Breeding Little Languages programming language design, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Designing a New Language, Legacy Culture programming languages adding features to, Language and Design evolution of, Future, The Pythonic Way, Language Design, Engineering Objective-C, Growing a Language, Growing a Language, Growing a Language, Growing a Language, Growing a Language experiments of, Language extensibility of, Expedients and Experience, Unix and Its Culture, Waiting for a Breakthrough, Growing a Language families of, The Theory of Meaning general-purpose, Designing a New Language, Waiting for a Breakthrough growth of, Feedback Loop implementation of, Experience interface for, Transformative Technologies linguistics as influence on, Language little making more general, Legacy Culture, Waiting for a Breakthrough, Growing a Language resurgence of, Legacy Culture longevity of, Unix and Its Culture new, Growing a Language number of in use, Growing a Language productivity affected by, Theory and Practice, Growing a Language safety of, Growing a Language size of, Unix and Its Culture strengths of, Designing a New Language teaching languages, Language and Design testing new features of, Designing a New Language theory of meaning for, Beyond Informatics usability of, Using UML validating, Beyond Informatics programs as domain-specific languages, Elementary Principles beauty or elegance of, Language Design complexity of, A Bit of Reusability, Creativity, Refinement, and Patterns computer’s ability to state meaning of, The Theory of Meaning legacy, Project Management and Legacy Software local workarounds versus global fixes, Bits That Change the Universe maintainability of, Bits That Change the Universe, Waiting for a Breakthrough, Education and Training, Project Management and Legacy Software performance of, Programming by Example problems in, Programming by Example revising heavily before shipping, Transformative Technologies rewriting, Waiting for a Breakthrough size of, Designing a New Language written in 1970s, Hardware protocols, Objective-C and Other Languages provability, The Soundness of Theorems, The Soundness of Theorems proving theorems, The Soundness of Theorems, The Theory of Meaning, The Theory of Meaning Python, Python adding features to, The Pythonic Way, The Pythonic Way, The Pythonic Way bottom-up versus top-down design, The Good Programmer concurrency with, Multiple Pythons design process using, The Good Programmer dynamic features of, The Good Programmer elegance philosophy for, The Pythonic Way, The Good Programmer experts using, The Pythonic Way garbage collection, Multiple Pythons lessons learned from design of, Expedients and Experience macros in, Multiple Pythons maintainability of, The Good Programmer multiple implementations of, Multiple Pythons–Multiple Pythons, Multiple Pythons, Multiple Pythons, Multiple Pythons multiple paradigms in, The Pythonic Way new versions of, The Pythonic Way novices using, The Pythonic Way prototyping uses of, The Good Programmer searching large code bases, Expedients and Experience security of, The Good Programmer simple parser used by, Multiple Pythons strict formatting in, Multiple Pythons type of programmers using, The Pythonic Way Python 3000, The Good Programmer Python Enhancement Proposal (PEP), The Pythonic Way Pythonic, The Pythonic Way Q Quill, Feedback and Evolution R RAD (rapid application development), Future readability, The Forth Language and Language Design refactoring, OOP and Concurrency Reisner, Feedback and Evolution relational databases, Parallelism research groups, Research and Education, Research and Education resilience, Feedback and Evolution resources limited, Experience reusability, Reusability and Genericity and OOP, A Bit of Reusability, A Bit of Reusability and SOA, A Bit of Reusability rule-based technology, Knowledge Rumbaugh, UML background of, Be Ready for Change benefits of UML, Using UML change, Symmetric Relationships communication facilitated by UML, Using UML computer science, Be Ready for Change concurrency, A Bit of Reusability implementation code, Using UML lessons learned by design of UML, Be Ready for Change pattern movement, Layers and Languages programming, Be Ready for Change programming knowledge linked to languages, Be Ready for Change purposes of UML, Using UML redesigning UML, Layers and Languages, Layers and Languages reusability and OOP, A Bit of Reusability security, Symmetric Relationships simplicity, Using UML simplifying UML, Using UML size of project determining usefulness of UML, Using UML SOA, A Bit of Reusability standardization of UML, Layers and Languages universal model/language, Using UML S Scala, Concurrency, Designing a Language SCOOP model, An Inspired Afternoon scoping, The Language of Revolutions SDL, UML, UML, UML seamless development, Proofreading Languages security of software formalisms of language affecting, Unix and Its Culture importance of, Symmetric Relationships language choice affecting, Theory and Practice multilevel integration affecting, Components, Sand, and Bricks with dynamic languages, The Good Programmer with Lua, The Power of Scripting with Python, The Good Programmer SEQUEL, A Seminal Paper service-oriented architecture (SOA), Components, Sand, and Bricks shared variables, Parallelism shell scripts, Unix and Its Culture simplicity advice for, Bits That Change the Universe of Forth, Application Design relationship to power, Power or Simplicity sketching tools, Expedients and Experience Smalltalk browser for, The Future of Computer Science incorporated in Objective-C, Growing a Language social engineering, The Role of the People Software and the Future of Programming Languages (Aho), Unix and Its Culture space insensitivity, The Goals Behind BASIC, Language Design specialization in programming, Layers and Languages specialization of labor, Components, Sand, and Bricks, Components, Sand, and Bricks, Education specifications distinct from implementation, An Inspired Afternoon SQL, SQL, A Seminal Paper–A Seminal Paper, A Seminal Paper, A Seminal Paper influencing future language design, The Language updates on indexes, The Language stack management, Application Design stack-based design, Designed to Last stack-based subroutine calls, The Forth Language and Language Design standardization of APL, Paper and Pencil of C#, C# of UML, Layers and Languages problems with, Standard Wishes static typing, The Pythonic Way statically checked interfaces, OOP and Concurrency Stroustrup academic pursuits of, Future C++0x FAQ, Future concurrency, OOP and Concurrency concurrency and network distribution, OOP and Concurrency creating a new language, Future industry connections of, Teaching lessons from design of C++, Future structured programming, Proofreading Languages Structured Programming (Dahl; Dijkstra; Hoare), An Inspired Afternoon superdistribution, Components, Sand, and Bricks, Education symmetric relationships, Symmetric Relationships–Symmetric Relationships, Symmetric Relationships, Symmetric Relationships System R project, A Seminal Paper systems wider not faster, Concurrency T tables, The Power of Scripting Tcl/Tk, Transformative Technologies teams of programming language designers, Bits That Change the Universe, A Functional Team, A Functional Team, A Functional Team, Feedback Loop, C#, UML, Designed to Last templates, OOP and Concurrency test cases, Learning and Teaching testing code, Experience Python, Multiple Pythons writing code to facilitate, Transformative Technologies The Design and Evolution of C++ (Stroustrup), Future The Design of APL (Falkoff; Iverson), Paper and Pencil The Elements of Programming Style (Kernighan), Breeding Little Languages The Formal Description of System 360 (Falkoff; Iverson; Sussenguth), Paper and Pencil The Practice of Programming (Kernighan; Pike), Breeding Little Languages theorems proving as purpose of ML, The Theory of Meaning with LCF and ML, The Soundness of Theorems with type system, The Theory of Meaning working on, Bits That Change the Universe, Programming by Example transformative technologies, Transformative Technologies–Transformative Technologies, Transformative Technologies, Transformative Technologies True BASIC, The Goals Behind BASIC, The Goals Behind BASIC, Language Design type checking, The Forth Language and Language Design type systems decidability of, The Soundness of Theorems in ML, The Theory of Meaning U ubiquitous systems, Beyond Informatics UML (Unified Modeling Language), UML, UML, UML backward compatibility with, Layers and Languages persuading people of benefits of, UML, UML, Using UML, UML purposes of, UML removing elements from, UML semantic definitions in, UML Unix, Unix and Its Culture use cases, Learning and Teaching user-created and built-in language elements, Elementary Principles users considering when programming, Language Design, Work Goals, Breeding Little Languages V van Rossum, Python dynamic typing, The Pythonic Way garbage collection in Python, Multiple Pythons interface or API design, Expedients and Experience learning Python, The Good Programmer macros in Python, Multiple Pythons programmers, The Pythonic Way recognizing good, The Good Programmer Pythonic, The Pythonic Way resuming programming, Expedients and Experience skills of, The Good Programmer static typing, The Pythonic Way testing Python code, Expedients and Experience visual applications, Language Design Visual Basic limitations of, Language Design usefulness of, Transformative Technologies visual programming languages, Creativity, Refinement, and Patterns W Wadler class system in Haskell, The Haskell Language language design influencing software design, The Haskell Language Wall, Perl complexity of languages, Language CPAN, Community languages compared to tools, Language languages moving from specialized to general-purpose, Language transition of Perl from text tool to complete language, The Language of Revolutions Warnock, PostScript font building for PostScript, Designed to Last web, Standard Wishes website resources C++ Standards Committee, Future Weinberger, AWK AWK compared to SQL, Bits That Change the Universe C, Waiting for a Breakthrough creativity in programmers, Bits That Change the Universe error messages, Theory and Practice extensible languages, Waiting for a Breakthrough functional programming, Bits That Change the Universe general-purpose languages, Waiting for a Breakthrough implementation affecting language design, Theory and Practice language design, Theory and Practice, Waiting for a Breakthrough, Waiting for a Breakthrough, Waiting for a Breakthrough, Waiting for a Breakthrough, Programming by Example large programs in AWK, Waiting for a Breakthrough learning new things on Internet, Bits That Change the Universe Lisp, Waiting for a Breakthrough little programs, Bits That Change the Universe mathematics, Bits That Change the Universe mistakes made by, Bits That Change the Universe objects compared to system components, Theory and Practice problems in software, Programming by Example programming, Bits That Change the Universe programming by example, Theory and Practice programming language design, Theory and Practice, Theory and Practice programs rewriting, Waiting for a Breakthrough security, Theory and Practice simplicity, Bits That Change the Universe success, Waiting for a Breakthrough teaching debugging, Bits That Change the Universe teaching programming, Theory and Practice whitespace insensitivity, Language Design WYSIWYG editors, Language Design X X Window system, Legacy Culture XML, XQuery and XML XQuery, XQuery and XML Y yacc as transformative technology, Legacy Culture Yahoo!

That’s the fundamental idea behind the relational model, invented by E. F. (Ted) Codd. SQL is the most visible implementation of the relational model—a declarative language where you describe what you want, not how to do it. Donald Chamberlin and Raymond Boyce developed SQL based on Codd’s ideas. A Seminal Paper How was SQL designed? Don Chamberlin: In the early 1970s, integrated database systems were just beginning to be widely deployed. Trends in technology and economics were making it possible for the first time for businesses to view their data as a corporate resource to be shared among many applications.

pages: 447 words: 104,258

Mathematics of the Financial Markets: Financial Instruments and Derivatives Modelling, Valuation and Risk Issues
by Alain Ruttiens
Published 24 Apr 2013

Figure 4.16 Effect of Fj on ri, proportional to sensitivity factor βij In particular, at point B, Fj = 0 → ri = E(ri). Choice of Factors Fj There is no objective rule governing the choice of the factors Fj. Because of the principle of parsimony, their number m must remain small enough: the higher m, the higher the sum of estimation errors. In one of their seminal papers, 7 Roll and Ross proposed: F1 = change in expected inflation; F2 = change in expected industrial production; F3 = unanticipated risk premium variation; F4 = unanticipated yield curve move.7 so that the i term of Eq. 4.5 would represent the impact of these factors over the period of time covered by the ex-post regression.

William F. SHARPE, Investors and Markets – Portfolio Choices, Asset Prices, and Investment Advice, Princeton University Press, 2006, 232 p. 1. A detailed presentation of the market efficiency and its various forms (weak, semi-strong, strong) is beyond the scope of this book. See for example the seminal paper by E. FAMA, ‘Efficient capital markets: a review of theory and empirical work’, Journal of Finance, 25(1), 1970, pp. 383–417, and its sequel, E. FAMA, Efficient capital markets: II, Journal of Finance, 46(5), 1991, pp. 1575–1617. For a more recent state of the theory, see, for example, M. BEECHEY, D.

A way to select the best combination is by optimizing their ratio – see Chapter 14. 3. Also called idiosyncratic. 4. For further details about utility functions, see, for example, H. GERBER, G. PAFUMI, “Utility functions: from risk theory to finance”, North American Actuarial Journal, 2(3), 1998, pp. 74–100. 5. In his seminal paper, W.F. SHARPE, ‘Capital Asset prices – A theory of market equilibrium under conditions of risk’, Journal of Finance, vol. XIX, no. 3, September 1964, pp. 425–442. 6. Actually, Sharpe's theory covers a wider range than just stocks, that is, the set of all risky assets traded on markets. However, practically speaking, the financial community restricts the market portfolio on the subset of traded stocks.

pages: 449 words: 123,459

The Infinity Puzzle
by Frank Close
Published 29 Nov 2011

For Salam and Ward especially, there was supreme irony: Unknown to them, at Imperial College that summer, three colleagues—Tom Kibble, Gerry Guralnik, and Dick Hagen—had found the missing link in attempts to unify the weak and electromagnetic interactions. The Marriage of Weak and Electromagnetic Forces—to 1964 123 Within three weeks of Salam and Ward’s manuscript being completed, the team of Guralnik, Hagen, and Kibble submitted their seminal paper on “Hidden Symmetry,” early in October 1964, explaining how gauge bosons could become massive while maintaining gauge invariance. This paved the way for the eventual solution to the Infinity Puzzle for the weak interaction. But it seems that in the summer of 1964, when Salam and Ward had stumbled on SU2 × U1 while down the corridor their colleagues had found how to escape the straitjacket of massless gauge bosons, no one at Imperial College put two and two together.

Anderson had now built on this with his proposal that the Goldstone Boson, by being absorbed within the photon, provides the “missing” longitudinal oscillation for a massive vector particle. Although Anderson had identified the way forward, he had not actually identified any flaws in Goldstone’s argument.34 The complete solution had to be found. There is some irony to the fact that the key to the answer was already in one of Nambu’s seminal papers. Foreshadowing even Anderson’s insight, in 1961 Nambu, and his collaborator Giovanni Jona-Lasinio, had remarked that in superconductivity there would have been Nambu-Goldstone Bosons “in the absence of Coulomb [electrostatic] interaction.”35 In effect, this recognizes that the Goldstone theorem applies only if there are no long-range forces, such as electromagnetic forces, present.

Regrettably, no copy is currently available of that letter.”67 As the GHK team, including Hagen, had not completed their paper until after the appearance of Higgs’s original papers, and as Higgs had built on this further during 1966, with a study of how the massive boson decays, Lee’s assessment at that time is perhaps not unreasonable. In any event, Hagen’s letter seems to have had little effect. As we shall see in the next chapter, the following year, Steven Weinberg produced his seminal paper, which uses these ideas to build what is now confirmed as a viable theory of the weak and electromagnetic forces.68 Weinberg’s paper, which cites Higgs prominently in pole position, eventually became the most highly quoted paper in theoretical particle physics. Within the community of particle physicists it is Higgs’s name that is freely associated with the “Boson that has been named after [him].”

pages: 130 words: 32,279

Beyond the 4% Rule: The Science of Retirement Portfolios That Last a Lifetime
by Abraham Okusanya
Published 5 Mar 2018

Retirees may sacrifice some lifestyle and legacy goals to secure their essential income Upside • Flexibility • Higher income and legacy if market turns out to be favourable Essential income not subject to vagaries of the market Retirement income product • Diversified investment portfolios • Annuity for essential spending • Investment-linked annuity • Diversified portfolios only used for discretionary spending Risk profile Medium to high Low to medium Flexibility to income adjustment Medium to high Low to medium Maintenance High Low Difficulty Complex Simple Modern Portfolio Theory vs. Modern Retirement Theory The empirical foundation for the probability-based school is the seminal paper published in the Journal of Financial Planning in 19945 by engineer-turned-financial planner, William Bengen. A key aspect of Bengen’s work is the idea of optimal asset allocation for a retirement portfolio. This draws on the Modern Portfolio Theory, pioneered by Harry Markowitz in 1952. It explores how a portfolio of multiple assets maximises returns for a given level of risk.

Available at SSRN: https://ssrn.com/abstract=1969021 or http://dx.doi.org/10.2139/ssrn.1969021 CHAPTER 4 Safe withdrawal rate: how safe? The key framework for managing sequence risk from a drawdown portfolio originated from Bill Bengen. Bill was an engineer who later became a financial adviser. His seminal paper in 1994 transformed the conversation around retirement income planning. The paper has been peer-reviewed and referenced by both academics and practitioners. Sadly, the SWR framework has been misinterpreted and misapplied far too often. If I didn’t know better, I’d say Bill would be cringing if he read some of the nonsense that’s been written.

pages: 252 words: 75,349

Spam Nation: The Inside Story of Organized Cybercrime-From Global Epidemic to Your Front Door
by Brian Krebs
Published 18 Nov 2014

Much as squeezing an inflated balloon doesn’t make the balloon any smaller but instead merely displaces the air into new bulges, anti-spam campaigns that succeed in shuttering one partnerka or a major component of that operation often result in the most successful affiliates simply shifting their spam traffic to competing partnerkas. As Dmitry Samosseiko, a security expert with SophosLabs Canada, noted in his seminal paper, “The Partnerka—What Is It, and Why Should You Care?”, all partnerkas are in strong competition with each other. “Allegiance is earned through more generous commission rates, shorter ‘hold’ periods, support for a wider range of payment systems, higher quality promotional material, better support, etc.,” Samosseiko wrote.

(The winner of that competition, a hacker nicknamed “Engel,” was the Russian man allegedly behind the “Festi” spam botnet, an extremely virulent and powerful spam-spewing machine, as detailed in Chapter 7. Incidentally, Engel and his botnet would eventually catapult Vrublevsky and himself toward a dangerous collision with the law, as we’ll see in Chapter 12.) In their seminal paper, “PharmaLeaks: Understanding the Business of Online Pharmaceutical Affiliate Programs,” researchers at the University of California, San Diego (UCSD), the International Computer Science Institute, and George Mason University examined caches of data tracking the day-to-day finances of GlavMed, SpamIt, and Rx-Promotion, which collectively over a four-year period processed more than $170 million worth of orders from customers seeking cheaper, more accessible, and more discreetly available drugs.

The section that references a letter from the FDA to Vrublevsky’s alleged partner in Rx-Promotion refers to a letter dated October 8, 2010, and addressed to one “Jorge Smark” at the email address hellmanh@gmail.com. See www.fda.gov/ICECI/EnforcementActions/WarningLetters/ucm229010.htm. Chapter 6: Partner(ka)s in (Dis)Organized Crime The inspiration for this chapter came principally from the seminal paper on partnerka programs “The Partnerka—What Is It, and Why Should You Care,” by Dmitry Samosseiko of SophosLabs Canada. This chapter also relies heavily on data gathered by researchers at the University of California, San Diego, the International Computer Science Institute, and George Mason University.

pages: 174 words: 56,405

Machine Translation
by Thierry Poibeau
Published 14 Sep 2017

On the other hand, research in computational linguistics was blooming during the same period for speech as well as for written text: the 1960s and 1970s saw major developments in parsing (automatic syntactic analysis), semantics, and text understanding for example, as suggested in the 1966 ALPAC report (see chapter 5). The 1990s saw the advent of a new approach based on statistics and very large bilingual corpora. This trend clearly derived from a series of seminal papers written by a research group working at IBM in the late 1980s and early 1990s. These papers had a considerable impact, along with the development of statistical and empirical approaches in natural language processing. The most popular translation systems nowadays (Google and Bing translation) are all based on a variant of this approach.

It is important that the data used for training be similar to the data used for testing in order for the system to produce satisfactory results. As one can imagine, the key point lies in the quality of the information accumulated during the training step, which essentially entails analyzing a very large aligned corpus at word and sentence levels. The seminal paper from IBM in 1993 described five alignment models, each of which is a modification of the previous model. Different Approaches for Lexical Alignment: The IBM Models As we have seen, the translation approach developed within IBM in the late 1980s was essentially based on translation choices carried out at word level.

pages: 204 words: 60,319

Finding Zero: A Mathematician's Odyssey to Uncover the Origins of Numbers
by Amir D. Aczel
Published 6 Jan 2015

Emptiness was the door from nonexistence to existence, in the same way that zero was the conduit from positive to negative numbers, one set being a perfect geometrical reflection of the other along the number line. But I now had to find the lost Eastern zero—if indeed it still existed. I knew that in 1931, George Cœdès was able to destroy Kaye’s argument in his seminal paper that employed this zero.7 In fact, Cœdès presented in his paper two newly discovered zeros: One from Palembang, Indonesia, dated to 684 CE, and the one-year-older inscription from the Khmer temple at Sambor on the Mekong. In the paper, the Sambor find was identified as inscription K-127. This K- notation, instituted by Cœdès, would become my main lead in searching for the artifact.

A reissue of a superb source of information on mathematical notations; it does not include the discoveries of the earliest numerals in Southeast Asia. Cantor, Moritz. Vorlesungen uber Geschichte der Mathematik. Vol. 1. Berlin, 1907. Cœdès, George. “A propos de l’origine des chiffres arabes.” Bulletin of the School of Oriental Studies (University of London) 6, no. 2 (1931): 323–28. This is the seminal paper by Cœdès, which changed the entire chronology of the evolution of our number system by reporting and analyzing the discovery, by Cœdès himself, of a Cambodian zero two centuries older than the accepted knowledge at that time. Cœdès, George. The Indianized States of Southeast Asia. Hilo: University of Hawaii Press, 1996.

pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence
by George Zarkadakis
Published 7 Mar 2016

But if we accept this proposition, we must ask ourselves who wrote our program? Are we trapped by the contemporary literary metaphor for life? Or is there something beyond the metaphor, a deeper insight into the nature and cause of being and becoming? Ever since British mathematician Alan Turing wrote his seminal paper on machines imitating humans, various camps in computer science, robotics and Artificial Intelligence have been demarcated by the dichotomy between materialism and idealism. We cannot possibly gain insight into Artificial Intelligence, and its potential to change our world and our civilisation, unless we understand the centrifugal ideas that dominate it.

They convert signals that exist in the physical world into binary representations of ‘0s’ and ‘1s’.24 In binary code ‘0’ denotes the absence of a signal and ‘1’ the presence of a signal. Every time you use your smartphone to take a picture, light captured by your phone’s camera is converted into binary digits and stored in the memory. Digital information is a long, long sequence of zeros and ones. Shannon’s breakthrough idea in his seminal paper ‘A Mathematical Theory of Communication’25 was to borrow the probabilistic mathematics of thermodynamics and apply them to the new field of telecommunications. Thermodynamics describes how molecules move as they heat up or cool down. The greater the heat, the more energetic the molecules become.

The man who demonstrated the direct connection between neurons and computers was Professor Warren S. McCulloch (1898–1969), the American neurophysiologist who loved writing sonnets and laid the foundations of many contemporary brain theories. In 1943, he collaborated with Walter Pitts, a logician, on a seminal paper about the mathematics of neural cells.5 In this paper, McCulloch and Pitts tried to understand how the brain could produce highly complex patterns by using many basic cells – called neurons – that are connected together. To do so they borrowed ideas from Alan Turing. Turing’s influence has been tremendous in America, and his ideas for calculating machines (the so-called ‘Turing machines’) provided an excellent theoretical framework for McCulloch and Pitts.

pages: 392 words: 114,189

The Ransomware Hunting Team: A Band of Misfits' Improbable Crusade to Save the World From Cybercrime
by Renee Dudley and Daniel Golden
Published 24 Oct 2022

Dubbed “sociobiology,” it applied tools associated with Darwinian analysis to animal and, more controversially, human behavior. Every week, about fifteen disciples would gather near the Harvard campus in the spacious home of Professor Irven DeVore, a baboon specialist. Often they debated, drank, and gambled into the early morning. “Basically, all of the seminal papers were written in DeVore’s living room at three a.m.,” one attendee said. Popp was a regular at these sessions. He was DeVore’s protégé, expected to be Harvard’s next baboon expert and lead the study of primates into the heyday of sociobiology. He and DeVore coauthored a paper on how male animals use aggressive behavior to maximize reproductive success.

With the exception of the AIDS Trojan, though, all these cryptographic methods remained purely defensive. They were designed to shield national security communications, financial records, and other valuable information from enemies and thieves, and to ensure that the data was authentic. Then, at a 1996 conference, two Columbia University researchers, Adam Young and Moti Yung, presented a seminal paper showing how to use hybrid encryption for extortion. Their idea was ingenious. Under their model, a hacker can infiltrate a computer and use a symmetric key to encrypt the victim’s files—anyone who knew the key could later decrypt the files. The symmetric key, however, is formidably protected: it is randomly generated and presumably difficult to crack; it is also safeguarded a second time when it’s encrypted by a public key embedded in the ransomware program.

THE MAN WHO INVENTED RANSOMWARE “would end up being a CPA”: Wally Guenther, “Neighbors Express Surprise at Arrest,” Plain Dealer (Cleveland, OH), February 3, 1990. “the most important book”: Joseph L. Popp, Popular Evolution: Life-Lessons from Anthropology (Lake Jackson, TX: Man and Nature Press, 2000), xviii. “Basically, all of the seminal papers”: Author interview with Robert Sapolsky, June 12, 2000. “greater damage”: Joseph L. Popp and Irven DeVore, “Aggressive Competition and Social Dominance Theory: Synopsis,” in The Great Apes, ed. David A. Hamburg and Elizabeth R. McCown (Menlo Park, CA: Benjamin/Cummings, 1979), 323. “Life is merely an artifact”: Popp, Popular Evolution, 1–2.

pages: 229 words: 68,426

Everyware: The Dawning Age of Ubiquitous Computing
by Adam Greenfield
Published 14 Sep 2006

Almost twenty years ago, a researcher at the legendary Xerox Palo Alto Research Center wrote an article—a sketch, really—setting forth the outlines of what computing would look like in a post-PC world. The researcher's name was Mark Weiser, and his thoughts were summarized in a brief burst simply entitled "Ubiquitous Computing #1." In it, as in the series of seminal papers and articles that followed, Weiser developed the idea of an "invisible" computing, a computing that "does not live on a personal device of any sort, but is in the woodwork everywhere." What Weiser was describing would be nothing less than computing without computers. In his telling, desktop machines per se would largely disappear, as the tiny, cheap microprocessors that powered them faded into the built environment.

Certainly, Mark Weiser's contingent at PARC wanted to push computation into the environment because they hoped that doing so judiciously might ameliorate some less pleasant aspects of a user experience that constantly threatened to spin out of control. As Weiser and co-author John Seely Brown laid out in a seminal paper, "The Coming Age of Calm Technology," they wanted to design tools to "encalm as well as inform." Similar lines of argument can be adduced in the work of human-centered design proponents from Don Norman onward. Much of the Japanese work along ubiquitous lines, and in parallel endeavors such as robotics, is driven by the recognition that an aging population will require not merely less complicated interfaces, but outboard memory augmentation—and Japan is far from the only place with graying demographics.

pages: 242 words: 68,019

Why Information Grows: The Evolution of Order, From Atoms to Economies
by Cesar Hidalgo
Published 1 Jun 2015

In the words of Fukuyama: “Certain societies can save substantially on transaction costs because economic agents trust one another in their interactions and therefore can be more efficient than low trust societies, which require detailed contracts and enforcement mechanisms.”12 James Coleman, a sociologist well known for his work on social capital, has also emphasized the ability of trust to reduce transaction costs. In his seminal paper on social capital Coleman described the transactions between Jewish diamond merchants in New York, who have the tradition of letting other merchants inspect their diamonds in private before executing a transaction. He argues that trust and the social network of family and acquaintances that implicitly enforces this trust are essential to make these interactions feasible.

Surprisingly, the latter critique was made famous by a sociologist, Dennis Wrong, who criticized the oversocialized view of individuals advanced by his colleagues in the early 1960s; see his “The Oversocialized Conception of Man in Modern Sociology,” American Sociological Review 26, no. 2 (1961): 183–193. Here, however, I will use the description of both critiques presented by James Coleman in his seminal paper on social capital, “Social Capital in the Creation of Human Capital,” American Journal of Sociology 94 (1988): S95–S120. There are two broad intellectual streams in the description and explanation of social action. One, characteristic of the work of most sociologists, sees the actor as socialized and action as governed by social norms, rules, and obligations.

pages: 249 words: 66,383

House of Debt: How They (And You) Caused the Great Recession, and How We Can Prevent It From Happening Again
by Atif Mian and Amir Sufi
Published 11 May 2014

These episodes ended in spectacular busts, and it is tempting to call them bubbles after the fact. But what if the price booms were legitimate and based on economic prospects at the time? How can one prove the existence of bubbles without a doubt? In 1988 future Nobel laureate Vernon Smith and his coauthors, Gerry Suchanek and Arlington Williams, published a seminal paper on the existence of bubbles.3 The authors conducted an experiment where participants were each given an initial allotment of cash and stocks that they could trade with one another. The experiment had fifteen trading periods. At the end of each trading period, the owner of a stock received a dividend payment that could have one of four values with equal probability—0, 8, 28, and 60—for an expected value of 24 cents.

Of the twenty-two experiments conducted, fourteen saw a stock market “characterized by a price bubble measured relative to dividend value.” The results bore an uncanny resemblance to the “excess volatility” phenomena first documented by Robert Shiller in 1981 for the U.S. stock market.4 In his seminal paper that led to the creation of the field of behavioral finance, Shiller showed that stock prices moved too much to be justified by the subsequent movement in their dividends. This phenomenon was later succinctly summarized by Jeffrey Pontiff in 1997 when he demonstrated that closed-end mutual funds were significantly more volatile than the market value of the underlying securities.5 Closed-end mutual funds hold stocks and bonds like regular “open-ended” mutual funds.

pages: 481 words: 120,693

Plutocrats: The Rise of the New Global Super-Rich and the Fall of Everyone Else
by Chrystia Freeland
Published 11 Oct 2012

He has found that in 1916 the richest 1 percent of Americans received only one-fifth of their income from paid work; in 2004, that figure had risen threefold, to 60 percent. “As a consequence, top executives (the ‘working rich’) have replaced top capital owners (the ‘rentiers’) at the top of the income hierarchy during the twentieth century,” Saez and Piketty write in their seminal paper on the subject. Michael Lindsay, a professor at Rice University who has interviewed more than five hundred American leaders as part of the multiyear Platinum Study of the background and behavior of the nation’s bosses, has reached the same conclusion. Speaking at a Columbia University conference on elites in the fall of 2010, Lindsay said that nowadays most of America’s business, nonprofit, and academic chiefs hadn’t inherited their money or come from privileged backgrounds.

That small group of wealthy capitalists laid the foundations for America’s astonishing economic ascent in the twentieth century. But as the American economy matured, control of its private businesses began to pass from the hands of the vigorous, scheming, and resolute founders of Marshall’s age to a new generation of stewards. That shift was documented in a seminal paper published in 1931 by Gardiner Means, a New England farm boy and steely-nerved World War I pilot who’d eventually made his way to economics and the Ivy League faculty. Means showed that of the two hundred largest U.S. companies at the end of 1929, 44 percent were controlled by managers rather than by their owners.

The terms of the deal were undisclosed Bernard Weinraub, “Disney Settles Bitter Suit with Former Studio Chief,” New York Times, July 8, 1999. Dick Tracy cost Disney $47 million to produce. See James B. Stewart, DisneyWar (Simon & Schuster, 2005), p. 111. “the stewards of a rich man” Smith, Wealth of Nations, Book V, Chapter I, Section 107. a seminal paper published in 1931 Gardiner C. Means, “The Separation of Ownership and Control in American Industry,” The Quarterly Journal of Economics,1931. “the princes of industry” Adolf Augustus Berle and Gardiner Coit Means, The Modern Corporation and Private Property (Transaction Publishers, 1932), p. 4.

pages: 314 words: 122,534

The Missing Billionaires: A Guide to Better Financial Decisions
by Victor Haghani and James White
Published 27 Aug 2023

For 50 years, academic researchers and practitioners have built upon their work. It is not obvious to me what good it has done for individual investors, although it has helped banks create, and profit from, many complicated derivatives (full disclosure: a long time ago, I worked in equity derivatives at Goldman Sachs). You will learn about another seminal paper published four years earlier in 1969 by Robert Merton, about how much to put in the stock market and what fraction of your wealth to spend each year as you age. This paper has been largely forgotten outside academia, just like the 1956 paper by John Kelly on how to optimize the growth of wealth.

Notice that , the fraction of her wealth she would like to bequeath in the calibration question. 1. See Markowitz, 1952. 2. The discipline is often called lifetime portfolio choice and consumption. Many consider Robert Merton's Lifetime portfolio selection under uncertainty: The continuous‐time case, Review of Economics and Statistics, Vol. 51, No. 3 (Aug. 1969) to be the seminal paper. For a brief overview, see John Campbell, Strategic Asset Allocation: Portfolio Choice for Long‐Term Investors, NBER (2000). For a more in depth treatment, see: John Campbell and Luis Viceira, Strategic Asset Allocation (2002), John Campbell, Financial Decisions and Markets (2017), Robert C. Merton, Continuous Time Finance (1992), John Cochrane, Asset Pricing (2005). 3.

Note that this difference in perspective is distinct from having a short‐term versus long‐term investment horizon. When we compare changes in earnings versus changes in stock prices, in both cases we are looking at a relatively long 10‐year horizon. The recognition that stock prices are much more volatile than long‐term earnings is not new. It was presented in 1980 by Robert Shiller in his seminal paper, “Do Stock Prices Move Too Much to be Justified by Subsequent Changes in Dividends?,” and it is also at the heart of the long‐debated “equity risk premium puzzle” that we discuss in Chapter 20.5 These days, most market observers believe that stock market volatility arises as much (or possibly more) from changes in how people value future earnings, as from changes in the expected future earnings themselves.6 It's easy to read too much into this historical analysis, but one interpretation is that equities are intrinsically a bit like the CSL annuity we've been discussing: they provide an indefinite stream of earnings, which naturally adjust, at least somewhat, to CPI inflation.

Deep Work: Rules for Focused Success in a Distracted World
by Cal Newport
Published 5 Jan 2016

This task of formalization began in earnest in the 1970s, when a branch of psychology, sometimes called performance psychology, began to systematically explore what separates experts (in many different fields) from everyone else. In the early 1990s, K. Anders Ericsson, a professor at Florida State University, pulled together these strands into a single coherent answer, consistent with the growing research literature, that he gave a punchy name: deliberate practice. Ericsson opens his seminal paper on the topic with a powerful claim: “We deny that these differences [between expert performers and normal adults] are immutable… Instead, we argue that the differences between expert performers and normal adults reflect a life-long period of deliberate effort to improve performance in a specific domain.”

In this sense, we should see the goal of this rule as taming shallow work’s footprint in your schedule, not eliminating it. Then there’s the issue of cognitive capacity. Deep work is exhausting because it pushes you toward the limit of your abilities. Performance psychologists have extensively studied how much such efforts can be sustained by an individual in a given day.* In their seminal paper on deliberate practice, Anders Ericsson and his collaborators survey these studies. They note that for someone new to such practice (citing, in particular, a child in the early stages of developing an expert-level skill), an hour a day is a reasonable limit. For those familiar with the rigors of such activities, the limit expands to something like four hours, but rarely more.

pages: 278 words: 70,416

Smartcuts: How Hackers, Innovators, and Icons Accelerate Success
by Shane Snow
Published 8 Sep 2014

In Bigger or Better, the parlay never stops. Players don’t wait an arbitrary period of time before moving on to the next trade, and they don’t mind if the result of a trade was only a slightly more desirable object, so long as the game keeps moving. “By itself, one small win may seem unimportant,” writes Dr. Karl Weick in a seminal paper for American Psychologist in 1984. “A series of wins at small but significant tasks, however, reveals a pattern that may attract allies, deter opponents, and lower resistance to subsequent proposals.” “Once a small win has been accomplished,” Weick continues, “forces are set in motion that favor another small win.”

For an excellent academic discussion about experimentation for entrenched businesses, see Stefan Thomke, “Unlocking Innovation through Business Experimentation,” European Business Review, http://www.europeanbusinessreview.com/?p=8420 (accessed February 17, 2014). 112 enjoy an unfair advantage over their competitors: The seminal paper on first-mover advantage was Marvin B. Lieberman, and David B. Montgomery, “First-Mover Advantages,” Strategic Management Journal, no. 9 (1988): 41–58. Lieberman and Montgomery revisited and amended their claims ten years later in “First-Mover (Dis)Advantages: Retrospective and Link with the Resource-Based View,” Strategic Management Journal 19 (1998): 1111–25.

pages: 936 words: 252,313

Good Calories, Bad Calories: Challenging the Conventional Wisdom on Diet, Weight Control, and Disease
by Gary Taubes
Published 25 Sep 2007

But even White originally considered the disease “part and parcel of the process of growing old,” which is what he wrote in his 1929 textbook Heart Disease, while noting that “it also cripples and kills often in the prime of life and sometimes even in youth.” So the salient question is whether the increasing awareness of the disease beginning in the 1920s coincided with the budding of an epidemic or simply better technology for diagnosis. In 1912, the Chicago physician James Herrick published a seminal paper on the diagnosis of coronary heart disease—following up on the work of two Russian clinicians in Kiev—but only after Herrick used the newly invented electrocardiogram in 1918 to augment the diagnosis was his work taken seriously. This helped launch cardiology as a medical specialty, and it blossomed in the 1920s.

He eventually took a position with the College of Medical Evangelists in Los Angeles, which was affiliated with the Seventh-day Adventist Church, and he became a senior attending physician at Los Angeles County General Hospital. But these were not institutions that bestowed credibility. Meanwhile, Newburgh’s seminal paper establishing a perverted appetite as the definitive cause of obesity was published in 1942, and Newburgh rejected the lipophilia hypothesis with the alacrity with which he rejected any explanation that didn’t implicate gluttony as the primary cause. What made the disappearance of the lipophilia hypothesis so remarkable is that it could easily be tested in the laboratory, in animal models.

“…this conception deserves…”: Wilder and Wilbur 1938:310–11. 1955 German textbook chapter: Bahner 1955:1023–26. References from German literature: Rony 1940; Rynearson and Gastineau 1949. Footnote. Interview, Theodore Van Itallie. Bauer’s articles in English: Silver and Bauer 1931; Bauer 1940; Bauer 1941. Newburgh’s seminal paper: Newburgh 1942. “indubitable” and “is also probably present…”: Cahill 1978. “significantly more weight”: Lee and Schaffer 1934. For a similar experiment, see Marx et al. 1942. “These mice will make fat…”: Mayer 1968:48. Benedict reported this: discussed in Alonso and Maren 1955, which reported confirmation of the observation in a different strain of mice.

pages: 260 words: 84,847

P53: The Gene That Cracked the Cancer Code
by Sue Armstrong
Published 20 Nov 2014

WEINBERG, ROBERT Eminent US scientist involved since the early days of the molecular-biology revolution in uncovering the genetic basis of cancer. Best known for his discoveries of the first human oncogene (or cancer-promoting gene) and the first tumour suppressor. Weinberg has spent most of his working life at the Massachusetts Institute of Technology (MIT) and is the author, with Doug Hanahan, of a seminal paper, ‘The Hallmarks of Cancer’, which defines the key characteristics of all cancer cells. WYLLIE, ANDREW Trained as a pathologist, Wyllie was a PhD student at Aberdeen University in Scotland when ‘programmed cell death’, or cell suicide, emerged from rarefied fields into mainstream biology and was given the name ‘apoptosis’.

If a gene is ‘over-expressed’, it implies there is an over-abundance of protein in the cell. Gain of function: An expression used in reference to a genetic mutation that changes the gene product (e.g. protein) in such a way that it gains a new and abnormal function (see also loss of function). ‘Hallmarks of Cancer’: A seminal paper written by Robert Weinberg and Doug Hanahan in 2000 that describes the six characteristics common to all cancers, of whatever organ or origin. They revised the ‘Hallmarks’ in 2011, adding four more general principles. Large T antigen: The gene in the DNA of the monkey virus SV40 that is responsible for causing cancer in the cells of the host species it infects.

pages: 369 words: 153,018

Power, Sex, Suicide: Mitochondria and the Meaning of Life
by Nick Lane
Published 14 Oct 2005

Margulis was then married to the cosmologist Carl Sagan, and she took a similarly cosmic view of the evolution of life, considering not just the biology, but also the geological evidence of atmospheric evolution, and fossils of bacteria and early eukaryotes. She brought to the task a consummate discernment of microbial anatomy and chemistry, and applied systematic criteria to determine the likelihood of symbiosis. Even so, her work was rejected. Her seminal paper was turned down by 15 different journals before James Danielli, the far-seeing editor of the Journal of Theoretical Biology, finally accepted it. Once published, there were an unprecedented 800 reprint requests for the paper within a year. Her book, The Origin of Eukaryotic Cells, was rejected by Academic Press, despite having been written to contract, and was eventually published by Yale University Press in 1970.

Yet, for a long time, it looked as if surviving without a cell wall was a magic trick equivalent to pulling a rabbit out of a hat. Bacteria were believed to lack an internal cytoskeleton, and if that was the case, the eukaryotes must have evolved their complex skeleton in a single generation, or faced extinction. In fact this assumption turns out to be groundless. In two seminal papers, published in the journals Cell and Nature in 2001, Quest for a Progenitor 39 Laura Jones and her colleagues at Oxford, and Fusinita van den Ent and her colleagues in Cambridge, showed that some bacteria do indeed have a cytoskeleton as well as a cell wall—they wear a belt and braces, as Henry Fonda put it in Once Upon a Time in the West (‘never trust a man who can’t even trust his own trousers’).

Each complex is millions of times the size of a carbon atom, but even so they are barely visible down the electron microscope. The individual complexes are composed of numerous proteins, coenzymes, and cytochromes, Sir Hans Krebs received the Nobel Prize in 1953 for elucidating the cycle, although many others contributed to a detailed understanding. Krebs’ seminal paper on the cycle in 1937 was rejected by Nature, a personal set-back that has since encouraged generations of disappointed biochemists. In addition to its central role in respiration the Krebs Cycle is also the cell’s starting point for making amino acids, fats, haems, and other important molecules.

pages: 293 words: 91,110

The Chip: How Two Americans Invented the Microchip and Launched a Revolution
by T. R. Reid
Published 18 Dec 2007

His master’s thesis, in 1937, demonstrated how computerized mathematical circuits should be designed; this youthful piece of work not only served as the cornerstone of computer architecture from then on, but also launched a new academic discipline known as switching theory. Ten years later, as a researcher at Bell Labs, Shannon got to thinking about efficient means of electronic communications (for example, how to send the largest number of telephone conversations through a single wire). He published another seminal paper, “A Mathematical Theory of Communication,” that launched an even more important new academic discipline known as information theory; today information theory is fundamental not only in electronics and computer science but also in linguistics, sociology, and numerous other fields. You could argue that Claude Shannon was the Alexander Graham Bell of the cellular phone, because mobile communications would be impossible without the basic formulas of information theory that Shannon devised.

.: Princeton University Press, 1972), which is strangely organized but has the immediacy that could be conveyed only by one who was present at the creation of the modem electronic computer. Andrew Hodges, Alan Turing: The Enigma (New York: Simon & Schuster, 1983), and Steve J. Heims, John von Neumann and Norbert Wiener (Cambridge, Mass.: MIT Press, 1980), are the first complete biographies. Von Neumann’s seminal paper “Preliminary Discussion of the Logical Design of an Electronic Computing Instrument” is reprinted in John Diebold, ed., The World of the Computer (New York: Random House, 1973). There are far more books than any one person could read on the inner workings of integrated circuits, microprocessors, calculators, and computers.

pages: 578 words: 168,350

Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies
by Geoffrey West
Published 15 May 2017

Because the essence of any measurable quantity cannot depend on an arbitrary choice of units made by human beings, neither can the laws of physics. Consequently, all of these and indeed all of the laws of science must be expressible as relationships between scale-invariant dimensionless quantities, even though conventionally they are not typically written that way. This was the underlying message of Rayleigh’s seminal paper. His paper elegantly illustrates the technique with many well-chosen examples, including one that provides the scientific explanation for one of the great mysteries of life that all of us have pondered at some time, namely, why is the sky blue? Using an elegant argument based solely on relating purely dimensionless quantities, he shows that the intensity of light waves scattered by small particles must decrease with the fourth power of their wavelength.

As was shown in Figure 1, metabolic rate scales with body size in the simplest possible manner one could imagine when plotted logarithmically against mass, namely, as a straight line indicative of a simple power law scaling relationship. The scaling of metabolic rate has been known for more than eighty years. Although primitive versions of it were known before the end of the nineteenth century, its modern incarnation is credited to the distinguished physiologist Max Kleiber, who formalized it in a seminal paper published in an obscure Danish journal in 1932.5 I was quite excited when I first came across Kleiber’s law because I had presumed that the randomness and unique historical path dependency implicit in how each species had evolved would have resulted in a huge uncorrelated variability among them.

Brown, “The Origin of Allometric Scaling Laws in Biology from Genomes to Ecosystems: Towards a Quantitative Unifying Theory of Biological Structure and Organization,” Journal of Experimental Biology 208 (2005): 1575–92; and G. B. West and J. H. Brown, “Life’s Universal Scaling Laws,” Physics Today 57 (2004): 36–42. The various technical papers devoted to specific elaborations and ramifications of this framework will be cited in the appropriate places in later chapters. 15. The seminal paper detailing these results is L. M. A. Bettencourt, et al., “Growth, Innovation, Scaling, and the Pace of Life in Cities,” Proceedings of the National Academy of Science USA 104 (2007): 7301–6. Subsequent papers dealing with specific subtopics will be cited in the appropriate places in later chapters.

pages: 361 words: 100,834

Mapmatics: How We Navigate the World Through Numbers
by Paulina Rowinska
Published 5 Jun 2024

Hamming set out to invent a way for the computer to detect and correct an error, which would allow it to continue with the computations. Computers translate anything we input into long sequences of zeros and ones called bits. This means that as long as the machine can detect an error, it can also correct it – change a zero into one, or one into zero. In 1950, Hamming published a seminal paper, ‘Error Detecting and Error Correcting Codes’, in which he described his method of error detection and error correction. But before we dig into his idea, let’s talk about my experience calling a bank when I first moved to the UK. ‘Could you spell your last name for me?’ a high-pitched voice with a thick northern accent asked.

He argued that ‘this type of solution bears little relationship to mathematics’ and ‘the solution is based on reason alone, and its discovery does not depend on any mathematical principle’. Despite this sceptical initial response, just a few months later, Euler presented the solution to his colleagues at the Academy of Sciences in Saint Petersburg, and in 1741 wrote the seminal paper ‘Solutio problematis ad geometriam situs pertinentis’ (‘The solution of a problem relating to the geometry of position’). In the paper, Euler referred to a branch of geometry that ‘is concerned only with the determination of position and its properties; it does not involve distances, nor calculations made with them’.

pages: 358 words: 106,729

Fault Lines: How Hidden Fractures Still Threaten the World Economy
by Raghuram Rajan
Published 24 May 2010

A single construction worker with a backhoe can shift far more mud than several workers with shovels and wheelbarrows. If, however, the only difference between the rich and the poor countries is physical capital, the obvious question, posed by the University of Chicago Nobel laureate Robert Lucas in a seminal paper in 1990, is, Why does more money not flow from rich countries to poor countries so as to enable the poor countries to buy the physical capital they need?2 After all, poor countries would gain enormously from a little more capital investment: in some parts of Africa, it is easier to get to a city a few hundred miles away by taking a flight to London or Paris and taking another flight back to the African destination than to try to go there directly.

Savings and Investment In the perfect world envisioned by economists, a country’s investments should not depend on its savings. After all, countries should be able to borrow as much as they need from international financial markets if their investment opportunities are good, and their own domestic savings should be irrelevant. So there should be a low correlation between a country’s investment and its savings. In a seminal paper in 1980, Martin Feldstein from Harvard University and Charles Horioka from Osaka University showed that this assumption was incorrect: there was a much higher positive correlation between a country’s investment and its savings than one might expect if capital flowed freely across countries.4 The interpretation of these findings was that countries, especially poor ones like Burundi and Ecuador, could not get as much foreign financing as they needed, so they had to cut their coats to fit the cloth.

pages: 350 words: 103,270

The Devil's Derivatives: The Untold Story of the Slick Traders and Hapless Regulators Who Almost Blew Up Wall Street . . . And Are Ready to Do It Again
by Nicholas Dunbar
Published 11 Jul 2011

The second type, market diversification, covers situations where the risk involves traded investments that can go down in price (and give poor returns) rather than either surviving or defaulting. The theory behind market diversification dates back to 1952, when Harry Markowitz, a PhD student at the University of Chicago, published a seminal paper, “Portfolio Selection,” which showed that if the prices of assets behaved independently, increasing the number of investments would always reduce the variance of returns. If you argued that variance (the degree to which investment returns fluctuated around their average) was a bad thing, then adding more investments—or diversifying your portfolio—was unquestionably a good thing.

What started out as an arcane twist of high finance—derivatives—has now corrupted the entire financial world, and has set a hellish trap for taxpayers and their representatives that offers no way out. Appendix A timeline of some significant historical events referred to in the book, and episodes involving the book’s key characters. 1973 Fischer Black, Myron Scholes, and Robert Merton publish seminal papers on option pricing 1974 Robert Merton publishes paper using option theory to link debt and equity 1986 Start of S&L crisis 1987 Oldrich Vasicek publishes working paper applying Merton’s work to credit portfolios Federal Reserve protects Wall Street securities firms from October stock market crash by ensuring that banks lend 1988 Basel I bank capital accord agreed Nick Sossidis and Stephen Partridge-Hicks set up Alpha Finance for Citibank 1994 VAR models protect commercial banks from market turmoil 1995 Barings Bank almost bankrupted by Nick Leeson’s rogue trading Sossidis and Partridge-Hicks set up Sigma 1996 Basel Committee agrees to incorporate VAR-based trading book rules into bank capital accord Citibank launches Centauri SIV Moody’s binomial expansion technique CDO rating model published 1997 J.P.

pages: 465 words: 103,303

The Cancer Chronicles: Unlocking Medicine's Deepest Mystery
by George Johnson
Published 26 Aug 2013

That kind of clarity: The experiments by Avery, Hershey, and Chase, and the discovery of DNA’s double-helical structure, are described in Horace Freeland Judson’s The Eighth Day of Creation: Makers of the Revolution in Biology, expanded ed. (Cold Spring Harbor, NY: Cold Spring Harbor Laboratory Press, 1996). The seminal papers include Oswald T. Avery, Colin M. MacLeod, and Maclyn McCarty, “Studies on the Chemical Nature of the Substance Inducing Transformation of Pneumococcal Types,” The Journal of Experimental Medicine 79, no. 2 (February 1, 1944): 137–58 [http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2135445]; A. D.

See Bert Vogelstein et al., “Genetic Alterations During Colorectal-tumor Development,” New England Journal of Medicine 319, no. 9 (September 1, 1988): 525–32. [http://www.ncbi.nlm.nih.gov/pubmed/2841597] 3. “For decades now”: Hanahan and Weinberg, “The Hallmarks of Cancer” (italics added). 4. don’t necessarily have to occur through mutations: The seminal paper on epigenetics is Andrew P. Feinberg and Bert Vogelstein, “Hypomethylation Distinguishes Genes of Some Human Cancers from Their Normal Counterparts,” Nature 301, no. 5895 (January 6, 1983): 89–92. [http://www.nature.com/nature/journal/v301/n5895/abs/301089a0.html] For a historical overview see Andrew P.

pages: 432 words: 106,612

Trillions: How a Band of Wall Street Renegades Invented the Index Fund and Changed Finance Forever
by Robin Wigglesworth
Published 11 Oct 2021

Higher-beta stocks are more volatile, and should therefore offer greater returns than steadier, lower-beta securities. And thus beta became the lingua franca for the returns of the stock market as a whole, while “alpha” later emerged as the term for the extra returns generated by a skilled investor. Not only did this gain Sharpe his PhD, but it eventually evolved into a seminal paper on what he called the “capital asset pricing model” (CAPM), a formula that investors could use to calculate the value of financial securities. The broader, groundbreaking implication of CAPM was introducing the concept of risk-adjusted returns—one had to measure the performance of a stock or a fund manager versus the volatility of its returns—and indicated that the best overall investment for most investors is the entire market, as it reflects the optimal tradeoff between risks and returns.

Today, even fund managers who manage to beat their benchmarks are no longer safe from the revolution unleashed by John McQuown, Jack Bogle, and Nate Most. Voluminous research since the initial, inspirational spate of work in the 1960s and 1970s has kept hammering the point home that active management is for the most part still a “Loser’s Game,” as Charles Ellis termed it back in 1975. The seminal paper in the field was published in 1991 by William Sharpe, whose theories underpinned the original creation of the index fund, and was bluntly titled “The Arithmetic of Active Management.”16 This expanded on Sharpe’s earlier work, and addressed the suggestion that the index investing trend that was starting to gain ground at the time was a mere “fad.”

pages: 321 words: 113,564

AI in Museums: Reflections, Perspectives and Applications
by Sonja Thiel and Johannes C. Bernhardt
Published 31 Dec 2023

Allen, to OpenAI’s DALL-E 2 and Stability AI’s Stable Diffusion, which has become the most popular model: it requires less computing power and was made openly available, so that it was used so frequently that it nearly gave rise to its own flood of images.8 Given these unquestionable advances in computer-assisted picture generation over all previous approaches and especially over the simple programs from the 1960s, it may be tempting to take up Noll’s question of ‘man or machine’ and answer it now 6 7 8 For GANs used in the field of AI-generated art, see the seminal paper Goodfellow/PougetAbadie/Mirza et al. 2014 and the overview by Maerten/Soydaner 2023, 14–17. The seminal paper here is Ho/Jain/Abbeel 2022. See also Marten/Soydaner 2023, 19–22, for an overview of different diffusion models. Decisive advances in efficiency were achieved in Rombach/Blattmann/Lorenz et al. 2022. Arno Schubbach: AI and Art without hesitation in favour of the machine.

Economic Origins of Dictatorship and Democracy
by Daron Acemoğlu and James A. Robinson
Published 28 Sep 2001

He connects this to their economic power with respect to democracy – democrats cannot hurt previous elites if they have sufficient economic strength, perhaps because taxing the elite leads to a collapse in the economy. Rogowski (1998) similarly emphasizes the impact of the ability of citizens to exit as leading to democracy – a case in which voice prevents exit. Finally, our work builds on the literature that emphasizes how political institutions can solve problems of commitment. The seminal paper is by North and Weingast (1989), and this has been a theme of a series of important papers by Weingast (1997, 1998). 7. Our Contribution The ideas presented in this book build on the framework we introduced in Acemoglu and Robinson (2000a,b; 2001, 2002). There, we placed the issue of regime transitions within a framework of redistributive conflict and developed the basic idea of democracy as a credible commitment by the elites to avoid revolution and derived some of the important comparative static results – for instance, the inverted U-shaped relationship between inequality and democratization.

In addition to a model in which political conflict is between the rich and the poor, we want to examine what happens when conflict is based on other political identities. We introduce such a model in Subsection 4.4. 4.1 The Median Voter Model of Redistributive Politics We consider a society consisting of an odd number of n citizens (the model we develop builds on the seminal papers of Romer 1975, Roberts 1977, and Meltzer 100 Democratic Politics and Richard 1981). Person i = 1, 2, . . , n has income y i . Let us order people from poorest to richest and think of the median person as the person with median income, denoted y M . Then, given that we are indexing people according to their incomes, the person with the median income is exactly individual M = (n + 1)/2.

Change the identity of who has political power and promises become credible. 178 Democratization We are not the first to emphasize the commitment value of institutions. Although this theme appears in many writings and is implicit in others (e.g., the literature on structure-induced equilibrium; see Shepsle 1979; Romer and Rosenthal 1978; and Shepsle and Weingast 1984), it is probably most clearly associated with the seminal paper by North and Weingast (1989). They argued that the establishment of the constitutional regime in Britain after the Glorious Revolution of 1688 provided commitment that the Crown would not repudiate its debt, thereby increasing its borrowing capacity. This led to fundamental changes in financial institutions and provided part of the preconditions for the Industrial Revolution.

pages: 133 words: 36,528

Peak Car: The Future of Travel
by David Metz
Published 21 Jan 2014

Sahlins (1974) and Kelly (1995) report behaviour in surviving hunter-gatherer societies. History of transport: Wolmar (2007) for railways in Britain; Lay (1992) for the world’s roads and the vehicles that use them. General development of travel in Britain and elsewhere over the past forty years: Metz (2008a, 2008b, 2010, 2012, 2013a, 2013b). Seminal papers on the constancy of travel time are Marchetti (1994) and Schafer and Victor (2000). Cessation of growth of car travel: Puentes and Tomer (2008), Millard-Ball and Schipper (2011), Goodwin (2012a,b), Le Vine and Jones (2012), Kuhnimhof et al (2012), Puentes (2012); Gargett (2012); Dutzik and Baxandall (2013); Transport Reviews 33(3) 2013.

pages: 492 words: 118,882

The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory
by Kariappa Bheemaiah
Published 26 Feb 2017

Secondly, they were incapable of integrating variables that represented microeconomic changes such as the elastic substitution of goods, the elasticity of labour supply (especially as technology replaced physical labour making economies more service oriented rather than manufacturing intensive), etc. Finally, they did not recognize that the decision-making rules of economic agents would vary systematically with changes in monetary policy. This final flaw is often referred to as the Lucas Critique. The Lucas Critique and the introduction of Rational Expectations (following a seminal paper by Muth in 1961), led to demise of neo-Keynesian models. In its stead, DSGE models came into being. The first DSGE models were known as Real Business Cycle (RBC) models were introduced in the early 1980’s and were based on the concepts detailed by Finn E. Kydland and Edward C. Prescott in 1982.

Transactional cost theory (TCT) is the branch of economics that deals with the costs of transactions and the institutions that are developed to govern them. It studies the cost of economic links and the ways in which agents organize themselves to deal with economic interactions. Coase realized that economic transactions are costly. In his seminal paper, ‘The Nature of the Firm’, Coase noted that while economies involve plenty of planning, a large part of this planning is not coordinated by the price system and takes place within the boundaries of the firm (Hidalgo, 2015). As firms have hierarchies, most interactions within a firm are political.

Visual Thinking: The Hidden Gifts of People Who Think in Pictures, Patterns, and Abstractions
by Temple Grandin, Ph.d.
Published 11 Oct 2022

The six-shooter pistol invented by Samuel Colt had a revolving cylinder, whittled out of wood, that automatically rotated the next bullet into position and allowed the gun to be fired multiple times without reloading, something that changed the face of warfare. All four of these inventors were mechanically clever; none of them would have needed higher math for their creations. Visual problem-solving is the stock in trade of the clever engineer. It’s how mechanical information has been transmitted through the centuries. In a seminal paper on visual thinking published in Science, Eugene S. Ferguson, an engineer and historian of technology, presented the visual record of technical knowledge that mushroomed with the advent of the printing press. In compiling artists’ and engineers’ notebooks, technical workbooks and manuals from the fifteenth to the twentieth century (including Leonardo da Vinci’s thousands of pages of technical drawings), Ferguson traces the record of human ingenuity in the detailed drawings of every known device and mechanism.

See spatial visualizers visual thinkers aha moments and, 207 description of, 9–13, 25–26 identification of, 7, 16–18, 106, 277 pairing with verbal thinkers, 145–46 recognizing talents of, 119, 238, 276, 277 screened out, 5–6, 54–55, 96, 107 studies on, 26–30, 33, 37–38 traits/skills of, 45–47, 161–62, 198–200 See also object visualizers; spatial visualizers visual thinking advantages of, 42–47, 257 associational thinking, 10, 45 bottom-up thinking of, 43–45, 129 description of, 232 seminal paper on, 88–89 skills related to, 35–38 in verbal world, 13–16, 239 visual analogies, 45–47 visual-verbal continuum, 16–21 visual vocabulary, 175, 215 Visualizer-Verbalizer Cognitive Style Questionnaire (VVCSQ), 31 Vividness of Visual Imagery Questionnaire (VVIQ), 39–40 vocational programs, 92, 95, 110–12 Voices Within, The (Fernyhough), 13 von Bayern, Auguste M.

pages: 150 words: 43,467

Maths on the Back of an Envelope: Clever Ways to (Roughly) Calculate Anything
by Rob Eastaway
Published 18 Sep 2019

The average cow produces somewhere between 200 and 500 litres of methane per day (that’s a huge figure, not one I felt qualified to estimate at all, so I looked it up – and even official sources vary hugely in the figure they quote). Cows aren’t the only creatures responsible for methane. Every living creature contributes methane as a natural part of its digestion or decomposition. That includes humans. In the seminal paper ‘Investigation of Normal Flatus Production in Healthy Volunteers’ by J. Tomlin, C. Lowis and N.W. Read (what do you mean, you haven’t read it?), the authors concluded that the average human on a diet that includes 200 g of baked beans, produces about 15 ml of methane per day. To put that in context, remember that the figure for cows is of the order of hundreds of litres per day.

pages: 386 words: 122,595

Naked Economics: Undressing the Dismal Science (Fully Revised and Updated)
by Charles Wheelan
Published 18 Apr 2010

Economists have come up with a theory of political behavior that fits better with what we actually observe. When it comes to interest group politics, it pays to be small. Gary Becker, the same University of Chicago Nobel Prize winner who figured so prominently in our thinking about human capital, wrote a seminal paper in the early 1980s that nicely encapsulated what had become known as the economics of regulation. Building on work that went all the way back to Milton Friedman’s doctoral dissertation, Becker theorized that, all else equal, small, well-organized groups are most successful in the political process.

Here is a remarkable figure: Only two of thirty countries classified by the World Bank as rich—Hong Kong and Singapore—lie between the Tropic of Cancer (which runs through Mexico across North Africa and through India) and the Tropic of Capricorn (which runs through Brazil and across the northern tip of South Africa and through Australia). Geography may be a windfall that we in the developed world take for granted. Development expert Jeffrey Sachs wrote a seminal paper in which he posited that climate can explain much of the world’s income distribution. He writes, “Given the varied political, economic, and social histories of regions around the world, it must be more than coincidence that almost all of the tropics remain underdeveloped at the start of the twenty-first century.”15 The United States and all of Europe lie outside the tropics; most of Central and South America, Africa, and Southeast Asia lie within.

pages: 824 words: 218,333

The Gene: An Intimate History
by Siddhartha Mukherjee
Published 16 May 2016

Pauling had revealed his model at a meeting at Caltech with the dramatic flair of a sorcerer pulling a molecular bunny out of a hat: the model had been hidden behind a curtain until the end of the talk, and then—presto!—it had been revealed to a stunned, applauding audience. Rumor had it that Pauling had now turned his attention from proteins to the structure of DNA. Five thousand miles away, in Cambridge, Watson and Crick could almost feel Pauling breathing down their necks. Pauling’s seminal paper on the protein helix was published in April 1951. Festooned with equations and numbers, it was intimidating to read, even for experts. But to Crick, who knew the mathematical formulas as intimately as anyone, Pauling had hidden his essential method behind the smoke-and-mirrors algebra. Crick told Watson that Pauling’s model was, in fact, the “product of common sense, not the result of complicated mathematical reasoning.”

But he was vastly more interested in DNA and soon abandoned all other projects to focus on DNA. Watson, Annotated and Illustrated Double Helix, 127. “A youthful arrogance”: Crick, What Mad Pursuit, 64. “The trouble is, you see, that there is”: Watson, Annotated and Illustrated Double Helix, 107. Pauling’s seminal paper: L. Pauling, R. B. Corey, and H. R. Branson, “The structure of proteins: Two hydrogen-bonded helical configurations of the polypeptide chain,” Proceedings of the National Academy of Sciences 37, no. 4 (1951): 205–11. “product of common sense”: Watson, Annotated and Illustrated Double Helix, 44.

“It’s the magnesium”: “Albert Lasker Award for Special Achievement in Medical Science: Sydney Brenner,” Lasker Foundation, http://www.laskerfoundation.org/awards/2000special.htm. Like DNA, these RNA molecules were built: Two other scientists, Elliot Volkin and Lazarus Astrachan, had proposed an RNA intermediate for genes in 1956. The two seminal papers published by the Brenner/Jacob group and the Watson/Gilbert group in 1961 are: F. Gros et al., “Unstable ribonucleic acid revealed by pulse labeling of Escherichia coli,” Nature 190 (May 13, 1960): 581–85; and S. Brenner, F. Jacob, and M. Meselson, “An unstable intermediate carrying information from genes to ribosomes for protein synthesis,” Nature 190 (May 13, 1960): 576–81.

pages: 247 words: 43,430

Think Complexity
by Allen B. Downey
Published 23 Feb 2012

The criteria are the following: The case study should be relevant to complexity. For an overview of possible topics, see http://en.wikipedia.org/wiki/Complexity and http://en.wikipedia.org/wiki/Complex_systems. Topics not already covered in the book are particularly welcome. A good case study might present a seminal paper, reimplement an important experiment, discuss the results, and explain their context. Original research is not necessary and might not be appropriate for this format, but you could extend existing results. A good case study should invite the reader to participate by including exercises, references to further reading, and topics for discussion.

pages: 182 words: 45,873

Hacking the Code of Life: How Gene Editing Will Rewrite Our Futures
by Nessa Carey
Published 7 Mar 2019

So, once you have gene edited plant cells successfully, it can often be fairly straightforward to propagate lots of identical plants. Plant scientists recognised very quickly that the new techniques for gene editing could revolutionise the efficiency, speed and ease of creating new plant varieties. The first gene-edited plants were created just one year after Doudna and Charpentier’s seminal paper, by a number of research groups.13,14,15 Since then, researchers have improved the techniques and extended them to a whole range of plant species. It might be tempting to wonder why we need to bother with gene editing for plants, given that we have been creating new varieties for millennia, simply by cross-pollinating ones that have features we like.

How I Became a Quant: Insights From 25 of Wall Street's Elite
by Richard R. Lindsey and Barry Schachter
Published 30 Jun 2007

I joined as the firm’s first “rocket scientist,” someone with extensive training in science and none in finance.2 When Richard Grinold hired me, he also invited me to sit in on the graduate seminar that he was teaching at Berkeley.3 My first months at the firm included in-depth work on a particular project, as well as a survey of seminal papers in academic finance. My project was to research an improved model for interest rate options. For example, the U.S. Treasury had long issued bonds with embedded options, allowing them to pay back investors once the bonds were within five years of maturing. Corporations issued similar bonds. If the bonds paid 12 percent interest, and rates had fallen to 8 percent, the Treasury could refinance at a lower rate.

He is a founding member of the board of Math for America, a nonprofit dedicated to improving the quality of mathematics teaching in the United States. He is also a member of the board the Mathematical Finance program at University of Chicago. Dr. Chriss has published extensively in quantitative finance – including “Optimal Execution of Portfolio Transactions” a seminal paper on algorithmic trading, “Optimal Portfolios from Ordering Information,” and the book Black-Scholes and Beyond: Modern Option Pricing. Dr. Chriss holds an BS and PhD in mathematics from University of Chicago and an MS in mathematics from California Institute of Technology. Andrew Davidson is president and founder of Andrew Davidson & Co., Inc., a consulting firm specializing in the application of analytical tools to investment management.

pages: 475 words: 134,707

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt
by Sinan Aral
Published 14 Sep 2020

In fact, Facebook wasn’t creating social connections so much as grafting itself onto social connections that already existed—between college friends, high school buddies, and work colleagues. The tight-knit relationships among Facebook users turbocharged its local network effect. In fact, Jeffrey Rohlfs anticipated Facebook’s exact go-to-market strategy in his seminal paper. He considered the launch strategies of new services that display local network effects and suggested giving away the services to carefully selected groups of people for a limited period of time. Since, as Rohlfs writes, “an individual’s demand may depend primarily on which of his few principal contacts are users…the success of this approach may also depend on how the initial user set is selected.”

Travers and Stanley Milgram, “An Experimental Study of the Small World Problem,” Sociometry 32 (1969); Duncan J. Watts, “Networks, Dynamics, and the Small World Phenomenon,” American Journal of Sociology 105, no. 2 (1999): 493–527. Facebook recruited users within: This “group-based targeting,” incidentally, is the same go-to-market strategy advocated by Jeffrey Rohlfs in his seminal paper on network effects published in 1974 and reiterated onstage by Sean Parker in conversation with Jimmy Fallon, in reference to Facebook’s go-to-market strategy, at the NextWork Conference in 2011 (see footnote in Chapter 5). Individuals with access to scarce, novel: Ronald Burt, Structural Holes: The Social Structure of Competition (Cambridge, Mass.: Harvard University Press, 1992); Ronald Burt, “Structural Holes and Good Ideas,” American Journal of Sociology 110 (2004): 349–99; A.

pages: 214 words: 14,382

Monadic Design Patterns for the Web
by L.G. Meredith

. • for( fn( _, ..., _ ) <- d if true(c1 , ..., cn ) ) yield fn Cardelli and Gordon’s ambient calculus, take this presentation one step further and add a set of conditional rewrite rules to express the computational content of the model. It was Milner who first recognized this particular decomposition of language definitions in his seminal paper, Functions as Processes, where he reformulated the presentation π-calculus along these lines. Cover · Overview · Contents · Discuss · Suggest · Glossary · Index Section 10.4 Chapter 10 · The Semantic Web • for( _( fixpt ) <- d if (( f ) => ((x) => f (x(x)))((x) => f (x(x)))) (true) ) yield fixpt • for( a <- d if h(x) => ((Y f )x)i a ) yield a The first of these will return the expressions in “function” position applied the actual parameters meeting the conditions ci respectively.

pages: 236 words: 50,763

The Golden Ticket: P, NP, and the Search for the Impossible
by Lance Fortnow
Published 30 Mar 2013

For a readable story of the four-color problem, see Robin Wilson, Four Colors Suffice: How the Map Problem Was Solved (Princeton, NJ: Princeton University Press, 2004). Chapter 4 The quotation from Cook is actually a paraphrase in modern terminology of the original quotation from his seminal paper. The original reads as follows: The theorems suggest that {tautologies} is a good candidate for an interesting set not in L*, and I feel it is worth spending considerable effort trying to prove this conjecture. Such a proof would be a major breakthrough in complexity theory. Steve Cook, “The Complexity of Theorem-Proving Procedures,” in Proceedings of the Third Annual ACM Symposium on Theory of Computing (New York: ACM), 151–58.

pages: 181 words: 52,147

The Driver in the Driverless Car: How Our Technology Choices Will Create the Future
by Vivek Wadhwa and Alex Salkever
Published 2 Apr 2017

Some researchers, such as Erik Brynjolfsson and Andrew McAfee of the Massachusetts Institute of Technology, see the automatons inevitably gobbling up more and more meaningful slices of our work.9 Oxford University researchers Carl Benedikt Frey and Michael A. Osborne caused a tremendous stir in September 2013, when they asserted in a seminal paper that A.I. would put 47 percent of current U.S. employment “at risk.”10 The paper, “The Future of Employment,” is a rigorous and detailed historical review of research on the effect of technology innovation upon labor markets and employment. In a recent research paper, McKinsey & Company found that “only about 5 percent of occupations could be fully automated by adapting current technology.

pages: 184 words: 53,625

Future Perfect: The Case for Progress in a Networked Age
by Steven Johnson
Published 14 Jul 2012

He anointed the message fragments with the slightly more Anglo name of “packets,” and the general approach “packet switching.” The metaphors stuck. Today, the vast majority of data circling around the globe comes in the form of message fragments that we still call packets. Years after both Baran and Davies had published their seminal papers, Davies jokingly said to Baran, “Well, you may have got there first, but I got the name.” In the late 1960s, packet switching became the foundation of ARPANET, the research network that laid the groundwork for the Internet. The ARPANET design relied on several radical principles that broke with existing computing paradigms.

Investment: A History
by Norton Reamer and Jesse Downing
Published 19 Feb 2016

Instead of considering market participants as hyperrational agents obeying arguably overly elegant utility functions, they are thought of as possessing biases, prejudices, and tendencies that have real and measurable effects on markets and financial transactions. Daniel Kahneman and Amos Tversky wrote a seminal paper in the field outlining what they call prospect theory, a description of individuals’ optimization outside of the classical expected utility framework. Their pioneering paper noted many 252 Investment: A History of the known behaviors that represent aberrations from expected utility theory, including lottery problems (in which individuals tend to elect a lump-sum payment up front even if that is smaller than the expected value of receiving a larger amount or zero when a coin flip is involved) and probabilistic insurance (in which individuals have a more disproportionate dislike for a form of insurance that would cover losses based on a coin flip more than the math suggests they should).

Prospect theory contends that individuals’ choices are more centered on changes in utility or wealth rather than end values; it also suggests that most people exhibit loss aversion in which losses cause more harm to one’s welfare than the benefit from happiness one receives from gaining the same amount of reward.46 This theory may seem intellectually interesting, but how does it relate precisely to finance and investing? Since Kahneman and Tversky’s seminal paper, subsequent work has made many connections to markets, one of which is the “equity premium puzzle.” The equity premium puzzle was described first in a 1985 paper by Rajnish Mehra and Edward Prescott.47 The central “puzzle” is that while investors should be compensated more for holding riskier equities than holding the risk-free instrument (Treasury bills), the amount by which they are compensated seems extremely excessive historically.

pages: 542 words: 145,022

In Pursuit of the Perfect Portfolio: The Stories, Voices, and Key Insights of the Pioneers Who Shaped the Way We Invest
by Andrew W. Lo and Stephen R. Foerster
Published 16 Aug 2021

In January 1962, Sharpe first presented his results at a University of Chicago seminar. Shortly afterward he submitted the paper, titled “Capital Asset Prices: A Theory of Market Equilibrium under Conditions of Risk,” to the prestigious Journal of Finance, at the time the top academic publication in the field and also where Markowitz had published his seminal paper. Sharpe received an initial negative report from an anonymous referee. Sharpe’s assumptions, the report commented, including the important assumption that all investors would make the same predictions about the expected returns and risks of securities, were so “preposterous” that all subsequent conclusions were “uninteresting.”32 Sharpe kept trying with the Journal of Finance.

We didn’t submit it to a finance journal because we thought this would have a broader application.”47 Shortly after the submission, Black received what is known as a desk reject, a decision by the editor of the journal to reject a submission outright rather than soliciting the views of “blind” referees. The letter indicated that their paper was too specialized for the journal and would be better suited for the Journal of Finance, where Markowitz and Sharpe had published their seminal papers. Black then sent the paper to another prestigious economics journal, the Review of Economics and Statistics, founded in 1919 and published by MIT Press, and again received a prompt rejection letter. Black suspected that at least part of the reason for the prompt rejections was because it was clear from Black’s return address that he wasn’t at an academic institution, and thus the paper wasn’t taken seriously.

pages: 225 words: 61,388

Dead Aid: Why Aid Is Not Working and How There Is a Better Way for Africa
by Dambisa Moyo
Published 17 Mar 2009

Aid success in good policy environments Faced with mounting evidence that aid has not worked, aid proponents have also argued that aid would work, and did work, when placed in good policy environments, i.e. countries with sound fiscal, monetary and trade policies. In other words, aid would do its best, when a country was in essentially good working order. This argument was formalized in a seminal paper published by World Bank economists Burnside and Dollar in 2000. (Quite why a country in working order would need aid, or not seek other better, more transparent forms of financing itself, remains a mystery.) Donors soon latched onto the Burnside–Dollar result and were quick to put the findings into practice.

pages: 200 words: 60,987

The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America
by Steven Johnson
Published 26 Dec 2008

The giants of the Carboniferous illuminate the enduring power of Priestley’s original mint experiment, the long flame of associations and insights that came out of that original spark. Priestley and Franklin’s hunch that plant life was central to the planet’s production of breathable air first approached scientific consensus in the late 1960s, after two physicists, Lloyd Berkner and Lauriston Marshall, proposed in a seminal paper that the vast majority of atmospheric oxygen originated in photosynthesis. The “natural” level of oxygen on Earth was less than 1 percent; the 20.7 percent levels we enjoy as respiring mammals was an artificial state, engineered by the evolutionary breakthrough that began with cyanobacteria billions of years ago.

pages: 203 words: 63,257

Neutrino Hunters: The Thrilling Chase for a Ghostly Particle to Unlock the Secrets of the Universe
by Ray Jayawardhana
Published 10 Dec 2013

Carl Anderson discovered the positron in cosmic rays, confirming Paul Dirac’s prediction of the existence of antimatter. 1933: Fermi formulated a theory of beta decay that incorporated the neutrino and foreshadowed the weak force. 1937: Ettore Majorana proposed that the neutrino could be its own antiparticle. 1939: Hans Bethe published his seminal paper on energy production in stars but failed to mention neutrinos. 1946: Bruno Pontecorvo proposed that neutrinos produced in nuclear reactors and in the Sun could be detected with chlorine-based experiments. 1955–56: Researchers using the Bevatron in California identified the antiproton and the antineutron. 1956: Frederick Reines and Clyde Cowan definitively detected (anti)neutrinos using a nuclear reactor as the source. 1956–57: T.

pages: 229 words: 61,482

The Gig Economy: The Complete Guide to Getting Better Work, Taking More Time Off, and Financing the Life You Want
by Diane Mulcahy
Published 8 Nov 2016

It’s difficult to rely solely on a compelling résumé or being heard above the persistent noise of social media. We’re better off cultivating our connections and networks of people who know us, like us, and can help point us toward good opportunities. A good network is both deep and broad. Mark Granovetter, a sociologist at Stanford, best described the benefits of both in his seminal paper “The Strength of Weak Ties.”1 Our deep connections come from what he calls strong ties. They are limited in number and are the people we know best and interact with most frequently, like spouses, close friends, and current colleagues. Strong ties are important emotionally and are essential as the backbone of any fulfilling life.

pages: 236 words: 66,081

Cognitive Surplus: Creativity and Generosity in a Connected Age
by Clay Shirky
Published 9 Jun 2010

The immediate effect of their actions was to reduce the amount of trash on a few market streets, but their longer-term value is not their output but their example. As they put it in a Responsible Citizens manifesto: “We wish to nurture in everyone a community spirit.” They were trying to make civic action contagious. This idea is less crazy than it sounds. In 1973 Mark Granovetter showed in a seminal paper, “The Strength of Weak Ties,” that people tend to find jobs through casual acquaintances rather than through close friends or family. Since then an increasing body of research has demonstrated the importance of social networks to our well-being. Nicholas Christakis and James Fowler, researchers at Harvard Medical School, have shown that social networks spread all kinds of behaviors: we are likelier to be obese if our friends are obese, or to exercise if they exercise, or even to be happy if they are happy.

pages: 259 words: 67,456

The Mythical Man-Month
by Brooks, Jr. Frederick P.
Published 1 Jan 1975

L., "On the design and development of program families," IEEE Trans, on Software Engineering, SE-2, 1 (March, 1976), pp. 1-9; Parnas, D. L., "Designing software for ease of extension and contraction," IEEE Trans, on Software Engineering, SE-5, 2 (March, 1979), pp. 128-138. D. Harel, "Biting the silver bullet," Computer (Jan,, 1992), pp. 8-20. The seminal papers on information hiding are: Parnas, D. L., "Information distribution aspects of design methodology," Carnegie-Mellon, Dept. of Computer Science, Technical Report (Feb., 1971); Parnas, D. L., "A technique for software module specification with examples," Comm. ACM, 5, 5 (May, 1972), pp. 330-336; Parnas, D.

pages: 272 words: 71,487

Getting Better: Why Global Development Is Succeeding--And How We Can Improve the World Even More
by Charles Kenny
Published 31 Jan 2011

And the political economy of public service provision in countries worldwide means that richer people are more likely to have better access to and quality of service from supposedly “universal” services where they are available at all. In Africa, for example, the richest fifth of the population benefits from 30 percent of public health expenditure while the poorest fifth benefits from only 12 percent of such expenditure.3 Wealthier countries are, unsurprisingly, healthier, according to a seminal paper by Lant Pritchett and Lawrence Summers. High-income countries see average life expectancies twenty years longer than those in low-income countries. Fewer than 1 percent of children die before the age of five in rich countries compared to 12 percent of children in low-income countries—a comparative toll of 100,000 children dying each year in wealthy countries compared to 10 million in the developing world.

pages: 245 words: 64,288

Robots Will Steal Your Job, But That's OK: How to Survive the Economic Collapse and Be Happy
by Pistono, Federico
Published 14 Oct 2012

What I found was a very complicated and intricate world of happiness research, much more complex than I originally thought it would be. Richard Easterlin, economist and Professor of Economics at the University of Southern California, discussed the factors contributing to happiness in the his 1974 seminal paper ‘Does Economic Growth Improve the Human Lot? Some Empirical Evidence’137. He found that the average reported level of happiness does not vary much with national income per person, at least for countries with income sufficient to meet basic needs. Similarly, although income per person rose steadily in the United States between 1946 and 1970, average reported happiness showed no long-term trend and declined between 1960 and 1970.

pages: 239 words: 56,531

The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine
by Peter Lunenfeld
Published 31 Mar 2011

This trifle, inspired at least in part by the renown of Christopher’s uncle Lytton Strachey’s 1918 portrait of a generation, Eminent Victorians, is the product of a stored program computer, and as such may well be the first aesthetic object produced by the ancestors of the culture machine. The love letter generator’s intentional blurring of the boundary between human and nonhuman is directly related to one of the foundational memes of artificial intelligence: the still-provocative Turing Test. In “Computing Machinery and Intelligence,” a seminal paper from 1950, Turing created a thought experiment. He posited a person holding a textual conversation on any topic with an unseen correspondent. If the person believes he or she is communicating with another person, but is in reality conversing with a machine, then that machine has passed the Turing Test.

pages: 257 words: 66,480

Strange New Worlds: The Search for Alien Planets and Life Beyond Our Solar System
by Ray Jayawardhana
Published 3 Feb 2011

“The scientist at his or her purest is very similar to the artist,” he explained. “They have common goals; they’re both searching for the truth. The difference is that scientifc truth is external truth whereas the truth that a writer or a painter sees is inner truth.” In 1977, Shu published a seminal paper on star formation, building on previous work by Yale University astronomer Richard Larson and others. In it, he proposed a simple, yet elegant, model showing that cloud cores collapse “inside out,” frst forming a small central star onto which rest of the material falls. Because the cloud is spinning, it actually fattens into a disk as it shrinks in size, sort of like how pizza dough makes a pie as it is spun in the air.

pages: 213 words: 68,363

Never Enough: The Neuroscience and Experience of Addiction
by Judith Grisel
Published 15 Feb 2019

Moreover, the b process can be elicited solely by environmental stimuli that promise the a process is coming—which is what happened with Pavlov’s dogs, who learned to salivate even when food was not present. Our experience (solid line) is the combined effect of the drug (a process) and the brain’s opponent response to the drug (b process). I don’t have any tattoos, but on my short list, if I decide to get one, is a figure like the one shown below, also copied from Solomon and Corbit’s seminal paper and illustrating the changes that occur in the b process as a result of adaptation. Note how the experience of the stimulus is dramatically altered, so that now there is hardly a bump in feeling state. In many ways, this figure is the theoretical heart of scientific understanding about addiction and the core of this book, depicting how the drug comes to function mainly to stave off withdrawal and craving in the face of the brain’s powerful ability to counteract perturbation.

pages: 212 words: 68,649

Wordslut: A Feminist Guide to Taking Back the English Language
by Amanda Montell
Published 27 May 2019

Certainly no one thinks that because the Spanish word for eye (ojo) is masculine and the word for chin (barbilla) is feminine, Spanish speakers perceive eyes as inherently macho body parts and chins as inherently ladylike ones. But toward the end of the twentieth century, linguist Suzanne Romaine determined that this relationship between grammatical and “natural” gender is not always so separate. In 1997 Romaine published a seminal* paper called “Gender, Grammar, and the Space in Between.” The same year of Princess Diana’s death and Mike Tyson’s bite fight, Romaine was blowing minds at the University of Oxford with the theory that in languages all over the world, there is some undeniable “leakage” going on between grammatical gender and how we perceive human gender in real life.

pages: 227 words: 63,186

An Elegant Puzzle: Systems of Engineering Management
by Will Larson
Published 19 May 2019

We describe several related efforts to measure and pay down technical debt found in Google’s BUILD files and associated dead code. We address debt found in dependency specifications, unbuildable targets, and unnecessary command line flags. These efforts often expose other forms of technical debt that must first be managed. “No Silver Bullet—Essence and Accident in Software Engineering” A seminal paper from the author of The Mythical Man-Month, “No Silver Bullet” expands on discussions of accidental versus essential complexity, and argues that there is no longer enough accidental complexity to allow individual reductions in that accidental complexity to significantly increase engineer productivity.

pages: 1,041 words: 317,136

American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer
by Kai Bird and Martin J. Sherwin
Published 18 Dec 2007

When one of his graduate students wrote him for help in raising money for a particular project, Oppie replied whimsically that such research, “like marriage and poetry, should be discouraged and should occur only despite such discouragement.” On February 14, 1930, Oppenheimer finished writing a seminal paper, “On the Theory of Electrons and Protons.” Drawing on Paul Dirac’s equation on the electron, Oppenheimer argued that there had to be a positively charged counterpart to the electron—and that this mysterious counterpart should have the same mass as the electron itself. It could not, as Dirac had suggested, be a proton.

The so-called “Lamb shift” correctly attributed the difference between the two energy levels to the process of self-interaction—whereby charged particles interact with electromagnetic fields. Lamb won a Nobel Prize in 1955, in part for his precise measurement of the Lamb shift, a key step in the development of quantum electrodynamics. During these years, Oppenheimer wrote important, even seminal, papers on cosmic rays, gamma rays, electrodynamics and electron-positron showers. In the field of nuclear physics, he and Melba Phillips calculated the yield of protons in deuteron reactions. Phillips, an Indiana farm girl, born in 1907, was Oppenheimer’s first doctoral student. Their calculations on proton yields became widely known as the “Oppenheimer-Phillips process.”

IN FEBRUARY 1944, a team of British scientists led by the German-born Rudolf E. Peierls arrived in Los Alamos. Oppenheimer had first met this brilliant but unassuming theoretical physicist in 1929, when both men were studying under Wolfgang Pauli. Peierls had emigrated from Germany to England in the early 1930s, and in 1940 he and Otto R. Frisch had written the seminal paper “On the Construction of a Superbomb,” which had persuaded both the British and American governments that a nuclear weapon was feasible. During the next several years, Peierls worked on all aspects of Tube Alloys, the British bomb program. In 1942 and again in September 1943, Prime Minister Winston Churchill sent Peierls to America to help expedite work on the bomb.

pages: 193 words: 19,478

Memory Machines: The Evolution of Hypertext
by Belinda Barnet
Published 14 Jul 2013

If the user edited this tree structure, then the structure of the corresponding printed document would change. The model was derived from a review of the literature in cognitive psychology, composition theory and the nascent field of human computer studies (which would later become Human Computer Interaction (HCI)). Bolter coauthored a seminal paper on WE in 1986 with Smith, Lansman, Stephen Weiss, David Beard and Gordon Ferguson, and contributed to its development. From the outset in the discussions this group were having, however, it became apparent to Smith that: Jay was more interested in the Macintosh hardware, and he was also interested in applying these ideas to literature and a literary context.

pages: 280 words: 76,638

Rebel Ideas: The Power of Diverse Thinking
by Matthew Syed
Published 9 Sep 2019

When they interacted effectively, they exceeded the capability of individual members.’ 26 Cass Sunstein and Reid Hastie, Wiser: Getting Beyond Groupthink to Make Groups Smarter (Harvard Business Review Press, 2014). 27 Adam Galinsky and Maurice Schweitzer, Friend and Foe: When to Cooperate, When to Compete, and How to Succeed at Both (Crown, 2015). 28 https://journals.aom.org/doi/10.5465/ambpp.2017.313 29 Adam Galinsky and Maurice Schweitzer, Friend and Foe. 30 Quoted in Joseph Henrich, The Secret of Our Success (Princeton University Press, 2015). 31 The seminal paper was written by Henrich and Gil-White, https://www.ncbi.nlm.nih.gov/pubmed/11384884 32 Conversation with the author. 33 https://static1.squarespace.com/static/56cf3dd4b6aa60904403973f/t/57be0776f7e0ab26d736060e/1472071543508/dominance-and-prestige-dual-strategies-for-navigating-social-hierarchies.pdf 34 Conversation with the author. 35 https://creighton.pure.elsevier.com/en/publications/psychological-safety-a-meta-analytic-review-and-extension 36 https://rework.withgoogle.com/blog/five-keys-to-a-successful-google-team/ 37 Conversation with the author. 38 Conversation with the author. 39 https://www.linkedin.com/pulse/beauty-amazons-6-pager-brad-porter 40 Conversation with the author. 41 Quoted in Adam Grant, Originals: How Non-Conformists Change the World (W.

pages: 240 words: 78,436

Open for Business Harnessing the Power of Platform Ecosystems
by Lauren Turner Claire , Laure Claire Reillier and Benoit Reillier
Published 14 Oct 2017

Visa and Mastercard may not have used the term multisided markets when they launched, but their operations – of connecting card users and merchants – clearly exhibited the economic characteristics of platform businesses.8 The concept of multisided markets started to be formalized by academics in 2000. Geoff Parker and Marshall Van Alstyne were among the first economists to look closely at platform business models while trying to understand how firms such as Microsoft could sustainably offer free software.9 Shortly after, JeanCharles Rochet and Jean Tirole published a seminal paper on the economics of card platforms in 2002. Their research proposed a new economic model of the price relationships used on both sides of a multisided market to better coordinate demand.10 While the main focus area of the paper was credit cards, the analysis and key findings apply more widely.

pages: 270 words: 79,180

The Middleman Economy: How Brokers, Agents, Dealers, and Everyday Matchmakers Create Value and Profit
by Marina Krakovsky
Published 14 Sep 2015

See Alan Fiske, “The Four Elementary Forms of Sociality: Framework for a Unified Theory of Social Relations,” Psychological Review 99, no. 4 (1992): 689–723. 42.Thiers is using the term “tipping point” in Malcolm Gladwell’s sense—the moment at which something suddenly starts to spread like wildfire. Scholars of two-sided networks also talk about tipping (although not tipping points), but they mean something rather different: “the tendency of one system to pull away from its rivals in popularity once it has gained an initial edge,” according to a seminal paper on network effects. See Michael L. Katz and Carl Shapiro, “Systems Competition and Network Effects,” Journal of Economic Perspectives 8, no. 2 (Spring 1994): 106. This kind of tipping doesn’t always occur. For example, when users can simultaneously be on two or more platforms (multihoming), a single platform need not prevail—so while SitterCity may have reached a tipping point, the matchmaking between parents and babysitters hasn’t tipped toward any one platform. 43.This statement is a corollary of the so-called Metcalfe’s Law, which postulates that the value of a network is proportional to the square of the number of users of the network.

pages: 312 words: 83,998

Testosterone Rex: Myths of Sex, Science, and Society
by Cordelia Fine
Published 13 Jan 2017

And startlingly, the first set of contradictory data we’ll look at comes from Bateman’s own study. ALTHOUGH BATEMAN’S CONCLUSIONS tend to evoke images of the Playboy Mansion or well-stocked harems, it’s necessary for the time being to return to Bateman’s unsalubrious glass containers. It was only in our young century that, noticing that this (ahem) seminal paper had never been replicated, or apparently even subjected to close inspection, the contemporary evolutionary biologists Brian Snyder and Patricia Gowaty reexamined it. As they acknowledge, they returned to the study with many advantages that Bateman had lacked. These included modern computational aids, more sophisticated statistical methods and—perhaps I can dare to add?

pages: 294 words: 82,438

Simple Rules: How to Thrive in a Complex World
by Donald Sull and Kathleen M. Eisenhardt
Published 20 Apr 2015

Weaver had an uncanny knack for picking future all-stars. Eighteen scientists won Nobel Prizes for research related to molecular biology in the middle of the century, and Weaver had funded all but three of them. Weaver recognized the potential of computers long before most people even knew they existed. He wrote a seminal paper that laid out how computers could translate text from one language to another, sixty years before the creation of Google Translate and Babylon. While at the Rockefeller Foundation, Weaver also handpicked and financed a team that spent two decades developing high-yield varieties of wheat that were impervious to disease.

pages: 304 words: 80,965

What They Do With Your Money: How the Financial System Fails Us, and How to Fix It
by Stephen Davis , Jon Lukomnik and David Pitt-Watson
Published 30 Apr 2016

After all, no one is going to take advantage of the option to sell you a stock at less than what it’s worth. But if the stock drops, the portfolio manager is obligated to buy it at the agreed-upon price, which may now be higher than the market price. As veteran investor Andrew Weisman pointed out in a seminal paper more than a decade ago, a number of hedge fund strategies are based on the same idea, although the method is more complex. Often it involves buying one security and selling another, and betting that the relationship between the two remains steady, so as to “permit a trader to collect a premium for assuming the risks associated [with] low-probability events.”23 The problem, of course, is that “low probability” is not “no probability.”

pages: 283 words: 85,906

The Clock Mirage: Our Myth of Measured Time
by Joseph Mazur
Published 20 Apr 2020

In the early 1980s Jeffrey Hall and Michael Rosbash, along with Rosbash’s graduate student Paul Hardin at Brandeis, discovered such a circadian oscillator in the fruit fly, an insect that has timekeeping gene qualities associated with clock genes in humans. Hall and Rosbash won the 2017 Nobel Prize in Physiology or Medicine for their discoveries of the molecular processes that control circadian rhythms. In their seminal paper in the Proceedings of the National Academy of Science, they isolated the so-called Period gene (per), which cycles the amount of messenger RNA (mRNA) produced in a feedback loop, first forming and then terminating proteins made from per gene instructions.13 For clarity, let’s first briefly recall the mechanisms of mRNA and proteins.

pages: 304 words: 84,396

Bounce: Mozart, Federer, Picasso, Beckham, and the Science of Success
by Matthew Syed
Published 19 Apr 2010

THE HIDDEN LOGIC OF SUCCESS “I propose to show”: The quotes from Francis Galton are taken from Hereditary Genius: An Inquiry into Its Laws and Consequences (New York: D. Appleton, 1884). In 1991 Anders Ericsson: The study of violinists at the Music Academy of West Berlin is published in one of the most seminal papers in the study of expertise: K. Anders Ericsson, Ralf Th. Krampe, and Clemens Tesch-Romer, “The Role of Deliberate Practice in the Acquisition of Expert Performance,” Psychological Review 100, no. 3 (1993): 363–406. “There is absolutely no evidence of a ‘fast track’”: This view was based on a wide-ranging study of musical achievement: John A.

Green Economics: An Introduction to Theory, Policy and Practice
by Molly Scott Cato
Published 16 Dec 2008

Like all metaphysical concepts, when you try to pin it down it turns out to be just a word.15 THE POLICY CONTEXT 117 A recent report from the Centre for Holistic Studies agrees, finding that much of the policy aimlessness of recent years is the result of relying on measurement rather than judgement.16 As Funtowicz and Ravetz pointed out in a seminal paper in the related field of ecological economics, in which they ask, ‘What is the value of a songbird?’, the most valuable things in life are, quite literally, priceless. Unfortunately, this can mean that they are therefore accorded no economic value and not protected.17 The call for alternative indicators comes down to a question of how we, as citizens concerned for the well-being of others and of the planet we share, would choose to measure the well-being of a nation.

pages: 280 words: 85,091

The Wisdom of Psychopaths: What Saints, Spies, and Serial Killers Can Teach Us About Success
by Kevin Dutton
Published 15 Oct 2012

But there’s an added ingredient, a kind of naive, childlike inquisitiveness, which is strongly reminiscent of the core “openness to experience” factor of the Big Five personality structure that we explored in chapter 2. And which psychopaths, if you recall, score very high on. “The first component [of mindfulness] involves the self-regulation of attention so that it is maintained on immediate experience,” explains psychiatrist Scott Bishop in one of the seminal papers on the subject back in 2004, “thereby allowing for increased recognition of mental events in the present moment. The second component involves adopting a particular orientation toward one’s experiences in the present moment, an orientation that is characterized by curiosity, openness, and acceptance.”

pages: 277 words: 81,718

Vassal State
by Angus Hanton
Published 25 Mar 2024

They will tune into that conversation and be able to ignore other chatter even when it is at louder volumes. But it turns out that when certain words or ideas are expressed in other conversations, listeners cannot stop themselves from hearing what is said. This occurs when their own name is mentioned, or if there is anything overtly sexual or taboo. In the title of his seminal paper, Cherry refers to ‘recognition of speech, with one and two ears’.3 He ran his experiments by putting headphones on his subjects with different soundtracks running into each ear. American politicians and officials must feel they are permanently at a noisy cocktail party, and most of the time they are listening with only one ear.

pages: 2,466 words: 668,761

Artificial Intelligence: A Modern Approach
by Stuart Russell and Peter Norvig
Published 14 Jul 2019

Korf and Zhang (2000) describe a divide-and-conquer approach, and Zhou and Hansen (2002) introduce memory-bounded A* graph search and a strategy for switching to breadth-first search to increase memory-efficiency (Zhou and Hansen, 2006). The idea that admissible heuristics can be derived by problem relaxation appears in the seminal paper by Held and Karp (1970), who used the minimum-spanning-tree heuristic to solve the TSP. (See Exercise 3.MSTR.) The automation of the relaxation process was implemented successfully by Prieditis (1993). There is a growing literature on the application of machine learning to discover heuristic functions (Samadi et al., 2008; Arfaee et al., 2010; Thayer et al., 2011; Lelis et al., 2012).

They are, however, extremely important in their own right because so many combinatorial problems in computer science can be reduced to checking the satisfiability of a propositional sentence. Any improvement in satisfiability algorithms has huge consequences for our ability to handle complexity in general. 7.6.1A complete backtracking algorithm The first algorithm we consider is often called the Davis–Putnam algorithm, after the seminal paper by Martin Davis and Hilary Putnam (1960). The algorithm is in fact the version described by Davis, Logemann, and Loveland (1962), so we will call it DPLL after the initials of all four authors. DPLL takes as input a sentence in conjunctive normal form—a set of clauses. Like BACKTRACKING-SEARCH and TT-ENTAILS?

Good sources for information on satisfiability, both theoretical and practical, include the Handbook of Satisfiability (Biere et al., 2009), Donald Knuth’s (2015) fascicle on satisfiability, and the regular International Conferences on Theory and Applications of Satisfiability Testing, known as SAT. The idea of building agents with propositional logic can be traced back to the seminal paper of McCulloch and Pitts (1943), which is well known for initiating the field of neural networks, but actually was concerned with the implementation of a Boolean circuit-based agent design in the brain. Stan Rosenschein (Rosenschein, 1985; Kaelbling and Rosenschein, 1990) developed ways to compile circuit-based agents from declarative descriptions of the task environment.

pages: 315 words: 93,628

Is God a Mathematician?
by Mario Livio
Published 6 Jan 2009

A concise but accurate summary of the claims of Lobachevsky and Bolyai for priority is given in Kline 1972. ome of Gauss’s correspondence on non-Euclidean geometry is presented in Ewald 1996. In a brilliant lecture delivered in Göttingen: An English translation of the lecture, as well as other seminal papers on non-Euclidean geometries, together with illuminating notes, can be found in Pesic 2007. Poincaré’s views were inspired: Poincaré 1891. in the first chapter of the Ars Magna: Cardano 1545. In another important book, Treatise of Algebra: Wallis 1685. A concise summary of Wallis’s biography and work can be found in Rouse Ball 1908.

pages: 313 words: 91,098

The Knowledge Illusion
by Steven Sloman
Published 10 Feb 2017

In 1985 he was appointed by the Royal Society of London, the oldest scientific society in the world, to lead a team to evaluate the current state of attitudes toward science and technology in Britain. The Royal Society was concerned about antiscientific sentiment in Britain, seeing it as a serious risk to societal well-being. The team’s results and recommendations were published in a seminal paper, now known as the Bodmer Report. Previous research had focused primarily on measuring attitudes directly, but Bodmer and his team argued passionately for a simple and intuitive idea: that opposition to science and technology is driven by lack of understanding. Hence, by promoting better understanding of science, society can promote more favorable attitudes and take better advantage of the benefits afforded by science and technology.

pages: 293 words: 88,490

The End of Theory: Financial Crises, the Failure of Economics, and the Sweep of Human Interaction
by Richard Bookstaber
Published 1 May 2017

Broadly speaking, the market impact is represented in their model and in other academic models as a smooth function, but in the case of an agent-based model, this need not be the case; there can be the sudden discontinuous drops common to phase transitions. 8. The analysis of market liquidity has generally focused on day-to-day liquidity during normal periods in equity markets. It has discussed the relationship between price impact and transactions volume. Much of this work and discussion derives from Kyle’s (1985) seminal paper, where he builds a dynamic model of trade with a sequential auction model that resembles a continuous market, where he uses three agents: a random-noise trader, a risk-neutral insider, and a competitive-risk natural market maker. By doing so he is able to create a market model by which questions about liquidity and information could be tested.

pages: 713 words: 93,944

Seven Databases in Seven Weeks: A Guide to Modern Databases and the NoSQL Movement
by Eric Redmond , Jim Wilson and Jim R. Wilson
Published 7 May 2012

for room in 1...100​​ ​​ # Create a unique room number as the key​​ ​​ ro = Riak::RObject.new(bucket, (current_rooms_block + room))​​ ​​ # Randomly grab a room style, and make up a capacity​​ ​​ style = STYLES[rand(STYLES.length)]​​ ​​ capacity = rand(8) + 1​​ ​​ # Store the room information as a JSON value​​ ​​ ro.content_type = "application/json"​​ ​​ ro.data = {'style' => style, 'capacity' => capacity}​​ ​​ ro.store​​ ​​ end​​ ​​end​​ ​​$ ruby hotel.rb​​ We’ve now populated a human hotel we’ll mapreduce against. Introducing Mapreduce One of Google’s greatest lasting contributions to computer science is the popularization of mapreduce as an algorithmic framework for executing jobs in parallel over several nodes. It is described in Google’s seminal paper[15] on the topic and has become a valuable tool for executing custom queries in the class of partition-tolerant datastores. Mapreduce breaks down problems into two parts. Part 1 is to convert a list of data into another type of list by way of a map function. Part 2 is to convert this second list to one or more scalar values by way of a reduce function.

pages: 292 words: 94,324

How Doctors Think
by Jerome Groopman
Published 15 Jan 2007

I got cavalier." As there are classic clinical maladies, there are classic cognitive errors. Alter's misdiagnosis resulted from such an error, the use of a heuristic called "availability." Amos Tversky and Daniel Kahneman, psychologists from the Hebrew University in Jerusalem, explored this shortcut in a seminal paper more than two decades ago. Kahneman won the Nobel Prize in economics in 2002 for work illuminating the way certain patterns of thinking cause irrational decisions in the marketplace; Tversky certainly would have shared the prize had he not died an untimely death in 1996. "Availability" means the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind.

pages: 324 words: 90,253

When the Money Runs Out: The End of Western Affluence
by Stephen D. King
Published 17 Jun 2013

A lack of trust prevented a transaction from taking place. This was a classic example of market failure: all parties wanted a transaction to take place but a lack of trust meant that it was impossible to strike a deal. My experience is not so different from George Akerlof's market for lemons. In his seminal paper published in 1970,1 Akerlof investigated an obvious peculiarity associated with the value of second-hand cars. Why did the value of a brand new car immediately drop as soon as it was driven off the forecourt? The answer was simple: the seller, having owned the car, would know something about its idiosyncratic strengths and weaknesses that the would-be buyer would, inevitably, be clueless about.

pages: 335 words: 94,578

Spectrum Women: Walking to the Beat of Autism
by Barb Cook and Samantha Craft
Published 20 Aug 2018

Catriona beautifully describes the historical context of understanding gender, intersectionality, and feminism. I would like to describe the historical context of autism because I think it bears strongly on all of the issues that Catriona raises. The definition of autism has broadened enormously in the last 40 years since Lorna Wing’s seminal paper in 1981 describing Asperger syndrome in English for the first time. When I started working in this area in 1993, these broader definitions of autism were not in the international diagnostic texts for understanding difference, emerging first in the Diagnostic and Statistical Manual Fourth Edition (DSM-IV; American Psychiatric Association, 1994).

pages: 292 words: 94,660

The Loop: How Technology Is Creating a World Without Choices and How to Fight Back
by Jacob Ward
Published 25 Jan 2022

We instead need to account for past patterns of discrimination, the kind of horrific systemic abuses Jesus Hernandez has spent decades measuring, and in fact put our finger on the scale to compensate for it where we can. There is vital and rapid work being done on bias in AI. The researchers Joy Buolamwini and Timnit Gebru published a seminal paper in February 2018 revealing that the top three commercial facial-recognition systems misidentified white, male faces only 0.8 percent of the time, while the same systems misidentified dark-skinned women more than 20 percent of the time. And the stakes, they pointed out, are high: “While face recognition software by itself should not be trained to determine the fate of an individual in the criminal justice system, it is very likely that such software is used to identify suspects.”5 Inspired in part by that work, a study by the federal agency charged with establishing technical benchmarks on new technology, the National Institute of Standards and Technology (NIST), found that across 189 facial-recognition algorithms from 99 developers around the world, Asian and African American faces were misidentified far more often than white faces.

pages: 404 words: 92,713

The Art of Statistics: How to Learn From Data
by David Spiegelhalter
Published 2 Sep 2019

Barnard was a delightful man, a pure mathematician (and Communist) before the war, when like many others he adapted his skills for statistical war work. He later went on to develop the official British Standard for condoms (BS 3704). * This was a remarkable achievement, given the collective noun for statisticians has been said to be a ‘variance’. * He died with no knowledge whatsoever of his enduring legacy, and not only was his seminal paper published posthumously in 1763, but his name did not become associated with this approach until the twentieth century. * Some might even say I was indoctrinated. * Odds of 1 are sometimes known as ‘evens’, since the events are equally likely, or evenly balanced. * His exact words were, ‘Given the number of times on which an unknown event has happened and failed: Required the chance that the probability of its happening in a single trial lies somewhere between any two degrees of probability that can be named’, which is reasonably clear, except in modern terminology we would probably reverse his use of ‘chance’ and ‘probability’

pages: 294 words: 96,661

The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity
by Byron Reese
Published 23 Apr 2018

And sure enough, Gmeindl and Chiu found consistent places in the brain where activity occurred immediately before the shift. So is that where so-called free will lives in the brain? If so, it sure looks like plain ol’ deterministic brain activity. There is nothing special going on at all. The brain is just doing its thing. However, it isn’t that simple. The plot, as they say, thickens. In a seminal paper published in 1999, the psychologists Dan Wegner and Thalia Wheatley proposed a revolutionary idea. Instead of the traditional order of the sequence—a person decides to do something and then it happens—they maintained that things in the brain actually run backward from that. First, the theory goes, you do something, then you tell yourself later that you decided to do it.

pages: 442 words: 94,734

The Art of Statistics: Learning From Data
by David Spiegelhalter
Published 14 Oct 2019

He later went on to develop the official British Standard for condoms (BS 3704). 9 This was a remarkable achievement, given the collective noun for statisticians has been said to be a ‘variance’. CHAPTER 11: Learning from Experience the Bayesian Way 1 He died with no knowledge whatsoever of his enduring legacy, and not only was his seminal paper published posthumously in 1763, but his name did not become associated with this approach until the twentieth century. 2 Some might even say I was indoctrinated. 3 Odds of 1 are sometimes known as ‘evens’, since the events are equally likely, or evenly balanced. 4 His exact words were, ‘Given the number of times on which an unknown event has happened and failed: Required the chance that the probability of its happening in a single trial lies somewhere between any two degrees of probability that can be named’, which is reasonably clear, except in modern terminology we would probably reverse his use of ‘chance’ and ‘probability’. 5 Being a Presbyterian minister, he just called it a table. 6 Remember, it means that, in the long run, 95% of such intervals will contain the true value – but we can’t say anything about any particular interval. 7 But I still prefer the Bayesian approach.

pages: 1,082 words: 87,792

Python for Algorithmic Trading: From Idea to Cloud Deployment
by Yves Hilpisch
Published 8 Dec 2020

Computers now link together various stock exchanges, a practice which is helping to create a single global market for the trading of securities. The continuing improvements in technology will make it possible to execute trades globally by electronic trading systems. Interestingly, one of the oldest and most widely used algorithms is found in dynamic hedging of options. Already with the publication of the seminal papers about the pricing of European options by Black and Scholes (1973) and Merton (1973), the algorithm, called delta hedging, was made available long before computerized and electronic trading even started. Delta hedging as a trading algorithm shows how to hedge away all market risks in a simplified, perfect, continuous model world.

pages: 326 words: 88,968

The Science and Technology of Growing Young: An Insider's Guide to the Breakthroughs That Will Dramatically Extend Our Lifespan . . . And What You Can Do Right Now
by Sergey Young
Published 23 Aug 2021

Even if figuring out the “cause” of aging is more trouble than it’s worth, gerontologists still need a way to identify the first principles of the problem of aging. THE HALLMARKS OF AGING In 2013, a group of European scientists led by biochemist and molecular biologist Carlos López-Otín published a seminal paper entitled “The 9 Hallmarks of Aging,” which tackled this problem and gave the longevity community a way to study aging without agreeing on its root cause. An entire book could be written about these hallmarks, but for our purposes it is only important to know that each of the hallmarks meets three essential criteria set by López-Otín’s research team: they present themselves during normal aging; they speed up aging when researchers experimentally aggravate them; and blocking them in some way tends to slow down aging and/or increase lifespan.

pages: 308 words: 94,447

The Sixth Extinction: An Unnatural History
by Elizabeth Kolbert
Published 11 Feb 2014

Only much later did they reach the Americas, and only many thousands of years after that did they make it to Madagascar and New Zealand. “When the chronology of extinction is critically set against the chronology of human migrations,” Paul Martin of the University of Arizona wrote in “Prehistoric Overkill,” his seminal paper on the subject, “man’s arrival emerges as the only reasonable answer” to the megafauna’s disappearance. In a similar vein, Jared Diamond has observed: “Personally, I can’t fathom why Australia’s giants should have survived innumerable droughts in their tens of millions of years of Australian history, and then have chosen to drop dead almost simultaneously (at least on a time scale of millions of years) precisely and just coincidentally when the first humans arrived.”

pages: 335 words: 89,924

A History of the World in Seven Cheap Things: A Guide to Capitalism, Nature, and the Future of the Planet
by Raj Patel and Jason W. Moore
Published 16 Oct 2017

But the record suggests that the majority of the extinctions on Madeira happened over the past two centuries—not under the initial colonial onslaught but later, as successive waves of foreign species and agrarian capitalism snuffed out millions of years of evolution.70 The trees, water, soil, fauna, and flora on Madeira and the sea around the island were treated as “free gifts,” transformed into a series of inputs or hindrances to production.71 In a seminal paper on overfishing, “Reefs since Columbus,” Jeremy Jackson notes how humans have extinguished life from the time that young Columbus arrived on Madeira.72 Humans under capitalism abuse the ecosystems of which we are part—and on which we depend. Capitalists are, for instance, happy to view the ocean as both storage facility for the seafood we have yet to catch and sinkhole for the detritus we produce on land.

pages: 315 words: 87,035

May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases—And What We Can Do About It
by Alex Edmans
Published 13 May 2024

He took every opportunity – tennis matches, parties and random meetings – to beg anyone remotely connected to the brokerage industry to give him the data. Finally, he struck gold, and a large brokerage gave him a database of 78,000 accounts, with client names removed. Importantly, these accounts were randomly chosen by the broker, so Terry had a representative sample, not a selected one. This dataset allowed Terry to write many seminal papers on investor behaviour, mostly with Brad Barber at the University of California at Davis. One influential study calculated the profit that frequent traders make.† Importantly, they studied all frequent traders in their sample, regardless of whether they struck it rich; they simply picked out every single hyperactive investor without pre-screening for their level of success, and calculated the average return across the group.

pages: 320 words: 90,115

The Warhol Economy
by Elizabeth Currid-Halkett
Published 15 Jan 2020

See, for example, Baumol 2002. 5. See Nalebuff and Ayres 2006 for a discussion of these types of relationships. 6. For an in-depth discussion of these relationships, see Marshall 1961; Storper 1997; Castells and Hall 1994; Scott 1993, 2000; and Saxenian 1994. 7. See Romer 1986 and 1990 for his most seminal papers on “increasing returns” to knowledge and New Growth Theory. Also see Warsh 2006. 8. Please see Freiberger and Swain 1999 for a fascinating account of the Homebrew Computer Club and the bohemian culture of early Silicon Valley. 9. Saxenian 1994. 10. Strumsky et al. 2005. 11. Marshall 1890. 12.

pages: 337 words: 89,075

Understanding Asset Allocation: An Intuitive Approach to Maximizing Your Portfolio
by Victor A. Canto
Published 2 Jan 2005

A final reason for choosing this specific sample period is the bulk of the asset data is only available from 1975 on. This is the longest sample period for which I could get the data in the classifications that match the current exchange-traded funds’ (ETFs) availability to satisfy the mutual-exclusion constraint I find essential. 3. Harry Markowitz’s seminal paper (1952) marks modern financial literature’s beginning. Subsequent publications by Jensen (1968), Lintner (1965 and 1969), Sharpe (1964), and Treynor (1962) led to modern financial risk metrics’ development. 4. S&P/BARRA Indexes, Research and Indices description, BARRA.com (2005). 5. Sharpe (1992). 6.

pages: 296 words: 87,299

Portfolios of the poor: how the world's poor live on $2 a day
by Daryl Collins , Jonathan Morduch and Stuart Rutherford
Published 15 Jan 2009

The general problem is framed in Morduch’s (1999) essay on the strengths and weaknesses of informal risk sharing. He asks: “does informal insurance patch the safety net?” And answers: “yes, but not very well.” The essay also 251 NOTES TO CHAPTER THREE describes the hidden costs—financial, economic, and emotional—often attached to informal risk sharing. See Townsend 1994 for the seminal paper on formal tests of village-level risk sharing, as well as Deaton 1992, 1997 for similar work in Côte d’Ivoire, Ghana, and Thailand, Morduch 2005 in India, Udry 1994 in Nigeria, Grimard 1997 in Côte d’Ivoire, Lund and Fafchamps 2003 in the Philippines, and Dubois 2000 in Pakistan. Morduch 2005 provides a critical overview of the work on South Asia, and Morduch 2006 provides an accessible introduction to the broader research program.

pages: 352 words: 96,532

Where Wizards Stay Up Late: The Origins of the Internet
by Katie Hafner and Matthew Lyon
Published 1 Jan 1996

“The political process,” he wrote, “would essentially be a giant teleconference, and a campaign would be a months-long series of communications among candidates, propagandists, commentators, political action groups, and voters. The key is the self-motivating exhilaration that accompanies truly effective interaction with information through a good console and a good network to a good computer.” Lick’s thoughts about the role computers could play in people’s lives hit a crescendo in 1960 with the publication of his seminal paper “Man-Computer Symbiosis.” In it he distilled many of his ideas into a central thesis: A close coupling between humans and “the electronic members of the partnership” would eventually result in cooperative decision making. Moreover, decisions would be made by humans, using computers, without what Lick called “inflexible dependence on predetermined programs.”

pages: 370 words: 97,138

Beyond: Our Future in Space
by Chris Impey
Published 12 Apr 2015

Robert Goddard is bundled against the cold of a New England winter in 1926 as he stands by the launching frame of his most notable invention. The liquid fuel of this rocket was gasoline and liquid oxygen, contained in the cylinder across from Goddard’s torso. Nevertheless, the world was not quite ready for rockets. Goddard’s seminal paper from 1919, “A Method of Reaching Extreme Altitudes,” was ridiculed by the press and fellow scientists. An unsigned editorial in the New York Times was particularly harsh, accusing him of ignorance of the laws of physics: “. . . Professor Goddard . . . does not know the relation of action and reaction, and of the need to have something better than a vacuum against which to react. . . .

pages: 282 words: 89,436

Einstein's Dice and Schrödinger's Cat: How Two Great Minds Battled Quantum Randomness to Create a Unified Theory of Physics
by Paul Halpern
Published 13 Apr 2015

It is like counting the number of arrangements of a set of pennies, each minted in a different year. If you distinguish them by their dates, they have many more unique configurations than if you treat them as identical. Therefore, quantum estimates of entropy are different from classical measures. Before Bose contributed his seminal paper on photons and Einstein extended his treatment to include ideal gases, many physicists were perplexed about which factors to include in expressing the entropy for quantum systems. A well-known equation for entropy contained a controversial correction term that no one, until Bose, could fully explain.

pages: 364 words: 102,926

What the F: What Swearing Reveals About Our Language, Our Brains, and Ourselves
by Benjamin K. Bergen
Published 12 Sep 2016

By all accounts, McCawley was a polymath (for instance, he had several degrees in math), a prodigy (who started as a student at the University of Chicago at sixteen), and an inveterate prankster. Under the pseudonym of Quang Phuc Dong, ostensibly of the South Hanoi Institute of Technology (or SHIT), he wrote several seminal papers in what he called “scatolinguistics.” The first, “English Sentences Without Overt Grammatical Subject,” deals with the grammar of Fuck you. McCawley died in 1999 and with him a lot of the fun of linguistics. fMuch of this discussion is inspired by Fillmore, C. J. (1985). We miss you, Chuck. gOne final thing that’s interesting about this case is that the profane word (fuck or hell) looks like a noun—it follows the, as nouns are wont to do—but it doesn’t behave like just any noun.

pages: 471 words: 97,152

Animal Spirits: How Human Psychology Drives the Economy, and Why It Matters for Global Capitalism
by George A. Akerlof and Robert J. Shiller
Published 1 Jan 2009

“Attitude of Waiting.” See also “A Twenty-Five Million Pool.” 2. Cooper and John (1988) have in particular emphasized the role of dual equilibria in macroeconomics. The view of confidence in this chapter goes beyond this interpretation. Our description of confidence corresponds to that presented in the seminal paper by Benabou (2008). To Benabou the notion of confidence corresponds to a psychological state in which people do not sufficiently utilize the information that is available to them. They are too trusting —a state of mind that leads to overinvestment. Blanchard (1993) takes a similar view on the nature of animal spirits.

pages: 367 words: 97,136

Beyond Diversification: What Every Investor Needs to Know About Asset Allocation
by Sebastien Page
Published 4 Nov 2020

His main point, I suppose, was that his judgment, given his experience and track record, was probably better than any mathematically derived estimate. One of the great misconceptions on portfolio theory is that it precludes the use of judgment and experience. It doesn’t. It’s right there in Markowitz’s 1952 seminal paper, for everyone to see, in the first paragraph: The process of selecting a portfolio may be divided into two stages. The first stage starts with observation and experience and ends with beliefs about the future performances of available securities. The second stage starts with the relevant beliefs about future performances and ends with the choice of a portfolio.

pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity
by Amy Webb
Published 5 Mar 2019

And it would take another 100 years for someone to realize that Boolean logic and probability could help computers evolve from automating basic math to more complex thinking machines. There wasn’t a way to build a thinking machine—the processes, materials, and power weren’t yet available—and so the theory couldn’t be tested. The leap from theoretical thinking machines to computers that began to mimic human thought happened in the 1930s with the publication of two seminal papers: Claude Shannon’s “A Symbolic Analysis of Switching and Relay Circuits” and Alan Turing’s “On Computable Numbers, with an Application to the Entscheidungsproblem.” As an electrical engineering student at MIT, Shannon took an elective course in philosophy—an unusual diversion. Boole’s An Investigation of the Laws of Thought became the primary reference for Shannon’s thesis.

pages: 319 words: 100,984

The Moon: A History for the Future
by Oliver Morton
Published 1 May 2019

“We have as yet no direct evidence of radio waves passing between the surface of the earth and outer space,” he noted, “[but] given sufficient transmitting power, we might obtain the necessary evidence by exploring for echoes from the moon.” I do not know that the men of Project Diana knew of Clarke’s seminal paper, but their counterparts in the US Navy did, and so did some in the press. On February 3rd 1946 the Los Angeles Times ran a front-page story on the idea, noting of Clarke’s proposed Moonbounce test that “the US Army Signal Corps has just done this.” Project Diana thus showed both the feasibility of communication satellites and that the Moon could function in such a role.

pages: 323 words: 100,772

Prisoner's Dilemma: John Von Neumann, Game Theory, and the Puzzle of the Bomb
by William Poundstone
Published 2 Jan 1993

The translator, mathematician L. J. Savage, told Steve Heims: “He phoned me from someplace like Los Alamos, very angry. He wrote a criticism of these papers in English. The criticism was not angry. It was characteristic of him that the criticism was written with good manners.” All this granted, the seminal paper of game theory is without doubt von Neumann’s 1928 article, “Zur Theorie der Gesellschaftspiele” (“Theory of Parlor Games”). In this he proved (as Borel had not) the famous “minimax theorem.” This important result immediately gave the field mathematical respectability. THEORY OF GAMES AND ECONOMIC BEHAVIOR Von Neumann wanted game theory to reach a larger audience than mathematicians.

pages: 340 words: 100,151

Secrets of Sand Hill Road: Venture Capital and How to Get It
by Scott Kupor
Published 3 Jun 2019

After all, we knew that, given the size of the opportunity, the market would attract multiple companies into the space. Martin Casado was the consummate founder for this business. He had spent his early career at the CIA building out the foundations for software-defined networking and then went to Stanford to earn his PhD in the same area. His doctoral thesis was the seminal paper on the topic. There couldn’t have been better founder-market fit. And so we invested in Nicira, which was ultimately acquired by VMware for $1.25 billion. And then after spending several years at VMware running the business division into which he was acquired, Martin joined Andreessen Horowitz as a general partner.

pages: 329 words: 101,233

We Are Electric: Inside the 200-Year Hunt for Our Body's Bioelectric Code, and What the Future Holds
by Sally Adee
Published 27 Feb 2023

Tseng had shown that all the persnickety chemical gradients, transcriptional networks, and force cues needed to orchestrate individual cells into complicated tissues could be harnessed with a comparatively simple set of electrical instructions. The genes were hardware, and they could be controlled by manipulating ion flows—the instructions from the software. Tseng and Levin soon published the seminal paper introducing their new idea: “Cracking the bioelectric code.”44 Subsequent research has yielded multi-limbed frogs and other evidence of bioelectricity’s role in regeneration. Among the most startling of these, it was possible to use bioelectric interventions to make planarians that had been chopped in half grow a second head instead of a tail.

pages: 456 words: 101,959

Unmasking Autism: Discovering the New Faces of Neurodiversity
by Devon Price
Published 4 Apr 2022

Stigma among parents of children with autism: A literature review. Asian Journal of Psychiatry, 45, 88–94. I have conducted a thorough literature review and found numerous studies on self-stigma reduction for people who are not actually Autistic, but merely related to someone Autistic, and the above review lists some of the most seminal papers. At the time of this writing I can find no papers on self-stigma reduction for the actual members of the stigmatized group—Autistic people ourselves. BACK TO NOTE REFERENCE 9 Corrigan, P. W., Kosyluk, K. A., & Rüsch, N. (2013). Reducing self-stigma by coming out proud. American Journal of Public Health, 103(5), 794–800.

pages: 326 words: 106,053

The Wisdom of Crowds
by James Surowiecki
Published 1 Jan 2004

Jeff Bezos drew the analogy between the Cambrian explosion and the Internet in a number of places, including an interview in Business Week (September 16, 1999), http://www.businessweek.com/ebiz/9909/916bezos.htm. Scott Page describes this experiment in “Return to the Toolbox,” unpublished paper (2002). Also see Scott Page and Lu Hong, “Problem Solving by Heterogeneous Agents,” Journal of Economic Theory 97 (2001): 123–63. The seminal paper is James G. March, “Exploration and Exploitation in Organizational Learning,” Organization Science 2 (1991): 71–87. The quotes are from pages 86 and 79. The study of chess players can be found in Herbert A. Simon and W. G. Chase, “Skill in Chess,” American Scientist 61 (1973): 394–403. The Chase quote is from James Shanteau, “Expert Judgment and Financial Decision Making,” paper prepared for Risky Business: Risk Behavior and Risk Management, edited by Bo Green (Stockholm: Stockholm University, 1995).

pages: 332 words: 109,213

The Scientist as Rebel
by Freeman Dyson
Published 1 Jan 2006

Many of the older scientists remained immune, but their influence waned as the new language became universal. After Feynman’s work on the diagrams was done, a year went by before it was published. He was willing and eager to share his ideas in conversation with anyone who would listen, but he found the job of writing a formal paper distasteful and postponed it as long as he could. His seminal paper, “Space-Time Approach to Quantum Electrodynamics,”5 might never have been written if he had not gone to Pittsburgh to stay for a few days with his friends Bert and Mulaika Corben. While he was in the Corbens’ house, they urged him to sit down and write the paper, and he made all kinds of excuses to avoid doing it.

pages: 356 words: 105,533

Dark Pools: The Rise of the Machine Traders and the Rigging of the U.S. Stock Market
by Scott Patterson
Published 11 Jun 2012

After graduating in 1995 with degrees in mathematics and cognitive science—the latter is the study of the mind as a machine that processes information—Bodek found work at Magnify, an Oak Park, Illinois, high-tech outfit run by Robert Grossman, a pioneer in techniques to mine giant databases for information. Bodek quickly proved his mettle at Magnify. With Grossman and several other researchers he helped write a seminal paper on predicting credit card fraud based on massive data sets. Using “machine learning,” a branch of artificial intelligence that deployed algorithms to crunch large blocks of data, the system could detect patterns of fraudulent transactions. One red flag might be a $1 credit card purchase at a gas station followed by a $10,000 splurge at a jewelry store (signaling that the thieves were testing the card before trying to make a big score).

pages: 339 words: 109,331

The Clash of the Cultures
by John C. Bogle
Published 30 Jun 2012

Agency Costs and Managerial Behavior In 1976, another pair of wise academics—Harvard Business School’s Michael C. Jensen and University of Rochester’s William H. Meckling—added another brilliant insight on corporate behavior. “America has a principal/agent problem,” as The Economist explained their seminal paper. “Agents (i.e., managers) were feathering their own nests rather than the interests of their principals (shareholders).” In “Theory of the Firm: Managerial Behavior, Agency Costs, and Ownership Structure,” Jensen and Meckling set forth “a theory of (1) property rights, (2) agency, and (3) finance (as they relate to) the ownership structure of the firm.”

Smart Mobs: The Next Social Revolution
by Howard Rheingold
Published 24 Dec 2011

From the beginning, he saw a combination of languages, methodologies, and machines supporting new ways to think, communicate, collaborate, and learn. Much of the apparatus was social, and therefore nonmechanical. After failing to recruit support from computer science or computer manufacturers, Engelbart wrote his seminal paper, “A Conceptual Framework for the Augmentation of a Man’s Intellect,” in order to explain what he was talking about.83 Engelbart came to the attention of Licklider. ARPA sponsored a laboratory at the Stanford Research Institute (SRI), the “Augmentation Research Center,” where Engelbart and a group of hardware engineers, programmers, and psychologists who shared Engelbart’s dream started building the computer as we know it today.

pages: 369 words: 105,819

The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President
by Bandy X. Lee
Published 2 Oct 2017

All tyrants share several essential features: they are predominantly men with a specific character defect, narcissistic psychopathy (a.k.a. malignant narcissism). This defect manifests in a severely impaired or absent conscience and an insatiable drive for power and adulation that masks the conscience deficits. It forms the core of attraction between him and his followers, the essence of what is seen as his “charisma.” In his seminal paper on “Antisocial Personality Disorder and Pathological Narcissism in Prolonged Conflicts and Wars of the 21st Century” (2015), Frederick Burkle observes that narcissism augments and intensifies the pathological features of a psychopathic character structure, making those endowed with it especially dangerous, not in the least because of their ability to use manipulative charm and a pretense of human ideals to pursue their distinctly primitive goals.

pages: 416 words: 106,532

Cryptoassets: The Innovative Investor's Guide to Bitcoin and Beyond: The Innovative Investor's Guide to Bitcoin and Beyond
by Chris Burniske and Jack Tatar
Published 19 Oct 2017

While people accept that equities and bonds are the two major investment asset classes, and others will accept that money market funds, real estate, precious metals, and currencies are other commonly used asset classes,4 few bother to understand what is meant by an asset class in the first place. Robert Greer, vice president of Daiwa Securities, wrote “What Is an Asset Class, Anyway?”5 a seminal paper on the definition of an asset class in a 1997 issue of The Journal of Portfolio Management. According to Greer: An asset class is a set of assets that bear some fundamental economic similarities to each other, and that have characteristics that make them distinct from other assets that are not part of that class.

Capital Ideas Evolving
by Peter L. Bernstein
Published 3 May 2007

Merton, Merton Miller, Franco Modigliani, Myron Scholes, and William Sharpe—have won Nobel Prizes, and, if he had been alive when Scholes and Merton received * In his f ine book, An Engine, Not a Camera: How Financial Models Shape Markets (2006), MacKenzie has characterized the process as a “cascade,” in which each innovator drew directly on his predecessors (p. 389). bern_a03fpref.qxd 3/23/07 8:43 AM Page xxi Preface xxi theirs in 1997, Fischer Black would surely have been included. Jack Treynor, very much a part of the original story, should also have won a Nobel but missed out because he never published his seminal paper on the Capital Asset Pricing Model.* Working on this project has been a great adventure and a rare privilege. Peter L. Bernstein New York, New York March 2007 * On a personal note, I owe Jack Treynor an apology. On page 184 of Capital Ideas, I wrote that Treynor “left Harvard Business School in 1955 . . . ,” giving the impression that Jack left without graduating.

pages: 383 words: 105,021

Dark Territory: The Secret History of Cyber War
by Fred Kaplan
Published 1 Mar 2016

Still, Marsh had been away from day-to-day operations for twelve years, and this focus on “cyber” was entirely new to him. For advice and a reality check, Marsh called an old colleague who knew more about these issues than just about anybody—Willis Ware. Ware had kept up with every step of the Internet revolution since writing his seminal paper, nearly thirty years earlier, on the vulnerability of computer networks. He still worked at the RAND Corporation, and he was a member of the Air Force Scientific Advisory Board, which is where Marsh had come to know and trust him. Ware assured Marsh that Gorelick’s report was on the right track; that this was a serious issue and growing more so by the day, as the military and society grew more dependent on these networks; and that too few people were paying attention.

pages: 374 words: 111,284

The AI Economy: Work, Wealth and Welfare in the Robot Age
by Roger Bootle
Published 4 Sep 2019

It grew out of digital computing, which was explored and developed at Bletchley Park in England during the Second World War, famously enabling the Nazis’ Enigma code to be broken. That feat is closely associated with the name of Alan Turing. Turing was also responsible for AI’s early conceptual framework, publishing in 1950 the seminal paper “Computing Machinery and Intelligence.” The subject was subsequently developed mainly in the USA and the UK. But it waxed and waned in both esteem and achievement. Over the last decade, however, a number of key developments have come together to power AI forward: • Enormous growth in computer processing power

pages: 460 words: 107,454

Stakeholder Capitalism: A Global Economy That Works for Progress, People and Planet
by Klaus Schwab
Published 7 Jan 2021

When I visited US Senator Elizabeth Warren in Washington, DC, at the end of 2018, she was already contemplating a similar stance against the market leaders in many of America's industries, including technology, the pharmaceutical sector, and finance. Wu's colleague at Columbia Law School Lina Khan in 2016 wrote a seminal paper (while at Yale), taking a similar stance: “Amazon's Antitrust Paradox.”33 Economists such as Gabriel Zucman, Emmanuel Saez, Kenneth Rogoff, and Nobel Prize winners Paul Krugman and Joseph Stiglitz have also stated Big Tech has “too much power”34 or needs to be more strictly regulated. Leading journalists including Nicholas Thompson, the editor in chief of Wired, and Rana Foroohar, associate editor of the Financial Times, favor antitrust action against Big Tech too.

pages: 444 words: 111,837

Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe
by Paul Sen
Published 16 Mar 2021

The lesson was clear: muscle movement was driven by chemical energy released as one substance turned into another, again, in principle, no different to combustion. The third part of Helmholtz’s strategy drew its inspiration from the steam engine. Specifically, Helmholtz employed the same assumption about the impossibility of perpetual motion that Sadi Carnot had made in his seminal paper on steam engine efficiency. If the vitalists were right and animals could generate more heat than could be released by burning carbon with oxygen, then there must be some other source of heat within them that is not subject to physical laws. That, however, would imply that animals could create some heat without consuming any food or fuel.

pages: 460 words: 107,454

Stakeholder Capitalism: A Global Economy That Works for Progress, People and Planet
by Klaus Schwab and Peter Vanham
Published 27 Jan 2021

When I visited US Senator Elizabeth Warren in Washington, DC, at the end of 2018, she was already contemplating a similar stance against the market leaders in many of America's industries, including technology, the pharmaceutical sector, and finance. Wu's colleague at Columbia Law School Lina Khan in 2016 wrote a seminal paper (while at Yale), taking a similar stance: “Amazon's Antitrust Paradox.”33 Economists such as Gabriel Zucman, Emmanuel Saez, Kenneth Rogoff, and Nobel Prize winners Paul Krugman and Joseph Stiglitz have also stated Big Tech has “too much power”34 or needs to be more strictly regulated. Leading journalists including Nicholas Thompson, the editor in chief of Wired, and Rana Foroohar, associate editor of the Financial Times, favor antitrust action against Big Tech too.

pages: 454 words: 107,163

Break Through: Why We Can't Leave Saving the Planet to Environmentalists
by Michael Shellenberger and Ted Nordhaus
Published 10 Mar 2009

And in consistently defining the interests of others—whether they are corporate executives, labor unions, or Brazilian peasants—as outside the categories of the environment and nature, environmental and conservation leaders have failed to create a politics capable of dealing with ecological crises. 2. In 1943, the American psychologist Abraham Maslow wrote a seminal paper called “A Theory of Human Motivation.”8 In it he introduced the theory that humans have a “hierarchy of needs,” a deceptively simple concept that many of us can still remember seeing as a multicolor pyramid in our high school social studies classes. At the bottom of the pyramid there were the basic material needs: food, shelter, and security.

pages: 534 words: 118,459

Database Design and Relational Theory
by C.J. Date
Published 19 Apr 2012

Indeed, I think it’s noteworthy that Codd called his very first (1969) paper on the relational model “Derivability, Redundancy, and Consistency of Relations Stored in Large Data Banks” (boldface added; see Appendix C). And his second (1970) paper, “A Relational Model of Data for Large Shared Data Banks” (again, see Appendix C)—this is the one that’s usually regarded as the seminal paper in the field, though that characterization is a little unfair to its 1969 predecessor—was in two parts of almost equal length, the second of which was called “Redundancy and Consistency” (the first was called “Relational Model and Normal Form”). Codd thus clearly regarded his thoughts on redundancy as a major part of the contribution of his relational work: rightly so, in my opinion, since he did at least provide us with a framework in which we could begin to address the issue precisely and systematically.

pages: 419 words: 119,368

Espresso Tales
by Alexander McCall Smith
Published 1 Jan 2005

Irene sat up at the mention of the name. Nicholas Fairbairn. Why did Dr Fairbairn mention Nicholas Fairbairn? Was it because he was his brother, perhaps? Which meant that he must be the son of Ronald Fairbairn, no less – Ronald Fairbairn who had written Psychoanalytic Studies of the Personality, in which volume there appeared the seminal paper, “Endoscopic Structure Considered in Terms of Object-Relationships.” “Are you, by any chance … ?” she began. Dr Fairbairn hesitated. More guilt was coming to the surface, inexorably, bubbling up like the magma of a volcano. “No,” he said. “I’m not. I am nothing to do with Ronald Fairbairn, or his colourful son.

pages: 372 words: 111,573

10% Human: How Your Body's Microbes Hold the Key to Health and Happiness
by Alanna Collen
Published 4 May 2015

In reality, autism forms a spectrum of symptoms, from those with average or above-average intelligence – known as Asperger syndrome – to those with severe autism and significant learning disabilities like Andrew Bolte. In common to all with autistic spectrum disorder (ASD) are difficulties with social behaviour. It was this feature that prompted the American psychiatrist Leo Kanner to identify autism as a distinct syndrome in 1943. In his seminal paper on the topic, he described the case histories of eleven children who shared an ‘inability to relate themselves in the ordinary way to people and situations from the beginning of life’. Kanner borrowed the word autism, meaning ‘self-ism’, from the constellation of symptoms associated with schizophrenia.

pages: 309 words: 114,984

The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age
by Robert Wachter
Published 7 Apr 2015

Under his stewardship, the NEJM published the first reports of the disease that would later be called AIDS. A few years later, it published the first description of the discovery of HIV and, still later, the clinical trials that demonstrated the effectiveness of AZT in treating HIV infection. Relman’s NEJM published seminal papers on new ways to treat and prevent coronary artery disease, research that led to halving the American death toll from heart attacks between 1980 and 2005. It also published major papers that proved the effectiveness of lumpectomy for breast cancer and the vaccine to prevent hepatitis B. While this is extraordinary stuff, I’d want to chat with Relman mostly about his other passion: health policy.

pages: 423 words: 118,002

The Boom: How Fracking Ignited the American Energy Revolution and Changed the World
by Russell Gold
Published 7 Apr 2014

His argument was that the amount of oil in the world is finite and that as production increases, it will reach a peak and then begin to decline. Drawn on a graph, his forecast resembled a bell curve. In the late 1940s, he became interested in the question of how many years of oil supply could be pumped out of the earth and set out to figure it out. At the same time, he studied hydraulic fracturing and wrote a seminal paper on the new technology. The two interests were connected. If hydraulic fracturing could significantly increase the availability of oil and gas, it would make more oil available and push back the date of “Hubbert’s Peak.” But he was not impressed with Stanolind’s hydrafracs. In his famous 1956 paper outlining his ideas on peak oil, he noted that only about one-third of the oil in a reservoir was being recovered.

pages: 474 words: 120,801

The End of Power: From Boardrooms to Battlefields and Churches to States, Why Being in Charge Isn’t What It Used to Be
by Moises Naim
Published 5 Mar 2013

In 1937, Coase produced a conceptual breakthrough that explained why large organizations were not just rational according to a certain theory of profit-maximizing behavior but, indeed, often proved more efficient than the alternatives. It was no coincidence that, while still an undergraduate, in 1931–1932, Coase carried out the research for his seminal paper, “The Nature of the Firm,” in the United States. Earlier he had flirted with socialism, and he became intrigued by the similarities in organization between American and Soviet firms and, in particular, by the question of why large industry, where power was highly centralized, had emerged on both sides of the ideological divide.20 Coase’s explanation—which would help earn him the Nobel Prize in economics decades later—was both simple and revolutionary.

pages: 410 words: 114,005

Black Box Thinking: Why Most People Never Learn From Their Mistakes--But Some Do
by Matthew Syed
Published 3 Nov 2015

The attitudes and activities required to effectively detect and analyze failures are in short supply in most companies, and the need for context-specific learning strategies is underappreciated. Organizations need new and better ways to go beyond lessons that are superficial.”15 Wald’s analysis of bullet-riddled aircraft in World War II saved the lives of dozens of brave airmen. His seminal paper for the military was not declassified until July 1980, but can be found today via a simple search on Google. It is entitled: “A Method of Estimating Plane Vulnerability Based on Damage of Survivors.”16 It wasn’t until after the war that Wald learned of the murder of eight of his nine family members at the hands of the Nazis.

pages: 406 words: 115,719

The Case Against Sugar
by Gary Taubes
Published 27 Dec 2016

Through the mid-nineteenth century, diabetes remained a rare affliction, to be discussed in medical texts and journal articles but rarely seen by physicians in their practices. As late as 1797, the British army surgeon John Rollo could publish “An Account of Two Cases of the Diabetes Mellitus,” a seminal paper in the history of the disease, and report that he had seen these cases nineteen years apart despite, as Rollo wrote, spending the intervening years “observ[ing] an extensive range of disease in America, the West Indies, and in England.” If the mortality records from Philadelphia in the early nineteenth century are any indication, the city’s residents were as likely to die from diabetes, or at least to have diabetes attributed as the cause of their death, as they were to be murdered or to die from anthrax, hysteria, starvation, or lethargy.*1 In 1890, Robert Saundby, a former president of the Edinburgh Royal Medical Society, presented a series of lectures on diabetes to the Royal College of Physicians in London in which he estimated that less than one in every fifty thousand died from the disease.

pages: 384 words: 118,572

The Confidence Game: The Psychology of the Con and Why We Fall for It Every Time
by Maria Konnikova
Published 28 Jan 2016

She concluded, “His stupidity could cost him his life.” One of the reasons that the tale is so powerful is that, despite the motivated reasoning that we engage in, we never realize we’re doing it. We think we are being rational, even if we have no idea why we’re really deciding to act that way. In “Telling More Than We Can Know,” a seminal paper in the history of social and cognitive psychology, Richard Nisbett and Timothy Wilson showed that people’s decisions are often influenced by minute factors outside their awareness—but tell them as much, and they rebel. Instead, they will give you a list of well-reasoned justifications for why they acted as they did.

pages: 443 words: 116,832

The Hacker and the State: Cyber Attacks and the New Normal of Geopolitics
by Ben Buchanan
Published 25 Feb 2020

Error Locked San Bernardino Attacker’s iPhone,” New York Times, March 1, 2016. 3. US Department of Justice Office of the Inspector General, “A Special Inquiry Regarding the Accuracy of FBI Statements Concerning Its Capabilities to Exploit an iPhone Seized during the San Bernardino Terror Attack Investigation,” March 27, 2018, 8. 4. For the two seminal papers summarizing cryptographers’ views on the dangers of weakening encryption, see Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, et al., “Keys under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications,” Computer Science and Artificial Intelligence Laboratory Technical Report, Massachusetts Institute of Technology, July 6, 2015; and Hal Abelson et al., “The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,” Columbia University Academic Commons, May 27, 1997. 5.

pages: 316 words: 117,228

The Code of Capital: How the Law Creates Wealth and Inequality
by Katharina Pistor
Published 27 May 2019

Hirschman, Exit, Voice, and Loyalty: Responses to Decline in Firms, Organizations, and States (Cambridge, MA: Harvard University Press, 1970). 43. See Dari-Matiacci et al., “The Emergence of the Corporate Form,” p. 221 with figures 10 and 11, p. 223. 44. Modigliani and Miller addressed this question in a seminal paper published in 1958. See Franco Modigliani and Merton H. Miller, “The Cost of Capital, Corporation Finance and the Theory of Investment,” American Economic Review 48, no. 3 (1958):261–297. 45. See Oliver Williamson, “Transaction-Cost Economics: The Governance of Contractual Relations,” Journal of Law and Economics 22, no. 2 (1979):233–261; Sanford J.

pages: 434 words: 117,327

Can It Happen Here?: Authoritarianism in America
by Cass R. Sunstein
Published 6 Mar 2018

The second problem is that individual choices are often subject to social influence, or what economists call peer effects, meaning that the choice that is most desirable, or simply the most available, to one person depends on what others are doing. Individuals are bound together by complex networks, which can propagate beliefs and behavior in ways that are fundamentally at odds with common sense. In a seminal paper, Mark Granovetter (1978) showed how surprising the dynamics of networked systems can be. Granovetter imagined a hypothetical crowd of agitators on the brink of violence, where each member has some “threshold” for acting out that depends on how many others are. If one member has a threshold of zero, he will trigger without provocation.

pages: 401 words: 115,959

Philanthrocapitalism
by Matthew Bishop , Michael Green and Bill Clinton
Published 29 Sep 2008

This “serves as a reminder to all our employees to consider the consequences of our actions,” Page told the Global Philanthropy Forum, which he and Brin hosted in 2007 at the Googleplex. Then he joked, “Perhaps it was a mistake—we should have said, ‘Be good.’ ” According to Google legend, when the pair first met in 1995, as computer science students at Stanford University, they were “not terribly fond of each other.” That soon changed, and they went on to write a seminal paper together, “The Anatomy of a Large-Scale Hypertextual Web Search Engine,” before founding Google, which was incorporated as a private company in September 1998. Less than six years later, the online search firm, with its mission to “organize the world’s information and make it universally accessible and useful,” sold its shares to the public, instantly making the American-born Page and the Russian-born Brin billionaires several times over.

Decoding Organization: Bletchley Park, Codebreaking and Organization Studies
by Christopher Grey
Published 22 Mar 2012

The relationship between organization studies and history has been somewhat limited and not always very satisfactory. Of course, there are exceptions to this, Whipp and Clark’s (1986) study of innovation in the car industry being a good example. Nevertheless, the very legitimacy of such an engagement has been questioned (Golden, 1992) and, until Alfred Kieser’s (1994) seminal paper, there was little discussion within the recent literature regarding the role and relationship of history to organization studies, unlike that which had occurred, for example and in particular, in sociology (e.g. Abrams, 1982). In that regard Kieser notes that Weber, in many ways the ‘founding father’ of organization studies, was as much historian as sociologist, believing contemporary institutions could be understood only by knowing how they developed in history.

pages: 424 words: 114,820

Neurodiversity at Work: Drive Innovation, Performance and Productivity With a Neurodiverse Workforce
by Amanda Kirby and Theo Smith
Published 2 Aug 2021

Two recent papers in the past few years have exemplified the need to consider how services are delivered and are really saying we need to move away from the strict categorization and medical model of psychiatry and paediatrics. The recent consensus paper on identification and treatment of ADHD and ASD5 and the seminal paper from Thapar, Cooper and Rutter6 both highlight the real clinical conundrum of ‘the complexity of clinical phenotypes and the importance of the social context’. A phenotype means the challenges or strengths that someone can observe in someone. They go on to argue ‘the importance of viewing neurodevelopmental disorders as traits but highlight that this is not the only approach to use’.

pages: 555 words: 119,733

Autotools
by John Calcote
Published 20 Jul 2010

There is a method to my madness: I've tried to use constructs that are portable to many flavors of the make utility. Now let's discuss the basics of make. If you're already pretty well versed in it, then you can skip the next section. Otherwise, give it a quick read, and we'll return our attention to the Jupiter project later in the chapter. * * * [16] Peter Miller's seminal paper, "Recursive Make Considered Harmful" (http://miller.emu.id.au/pmiller/books/rmch/), published over 10 years ago, discusses some of the problems recursive build systems can cause. I encourage you to read this paper and understand the issues Miller presents. While the issues are valid, the sheer simplicity of implementing and maintaining a recursive build system makes it, by far, the most widely used form of build system.

pages: 320 words: 33,385

Market Risk Analysis, Quantitative Methods in Finance
by Carol Alexander
Published 2 Jan 2007

But an arithmetic process can become negative, whilst the prices of tradable assets are never negative. Thus, to represent the dynamics of an asset price we very often use a geometric Brownian motion which is specified by the following SDE: dSt = dt + dZt (I.3.143) St The standard assumption, made in the seminal papers by Black and Scholes (1973) and Merton (1973), is that the parameters and (the drift and the volatility of the process) are constant. Although it has been known since Mandlebrot (1963) that the constant volatility assumption is not valid for financial asset prices, the Black–Scholes–Merton framework still remains a basic standard against which all other models are gauged.

pages: 561 words: 120,899

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy
by Sharon Bertsch McGrayne
Published 16 May 2011

Cornfield maintained a running argument with Fisher through the 1950s. Cornfield was already thinking deeply about the standards of evidence needed before observational data could establish cause and effect. Finally, in 1959, he raked Fisher over the coals about smoking with a common-sense, nonmathematical paper that reads like a legal brief. In that seminal paper he and five coauthors systematically addressed every one of Fisher’s alternative explanations for the link between cigarette smoking and lung cancer. They hurled one counterargument after another at Fisher’s hypothetical genetic factor. If cigarette smokers were nine times more likely than nonsmokers to get lung cancer, Fisher’s latent genetic factor must be even larger—though nothing approaching that had ever been seen.

pages: 476 words: 120,892

Life on the Edge: The Coming of Age of Quantum Biology
by Johnjoe McFadden and Jim Al-Khalili
Published 14 Oct 2014

The fuzziness in the position of any particle allows it to leak through an energy barrier. We saw in chapter 3 how enzymes utilize quantum tunneling of electrons and protons by bringing molecules close enough together for tunneling to take place. A decade after Watson and Crick published their seminal paper, the Swedish physicist Per-Olov Löwdin, whom we met earlier in this chapter, proposed that quantum tunneling could provide an alternative way for protons to move across hydrogen bonds to generate the tautomeric, mutagenic, forms of nucleotides. It is important to emphasize that DNA mutations are caused by a variety of different mechanisms, including damage caused by chemicals, ultraviolet light, radioactive decay particles, even cosmic rays.

pages: 467 words: 116,094

I Think You'll Find It's a Bit More Complicated Than That
by Ben Goldacre
Published 22 Oct 2014

However, quantifiable indices of health status, social functioning, criminal behaviour, total opiate consumption, needle-sharing and so on are all viable and uncontroversial outcome measures, and should be comprehensively investigated for methadone and heroin. Furthermore, no indications have been found that prescribing heroin would inflict harm of a kind that might make such trials unacceptable. Perneger et al. (1998) have noted that although the Swiss trial was small, it was similar to the initial evaluations of methadone, such as the seminal paper by Dole and Nyswander (1965), which led to its widespread use in the treatment of drug addiction. It seems likely that a contributory factor was the medical profession’s emotional and moral attitudes towards drug users. However noble our intentions when we approach a clinical or social problem, we may often be confounded by extraneous factors and preconceptions, and fail in our objectivity.

pages: 478 words: 126,416

Other People's Money: Masters of the Universe or Servants of the People?
by John Kay
Published 2 Sep 2015

This was the beginning of the development of markets in derivative securities. It is not a coincidence that the University of Chicago was then and is today a leading centre of the study of financial economics. In the following year two members of its faculty – Fischer Black and Myron Scholes – would publish a seminal paper on the valuation of derivatives.5 Much of the growth of the financial sector in the three decades that followed would be the direct and indirect consequence of the growth of derivative markets. Futures were not the only kind of derivative. An option gave you the right, but not the obligation, to buy or sell – you could use an option to insure yourself against a rise, or a fall, in price.

The Future of Technology
by Tom Standage
Published 31 Aug 2005

Passwords should be at least six and ideally eight characters long, and contain a mixture of numbers, letters and punctuation marks. Dictionary words and personal information should not be used as passwords. Users should have a different password on each system, and they should never reveal their passwords to anyone, including systems managers. Yet a seminal paper published as long ago as 1979 by Ken Thomson and Robert Morris found that nearly a fifth of users chose passwords consisting of no more than three characters, and that a third used dictionary words. (Robert Morris, the chief scientist at America’s National Computer Security Centre, was subsequently upstaged by his son, also called Robert, who released the first internet worm in 1988 and crashed thousands of computers.

Entangled Life: How Fungi Make Our Worlds, Change Our Minds & Shape Our Futures
by Merlin Sheldrake
Published 11 May 2020

This sort of relationship-building enacts one of the oldest evolutionary maxims. If the word cyborg—short for “cybernetic organism”—describes the fusion between a living organism and a piece of technology, then we, like all other life-forms are symborgs, or symbiotic organisms. The authors of a seminal paper on the symbiotic view of life take a clear stance on this point. “There have never been individuals,” they declare. “We are all lichens.” * * * — DRIFTING AROUND ON the Caper, we spend a lot of time looking at sea charts. On these maps, the familiar role of sea and land are reversed.

pages: 453 words: 122,586

Samuelson Friedman: The Battle Over the Free Market
by Nicholas Wapshott
Published 2 Aug 2021

“Samuelson reshaped academic thinking about nearly every economic subject,” declared the New York Times, which noted that “a historian could well tell the story of 20th-century public debate over economic policy in America through the jousting between Mr. Samuelson and Milton Friedman.”36 The paper’s principal economics commentator, Paul Krugman, wrote, “Most economists would love to have written even one seminal paper—a paper that fundamentally changes the way people think about some issue. Samuelson wrote dozens.”37 The conservative Wall Street Journal acknowledged Samuelson as a “Titan of economics,” while The Economist pronounced him “the last of the great general economists.”38 The Daily Telegraph, London, wrote that “Samuelson made such diverse contributions to his field—ranging from welfare economics, theories of consumption, prices, capital accumulation, economic growth, public goods, finance and international trade—that it is hard to think of a debate to which he did not make a trenchant contribution,” and that his textbook “effectively gave the world a common language with which the complexities of world markets can be discussed and understood.”

User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work & Play
by Cliff Kuang and Robert Fabricant
Published 7 Nov 2019

This issue of real-world performance versus lab experiments hovered over the battlefield, a killer beyond reckoning. It was made worse by the fact that the war was being fought on a “sensory margin” exponentially more fine than that of just ten years before. S. S. Stevens, a psychologist at Harvard, was the one who reported that story of an airman lost at sea. He was horrified. As he put it in his seminal paper “Machines Cannot Fight Alone”: The battle hangs on the power of the eyes or the ears to make a fine discrimination, to estimate a distance, to see or hear a signal which is just at the edge of human capacity. Radars don’t see, radios don’t hear, sonars don’t detect, guns don’t point without someone making a fine sensory judgment, and the paradox of it is that the faster the engineers and the inventors served up their “automatic” gadgets to eliminate the human factor the tighter the squeeze became on the powers of the operator—the man who must see and hear and judge and act with that margin of superiority which gives his outfit the jump on the enemy.5 Stevens notes that men would push this faulty equipment to its limits.

pages: 442 words: 127,300

Why We Sleep: Unlocking the Power of Sleep and Dreams
by Matthew Walker
Published 2 Oct 2017

Lewis Terman, famous for helping construct the IQ test, dedicated his research career to the betterment of children’s education. Starting in the 1920s, Terman charted all manner of factors that promoted a child’s intellectual success. One such factor he discovered was sufficient sleep. Published in his seminal papers and book Genetic Studies of Genius, Terman found that no matter what the age, the longer a child slept, the more intellectually gifted they were. He further found that sleep time was most strongly connected to a reasonable (i.e., a later) school start time: one that was in harmony with the innate biological rhythms of these young, still-maturing brains.

pages: 437 words: 126,860

Case for Mars
by Robert Zubrin
Published 27 Jun 2011

If we consider only the upper Amazonian territories as viable candidates, and spread their formation equally in time across the 500 million years of that era, we find that 10 percent, or 0.5 million square kilometers, is probably less than 50 million years old; 1 percent, or 50,000 square kilometers, is probably less than 5 million years old; and 0.1 percent, or 5,000 square kilometers, has probably been active within the past 500,000 years. You don’t have to extract geothermal power from a region that is actually volcanically active now. The ground stays hot a long time after activity has subsided. In his seminal paper on Mars geothermal power, Fogg presented calculations of the temperature profiles of Martian land as a function of the time since the region was active. His results are summarized in Table 7.2. As a point of reference, the current state of the art of terrestrial drilling technology is to be able to drill down to about 10 kilometers.

pages: 424 words: 122,350

Feral: Rewilding the Land, the Sea, and Human Life
by George Monbiot
Published 13 May 2013

Armies of conservation volunteers are employed to prevent natural processes from occurring. Land is intensively grazed to ensure that the plants do not recover from intensive grazing. Woods are coppiced (the trees are felled at ground level, encouraging them to resprout from that point) to sustain the past impacts of coppicing. In their seminal paper challenging the conservation movement, the biologists Clive Hambler and Martin Speight point out that while coppicing might favour butterfly species which can live in many habitats, it harms woodland beetles and moths that can live nowhere else.30 They noted that of the 150 woodland insects that are listed as threatened in Britain, just three (2 per cent) are threatened by a reduction in coppicing, while 65 per cent are threatened by the removal of old and dead wood.

pages: 451 words: 125,201

What We Owe the Future: A Million-Year View
by William MacAskill
Published 31 Aug 2022

In 1820, an estimated 83.9 percent of the world population lived on a daily income that, adjusted for inflation and price differences between countries, bought less than one dollar did in the US in 1985 (Bourguignon and Morrisson 2002, Table 1, 731, 733). In 2002, when Bourguignon and Morrisson published their seminal paper on the history of the world income distribution, this was the World Bank’s international poverty line, typically used to define extreme poverty. The World Bank has since updated the international poverty line to a daily income corresponding to what $1.90 would have bought in the US in 2011. Using this new definition, World Bank data indicates that the share of the global population living in extreme poverty has been less than 10 percent since 2016; the COVID-19 pandemic tragically broke the long-standing trend of that percentage declining year after year, but it did not quite push it over 10 percent again (World Bank 2020).

pages: 484 words: 136,735

Capitalism 4.0: The Birth of a New Economy in the Aftermath of Crisis
by Anatole Kaletsky
Published 22 Jun 2010

But while Public Choice theory is often regarded as a laissez-faire ideology characteristic of the Thatcher-Reagan period, some of the most important contributors to this skeptical view of regulation4 were progressive advocates of strong and effective governments who were trying to develop a theory on how to improve, not jettison, public choice. Even Buchanan, although a conservative in his general political outlook, maintained that he was neither for nor against government. In one of his seminal papers, he explained how the skeptical framework of Public Choice “almost literally forces the critic to be pragmatic in any comparison of proposed institutional structures.”5 This is a perfect way to characterize the attitude to government and markets in Capitalism 4.0. The skeptical Public Choice insights about the failures of government in the 1970s and 1980s are likely to produce new conclusions under Capitalism 4.0.

Designing Interfaces
by Jenifer Tidwell
Published 15 Dec 2010

Figure 7-25. San Francisco Crimespotting In other libraries http://patternbrowser.org/code/pattern/pattern_anzeigen.php?4,231,17,0,0,252 http://www.infovis-wiki.net/index.php?title=Dynamic_query Both the name and the concept for Dynamic Queries originated in the early 1990s with several seminal papers by Christopher Ahlberg, Christopher Williamson, and Ben Shneiderman. You can find some of these papers online, including the following: http://hcil.cs.umd.edu/trs/91-11/91-11.html http://hcil.cs.umd.edu/trs/93-01/93-01.html Data Brushing Figure 7-26. BBN Cornerstone What Let the user select data items in one view; show the same data selected simultaneously in another view.

pages: 434 words: 135,226

The Music of the Primes
by Marcus Du Sautoy
Published 26 Apr 2004

Even though everyone is using the same key to encode their data – to lock the door and secure their secret – no one can read anyone else’s encoded message. In fact, once data is encoded, customers are unable to read it, even if it is their own. Only the company running the website has key B, to unlock the door and read those credit card numbers. Public-key cryptography was first openly proposed in 1976 in a seminal paper by two mathematicians based at Stanford University in California, Whit Diffie and Martin Hellman. The duo sparked a counter-culture in the cryptographic world that would challenge the governmental agencies’ monopoly on cryptography. Diffie in particular was the archetypal anti-establishment, long-haired child of the 1960s.

From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry
by Martin Campbell-Kelly
Published 15 Jan 2003

Steve Lohr, the author of a fine history of computer programming, has written: “Gates, most of all, is someone with a deep understanding of software, and the foremost applied economist of the past half-century.”71 Describing the development of increasing-returns economics, Lohr continues: “One of the seminal papers in this new branch of economics research was written by Michael Katz and Carl Shapiro, ‘Network Externalities, Competition and Compatibility,’ published in 1985. While they were working on their Not Only Microsoft 265 paper, Shapiro recalled Katz saying there was a guy who had been at Harvard when Katz was an undergraduate, who was doing precisely what they were writing about at a software company outside Seattle.

pages: 418 words: 128,965

The Master Switch: The Rise and Fall of Information Empires
by Tim Wu
Published 2 Nov 2010

Be liberal in what you accept from others.14 It may seem strange that such a philosophical, perhaps even spiritual, principle should be embedded in the articulation of the Internet, but then network design, like all design, can be understood as ideology embodied, and the Internet clearly bore the stamp of the opposition to bigness characteristic of the era. Not long thereafter, three professors of computer science, David Reed, David Clark, and Jerome Saltzer, would try to explain what made the Internet so distinctive and powerful. In a seminal paper of 1984, “End-to-End Arguments in System Design,” they argued for the enormous potential inherent in decentralizing decisional authority—giving it to the network users (the “ends”).15 The network itself (the “middle”) should, they insisted, be as nonspecialized as possible, so as to serve the “ends” in any ways they could imagine.* What were such notions if not the computer science version of what Hayek and Jacobs, Kohr and Schumacher, had been arguing?

pages: 461 words: 128,421

The Myth of the Rational Market: A History of Risk, Reward, and Delusion on Wall Street
by Justin Fox
Published 29 May 2009

Finance was just a side interest for him, but he devised the first mathematical proof of the efficient market hypothesis and came close to solving the option-pricing puzzle. Recipient of the second Nobel prize in economics, in 1970. Leonard “Jimmy” Savage Statistics professor whose axioms for assessing data under uncertainty informed the work of his Chicago student Harry Markowitz and helped define rationality for decades. Also coauthor of a seminal paper on expected utility with Milton Friedman, and rediscoverer of the work of French market theory pioneer Louis Bachelier. Myron Scholes Classmate and friend of Michael Jensen and Richard Roll at Chicago. Devised the Black-Scholes option pricing model along with Fischer Black while teaching at MIT.

pages: 582 words: 136,780

Krakatoa: The Day the World Exploded
by Simon Winchester
Published 1 Jan 2003

The Dutchman was promptly invited to the United States by a young Princeton scientist named Harry Hess; and, together with two other young men who were to go on to become rising stars in the new field, Maurice Ewing and Teddy (later to be Sir Edward) Bullard, they took off in a boat called the Barracuda to see if the Javan anomalies could be found above the submarine trenches known to exist in the Caribbean. They did, spectacularly so. The four excitedly discussed why this might be – with Harry Hess and Vening Meinesz openly speculating that they were caused by some mysterious force dragging the rocks of the seabed downwards and (as it were) dragging the gravity down with them. Hess wrote a seminal paper in 1939: Recently an important new concept concerning the origins of the negative strip of gravity anomalies… has been set forward… It is based on model experiments in which… by means of horizontal rotating cylinders, convection currents were set up in a fluid layer beneath the ‘crust’ and a convection cell was formed.

pages: 607 words: 133,452

Against Intellectual Monopoly
by Michele Boldrin and David K. Levine
Published 6 Jul 2008

These ideas build on the earlier ideas of Allyn Young (1928), and especially the work of Kenneth Arrow (1962), further developed by Karl Shell (1966, 1967). To give credit where it belongs, we should point out that Arrow’s original argument was meant to lead to the conclusion that R&D, because it produced a public good (nonrivalrous knowledge), ought to be financed by public expenditure. There is nothing in Arrow’s seminal paper, nor in his subsequent writings on the topic, that suggests he had in mind intellectual monopoly as a solution to the allocational inefficiency that he – in our view, incorrectly – detected in the production of knowledge. There is also an extensive microeconomics literature on patents that generally begins with the assumption that innovation will not take place without a patent and inquires into the optimal length and breadth of patent protection.

AI 2041: Ten Visions for Our Future
by Kai-Fu Lee and Qiufan Chen
Published 13 Sep 2021

Now you can see why happiness-inducing AI is extremely hard! Let’s dig into the four problems and possible solutions. WHAT IS HAPPINESS IN THE ERA OF AI? Setting aside AI for the moment, let’s ask the most basic question: What does happiness mean anyway? In 1943, Abraham Maslow published his seminal paper “A Theory of Human Motivation,” which described what is now known as “Maslow’s hierarchy of needs.” This theory is usually illustrated as a pyramid, shown below. This pyramid describes human needs from the most basic to the most advanced level. Each lower-level need must be fulfilled in order to move toward a higher-level need.

Evil Genes: Why Rome Fell, Hitler Rose, Enron Failed, and My Sister Stole My Mother's Boyfriend
by Barbara Oakley Phd
Published 20 Oct 2008

Kindhearted naivete puts one at risk of being taken advantage of by a psychopath, which is perhaps why sweet-tempered Laci Peterson, brutally murdered by her husband while eight months pregnant with their first child, had previously dated a man who eventually received a fifteen-year prison sentence for shooting another girlfriend in the back.)3 Just as the cuckoo has found an evolutionary niche laying its eggs in the nests of other birds (taking advantage of their nurturing instincts), psychopaths and Machiavellians have found their evolutionary niche in taking advantage of the natural altruism of other humans.a.4 Such variation in human emotional outlook is bred into our very genes. Research has progressed since Mealey wrote her seminal paper in 1995. But the essential idea she reviewed and synthesized is unchanged—that is, congenitally deceptive individuals—cheaters—can thrive and reproduce in society. How much these cheaters succeed depends on how many of them there are. If their numbers are tiny, they can easily find victims to dupe, and so they thrive.

The Book of Why: The New Science of Cause and Effect
by Judea Pearl and Dana Mackenzie
Published 1 Mar 2018

Because Wright lived such a long life, he had the rare privilege of seeing a biography (Provine, 1986) come out while he was still alive. Provine’s biography is still the best place to learn about Wright’s career, and we particularly recommend Chapter 5 on path analysis. Crow’s two biographical sketches (Crow, 1982, 1990) also provide a very useful biographical perspective. Wright (1920) is the seminal paper on path diagrams; Wright (1921) is a fuller exposition and the source for the guinea pig birth-weight example. Wright (1983) is Wright’s response to Karlin’s critique, written when he was over ninety years old. The fate of path analysis in economics and social science is narrated in Chapter 5 of Pearl (2000) and in Bollen and Pearl (2013).

Applied Cryptography: Protocols, Algorithms, and Source Code in C
by Bruce Schneier
Published 10 Nov 1993

This means that, if Satisfiability is solvable in polynomial time, then P = NP. Conversely, if any problem in NP can be proven not to have a deterministic polynomial-time algorithm, the proof will show that Satisfiability does not have a deterministic polynomial-time algorithm either. No problem is harder than Satisfiability in NP. Since Cook’s seminal paper was published, a huge number of problems have been shown to be equivalent to Satisfiability; hundreds are listed in [600], and some examples follow. By equivalent, I mean that these problems are also NP-complete; they are in NP and also as hard as any problem in NP . If their solvability in deterministic polynomial time were resolved, the P versus NP question would be solved.

Their contribution to cryptography was the notion that keys could come in pairs—an encryption key and a decryption key—and that it could be infeasible to generate one key from the other (see Section 2.5). Diffie and Hellman first presented this concept at the 1976 National Computer Conference [495]; a few months later, their seminal paper “New Directions in Cryptography” was published [496]. (Due to a glacial publishing process, Merkle’s first contribution to the field didn’t appear until 1978 [1064].) Since 1976, numerous public-key cryptography algorithms have been proposed. Many of these are insecure. Of those still considered secure, many are impractical.

Multitool Linux: Practical Uses for Open Source Software
by Michael Schwarz , Jeremy Anderson and Peter Curtis
Published 7 May 2002

Eric Raymond and Bruce Perens Bruce Perens created the original Open Source Definition (OSD) as the Debian Free Software Guidelines (DFSG). Debian is a Linux distribution that consists entirely of software that meets these guidelines. Some Linux distributions include closed, commerical, or marginally open software. Debian never has and never will. Eric S. Raymond wrote a number of seminal papers on the phenomenon of what he and Bruce call open source development but what I think may be more generally called Internet-distributed software development. Eric tends to come to fairly far-reaching conclusions from a limited number of data points, but it doesn't change the fact that The Cathedral and the Bazaar has become what may be the single most influential essay on the whole phenomenon of free software.

pages: 516 words: 159,734

War Without Mercy: PACIFIC WAR
by John Dower
Published 11 Apr 1986

Gorer was close to the American academics involved in the development of “culture and personality” studies, and briefly associated with the analysis of Japanese behavior being conducted under the Foreign Morale Analysis Division of the Office of War Information. In addition, his theories were quickly picked up in the popular press. His seminal paper, entitled “Themes in Japanese Culture” in its original presentation, was recapitulated in Time under the heading “Why Are Japs Japs?”6 Gorer began his analysis by observing that on the surface Japan appeared to be “the most paradoxical culture of which we have any record,” illustrating this with a familiar catalog of contradictions.

pages: 518 words: 147,036

The Fissured Workplace
by David Weil
Published 17 Feb 2014

In so doing, “a predictable salary progress schedule not only should help to reduce uncertainty about future pay but also should prevent the development of false expectations. In addition it should minimize dysfunctional competition between individuals for favored treatment.” Quoted in Foulkes (1980, 186). 25. Fehr, Goette, and Zehnder (2009, 378). The literature on loss aversion and “framing” in psychology is extensive. The seminal papers are Tversky and Kahneman (1974) and Kahneman and Tversky (1984). Kahneman (2011) provides an overview of the extensive research in the field in the decades following those landmark works. 26. Slichter, Healy, and Livernash (1960) explained the common practice of uniform pay increases with job grades with minimal performance evaluation in union and nonunion facilities as an outgrowth of union avoidance and the constant problems of defending merit-based evaluations in the minds of workers.

pages: 562 words: 153,825

Dark Mirror: Edward Snowden and the Surveillance State
by Barton Gellman
Published 20 May 2020

See also Eric Hughes, “A Cypherpunk’s Manifesto” (1993), www.activism.net/cypherpunk/manifesto.html; and John Perry Barlow, “A Declaration of the Independence of Cyberspace,” Electronic Frontier Foundation, February 8, 1996, www.eff.org/cyberspace-independence. invented “onion routing”: Among the seminal papers by Naval Research Laboratory employees was David Goldschlag, Michael Reed, and Paul Syverson, “Onion Routing for Anonymous and Private Internet Connections,” Communications of the Association for Computing Machinery, January 28, 1999, www.onion-router.net/Publications/CACM-1999.pdf. Onion routing relays an internet connection through a series of hops, each of them encrypted, ensuring that no single network operator can see both the origin and the destination.

pages: 513 words: 152,381

The Precipice: Existential Risk and the Future of Humanity
by Toby Ord
Published 24 Mar 2020

Jonathan Schell (1982). The Fate of the Earth. The first deep exploration of the badness of extinction, and the central importance of ensuring humanity’s survival. Filled with sharp philosophical insight. Carl Sagan (1983). “Nuclear War and Climatic Catastrophe: Some Policy Implications.” A seminal paper, introducing the new-found mechanism of nuclear winter and exploring the ethical implications of human extinction. Derek Parfit (1984). Reasons and Persons. Among the most famous works in philosophy in the twentieth century, it made major contributions to the ethics of future generations and its concluding chapter highlighted how and why the risk of human extinction may be one of the most important moral problems of our time.

pages: 482 words: 149,351

The Finance Curse: How Global Finance Is Making Us All Poorer
by Nicholas Shaxson
Published 10 Oct 2018

Though LBO had become almost a term of abuse, the titans were determined to try again, so they rebranded the sector with a fancy new name: private equity.4 At the same time a new intellectual saviour appeared, in the form of a Harvard Business School professor called Michael Jensen, who had trained at the University of Chicago and had some exciting new ideas about business strategy. Normal public companies – your BPs or your Tescos, say – are owned by a diverse group of shareholders, but run by a different group, their managers. These two groups’ interests weren’t necessarily aligned, Jensen argued in a couple of seminal papers in the Harvard Business Review in 1989 and 1990. Managers didn’t have strong enough incentives to look after shareholders’ money, and this led to what he called ‘widespread waste and inefficiency’. He argued, first of all, that corporate America needed a new breed of superstar owner-managers in a financial ‘market for corporate control’ that would boost efficiency across the system.

Mastering Blockchain, Second Edition
by Imran Bashir
Published 28 Mar 2018

This standard allows faster settlement of transactions and direct peer-to-peer transaction routing. It aims to address regulatory, security, and privacy requirements in blockchain technologies. OS1 also provides a framework for smart contract development and allows the participant to meet AML and KYC requirements easily. Smart contract standardization efforts have also started with a seminal paper authored by Lee and others, which formally defines the smart contract templates and presents a vision for future research and necessities in smart contract related research and development. This paper is available at https://arxiv.org/abs/1608.00771v2. Moreover, some discussion on this topic has been carried out in Chapter 18, Scalability and Other Challenges and Chapter 9, Smart Contracts.

Lifespan: Why We Age—and Why We Don't Have To
by David A. Sinclair and Matthew D. Laplante
Published 9 Sep 2019

If you are skeptical that a yeast cell can tell us anything about cancer, Alzheimer’s disease, rare diseases, or aging, consider that there have been five Nobel Prizes in Physiology or Medicine awarded for genetic studies in yeast, including the 2009 prize for discovering how cells counteract telomere shortening, one of the hallmarks of aging.5 The work Mortimer and Johnston did—and, in particular, a seminal paper in 1959 that demonstrated that mother and daughter yeast cells can have vastly different lifespans—would set the stage for a world-shattering change in the way we view the limits of life. And by the time of Mortimer’s death in 2007, there were some 10,000 researchers studying yeast around the globe.

What We Cannot Know: Explorations at the Edge of Knowledge
by Marcus Du Sautoy
Published 18 May 2016

Indeed, what distinguishes a Stradivarius from a factory-made cello is partly the perfection of shape which leads to a more beautiful sound. One of the intriguing problems that challenged mathematicians for some time was whether you could deduce the shape of the box from the frequencies of the waves that vibrate inside it. In a seminal paper, Mark Kac posed the question: ‘Can you hear the shape of a drum?’ For example, only a square has the particular set of frequencies that are produced by this shape. But in 1992 mathematicians Carolyn Gordon, David Well and Scott Wolpert constructed two strange shapes whose resonant frequencies were identical although the underlying shapes differed.

Spies, Lies, and Algorithms: The History and Future of American Intelligence
by Amy B. Zegart
Published 6 Nov 2021

One study found that 95 out of 100 doctors estimated the probability to be between 70 and 80 percent.109 The correct answer is 7.8 percent. That’s not a typo. In this hypothetical, the probability that a woman who has tested positive for breast cancer in a routine mammography actually has breast cancer is less than 8 percent. In their seminal paper, Gigerenzer and Hoffrage find that the way in which probability problems like the breast cancer diagnosis are usually presented makes it harder to understand base rates and get the right answer. Standard probability looks like this: It’s pretty basic math but still not intuitive to most people.

pages: 547 words: 172,226

Why Nations Fail: The Origins of Power, Prosperity, and Poverty
by Daron Acemoglu and James Robinson
Published 20 Mar 2012

The Charter of Maryland, the Fundamental Constitutions of Carolina, and other colonial constitutions have been put on the Internet by Yale University’s Avalon Project, at avalon.law.yale.edu/17th_century. Bakewell (2009), chap. 14, discusses the independence of Mexico and the constitution. See Stevens (1991) and Knight (2011) on postindependence political instability and presidents. Coatsworth (1978) is the seminal paper on the evidence on economic decline in Mexico after independence. Haber (2010) presents the comparison of the development of banking in Mexico and the United States. Sokoloff (1988) and Sokoloff and Khan (1990) provide evidence on the social background of innovators in the United States who filed patents.

pages: 551 words: 174,280

The Beginning of Infinity: Explanations That Transform the World
by David Deutsch
Published 30 Jun 2011

Nevertheless, rather like empiricism, which it resembles, the idea of the Turing test has played a valuable role. It has provided a focus for explaining the significance of universality and for criticizing the ancient, anthropocentric assumptions that would rule out the possibility of AI. Turing himself systematically refuted all the classic objections in that seminal paper (and some absurd ones for good measure). But his test is rooted in the empiricist mistake of seeking a purely behavioural criterion: it requires the judge to come to a conclusion without any explanation of how the candidate AI is supposed to work. But, in reality, judging whether something is a genuine AI will always depend on explanations of how it works.

pages: 692 words: 167,950

The Ripple Effect: The Fate of Fresh Water in the Twenty-First Century
by Alex Prud'Homme
Published 6 Jun 2011

This was also the time of significant groundwater depletion, such as the draining of “fossil water” from the Ogallala Aquifer by high-capacity pumps and center-pivot irrigation systems, while point-source pollution and the environmental impact of large dams emerged as national issues. In 1971, Wolman published a seminal paper in Science entitled “The Nation’s Rivers,” in which he pointed to how little we knew about the degradation and improvement of rivers and underscored the need for long-term data collection on which to build informed decisions. “Our science has followed [Wolman’s] pattern and needs to continue” to do so, Hirsch said, to applause from the crowd.

pages: 741 words: 164,057

Editing Humanity: The CRISPR Revolution and the New Era of Genome Editing
by Kevin Davies
Published 5 Oct 2020

But a priori, Jínek reckoned there was “nothing fundamentally different or any impediment to getting it to work in mammalian cells.” On October 3, Doudna’s inbox chimed with a message that supported Jínek’s belief. The sender was Jin-Soo Kim, a leading molecular biologist in South Korea. His lab had been pursuing CRISPR editing since Doudna and Charpentier’s “seminal paper” and was preparing to submit a new report on “Genome editing in mammalian cells.” Kim generously asked if Doudna (and Charpentier) would be interested in publishing together. “I do not wish to scoop you because your Science paper prompted us to start this project,” Kim wrote. But he wasn’t interested in getting scooped, either.22 Six weeks later, Church likewise emailed Doudna and Charpentier “a quick note to say how inspiring and helpful” he had found their CRISPR paper.

pages: 626 words: 167,836

The Technology Trap: Capital, Labor, and Power in the Age of Automation
by Carl Benedikt Frey
Published 17 Jun 2019

As we all know, in a six-game match played in 1996, the chess master Garry Kasparov prevailed against IBM’s Deep Blue by three wins but lost in a historic rematch a year later. Relative to chess, the complexity of Go is striking. Go is played on a board that is nineteen by nineteen squares, whereas chess uses a board that is eight by eight squares. As the mathematician Claude Shannon demonstrated in 1950, in his seminal paper on how to program a machine to play chess, a lower-bound estimate of the number of possible moves in chess is greater than the number of atoms in the observable universe, and the number of possible moves in Go is more than twice that number.2 Indeed, even if every atom in the universe was its own universe and had inside it the number of atoms in our universe, there would still be fewer atoms than the number of possible legal moves in Go.

pages: 625 words: 167,349

The Alignment Problem: Machine Learning and Human Values
by Brian Christian
Published 5 Oct 2020

Yarin Gal, “Modern Deep Learning Through Bayesian Eyes” (lecture), Microsoft Research, December 11, 2015, https://www.microsoft.com/en-us/research/video/modern-deep-learning-through-bayesian-eyes/. 15. Zoubin Ghahramani, “Probabilistic Machine Learning: From Theory to Industrial Impact” (lecture), October 5, 2018, PROBPROG 2018: The International Conference on Probabilistic Programming, https://youtu.be/crvNIGyqGSU. 16. For seminal papers relating to Bayesian neural networks, see Denker et al., “Large Automatic Learning, Rule Extraction, and Generalization”; Denker and LeCun, “Transforming Neural-Net Output Levels to Probability Distributions”; MacKay, “A Practical Bayesian Framework for Backpropagation Networks”; Hinton and Van Camp, “Keeping Neural Networks Simple by Minimizing the Description Length of the Weights”; Neal, “Bayesian Learning for Neural Networks”; and Barber and Bishop, “Ensemble Learning in Bayesian Neural Networks.”

pages: 700 words: 160,604

The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race
by Walter Isaacson
Published 9 Mar 2021

We are not in the Brave New World yet, but we are well along the road.”5 Even though human gene–editing technologies had not yet been devised, the battle lines had thus been defined. It became the mission of many of the scientists to find a middle ground rather than let the issue become politically polarized. Asilomar In the summer of 1972, Paul Berg, who had just published his seminal paper on how to make recombinant DNA, went to the ancient clifftop village of Erice on the coast of Sicily to lead a seminar on the new biotechnologies. The graduate students who attended were shocked by what he described, and they peppered him with questions about the ethical dangers of genetic engineering, especially the modification of humans.

pages: 667 words: 186,968

The Great Influenza: The Story of the Deadliest Pandemic in History
by John M. Barry
Published 9 Feb 2004

In England in 1933, during a minor outbreak of human influenza, Andrewes, Patrick Laidlaw, and Wilson Smith, largely following Shope’s methodology, filtered fresh human material and transmitted influenza to ferrets. They found the human pathogen. It was a filter-passing organism, a virus, like Shope’s swine influenza. Had Lewis lived, he would have coauthored the papers with Shope, and even added breadth and experience to them. He would have helped produce another of the seminal papers in virology. His reputation would have been secure. Shope was not perfect. For all his later accomplishments in influenza and in other areas, some of his ideas, including some of those pertaining to influenza, were mistaken. Lewis, if energized and once again painstaking, might have prevented those errors.

The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley
by Leslie Berlin
Published 9 Jun 2005

Balls tunneling through the wall: Professor Stig Lundqvist of the Royal Academy of Sciences used this analogy in his speech presenting the 1973 Nobel Prize to Leo Esaki, Ivar Giaever, and Brian David Josephson. Boss showed no interest, powerful demotivator: Noyce, “Innovation: The Fruit of Success,” Technology Review, Feb. 1978: 24–27. Esaki’s seminal paper: Leo Esaki, “New Phenomenon in Narrow Germanium P-N Junctions, Physical Review, 1958, 109: 603. Esaki conducted his research in 1957, at roughly the same time Noyce noted his ideas. On the response to this paper: Leo Esaki, “The Global Reach of Japanese Science,” http://www.jspsusa.org/FORUM1996/esaki.html, accessed 1 Nov. 2004.

pages: 733 words: 179,391

Adaptive Markets: Financial Evolution at the Speed of Thought
by Andrew W. Lo
Published 3 Apr 2017

Even after all this discussion, only a few students raise their hands. When I ask those who didn’t raise their hands why not, they sheepishly admit that they simply don’t feel comfortable doing so. This is exactly the point of the exercise, which is now called the Ellsberg Paradox, after the example in Ellsberg’s seminal paper. Thinking isn’t the same as feeling. You can think the two games have equal odds, but you just don’t feel the same about them. People have no problem taking risks in their day-to-day activities, but when there’s any uncertainty about those risks, they immediately become more cautious and conservative.

pages: 687 words: 189,243

A Culture of Growth: The Origins of the Modern Economy
by Joel Mokyr
Published 8 Jan 2016

How do they trade off the number of children against the resources they spend on the education of each one? Investment in human capital is still widely regarded to be of central importance to all economic development. Education and economic development are both regarded as desirable phenomena. What could be a more reassuring idea than that they were closely associated? The seminal paper on the matter (Nelson and Phelps, 1966) was published almost a half century ago. It postulated that both technological advance and technological catch-up depend strongly on the level of human capital.7 In his presidential address, Richard A. Easterlin (1981) posed the basic question: Why isn’t the whole world developed?

pages: 757 words: 193,541

The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services, Volume 2
by Thomas A. Limoncelli , Strata R. Chalup and Christina J. Hogan
Published 27 Aug 2014

Appendix B will make the case that cloud or distributed computing was the inevitable result of the economics of hardware. DevOps is the inevitable result of needing to do efficient operations in such an environment. If hardware and software are sufficiently fault tolerant, the remaining problems are human. The seminal paper “Why Do Internet Services Fail, and What Can Be Done about It?” by Oppenheimer et al. (2003) raised awareness that if web services are to be a success in the future, operational aspects must improve: We find that (1) operator error is the largest single cause of failures in two of the three services, (2) operator errors often take a long time to repair, (3) configuration errors are the largest category of operator errors, (4) failures in custom-written front-end software are significant, and (5) more extensive online testing and more thoroughly exposing and detecting component failures would reduce failure rates in at least one service.

pages: 586 words: 186,548

Architects of Intelligence
by Martin Ford
Published 16 Nov 2018

We released the entire 15 million images to the world and started to run international competitions for researchers to work on the ImageNet problems: not on the tiny small-scale problems but on the problems that mattered to humans and applications. Fast-forward to 2012, and I think we see the turning point in object recognition for a lot of people. The winner of the 2012 ImageNet competition created a convergence of ImageNet, GPU computing power, and convolutional neural networks as an algorithm. Geoffrey Hinton wrote a seminal paper that, for me, was Phase One in achieving the holy grail of object recognition. MARTIN FORD: Did you continue this project? FEI-FEI LI: For the next two years, I worked on taking object recognition a step further. If we again look at human development, babies start by babbling, a few words, and then they start making sentences.

pages: 619 words: 177,548

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity
by Daron Acemoglu and Simon Johnson
Published 15 May 2023

See, for example, Bostrom (2017), Christian (2020), Stuart Russell (2019), and Ford (2021) on advances in artificial intelligence, and Kurzweil (2005) and Diamandis and Kotler (2014) on the economic abundance that this would create. Our discussion of routine and nonroutine tasks builds on Autor, Levy, and Murnane’s (2003) seminal paper and Autor’s (2014) discussion of limits to automation. Our interpretation that current AI still mostly focuses on routine tasks is based on the evidence in Acemoglu, Autor, Hazell, and Restrepo (2022). Frey and Osborne’s famous (2013) study also supports the notion that AI is primarily about automation; they estimate that close to 50 percent of US jobs can be automated by AI within the next several decades.

pages: 789 words: 207,744

The Patterning Instinct: A Cultural History of Humanity's Search for Meaning
by Jeremy Lent
Published 22 May 2017

The Turning Point: Science, Society, and the Rising Culture. New York: Bantam Books, 1988. A deep exploration of the modern mechanistic worldview and the possibilities for alternative ways of thinking. White, Lynn. “The Historical Roots of Our Ecological Crisis.” Science 155, no. 3767 (1967): 1203–07. A seminal paper that catalyzed a greater understanding of the ecological implications of traditional Christian cosmology. Chapter 16. Great Rats: The Story of Power and Exploitation Ponting, Clive. A New Green History of the World: The Environment and the Collapse of Great Civilizations. New York: Penguin, 2007.

pages: 562 words: 201,502

Elon Musk
by Walter Isaacson
Published 11 Sep 2023

That would transform not only our economy, he said, but the way we live. 65 Neuralink 2017–2020 A monkey playing Pong using only his brainwaves Human-computer interfaces Some of the most important technology leaps in the digital age involved advances in the way that humans and machines communicate with each other, known as “human-computer interfaces.” The psychologist and engineer J. C. R. Licklider, who worked on air-defense systems that tracked planes on a monitor, wrote a seminal paper in 1960 titled “Man-Computer Symbiosis,” showing how video displays could “get a computer and a person thinking together.” He added, “The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly.” MIT hackers used these video displays to create a game called Spacewar, which helped spawn commercial games that, in order to be easy enough for a stoned college student to play, had interfaces that were so intuitive they required almost no instructions. (“1.

pages: 998 words: 211,235

A Beautiful Mind
by Sylvia Nasar
Published 11 Jun 1998

Nobody who knew any facts was willing to go on the record or even talk to me. Martha Legg, Nash’s sister, finally broke the silence about the nature of the illness that had wrecked his life. Lloyd Shapley, another pioneer of game theory, described Nash as a graduate student in the late 1940s, when he wrote his seminal papers on game theory: “He was immature, he was obnoxious, he was a brat. What redeemed him was a keen, logical, beautiful mind. So now you know to whom I owe the title of the biography. Because Nash’s story is so familiar, I’d like to share some of the less familiar parts, including how the book came to be and some of the things that happened after the book and movie broke off.

pages: 695 words: 219,110

The Fabric of the Cosmos
by Brian Greene
Published 1 Jan 2003

Decoherence is a widespread phenomenon that forms a bridge between the quantum physics of the small and the classical physics of the not-so-small by suppressing quantum interference—that is, by diminishing sharply the core difference between quantum and classical probabilities. The importance of decoherence was realized way back in the early days of quantum theory, but its modern incarnation dates from a seminal paper by the German physicist Dieter Zeh in 1970,14 and has since been developed by many researchers, including Erich Joos, also from Germany, and Wojciech Zurek, of the Los Alamos National Laboratory in New Mexico. Here’s the idea. When Schrödinger’s equation is applied in a simple situation such as single, isolated photons passing through a screen with two slits, it gives rise to the famous interference pattern.

pages: 843 words: 223,858

The Rise of the Network Society
by Manuel Castells
Published 31 Aug 1996

In the words of Croteau and Haynes, “there are three basic ways in which media audiences have been seen as active: through individual interpretation of media products, through collective interpretation of media, and through collective political action.”36 And they go on to provide a wealth of data and illustrations to support their claim of the relative autonomy of the audience vis à vis messages received from the media. Indeed, this is a well-established tradition in media studies. Thus, Umberto Eco provided an insightful perspective to interpret media effects in his 1977 seminal paper titled “Does the Audience have Bad Effects on Television?” As Eco wrote: There exist, depending on sociocultural circumstances, a variety of codes, or rather of rules of competence and interpretation. The message has a signifying form that can be filled with different meanings… So the suspicion grew that the sender organized the televisual image on the basis of his own codes, which coincided with those of the dominant ideology, while the addressees filled it with “aberrant” meanings according to their particular cultural codes.37 The consequence of this analysis is that: “One thing we do know is that there doesn’t exist a Mass Culture in the sense imagined by the apocalyptic critics of mass communications because this model competes with others (constituted by historical vestiges, class culture, aspects of high culture transmitted through education etc.).”38 While historians and empirical researchers of the media would find this statement pure common sense, in fact, taking it seriously, as I do, it decisively undermines a fundamental aspect of critical social theory from Marcuse to Habermas.

Engineering Security
by Peter Gutmann

Another system that’s been proposed is to have users memorise specific points in an image, which are known to experimental psychologists working in the field of visual attention as salient points and which, as with face selections, are fairly predictable and amenable to computer processing (this is a vast and complex field with far too much material to cover here, but one of the seminal papers by neuroinformatics professor Christoph Koch, going back more than ten years, provides a good overview of initial work in the area [20]). There’s extensive ongoing work towards automatic Password Complexity 567 classification of these items for purposes such as feature extraction in image-based search, as well as less academic pursuits like automatic target identification and tracking in the military, which is a much harder task since it has to be done in real time and the salient points are trying very hard to be as non-salient as possible.

In addition since many of them concern flaws in deployed products it could lead to a situation where if I give you read access to the identities of contributors, their employers might apply execute access to them. Because of this I’ve only given references for openly published sources or when someone is explicitly quoted in the text. 658 PKI Certificates The pre-history of PKI goes back to Diffie and Hellman’s seminal paper on publickey cryptography, which proposed a key directory called a Public File that users can consult to find other users’ public keys [9]. The Public File protects its communications by signing them, and would today be called a trusted directory [10]. A signature on a public key was thus a one-time assertion by the public file that “this key is valid right now for this person”.

pages: 761 words: 231,902

The Singularity Is Near: When Humans Transcend Biology
by Ray Kurzweil
Published 14 Jul 2005

Huxley, "A Quantitative Description of Membrane Current and Its Application to Conduction and Excitation in Nerve," Journal of Physiology 117 (1952): 500–544. 23. W. S. McCulloch and W. Pitts, "A Logical Calculus of the Ideas Immanent in Nervous Activity," Bulletin of Mathematical Biophysics 5 (1943): 115-33. This seminal paper is a difficult one to understand. For a clear introduction and explanation, see "A Computer Model of the Neuron," the Mind Project, Illinois State University, http://www.mind.ilstu.edu/curriculum/perception/mpneuron1.html. 24. See note 172 in chapter 5 for an algorithmic description of neural nets. 25.

pages: 1,088 words: 228,743

Expected Returns: An Investor's Guide to Harvesting Market Rewards
by Antti Ilmanen
Published 4 Apr 2011

For understanding the determinants of expected returns, I find empirically oriented models, such as that of Campbell–Sunderam–Viceira (2010), more relevant than either pure term structure models or macro-finance models (see Box 9.1). Box 9.1. Macro-finance models—a recent academic focus Unlike models that rely purely on term structure data, macro-finance models link bond yield fluctuations to macro factors in the context of a “no-arbitrage” term structure model. The seminal paper by Ang–Piazzesi (2003) uses the basic insight of the Taylor rule that a reasonable central bank policy makes short-term rates functions of real activity and inflation [2]. While the growing macro-finance literature offers promise, its practical relevance remains limited to date, perhaps due to challenges with model specification and estimation errors.

pages: 809 words: 237,921

The Narrow Corridor: States, Societies, and the Fate of Liberty
by Daron Acemoglu and James A. Robinson
Published 23 Sep 2019

The description of bodies under the bridge is from Cunliffe-Jones (2010, 23). For Lagos disappearing under rubbish, see http://news.bbc.co.uk/2/hi/africa/281895.stm. The quotes from Philip Pettit come from Pettit (1999, 4–5), and see also the development of his ideas in Pettit (2014). The seminal paper on the violence of hunter-gatherer societies is Ember (1978); we refer here to the work of Keeley (1996) and Pinker (2011); see specifically the data in Pinker’s Figure 2-3 (53). On the homicide rates of the Gebusi, see Knauft (1987). All quotes from Hobbes are directly from Hobbes (1996, Chapters 13, 17–19: “continual feare,” 89; “from hence it comes to passe,” 87; “In such condition,” 89; “men live without” and “to submit their Wills,” 120).

pages: 848 words: 227,015

On the Edge: The Art of Risking Everything
by Nate Silver
Published 12 Aug 2024

Electrocardiograph refers to the first practical version of the technology. The hamburger is a notoriously disputed example; 1904 is the date when the hamburger became famous at the St. Louis World’s Fair, but other sources date its origins to 1885 or 1900. mRNA vaccines are another ambiguous case. But the seminal paper cited to Katalin Karikó and Drew Weissman in their award of the Nobel Prize dates to 2005. Cloud computing refers to commercial application with Amazon Web Services. Human genome project refers to the completion of the project in 2003. And Tesla’s 2008 date refers to the first commercial sales.

Data Mining: Concepts and Techniques: Concepts and Techniques
by Jiawei Han , Micheline Kamber and Jian Pei
Published 21 Jun 2011

This complete set of answers to the exercises in the book is available only to instructors from the publisher's web site. ■ Course syllabi and lecture plans. These are given for undergraduate and graduate versions of introductory and advanced courses on data mining, which use the text and slides. ■ Supplemental reading lists with hyperlinks. Seminal papers for supplemental reading are organized per chapter. ■ Links to data mining data sets and software. We provide a set of links to data mining data sets and sites that contain interesting data mining software packages, such as IlliMine from the University of Illinois at Urbana-Champaign http://illimine.cs.uiuc.edu

pages: 1,205 words: 308,891

Bourgeois Dignity: Why Economics Can't Explain the Modern World
by Deirdre N. McCloskey
Published 15 Nov 2011

In 2015 it was claimed by students of the matter that the Plague was spread from Central Asia not by flea-bearing rats but by, of all things, (flea-bearing) gerbils. 16. Alfani 2013, for example on p. 427. 17. Ross Emmett emphasizes Malthus’s notions here, Emmett n.d., p. 3. 18. Sahlins 1972 (2004). 19. Gaus 2013, p. 13. 20. Olson 1993 is the seminal paper. Thus Scott 2009, and for a West African example, from my beloved colleague the late James Searing, Searing 2002. 21. Mayshar, Moav, and Neeman 2011. 22. Weatherford 2004; Perdue 2005, Hellie 2003, McNeill 1964, Lattimore 1940. Chapter 3 1. Gerschenkron 1971. 2. Nordhaus 2004. 3.

pages: 1,373 words: 300,577

The Quest: Energy, Security, and the Remaking of the Modern World
by Daniel Yergin
Published 14 May 2011

Geological Survey, where he was in a permanent battle with some of his colleagues. “He was the most difficult person I ever worked with,” said Peter Rose, his boss at the USGS. Yet Hubbert also became recognized as one of the leading figures in the field and made a variety of major contributions, including a seminal paper in 1957, “The Mechanics of Hydraulic Fracturing.” One of his fundamental objectives was to move geology from what he called its “natural-history phase” to “physical science phase,” firmly based in physics, chemistry, and in particular, in rigorous mathematics. “King Hubbert, mathematician that he is,” said the chief geophysicist of one of the oil companies, “based his look ahead on facts, logically and analytically analyzed.”

pages: 1,073 words: 314,528

Strategy: A History
by Lawrence Freedman
Published 31 Oct 2013

Three maxims flowed from his analysis: “Who rules East Europe controls the heartland; Who rules the heartland commands the World-Island; Who rules the World-Island commands the World.”32 The importance of distance, which Mackinder saw being transformed by railways and motorized transport, was eventually affected even more by the ability of aircraft to fly over both land and sea. Surprisingly, Mackinder paid little attention to the possibilities of air power though it was only a few weeks before he gave his seminal paper in 1904 that the Wright brothers made their historic first flight. There was much that Mackinder shared with Mahan. International relations were understood in terms of relentless competition among naturally expansive great powers. What Mackinder introduced was a way of thinking about the geographical dimension that showed how the land and sea could be understood as part of the same world system, and as a source of continuity even as political and technological change affected its relevance.

pages: 1,535 words: 337,071

Networks, Crowds, and Markets: Reasoning About a Highly Connected World
by David Easley and Jon Kleinberg
Published 15 Nov 2010

INFORMATION NETWORKS, HYPERTEXT, AND ASSOCIATIVE MEMORY387 Kossinets- Watts 2006 Burt 2004 Burt 2000 Coleman 1988 Granovetter 1985 Feld 1981 Granovetter 1973 Travers- Davis 1963 Milgram 1969 Rapoport Milgram Cartwright- Lazarsfeld- 1953 1967 Harary 1956 Merton 1954 Figure 13.3: The network of citations among a set of research papers forms a directed graph that, like the Web, is a kind of information network. In contrast to the Web, however, the passage of time is much more evident in citation networks, since their links tend to point strictly backward in time. 388 CHAPTER 13. THE STRUCTURE OF THE WEB key ideas in the first part of this book. (At the bottom of this figure are seminal papers on — from left to right — triadic closure, the small-world phenomenon, structural balance, and homophily.) We can see how work in this field — as in any academic discipline — builds on earlier work, with the dependence represented by a citation structure. We can also see how this citation structure naturally forms a directed graph, with nodes representing books and articles, and directed edges representing citations from one work to another.

pages: 1,758 words: 342,766

Code Complete (Developer Best Practices)
by Steve McConnell
Published 8 Jun 2004

As Java guru Joshua Bloch says, "Design and document for inheritance, or prohibit it." If a class isn't designed to be inherited from, make its members non-virtual in C++, final in Java, or non-overridable in Microsoft Visual Basic so that you can't inherit from it. Adhere to the Liskov Substitution Principle (LSP). In one of object-oriented programming's seminal papers, Barbara Liskov argued that you shouldn't inherit from a base class unless the derived class truly "is a" more specific version of the base class (Liskov 1988). Andy Hunt and Dave Thomas summarize LSP like this: "Subclasses must be usable through the base class interface without the need for the user to know the difference" (Hunt and Thomas 2000).

pages: 1,213 words: 376,284

Empire of Things: How We Became a World of Consumers, From the Fifteenth Century to the Twenty-First
by Frank Trentmann
Published 1 Dec 2015

In the last generation, a vast literature has challenged the conventional economic yardstick of gross domestic product (GDP) and outlined a number of alternative measures of well-being. We cannot do justice to this extensive scholarship,39 only hint at implications for the uses of time. The initial focus of research was on the tenuous link between income and happiness in affluent societies. In a seminal paper in 1974, Richard Easterlin noted that, while rich Americans were happier than their poor neighbours, Americans as a whole were no happier in 1970 than in 1946, in spite of being 60 per cent richer (in real income). Once basic human needs were met – around $15,000 a year in 1974 money – additional income ceased to buy more happiness.40 Other scholars since have been more optimistic and point out that richer countries do tend to be happier than poorer ones.

pages: 1,293 words: 357,735

The Coming Plague: Newly Emerging Diseases in a World Out of Balance
by Laurie Garrett
Published 31 Oct 1994

.; Menlo Park, CA: Benjamin/Cummings Publishing Co., 1987). 3 For an excellent review of McClintock’s work and its subsequent impact on molecular biology, see N. V. Federoff, “Maize Transposable Elements.” Chapter 14 in D. E. Berg and M. M. Howe, eds., Mobile DNA (Washington, D.C.: American Society for Microbiology, 1989). One of McClintock’s seminal papers is B. McClintock, “The Origin and Behavior of Mutable Loci in Maize,” Proceedings of the National Academy of Sciences 36 (1950): 344–55. 4 James Watson has written four editions of his grand guide to molecular biology, each of which, since the first in 1965, has been considerably larger than its predecessor, reflecting the explosion of scientific discovery.

pages: 1,437 words: 384,709

The Making of the Atomic Bomb
by Richard Rhodes
Published 17 Sep 2012

Hahn mailed the paper and then felt the whole thing to be so improbable “that I wished I could get the document back out of the mail box”; or Paul Rosbaud came around to the KWI the same evening to pick it up.974 Both stories survive Hahn’s later recollection. Since Rosbaud knew the paper’s importance and dated its receipt December 22, 1938, he probably picked it up. But Hahn also visited the mailbox that night, to send a carbon copy of the seminal paper to Lise Meitner in Stockholm. His misgivings at publishing without her—or some dawning glimmer of the fateful consequences that might follow his discovery—may have accounted for his remembered apprehension. * * * The Swedish village of Kungälv—the name means King’s River—is located some ten miles above the dominant western harbor city of Goteborg and six miles inland from the Kattegat coast.975 The river, now called North River, descends from Lake Vanern, the largest freshwater lake in Western Europe; at Kungälv it has cut a sheer granite southward-facing bluff, the precipice of Fontin, 335 feet high.

God Created the Integers: The Mathematical Breakthroughs That Changed History
by Stephen Hawking
Published 28 Mar 2007

By defining the integral in this manner, Lebesgue had reduced the theory of the integral to the theory of measure. If we consider the problem from a geometrical perspective, we see that Lebesgue had reduced the problem of determining the area of a two-dimensional object to the problem of determining the measure of a set of points embedded in the one-dimensional real line. In Lebesgue’s seminal paper, a large portion of which is presented here, he first addresses the topic of measuring a set of points embedded in the one-dimensional real line and then progresses to his theory of the integral. Lebesgue begins by enunciating properties a measure must have to satisfy our intuitions. • The measure of the interval [a, b] (i.e., the set of real numbers x such that a ≤ x ≤ b) is simply the value b − a