REST API Design Rulebook
by
Mark Masse
Published 19 Oct 2011
Hypertext Text-based documents containing embedded links to related documents, which creates a navigable mesh of information. HyperText Mark-up Language (HTML) Created by Tim Berners-Lee to represent the state of a web resource’s information and relationships. HyperText Transfer Protocol (HTTP) Originally developed by Tim Berners-Lee, this is a message-based language that computers could use to communicate over the Internet. Hypertext Transfer Protocol version 1.1 (HTTP/1.1) Roy Fielding, Tim Berners-Lee, and others contributed to the standardization of this most recent version of the communication protocol. JavaScript A powerful scripting language that is commonly used by web developers.
…
Find us on Facebook: http://facebook.com/oreilly Follow us on Twitter: http://twitter.com/oreillymedia Watch us on YouTube: http://www.youtube.com/oreillymedia Acknowledgments I could not have written this book without the help of the folks mentioned here. Tim Berners-Lee As a member of the World Wide Web generation, I have spent my entire career as a software engineer working in, and adding to, the Web. I am eternally grateful to Tim Berners-Lee for his “WorldWideWeb” project. A triumph; huge success. Roy Fielding Roy Fielding’s pioneering Ph.D. dissertation was the primary inspiration for this book. If you want to learn all about REST from its original author, I highly recommend that you read Fielding’s dissertation.[2] Leonard Richardson In an effort to distinguish between RESTful and other Web API designs, Leonard Richardson proposed[3] what has come to be known as the “Richardson Maturity Model.”
…
* * * [11] http://httpd.apache.org. Web Standards Fielding worked alongside Tim Berners-Lee and others to increase the Web’s scalability. To standardize their designs, they wrote a specification for the new version of the Hypertext Transfer Protocol, HTTP/1.1.[12] They also formalized the syntax of Uniform Resource Identifiers (URI) in RFC 3986.[13] Adoption of these standards quickly spread across the Web and paved the way for its continued growth. * * * [12] Fielding, Roy T., Tim Berners-Lee, et al. HTTP/1.1, RFC 2616, RFC Editor, 1999 (http://www.rfc-editor.org/rfc/rfc2616.txt)
The Internet Is Not the Answer
by
Andrew Keen
Published 5 Jan 2015
And it crowned Marc Andreessen, who was featured sitting shoeless on the cover of Time magazine in February 1996, as the young disruptive hero of the Internet revolution. But the Netscape Moment marked the death of something, too. Discussing Tim Berners-Lee’s decision to give away his technology for free, Jim Clark—who disliked venture capitalists, believing them to be vultures that “make hyenas look good”22—suggests that “any entrepreneur might wonder about his (Berners-Lee’s) sanity while admiring his soul.”23 What the Internet lost in the early nineties, with the passing of its mantle from researchers like Tim Berners-Lee to businessmen like Jim Clark, can be simply summarized. As Wall Street moved west, the Internet lost a sense of common purpose, a general decency, perhaps even its soul.
…
Marshall McLuhan put it slightly differently, but with even more relevance to our networked age. Riffing off Churchill’s 1944 speech, the Canadian media visionary said that “we shape our tools and thereafter our tools shape us.”24 McLuhan died in 1980, nine years before a young English physicist named Tim Berners-Lee invented the World Wide Web. But McLuhan correctly predicted that electronic communication tools would change things as profoundly as Johannes Gutenberg’s printing press revolutionized the fifteenth-century world. These electronic tools, McLuhan predicted, will replace the top-down, linear technology of industrial society with a distributed electronic network shaped by continuous feedback loops of information.
…
Like Bush, who believed that the trails on his Memex “do not fade,”41 the highly eccentric Nelson saw himself as a “rebel against forgetting.”42 His lifelong quest to create hypertext, which he code-named Xanadu, was indeed a kind of rebellion against forgetfulness. In Nelson’s Xanadu system, there was no “concept of deletion.” Everything would be remembered. In 1980, twenty years after Nelson’s invention of the hypertext idea, a much less eccentric genius, Tim Berners-Lee, arrived as a consultant at the European Particles Physics Laboratory (CERN) in Geneva. Like Nelson, Berners-Lee, who had earned a degree in physics from Oxford University’s Queens College in 1976, was concerned with protecting his own personal forgetfulness. The problem, Berners-Lee wrote in his autobiography, Weaving the Web, was remembering “the connections among the various people, computers, and projects at the lab.”43 This interest in memory inspired Berners-Lee to build what he called his first website program, Enquire.
This Is for Everyone: The Captivating Memoir From the Inventor of the World Wide Web
by
Tim Berners-Lee
Published 8 Sep 2025
Nash ref1 electronics experiments ref1, ref2 home-built computer ref1, ref2, ref3, ref4 knighthood ref1 lecture tours ref1 marriage to Rosemary ref1, ref2 model railway ref1 move to USA ref1 music ref1 Order of Merit ref1 Oxford University ref1, ref2, ref3, ref4 Plessey ref1, ref2, ref3 Royal Society election ref1 running ref1, ref2 sailing ref1, ref2 skiing ref1, ref2, ref3 windsurfing ref1 see also CERN; MIT; World Wide Web ‘best viewed in’ ref1, ref2 Bezos, Jeff ref1 Bezos, Mackenzie ref1 Bina, Eric ref1, ref2, ref3 biometrics ref1, ref2 Bitcoin ref1 Black Mountains, Wales ref1 Bletchley Park ref1, ref2 blockchain technology ref1, ref2 blogosphere ref1 blogs ref1, ref2, ref3 Bluesky ref1 Bono ref1 bookmarks ref1 Bos, Burt ref1 Boston ref1, ref2, ref3, ref4 Bostrom, Nick ref1 bots ref1 Bouazizi, Mohamed ref1 Bourgeois, Alain ref1 Boutell, Thomas ref1 Boyera, Stephan ref1 Boyle, Danny ref1, ref2 Bratt, Steve ref1 Brazil ref1 Brewer, Judy ref1 Brexit ref1 Brin, Sergei ref1 broadband ref1 broadband providers Africa ref1 cable ref1 speeds ref1 Brown, Gordon ref1, ref2 browsers Cello ref1 Chrome ref1 cookies ref1, ref2, ref3 early development ref1, ref2, ref3 Firefox ref1 Internet Explorer ref1, ref2, ref3, ref4, ref5 landing pages ref1 Mosaic ref1, ref2, ref3, ref4 Netscape ref1, ref2 Opera ref1, ref2 smartphones ref1 Bruce, John ref1, ref2, ref3, ref4 Bruce, Tom ref1, ref2, ref3 Buckingham Palace ref1, ref2 Bulletin Board Services (BBS) ref1 bulletin board services (BBS) ref1 Burkina Faso ref1, ref2 Bush, Kate ref1 Bush, Vannevar ref1, ref2 Butler, Christopher ref1, ref2 ByteDance ref1 cable companies ref1 Cailliau, Robert ref1, ref2, ref3, ref4, ref5, ref6, ref7 Cairo ref1 calendars interoperability ref1, ref2 Mary Lee Berners-Lee design ref1 Cambridge Analytica ref1, ref2, ref3 Cambridge University ref1 Cameron, David ref1 Campbell Gray, Helen (grandmother) ref1 canals ref1, ref2 Capital One ref1 car analogy, net neutrality ref1 Cargill, Carl ref1 Carpenter, Brian ref1, ref2, ref3 Carroll, Lewis ref1 cathode-ray tubes ref1 Cello browser ref1, ref2 censorship ref1, ref2, ref3, ref4, ref5 Center for Democracy and Technology (CDT) ref1 Center for Humane Technology ref1 Cerf, Vint ref1, ref2, ref3, ref4, ref5, ref6, ref7 CERN CERNDoc ref1 culture ref1, ref2 description of site ref1 ‘Enquire-within’ program ref1, ref2, ref3 history and foundation ref1 information systems ref1 International Conference for the World Wide Web (WWW1) ref1, ref2 LAN ref1 language protocols ref1 Large Electron–Positron (LEP) ref1, ref2 Large Hadron Collider (LHC) ref1, ref2 mission ref1, ref2, ref3 phone numbers website ref1 Proton Synchrotron Booster (PSB) ref1, ref2 real-time data acquisition ref1, ref2 Tim Berners-Lee’s arrival ref1 Tim Berners-Lee’s return ref1 WWW intellectual property rights ref1, ref2 Charlie (an AI that works for you) ref1, ref2 chatbots ref1 ChatGPT ref1, ref2, ref3, ref4, ref5 chemistry ref1 Chequers ref1 chess ref1, ref2 children, smartphones access ref1 China ref1, ref2 Chrétien, Jean ref1 Christianity ref1 Chrome ref1, ref2, ref3 cities ‘rational’ ref1 rivers ref1 citizens’ data ref1, ref2, ref3, ref4 civil liberties ref1, ref2, ref3 Clark, David ref1 Clark, Jim ref1 Clarke, Arthur C. ref1 Clarke, Joan ref1 climate change ref1 Clippy ref1 closed captioning ref1 ‘The Cluetrain Manifesto’ ref1 Coalition for Content Provenance and Authenticity (C2PA) ref1 Coffey, Shelby ref1 collaboration collaborative filtering ref1 early websites ref1 intercreativity ref1, ref2, ref3, ref4 principle ref1, ref2, ref3 TPAC (Technical Plenary Advisory Committee) conferences ref1 Collage ref1 Common Crawl ref1, ref2, ref3 communication CERN information systems and LAN ref1, ref2 internet protocols ref1, ref2 compassion ref1, ref2 CompuServe ref1 computer mouse ref1 computer science, home education ref1 computer terminals ref1 computers home-built ref1, ref2, ref3, ref4 interoperability ref1 NeXT ref1, ref2, ref3, ref4, ref5, ref6, ref7 PC revolution ref1 Xerox Alto ref1 confidentiality ref1 connections ref1, ref2, ref3 connectivity ref1, ref2, ref3 Connolly, Dan ref1 consciousness ref1 consistent hashing ref1 conspiracy theories ref1, ref2 Content ID ref1 Contract for the Web ref1, ref2, ref3 cookies ref1, ref2, ref3 Copilot ref1 copyright ref1, ref2 Cortico ref1 Cosgrave, Paddy ref1 cost-of-living crisis ref1 Covid-19 ref1 Craigslist ref1 crawlers ref1 creativity ref1, ref2, ref3, ref4, ref5, ref6 cryptocurrency ref1 cryptography ref1, ref2, ref3, ref4 CSS (cascading style sheets) ref1, ref2 culture, trust ref1 Cunningham, Ward ref1 cybercriminals ref1 cybersquatting ref1, ref2 Cyc ref1 Daleks ref1 Dalitz, Meryl ref1 DARPA (Defense Advanced Research Projects Agency) ref1, ref2 data breaches ref1 compared to documents ref1 garbage in, garbage out ref1 linking ref1, ref2, ref3 open data ref1 PODS (Personal Online Data Stores) ref1 read–write web ref1 silos ref1 structures ref1, ref2, ref3, ref4 data packets ref1 data sheets ref1 data sovereignty ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8 data wallets ref1 see also PODS data.gov.uk ref1 Datasolids ref1 Davies, Roger ref1 Dawkins, Richard ref1 De Martin, Noel ref1 dead links ref1 Deakin, Roger ref1 decentralization ref1, ref2, ref3 deep web ref1 deepfakes ref1, ref2 DeepMind ref1, ref2 democracy Arab Spring ref1 Burkina Faso ref1 Cambridge Analytica ref1, ref2 deepfakes ref1 digital citizenship ref1 historical record ref1, ref2 Philippines ref1 see also civil liberties Denmark ref1 Dertouzos, Michael ref1, ref2, ref3, ref4 ‘Design Issues’ ref1, ref2, ref3, ref4 D.G.
…
R. ref1 MasterCard ref1 Mastodon ref1, ref2 mathematics ref1, ref2, ref3, ref4 Matrix ref1 media industry ref1 mental health and social media ref1, ref2, ref3 ‘Mesh’ memo ref1 Meta ref1, ref2, ref3 see also Facebook metadata ref1 metasystems ref1 Metaverse ref1 Metcalf, Bob ref1 MeWe ref1 micropayments ref1 microprocessors ref1, ref2 Microsoft ActiveX ref1 Clippy ref1 Copilot ref1 copyright infringement ref1 early lack of browser ref1 Internet Explorer ref1, ref2, ref3, ref4, ref5 MS-DOS ref1 Office ref1 standards ref1, ref2 Teams ref1 XML ref1 Microsoft Exchange Server ref1 microtargeting ref1, ref2 Middle Earth ref1 Middle Earth map ref1 military applications ref1 Millennium Technology Prize ref1 Miller, Kaia ref1 Minnesota, University of ref1, ref2 misinformation ref1, ref2 MIT Center for Constructive Communication ref1 Computer Science and Artificial Intelligence Lab (CSAIL) ref1, ref2 speaking tours ref1 Tim Berners-Lee’s arrival ref1 Tim Berners-Lee’s early visits ref1, ref2 World Wide Web Consortium (W3C) ref1, ref2, ref3 mobile phones CSS (cascading style sheets) ref1 licences legislation ref1 touchscreens ref1 see also smartphones model railway ref1 modems broadband ref1 dial-up ref1, ref2 download speeds ref1, ref2 Moffat, John ref1, ref2, ref3 monopolization ref1, ref2, ref3, ref4, ref5, ref6 Montulli, Lou ref1 moon landings ref1 Moore’s Law ref1 Morsi, Mohammed ref1 Mosaic browser ref1, ref2, ref3, ref4 ‘The Mother of all Demos’ (Engelbart) ref1 motivation-hygiene theory ref1 Motorola ref1 Mount Stromlo Observatory ref1 Mozilla foundation ref1, ref2, ref3 MP3s ref1 Mubarak, Hosni ref1 multimedia ref1, ref2 Murthy, Vivek ref1 music collaboration ref1 copyright ref1 illuminated Italian Renaissance website ref1 MP3s ref1 recommendation services ref1, ref2 Tim Berners-Lee’s interest ref1 Musk, Elon ref1, ref2 MyData ref1 MySpace ref1 Myst (game) ref1 narrowcasting ref1, ref2, ref3 National Science Foundation ref1 National Theatre, London ref1 Naver Maps ref1 NCSA (National Center for Supercomputing Applications) ref1, ref2, ref3, ref4, ref5, ref6 Nelson, Ted ref1, ref2 neoliberalism ref1 net neutrality ref1, ref2, ref3 Netflix ref1, ref2 netiquette ref1 Netscape ref1, ref2, ref3, ref4, ref5, ref6 networking fractal ref1 humans as social animals ref1 neural networks ref1, ref2, ref3, ref4 URLs ref1 New York Times, The ref1, ref2, ref3 Newmark, Craig ref1 news organizations ref1 newsgroups ref1, ref2 NeXT ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8 NFT of original WWW code ref1 Ng, Andrew ref1 NHS ref1, ref2 Nigeria ref1 Nilekani, Nandan ref1 9/11 attacks ref1, ref2 Nix, Alexander ref1 Nobel Prize ref1, ref2, ref3, ref4 Nokia ref1 nuclear weapons ref1 Nupedia ref1 Nvidia ref1, ref2 Objective-C code ref1 Olympics, 2012 London ref1, ref2 Ong, Jonathan Corpus ref1 Open Data Institute (ODI) ref1, ref2, ref3, ref4 see also Solid open-source software ref1, ref2, ref3, ref4, ref5, ref6 Open Systems Interconnection (OSI) ref1 OpenAI ref1, ref2, ref3, ref4, ref5 OpenStreetMap ref1, ref2 Opera browser ref1, ref2, ref3, ref4 Opzoomer, Indi (Leith) ref1 Opzoomer, Jamie (Leith) ref1 Opzoomer, Lyssie (Leith) ref1 Order of Merit of the British Empire ref1, ref2 O’Reilly ref1, ref2, ref3 Oxford University ref1, ref2, ref3, ref4 PACER ref1 packet-switching ref1 Page, Larry ref1, ref2 PageRank ref1 pantomime ref1 ‘paperclip maximizer’ ref1 paradigm shift, artificial intelligence (AI) ref1 passkeys ref1 passwords ref1 patents ref1 peace ref1 Pellow, Nicola ref1 Penrose, Roger ref1 Pentagon ref1, ref2, ref3 Pets.com ref1 philanthropy ref1 Philippines ref1 photographs, metadata ref1 physics ref1, ref2 pi.ai ref1 Pinterest ref1, ref2 Pioch, Nicolas ref1 plagiary ref1 Plessey ref1, ref2, ref3 Plewe, Brandon ref1 PNG (Portable Network Graphics) ref1 podcasts ref1 PODS (Personal Online Data Stores) ref1, ref2, ref3, ref4, ref5, ref6 polarization ref1, ref2, ref3, ref4, ref5, ref6 Polis ref1 Pollerman, Bernd ref1 Polly, Jean Armour ref1 Poole ref1 Pordes, Ruth ref1 Postel, Jon ref1 Postscript language ref1 Priceline ref1 printers, dot-matrix ref1 printing presses ref1, ref2 privacy apps ref1 Contract for the Web ref1, ref2 data sovereignty ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8 location data ref1 MyData ref1 principle ref1, ref2 regulations ref1 RSA system ref1 Solid ref1, ref2 Prodigy ref1 programming languages ECMAScript ref1 HTML5 ref1 Java ref1 Objective-C ref1 Python ref1, ref2 Timpl ref1 Unix ref1 see also HTML Project Liberty ref1 protein-folding ref1, ref2 protocols CERN ref1 internet ref1, ref2 open ref1 see also HTTP; Solid provenance ref1 Prud’hommeaux, Eric ref1 public-key encryption ref1, ref2, ref3 punting ref1 Purnell, ‘Daffy’ ref1 Putz, Steve ref1 Python ref1, ref2, ref3, ref4, ref5 quantum mechanics 151n queuing theory ref1, ref2 Quicken ref1 Quint, Vincent ref1 Raggett, Dave ref1, ref2 RAGs (Retrieval-Augmented Generation systems) ref1 railway, model ref1 The Ranch ref1 Raytheon ref1 RDF (Resource Description Format) ref1, ref2 read–write web ref1 Readers ref1, ref2 RECAP ref1 Reddit ref1, ref2, ref3 religion ref1, ref2, ref3 retail sites ref1, ref2 Richmond Park ref1 Rimmer, Peggie ref1, ref2, ref3, ref4, ref5, ref6 Rivest, Ron ref1, ref2 Rogers, Kevin ref1, ref2 Rouse, Paul ref1, ref2 Royal Society ref1 RSA system ref1, ref2, ref3, ref4 RSS (Really Simple Syndication) ref1, ref2 running ref1, ref2 Rwanda ref1 Safari ref1, ref2 Sahel ref1 sailing ref1, ref2 Sainsbury’s ref1 Sandberg, Sheryl ref1 Sanger, Larry ref1 Sawadogo, Yacouba ref1 Scheifler, Bob 118n Schneier, Bruce ref1 schools ref1, ref2, ref3 science fiction ref1, ref2, ref3 Science Museum, London ref1 search engines ref1, ref2, ref3, ref4, ref5 see also Google Searls, Doc ref1 Second Life ref1 secure systems ref1 Segal, Ben ref1, ref2 semantic web Applied Semantics ref1 initial concept ref1 ‘layer cake’ ref1 machine learning ref1 PODS (Personal Online Data Stores) ref1 RDF (Resource Description Format) ref1 semantic winter ref1 Semantic Web Institute ref1 Sendall, Mike ref1, ref2, ref3, ref4, ref5, ref6 Seoul Peace Prize ref1, ref2 server software ref1, ref2 Sesri, Rudina ref1, ref2, ref3, ref4 SGML (Standard Generalized Markup Language) ref1, ref2 Shadbolt, Nigel ref1, ref2 Shamir, Adi ref1 Sheen Mount school ref1 Silicon Valley ref1, ref2 silos ref1, ref2, ref3, ref4 simplified text ref1 singularity ref1 Siri ref1, ref2 skiing ref1, ref2, ref3 Slack ref1 Slashdot ref1 slashdotting ref1 Smarr, Larry ref1 smartphones apps ref1, ref2 children ref1 global growth ref1, ref2 interoperability ref1 Smith, Adam ref1, ref2 Social Dilemma, The (film, 2020) ref1 social media addiction ref1, ref2, ref3, ref4 advertisements ref1, ref2, ref3 algorithms ref1, ref2, ref3, ref4, ref5, ref6 Arab Spring ref1 attention economy ref1, ref2 collaborative filtering and polarization ref1 early development ref1 Institute for Rebooting Social Media ref1 liability of hosts ref1 mental health ref1 mental health issues ref1, ref2 MeWe ref1 silos ref1 social graph ownership ref1 users as the product ref1, ref2 social trust ref1 software copyright development ref1 open-source ref1, ref2, ref3, ref4, ref5, ref6 Solid (Social Linked Data protocol) adoption ref1, ref2, ref3, ref4, ref5 Charlie ref1, ref2 development ref1, ref2 functionality ref1, ref2, ref3 Inrupt ref1 organization and structure ref1 potential ref1 Seoul Peace Prize ref1 server protocol ref1 trust ref1 see also PODS; data wallets Sollins, Karen ref1 Sony ref1 South Korea ref1, ref2 spiders ref1 Spotify ref1, ref2, ref3 Spyglass ref1 standards ref1, ref2, ref3, ref4, ref5 see also protocols; World Wide Web Consortium Stanford Linear Accelerator Center (SLAC) ref1, ref2, ref3 stock brokers ref1, ref2 Stoppard, Tom ref1 Stover, Mr (music teacher) ref1 style sheets ref1, ref2, ref3 Suleyman, Mustafa ref1, ref2, ref3 Sun Microsystems ref1 Sunak, Rishi ref1 superintelligence ref1 surveillance ref1, ref2, ref3, ref4, ref5, ref6 Swick, Ralph ref1 Switzerland ref1, ref2, ref3 T-Mobile ref1 tabulator ref1 tags ref1 Tahrir Square ref1 Taiwan ref1 TCP/IP ref1, ref2 TED talks ref1, ref2, ref3 teleconferencing ref1, ref2, ref3 teletypes ref1 television cathode-ray tubes ref1 closed captioning ref1 Telnet ref1 Tencent ref1 Texas Instruments ref1 text-to-speech services ref1 third-party distribution networks ref1, ref2 This Is For Everyone ref1, ref2 TikTok ref1, ref2 timbl ref1 Time magazine ref1 Timpl ref1 Tolkein, J.
…
ref1, ref2, ref3, ref4 Yang, Jerry ref1 Yentob, Alan ref1 YouTube ref1, ref2, ref3 Zaï ref1 Zakim ref1 zip lines ref1 Zittrain, Jonathan ref1 Zoom ref1 Zuckerberg, Mark ref1, ref2 Praise for This Is for Everyone ‘As a company running computer networks before the dawn of the internet age, Bloomberg was an early beneficiary of the towering wave of change that Tim Berners-Lee ushered in with the World Wide Web. His book offers a fascinating look at the origin and evolution of a world-transforming invention and how we can harness its potential as a force for good’ Michael Bloomberg, founder of Bloomberg and Bloomberg Philanthropies and mayor of New York 2002–2013 ‘Tim Berners-Lee’s invention of the World Wide Web is a landmark event of the last fifty years – and his tireless work to keep the web accessible to everyone is a service to humanity.
The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution
by
Walter Isaacson
Published 6 Oct 2014
Newt Gingrich, speech to the American Political Science Association, Sept. 1, 2000. CHAPTER ELEVEN: THE WEB 1. Tim Berners-Lee, Weaving the Web (HarperCollins, 1999), 4. See also Mark Fischetti, “The Mind Behind the Web,” Scientific American, Mar. 12, 2009. 2. Author’s interview with Tim Berners-Lee. 3. Author’s interview with Tim Berners-Lee. 4. Author’s interview with Tim Berners-Lee. 5. Author’s interview with Tim Berners-Lee. 6. Tim Berners-Lee interview, Academy of Achievement, June 22, 2007. 7. Author’s interview with Tim Berners-Lee. 8. Author’s interview with Tim Berners-Lee. 9. Enquire Within Upon Everything (1894), http://www.gutenberg.org/files/10766/10766-h/10766-h.htm. 10.
…
Berners-Lee, Weaving the Web, 1. 11. Author’s interview with Tim Berners-Lee. 12. Tim Berners-Lee interview, Academy of Achievement, June 22, 2007. 13. Berners-Lee, Weaving the Web, 10. 14. Berners-Lee, Weaving the Web, 4. 15. Berners-Lee, Weaving the Web, 14. 16. Author’s interview with Tim Berners-Lee. 17. Tim Berners-Lee interview, Academy of Achievement, June 22, 2007. 18. Berners-Lee, Weaving the Web, 15. 19. John Naish, “The NS Profile: Tim Berners-Lee,” New Statesman, Aug. 15, 2011. 20. Berners-Lee, Weaving the Web, 16, 18. 21. Berners-Lee, Weaving the Web, 61. 22. Tim Berners-Lee, “Information Management: A Proposal,” CERN, Mar. 1989, http://www.w3.org/History/1989/proposal.html. 23.
…
It made the carefully packaged online services obsolete, and it fulfilled—indeed far surpassed—the utopian dreams of Bush, Licklider, and Engelbart. More than most innovations of the digital age it was invented primarily by one man, who gave it a name that managed to be, as he was personally, both expansive and simple: the World Wide Web. TIM BERNERS-LEE As a kid growing up on the edge of London in the 1960s, Tim Berners-Lee came to a fundamental insight about computers: they were very good at crunching step by step through programs, but they were not very good at making random associations and clever links, the way that an imaginative human could. This is not something that most kids ponder, but both of Berners-Lee’s parents were computer scientists.
Learning SPARQL
by
Bob Ducharme
Published 15 Jul 2011
This section gives you some background about the kinds of things you might see in a file of RDF data. There’s no need to learn all the details, but sometimes it’s handy to know which serialization is which. We’ll look at how several formats represent the following three facts: The book with ISBN 006251587X has the creator Tim Berners-Lee. The book with ISBN 006251587X has the title “Weaving the Web”. Tim Berners-Lee’s title is “Director”. The examples use the URI http://www.w3.org/People/Berners-Lee/card#i to represent Berners-Lee, because that’s the URI he uses to represent himself in his FOAF file. The examples use the URN urn:isbn:006251587X to represent the book.
…
Because of this, it’s an RDF best practice to assign rdfs:label values to resources so that human readers can more easily see what they represent. For example, in Tim Berners-Lee’s FOAF file, he uses the URI http://www.w3.org/People/Berners-Lee/card#i to represent himself, but his FOAF file also includes the following triple: # filename: ex038.ttl <http://www.w3.org/People/Berners-Lee/card#i> <http://www.w3.org/2000/01/rdf-schema#label> "Tim Berners-Lee" . Using multiple rdfs:label values, each with its own language tag, is a common practice. The DBpedia collection of RDF extracted from Wikipedia infoboxes has fifteen rdfs:label values for the resource http://dbpedia.org/resource/Switzerland.
…
More and more people are using the query language SPARQL (pronounced “sparkle”) to pull data from a growing collection of public and private data. Whether this data is part of a semantic web project or an integration of two inventory databases on different platforms behind the same firewall, SPARQL is making it easier to access it. In the words of W3C Director and Web inventor Tim Berners-Lee, “Trying to use the Semantic Web without SPARQL is like trying to use a relational database without SQL.” SPARQL was not designed to query relational data, but to query data conforming to the RDF data model. RDF-based data formats have not yet achieved the mainstream status that XML and relational databases have, but an increasing number of IT professionals are discovering that tools using the RDF data model let them expose diverse sets of data (including, as we’ll see, relational databases) with a common, standardized interface.
Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy
by
Jonathan Taplin
Published 17 Apr 2017
Strikingly, the Internet was created with government funding and built on the principles of decentralization—principles we need to find our way back to if we are to overcome the power of corporate monopolies in the digital age. Since 2010 I have run the Annenberg Innovation Lab at the University of Southern California, where I have been lucky enough to work with many of the pioneers of the Internet, including Tim Berners-Lee, Vint Cerf, and John Seely Brown. I was also the founder of one of the first streaming-video-on-demand company, Intertainer, which deployed high-quality video over the Internet ten years before YouTube went online. I am a committed believer in the power of technology. I have used Internet tools such as my blog on Medium to work out some of the ideas in this book.
…
Every piece of code—HTML, TCP/IP—was donated to the ARPANET project royalty-free. Of course DARPA had its own reasons for funding Doug Engelbart’s research, deeply interwoven with Cold War paranoia and post–nuclear attack “survivability,” but that was irrelevant to the purpose and the idealism of Engelbart, Brand, Vint Cerf, Tim Berners-Lee, and a host of other geniuses who made the Internet. But ultimately the connection with the military led to the undoing of Engelbart’s NLS vision. By 1969 the antiwar demonstrations outside the SRI building were a daily occurrence. Inside, the research team, which was growing quickly—thanks to increasing DARPA investment after the successful San Francisco demo—began to break into two factions: computer geeks and countercultural humanists.
…
In 1985, after the debut of the Macintosh, Microsoft quickly introduced Windows, an operating system that totally mimicked the Macintosh. Whatever advantage Apple had was quickly extinguished, and Steve Jobs was forced out of the company. Jobs immediately set out for revenge on his old company by building a new computer called NeXT. Not long after that, a twenty-nine-year-old English engineer, Tim Berners-Lee, took up a position at the Conseil Européen pour la Recherche Nucléaire (CERN). The Internet at this point was purely an academic research network linking physicists around the world and allowing them to share research documents, and CERN was the largest European node of the network. Finding documents was getting increasingly dicey as the network got more popular, so Berners-Lee began to work on the concept of hypertext as a way for researchers to link directly to other documents in their references.
How to Fix the Future: Staying Human in the Digital Age
by
Andrew Keen
Published 1 Mar 2018
To borrow some language from the Berlin venture firm BlueYard Capital, Borthwick wants to “encode” the “value” of openness into the architecture of the internet. It’s a kind of network neutrality for the AI age. And his model for this is the World Wide Web, the open platform so generously donated to the tech community by Tim Berners-Lee in 1989, on which innovative first-generation internet companies like Skype, Amazon, and Borthwick’s own Ada Web flourished. And that’s why, Borthwick tells me, he is advising the nonprofit Knight Foundation on its “Ethics and Governance of Artificial Intelligence Fund”—a $27 million fund announced in 2017 that is dedicated to researching artificial intelligence for the public interest.1 But as Borthwick acknowledges, for every public-spirited Berners-Lee or Knight Foundation, there is a private corporation seeking to dominate the market through its complete control of the technology stack.
…
The internet might have been described as the “people’s platform,”6 these critics say, but in fact it has a people problem. Jaron Lanier, the inventor of virtual reality and Silicon Valley’s most poignant thinker, even admits to a nostalgia for that halcyon time in the last century when technology did, indeed, put people first. “I miss the future,” Lanier confesses.7 He’s far from alone. Even Tim Berners-Lee, the inventor of the World Wide Web, is nostalgic for the open, decentralized technological future he imagined he’d fathered in 1989. And so, at the 2016 “Decentralized Web Summit” in San Francisco, an event conducted in the same idealistic spirit as BlueYard Capital’s “Encrypted and Decentralized” conference in Berlin, Berners-Lee spoke passionately about the state of the internet, particularly the emergence of vast digital monopolies and the pervasive culture of online surveillance.
…
In every minute of every day of 2016, we made 2.4 million Google searches, watched 2.78 million videos, entered 701,389 Facebook log-ins, added 36,194 new posts to Instagram, and exchanged 2.8 million messages on WhatsApp.10 All this personal data has become the most valuable commodity of the networked age, the so-called new oil11 of our networked economy—as everyone from European politicians to Silicon Valley venture capitalists to the CEO of IBM has put it—endowing Big Tech with the wealth of the gods. Tim Berners-Lee shares Kahle’s disappointment with recent digital history. “The Internet was designed to be decentralized so everybody could participate,” Berners-Lee told the summit attendees about a digital architecture he helped design. Instead, he said, “personal data has been locked up” in what he called “silos”—centralized big data companies like Google, Amazon, Facebook, and LinkedIn.12 And so “the problem,” he warned, “is the dominance of one search engine, one big social network, one Twitter for microblogging.”
Learning SPARQL
by
Bob Ducharme
Published 22 Jul 2011
This section gives you some background about the kinds of things you might see in a file of RDF data. There’s no need to learn all the details, but sometimes it’s handy to know which serialization is which. We’ll look at how several formats represent the following three facts: The book with ISBN 006251587X has the creator Tim Berners-Lee. The book with ISBN 006251587X has the title “Weaving the Web”. Tim Berners-Lee’s title is “Director”. The examples use the URI http://www.w3.org/People/Berners-Lee/card#i to represent Berners-Lee, because that’s the URI he uses to represent himself in his FOAF file. The examples use the URN urn:isbn:006251587X to represent the book.
…
Because of this, it’s an RDF best practice to assign rdfs:label values to resources so that human readers can more easily see what they represent. For example, in Tim Berners-Lee’s FOAF file, he uses the URI http://www.w3.org/People/Berners-Lee/card#i to represent himself, but his FOAF file also includes the following triple: # filename: ex038.ttl <http://www.w3.org/People/Berners-Lee/card#i> <http://www.w3.org/2000/01/rdf-schema#label> "Tim Berners-Lee" . Using multiple rdfs:label values, each with its own language tag, is a common practice. The DBpedia collection of RDF extracted from Wikipedia infoboxes has 15 rdfs:label values for the resource http://dbpedia.org/resource/Switzerland.
…
More and more people are using the query language SPARQL (pronounced “sparkle”) to pull data from a growing collection of public and private data. Whether this data is part of a semantic web project or an integration of two inventory databases on different platforms behind the same firewall, SPARQL is making it easier to access it. In the words of W3C Director and web inventor Tim Berners-Lee, “Trying to use the Semantic Web without SPARQL is like trying to use a relational database without SQL.” SPARQL was not designed to query relational data, but to query data conforming to the RDF data model. RDF-based data formats have not yet achieved the mainstream status that XML and relational databases have, but an increasing number of IT professionals are discovering that tools that use this data model make it possible to expose diverse sets of data (including, as we’ll see, relational databases) with a common, standardized interface.
Protocol: how control exists after decentralization
by
Alexander R. Galloway
Published 1 Apr 2004
After some of this material was deemed politically questionable by the Federal Bureau of Investigation, the whole server was yanked off the Internet by the telecommunications company who happened to be immediately upstream from the provider. The Thing had no recourse but to comply with this hierarchical system of control. The inventor of the World Wide Web, Tim Berners-Lee, describes the DNS system as the “one centralized Achilles’ heel by which [the Web] can all be brought down or controlled.”12 If hypothetically some controlling authority wished to ban China from the Internet (e.g., during an outbreak of hostilities), they could do so very easily through a simple modification of the information contained in the root servers at the top of the inverted tree.
…
The protocol known as Hypertext Transfer Protocol (HTTP) encapsulates this HTML object and allows it to be served by an Internet host. However, both client and host must abide by the TCP protocol to ensure that the HTTP object arrives in one piece. Finally, TCP is itself nested within the Internet Protocol, a protocol 12. Tim Berners-Lee, Weaving the Web (New York: HarperCollins, 1999), p. 126. 13. Paul Garrin, “DNS: Long Winded and Short Sighted,” Nettime, October 19, 1998. Introduction 10 that is in charge of actually moving data packets from one machine to another. Ultimately the entire bundle (the primary data object encapsulated within each successive protocol) is transported according to the rules of the only “privileged” protocol, that of the physical media itself (fiber-optic cables, telephone lines, air waves, etc.).
…
These layers are nested, meaning that the application layer is encapsulated within the transport layer, which is encapsulated with the Internet layer, and so on. This diagram, minus its “layer” captions, appears in RFC 791. The four layers are part of a larger, seven-layer model called the OSI (Open Systems Interconnection) Reference Model developed by the International Organization for Standardization (ISO). Tim Berners-Lee, inventor of the Web, uses a 13. Braden, “Requirements for Internet Hosts,” pp. 6–7. 14. Jonathan Postel, “Transmission Control Protocol,” RFC 793, September 1981, p. 7. Physical Media 39 slightly different four-layer model consisting of “the transmission medium, the computer hardware, the software, and the content.”
Possiplex
by
Ted Nelson
Published 2 Jan 2010
.* * I am not saying I will never work with HTML under any circumstance, only that I will investigate every other possibility in Heaven and Earth first. What hurts most is when people like my old English teacher, Warren Allen Smith, credit me in print with having created HTML. No thank you. What would Tim Berners-Lee have said? (1999) BACKGROUND. Around 1996, Marlene and I got an invitation from Tim Berners-Lee to dine with him in suburban Boston. We had a nice Thai dinner and then went to his home, where we argued till four in the morning. (His wife admired my wristwatch, so I gave it to her.) I don’t know how much TBL and I communicated that night; now I suspect he didn’t understand a word I said, except that I don’t buy into his paradigm.
…
He had the dream of a utopian11 society in which all information could be shared among people who communicated as equals.12 He struggled for years to find funding for his project, but success eluded him.” -- Tim Berners-Lee with Mark Fischetti, Weaving the Web. Harper/San Francisco, 1999, p.5. In some way this is the best summary of my work (except for certain details). While I greatly appreciate his intention, I have added corrective footnotes. Thanks, Tim, for a kind and gracious summary within your frame of reference, and I hope someday we can reach a deeper understanding and a shared vision. FOOTNOTES CORRECTING TIM BERNERS-LEE’S KIND REMARKS: 1. No one has ever paid me to be a visionary. 2. I don't believe I used the term "literary machines" until 1981, when I made it the title of a book.
…
And I have entirely left out the chapter on my experiences at Brown University, pending two formal requests I made for an ethics hearing on how I was treated there. These matters are now closed, but the current edition is integral without them; on consideration, I shall leave it that way. This is tricky writing. I am trying to maintain good relations with people like Larry Tesler, Alan Kay, Tim Berners-Lee, and certain of my former colleagues, even though I disagree with what they have done and deplore some of the consequences. Before we start, my especial thanks to my sweetpartner Marlene Mallicoat, my eternal friend and former wife Deborah Stone, my son Erik, and other friends, well-wishers and patrons,* without whom I would not have survived. * My patrons-- those who have given me financial aid, special aid or cut slack in my adult career-- include (in rough chronological order) Jean and Theodor Holm, Norman W.
Memory Machines: The Evolution of Hypertext
by
Belinda Barnet
Published 14 Jul 2013
There is no question of whether it is of benefit. It just does it all wrong, that’s all. (Nelson 1999a) Nelson’s concept of hypertext influenced Tim Berners-Lee, who appropriated Nelson’s term to describe his language (HTML, or Hypertext Markup Language, which describes how graphics and text should be displayed when they are delivered across the Web). But as Nelson stated, ‘HTML is like one-tenth of what I could do. It is a parody. I like and respect Tim Berners-Lee [but] he fulfilled his objective. He didn’t fulfil mine’ (Nelson 1999a). Although I don’t have the space here to go into the evolution of HTML, I’ll note that Berners-Lee shared Nelson’s ‘deep connectionist’ philosophy, and his desire to organize information associatively.
…
He doesn’t feel that HES had a long shadow – if only it were that influential. I do not believe that if you were to talk to the people who designed the browser, Mark Andreessen, and Tim Berners-Lee, who designed the HTTP protocol, and the early notions of the World Wide Web, that they would say, ‘Yeah, we read those early papers and we were deeply influenced by them.’ (van Dam 2011) That HES engendered the Web, or inspired its design, is debatable. For his part, Tim Berners-Lee claims in his autobiography that he had seen Dynatext, a later commercial electronic writing technology that van Dam helped launch after HES (see DeRose 1999), but that he didn’t transfer this design to HTML (Berners-Lee 1999, 27).
…
Andries van Dam kept the hypertext idea alive at Brown (and, for a while, at Apple), so that it could be picked up by a second wave of experimenters, including Jay Bolter, Michael Joyce and Mark Bernstein. The same ACM conference that heard Nelson cry ‘Wrong!’ also featured a poster session by one Tim Berners-Lee on something called the World Wide Web, downgraded from full paper status (so legend goes) because of doubts about the system’s scalability. Wrong indeed, and to a classical historian this might look like the kind of misjudgment that separates was from might have been. Yet in a cultural network, as in a game, we are perhaps better equipped to survive our errors.
The Death of Truth: Notes on Falsehood in the Age of Trump
by
Michiko Kakutani
Published 17 Jul 2018
“With Google personalized”: Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011), 3. “an endless you-loop”: Ibid., 16. “If algorithms are going to curate”: Eli Pariser, “Beware Online ‘Filter Bubbles,’ ” TED2011, ted.com. 7. ATTENTION DEFICIT “When you want to know”: William Gibson, Zero History (New York: Putnam, 2010), 212. Tim Berners-Lee: “History of the Web: Sir Tim Berners-Lee,” World Wide Web Foundation. “The rise of the web”: Jaron Lanier, You Are Not a Gadget (New York: Alfred A. Knopf, 2010), loc. 332–33, Kindle. “We don’t see the forest”: Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W. W. Norton, 2010), 91.
…
“If algorithms are going to curate the world for us,” Pariser warned in a 2011 TED talk, “if they’re going to decide what we get to see and what we don’t get to see, then we need to make sure that they’re not just keyed to relevance but that they also show us things that are uncomfortable or challenging or important, other points of view.” 7 ATTENTION DEFICIT When you want to know how things really work, study them when they’re coming apart. —WILLIAM GIBSON, ZERO HISTORY When it comes to spreading fake news and undermining belief in objectivity, technology has proven a highly flammable accelerant. Increasingly we have become aware of the dark side of what was imagined as a transformative catalyst for innovation. Tim Berners-Lee, who drew up a proposal in 1989 for what would become the World Wide Web, envisioned a universal information system, connecting people across boundaries of language and location and sharing information that would lead to unprecedented creativity and problem solving. A sort of benevolent version of Borges’s infinite library, where everything existed but, in this case, could also be retrieved and put to practical and imaginative use.
…
They worried that the magical tools they had helped create were becoming Frankensteinian monsters. Pierre Omidyar, founder of eBay, wrote that “the monetization and manipulation of information is swiftly tearing us apart,” and commissioned a white paper on the effect that social media was having on accountability and trust and our democracy. “The system is failing,” Tim Berners-Lee declared. He was still an optimist, he said, “but an optimist standing at the top of the hill with a nasty storm blowing in my face, hanging on to a fence.” In an impassioned essay, Roger McNamee, an early investor in Facebook, argued that the Russians’ manipulation of Facebook, Twitter, Google, and other platforms to try to shift the outcomes of the 2016 U.S. election and the Brexit referendum was just the tip of a huge iceberg: unless fundamental changes were made, he warned, those platforms were going to be manipulated again, and “the level of political discourse, already in the gutter, was going to get even worse.”
Intertwingled: The Work and Influence of Ted Nelson (History of Computing)
by
Douglas R. Dechow
Published 2 Jul 2015
I invented the highlighted textual link in 1984, while working with grad student Dan Ostroff, as part of our development of an electronic encyclopedia for the emerging U.S. Holocaust Memorial Museum. We ran empirical studies of different highlighting schemes and tested user capacity to navigate as well as ability to comprehend the paragraphs of text. We called the highlighted textual links, “embedded menus,” but Tim Berners-Lee referred to them with the more compelling term “hot spots” in citing our work in his spring 1989 manifesto for the web. A pioneering visionary of universal hypertext systems including the social and legal structures; keynote speaker at Hypertext ’87 Workshop.Ted Nelson (See Fig. 2.1) Fig. 2.1Example image of Ted Nelson in hyperties system [1] Keynote Speaker at Hypertext ’87 Workshop.
…
Xanadu was all about making non-sequential, non-hierarchical media a reality, a human common practice. As Ted put it himself in his book Dream Machines:Of course, if hypermedia aren’t the greatest thing since the printing press, this whole project falls flat on its face. But it is hard for me to conceive that they will not be. PS: Then Tim Berners-Lee packaged the Internet for the masses, with Andreessen tossing in graphics. Years earlier Ted Nelson had intended to stretch the Internet’s boundaries, as well as making it universally accessible. Sadly, HTML allowed Berners-Lee/Andreesen’s web to spread like wildfire. Graphics and still images only enhanced websites’ magazine feel.
…
They got it all wrong, but it can still be fixed. The courts are going to stomp in… The crackdown is coming and it’s going to be so nasty, and they don’t get it. I’m just trying to create the rational system the web should have been in the first place and would have been if we hadn’t screwed up politically. Tim Berners-Lee fashioned a way of pointing at conventional files and conventional directories via path names, visible to the user, over the Net. To me the notion of files and hierarchical directories is an unfortunate tradition that messes up the very nature of content. Marc Andreessen added Technicolor, all the special effects garbage he could cram in, glorifying, fetishizing these hierarchical directories which are now called websites and are located at URLs.
The Invisible Web: Uncovering Information Sources Search Engines Can't See
by
Gary Price
,
Chris Sherman
and
Danny Sullivan
Published 2 Jan 2003
“Once a bit of information in that space was labeled with an address, I could tell my computer to get it. By being able to reference anything with equal ease, a computer could represent associations between things that might seem unrelated but somehow did, in fact, share a relationship. A Web of information would form.” — Tim Berners-Lee, Weaving the Web The Web was created in 1990 by Tim Berners-Lee, who at the time was a contract programmer at the Organization for Nuclear Research (CERN) high-energy physics laboratory in Geneva, Switzerland. The Web was a side project Berners-Lee took on to help him keep track of the mind-boggling diversity of people, computers, research equipment, and other resources that are de rigueur at a massive research institution like CERN.
…
We’ve selected resources for this section from a broad range of categories that illustrate the high quality of information available on the Invisible Web. In Chapter 1, The Internet and the Visible Web, we trace the development of the Internet and many of the early tools used to locate and share information via the Net. We show how the limitations of these relatively primitive tools ultimately spurred the popular acceptance of the Web. As Tim Berners-Lee, creator of the Web, has written, “To understand the Web in the broadest and deepest sense, to fully partake of the vision that I and my colleagues share, one must understand how the Web came to be.” This historical background, while fascinating in its xxiv The Invisible Web own right, lays the foundation for understanding why the Invisible Web could arise in the first place.
…
Neither publisher nor authors will be held liable for any results, or lack thereof, obtained by the use of this site or any of its links; for any third-party charges; or for any hardware, software, or other problems that may occur as a result of using it. www. invisible-web.net is subject to change or discontinuation without notice at the discretion of the publisher or authors. xxix xxx This Page Intentionally Left Blank CHAPTER 1 The Internet and the Visible Web To understand the Web in the broadest and deepest sense, to fully partake of the vision that I and my colleagues share, one must understand how the Web came to be. —Tim Berners-Lee, Weaving the Web Most people tend to use the words “Internet” and “Web” interchangeably, but they’re not synonyms. The Internet is a networking protocol (set of rules) that allows computers of all types to connect to and communicate with other computers on the Internet. The Internet’s origins trace back to a project sponsored by the U.S.
The Future of Ideas: The Fate of the Commons in a Connected World
by
Lawrence Lessig
Published 14 Jul 2001
Whether or not the framers of the network understood what would grow from what they built, they built it with a certain philosophy in mind. The network itself would not control how it would grow. Applications would. That was the key to end-to-end design. As the inventor of the World Wide Web, Tim Berners-Lee, describes it: Philosophically, if the Web was to be a universal resource, it had to be able to grow in an unlimited way. Technically, if there was any centralized point of control, it would rapidly become a bottleneck that restricted the Web's growth, and the Web would never scale up. Its being “out of control” was very important.38 NETWORK ARCHITECTS Saltzer, Clark, and Reed were not the only people to notice the value of an end-to-end design.
…
If you're free from geekhood, you are likely not to distinguish the WWW from the Internet. But in fact, they are quite distinct. The World Wide Web is a set of protocols for displaying hyper-linked documents linked across the Internet. These protocols were developed in the late 1980s by researchers at the European particle physics lab CERN—in particular by Tim Berners-Lee. These protocols specify how a “Web server” serves content on the WWW. They also specify how “browsers”—such as Netscape Navigator or Microsoft's Internet Explorer—retrieve content on the World Wide Web. But these protocols themselves simply run on top of the protocols that define the Internet.
…
These Internet protocols, referred to as TCP/IP, are the foundation upon which the protocols that make the World Wide Web function—HTTP (hypertext transfer protocol) and HTML (hypertext markup language)—run.48 The emergence of the World Wide Web is a perfect illustration of how innovation works on the Internet and of how important a neutral network is to that innovation. Tim Berners-Lee came up with the idea of the World Wide Web after increasing frustration over the fact that computers at CERN couldn't easily talk to each other. Documents built on one system were not easily shared with other systems; content stored on individual computers was not easily published to the networks generally.
The Tangled Web: A Guide to Securing Modern Web Applications
by
Michal Zalewski
Published 26 Nov 2011
They followed their work on HTML with the development of HyperText Transfer Protocol (HTTP), an extremely basic, dedicated scheme for accessing HTML resources using the existing concepts of Internet Protocol (IP) addresses, domain names, and file paths. The culmination of their work, sometime between 1991 and 1993, was Tim Berners-Lee’s World Wide Web (Figure 1-1), a rudimentary browser that parsed HTML and allowed users to render the resulting data on the screen, and then navigate from one page to another with a mouse click. Figure 1-1. Tim Berners-Lee’s World Wide Web To many people, the design of HTTP and HTML must have seemed a significant regression from the loftier goals of competing projects. After all, many of the earlier efforts boasted database integration, security and digital rights management, or cooperative editing and publishing; in fact, even Berners-Lee’s own project, ENQUIRE, appeared more ambitious than his current work.
…
Adventurous companies and universities pursued pioneering projects such as ENQUIRE, NLS, and Xanadu, but most failed to make a lasting impact. Some common complaints about the various projects revolved around their limited practical usability, excess complexity, and poor scalability. By the end of the decade, two researchers, Tim Berners-Lee and Dan Connolly, had begun working on a new approach to the cross-domain reference challenge—one that focused on simplicity. They kicked off the project by drafting HyperText Markup Language (HTML), a bare-bones descendant of SGML, designed specifically for annotating documents with hyperlinks and basic formatting.
…
Unfortunately, each such change further reinforced bad web design practices[8] and forced the remaining vendors to catch up with the mess to stay afloat. Certainly, the absence of sufficiently detailed, up-to-date standards did not help to curb the spread of this disease. In 1994, in order to mitigate the spread of engineering anarchy and govern the expansion of HTML, Tim Berners-Lee and a handful of corporate sponsors created the World Wide Web Consortium (W3C). Unfortunately for this organization, for a long while it could only watch helplessly as the format was randomly extended and tweaked. Initial W3C work on HTML 2.0 and HTML 3.2 merely tried to catch up with the status quo, resulting in half-baked specs that were largely out-of-date by the time they were released to the public.
The Idealist: Aaron Swartz and the Rise of Free Culture on the Internet
by
Justin Peters
Published 11 Feb 2013
The man attempted to conceal his face behind a bicycle helmet, but he was clearly Aaron Swartz. This discovery raised far more questions than it answered. Aaron Swartz was famous. He was neither a malicious hacker nor a vandalistic “script kiddie,” but rather a well-known programmer and political activist. He was friends with the Internet icons Lawrence Lessig and Tim Berners-Lee. He was a research affiliate at Harvard. His blog was internationally popular. So why was he skulking around an MIT basement siphoning obscure research papers like some tenure-track cat burglar? What were his plans for the nearly 5 million JSTOR documents he had acquired? What in the world was he thinking?
…
“And the thing is that most people sort of outgrow that, and I don’t know if he ever did.”23 Throughout Swartz’s life, simple stimuli routinely elicited complex reactions, and minor aggravations were routinely magnified into moral crises. A pathologically picky eater, Swartz preferred bland, achromatic foods: dry Cheerios, white rice, Pizza Hut’s Personal Pan cheese pizzas. (“This reached its extremes at a World Wide Web conference where all the food was white, even the plate it was on,” Swartz wrote in 2005. “Tim Berners-Lee later pulled my mother aside to share his concerns about this diet.”)24 He suffered from ulcerative colitis, which partially explains his limited palate. But he also told friends he was a “supertaster,” extraordinarily sensitive to flavor, as if his taste buds were constantly moving from a dark room into bright light.
…
“I know I will never be happy without trying to see if I can change the world for the better in a major manner,” he wrote in 1990.59 That January, Hart traveled to the American Library Association’s midwinter meeting to proselytize for e-books and Project Gutenberg. There, he vowed, “There will be 10,000 Machine-Readable-Texts available by Dec. 31, 2000, even if I had to make them all myself.”60 * * * IN 1990, a British computer scientist named Tim Berners-Lee wrote an article for a house newsletter at CERN, a particle-physics laboratory in Switzerland. Berners-Lee programmed software at CERN, and, like many idealistic coders before him, he had become enamored of the Gospel of Richard Stallman. “A source of much debate over recent years has been whether to write software in-house or buy it from commercial suppliers.
The Great Wave: The Era of Radical Disruption and the Rise of the Outsider
by
Michiko Kakutani
Published 20 Feb 2024
GO TO NOTE REFERENCE IN TEXT “has evolved into an engine of inequity”: Tim Berners-Lee, “One Small Step for the Web…,” Medium, Sept. 29, 2018, medium.com/@timberners_lee/one-small-step-for-the-web-87f92217d085. GO TO NOTE REFERENCE IN TEXT Berners-Lee has proposed a new platform: Solid, solidproject.org; Thomas Macaulay, “Web Inventor Tim Berners-Lee: Screw Web3—My Decentralized Internet Doesn’t Need Blockchain,” TNW, June 23, 2022, thenextweb.com/news/web-inventor-tim-berners-lee-screw-web3-my-decentralized-internet-doesnt-need-blockchain; Peter Verdegem, “Tim Berners-Lee’s Plan to Save the Internet: Give Us Back Control of Our Data,” Conversation, Feb. 5, 2021, theconversation.com/tim-berners-lees-plan-to-save-the-internet-give-us-back-control-of-our-data-154130; Greg Noone, “What Is Web 3.0?
…
Baran’s work—along with that of the British computer scientist Donald Davies—would lead to the development of “packet switching” (a secure method of splitting and sending data that is reassembled at its destination), which, in the late 1960s, would become a basis for sending messages between computers and the development of the ARPANET, which connected academic, military, and research institutions and would eventually evolve into the internet. It remained difficult for scientists using the internet to share information until 1989, when Tim Berners-Lee—then a software engineer at CERN, the particle physics lab in Geneva—came up with the idea of using the emerging technology of hypertext to create the World Wide Web. In 1991, people outside CERN were invited to join this new web community, which embraced an array of egalitarian protocols including bottom-up design (“instead of code being written and controlled by a small group of experts,” it would be “developed in full view of everyone, encouraging maximum participation and experimentation”); net neutrality (meaning that internet service providers should treat all information equally); and decentralization (meaning “no permission is needed from a central authority to post anything on the web” and there is no “kill switch”—no way a central authority could control or monitor everything).
…
GO TO NOTE REFERENCE IN TEXT Berners-Lee has proposed a new platform: Solid, solidproject.org; Thomas Macaulay, “Web Inventor Tim Berners-Lee: Screw Web3—My Decentralized Internet Doesn’t Need Blockchain,” TNW, June 23, 2022, thenextweb.com/news/web-inventor-tim-berners-lee-screw-web3-my-decentralized-internet-doesnt-need-blockchain; Peter Verdegem, “Tim Berners-Lee’s Plan to Save the Internet: Give Us Back Control of Our Data,” Conversation, Feb. 5, 2021, theconversation.com/tim-berners-lees-plan-to-save-the-internet-give-us-back-control-of-our-data-154130; Greg Noone, “What Is Web 3.0? Three Visions for the Internet’s Future,” Tech Monitor, Aug. 13, 2021, techmonitor.ai/technology/emerging-technology/how-will-the-web-future-evolve. GO TO NOTE REFERENCE IN TEXT In 2014, Gavin Wood: Gavin Wood, “What Is Web 3? Here’s How Future Polkadot Founder Gavin Wood Explained It in 2014,” Yahoo, Jan. 4, 2022, yahoo.com/video/3-future-polkadot-founder-gavin-155942673.html.
Cataloging the World: Paul Otlet and the Birth of the Information Age
by
Alex Wright
Published 6 Jun 2014
The conventional history of the Internet traces its roots through an Anglo-American lineage of early computer scientists like Charles Babbage, Ada Lovelace, and Alan Turing; networking visionaries like Vinton G. Cerf and Robert E. Kahn; as well as hypertext seers like Vannevar Bush, J. C. R. Licklider, Douglas Engelbart, Ted Nelson, and of course Tim Berners-Lee and Robert Cailliau, who in 1991 released their first version of the World Wide Web. The dominant influence of the modern computer industry has placed computer science at the center of this story. Nonetheless Otlet’s work, grounded in an age before microchips and semiconductors, opened the door to an alternative stream of thought, one undergirding our present-day information age even though it has little to do with the history of digital computing.
…
From the Treatise on Documentation (Traité de documentation), 1934. 247 11 The Intergalactic Network Ever since Al Gore’s famously misquoted contention during the 2000 presidential election, the question of who invented the Internet has been hotly and endlessly debated. As Gore himself was quick to point out, credit cannot and should not go to any one individual. Tim Berners-Lee, for example, did not invent the Internet. Nor did Vannevar Bush, H. G. Wells, or Paul Otlet. Most wisdom on the subject has now settled on a far-flung group of researchers funded by the U.S. Department of Defense during the Cold War. In response to the Soviet Union’s 1958 launching of the Sputnik I satellite, President Eisenhower established a scientific-military organization whose goal was to develop strategic technologies: the Advanced Research Projects Agency (ARPA).
…
Most of the traffic flowing across the network remained scientific and technical—scholarly papers, research data, and related discussions. In 1991, the National Science Foundation decided to allow commercial traffic—a turning point that would ultimately transform the Internet into something very much like the global network that Licklider had envisioned. That same year, Tim Berners-Lee and his partner, Robert Cailliau (a notable Belgian information scientist), released the first public version of the World Wide Web, while working at the CERN particle 252 T he I ntergalactic N etwor k physics accelerator laboratory in Switzerland. That system—the one many of us now use every day—consisted of a global network of wired and wireless devices that would allow anyone, anywhere in the world, to retrieve and display content on a screen capable of projecting text, images, and other audiovisual material.
The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia
by
Andrew Lih
Published 5 Jul 2010
The Internet was powerful in the hands of computer experts, but for pretty much everyone else, it was an unapproachable jumble of codes and procedures. That all changed in 1990. Tim Berners-Lee, a scientist working at the CERN research lab in Switzerland, was looking for a way for scientists to more easily 52_The_Wikipedia_Revolution share documents and collaborate over the Internet. Even though he was not afraid of the Internet’s technical side, he knew other scientists and researchers were. He wanted to make a system that was graphical in order to easily share documents. Tim Berners-Lee used a computer that was the Ferrari of the techno-elite back then. And even though the NeXT computer is a faint memory today, like HyperCard, its impact went far beyond the units shipped.
…
Other URLs had protocols such as “ftp” for File Transfer Protocol, or the more obscure “gopher” or “wais” protocols. Today, Berners-Lee’s “http” for the World Wide Web dominates for all types of data. It seems like a simple concept, but the breakthrough allowed any information source to be pinpointed on the Internet using just one line. While the first Web browser from Tim Berners-Lee gained notoriety, there was a problem. The sexy features of the NeXT were not cheap. They offered only one model, and few folks could afford a $6,500 NeXT cube. Even NeXT’s follow-on budget version, the NeXT “slab,” was $4,995. It was hardly a computer for the masses. A Web Browser If we look under the hood of a Web browser, we see that it’s a pretty simple piece of software—it transfers a Web page from a computer on the Internet, known as a server; reads through the contents for images, sound, or other components; and downloads each of those elements.
…
It’s somewhat human readable, with special “markup” used for features on the page—italic, bold, images, and other formatting. But it wasn’t meant for human consumption and can become quite cumbersome to read and edit. 54_The_Wikipedia_Revolution The Web became successful because it was an open standard—no one company controlled the specification for it, and it was maintained by a consortium led by Tim Berners-Lee. Since the Web was not tied to one computer company or encumbered by patents, HTML could be generated and displayed by anyone who had the interest and skill to write a program to translate the codes to the computer screen. Most everything in HyperCard mapped quite well to HTML—text, italics, bold, images, and sounds.
WikiLeaks and the Age of Transparency
by
Micah L. Sifry
Published 19 Feb 2011
Tom Watson, “Power of Information: New Taskforce,” March 31, 2008, http:// webarchive.nationalarchives.gov.uk/20100413152047/www.cabinetoffice. gov.uk/about_the_cabinet_office/speeches/watson/080331watson.aspx. Micah L. Sifry, “Gov 2.0 Summit: Tom Steinberg on .gov Sites as Public Goods,” techPresident, September 9, 2009, http://techpresident.com/ blog-entry/gov-20-summit-tom-steinberg-gov-sites-public-goods. Tom Young, “Sir Tim Berners-Lee to help open up government information,” Computing, June 10, 2009, www.computing.co.uk/ctg/ news/1840530/sir-tim-berners-lee-help-government-information. James Crabtree and Tom Chatfield, “Mash the State,” Prospect magazine, February 2010. Simon Rogers, “Government spending over £25,000: Download the data and help analyse it with our new app,” The Guardian, November 19, 2010, www.guardian.co.uk/news/datablog/2010/nov/19/government-spendingdata.
…
The one change that did make it through the British political system these last few years mirrors the positive moves fostered by Obama’s open government initiative: the creation of Data.gov.uk as a central repository for releasing and promoting the use of open data. In response to the expenses scandal, then–Prime Minister Gordon Brown announced that he had appointed Sir Tim Berners-Lee, the father of the World Wide Web, to “help us drive the opening up of access to government data in the web.”27 It didn’t hurt that Berners-Lee brought real star power to the issue. As James Crabtree and Tom Chatfield reported in an in-depth cover story cheekily titled “Mash the State” for Prospect magazine, cabinet ministers were more interested in meeting him than the other way around.
…
P2P DNS wants to make it impossible for any government to blacklist sites off of the web’s domain name system, which 183 WIKILEAKS AND THE AGE OF TRANSPARENCY is how computers know where to find each other’s files.30 In the long term, these initiatives could enable something like Rushkoff ’s vision. But for now, unpopular speech online will probably exist in a twilight zone, semi-free, sometimes capable of threatening powerful institutions and other times subject to their whims. Recently, Sir Tim Berners-Lee, a leading advocate for open government and open data as noted above, was asked his opinion of WikiLeaks and the publication of the Iraq and Afghanistan war logs, as well as Cablegate. He said: What happened recently on WikiLeaks was that somebody stole information, somebody had privileged access to information, betrayed the trust put in them in their job, and took information which should not have been, according to their employer, released, and they released it. . . .
12 Bytes: How We Got Here. Where We Might Go Next
by
Jeanette Winterson
Published 15 Mar 2021
Computer. (Those 5,000 extra Bach chorales done over a lunchtime sandwich. Brian Eno’s forever app.) What will this mean for humans? For creativity? Or do I mean, what is meaning? For humans? For creativity? We shall have to reimagine those terms: Humans. Creativity. Meaning. * * * Here’s Tim Berners-Lee, father of the World Wide Web: What matters is in the connections. It isn’t the letters. It’s the way they’re strung together into words. It isn’t the words, it’s the way they’re strung together into phrases. It isn’t the phrases, it’s the way they’re strung together in the document … In an extreme view the world can be seen as only connections, nothing else.
…
* * * Back in late-1960s America, soon after the Summer of Love, the Advanced Research Projects Agency Network (ARPANET) adopted a British packet-switching system to transmit limited data between research institutions. At the same time the TCP/IP protocol suite was established. The more familiar term, INTERNET – really just inter-networking – came into use in the 1970s, to describe a collection of networks linked by a common protocol. It was Tim Berners-Lee, an Englishman working at the physics lab CERN, in Switzerland, who developed HTML. HTML (hypertext mark-up language) allowed hypertext documents to link into an information system accessible from any node (computer) in the network. In 1990 the World Wide Web as we know it came into existence.
…
Samsara is how Buddhists describe the incessant motion of life, which for them means that nothing is worth clinging to – objects, people, even our cherished ideas. Especially our cherished ideas. This isn’t a dismissive or disconnected approach to life. Connection is vital. Attachment is not. * * * Connectivity. It’s the buzzword of our era, isn’t it? Surely that is because we are beginning to realise what connectivity is. It is a vast web – Tim Berners-Lee knew that immediately – and he didn’t need an advertising firm to name it for him. Connectivity won’t – ultimately – depend on hardware. The aim of Google’s ambient computing, and, eventually, neural implants, is to connect us seamlessly without hardware. Without a device, a ‘thing’. Our liveliest connections with others, or with a piece of art, or an experience, are invisible (no hardware), yet they are the strongest and most profound parts of life.
The System: Who Owns the Internet, and How It Owns Us
by
James Ball
Published 19 Aug 2020
Shortly afterwards, the world’s biggest social media companies were credited with boosting Arab Spring protests against corrupt and dictatorial governments. Such was the mood towards the internet that the exultant opening ceremony to the 2012 London Olympics culminated in a seventeen-minute dance sequence celebrating Tim Berners-Lee for creating the World Wide Web and giving it to the world for free. Berners-Lee himself featured, tweeting the words ‘This is for everyone’ as they simultaneously appeared in lights across the stadium. The internet was almost indisputably a thing to be celebrated. That’s not the spirit of the world at the start of a new decade.
…
The separation of the US government’s formal military network from ARPANET in the early 1980s increased the scope for expansion, as did the formal adoption of TCP/IP and other public standards. The internet grew steadily through the 1980s, but a new innovation at the end of that decade set the ground for much more. This is the bit of the internet’s story that is familiar to many of us: that of the British technologist Tim Berners-Lee, then a scientist at the European CERN institute, who came up with a document, submitted to his supervisor on 12 March 1989: ‘Information Management: A Proposal’.15 The proposal, which was marked by Berners-Lee’s supervisor as ‘vague, but exciting’, did not immediately set the world on fire, but in practice united many of the elements of ‘the mother of all demos’ with the architecture of the internet.
…
The paper became the basis for what we now know as the World Wide Web – the internet as seen through your web browser – with its web addresses (formally known as URLs or URIs), HTML (the language used to format and style web pages) and HTTP (the protocol used to receive information on the web). In practical terms, Tim Berners-Lee’s discoveries set the ground for the internet to become a consumer product, and they did it just as the network was ready to consider connecting the networks and services of commercial entities to the internet’s architecture. The consumer internet effectively began here – but it was never going to become like the highly controlled telephone networks (of which more next chapter), because the culture had two decades to establish itself.
Designing for the Social Web
by
Joshua Porter
Published 18 May 2008
We clump into families, associations, and companies. We develop trust across the miles and distrust around the corner. What we believe, endorse, agree with, and depend on is representable and, increasingly, represented on the Web. We all have to ensure that the society we build with the Web is of the sort we intend.” — Tim Berners-Lee, Weaving the Web1 1 http://www.w3.org/People/Berners-Lee/Weaving/ 2 DESIGNING FOR THE SOCIAL WEB The Amazon Effect If you’ve ever watched someone shop at Amazon.com, you may have witnessed the Amazon Effect. I first saw the Amazon Effect during a usability study several years ago. I was observing a person shopping for a digital camera recommended to her by a friend.
…
Much of the early social psychology research done on online properties was focused on the WELL. Usenet, a system similar to BBSs, also found tremendous popularity in the 1980s as people posted articles and news to categories (called newsgroups). All of these social technologies predate the World Wide Web, which was invented by Sir Tim Berners-Lee in 1989.11 The web is incomparable. Now, nearly two decades after its invention, the world has completely and permanently changed. It’s hard to imagine what life must have been like before we had web sites and applications. Starting with the social software precursors mentioned above, the web has evolved toward more mature social software.
…
What follows is a very abridged history of the web from a social software point of 9 For more insight into the reasons why people use MySpace, read Danah Boyd’s: Identity Production in a Networked Culture: Why Youth Heart MySpace http://www.danah.org/papers/AAAS2006.html 10 http://en.wikipedia.org/wiki/Email 11 Super cool link: Tim Berners-Lee announcing the World Wide Web on Usenet: http://groups.google. com/group/alt.hypertext/msg/395f282a67a1916c 13 14 DESIGNING FOR THE SOCIAL WEB view. This is important because our audiences, except the youngest ones, have lived through and experienced this history and it shapes their expectations.
Free Speech: Ten Principles for a Connected World
by
Timothy Garton Ash
Published 23 May 2016
Jeff Howard, ‘Article 19: Freedom of Expression Anchored in International Law’, Free Speech Debate, http://freespeechdebate.com/en/discuss/article-19-freedom-of-expression-anchored-in-international-law/ 5. on human capabilities, see Sen 1985 and Sen 1999 6. see ‘Tim Berners-Lee on “Stretch Friends” & Open Data’, Free Speech Debate, http://freespeechdebate.com/en/media/tim-berners-lee-on-stretch-friends-open-data/ 7. the poll was conducted in 2009/10. See http://perma.cc/7XP7-VCSJ 8. Pinter 1988, 5 9. Green 1991, 219 10. for the Girona Manifesto, see PEN International, ‘Girona Manifesto on Linguistic Rights’, http://perma.cc/SS2B-QY7W.
…
, nor the casual argot of an American cartoon character, but the result of the Stanford computer crashing before it received the final g of ‘Log’. A December 1969 map of what would eventually develop into the internet shows four computers.18 The Oxford English Dictionary dates the word ‘internet’ to 1974.19 In August 1981 there were just 213 internet hosts.20 The idea of the World Wide Web was proposed by Tim Berners-Lee in 1989, and he created the first ever website at the end of 1990.21 Then it was fast forward. As Figure 4 shows, what is known as ‘Moore’s Law’—predicting a regular doubling of the number of transistors you can fit on a microchip, and hence an exponential growth in computing power—has held roughly true for 50 years, since the chipmaker Gordon Moore first made that prediction in 1965, although it appears the rate of growth is now finally slowing.22 Figure 4.
…
New words must be coined to describe the number of bytes—the basic unit of digital memory, usually consisting of an ‘octet’ string of eight 1s and 0s—of information stored online: from the megabytes (MB, or 1,0002 bytes) and gigabytes (GB, or 1,0003 bytes) we have on our personal computers, all the way to the exabyte, zettabyte and yottabyte, or 1,000,000,000,000,000,000,000,000 individual bytes.23 According to an estimate by Cisco, it would take you about 6 million years to watch all the videos crossing global networks in a single month.24 As of 2015, there are already somewhere around 3 billion internet users, depending exactly how you define internet and user, and that number is growing rapidly.25 The fastest growth will come in the non-Western world, in wireless rather than wired and especially on mobile devices. There are perhaps 2 billion smartphones across the world and that is projected to reach 4 billion by 2020.26 Some 85 percent of the world’s population is within reach of a mobile phone tower which has the capacity to relay data. Tim Berners-Lee and Mark Zuckerberg have been among those campaigning to achieve internet access for all.27 Billions of people are still excluded from this unprecedented network of communication. As Map 1 shows, internet access is very unevenly distributed across the globe. Map 1. Unequal internet use worldwide Country sizes are proportionate to absolute numbers of users.
Artificial Unintelligence: How Computers Misunderstand the World
by
Meredith Broussard
Published 19 Apr 2018
“Test Prep Is More Expensive—for Asian Students.” Atlantic, September 3, 2015. https://www.theatlantic.com/education/archive/2015/09/princeton-review-expensive-asian-students/403510/. Arthur, Charles. “Analysing Data Is the Future for Journalists, Says Tim Berners-Lee.” Guardian (US edition), November 22, 2010. https://www.theguardian.com/media/2010/nov/22/data-analysis-tim-berners-lee. Barlow, John Perry. “A Declaration of the Independence of Cyberspace.” Electronic Frontier Foundation, February 8, 1996. https://www.eff.org/cyberspace-independence. Been, Eric Allen. “Jaron Lanier Wants to Build a New Middle Class on Micropayments.”
…
I even built a computerized system to automatically water my garden. Recently, however, I’ve become skeptical of claims that technology will save the world. For my entire adult life, I’ve been hearing promises about what technology can do to change the world for the better. I began studying computer science at Harvard in September 1991, months after Tim Berners-Lee launched the world’s first website at CERN, the particle physics lab run by the European Organization for Nuclear Research. In my sophomore year, my roommate bought a NeXT cube, the same square black computer that Berners-Lee used as a web server at CERN. It was fun. My roommate got a high-speed connection in our dormitory suite, and we used his $5,000 computer to check our email.
…
Hamilton’s 2016 book Democracy’s Detectives outlined how crucial data-driven investigative journalism is for the public good—and how much this public service can cost. High-impact investigative data journalism stories cost hundreds of thousands of dollars to produce. “Stories can cost thousands of dollars to produce but deliver millions in benefits spread across a community,” Hamilton writes.18 In 2010, Tim Berners-Lee gave the new field the computational stamp of approval when he said: “Journalists need to be data-savvy. It used to be that you would get stories by chatting to people in bars, and it still might be that you’ll do it that way some times. But now it’s also going to be about poring over data and equipping yourself with the tools to analyse it and picking out what’s interesting.
The Boy Who Could Change the World: The Writings of Aaron Swartz
by
Aaron Swartz
and
Lawrence Lessig
Published 5 Jan 2016
The project, spearheaded by Tim Berners-Lee, inventor of the web, proposed to extend the working model of the web to more structured data, so that instead of simply publishing text web pages, users could publish their own databases, which could be aggregated by search engines like Google into major resources. The Semantic Web project has received an enormous amount of criticism, much (in my view) rooted in misunderstandings, but much legitimate as well. In the news today is just the most recent example, in which famed computer scientist turned Google executive Peter Norvig challenged Tim Berners-Lee on the subject at a conference.
…
you can ask all sorts of real questions: “What bands that my friends like are playing around here in the next week?” It can then look at who your stated friends are, see what bands they claim to like, get their schedules, find where you are, and see if any of them match, assuming all this data is available in RDF. It’s a very cool idea, but like the original web, it has this chicken-and-egg problem. When Tim Berners-Lee first came up with the web, he could only show people the handful of pages he’d written, so it didn’t seem all that interesting, and it was difficult to convince people to provide information in this crazy form for free if no one was going to read it. In the same way, there’s not much information out there in RDF now, and, because of that, there aren’t a lot of people working on reading it.
…
I’m not sure what to say to people who want to protect their privacy except be careful when you give out private information and think about where it could end up. When did you find your way toward the web? I’ve been using the web since the days of Mosaic, since I was a little kid. I still wish I had been there since Tim Berners-Lee’s World Wide Web, but I guess I was only 4 then, so it’s not all that unreasonable that I wasn’t. So that was probably ’94/’95. I think I wrote my first web page a couple years after that (’97/’98) and probably started programming database-backed websites around 1999. Actually, it must have been a little earlier, since my first db-backed website won an award in 1999.
HTML5 Cookbook
by
Christopher Schmitt
and
Kyle Simpson
Published 13 Sep 2011
Solution Use the itemscope and itemprop attributes, along with descriptive property names, to label your content: <p itemscope> <span itemprop="inventor">Tim Berners-Lee</span> created the <span itemprop="invention">World Wide Web</span>. </p> Discussion The itemscope attribute is used to identify the scope of the microdata item—an item being a set of name/value pairs. The itemprop attribute values define the property names and their associated values—in this case, the contents of the span tags. Thus, this example yields the following name/value pairs: Inventor: Tim Berners-Lee Invention: World Wide Web This is a very basic example. In the next recipe, we’ll look at an example that implements a standardized vocabulary.
…
Using Microdata and Schema.org Problem You want to convey additional meaning about your content—for example, that the content identifies a person—so that popular search engines can extrapolate this data. Solution In addition to using the itemscope and itemprop attributes, specify an itemtype and apply the appropriate property names from the Schema.org vocabulary: <section itemscope itemtype="http://schema.org/Person"> <h1 itemprop="name">Tim Berners-Lee</h1> <img itemprop="image" src="http://www.w3.org/Press/Stock/Berners-Lee/2001-europaeum-eighth.jpg"> <p> <span itemprop="jobTitle">Director</span>, <span itemprop="affiliation" itemscope itemtype="http://schema.org/Organization" itemprop="name">World Wide Web Consortium</span> </p> <p itemprop="address" itemscope itemtype="http://schema.org/PostalAddress"> <span itemprop="addressLocality">Cambridge</span>, <span itemprop="addressRegion">MA</span> </p> <a itemprop="url" href="http://www.w3.org/People/Berners-Lee/">Website at W3C</a> </section> Discussion The start of this microdata item is again indicated by the use of itemscope on the section element, but also added to this element is the itemtype attribute.
…
Use itemtype with a URL in order to identify the item data type. In this case, we’re using the Schema.org structure to identify a person. As in the previous recipe, the itemprop attribute is applied with property names to give meaning to the content in the markup. By looking at the properties and pairing them with the content, we can tell that “Tim Berners-Lee” is a person’s name and that this person’s job title is “Director.” The use of itemprop for both the image and URL properties works a bit differently: the corresponding values in these cases are the src and href attribute values, respectively. If you’ve worked with microformats in the past, this concept won’t be new to you.
Rebel Code: Linux and the Open Source Revolution
by
Glyn Moody
Published 14 Jul 2002
Because Unix had also failed to embrace fully the new wave of graphical interfaces, its solutions in this area were crude when compared with the Apple Macintosh or Microsoft Windows. Windows NT, by contrast, was designed to marry the power of a VMS-like operating system with the elegance and usability of Windows 3.1. But plenty was happening outside Microsoft’s immediate sphere of interest. In 1991, Tim Berners-Lee, a British physicist at CERN, the European Centre for Nuclear Research, released for public use a hypertext system that he had been developing over the past two years. The system, which he called the World Wide Web, ran across the Internet, then still small-scale and used mostly by academics.
…
Just as few people today are aware that their e-mail is sent and arrives so efficiently thanks largely to free software programs like Sendmail and BIND, it is often forgotten that the other great Net success, the World Wide Web, was made freely available from the beginning. Indeed, as the Web’s creator, Tim Berners-Lee, relates in his book, Weaving the Web, he even switched from an initial idea of using Richard Stallman’s GNU GPL to putting all the Web technology in the public domain in 1993 to ensure that it had the widest possible deployment. One of the first to take up Berners-Lee’s new technology was the National Center for Supercomputing Applications (NCSA), part of the University of Illinois at Urbana-Champaign, where Rob McCool wrote a Web server called HTTPd.
…
“VH was a very stripped down, simple, lightweight hypertext system, so I thought of it sort of as a Volkswagen to Xanadu’s Cadillac. No chrome, no tail fins, but it got the job done. So I called it VolksHypertext.” Xanadu wasn’t the only hypertext system that was being developed in this period. “It was late ’91, I think, I got mail from Tim Berners-Lee—who nobody had heard of at the time—saying, ‘I hear you’ve been doing some interesting things with hypertext; shall we collaborate?’ And so I sent him back some mail, basically saying sure, standards are good. And I didn’t hear back from him. I still don’t know why.” Happily for Raymond, he has many other claims to fame other than almost co-inventing the World Wide Web.
Consent of the Networked: The Worldwide Struggle for Internet Freedom
by
Rebecca MacKinnon
Published 31 Jan 2012
See Aymeric Pichevin, “HADOPI Study Says France’s Three-Strike Law Having Positive Impact on Music Piracy,” Billboard.biz, May 16, 2011, www.billboard.biz/bbbiz/industry/digital-and-mobile/hadopi-study-says-france-s-three-strike-1005185142.story (both accessed June 27, 2011). 108 the company, which was contracted to serve as a clearinghouse for user information that other companies collected to track infringement, was hacked: Nate Anderson, “France Halts ‘Three Strikes’ IP Address Collection After Data Leak,” Ars Technica, May 17, 2011, http://arstechnica.com/tech-policy/news/2011/05/france-halts-three-strikes-ip-address-collection-after-data-leak.ars (accessed June 27, 2011). 108 Broadband analyst Mark Jackson warned: Matthew Richardson, “Illegal File Sharing Cannot Be Stopped, Says Broadband Commentator,” SimplifyDig-ital, December 21, 2010, www.simplifydigital.co.uk/news/articles/2010/12/illegal-file-sharing-cannot-be-stopped-says-broadband-commentator (accessed June 27, 2011). 109 “lobbynomics”: Ian Hargreaves, “Digital Opportunity: A Review of Intellectual Property and Growth,” May 2011, www.ipo.gov.uk/ipreview.htm. 109 the recorded music industry, led by the Recording Industry Association of America, spent a combined $17.5 million in congressional lobbying in 2009 alone: Bruce Gain, “Special Report: Music Industry’s Lavish Lobby Campaign for Digital Rights,” Intellectual Property Watch, January 6, 2011, www.ip-watch.org/weblog/2011/01/06/special-report-music-industrys-lavish-lobby-campaign-for-digital-rights (accessed June 27, 2011). 110 policy outcomes strongly reflect the preferences of the most affluent: Martin Gilens, “Inequality and Democratic Responsiveness,” Public Opinion Quarterly 69, no. 5 (Special Issue 2005): 778–796, http://poq.oxfordjournals.org/content/69/5/778.full.pdf. 110 Until this fundamental flaw in American democracy is addressed: To that end, Lessig is leading two new, related projects: Fix Congress First (www.fixcongressfirst.org) and Root Strikers (www.rootstrikers.org). 111 Tim Berners-Lee . . . suggested that it is time for humanity to upgrade the principles first articulated eight hundred years ago in the Magna Carta: Tim Berners-Lee, “Long Live the Web: A Call for Continued Open Standards and Neutrality,” Scientific American, November 22, 2010, www.scientificamerican.com/article.cfm?id=long-live-the-web. CHAPTER 8: CORPORATE CENSORSHIP 115 On Apple’s special store for the Chinese market, apps related to the Dalai Lama are censored: Owen Fletcher, “Apple Censors Dalai Lama iPhone Apps in China,” IDG News, December 29, 2009, www.pcworld.com/article/185604/apple_censors_dalai_lama_iphone_apps_in_china.html (accessed June 27, 2011). 116 When challenged, company executives said: See Anita Ramasastry, “Should Cellphone Companies Be Able to Censor the Messages We Send?
…
All websites, no matter where they are created or what kind of computer system is used to host them, are readable from anywhere thanks to a common computer language called the hypertext mark-up language (HTML). The web is how most people on the planet today use the Internet. We can thank the Englishman Sir Tim Berners-Lee (eventually knighted for his invention), who in 1990 while at the particle physics lab at the European Organization for Nuclear Research in Switzerland wrote a simple computer program called the WorldWideWeb to make it easier for researchers in his lab to locate and share each other’s data. Sir Berners-Lee did not try to patent or charge for the use of his HTML language and the web-addressing system he created; instead he released them into the public domain.
…
But if democratically elected leaders adopt the policies pushed by lobbyists and make Internet and telecommunications companies vet and track their users to stop all “infringing” activity, not only can dictators breathe a sigh of relief, but so can incumbent politicians everywhere who would rather not have to face Internet-organized grassroots citizens’ movements. In late 2010 Tim Berners-Lee, inventor of the World Wide Web, suggested that it is time for humanity to upgrade the principles first articulated eight hundred years ago in the Magna Carta: “No person or organization shall be deprived of the ability to connect to others without due process of law and the presumption of innocence,” he wrote in an impassioned essay.
The Business Blockchain: Promise, Practice, and Application of the Next Internet Technology
by
William Mougayar
Published 25 Apr 2016
I could teach you how to drive one, but cannot predict where you will take it. Only you know your particular business or situation, and only you will be able to figure out where blockchains fit, after you have learned what they can do. Of course, we will first go together on road tests and racing tracks to give you some ideas. VISITING SATOSHI’S PAPER When Tim Berners-Lee created the first World Wide Web page in 1990, he wrote: “When we link information in the Web, we enable ourselves to discover facts, create ideas, buy and sell things, and forge new relationships at a speed and scale that was unimaginable in the analogue era.” In that short statement, Berners-Lee predicted search, publishing, e-commerce, e-mail, and social media, all at once, by a single stroke.
…
At the dawn of the Internet life in 1994, Kevin Kelly wrote in his book, Out of Control, three important comments to remember: The network is the icon of the 21st century. The net icon has no center—it is a bunch of dots connected to other dots. A decentralized, redundant organization can flex without distorting its function, and thus it can adapt. No wonder Tim Berners-Lee, the inventor of the Web, started an initiative, Web We Want,2 to reclaim some of the original goals of the Web. Notes Berners-Lee and the website’s community: We are concerned about the growing number of threats to the very existence of the open Web, such as censorship, surveillance, and concentrations of power.
…
Carr Nick Dodson Nick Szabo O Open Assets open source oracle smart oracle Otonomos over-the-counter ownership P PayPal Peer-to-Peer P2P policy makers post-trade privacy private blockchain productivity programming 10, proof in a service proof of – authority existence ownership provenance receipt stake work R R3 CEV reengineering regulation repurchase market reputation Ricardian contracts Ripple risk Robert Sams S Satoshi Nakamoto security settlement smart contract smart property society standard startups supply chain swaps SWIFT T Tierion Tim Berners-Lee token TPS transactions TransActive Grid trust U Ukraine unbundling V VISA Vitalik Buterin W wallets warehouse receipts web3 work World Wide Web ADDITIONAL RESOURCES Executive Presentations by William Mougayar, Explaining the Impact of the Blockchain and Decentralization As a trained professional consultant and analyst, William starts by understanding the context and unique requirements of each audience he addresses.
The Filter Bubble: What the Internet Is Hiding From You
by
Eli Pariser
Published 11 May 2011
From megacities to nanotech, we’re creating a global society whose complexity has passed the limits of individual comprehension. The problems we’ll face in the next twenty years—energy shortages, terrorism, climate change, and disease—are enormous in scope. They’re problems that we can only solve together. Early Internet enthusiasts like Web creator Tim Berners-Lee hoped it would be a new platform for tackling those problems. I believe it still can be—and as you read on, I’ll explain how. But first we need to pull back the curtain—to understand the forces that are taking the Internet in its current, personalized direction. We need to lay bare the bugs in the code—and the coders—that brought personalization to us.
…
—Christopher Alexander et al., A Pattern Language In theory, there’s never been a structure more capable of allowing all of us to shoulder the responsibility for understanding and managing our world than the Internet. But in practice, the Internet is headed in a different direction. Sir Tim Berners-Lee, the creator of the World Wide Web, captured the gravity of this threat in a recent call to arms in the pages of Scientific American titled “Long Live the Web.” “The Web as we know it,” he wrote, “is being threatened.... Some of its most successful inhabitants have begun to chip away at its principles.
…
But the Internet isn’t doomed, for a simple reason: This new medium is nothing if not plastic. Its great strength, in fact, is its capacity for change. Through a combination of individual action, corporate responsibility, and governmental regulation, it’s still possible to shift course. “We create the Web,” Sir Tim Berners-Lee wrote. “We choose what properties we want it to have and not have. It is by no means finished (and it’s certainly not dead).” It’s still possible to build information systems that introduce us to new ideas, that push us in new ways. It’s still possible to create media that show us what we don’t know, rather than reflecting what we do.
Googled: The End of the World as We Know It
by
Ken Auletta
Published 1 Jan 2009
His professed motto is, “Often wrong, never in doubt.” A self-made multimillionaire at age thirty-eight, Andreessen has often been right. As a computer science major at the University of Illinois at Urbana-Champaign, he worked at the university’s National Center for Supercomputing Applications. Inspired by Tim Berners-Lee’s vision of open standards for the Internet, in 1992 he and a coworker, Eric Bina, created an easy to use browser called Mosaic. The browser worked on a variety of computers, facilitating the hypertext links that allow Web surfing and Google search, helping users to effortlessly hop from site to site.
…
Phorm, an American company with offices around the world, proposed to go one step further, approaching telephone and broadband Internet service providers with software that tracks each consumer’s online activities, so that a nameless portrait of each consumer can be created. In return for supplying the data, the telephone and cable companies can open a new revenue spigot. By late 2007, Phorm had done three deals in England that yielded data on two-thirds of Britain’s broadband households. Publicity about Phorm aroused the ire of Tim Berners-Lee, a senior researcher at MIT and the inventor of the World Wide Web. Because Berners-Lee refused to patent his invention, to cash in financially, or to become a talk-show celebrity, his opinion carries heft. In a rare interview with the BBC, Berners-Lee expressed outrage: “I want to know if I look up a whole lot of books about some form of cancer that that’s not going to get to my insurance company and I’m going to find my insurance premium is going to go up by 5 percent because they’ve figured I’m looking at those books.”
…
Craig Newmark, the founder of craigslist, extols “nerd values,” by which he means that he is intent on keeping his listings free for most of his users and refuses to enrich himself by selling his company or taking a large salary. Social idealism has been a core value in the culture of the Internet, from the insistence of Tim Berners-Lee, who believed that the Web should be open and that he would not patent it or enrich himself; to the open-source movement; to Wikipedia, which follows a democratic faith in “the wisdom of crowds” and has adopted a nonprofit model. Before one dismisses these approaches as the gauzy thinking of left-wing populists, consider how often traditional companies now promote their own “corporate social responsibility”—in part to ecumenically emulate Andrew Carnegie, in part to bathe in the favorable publicity, in part to profit from some of these endeavors, and in part as a reaction against almost daily ethical business lapses.
HTML5 for Web Designers
by
Jeremy Keith
Published 2 Jan 2010
HTML5 is the latest iteration of this lingua franca. While it is the most ambitious change to our common tongue, this isn’t the first time that HTML has been updated. The language has been evolving from the start. As with the web itself, the HyperText Markup Language was the brainchild of Sir Tim Berners-Lee. In 1991 he wrote a document called “HTML Tags” in which he proposed fewer than two dozen elements that could be used for writing web pages. Sir Tim didn’t come up with the idea of using tags consisting of words between angle brackets; those kinds of tags already existed in the SGML (Standard Generalized Markup Language) format.
…
Over time, they were merged into a single specification called simply HTML5. Reunification While HTML5 was being developed at the WHATWG, the W3C continued working on XHTML 2. It would be inaccurate to say that it was going nowhere fast. It was going nowhere very, very slowly. In October 2006, Sir Tim Berners-Lee wrote a blog post in which he admitted that the attempt to move the web from HTML to XML just wasn’t working. A few months later, the W3C issued a new charter for an HTML Working Group. Rather than start from scratch, they wisely decided that the work of the WHATWG should be used as the basis for any future version of HTML.
Lurking: How a Person Became a User
by
Joanne McNeil
Published 25 Feb 2020
Then I was ready at the landing page to dive in and hide. * * * The first internet message was sent over a packet-switching network in 1969. Two decades later, the launch of the World Wide Web added another gust of excitement. This development was more accessible and customizable than previous online functions. Tim Berners-Lee humbly announced his new “hypertext browser/editor” in several posts to Usenet newsgroups in 1991. “This project is experimental and of course comes without any warranty whatsoever. However, it could start a revolution in information access,” he offered. The web is now core to the online experience, and many users mistake it for the internet itself, but the web is websites or web pages that a user accesses with a browser (like Chrome or Firefox, or Netscape or Mosaic before).
…
When tech executives appoint themselves as the stewards of the industry cleanup, they carry on with the same spirit of contempt for outsiders—and users—that unleashed the problems in the first place. Yet failure to anticipate the consequences of the internet is not the same as accepting, loving, or yielding to the internet in its current state. None other than Tim Berners-Lee—he of the modest 1991 Usenet announcement about a project that “comes without any warranty whatsoever”—has expressed disappointment about what has become of it. The World Wide Web was built following the ethic of a decentralized internet; concentration of power, which Amazon, Google, and Facebook have shored up, once seemed impossible to him.
…
The William Gibson quote comes from an interview in The Paris Review (David Wallace-Wells, “William Gibson, The Art of Fiction No. 211,” 2011). “First” as a date and concept is often a fuzzy distinction. The World Wide Web is thirty years old, a date that is basically correct the year I write this (2019), the year this book will hit shelves (2020), and the year after that (2021); Tim Berners-Lee invented it in 1989, wrote the first browser in 1990, and released it to the public, per his newsgroup posts, in 1991. Among the first commercial online services, The Source launched in July 1979, according to the 1995 Washington Technology obit for its founder, William F. von Meister, which also reports that Isaac Asimov was in attendance at its launch at the Plaza Hotel event, where he said, “This is the beginning of the information age.”
The Code: Silicon Valley and the Remaking of America
by
Margaret O'Mara
Published 8 Jul 2019
“It is vast, unmapped, culturally and legally ambiguous, verbally terse . . . hard to get around in, and up for grabs.”6 Operation Sun Devil faded into the desert sunset. But the questions it raised about the laws of cyberspace became ever more urgent. For the same May that the feds made that predawn raid on a San Jose subdivision, a British computer scientist named Tim Berners-Lee began to circulate a modest proposal to adapt Ted Nelson’s thirty-year-old notion of “hypertext” to organize the sprawling surge of information on the Internet. He called it the World Wide Web.7 INTERNETTING The Internet was more than thirty years old by the start of the 1990s, and it still had the academic and proudly noncommercial spirit it started with in 1969.
…
Tenenbaum believed the NSF would budge from its noncommercial stance eventually.10 Significantly, the innovation that changed everything was not a product of DARPA, NSF, or one of their American academic grantees. It emerged from outside the U.S. altogether, from the mind of a British scientist employed by the European Organization for Nuclear Research, or CERN, in Geneva. Nonetheless, American hacker-and-homebrew culture provided soul and inspiration for Tim Berners-Lee. He wanted information to be organized, but he also wanted it to flow freely and transparently. Working on a NeXT workstation (just like any self-respecting member of his scientific tribe), Berners-Lee and his CERN team created many of the building blocks of the online future. Hypertext markup language, or HTML, provided a common tongue for all the information now riding atop the Internet, both textual and visual.
…
It was clear by the start of the 1990s that a new wave was brewing in tech, and that government action now would make or break the Internet’s ability to realize its promise later. Bipartisan support enabled the Gore-sponsored High Performance Computing Act to become law in December 1991, only five months after Tim Berners-Lee released his Internet browser. President Bush endorsed it, and so did House Minority Whip Newt Gingrich. The Act ushered in an era of better standards and faster, higher-capacity connections, while still keeping the decentralized, democratic structure of the Internet intact. Soon after, as Marty Tenenbaum had hoped, the NSF started to pull down the walls of its online garden, changing the terms of its “acceptable use policy” to permit business transactions online.16 Mitch Kapor ran with a crowd that made no bones about its antipathy for government.
WTF?: What's the Future and Why It's Up to Us
by
Tim O'Reilly
Published 9 Oct 2017
The first website went live on August 6, 1991. It contained a simple description of Tim Berners-Lee’s hypertext project, complete with source code for a web server and a web browser. The site could be accessed by Telnet, a remote log-in program, and using that, you could download the source code for a web server and set up your own site. By the time Dale Dougherty and I had lunch with Tim in Boston a year later, there were perhaps a hundred websites. Yet by the time Google launched in September 1998, there were millions. Because the World Wide Web had been put into the public domain, Tim Berners-Lee didn’t have to do all the work himself.
…
but are already well on their way to being the stuff of daily life. The Linux operating system was a unicorn. It seemed downright impossible that a decentralized community of programmers could build a world-class operating system and give it away for free. Now billions of people rely on it. The World Wide Web was a unicorn, even though it didn’t make Tim Berners-Lee a billionaire. I remember showing the World Wide Web at a technology conference in 1993, clicking on a link, and saying, “That picture just came over the Internet all the way from the University of Hawaii.” People didn’t believe it. They thought we were making it up. Now everyone expects that you can click on a link to find out anything at any time.
…
Dale Dougherty, one of my earliest employees, who had played a key role in transforming O’Reilly & Associates (later renamed O’Reilly Media) from a technical writing consulting company into a technology book publishing company in the late 1980s, and whom I’d come to consider a cofounder, had gone on to explore online publishing. He created our first ebook project in 1987, and in trying to develop a platform for ebook publishing that would be open and available to all publishers, had discovered the nascent World Wide Web. Dale had brought the web to my attention, introducing me to Tim Berners-Lee in the summer of 1992. We quickly became convinced that the web was a truly important technology that we had to cover in our forthcoming book about the Internet, which was just then opening up for commercial use. Ed Krol, the author, didn’t yet know much about the web, so Mike Loukides, his editor at O’Reilly, wrote the chapter and we added it to the book just before its publication in October 1992.
Peer-to-Peer
by
Andy Oram
Published 26 Feb 2001
Beyond the potential efficiency of such networks, peer-to-peer systems can help people share ideas and viewpoints more easily, ultimately helping the formation of online communities. The writable Web The Web started out as a participatory groupware system. It was originally designed by Tim Berners-Lee as a way for high-energy physicists to share their research data and conclusions. Only later was it recast into a publishing medium, in which sites seek to produce content that attracts millions of passive consumers. To this day, there is a strong peer-to-peer element at the very heart of the Web’s architecture: the hyperlink.
…
Initially, these resources were competing services such as FTP, Gopher, and WAIS. But eventually, through CGI, the Web became an interface to virtually any information resource that anyone wanted to make available. Mailto and news links even provide gateways to mail and Usenet. There’s still a fundamental flaw in the Web as it has been deployed, though. Tim Berners-Lee created both a web server and a web browser, but he didn’t join them at the hip the way Napster did. And as the Buddhist Dhammapadda says, “If the gap between heaven and earth is as wide as a barleycorn, it is as wide as all heaven and earth.” Before long, the asymmetry between clients and servers had grown wide enough to drive a truck through.
…
Back in its pioneering days, the Web was idealized as a revolutionary peer platform that would enable anyone on the Internet to become a publisher and editor. It empowered individuals to publish their unique collections of knowledge so that they were accessible by anyone. The vision was of a worldwide conversation where everyone could be both a voice and a resource. Here are a few quotes from Tim Berners-Lee to pique your interest: The World Wide Web was designed originally as an interactive world of shared information through which people could communicate with each other and with machines (http://www.w3.org/People/Berners-Lee/1996/ppf.html). I had (and still have) a dream that the web could be less of a television channel and more of an interactive sea of shared knowledge.
Reinventing Discovery: The New Era of Networked Science
by
Michael Nielsen
Published 2 Oct 2011
[12] Yochai Benkler. Coase’s penguin, or, Linux and The Nature of the Firm. The Yale Law Journal, 112:369–446, 2002. [13] Yochai Benkler. The Wealth of Networks. New Haven: Yale University Press, 2006. [14] Tim Berners-Lee. Weaving the Web. New York: Harper Business, 2000. [15] Tim Berners-Lee and James Hendler. Publishing on the semantic web. Nature, 410:1023–1024, April 26, 2001. [16] Tim Berners-Lee, James Hendler, and Ora Lassila. The semantic web. Scientific American, May 17, 2001. [17] Mario Biagioli. Galileo’s Instruments of credit: Telescopes, images, secrecy. Chicago: University of Chicago Press, 2006
…
Networked science, in general: The potential of computers and the network to change the way science is done has been discussed by many people, and over a long period of time. Such discussion can be found in many of the works describd above, in particular the work of Vannevar Bush [31] and Douglas Engelbart [63]. Other notable works include those of Eric Drexler [57], Jon Udell [227], Christine Borgman [23], and Jim Gray [83]. See also Tim Berners-Lee’s original proposal for the world wide web, reprinted in [14]. A stimulating and enjoyable fictional depiction of networked science is Vernor Vinge’s Rainbows End [231]. Data-driven science: One of the first people to understand and clearly articulate the value of data-driven science was Jim Gray, of Microsoft Research.
World Without Mind: The Existential Threat of Big Tech
by
Franklin Foer
Published 31 Aug 2017
• • • EVERY SIGNIFICANT technological development since has come wrapped in McLuhan’s aspiration: the desire for machines to usher in a new era of cooperation. That’s what J.C.R. Licklider meant when he explained how his invention of the Internet would erase social isolation: “Life will be happier for the on-line individual.” And how Tim Berners-Lee described the possibilities of the World Wide Web he created: “Hope in life comes from the interconnections among all the people in the world.” The dream of stitching the world into a global village has been embodied in the nomenclature of modern technology—the net is interconnected, the Web is worldwide, media is social.
…
His gait was uneven; at times, he struggled to breathe. When he felt good, Carl Page, Sr., was a bundle of magical enthusiasms. He would scurry down the corridors of the computer science department, summoning colleagues to his office to announce one of his many big ideas. He could be an enchanted seer. In the eighties, years before Tim Berners-Lee’s invention of the Web, he would riff about the potential of hyperlinks. The students at Michigan State found Carl’s passions to be both inspiring and a bit overwhelming. His faith in their skills occasionally would stretch beyond the reality of their expertise. There was the time, for instance, he assigned kids to write code that would enable a robot to plug itself into electrical outlets.
…
“Today, after more than a century”: Marshall McLuhan, Understanding Media (McGraw-Hill, 1964), 3. “desert of classified data”: Eric McLuhan and Frank Zingrone, eds., Essential McLuhan (Basic Books, 1995), 92. “Today computers hold out the promise”: McLuhan, 80. “Life will be happier for the on-line individual”: Isaacson, 261. “Hope in life comes from the interconnections”: Tim Berners-Lee, Weaving the Web (HarperCollins, 1999), 209. “Money is not the greatest of motivators”: Linus Torvalds, Just for Fun (HarperCollins, 2001), 227. “Competition means strife”: Tim Wu, The Master Switch (Alfred A. Knopf, 2010), 8. “America’s most famous financier”: Ron Chernow, The House of Morgan (Atlantic Monthly Press, 1990), 54.
The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine
by
Peter Lunenfeld
Published 31 Mar 2011
In other words, who would turn this installed base of atomized personal computers into a communication system? Who would link them together to share their entertainments and experiences, and even pool resources to tackle those wicked problems? The Hosts, of course. The Hosts: Tim Berners-Lee and Linus Torvalds You affect the world by what you browse. —Tim Berners-Lee Software is like sex: it’s better when it’s free. —Linus Torvalds After the triumph of Apple, the Macintosh, Pixar, iTunes, the iPhone and the iPad, it is hard to remember that Jobs also had his share of failures. Buried in his corporate biography is NeXT Computer, the short-lived company that he founded after leaving Apple in the mid-1980s.
…
In reaction to the buttoned-down, all-business attitudes of the Plutocrats, the Aquarians of the 1960s and 1970s—people like Douglas Englebart and Alan Kay—expand on the more openended ideas of the Patriarchs, and develop the paradigm of visual, personalized, networked computing. In the 1980s and 1990s, the Hustlers—Microsoft’s Bill Gates and Apple’s Steve Jobs—commodify this personalized vision, putting a distinctive, “new economy” stamp on computing. Building on the installed base of all these users as the new millennium looms, the Hosts— World Wide Web inventor Tim Berners-Lee and open-source guru Linus Torvalds—link these disparate personal machines into a huge web, concentrating on communication as much as technology, pushing participation to the next level. The sixth generation, that of the Searchers—named after but hardly limited to Larry Page and Sergey Brin of Google, the search algorithm that became a company and then a verb—aggregated so much information and so many experiences that they rendered simulation and participation ubiquitous.
Internet for the People: The Fight for Our Digital Future
by
Ben Tarnoff
Published 13 Jun 2022
But over the course of the 1990s, many newcomers would come to know the internet primarily through the web, to the point where people had trouble distinguishing between the two. The first website went up in 1990; the browser that would popularize the web, Mosaic, appeared three years later. As usual, public money played a leading role. Tim Berners-Lee, the creator of the World Wide Web, worked as a scientist at CERN, the European research organization backed by nearly two dozen member states, while Mosaic was developed at the University of Illinois’s National Center for Supercomputing Applications, which had been created by the NSF in the 1980s.
…
It was somewhere to talk about politics or science fiction or the best way to implement a protocol. Other people were the main attraction, and this quality would endure, and inspire new applications, as the internet expanded. Even the World Wide Web was made with community in mind. “I designed it for a social effect—to help people work together,” its creator, Tim Berners-Lee, would later write. Community is what Omidyar liked best about the internet, and what he feared the dot-com gold rush would kill. He wasn’t alone in this: one could find dissidents polemicizing against the forces of commercialization on radical mailing lists like Nettime. But Omidyar was no anti-capitalist.
…
Ben Fowkes (London: Penguin Books, 1990 [1976]), 1019–38. 78, This is a useful lens … For a slightly different discussion of formal subsumption and real subsumption within the context of the internet, see Gavin Mueller, Breaking Things at Work: The Luddites Are Right about Why You Hate Your Job (London: Verso, 2021), 108–9. 79, In this, the internet followed … Email: Abbate, Inventing the Internet, 106–10; Hafner and Lyon, Where the Wizards Stay Up Late, 187–218. Three-quarters of all network traffic: Ibid., 194. Early online communities: Ryan, A History of the Internet, 74–87. 80, Email was more than just … “I designed it …”: Tim Berners- Lee, Weaving the Web (New York: HarperBusiness, 2000 [1999]), 123. 80, Community is what Omidyar … Nettime: Evan Malmgren, “Specter in the Machine,” Logic, December 20, 2020. Omidyar’s disdain for cheesy web commercialization: Cohen, The Perfect Store, 7. 80, eBay, by contrast … Importance of eBay as “community”: White, Buy It Now, 26–39. 81, This wasn’t an entirely … Bill Gates, with Nathan Myhrvold and Peter Rinearson, The Road Ahead (New York: Viking, 1995), 6. 81, Combining the community with the market … Encouraged to perform unpaid activities: As White observes in Buy It Now, 31, “eBay’s community discourse gets members to work for free.”
Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US
by
Rana Foroohar
Published 5 Nov 2019
Still, the World Wide Web consisted of billions of items, with more pouring in every second. How could they possibly manage to organize it in such a way that would allow people to find that one specific straw they needed? Let’s say you wanted information on Tim Berners-Lee. The reigning approach of AltaVista, then the leading search engine, assumed that the document you’d most want would be the one with the most mentions of Tim Berners-Lee. Page and Brin thought that was silly. Just because the words appeared many times didn’t mean it would necessarily offer the best, most useful information on the subject. But what would? Here, Larry Page relied on an insight from his parents’ background in academia, where the most desirable papers on a topic were never the ones that just repeated a term or name endlessly, but the one that other papers cited most frequently.
…
With so much new content coming onto the Web every day—all the articles, photographs, and songs people were posting on the seemingly infinite number of new websites cropping up—they were focused on developing a way to quickly sort through it all. They understood that when the ingenious British engineer Tim Berners-Lee invented the Web back in 1989, his genius was the ability to see that all the things living in cyberspace were connected to other things. It was a welter, sure, but a welter that could be organized like any other. To him, the Web was like the Library of Congress, with each book bearing a catalog number.
…
As one parent—a former Facebook employee—put it to a reporter at The New York Times, “I am convinced that the devil lives in our phones.”25 Utter hypocrisy? Or true repentance? Maybe a little of both. But it’s true that a growing number of technologists are finally waking up to the sheer force of the destruction they’ve unwittingly unleashed—and working to absolve the sins of their past. Witness techies-turned-activists like Tim Berners-Lee, who created the World Wide Web, and is now trying to wrest it out of Big Tech’s all-too-powerful hands. Or James Williams, a Googler-turned-philosopher who left the Valley for Oxford to research the ethics of persuasive technology. Or Jaron Lanier, the pioneer of virtual reality, whose recent book, Ten Arguments for Deleting Your Social Media Accounts Right Now, argues that social media is creating a culture of victims and diminishing diversity of thought in a way that will undermine not only our economy and democracy, but free thought itself.
System Error: Where Big Tech Went Wrong and How We Can Reboot
by
Rob Reich
,
Mehran Sahami
and
Jeremy M. Weinstein
Published 6 Sep 2021
Whereas the first wave of Silicon Valley tech companies focused on hardware—semiconductors, microprocessors, and personal computers—with companies such as Fairchild, Intel, and Apple, it was only a matter of time before software—ethereal bits, not physical atoms—would become the dominant force in Silicon Valley’s growth. Fast forward to 1989, half a world away from California. The mild-mannered British scientist Tim Berners-Lee, working at the CERN laboratory in Geneva, Switzerland, proposed the creation of the World Wide Web as a means of sharing research data between labs worldwide. In his acceptance speech after winning the 2016 A. M. Turing Award—regarded as the Nobel Prize of computer science—Berners-Lee recounted that the idea for the Web had initially received little fanfare and he had been heartened that his supervisor at the time hadn’t canceled the project altogether.
…
But privacy as a value is absolutely central to democratic government. If the state violates your privacy, he noted, you can’t easily switch governments. “It scares me to death when the NSA or the IRS know things about my personal life and how I vote.” None other than the founder of the World Wide Web, Tim Berners-Lee, echoes McNealy’s sentiment. But he also holds companies partly to blame for their willing cooperation in building surveillance technologies for the government. He expresses regret about how far the Web has strayed from his initial vision. After the revelations of Edward Snowden, in an Open Letter to the Internet on the twenty-eighth anniversary of the launch of the web in 2017, Berners-Lee wrote: Through collaboration with—or coercion of—companies, governments are also increasingly watching our every move online, and passing extreme laws that trample on our rights to privacy.
…
In the same way that we attach our preferences to a standardized telephone number or address, we should be able to do the same with a neutral online identifier that carries our privacy preferences. A number of entrepreneurs are racing to figure out how to do this. For example, a former Google executive, Richard Whitt, is designing a “digital trustmediary” that will sit between users and platforms to control what data are shared at any moment. And the internet pioneer Tim Berners-Lee has a new start-up called Inrupt that organizes users’ data into “data pods” that grant applications access to their data selectively and in line with the users’ preferences. Third, we need a credible, legitimate government agency that has the power to preserve and protect the privacy rights defined by new legislation.
Exponential: How Accelerating Technology Is Leaving Us Behind and What to Do About It
by
Azeem Azhar
Published 6 Sep 2021
The email protocol – the set of rules describing what an email is, and how sending and receiving computers should process it, is described in two documents: the RFC (Request for Comments) 82132 and the RFC 822.33 Both date from 1982, and were written by the late Jon Postel of the University of Southern California and David Crocker, then of the University of Delaware, respectively. The web protocol became established as a de facto standard within a couple of years of its development by Tim Berners-Lee in 1989. These internet standards mean we can send emails from one person to another without worrying about the compatibility of our email systems. They are, in short, interoperable. And we all benefit hugely from that interoperability. Standardisation of this kind makes the world more efficient – and allows different innovations to combine.
…
And because this is where the readers gather, Wikipedia becomes the space where contributors want to write. It is a positive externality: every additional contributor brings in more readers, which in turn attracts more contributors. In fact, the World Wide Web itself benefited from the network effect. When Tim Berners-Lee first developed the web in 1989, there were several contenders for methods of storing information on the internet, like Gopher and WAIS. There were also commercial products like GEnie, CompuServe and Delphi. In 1994, Microsoft even targeted the market with its own offering, Microsoft Network.
…
In times of rapid change, a safety net becomes critical – lest people lose their jobs and find themselves unable to survive. And the more entrepreneurial and volatile the economy, the more essential such a safety net becomes. Many academics and technologists, from the French superstar economist Thomas Piketty to the founder of the Web, Tim Berners-Lee, argue for universal basic income (UBI) to solve this very problem. Under a UBI system, a government gives every citizen a regular sum of cash, no strings attached. Expensive as this might sound, it’s certainly a quick route to economic security for large numbers of people who might otherwise be at the mercy of a cruel labour market.
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room
by
David Weinberger
Published 14 Jul 2011
For example, a local library site, Google Books, and Amazon all might have information online about the book On the Origin of Species: bibliographic information, ratings, reviews, analyses of word frequencies, and more. But because each of these has a different way of identifying the book, there’s no easy way to write a program that will reliably pull all that information together. If each of these sites followed the conventions specified by the Semantic Web—initiated by Sir Tim Berners-Lee, the inventor of the World Wide Web, around the turn of the millennium—computer programs could far more easily know that these sites were referring to the same book. In fact, the Semantic Web would make it possible to share far more complex information from across multiple sites. Agreeing on how to encode metadata makes the Net capable of expressing more knowledge than was put into it.
…
But writing an ontology of financial markets would require agreeing on exactly what the required definitional elements of a “trade,” “bond,” “regulation,” and “report” are—as well as on every detail and every connection with other domains, such as law, economics, and politics. So, some supporters of the Semantic Web (including Tim Berners-Lee 8) decided that there would be faster and enormous benefits to making data accessible in standardized but imperfect form—as what is called “Linked Data”—without waiting for agreement about overarching ontologies. So, if you have a store of information about, say, chemical elements, you can make it available on the Web as a series of basic assertions that are called “triples” because they have the form of two objects joined by a relation: “Mercury is an element.”
…
id=kFU-AAAAYAAJ. 6 See Ethan Zuckerman’s excellent post “Shortcuts in the Social Graph,” October 14, 2010, http://www.ethanzuckerman.com/blog/2010/10/14/shortcuts-in-the-social-graph/. 7 During the 2008 presidential campaign, Sarah Palin was accused of pressuring a local librarian to censor some books. See Rindi White, “Palin Pressured Wasilla Librarian,” Anchorage Daily News, September 4, 2008, http://www.adn.com/2008/09/03/515512/palin-pressured-wasilla-librarian.html. 8 Tim Berners-Lee, “Linked Data,” July 27, 2006, http://www.w3.org/DesignIssues/LinkedData.html. 9 This was the price quoted at Fisher Scientific on June 11, 2011. See http://www.fishersci.com/ecomm/servlet/itemdetail?catalogId=29104&productId=3426224&distype=0&highlightProductsItemsFlag=Y&fromSearch=1&storeId=10652&langId=-1. 10 See http://www.dublincore.org. 11 Jillian C.
Factfulness: Ten Reasons We're Wrong About the World – and Why Things Are Better Than You Think
by
Hans Rosling
,
Ola Rosling
and
Anna Rosling Rönnlund
Published 2 Apr 2018
In 2010, the World Bank decided to release all of its data for free (and thanked us for insisting). We presented at the ceremony for their new Open Data platform in May 2010, and since then the World Bank has become the main access point for reliable global statistics; see gapm.io/x6. This was all possible thanks to Tim Berners-Lee and other early visionaries of the free internet. Sometime after he had invented the World Wide Web, Tim Berners-Lee contacted us, asking to borrow a slide show that showed how a web of linked data sources could flourish (using an image of pretty flowers). We share all of our content for free, so of course we said yes. Tim used this “flower-powerpoint” in his 2009 TED talk—see gapm.io/x6—to help people see the beauty of “The Next Web,” and he uses Gapminder as an example of what happens when data from multiple sources come together; see Berners-Lee (2009).
…
Journal of Development Economics 104 (2013): 184–98. BBC. Producer Farhana Haider. “How the Danish Jews escaped the Holocaust.” Witness, BBC, Magazine, October 14, 2015. gapm.io/xbbcesc17. Berners-Lee, Tim. “The next web.” Filmed February 2009 in Long Beach, CA. TED video, 16:23. gapm.io/x-tim-b-l-ted. https://www.ted.com/talks/tim_berners_lee_on_the_next_web. Betts, Alexander, and Paul Collier. Refuge: Rethinking Refugee Policy in a Changing World. New York: Oxford University Press, 2017. Biraben, Jean-Noel. “An Essay Concerning Mankind’s Evolution.” Population. Selected Papers. Table 2. December 1980. As cited in US Census Bureau. gapm.io/xuscbbir.
Amateurs!: How We Built Internet Culture and Why It Matters
by
Joanna Walsh
Published 22 Sep 2025
We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open-source software.1 Web 2.0 could, equally, be seen as a rebranding exercise for the net following the dot-com crash of 2001. Internet founder Tim Berners-Lee described the term, coined by Darcy DiNucci in 1999, as ‘a piece of jargon . . . If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along.’2 What made Web 2.0 different was not only the input of amateurs, but new opportunities for their exploitation.
…
As an alternative, Bernard Stiegler envisaged an economy of the amateur: ‘Against a consumer economy that depletes the desires of consumers, an economy of contribution (one that makes possible cultural and cognitive technologies) would be psychically and collectively individuated by amateurs.’35 This is not a net dream. The net was born of an ‘economy of contribution’. In 1989, Tim Berners-Lee developed the World Wide Web in his spare time while working at CERN. In 1991 Linus Torvalds, then a student at the University of Helsinki, posted to the comp.os.minix newsgroup: ‘I’m doing a (free) operating system (just a hobby, won’t be big and professional).’ Linux was developed as open-source code that relied on what came to be known as commons-based peer production, and it is far from the only volunteer-developed web technology.
…
2012 Sianne Ngai, Our Aesthetic Categories 2012 Jayson Musson, Art Thoughtz by Hennessy Youngman, YouTube 2013 Atsuko Sato, Doge (the meme) 2013 Nedroid, The Internet 2014 Amalia Ulman, Excellences and Perfections 2014 McCoy, Dash, Quantum (first NFT) 2015 BookCorpus 2015 Anon, Cursed Images (Tumblr) 2302015 Anne Boyer, Garments Against Women 2015 Laura Bennett, ‘First-Person Industrial Complex’ 2016 Zhang Yiming, TikTok 2016 Microsoft, Tay (Twitterbot) 2017 Nick Land, ‘A Quick and Dirty Introduction to Accelerationism’ 2017 Jia Tolentino, ‘The Personal Essay Boom Is Over’ 2018 Aesthetics Wiki 2018 Barbara Hammer, The Art of Dying or (Palliative Art Making in the Age of Anxiety) 2019 Achille Mbembe, Necropolitics 2020/02 Denis Shiryaev, L’arrivée d’un train à La Ciotat 2020 Timnit Gebru, co-leader of Google’s Ethical Artificial Intelligence Team, sacked 2020 Shawn Presser, Books3 2020 Nadeem, Bethke, Reddy, StereoSet 2020 Venvonis, Vowlenu 2021 Harney, Moten, All Incomplete 2021 OpenAI, Dall-E 2021 NFT boom 2022 Kane Parsons, The Backrooms 2022 @pharmapsychotic, Clip Interrogator 2022 Elon Musk buys Twitter 2022 OpenAI, Outpainting 2022 Residents of Des Moines, Iowa, sue OpenAI over water use 2023 Reddit mod strike 2023 University of Chicago, Nightshade and Glaze 2023 Hito Steyerl, Mean Images 2023 Cory Doctorow, The Internet Con 2024 Cord Jefferson, American Fiction 2024 Claire Bishop, Disordered Attention 2024 Legacy Russell, Black Meme 2024 Reddit IPO 231 Notes 2004: Amateurs 1.Lev Grossman, ‘You – Yes, You – Are TIME’s Person of the Year’, Time, 25 December 2006. 2.developerWorks Interviews: Tim Berners-Lee, 28 July 2006. 3.Ibid. 4.Henri Lefebvre, Everyday Life in the Modern World (Allen Lane/Penguin, 1971), p. 204. 5.William Wordsworth, Samuel Taylor Coleridge, Lyrical Ballads: 1798 and 1800 (Broadview Press, 2008), p. 175. 6.Jacques Lacan, On Feminine Sexuality: The Limits of Love and Knowledge: Encore: The Seminar of Jacques Lacan, Book XX (Norton, 1999), p. 3. 7.Henri Lefebvre, Critique of Everyday Life (Verso, 2014), p. 51. 8.Ibid., p. 61. 9.Jacques Rancière, The Politics of Aesthetics, trans.
Machine, Platform, Crowd: Harnessing Our Digital Future
by
Andrew McAfee
and
Erik Brynjolfsson
Published 26 Jun 2017
Wu, for example, found that adopting companies experienced significant improvements in labor productivity, inventory turnover, and asset utilization once they started using their new enterprise systems. The advent of the World Wide Web extended the reach and power of enterprise systems to individual consumers via their computers (and later their tablets and phones). The web was born in 1989 when Tim Berners-Lee developed a set of protocols that allowed pieces of online content like text and pictures to link to each other, putting in practice the visions of hypertext first described by science and engineering polymath Vannevar Bush in 1945 (theoretically using microfilm) and computer visionary Ted Nelson, whose Project Xanadu never quite took off.
…
These examples highlight an important feature of platforms: they can often be built on top of each other. The World Wide Web, for example, is a multimedia, easy-to-navigate platform built on top of the original Internet information transfer protocols. Those protocols have been around for decades, but before Sir Tim Berners-Lee invented the web,†† the Internet was primarily a platform for geeks. One platform (the Internet), was a foundation or building block for another (the web). As we wrote in our previous book, The Second Machine Age, this building-block feature is valuable because it enables combinatorial innovation—the work of coming up with something new and valuable not by starting from scratch, but instead by putting together in new ways things that were already there (perhaps with a few generally novel ingredients).
…
The solution to this problem came, surprisingly enough, from the content itself. Larry Page and Sergey Brin, while still students in Stanford’s computer science department, realized that many pieces of web content, if not most, pointed to other pieces by linking to them; after all, that’s why Tim Berners-Lee had named it the “web.” They surmised that these links could be used to build an index of all the content out there—one where the “best” page on a given topic was the one that had the most other pages linking to it. In a way, this is how academic reputations are built: by noting which papers have the most citations from other papers.
Future Politics: Living Together in a World Transformed by Tech
by
Jamie Susskind
Published 3 Sep 2018
Compare the following statements: ‘The philosophers have only interpreted the world in various ways; the point is to change it.’ ‘We are not analysing a world, we are building it.’ The first is from Karl Marx’s 1845 Theses on Feuerbach.6 It served as a rallying cry for political revolutionaries for more than a century after its publication. The second is from Tim Berners-Lee, the mild-mannered inventor of the World Wide Web.7 Marx and Berners-Lee could scarcely be more different in their politics, temperament, or choice of facial hair. But what they have in common—in addition to having changed the course of human history—is a belief in the distinction between making change and merely thinking about it or studying it.
…
This was the first scientific instance of ‘mind-to-mind’ communication, also known as telepathy.40 You can already buy basic brainwave-reading devices, such as the Muse headband, which aims to aid meditation by providing real-time feedback on brain activity.41 Companies such as NeuroSky sell headsets that allow you to operate apps and play games on your smartphone using only thoughts.The US army has (apparently not very well) flown a helicopter using this kind of technology.42 Brain–computer interfaces have been the subject of a good deal of attention in Silicon Valley.43 Overall, increasingly connective technology appears set to deliver the vision of Tim Berners-Lee, inventor of the world wide web, of ‘anything being potentially connected with anything’.44 OUP CORRECTED PROOF – FINAL, 28/05/18, SPi РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS Increasingly Integrated Technology 49 Sensitive In the future, we can expect a dramatic rise in the number of sensors in the world around us, together with a vast improvement in what they are able to detect.This is increasingly sensitive technology.
…
But why not consciously engineer systems with justice in mind—whether equal treatment, equality of opportunity, or whatever other principle might be applicable to that particular application? Code could offer exciting new prospects for justice, not merely another threat to worry about. We need a generation of ‘philosophical engineers’ of the kind imagined by Tim Berners-Lee, and this cohort must be more diverse than today’s. It can’t be right that algorithms of justice are entrusted to an engineering community that is overwhelmingly made up of men.27 African-Americans receive about 10 per cent of computer science degrees and make up about 14 per cent of the overall workforce, but make up less than 3 per cent of computing occupations in Silicon Valley.28 At the very least, a workforce more representative of the public might mean greater awareness of the social implications of any given application of code.
GDP: A Brief but Affectionate History
by
Diane Coyle
Published 23 Feb 2014
Separately, the communications protocols between computers were developed in the United States in the 1970s, by DARPA (the Defense Advanced Research Projects Agency) among other groups. Computer-to-computer communication spread through the academic world first, from the mid-1970s onward, but Internet use required quite a high level of special knowledge through the 1980s. Tim Berners-Lee made the Internet accessible to all with his invention of the World Wide Web, working at the CERN laboratory in Geneva, which had the first website online in 1991. “This is for everyone,” as he put it.3 The Web started to get ordinary users online in the mid-1990s, and twenty years later being online is almost ubiquitous in developed countries and spreading rapidly in developing countries.
…
“The Dynamo and the Computer: An Historical Perspective on a Modern Productivity Problem,” American Economic Review 80, no. 2 (1990): 355–361. Available at http://elsa.berkeley.edu/~bhhall/e124/David90_dynamo.pdf. Accessed 23 January 2013. 2. Angus Maddison, The World Economy: A Millennial Perspective (Paris: Organization for Economic Cooperation and Development, 1999). 3. Tim Berners-Lee’s message at the opening ceremony of the 2012 London Olympic Games. 4. Robert Solow, “We’d Better Watch Out,” New York Times Book Review, 12 July 1987, 36. 5. Bill Lewis et al., “US Productivity Growth, 1995–2000,” McKinsey Global Institute, October 2001, http://www.mckinsey.com/insights/americas/us_productivity_growth_1995-2000. 6.
Open Standards and the Digital Age: History, Ideology, and Networks (Cambridge Studies in the Emergence of Global Enterprise)
by
Andrew L. Russell
Published 27 Apr 2014
Charles Babbage Institute, University of Minnesota, Minneapolis. 86 Derek Barber, “Meeting in Toronto, August 1976,” (n.d., 1976), INWG 128, McKenzie Collection; Louis Pouzin, “Cyclades ou Comment Perdre un Marche,” Recherche 328 (Fevrier 2000), 32–33; Stéphane Foucart, “Louis Pouzin, l’homme qui n’a pas inventé Internet,” Le Monde, 5 août 2006; Alain Beltran and Pascal Griset, “Le Projet Cyclades Sacrifié,” Codesource 11 (2007): 1; Nora and Minc, The Computerization of Society; Valérie Schafer, “Appropriating Packet Switching Networks, Making Cyclades Network”; Schafer, La France en Reseaux; Després interview, Charles Babbage Institute; Pouzin interview, Charles Babbage Institute. 87 Vint Cerf, quoted in Ian Peter, “Separating TCP and IP,” September 30, 2004, available from http://mailman.postel.org/pipermail/internet-history/2004-September/000431.html (accessed August 24, 2011). A dénouement of sorts occurred in 2013, when Pouzin shared the first Queen Elizabeth Prize for Engineering with Cerf, Kahn, and World Wide Web pioneers Sir Tim Berners-Lee and Marc Andreesen. Royal Academy of Engineering, “Queen Elizabeth Prize for Engineering,” June 25, 2013, http://qeprize.org/ (accessed June 30, 2013). 88 Pouzin, INWG 106; Kuo, “Political and Economic Issues for Internetwork Connections”; McKenzie, “INWG and the Conception of the Internet”; John Day, Patterns in Network Architecture: A Return to Fundamentals (Upper Saddle River, NJ: Prentice Hall-PTR, 2007). 89 As historian Edwin Layton noted and the examples in Chapters 2 and 3 demonstrate, “It is considered bad form to publicize the inner workings of engineering societies.”
…
In 1991, the National Science Foundation, which had operated the Internet backbone since 1986, lifted its restriction on commercial activity over the network – thus paving the way for the Internet to serve as a new commercial infrastructure. Another turning point for the popularity of the Internet also occurred in 1991, when Tim Berners-Lee released the World Wide Web, an application that used the Internet to allow users to browse hypertext documents.70 In the meantime, OSI failed to live up to the high expectations it had engendered. In 1994, the National Institute of Standards and Technology abandoned its GOSIP program in favor of the TCP/IP Internet.
…
“Meritocracy” served effectively as rhetorical cover for what was always designed to be a structure in which old-timers and insiders could maintain control. The former “Internet Architect” David Clark warned the Internet community in 1992 that it might become a “standards elephant,” just like OSI, if it failed to reconcile competing demands of speed and consensus. If Tim Berners-Lee’s experience with the Internet standards process was any indication, Clark was right to be concerned. Berners-Lee, the inventor of the World Wide Web, its hypertext markup language (html), and the hypertext transfer protocol (http), submitted his ideas as proposals for Internet standards in 1992 and 1993.
A More Beautiful Question: The Power of Inquiry to Spark Breakthrough Ideas
by
Warren Berger
Published 4 Mar 2014
Vijay Govindarajan and Srikanth Srinivas, “What’s the Connection Between Counting Squares and Innovation?,” Harvard Business Review, April 1, 2013. 20 “shifting our focus from objects or . . . ” Sutton, Weird Ideas That Work. 21 Why can’t computers do more than compute? . . . Tim Berners-Lee details from Peter J. Denning, “Innovating the future: from ideas to adoption: futurists and innovators can teach each other lessons to help their ideas succeed” The Futurist, January–February 2012; Mary Bellis, “Tim Berners-Lee,” About.com http://inventors.about.com/od/bstartinventors/p/TimBernersLee.htm; and Academy of Achievement website http://www.achievement.org/autodoc/page/ber1bio-1. 22 “How were we going to pay” . . .
…
21 In the 1950s it wasn’t clear how computers could be used outside of mathematics. Conway Berners-Lee, a British mathematician who worked on the early commercial electronic computers, was fascinated by the question, Could computers be used to link information rather than simply compute numbers? The question was later refined by his son, software engineer Tim Berners-Lee. Overwhelmed by massive amounts of research data, Berners-Lee wondered if there were a way to combine the nascent Internet with linked hypertext documents to better find and share information. In 1989, he proposed the global hypertext project to be known as the World Wide Web. His prototype included the now familiar architecture of web browsers, HTML, HTTP, and URLs.
Wait: The Art and Science of Delay
by
Frank Partnoy
Published 15 Jan 2012
One warm evening, Isaac Newton is sitting under an apple tree in his garden when an apple falls and bonks him on the head; he instantly discovers gravity.9 Thomas Edison is staying up all night at Menlo Park, frantically experimenting, when suddenly he creates a new lightbulb that glows continuously for thirteen-and-a-half hours.10 Tim Berners-Lee is helping some scientists share data when out of the blue an idea hits him and he invents the World Wide Web.11 But these stories are rarely accurate. Newton had been working on the problem of gravity for years, and neither he nor his biographer said an apple hit him on the head. The first incandescent bulb was invented seventy-five years before Edison’s innovation, which was not the bulb itself, but a filament made of bamboo. Tim Berners-Lee mocks the notion that he suddenly discovered the Internet, as if he “just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!
…
Thomas Edison Center at Menlo Park, “Young Edison,” http://www.menloparkmuseum.org/thomas-edison-and-menlo-park (excerpted from Westfield Architects and Preservation Consultants, Preservation Master Plan, Edison Memorial Tower, Museum, and Site (2007). 11. “Answers for Young People,” http://www.w3.org/People/Berners-Lee/Kids.html; Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (HarperOne, 1999). 12. “Answers for Young People.” Newton and Edison also rejected the notion that their discoveries were eureka stories. Remember these quotes? From Newton: “If I have seen a little further it is by standing on the shoulders of Giants.”
The Pirate's Dilemma: How Youth Culture Is Reinventing Capitalism
by
Matt Mason
But while pirates took over the media and became a geopolitical force to be reckoned with, the powers that be were already hatching a plan to defeat them. Fighting the Net “Net neutrality” is why the Internet is a level playing field. This is the principle that everyone using the Internet has an equal amount of access to everyone else. As inventor of the Web, Tim Berners-Lee, defines it: “If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level.” Telephone and telegraph networks were both successful because they were network-neutral, and it is why the Web has become such a world-changing force, both economically and socially.
…
The idea behind open-source software is to let others copy, share, change, and redistribute your software, as long as they agree to do the same with the new software they create in the process. This way, it can spread and progress as quickly as house music. The Internet was founded on free software such as USENET and UNIX, which is why no one can own it but everyone can use it. USENET is a public good free for the rest of us to build on. It was the early 1990s when Tim Berners-Lee, a British researcher working at the Swiss particle physics center CERN, designed the Web on top of such open-source software as a social experiment rather than a technical one. Free software was officially rebranded as open-source software in 1998 by the company Netscape (which then rebranded themselves as Mozilla, and created the hugely popular open-source Internet browser Firefox).
…
The Pirate Party, “Declaration of Principles 3.0.” http://docs.piratpartiet.se/Principles%203.0.pdf. The League of Noble Peers, Steal This Film, August 21, 2006. www.stealthisfilm.com. The Pirate Bay, “Buy Sealand? Is it possible?” Buysealand.com, January 9, 2007. http://buysealand.com/?cat=1. Tim Berners-Lee, “Net Neutrality,” Dig.csail.mit.edu, June 21, 2006. http://dig.csail.mit.edu/breadcrumbs/node/144. Cory Doctorow, “Big Cable's ridiculous Net Neutrality smear video,” Boing Boing, October 27, 2006. www.boingboing.net/2006/10/27/big_cables_ridiculou.html. Anders Bylund, “Mark Cuban on the tiered Internet,” Arstechnica.com, February 8, 2006. http://arstechnica.com/articles/culture/cuban.ars.
The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity
by
Amy Webb
Published 5 Mar 2019
And also David Teather, “90-Second Nightmare of Shuttle Crew,” Guardian, February 6, 2003, https://www.theguardian.com/world/2003/feb/06/columbia.science. 3. Katrina Brooker, “I Was Devastated: Tim Berners-Lee, the Man Who Created to World Wide Web, Has Some Regrets,” Vanity Fair, July 1, 2018, https://www.vanityfair.com/news/2018/07/the-man-who-created-the-world-wide-web-has-some-regrets. 4. Tim Berners-Lee, “The Web Is Under Threat. Join Us and Fight for It,” World Wide Web Foundation (blog), March 12, 2018, https://webfoundation.org/2018/03/web-birthday-29/. 5. “Subscriber share held by smartphone operating systems in the United States from 2012 to 2018,” Statista, https://www.statista.com/statistics/266572/market-share-held-by-smartphone-platforms-in-the-united-states/. 6.
…
The internet began as a concept—a way to improve communication and work that would ultimately benefit society. Our modern-day web evolved from a 20-year collaboration between many different researchers: in the earliest days as a packet-switching network developed by the Department of Defense and then as a wider academic network for researchers to share their work. Tim Berners-Lee, a software engineer based at CERN, wrote a proposal that expanded the network using a new set of technologies and protocols that would allow others to contribute: the uniform resource locator (URL), hypertext markup language (HTML), and hypertext transfer protocol (HTTP). The World Wide Web began to grow as more people used it; because it was decentralized, it was open to anyone who had access to a computer, and new users didn’t prevent existing users from creating new pages.
One Click: Jeff Bezos and the Rise of Amazon.com
by
Richard L. Brandt
Published 27 Oct 2011
It was a piece of technology that seemed to be coming together with elements that just might create opportunities for a company that used computer networks to conduct stock trades. Until recently, the Internet had been mostly used as a network that allowed universities, research labs, and government institutions to exchange information. One key step toward change was taken in 1990, when Tim Berners-Lee created the first Internet browser, called World Wide Web. Another came in 1991, when the Internet was opened up to commercial use for the first time. It took several more years before these changes caught on and spread to popular awareness. In 1993, a government-funded group at the University of Illinois at Urbana-Champaign created a new generation of Web browser called Mosaic, a wonderful, graphics-based browser.
…
The graphical browser was what made the Internet popular, by turning it into a simple point-and-click navigation system. In the early days of the Internet, nobody could be sure which Web browser would be most popular. University students from around the world were cranking out graphical browsers to take advantage of the World Wide Web communications standards developed by Tim Berners-Lee at the European Organization for Nuclear Research in the suburbs of Geneva. Some of the earliest graphical browsers from the early 1990s were now largely forgotten or overlooked names like Erwise, developed at the Helsinki University of Technology; ViolaWWW, from the University of California, Berkeley; and Lynx, created at the University of Kansas.
Capitalism 3.0: A Guide to Reclaiming the Commons
by
Peter Barnes
Published 29 Sep 2006
A food-buying club? Make it happen! Whether your interests relate to a river, a form of culture, or the planet, get involved. Adopt a commons. Learn everything about it. Fall in love with it. See who’s in charge. Then join or build an organization to revive it. If you want a role model, consider Tim Berners-Lee, the inventor and promoter of the World Wide Web. Berners-Lee was a programmer at CERN, the European high-energy physics lab, when he had an idea to simplify the Internet through hypertext. Readers of an Internet page would simply click on a hypertext link and be transported automatically to another page, anywhere in the world.
…
sec= programs&pg=spectrum_direct&bg=blk&continue=yes&X_TRANTYPE= download. 149 contract and converge: For information on contract and converge, see the website of the London-based Global Commons Institute at www.gci.org.uk/main.html. 150 a global atmosphere trust: The 1919 Treaty of Versailles, drawn up at the close of World War I, carved up the Ottoman and Austro-Hungarian empires, set up the League of Nations, and imposed stiff reparations on Germany. Some believe it paved the way to World War II. Chapter 10: What You Can Do 159 “I wanted to see the Web proliferate . . .”: Tim Berners-Lee, Weaving the Web (San Francisco: HarperSanFrancisco, 1999), p. 84. 164 Alaska Permanent Fund: In 1999, Alaska’s budget was in the red, and rather than raise taxes or cut expenditures, legislators tried to raid the Permanent Fund. After, however, voters in a referendum rejected their plan by 84 to 16 percent, the politicians gave up.
The Second Curve: Thoughts on Reinventing Society
by
Charles Handy
Published 12 Mar 2015
Five hundred years later it has happened again. The internet started life as a device to enable the US Department of Defense to improve their internal communications, a local Second Curve. The World Wide Web was first used in a sceptical CERN in Switzerland to organise the internal telephone directory although its founder, Tim Berners-Lee, the modern Gutenberg, always had far grander ambitions for it, even when his colleagues told him that with a name like that it would never catch on. For Berners-Lee it was a way to connect the world, to give everyone everywhere a freedom to share and to choose. It was essential to his vision that it should be free, his gift to a disbelieving world.
…
But good sense prevailed and in 1839 the postal system was altered and Hill put in charge. Within a few years the penny post idea had been copied around the world. Rowland Hill had reconnected the world, not just the nation. It was a Second Curve, brought about by just one individual. In a sense, Tim Berners-Lee, another Englishman, was Rowland Hill reborn 150 years later, for the internet and its offspring, the social media, have connected the world in a miraculous and wonderful way which we now take for granted. But being connected does not necessarily mean being close. Having 20,000 followers on Twitter does not equal having 20,000 contacts, let alone 20,000 friends.
Is the Internet Changing the Way You Think?: The Net's Impact on Our Minds and Future
by
John Brockman
Published 18 Jan 2011
“Pearls before swine” overestimates the average chat room conversation, but it is the pearls of hardware and software that inspire me: the Internet itself and the World Wide Web, succinctly defined by Wikipedia as “a system of interlinked hypertext documents accessed via the Internet.” The Web is a work of genius, one of the highest achievements of the human species, whose most remarkable quality is that it was constructed not by one individual genius such as Tim Berners-Lee or Steve Wozniak or Alan Kay, nor by a top-down company such as Sony or IBM, but by an anarchistic confederation of largely anonymous units located (irrelevantly) all over the world. It is Project MAC writ large. Suprahumanly large. Moreover, there is not one massive central computer with lots of satellites, as in Project MAC, but a distributed network of computers of different sizes, speeds, and manufacturers—a network that nobody, literally nobody, ever designed or put together but which grew, haphazardly, organically, in a way that is not just biological but specifically ecological.
…
It is said that Twitter played an important part in the unrest surrounding the election in Iran in 2009, and news from that faith pit encouraged the view that the trend will be toward a net positive effect of the Internet on political liberty. We can at least hope that the faster, more ubiquitous, and above all cheaper Internet of the future may hasten the long-awaited downfall of ayatollahs, mullahs, popes, televangelists, and all who wield power through the control (whether cynical or sincere) of gullible minds. Perhaps Tim Berners-Lee will one day earn the Nobel Peace Prize. Let Us Calculate Frank Wilczek Physicist, MIT; 2004 Nobel laureate in physics; author, The Lightness of Being: Mass, Ether, and the Unification of Forces Apology: The question “How is the Internet changing the way you think?” is a difficult one for me to answer in an interesting way.
…
I am answering this year’s Edge question against the deadline, as the answer slips as defiantly as time. The Internet has not only changed the way I think but prompted me to think about those changes over time, weighted by the unevenness of technology take-up and accessibility to the Net. I encountered the Web as a researcher at Oxford in the mid-1990s. I learned later that I was at Tim Berners-Lee’s former college, but I was pretty blasé about being easily online. I saw the Internet more as a resource for messaging, a faster route than the bike-delivered pigeon post. I didn’t see it as a tool for digging, and I remained resolutely buried in books. But when I visited nonacademic friends and asked if I could check e-mails on their dial-ups, I began to equate the Net with privilege, via phone bill anxiety.
Zucked: Waking Up to the Facebook Catastrophe
by
Roger McNamee
Published 1 Jan 2019
In the early nineties, consumer-centric PCs optimized for video games came to market. The virtuous cycle of Moore’s Law for computers and Metcalfe’s Law for networks reached a new level in the late eighties, but the open internet did not take off right away. It required enhancements. The English researcher Tim Berners-Lee delivered the goods when he invented the World Wide Web in 1989 and the first web browser in 1991, but even those innovations were not enough to push the internet into the mainstream. That happened when a computer science student by the name of Marc Andreessen created the Mosaic browser in 1993.
…
It certainly caught the eye of many journalists, which helped our cause. One thing had not changed from when I first reached out to Zuck and Sheryl in 2016: Facebook was not open to criticism, much less taking it to heart. Ignore the messenger was their first instinct; if that failed, presumably they would bring in the heavy artillery. Later that day, Tim Berners-Lee tweeted the Washington Monthly essay to his followers. Berners-Lee is one of my heroes. His endorsement of the essay meant the world to me. Suddenly, the essay was everywhere, and requests came in from all sorts of media: CNBC, Tucker Carlson on Fox, CBS Morning News, NBC Nightly News, the Today show, MSNBC, Frontline, CNN, 60 Minutes, Bloomberg Technology, BBC Radio, and Bloomberg Radio.
…
Robert Lustig, Scott Galloway, Chris Hughes, Laura Rosenberger, Karen Kornbluh, Sally Hubbard, T Bone Burnett, Callie Khouri, Daniel Jones, Glenn Simpson, Justin Hendrix, Ryan Goodman, Siva Vaidhyanathan, B. J. Fogg, and Rob and Michele Reiner. My deepest appreciation to Marc Benioff for supporting the cause early on. Thank you to Tim Berners-Lee for sharing my essay from Washington Monthly. Huge thanks to Gail Barnes for being my eyes and ears on social media. Thank you to Alex Knight, Bobby Goodlatte, David Cardinal, Charles Grinstead, Jon Luini, Michael Tchao, Bill Joy, Bill Atkinson, Garrett Gruener, and Andrew Shapiro for ideas, encouragement, and thought-provoking questions.
Americana: A 400-Year History of American Capitalism
by
Bhu Srinivasan
Published 25 Sep 2017
the greatest heist: Nina Munk, Fools Rush In: Steve Case, Jerry Levin, and the Unmaking of AOL Time Warner (New York: Harper Business, 2004), 156. “Just One More Bubble”: “If You Can Make It in Silicon Valley, You Can Make It . . . in Silicon Valley Again,” New York Times, June 5, 2005. Chapter 35: Mobile from George Lucas: Walter Isaacson, Steve Jobs (New York: Simon & Schuster, 2011), 240. Tim Berners-Lee attributed: Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (San Francisco, Harper, 1999), 22–23. making Jobs a billionaire: John Markoff, “Apple Computer Co-Founder Strikes Gold with New Stock,” New York Times, November 30, 1995. Apple agreed to buy: Isaacson, Steve Jobs, 301.
…
By the late 1980s, the utility of the Internet was fairly established, with academia and the military its primary users. What launched the consumer Internet, however, was a visual method of organizing and accessing all of the information on the network. Starting in 1980, a young Englishman named Tim Berners-Lee with a physics degree from Oxford was on a software consulting job at the European Council for Nuclear Research in Switzerland. CERN was financed by a consortium of European countries to conduct experimental research in physics. With thousands of affiliated researchers carrying out experiments at the facility, Berners-Lee saw firsthand the volume of papers and results at the laboratory.
…
In early 1986 he bought an animation studio for $10 million from George Lucas, the creator of Star Wars, and then invested tens of millions of dollars of his rapidly disappearing fortune from Apple into it. He founded another computer maker called NeXT to spite Apple. While NeXT didn’t live up to Jobs’s ambitions, it had its moments—Tim Berners-Lee attributed his development of HTTP, HTML, and the World Wide Web to his use of NeXT’s features. Jobs’s animation studio, Pixar, created the hit movie Toy Story using computer animation rather than the hand-drawing techniques pioneered by Walt Disney. After Toy Story Pixar had a public offering in 1995, making Jobs a billionaire for the first time in his life, this second fortune vastly outstripping his first.
Americana
by
Bhu Srinivasan
the greatest heist: Nina Munk, Fools Rush In: Steve Case, Jerry Levin, and the Unmaking of AOL Time Warner (New York: Harper Business, 2004), 156. “Just One More Bubble”: “If You Can Make It in Silicon Valley, You Can Make It . . . in Silicon Valley Again,” New York Times, June 5, 2005. Chapter 35: Mobile from George Lucas: Walter Isaacson, Steve Jobs (New York: Simon & Schuster, 2011), 240. Tim Berners-Lee attributed: Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (San Francisco, Harper, 1999), 22–23. making Jobs a billionaire: John Markoff, “Apple Computer Co-Founder Strikes Gold with New Stock,” New York Times, November 30, 1995. Apple agreed to buy: Isaacson, Steve Jobs, 301.
…
By the late 1980s, the utility of the Internet was fairly established, with academia and the military its primary users. What launched the consumer Internet, however, was a visual method of organizing and accessing all of the information on the network. Starting in 1980, a young Englishman named Tim Berners-Lee with a physics degree from Oxford was on a software consulting job at the European Council for Nuclear Research in Switzerland. CERN was financed by a consortium of European countries to conduct experimental research in physics. With thousands of affiliated researchers carrying out experiments at the facility, Berners-Lee saw firsthand the volume of papers and results at the laboratory.
…
In early 1986 he bought an animation studio for $10 million from George Lucas, the creator of Star Wars, and then invested tens of millions of dollars of his rapidly disappearing fortune from Apple into it. He founded another computer maker called NeXT to spite Apple. While NeXT didn’t live up to Jobs’s ambitions, it had its moments—Tim Berners-Lee attributed his development of HTTP, HTML, and the World Wide Web to his use of NeXT’s features. Jobs’s animation studio, Pixar, created the hit movie Toy Story using computer animation rather than the hand-drawing techniques pioneered by Walt Disney. After Toy Story Pixar had a public offering in 1995, making Jobs a billionaire for the first time in his life, this second fortune vastly outstripping his first.
Mining the Social Web: Finding Needles in the Social Haystack
by
Matthew A. Russell
Published 15 Jan 2011
Industry leaders such as Google and Facebook have begun to increasingly push graph-centric terminology rather than web-centric terminology as they simultaneously promote graph-based APIs. In fact, Tim Berners-Lee has suggested that perhaps he should have used the term Giant Global Graph (GGG) instead of World Wide Web (WWW), because the terms “web” and “graph” can be so freely interchanged in the context of defining a topology for the Internet. Whether the fullness of Tim Berners-Lee’s original vision will ever be realized remains to be seen, but the Web as we know it is getting richer and richer with social data all the time. When we look back years from now, it may well seem obvious that the second- and third-level effects created by an inherently social web were necessary enablers for the realization of a truly semantic web.
…
I designed it for a social effect—to help people work together—and not as a technical toy. The ultimate goal of the Web is to support and improve our weblike existence in the world. We clump into families, associations, and companies. We develop trust across the miles and distrust around the corner. —Tim Berners-Lee, Weaving the Web (Harper) To Read This Book? If you have a basic programming background and are interested in insight surrounding the opportunities that arise from mining and analyzing data from the social web, you’ve come to the right place. We’ll begin getting our hands dirty after just a few more pages of frontmatter.
Designing Social Interfaces
by
Christian Crumlish
and
Erin Malone
Published 30 Sep 2009
I have always imagined the information space as something to which everyone has immediate and intuitive access, and not just to browse, but to create. Furthermore, the dream of people-to-people communication through shared knowledge must be possible for groups of all sizes, interacting electronically with as much ease as they do now in person. —Tim Berners-Lee Weaving the Web, p 157, 1999 A Little Social Backstory… Social design for interactive digital spaces has been around since the earliest bulletin board systems. The most famous being The Well (1985), which was described by Wired magazine in 1997* as “the world’s most influential online community” and predated the World Wide Web and browser interfaces by several years
…
v=MpIOClX1jPE A Timeless Way of Building, by Christopher Alexander, Oxford University Press, 1979 The Virtual Community: Homesteading on the Electronic Frontier, by Howard Rheingold, The MIT Press, 2000 The Well: A Story of Love, Death and Real Life in the Seminal Online Community, by Katie Hafner, Carroll, Graf Publishers, 2001 Download at WoweBook.Com Chapter 2 Social to the Core The Web is more a social creation than a technical one. I designed it for a social effect—to help people work together—and not as a technical toy. —Tim Berners-Lee, Weaving the Web (1999) In A Timeless Way of Building, Christopher Alexander explains the purpose of pattern languages in part by saying that they are about imbuing built spaces with “the quality without a name.” There is something, often something ineffable, about some architectural spaces that make them inviting, warm, humane, comfortable, healthy, and alive.
…
Up popped a text box containing plain text and a small amount of HTML, the code that tells a browser how to display a given page. Inside the box I saw the words that had been on the page. I made a small change, clicked another button that said, “Save this page” and voilà, the page was saved with the changes.... Dave was a leader in a move that brought back to life the promise, too long unmet, that Tim Berners-Lee, inventor of the Web, had wanted from the start. Berners-Lee envisioned a read/write Web. But what had emerged in the 1990s was an essentially read-only Web on which you needed an account with an ISP to host your web site, special tools, and/or HTML expertise to create a decent site. What Dave and the other early blog pioneers did was a breakthrough.
Blockchain Basics: A Non-Technical Introduction in 25 Steps
by
Daniel Drescher
Published 16 Mar 2017
That list of properties reads like a short description of the blockchain. However, back in 1994, the blockchain did not exist! Actually, the system described by these points was the Internet, or at least Tim Berners-Lees’s vision of the Internet. As a result of technical progress, the emergence of Internet commerce, and the rise of Internet giants nowadays, the Internet may not have very much in common with Tim Berners-Lees’s vision of the Internet formulated back in 1994. The fact that technology evolves and as a result diverges from the vision of its inventors should be kept in mind when considering the future of the blockchain.
Digital Disconnect: How Capitalism Is Turning the Internet Against Democracy
by
Robert W. McChesney
Published 5 Mar 2013
As commercial interests were seen taking an increasing interest in the Internet, there was, as James Curran has documented, a “revolt of the nerds”—led by people like Richard Stallman and Linus Torvalds—which launched the open-software movement in the 1980s.38 Much of the noncommercial institutional presence on the Internet today can be attributed to this movement and its progeny. When Tim Berners-Lee created the World Wide Web in 1990, he said it would have been “unthinkable” to patent it or ask for fees. The point of the Internet was “sharing for the common good.” That was about to change. As the market exploded in the 1990s, patents became the rage. The use of patents to create unnecessary and dangerous monopolies rather than as incentives for research, as Berners-Lee put it, became a “very serious problem.”
…
It controls 69 percent of the code division multiple access (CDMA) chipset market and 77 percent of the wireless chipsets in Android devices. Along with Broadcom, Qualcomm controls half of the key wi-fi chipset markets.20 A related factor that encourages monopoly is the widespread use of patents, to such an extent that Tim Berners-Lee would probably now regard the late 1990s, which he once deplored, as a golden age of openness. The U.S. Patent Office awarded 248,000 patents in 2011, 35 percent more than a decade earlier.21 Patents are similar to copyright: by offering government protection for a temporary monopoly license, they have the necessary function of rewarding and therefore encouraging innovation; like copyright, patents have exploded in prominence in the digital era.22 Bloomberg Businessweek terms the escalation in patents—Microsoft, for example, took out over 2,500 in 2010, compared to just a few hundred in 2002—as a “high-tech arms race.”
…
Postal Policy in the Era of Competition,” International Communication Association Conference, San Francisco, May 2007, 57–65. 36. Nowak, Sex, Bombs and Burgers, 206. 37. Richard Adler, Updating Rules of the Digital Road: Privacy, Security, Intellectual Property (Washington, DC: Aspen Institute, 2012), 4. 38. Curran, “Rethinking Internet History,” 45. 39. Tim Berners-Lee, Weaving the Web (New York: HarperCollins, 1999), 197–98. 40. Project Censored, http://www.projectcensored.org/top-stories/articles/category/top-stories/top-25-of-1996/page/3. 41. This story is brilliantly told in Fred Turner, From Counterculture to Cyberculture (Chicago: University of Chicago Press, 2006). 42.
Orwell Versus the Terrorists: A Digital Short
by
Jamie Bartlett
Published 12 Feb 2015
Unlike the cloistered Arpanet, Usenet and BBS, the forerunners of the chat room and forum, were available to anyone with a modem and a home computer. Although small, slow and primitive by today’s standards, they were attracting thousands of people intrigued by a new virtual world. By the mid-nineties and the emergence of Tim Berners-Lee’s World Wide Web, the internet was fully transformed: from a niche underground haunt frequented by computer hobbyists and academics, to a popular hangout accessed by millions of excited neophytes.fn2 According to John Naughton, Professor of the Public Understanding of Technology at the Open University, cyberspace at this time was more than just a network of computers.
Blindside: How to Anticipate Forcing Events and Wild Cards in Global Politics
by
Francis Fukuyama
Published 27 Aug 2007
It was the small, but fervent community of elec- 2990-7 ch11 waldrop 7/23/07 12:13 PM Page 123 innovation and adaptation 123 tronics hobbyists who started using the chips in “personal minicomputers”— and kicked off the personal computer revolution.8 Then in the late 1980s, to take another famous example, the Internet had already begun to spread through academia like the proverbial wildfire—but mainly as a mechanism for e-mail, file sharing, and remote login to mini- and supercomputers. It was Tim Berners-Lee, a physicist and Internet user at the European Center for Particle Physics (CERN), who came up with a way of systematically displaying files in a visual form, and using hyperlinks to jump between them—a system he dubbed the World Wide Web.9 And more recently, of course, the web itself has paved the way for eBay, peer-to-peer file sharing, blogging, and a host of other user-driven innovations that no one had anticipated.
…
William Aspray, John von Neumann and the Origins of Modern Computing (MIT Press, 1990); William Aspray, “John von Neumann’s Contributions to Computing and Computer Science,” Annals of the History of Computing 11, no. 3: 189–95 (1989). 8. Paul E. Ceruzzi, A History of Modern Computing (MIT Press, 1998), chap. 7; Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: Basic Books, 1996), chap. 10. 9. Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by Its Inventor (San Francisco: Harper, 1999). 10. M. Mitchell Waldrop, The Dream Machine: J. C. Licklider and the Revolution That Made Computing Personal (New York: Viking, 2001), chap. 2. 11. It is not even very hard.
The Snowden Files: The Inside Story of the World's Most Wanted Man
by
Luke Harding
Published 7 Feb 2014
But further publication could help paedophiles and endanger MI5 agents. The editor said the Guardian’s surveillance revelations were dominating the news agenda in the US and had sparked a huge debate. Everyone was concerned, from Al Gore to Glenn Beck; from Mitt Romney to the American Civil Liberties Union. Tim Berners-Lee, the founder of the internet, and Jim Sensenbrenner, the congressman who drew up the Patriot Act, were also supportive. Even President Obama had said he welcomed the debate. ‘We are hoping you will take the same view as Obama. It’s a good debate,’ said Rusbridger. Heywood responded: ‘You have had your debate.
…
There were no questions about GCHQ’s reported role in tapping British traffic between Google’s own data servers. There was nothing on the bugging of Chancellor Merkel’s phone, or spying on friendly world leaders. Nothing either on the reliance on corporate telecoms partners who offered help ‘well beyond’ what they were compelled to do. The previous week Sir Tim Berners-Lee – the man who invented the internet – had described the UK–USA’s secret efforts to weaken internet encryption as ‘appalling and foolish’. Nobody asked about this either. It was left to Rusbridger to point out the obvious to his critics. Snowden – luckily – had entrusted his files to journalists.
The End of Absence: Reclaiming What We've Lost in a World of Constant Connection
by
Michael Harris
Published 6 Aug 2014
And just the other day, when I took that book down from its dusty post on my shelf, the same pressed flowers fell out of its pages (after a quarter century of stillness) and dropped onto my bare toes. There was a deep sense memory, then, that returned me to that hushed state of mind on the lost green hill, a state that I have so rarely known since. And to think: That same year, a British computer scientist at CERN called Tim Berners-Lee was writing the code for the World Wide Web. I’m writing these words on the quarter-century anniversary of his invention. That memory of a quieter yesteryear is dearly useful. Awake—or at least partly so—to the tremendous influence of today’s tech-littered landscape, I have the choice to say yes and no to the wondrous utility of these machines, their promise and power.
…
As Elizabeth Eisenstein points out, indexes, tables of contents, and the like were not natural to printed books, but evolved over time. Even alphabetical order (a to z) is of course an invention and a direct result of the printing revolution. (See Eisenstein’s The Printing Press as an Agent of Change, 71.) 17. Tim Berners-Lee indulges in this recall-as-memory fallacy in a coauthored 2012 paper titled “Defend the Web,” where he tells the story of Deacon Patrick Jones, who found succor in “memory aids” after a traumatic brain injury. “His very memory is extended into the Internet,” the authors enthuse; but this is clearly not so. 18.
Working in Public: The Making and Maintenance of Open Source Software
by
Nadia Eghbal
Published 3 Aug 2020
We careened toward the new millennium, flushed with the global triumph of Western liberal democracy. And so, in the late twentieth century, when murmurs of the internet began to hum to an audible roar, they carried with them all the qualities of our harmonious, never-gonna-give-you-up infatuation with the unconstrained spread of knowledge. Tim Berners-Lee, the inventor of the World Wide Web, envisioned “a pool of information . . . which could grow and evolve” in his original proposal to CERN, the European particle physics laboratory that incubated his project.1 “For this to be possible, the method of storage must not place its own restraints on the information,” he wrote.
…
I wanted this book to showcase the real stories of developers, and I greatly benefited from being able to dig through the vast repository of information out there. Thank you also to everyone who replied to my tweets, newsletters, and blog posts throughout the writing process: you gave me a sounding board to test and refine many of the ideas that eventually made their way into this book. INTRODUCTION 1 Tim Berners-Lee, “Information Management: A Proposal,” The Original Proposal of the WWW, HTMLized, May 1990, https://www.w3.org/History/1989/proposal.html. 2 Nadia Eghbal, “Roads and Bridges: The Unseen Labor Behind Our Digital Infrastructure,” Ford Foundation, July 14, 2016, https://www.fordfoundation.org/about/library/reports-and-studies/roads-and-bridges-the-unseen-labor-behind-our-digital-infrastructure. 3 Gustavo Pinto, Igor Steinmacher, and Marco Aurélio Gerosa, “More Common Than You Think: An In-Depth Study of Casual Contributors,” in 2016 IEEE 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER) (Suita, Japan: IEEE, March 2016): 518–528, https://doi.org/10.1109/saner.2016.68. 4 Twitter Bootstrap Usage Statistics,” Built With, accessed March 31, 2020, https://trends.builtwith.com/docinfo/Twitter-Bootstrap. 5 “Contributors: Commits,” Bootstrap Insights, GitHub, accessed March 31, 2020, https://github.com/twbs/bootstrap/graphs/contributors. 6 Nadia Eghbal, “User Support System Analysis,” Nayafia Code, Github, September 27, 2018, https://github.com/nayafia/user-support/blob/master/top-100-by-issue-volume.csv. 7 Suvodeep Majumder, Joymallya Chakraborty, Amritanshu Agrawal, and Tim Menzies, “Why Software Projects Need Heroes (Lessons Learned from 1000+ Projects),” ArXiv, April 22, 2019, https://arxiv.org/pdf/1904.09954.pdf. 8 Npm, Inc., “This Year in JavaScript: 2018 in Review and Npm’s Predictions for 2019,” Npm (blog), Medium, December 6, 2018, https://medium.com/npm-inc/this-year-in-javascript-2018-in-review-and-npms-predictions-for-2019-3a3d7e5298ef. 9 Steve Weber, The Success of Open Source (Cambridge, MA: Harvard University Press, 2005), Loc 867. 10 “The State of the Octoverse,” GitHub, 2019, https://octoverse.github.com/. 11 DinoInNameOnly, “Most of What You Read on the Internet Is Written by Insane People,” R/slatestarcodex, Reddit, October 27, 2018, https://www.reddit.com/r/slatestarcodex/comments/9rvroo/most_of_what_you_read_on_the_internet_is_written/. 12 CBS News, “Meet the Man behind a Third of What’s on Wikipedia,” CBS News, CBS Interactive, January 26, 2019, https://www.cbsnews.com/news/meet-the-man-behind-a-third-of-whats-on-wikipedia/. 13 Kristen Roupenian, “What It Felt Like When ‘Cat Person’ Went Viral,” The New Yorker, January 9, 2019, https://www.newyorker.com/books/page-turner/what-it-felt-like-when-cat-person-went-viral. 01 14 “The State of the Octoverse,” GitHub, 2019, https://octoverse.github.com/. 15 Free Software Foundation, “What Is Free Software?
Mastering Structured Data on the Semantic Web: From HTML5 Microdata to Linked Open Data
by
Leslie Sikos
Published 10 Jul 2015
By linking to datasets already on the LOD Cloud Diagram (such as pointing to a definition on DBpedia), your dataset will become part of the LOD Cloud. 73 Chapter 3 ■ Linked Open Data Figure 3-9. Two RDF graphs sharing the same URI merge The Giant Global Graph, the other name for the Web of Data coined by Tim Berners-Lee, is the supergraph of all the automatically merged LOD graphs [8]. Registering Your Dataset To be considered for inclusion in the LOD Cloud Diagram, your dataset must be registered on http://datahub.io. To be able to register, you need an affiliation with a company or research institute already on Datahub, or if it is not yet registered, you have to request a new company registration.
…
The first one, called the Google Knowledge Panel, is displayed on the right-hand side of the Search Engine Result Pages, next to the organic search results. Searching for persons or brand names typically results in a Knowledge Panel, as shown in Figure 8-2. 200 Chapter 8 ■ Big Data Applications Figure 8-2. Facts about Tim Berners-Lee shown by the Google Knowledge Panel If the object has a social media presence on Facebook, YouTube, Twitter, Instagram, Google+, etc., links to those pages will also be displayed. The most related links are shown under “People also search for,” which can be extended by clicking the “View more” link.
Forge Your Future with Open Source
by
VM (Vicky) Brasseur
Make sure to review the CONTRIBUTING file to verify the correct way to ask questions for that project. Drop the attitude. Even if you’ve contributed to other projects before, even if you’ve been in software development for thirty years, even if you’re Linus Torvalds[92], Larry Wall[93], and Tim Berners-Lee[94] all rolled into one: Get over yourself. It’s entirely possible to phrase a question that is confident and competent yet does not make you sound like an arrogant twit. For instance, when asking about a programming optimization, rather than pointing out that you have a PhD in computer science and have been programming in that language for twelve years (information which is irrelevant to the suggestion at hand), simply state the problem you’ve noticed, how the code could be improved, and ask whether anyone would have a problem with you making the change.
…
Now that you know how to communicate with others in a FOSS community, it’s time to get to know them, and what better way to do that than getting together in person? Footnotes [92] https://en.wikipedia.org/wiki/Linus_Torvalds; The inventor of Linux. [93] https://en.wikipedia.org/wiki/Larry_Wall; The inventor of rn, patch, Perl, and Perl 6. [94] https://en.wikipedia.org/wiki/Tim_Berners-Lee; The inventor of the World Wide Web. [95] https://en.wikipedia.org/wiki/Pastebin [96] https://www.xkcd.com/386/ [97] https://opensource.com/life/16/6/irc Copyright © 2018, The Pragmatic Bookshelf. Chapter 8 It’s About the People By now you’ve noticed a large part of the book is dedicated to methods and tips for interacting with others.
99%: Mass Impoverishment and How We Can End It
by
Mark Thomas
Published 7 Aug 2019
Technology has been advancing quickly Technology has transformed our lives over the last thirty-five years. In 1980, few homes contained a computer – the first mass-market computer was the Commodore PET, launched in 1977. Mobile telephones had not yet reached the market – it was not until 1983 that the first mobile phones went on sale in the US (at almost US$4,000 each). Sir Tim Berners-Lee invented the World Wide Web only thirty years ago. Now, almost every home has at least one computer; almost every adult has a mobile phone; and we all use the Internet and mobile networks for entertainment, for shopping and for social interaction. In the UK, adults now spend more than eight hours a day interacting with technology.2 The world of business has been transformed.
…
All of these elements exist only because the early, high-risk stages of their development were funded by the public sector. The touchscreen, for example, was developed by CERN and perfected by Westerman and Elias originally at Delaware University; the Internet was developed at DARPA and the World Wide Web at CERN by Tim Berners-Lee; global positioning satellites (GPS) were another military technology; and SIRI was originally developed by the Stanford Research Institute on behalf of DARPA. Without government-backed R&D, we would not have the iPhone. The world is struggling to meet its commitments to reduce greenhouse gas emissions.
Hacking Politics: How Geeks, Progressives, the Tea Party, Gamers, Anarchists and Suits Teamed Up to Defeat SOPA and Save the Internet
by
David Moon
,
Patrick Ruffini
,
David Segal
,
Aaron Swartz
,
Lawrence Lessig
,
Cory Doctorow
,
Zoe Lofgren
,
Jamie Laurie
,
Ron Paul
,
Mike Masnick
,
Kim Dotcom
,
Tiffiniy Cheng
,
Alexis Ohanian
,
Nicole Powers
and
Josh Levy
Published 30 Apr 2013
The phone company that provides the transmission line isn’t allowed to decide what you say or who you can talk to when you use its network. This de facto “Network Neutrality” forms the basis for the Internet’s historical openness. Sir Tim Berners-Lee could have adopted proprietary technologies to build his vision of a web of interconnected documents. Instead, he opted for openness when inventing the software that became the Web. Sir Tim Berners-Lee, the inventor of the World Wide Web, was one of the key Internet figures to speak out against SOPA/PIPA. Twitter is one of the ways he was able to help spread the word about the threat, including by re-tweeting messages from contributors to Hacking Politics.
…
Former President Bill Clinton, that champion of honesty, has even suggested the creation of an entirely new cabinet department devoted to “fact checking” the Internet! These proposals are done in the name of preventing the spread of factual errors, misinformation, and “conspiracy theories.” Josh Levy “Network Neutrality” forms the basis for the Internet’s historical openness. Sir Tim Berners-Lee could have adopted proprietary technologies to build his vision of a web of interconnected documents. Instead, he opted for openness when inventing the software that became the Web. Dave Dayen (reporter for Firedoglake) At the exact same time Senate Democrats voted down net neutrality repeal, many of them were scheming to bring so-called anti-piracy legislation to the floor.
Fire in the Valley: The Birth and Death of the Personal Computer
by
Michael Swaine
and
Paul Freiberger
Published 19 Oct 2014
Our primary debt is to the people who lived this story and graciously granted us entry into what is in fact their personal history—through hundreds of hours of interviews and generous access to documents, records, letters, diaries, time lines, telexes, and photographs. Among others, we are grateful to the following individuals: Scott Adams, Todd Agulnick, David Ahl, Alice Ahlgren, Bob Albrecht, Paul Allen, Dennis Allison, Bill Anderson, Bill Baker, Steve Ballmer, Rob Barnaby, John Barry, Allen Baum, John Bell, Tim Berners-Lee, Tim Berry, Ray Borrill, Stewart Brand, Dan Bricklin, Keith Britton, David Bunnell, Nolan Bushnell, Maggie Canon, David Carlick, Douglas Carlston, Mark Chamberlain, Hal Chamberlin, Roger Chapman, Alan Cooper, Sue Cooper, Ben Cooper, John Craig, Andy Cunningham, Eddie Curry, Steve Dompier, John Draper, John Dvorak, Doug Engelbart, Chris Espinosa, Gordon Eubanks, Ed Faber, Federico Faggin, Lee Felsenstein, Bill Fernandez, Todd Fischer, Richard Frank, Bob Frankston, Paul Franson, Nancy Freitas, Don French, Gordon French, Howard Fulmer, Dan Fylstra, Mark Garetz, Harry Garland, Jean-Louis Gassee, Bill Gates, Bill Godbout, John Goodenough, Chuck Grant, Wayne Green, Dick Heiser, Carl Helmers, Kent Hensheid, Andy Hertzfeld, Ted Hoff, Thom Hogan, Rod Holt, Randy Hyde, Peter Jennings, Steve Jobs, Bill Joy, Philippe Kahn, Mitch Kapor, Vinod Khosla, Guy Kawasaki, Gary Kildall, Joe Killian, Dan Kottke, Barbara Krause, Tom Lafleur, Jaron Lanier, Phil Lemons, Phil Levine, Andrea Lewis, Bill Lohse, Mel Loveland, Scott Mace, Regis McKenna, Marla Markman, Mike Markkula, Bob Marsh, Patty McCracken, Dorothy McEwen, Patrick McGovern, Scott McNealy, Roger Melen, Seymour Merrin, Edward Metro, Vanessa Mickan, Jill Miller, Dick Miller, Michael Miller, Fred Moore, Gordon Moore, Lyall Morrill, George Morrow, Jeanne Morrow, Theodor Holm Nelson, Robert Noyce, Tom and Molly O’Neill, Terry Opdendyk, Adam Osborne, Chuck Peddle, Harvard Pennington, Joel Pitt, Fred “Chip” Poode, Frank and Susan Raab, Jeff Raikes, Janet Ramusack, Jef Raskin, Ed Roberts, Roy Robinson, Tom Rolander, Phil Roybal, Seymour Rubinstein, Sue Runfola, Chris Rutkowski, Paul Saffo, Art Salsberg, Wendell Sanders, Ed Sawicki, Joel Schwartz, John Sculley, Jon Shirley, John Shoch, Richard Shoup, Michael Shrayer, Bill Siler, Les Solomon, Deborah Stapleton, Alan Stein, Barney Stone, Don Tarbell, George Tate, Paul Terrell, Larry Tesler, Glenn Theodore, John Torode, Jack Tramiel, Bruce Van Natta, Jim Warren, Larry Weiss, Randy Wigginton, Margaret Wozniak, Steve Wozniak, Larry Yaeger, Greg Yob, and Pierluigi Zappacosta.
…
The Internet finally brought hypertext to life. And by the DARPA programmers having developed a method for passing data around the Internet, and the personal-computer revolution having put the means of accessing the Internet in the hands of ordinary people, the pieces of the puzzle were all on the table. Tim Berners-Lee, a researcher at CERN, a high-energy research laboratory on the French-Swiss border, created the World Wide Web in 1989 by writing the first web server, a program for putting hypertext information online, and by writing the first web browser, a program for accessing that information. The information was displayed in manageable chunks called pages.
…
For so large a company as Microsoft to be able to change directions so quickly was impressive. The biggest fish in the pond and maneuverable, too: Microsoft in the mid-1990s dominated the personal-computer industry and seemed utterly invincible. Until that bright young hacker came along. Creating Cyberspace One of the places where Tim Berners-Lee’s achievement in creating the Web was fully appreciated was at the National Center for Supercomputing Applications (NCSA) at the University of Illinois’s Urbana-Champaign campus. NCSA had a large budget, a lot of hot technology, and a large staff with “frankly, not enough to do,” according to one of the young programmers privileged to work there.
We-Think: Mass Innovation, Not Mass Production
by
Charles Leadbeater
Published 9 Dec 2010
Whether we miss this opportunity to create more equitable, collaborative and participative ways to organise ourselves will be one of the big stories of the next decade. We Think is an attempt to chart how the new Levellers might avoid the crushing defeat inflicted upon the original Levellers. Vital to that will be the scale of our ambition to realise the shared potential the web offers. Web creator, Tim Berners-Lee, an English radical following in Winstanley’s footsteps says: ‘The danger is not that we ask too much of the Internet but too little, that we turn it into just another piece of kit when it could be so much more significant than that, a new platform for how we could organise ourselves, to find knowledge together, to work out what is true and to decide together what we should do about it.’
…
Web 2.0 differs from earlier, more static versions of the web, though, in that it encourages this community to have a conversation. The readers can post replies and talk among themselves. It is not one-way traffic. Arseblog is more than a niche publishing venture; it’s the focal point for a conversation among a community of fans. Tim Berners Lee probably was not anticipating Arseblog when he wrote the original software for the web at CERN, the European centre for nuclear physics research. Yet in some respects, it is a realisation of his original vision that the web could be a platform for collaboration, not just a new way to publish information.
The People's Platform: Taking Back Power and Culture in the Digital Age
by
Astra Taylor
Published 4 Mar 2014
“If communism vs. capitalism was the struggle of the twentieth century,” law professor and open culture activist Lawrence Lessig writes, “then control vs. freedom will be the debate of the twenty-first century.”19 No doubt, there is much to be said for open systems, as many have shown elsewhere.20 The heart of the Internet is arguably the end-to-end principle (the idea that the network should be kept as flexible, unrestricted, and open to a variety of potential uses as possible). From this principle to the freely shared technical protocols and code that Tim Berners-Lee used to create the World Wide Web, we have open standards to thank for the astonishing growth of the online public sphere and the fact that anyone can participate without seeking permission first.21 Open standards, in general, foster a kind of productive chaos, encouraging innovation and invention, experimentation and engagement.
…
Related arguments about the limitations of the framework of left versus right and state versus market are made by Steven Johnson in Future Perfect: The Case for Progress in a Networked Age (New York: Riverhead Books, 2012) and his op-ed “Peer Power, from Potholes to Patents,” Wall Street Journal, September 21, 2012, as well as by Yochai Benkler in The Penguin and the Leviathan: The Triumph of Cooperation over Self-Interest (New York: Crown Business, 2011). 20. Tim Wu makes a compelling case for open systems in The Master Switch: The Rise and Fall of Information Empires (New York: Knopf, 2010). 21. Tim Berners-Lee, “Long Live the Web: A Call for Continued Open Standards and Neutrality,” Scientific American, November 22, 2012. Though I use the terms interchangeably, as most people do, the World Wide Web is technically an application that runs on the Internet. The Internet was invented before Web pages were.
New Dark Age: Technology and the End of the Future
by
James Bridle
Published 18 Jun 2018
Despite such deep realities, there’s a wonderful thing that happens when you hear someone tell a story that just makes sense: a sense of who they are, and where they came from; the sense that something they did makes sense, has history and progress behind it, that it had to happen this way – and that it had to happen to them, because of the story itself. Tim Berners-Lee, the inventor of the World Wide Web, gave a talk in a tent in Wales in 2010 entitled ‘How the World Wide Web Just Happened’.2 It’s a joyful thing, an exegesis on computation itself as well as a humble hero story. TBL’s parents, Conway Berners-Lee and Mary Lee Woods, were computer scientists; they met and married while working on the Ferranti Mark 1, the first commercially available, general-purpose electronic computer, in Manchester in the 1950s.
…
Direct Effects of Low-to-Moderate CO2 Concentrations on Human Decision-Making Performance’, Environmental Health Perspectives 120:12 (December 2012), 1671–7. 4Calculation 1.William Gibson, interviewed by David Wallace-Wells, ‘William Gibson, The Art of Fiction No. 211’, Paris Review 197 (Summer 2011). 2.Tim Berners-Lee, ‘How the World Wide Web just happened’, Do Lectures, 2010, thedolectures.com. 3.‘Cramming more components onto integrated circuits’, Electronics 38:8 (April 19, 1965). 4.‘Moore’s Law at 40’, Economist, March 23, 2005, economist.com. 5.Chris Anderson, ‘End of Theory’, Wired Magazine, June 23, 2008. 6.Jack W.
Breaking News: The Remaking of Journalism and Why It Matters Now
by
Alan Rusbridger
Published 14 Oct 2018
* It was easy, in 1993, to be only dimly aware of what the internet did. The kids in the basement might have a PC capable of accessing the web, but most of us had only read about it. Writing 15 years later in the Observer,2 the critic John Naughton compared the begetter of the world wide web, Sir Tim Berners-Lee, with the seismic disruption five centuries earlier caused by the invention of movable type. Just as Gutenberg had no conception of his invention’s eventual influence on religion, science, systems of ideas and democracy, so – in 2008 – ‘it will be decades before we have any real understanding of what Berners-Lee hath wrought’.
…
Cryptologists, business leaders and privacy experts had been appalled to learn about what the NSA and GCHQ had been up to. President Obama’s own review panel of security experts echoed his concern: ‘the US Government should . . . not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software.’ Did Sir Tim Berners-Lee’s views on the web count as much as the spies’? • The risk to the digital economy. US and UK digital entrepreneurs were gravely concerned at a potential backlash against western tech companies which, it was estimated, could cost them tens of billions of dollars over the next few years. Just before Christmas 2013 the leaders of all the major West Coast tech companies expressed their alarm to President Obama.
…
Jim Sensenbrenner54 wrote that the Act was never intended to hand the US government the kind of powers Snowden revealed it had awarded itself. Silicon Valley chief executives, including Apple’s Tim Cook, visited the Guardian to discuss the issue. There was also a memorable Guardian private dinner in Davos attended by Eric Schmidt from Google; inventor of the world wide web, Tim Berners-Lee; the boss of Vodafone, Vittori Colao; former Swedish prime minister Carl Bildt; the founder of LinkedIn, Reid Hoffman; and Fadi Chehadé, CEO of the body charged with co-ordinating the internet. If you were a serious technologist or entrepreneur, you got what Snowden was talking about. It was real.
Data Mining the Web: Uncovering Patterns in Web Content, Structure, and Usage
by
Zdravko Markov
and
Daniel T. Larose
Published 5 Apr 2007
Data Mining the Web: Uncovering Patterns in Web Content, Structure, and Usage C 2007 John Wiley & Sons, Inc. By Zdravko Markov and Daniel T. Larose Copyright CHAPTER 1 INFORMATION RETRIEVAL AND WEB SEARCH WEB CHALLENGES CRAWLING THE WEB INDEXING AND KEYWORD SEARCH EVALUATING SEARCH QUALITY SIMILARITY SEARCH WEB CHALLENGES As originally proposed by Tim Berners-Lee [1], the Web was intended to improve the management of general information about accelerators and experiments at CERN. His suggestion was to organize the information used at that institution in a graphlike structure where the nodes are documents describing objects, such as notes, articles, departments, or persons, and the links are relations among them, such as “depends on,” “is part of,” “refers to,” or “uses.”
…
Then the estimated value of the resemblance between documents d1 and d 2 is re (d1 ,d2 ) = |L(d) ∩ L(d2 )| |L(d1 ) ∪ L(d2 )| L(d) is a smaller set of shingles called a sketch of document d. By choosing a proper value for p, the storage for L(d), and consequently the storage needed for precomputing resemblance for pairs of documents, can be reduced. Of course, this comes at the expense of less accurate estimation of resemblance. EXERCISES 43 REFERENCES 1. Tim Berners-Lee, Information Management: A Proposal, CERN, Geneva, Switzerland, 1989–1990, http://www.w3.org/History/1989/proposal.html. 2. Tim Mayer, Our blog is growing up—and so has our index, Yahoo! Search Blog, Aug. 2005, http://www.ysearchblog.com/archives/000172.html. 3. C. Buckley, Implementation of the SMART information retrieval system, Technical Report 85-686, Cornell University, Ithaca, NY, 1985. 4.
Writing on the Wall: Social Media - the First 2,000 Years
by
Tom Standage
Published 14 Oct 2013
But access to the growing volume of information available online, along with the ability to commune with like-minded people, proved sufficiently compelling that some people, at least, were prepared to put up with the complexity of getting it all to work. And then, in 1993, it all suddenly got much easier, with the emergence into the mainstream of the World Wide Web. WEAVING THE WEB When Tim Berners-Lee, a British scientist, wrote a program called WorldWideWeb in 1990, his aim was merely to make it easier for his fellow physicists to communicate with each other. Berners-Lee was working at CERN, the European particle-physics laboratory near Geneva, Switzerland, where a wide variety of computer systems were in use.
…
But it did not seem quite that straightforward when I started, and I am very grateful to all those who helped me develop my thesis, provided leads and suggestions, or shared their opinions in interviews. In particular, I would like to thank Craig Newmark, An Xiao Mina, Jay Rosen, Henry Jenkins, Vint Cerf, Tim Berners-Lee, Wael Ghonim, Matt Locke, and Andrew Lintott. John Micklethwait, Emma Duncan, Ann Wroe, Oliver Morton, Rob Gifford, and Gady Epstein, all colleagues at the Economist, provided help of various kinds along the way. I am also grateful to George Gibson, Jackie Johnson, and Katinka Matson for their continued support and encouragement throughout the writing process.
The Entrepreneurial State: Debunking Public vs. Private Sector Myths
by
Mariana Mazzucato
Published 1 Jan 2011
From the 1970s through the 1990s, DARPA funded the necessary communication protocol (TCP/IP), operating system (UNIX) and email programs needed for the communication system, while the National Science Foundation (NSF) initiated the development of the first high-speed digital networks in the US (Kenney 2003). Meanwhile, in the late 1980s, British scientist Tim Berners-Lee was developing the Hypertext Markup Language (HTML), uniform resource locators (URL) and uniform Hypertext Transfer Protocol (HTTP) (Wright 1997). Berners-Lee, with the help of another computer scientist named Robert Cailliau, implemented the first successful HTTP for the computers installed at CERN.
…
Wood, R. 2012. ‘Fallen Solyndra Won Bankruptcy Battle but Faces Tax War’. Forbes, 11 June. Available online at http://www.forbes.com/sites/robertwood/2012/11/06/fallen-solyndra-won-bankruptcy-battle-but-faces-tax-war/ (accessed 29 January 2013). Wright, R. 1997. ‘The Man Who Invented the Web: Tim Berners-Lee Started a Revolution, But It Didn’t Go Exactly as Planned’. TIME 149, no. 20 (19 May). Yasuda, T. 2005. ‘Firm Growth, Size, Age and Behavior in Japanese Manufacturing’. Small Business Economics 24, no. 1: 1–15. Zenghelis, D. 2011. ‘A Macroeconomic Plan for a Green Recovery’. Centre for Climate Change Economics and Policy, Grantham Research Institute on Climate Change and the Environment policy paper, January.
Broad Band: The Untold Story of the Women Who Made the Internet
by
Claire L. Evans
Published 6 Mar 2018
Wendy Hall came from England to demonstrate the latest build of Microcosm. The conference floor, a hotel reception area lined with rows of tables, was clustered with representatives from dozens of hypertext projects with names like AnswerBook and LinkWorks. Several tables down from Wendy Hall sat another British computer scientist, Tim Berners-Lee. He’d had his conference paper rejected, but he’d come to San Antonio anyway, to show off a new system to the hypertext crowd. He’d brought Robert Caillau, a colleague from CERN, the European Organization for Nuclear Research. The pair was demonstrating a distributed hypertext system Berners-Lee had built to make sharing data on networked computers across their massive Swiss campus a little easier.
…
“People used to say,” she remembers, and laughs, “‘I think what you’re doing is wonderful, but this Web thing is free, so we’re gonna try that first.’” Fortunately, Wendy never abandoned university life. Running an expanding department at Southampton, she remained in contact with the growing Web development community, and after working closely with Tim Berners-Lee to develop the Microcosm Web viewer and the Distributed Link Service, she became a sustained presence on the early Web scene. In 1994, she helped to organize the first Web conference but still wasn’t confident that the Web was the end-all solution. In a 1997 lecture at Southampton, she minced no words.
Likewar: The Weaponization of Social Media
by
Peter Warren Singer
and
Emerson T. Brooking
Published 15 Mar 2018
“It was the first, and being first, was best, / But now we lay it down to ever rest. / . . . / Of faithful service, duty done, I weep. / Lay down thy packet, now, o friend, and sleep.” While the internet and the military were ostensibly dividing, other worlds were on the brink of colliding. Back in 1980, the British physicist Tim Berners-Lee had developed a prototype of something called “hypertext.” This was a long-theorized system of “hyperlinks” that could bind digital information together in unprecedented ways. Called ENQUIRE, the system was a massive database where items were indexed based on their relationships to each other.
…
For all the internet’s creative chaos, it has come to be ruled by a handful of digital kings. The outcome is an internet that is simultaneously familiar but unrecognizable to its founders, with deep ramifications for not just the web’s future, but for the future of politics and war as well. As Tim Berners-Lee has written, “The web that many connected to years ago is not what new users will find today. What was once a rich selection of blogs and websites has been compressed under the powerful weight of a few dominant platforms. This concentration of power creates a new set of gatekeepers, allowing a handful of platforms to control which ideas and opinions are seen and shared . . .
…
,” Business Insider, February 19, 2014, http://www.businessinsider.com/facebook-is-buying-whatsapp-2014-2. 50 Thailand and the Philippines: Adam Minter, “Emerging Markets Can’t Quit Facebook,” Bloomberg, April 19, 2018, https://www.bloomberg.com/view/articles/2018-04-19/emerging-markets-can-t-quit-facebook. 50 “The web that many”: Tim Berners-Lee, “The Web Is Under Threat. Join Us and Fight for It,” Web Foundation, March 12, 2018, https://webfoundation.org/2018/03/web-birthday-29/. 51 nearly a billion users: Emma Lee, “WeChat Nears 1 Billion Users,” TechNode, August 17, 2017, https://technode.com/2017/08/17/wechat-nears-1-billion-users/. 51 On WeChat: Jonah M.
Globish: How the English Language Became the World's Language
by
Robert McCrum
Published 24 May 2010
By creating institutions such as the British Museum and Kew Gardens (both founded in 1759), Britain planted the ambition for a public realm in which intellectual and cultural life would be open and accessible to all. This principle would eventually give birth to the BBC and the Open University, and would possibly inspire the World Wide Web, pioneered by Tim Berners-Lee. The significance of 1759 is perhaps reinforced, for the superstitious, by the reappearance of Halley’s comet blazing across northern skies in March 1759. To the English, this celestial fire had once been a portent of catastrophe; on this occasion, a year of battles across the known world saw the wheel of historical fortune turn in Britain’s favour.
…
In the 1990s a deep, ancient enmity became anaesthetised by Globish consumerism – Gap jeans, Benetton fashion and Dell computers. Above all, it was the personal computer, like the printing press of the 1460s and 1470s, that would create the new environment in which Globish, the dialect of Generation Y, could flourish. 4 Tim Berners-Lee, who invented the World Wide Web, called it ‘an abstract (imaginary) space of information’. On the Net, he said, you find computers. ‘On the Web, you find information. The Web could not be without the Net.’ In 1995 the potential of the Web was transformed by the launch of Netscape, the first commercial browser.
Where Wizards Stay Up Late: The Origins of the Internet
by
Katie Hafner
and
Matthew Lyon
Published 1 Jan 1996
More people by the day were logging-on to conduct business or find entertainment on the Net. Analysts pronounced the Internet the next great marketing opportunity. The takeoff was just beginning. In 1990, the World Wide Web, a multimedia branch of the Internet, had been created by researchers at CERN, the European Laboratory for Particle Physics near Geneva. Using Tim Berners-Lee’s HTTP protocol, computer scientists around the world began making the Internet easier to navigate with point-and-click programs. These browsers were modeled after Berners-Lee’s original, and usually based on the CERN code library. One browser in particular, called Mosaic, created in 1993 by a couple of students at University of Illinois, would help popularize the Web and therefore the Net as no software tool had yet done.
…
The Internet’s roots most certainly lay with the ARPANET. The group around the telephone grew uncomfortable. “How about women?” asked the reporter, perhaps to break the silence. “Are there any female pioneers?” More silence. The weekend was as noteworthy for who wasn’t present as for who was. Tim Berners-Lee, the inventor of the World Wide Web, had just moved to Boston from Geneva to join MIT’s Laboratory for Computer Science. He wasn’t invited, nor was Marc Andreessen, the co-programmer of Mosaic, who had just left Illinois to develop a commercial version of his Web browser. Granted, they hadn’t played roles in the birth of either the ARPANET or the Internet (Andreessen wasn’t even born until 1972, after the first ARPANET nodes were installed) and couldn’t technically be counted as founders.
Beyond: Our Future in Space
by
Chris Impey
Published 12 Apr 2015
The US Congress passed a law that allowed the NSF to support access to networks that weren’t used exclusively for research and education. This created angst as researchers worried that the new Internet might not be responsive to their needs. The online world had always been a geeky place of text and equations, but in 1989 CERN researcher Tim Berners-Lee released his hypertext concept for public use. In 1993, a team led by Marc Andreessen at the University of Illinois increased the visual appeal of the Internet by releasing the first web browser, called Mosaic. Encryption was added soon afterward to make transactions more secure. In 1995, the NSF dropped all restrictions on Internet commerce and let private companies take over the high-speed “backbone.”
…
He has done performance art converting colors into music, and his art focuses on the relationship between color and sound; he’s also had experimental theater and dance performances. Using his eyeborg, he has created live sonic “portraits” of celebrities, including Leonardo DiCaprio, Al Gore, Tim Berners-Lee, James Cameron, Woody Allen, and Prince Charles. A 2013 Huffington Post article, “Hacking Our Senses,” features his 2012 TED Global Talk, “I Listen to Color,” and quotes him saying, “I don’t feel that I’m using technology, I don’t feel that I’m wearing technology, I feel that I am technology.”
Reinventing the Bazaar: A Natural History of Markets
by
John McMillan
Published 1 Jan 2002
As the information economy has grown, so have debates about whether information should be owned. To some, intellectual property is a relic of an earlier age. They see a struggle for cyberspace, as copyrights and patents fence off what should be free. A headline in the satirical magazine The Onion put their worries in a nutshell: “Microsoft Patents Ones, Zeroes.” Tim Berners-Lee, the inventor of the world wide web (which he did not patent), has called for software developers to fight the patent system. He and other computer engineers believe software should be nonproprietary, so people could give it to each other and build on each other’s work, as indeed happened in the early days of computers.
…
They share the same chief characteristic: a decentralized structure and a resulting freewheeling nature. The “strongest feature” of the internet, according to John Quarterman, the manager of an internet firm, is that “no single entity is in control, and its pieces run themselves, cooperating to form the network of networks that is the internet.” The world wide web, according to its inventor, Tim Berners-Lee, took off “by the grassroots effort of thousands.”13 The internet has no central planning bureau, no equivalent of Gosplan, the Soviet Union’s planning agency. Instead of a strong central authority, many people control small parts of it. The multiple networks that combine to form the internet are operated by thousands of service providers and hundreds of telecommunication companies.
The Corruption of Capitalism: Why Rentiers Thrive and Work Does Not Pay
by
Guy Standing
Published 13 Jul 2016
James Watt’s patent to protect his invention of the steam engine prevented further development of the technology until after his patent expired.9 Had César Milstein applied for a patent for his creation of monoclonal antibodies, many advances in cancer treatment would have been delayed.10 The decision of Tim Berners-Lee and CERN (the European Organization for Nuclear Research), where he worked, not to patent his 1989 invention of the World Wide Web paved the way for an explosion in information and communication technologies. This could not have happened had use of the invention been restricted. While growth in patent-intensive industries has outstripped growth in sectors with few patents or other intellectual property, there is no consensus on whether patent rights result in more innovation.
…
Transport for London now plans to reinstate one of the arches as well as the murals. The other three arches will be reconstructed and go on public display at Edinburgh University. But the episode showed the vigilance required to save the cultural commons. INTELLECTUAL COMMONS Ideas and information should be part of the intellectual commons, available to all. Tim Berners-Lee, inventor of the World Wide Web, wanted the web to be part of that. But, although it is nominally free of charge, it has become a powerful means of exploitation and commodification, notably through the spread of intellectual property rights. Turning ideas into intellectual property rights results in contrived scarcity by limiting access to and use of information.
The Myth of Capitalism: Monopolies and the Death of Competition
by
Jonathan Tepper
Published 20 Nov 2018
Australian's AdNews discovered that Facebook claims to reach 1.7 million more 16- to 39-year-olds in Australia than exist in the country, according to its census bureau. There is a similar situation in the United States, where Facebook claims a potential reach of 41 million 18- to 24-year-olds, 60 million 25- to 34-year-olds, and 61 million 35- to 49-year-olds. All of these numbers exceed US Census figures.44 It is not just news that is dying. Tim Berners Lee, the creator of the web, thinks the internet itself is dying.45 In 2014 the web took a very dark turn. Beforehand, traffic to websites came from all sorts of places, and the web was a lively ecosystem. But starting in 2014, over half of all traffic started coming from Facebook and Google. Today, over 70% of traffic is dominated by the two sources.46 For websites like the comedy hub Funny or Die, Facebook ended up capturing all the economics of their content.
…
Chapter 5: Silicon Valley Throws Some Shade 1. https://www.theregister.co.uk/2010/12/01/google_eu_investigation_comment/. 2. http://europa.eu/rapid/press-release_IP-17-1784_en.htm. 3. https://www.wired.com/story/yelp-claims-google-broke-promise-to-antitrust-regulators/. 4. https://www.nytimes.com/2018/02/20/magazine/the-case-against-google.html. 5. http://theweek.com/articles/693488/google-monopoly--crushing-internet. 6. https://theoutline.com/post/1399/how-google-ate-celebritynetworth-com. 7. https://www.idc.com/promo/smartphone-market-share/os. 8. https://www.netmarketshare.com/browser-market-share.aspx. 9. https://www.wsj.com/articles/how-google-swayed-efforts-to-block-annoying-online-ads-1518623663. 10. https://www.rollingstone.com/politics/features/taibbi-facebook-can-we-be-saved-social-media-giant-w518655. 11. https://www.forbes.com/sites/stevendennis/2017/06/19/should-we-care-whether-amazon-is-systematically-destroying-retail/#62085ff66b1f. 12. https://www.axios.com/regulators-ftc-facebook-google-doj-advertising-5ea0f001-eca8-4f07-b7d0-6ed22782800f.html. 13. https://lpeblog.org/2017/12/06/from-territorial-to-functional-sovereignty-the-case-of-amazon/. 14. http://www.nationalreview.com/article/450476/silicon-valleys-anti-conservative-bias-solution-treat-major-tech-companies-utilities. 15. https://www.wired.com/story/heres-what-facebook-wont-let-you-post/. 16. https://www.nytimes.com/2016/11/22/technology/facebook-censorship-tool-china.html. 17. https://www.washingtonpost.com/news/volokh-conspiracy/wp/2014/12/21/facebook-should-stop-cooperating-with-russian-government-censorship/. 18. http://iasc-culture.org/THR/THR_article_2017_Fall_Pasquale.php. 19. https://www.bloomberg.com/news/articles/2018-01-02/google-s-dutch-sandwich-shielded-16-billion-euros-from-tax. 20. https://moderndiplomacy.eu/2018/05/17/the-google-tax/. 21. https://www.theatlantic.com/business/archive/2016/04/corporate-tax-avoidance/478293/. 22. https://www.theguardian.com/commentisfree/2017/nov/08/tax-havens-dodging-theft-multinationals-avoiding-tax. 23. https://cyber.harvard.edu/interactive/events/conferences/2008/09/msvdoj/smith. 24. https://www.nytimes.com/2018/02/20/magazine/the-case-against-google.html. 25. https://www.theringer.com/tech/2018/5/18/17362452/microsoft-antitrust-lawsuit-netscape-internet-explorer-20-years. 26. https://www.wsj.com/articles/inside-the-u-s-antitrust-probe-of-google-1426793274. 27. https://theintercept.com/2016/04/22/googles-remarkably-close-relationship-with-the-obama-white-house-in-two-charts/. 28. https://www.recode.net/2018/1/23/16919424/apple-amazon-facebook-google-uber-trump-white-house-lobbying-immigration-russia. 29. http://www.stateofdigital.com/eric-schmidt-at-google-hearings-close-to-monopoly-but-weve-not-cooked-anything/. 30. https://laweconcenter.org/wp-content/uploads/2018/05/manne-the_real_reaon_foundem_foundered_2018-05-02-1.pdf. 31. https://www.salon.com/2015/11/24/googles_insidious_shadow_lobbying_how_the_internet_giant_is_bankrolling_friendly_academics_and_skirting_federal_investigations/. 32. https://www.nytimes.com/2017/08/30/us/politics/eric-schmidt-google-new-america.html. 33. https://qz.com/1206184/bill-gates-warns-silicon-valley-not-to-be-the-new-microsoft/. 34. https://www.npr.org/sections/thetwo-way/2017/06/27/534524024/google-hit-with-2-7-billion-fine-by-european-antitrust-monitor. 35. http://ec.europa.eu/competition/antitrust/cases/dec_docs/39740/39740_14996_3.pdf. 36. https://www.newyorker.com/magazine/2017/08/28/who-owns-the-internet. 37. https://www.cjr.org/special_report/facebook-media-buzzfeed.php. 38. https://www.theguardian.com/technology/2017/oct/23/facebook-non-promoted-posts-news-feed-new-trial-publishers. 39. https://www.socialmediatoday.com/social-networks/complete-list-facebooks-misreported-metrics-and-what-they-mean. 40. https://www.socialmediatoday.com/social-networks/complete-list-facebooks-misreported-metrics-and-what-they-mean. 41. https://nypost.com/2016/11/03/facebook-sued-over-its-fraudulent-ad-metrics/. 42. https://www.broadcastingcable.com/news/facebook-s-video-move-may-aid-nielsen-comscore-168497. 43. http://adcontrarian.blogspot.com/2013/06/the-75-billion-ad-swindle.html. 44. https://www.theguardian.com/technology/2017/sep/07/facebook-claims-it-can-reach-more-people-than-actually-exist-in-uk-us-and-other-countries. 45. https://www.theguardian.com/technology/2017/mar/11/tim-berners-lee-web-inventor-save-internet. 46. https://staltz.com/the-web-began-dying-in-2014-heres-how.html. 47. http://www.vulture.com/2018/02/how-facebook-is-killing-comedy.html. 48. https://medium.com/humane-tech/tech-and-the-fake-market-tactic-8bd386e3d382. 49. https://staltz.com/the-web-began-dying-in-2014-heres-how.html. 50. https://www.rollingstone.com/politics/features/taibbi-facebook-can-we-be-saved-social-media-giant-w518655. 51. https://www.theatlantic.com/technology/archive/2018/04/amazon-may-have-a-counterfeit-problem/558482/. 52. https://www.theatlantic.com/technology/archive/2018/04/amazon-may-have-a-counterfeit-problem/558482/. 53. https://www.forbes.com/sites/stevendennis/2017/06/19/should-we-care-whether-amazon-is-systematically-destroying-retail/#62085ff66b1f. 54. https://www.yalelawjournal.org/note/amazons-antitrust-paradox. 55. https://www.theguardian.com/technology/2015/jun/23/amazon-marketplace-third-party-seller-faustian-pact. 56. https://www.forbes.com/sites/retailwire/2014/10/30/is-amazon-undercutting-third-party-sellers-using-their-own-data/#700a08a953d8. 57. https://www.propublica.org/article/amazon-says-it-puts-customers-first-but-its-pricing-algorithm-doesnt 58. https://rainforests.mongabay.com/0202.htm. 59. https://www.rand.org/pubs/research_briefs/RB77/index1.html. 60. https://www.ncbi.nlm.nih.gov/books/NBK236347/. 61. https://www.vox.com/new-money/2017/7/11/15929014/end-of-the-internet-startup. 62.
Bezonomics: How Amazon Is Changing Our Lives and What the World's Best Companies Are Learning From It
by
Brian Dumaine
Published 11 May 2020
In 1961, a California start-up named Fairchild Semiconductor started selling the first microchip, an invention that allowed the miniaturization of electronics and led to corporations using computers to expand globally at a level never before seen. That breakthrough eventually put armies of accountants, middle managers, and telephone operators out of work. Tim Berners-Lee, a computer scientist at CERN, a Swiss research organization, created in 1989 the HTTP Internet standards, which facilitated communication on the web between servers and clients. In the following years, more and more companies adopted the web as a business model. That gave us laptops, smartphones, search engines, online shopping, and social media.
…
When Henry Ford proved: “Celebrating the Moving Assembly Line in Pictures,” Ford Media Center, September 12, 2013, https://media.ford.com/content/fordmedia/fna/us/en/features/celebrating-the-moving-assembly-line-in-pictures.html. In 1961, a California start-up: David Laws, “Fairchild Semiconductor: The 60th Anniversary of a Silicon Valley Legend,” Computer History Museum, September 19, 2017. Tim Berners-Lee, a computer scientist: “World Wide Web,” Encyclopaedia Britannica, https://www.britannica.com/topic/World-Wide-Web. The consulting firm McKinsey: James Manyika et al., “Jobs Lost, Jobs Gained: What the Future of Work Will Mean for Jobs, Skills and Wages,” McKinsey Global Institute, November 2017.
The Truth Machine: The Blockchain and the Future of Everything
by
Paul Vigna
and
Michael J. Casey
Published 27 Feb 2018
There’s Solid, which stands for Social Linked Data, a new protocol for data storage that puts data back in the hands of the people to whom it belongs. The core idea is that we will store our data in Pods (Personalized Online Data Stores) and distribute it to applications via permissions we control. Solid is the brainchild of none other than Tim Berners-Lee, the computer scientist who perfected HTTP and gave us the World Wide Web. Another one that gets a lot of people excited is the Interplanetary File System, designed by Juan Benet. The principle behind it is similar to that of the popular file-sharing system BitTorrent, which unlike Napster has defied music- and movie-studio efforts to have it shut down on piracy grounds.
…
And in places like Estonia, a country that has turned itself into a veritable living lab for civic tech, the government is warming to the idea of blockchains as a more reliable notarization service, ensuring that trusted documents can much more easily be submitted for applications for services. All manner of government records could soon be transferred into this immutable environment. And the more that access to that data can be put under the control of citizens themselves, rather than locked in the siloed departments that Tim Berners-Lee complains about, the closer we’ll come to the great information-processing power of a longed-for open-data age. Yet, despite these strides, the readiness of the regulatory machinery to address the changes that are coming is woeful. One problem is that before we get lawmakers and regulators to understand blockchains, we need them to focus on everything else that’s going on in the digital transformation of our age: the other paradigm shifts that AI, virtual reality, 3D printing, the Internet of Things, and network analytics are bringing to the economy.
New Power: How Power Works in Our Hyperconnected World--And How to Make It Work for You
by
Jeremy Heimans
and
Henry Timms
Published 2 Apr 2018
Participants and super-participants in the platform, and the wider public, could have legitimate debates about whether, how, and how much a platform should “tip the scales” in this way. NEW POWER, NEW PLATFORMS, NO PLATFORMS? A decade ago, none other than the father of the World Wide Web, Tim Berners-Lee, saw the dangers of participation farms on the horizon. In 2008, almost twenty years after laying out his original vision, he rallied for the building of “decentralized social networks” that would reclaim his beloved web from increasingly centralizing sites like Facebook. He saw a big prize in a more fluid and pluralistic world of platforms in which “online social networking will be more immune to censorship, monopoly, regulation, and other exercise of central authority.”
…
“A community-owned Twitter”: “Notice of Annual Meeting of Stockholders,” May 22, 2017. www.twitter.com. “Think more artist respect”: Stocksy United, “Raising the Bar—and the Industry’s Expectations—of Stock Photography and Cinematography,” July 2017. www.stocksy.com. “online social networking will be more immune”: Ching-man Au Yeung, Ilaria Liccardi, Kanghao Lu, Oshani Seneviratne, and Tim Berners-Lee, “Decentralization: The Future of Online Social Networking,” In W3C Workshop on the Future of Social Networking Position Papers, 2009. Berners-Lee’s Solid project: “What Is Solid?” July 2017. https://solid.mit.edu. You might know the Blockchain: BlockGeeks, “What Is Blockchain Technology?
We Are Data: Algorithms and the Making of Our Digital Selves
by
John Cheney-Lippold
Published 1 May 2017
But beyond the fundamentals of capitalist exploitation, the baseline premise of data being ownable has a provocative ring to it. If we could mobilize and change Facebook’s terms of service to allow us true ownership over our Facebook profiles and data, then we could wrestle some control back. In this way, ownership is interpreted as a liberatory, privacy-enhancing move. Tim Berners-Lee, inventor of the World Wide Web and leading voice against the commercialization of the Internet, believes our data belongs to you or me or whoever births it.11 In a way, our datafied lives are mere by-products of our mundane entrepreneurial spirits, a tradable commodity whose value and use we, and we alone, decide.
…
Margaret Jean Radin, “Regulation by Contract, Regulation by Machine,” Journal of Institutional and Theoretical Economics 160, no. 1 (2014): 142–156; Hector Postigo, “Emerging Sources of Labor on the Internet: The Case of America Online Volunteers,” International Review of Social History 48 (2003): 205–223. 9. Ethan Zuckerman, “The Internet’s Original Sin,” Atlantic, August 14, 2014, www.theatlantic.com. 10. Christian Fuchs, “The Political Economy of Privacy on Facebook,” Television & New Media 13, no. 2 (2012): 139–159. 11. Alex Hern, “Sir Tim Berners-Lee Speaks Out on Data Ownership,” Guardian, October 8, 2014, www.theguardian.com. 12. Jaron Lanier, Who Owns the Future? (New York: Simon and Schuster, 2014). 13. Tiziana Terranova, Network Culture: Politics for the Information Age (London: Pluto, 2004); Trebor Scholz, ed., Digital Labor: The Internet as Playground and Factory (New York: Routledge, 2012). 14.
AngularJS Essentials
by
Rodrigo Branas
Published 20 Aug 2014
We appreciate your help in protecting our authors, and our ability to bring you valuable content. Questions You can contact us at questions@packtpub.com if you are having a problem with any aspect of the book, and we will do our best to address it. [6] Getting Started with AngularJS HyperText Markup Language (HTML) was created in 1990 by Tim Berners-Lee—a famous physics and computer scientist—while he was working at CERN, the European Organization for Nuclear Research. He was motivated about discovering a better solution to share information among the researchers of the institution. To support that, he also created the HyperText Transfer Protocol (HTTP) and its first server, giving rise to the World Wide Web (WWW).
Roads and Bridges
by
Nadia Eghbal
Originally a working group within the United States government in 1986, the IETF became an independent, international organization in 1993. [45] The IETF itself is run by volunteers, and there are no membership requirements: anyone from the public may join simply by declaring him- or herself a member. The World Wide Web Consortium (W3C) helps set standards for the World Wide Web. It was founded by Tim Berners-Lee in 1994. The W3C tends to focus more exclusively on web pages and documents ( they are, for example, the reason why web pages use HTML for basic formatting). They maintain the standards around the markup language HTML and stylesheet formatting language CSS, two basic components of any web page.
Free Ride
by
Robert Levine
Published 25 Oct 2011
(The most successful PC game of recent years, World of Warcraft, is a closed system of its own; it charges a subscription fee.) And it’s why apps sell much better for Apple’s iPhone platform than for Google’s Android operating system.10 The online world needs to support both. To some in the technology business, this sort of thinking is nothing short of sacrilege. In the December 2010 issue of Scientific American, Tim Berners-Lee, the computer scientist credited with developing the World Wide Web, wrote “a call for continued open standards.”11 He suggested that Facebook shouldn’t keep its data behind walls and said that “the tendency for magazines, for example, to produce smartphone ‘apps’ rather than Web apps is disturbing, because that material is off the Web.
…
In the U.K., the Guardian used a similar strategy to sift through 458,832 pages of documents about politicians’ expenses. “Investigate Your MP’s Expenses,” Guardian, June 2009. 10. Gartner estimates that nine out of ten apps sold in 2010 were for Apple. Forecast: Mobile Application Stores, Worldwide, 2008–2014 (Gartner, January 26, 2011). 11. Tim Berners-Lee, “Long Live the Web: A Call for Continued Open Standards and Neutrality,” Scientific American, December 2010. 12. David Goldman, “Android and Qualcomm Are the New Wintel,” CNNMoney.com, November 12, 2010. The article references a study by the consultancy PRTM—which coined the term “Quadroid”—that says three-quarters of new Android phones use Qualcomm chips.
The Industries of the Future
by
Alec Ross
Published 2 Feb 2016
In my view, the best case for Bitcoin is not as a currency but as a protocol, relying on the new possibilities offered by the blockchain. In the same way HTML became the protocol markup language for the World Wide Web, the blockchain may have the technological ingenuity to become the protocol for trusted transactions. The Web was essentially made by HTML. The great innovation of Tim Berners-Lee, the Web’s creator, was that he made the Internet something visible, accessible, and easily navigable—and that allowed other innovations to be layered on top of the platform. The blockchain makes trusted transactions the basis—the protocol—on which much else can be built. The blockchain could provide a much lower-cost solution for transactions that require a third-party intermediary as a guarantor such as legal documents, brokerage fees, and ticket purchases.
…
As Andreessen further describes: Brian Fung, “Marc Andreessen: In 20 Years, We’ll Talk about Bitcoin Like We Talk about the Internet Today,” Washington Post, May 21, 2014, http://www.washingtonpost.com/blogs/the-switch/wp/2014/05/21/marc-andreessen-in-20-years-well-talk-about-bitcoin-like-we-talk-about-the-internet-today/. The great innovation of Tim Berners-Lee: “Inventor of the Week Archive: The World Wide Web,” MIT, http://web.mit.edu/invent/iow/berners-lee.html. My hunch is that: Joichi Ito, “Why Bitcoin Is and Isn’t like the Internet,” LinkedIn Pulse, January 18, 2015, https://www.linkedin.com/pulse/why-bitcoin-isnt-like-internet-joichi-ito. As Marc Andreessen has described: Andreessen, “Why Bitcoin Matters.”
Remix: Making Art and Commerce Thrive in the Hybrid Economy
by
Lawrence Lessig
Published 2 Jan 2009
Other such scum quickly followed. Usenet became less and less a place where conversation could happen, and more and more a ghetto for gambling ads and other such scams (see also your e-mail in-box).5 Just about the time that Usenet was fading, the World Wide Web was rising. The Web’s inventor, Tim Berners-Lee, was keen that the Web be a RW medium—what Benkler calls “the writable Web.”6 He pushed people developing tools to implement Web protocols to design their tools in a way that would encourage both reading and writing.7 At first, this effort failed. The real drive for the Web, its developers thought, would be businesses and other organizations that would want to publish content to the world.
…
In short, it will have taken just about all its assets and freed them to the community. And from these gifts, it expects to inspire a creativity that will make the platform extraordinarily valuable. 80706 i-xxiv 001-328 r4nk.indd 220 8/12/08 1:55:51 AM H Y BRID EC O NO MIE S 221 It’s been about seventeen years since the World Wide Web became more than a dream of Tim Berners-Lee. As it has seeped into our veins, it has changed how we interact. More of us do things for other people, even if we do it because it is fun. More businesses find ways to do things for us, because doing so is more profitable for them. And more are experimenting with ways to build value by working with a community—commercial economies leveraging sharing economies to produce hybrids.
An Optimist's Tour of the Future
by
Mark Stevenson
Published 4 Dec 2010
At this juncture it’s worth clearing up a common point of confusion – ‘the Internet’ and ‘the Web’ are not the same thing, despite often being referred to interchangeably. The Internet is deep plumbing. You won’t really see it, any more than you see the sewer when you go to the bathroom. The Web, invented in 1989 by Tim Berners-Lee, sits on top of that plumbing to provide us with a service – a way to take packets of data and present them in a visual and interlinked way called ‘web pages.’ (The Web is not the only service sitting on top of the Internet, email is another.) The two terms, however, have become popularly interchangeable because it was the Web that suddenly made the Internet useful to a much larger audience.* Beyond personal uses of the Internet/Web combo (being able to shop, get travel information, or find pictures of cats that look like Hitler) there are collective benefits too.
…
‘The good side of it is that we encounter people we never would have encountered, we have an opportunity to rub ideas together we might never have had the chance to explore – and I think that’s incredibly powerful,’ says Vint. Perhaps this is why staff at the Italian edition of Wired magazine nominated the Internet/Web for the 2010 Nobel Peace Prize and Vint Cerf, Bob Kahn and Tim Berners-Lee to be the recipients if it’s accepted. But connecting people is only a fraction of the Internet’s story. As Cerf has written: In the next decade, around 70% of the human population will have fixed or mobile access to the Internet at increasingly high speeds. We can reliably expect that mobile devices will become a major component of the Internet, as will appliances and sensors of all kinds.
Conscious Capitalism, With a New Preface by the Authors: Liberating the Heroic Spirit of Business
by
John Mackey
,
Rajendra Sisodia
and
Bill George
Published 7 Jan 2014
Capitalism and democracy decidedly won that epic battle, and the debates that remained were about the types of democracy and the degree of economic freedom that worked best. The Birth of the Web Working in Switzerland at CERN (the European Organization for Nuclear Research), British physicist Tim Berners-Lee invented the World Wide Web in 1989.2 His creation has rapidly transformed the world in myriad ways. You could argue that Berners-Lee did more to transform the world than any single individual in the past hundred years, including Churchill, Roosevelt, Gandhi, and Einstein. His invention is at least as dramatically culture changing as Guttenberg’s printing press was over five hundred years ago.
…
Wicks, Managing for Stakeholders: Survival, Reputation, and Success (New Haven: Yale University Press, 2007). 20. Marc Gafni, interview with authors, March 15, 2012. Chapter Two 1. Jonathan Plucker, ed., “The Flynn Effect,” in Human Intelligence: Historical Influences, Current Controversies, Teaching Resources, Indiana University, 2002, www.indiana.edu/~intell/flynneffect.shtml. 2. Tim Berners-Lee, “Homepage,” n.d., www.w3.org/People/Berners-Lee/. 3. The downside of this, of course, is accuracy. Anyone can publish anything on the Web, and some will believe it without question. 4. Mary Lennighan, “Number of Phones Exceeds Population of World,” Total Telecom, May 2011, www.totaltele.com/view.aspx?
Smart Mobs: The Next Social Revolution
by
Howard Rheingold
Published 24 Dec 2011
By adhering to one of the principles Ostrom had recognized—in complex social systems, the levels of governance should nest within each other—Internet architects hit upon the “end-to-end” principle that allows individual innovators, not the controllers of the network, to decide what to build on the Internet’s capabilities.62 When Tim Berners-Lee created World Wide Web software at a physics laboratory in Geneva, he didn’t have to get permission to change the way the Internet works, because the computers that are connected (the “fringes”), not a central network, is where the Internet changes. Berners-Lee simply wrote a program that worked with the Internet’s protocols and evangelized a group of colleagues to start creating Web sites; the Web spread by infection, not fiat.63 In 1993, Marc Andreesen and other programmers at the U.S.
…
See also: <http://www.tuxedo.org/~esr/writings/homesteading/ >(29 January 2002). 60. BIND (Berkeley Internet Name Domain), <http://www.isc.org/products/BIND/> (16 January 2002). 61. Network Working Group, ed. B. Carpenter, “Architectural Principles of the Internet,” June 1996, <http://www.ietf.org/rfc/rfc1958.txt> (26 November 2001). 62. Ibid. 63. Tim Berners-Lee, “Information Management: A Proposal,” 1989, < http://www.w3.org/History/1989/proposal.html> (29 January 2002). 64. Andy Server, “It Was My Party—and I Can Cry If I Want To,” Business 2.0, March 2001, <http://www.business2.com/articles/mag/0,1640,9662,FF.html> (12 August 2001). 65. Daniel Dern, “A Real Brief History of Usenet,” BYTE Magazine, 4 September 1999. 66.
Silence on the Wire: A Field Guide to Passive Reconnaissance and Indirect Attacks
by
Michal Zalewski
Published 4 Apr 2005
These attempts were not particularly successful though, largely because the computing power needed to make the technology appeal to users was still years in the future. The right time came in the late 1980s. After the microcomputer boom, and shortly before the frontal assault of the PC platform, a number of humble proposals made the rounds at Conseil Europeén pour la Recherche Nucléaire[29] (CERN) concerning the possibilities of hyperlinking. Tim Berners-Lee, one of the CERN researchers, is by all accounts the one to officially blame for spawning HyperText Markup Language (HTML), a set of controls for embedding metadata, links, and media resources in text documents. (Truth be told, HTML, the core of the Web as we know it, is hardly an entirely new design and borrows some ideas from SGML, an ISO 8879 Standard Generalized Markup Language of 1986.)
…
Braden (editor), “RFC1122: Requirements for Internet Hosts—Communication Layers,” Network Working Group (1989). [96] Salvatore Sanfilippo, “New TCP Scan Method,” Bugtraq, http://seclists.org/bugtraq/1998/Dec/0082.html (1998). [97] World Wide Web Consortium, http://www.w3c.org/History.html. [98] Vannevar Bush, “As We May Think,” Atlantic Monthly 176, no. 1 (1945): 101-08. [99] Tim Berners-Lee, “Basic HTTP,” http://www.w3c.org/Protocols/HTTP/HTTP2.html. [100] R. Fielding, J. Gettys, J. Mogul, H. Frystyk, L. Masinter, P. Leach, T. Berners-Lee, “RFC2616: HyperText Transfer Protocol—HTTP/1.1.” Network Working Group (1999). [101] Various sources, references quoted after http://usability.gov/guide-lines/softhard.html: Anna Bouch, Allan Kuchinsky, Nina Bhatti, “Quality Is in the Eye of the Beholder: Meeting Users’ Requirements for Internet Quality of Service,” CHI (2000); Martin, Corl, “System Response Time Effects on User Productivity,” Behaviour and Information Technology, vol 5, no. 1, 3-13 (1986); Jakob Nielsen, “Top Ten Mistakes in Web Design,” http://www.useit.com/alertbox/9605.html (1996); Nielsen, “The Need for Speed,” http://www.useit.com/alertbox/9703a.html (1997); Nielsen, “Changes in Web Usability Since 1994,” http://www.useit.com/alertbox/9712a.html (1997); Nielsen, “The Top Ten New Mistakes of Web Design,” http://www.useit.com/alertbox/990530.html (1999)
Searches: Selfhood in the Digital Age
by
Vauhini Vara
Published 8 Apr 2025
In the formative years of the internet, in the 1980s and early 1990s, old-school internet users who hung out on message boards would get irritated every September when freshmen showed up at universities, received their campus-based internet accounts for the first time, and flooded the message boards. “They would use them to, among other things, download naughty images,” Jay Furr, an early internet user, told me. In 1989, the British computer scientist Tim Berners-Lee, working at a European institution called CERN, had invented the World Wide Web, a global information system that involved using browsers to open hyperlinked documents. Then, in April 1993, CERN decided to make its World Wide Web source code freely available. Suddenly internet traffic swelled, with companies like America Online coming along to capitalize on it.
…
It was taken by a male computer scientist at CERN named Silvano de Gennaro who wrote and performed with the group. Backstage at a gig one day, he told me, he decided to snap a photo of the women for their photo album. They posed, booties and boobs out, smiles obliging. Later, when de Gennaro was editing the image for the Cernettes’ CD cover, Tim Berners-Lee, the man credited with inventing the World Wide Web, came into his office, glanced at the photo, and suggested it be posted on a section of CERN’s website devoted to its social activities. “Sex sells!” Jean-François Groff, a programmer who helped get the photo online, later told Vice. “It’s media.
Valley of Genius: The Uncensored History of Silicon Valley (As Told by the Hackers, Founders, and Freaks Who Made It Boom)
by
Adam Fisher
Published 9 Jul 2018
They put together three: the Ethernet—which they invented—with the ARPANET, with the SRI packet-radio net. That was in about 1975 or 1976. We had an internet without a world wide web or a browser. Those things we did not invent. The first version of the world wide web was hacked together in 1990 by Tim Berners-Lee, an English computer scientist working in a French physics lab. The embryonic web was a geeky but efficient way to link a couple thousand physicists to a tiny number of supercomputers. Then Marc Andreessen, an American student working at NCSA, the National Center for Supercomputing Applications at the University of Illinois, built NCSA Mosaic—the first decent web browser.
…
To go from one site to another you had to hang up the phone and make another call and listen to this crazy sound that the modem would make and then hopefully you would get a connection. It took so much work to move from one place to another. You really had to be committed to find anything useful on it. Jim Clark: The web’s original genesis was with Tim Berners-Lee, as a means for physicists to share publications and to pass around written documents of that sort. HTML and all of the Hypertext Transfer Protocol that is used is basically a format for passing around these documents, for shipping them electronically to other people. Aleks Totić: Everything was academic on the internet.
…
Brian Behlendorf: At Wired in the summer of ’93 I was setting up a website, but I was also fixing bugs, I was adding features, and sending them upstream over this e-mail list to the kids at the University of Illinois building, this along with everyone else, the other users, and we were trading these fixes like baseball cards. This was kind of like the Brownian motion of how technology improvement took place in the early days of the web. Louis Rossetto: It was only in the second issue that we had a small news item in the front talking about Tim Berners-Lee in Geneva. That was the first mention of the web in our magazine. But not the last. Then it became something we were paying attention to. Howard Rheingold: And so Louis thought, Okay—let’s create a web-based cultural publication that we’ll make money on. One of Louis’s admirable traits was that he thought really, really, really big.
Financing Basic Income: Addressing the Cost Objection
by
Richard Pereira
Published 5 Jul 2017
Since this is a public investment, surely Internet service providers should not be granted ongoing use of it for free since they charge users for access. Fitzgerald proposes a 10% resource rent on the $64.5 billion existing asset base providing $6.45 billion in revenue annually from the industry, including NBN and Internet service providers such as Bigpond, Optus and iiNet. Sir Tim Berners Lee created the World Wide Web including URL, http and html protocols in his spare time working at the Conseil Européen pour la Recherche Nucléaire (CERN) in Geneva, but required CERN to provide it as an open source common to everyone, so it would not be appropriate to charge for access. Banking Licenses The publicly granted privilege of banks to create money through bank loans may be the most valuable public asset given away by government.
In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence
by
George Zarkadakis
Published 7 Mar 2016
Email was thus the Internet’s first ‘killer app’. By the early 1990s, modems made email widely available. Computers began increasingly to connect to the Internet. The ocean was transforming into a new continent where information became a commodity. The invention of the World Wide Web (‘Web’ for short) by English computer scientist Sir Tim Berners-Lee provided a way for computers to share information. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser, the first web server and the first web pages.16 The browser is one of the Internet’s most widely used applications. It’s what allows us to search and navigate through vast amounts of information.
…
AD 50: Hero of Alexandria designs first mechanical automata. 1275: Ramon Lull invents Ars Magna, a logical machine. 1637: Descartes declares cogito ergo sum (‘I think therefore I am’). 1642: Blaise Pascal invents the Pascaline, a mechanical cal-culator. 1726: Jonathan Swift publishes Gulliver’s Travels, which includes the description of a machine that can write any book. 1801: Joseph Marie Jacquard invents a textiles loom that uses punched cards. 1811: Luddite movement in Great Britain against the auto-mation of manual jobs. 1818: Mary Shelley publishes Frankenstein. 1835: Joseph Henry invents the electronic relay that allows electrical automation and switching. 1842: Charles Babbage lectures at the University of Turin, where he describes the Analytical Engine. 1843: Ada Lovelace writes the first computer program. 1847: George Boole invents symbolic and binary logic. 1876: Alexander Graham Bell invents the telephone. 1879: Thomas Edison invents the light bulb. 1879: Gottlob Frege invents predicate logic and calculus. 1910: Bertrand Russell and Alfred North Whitehead publish Principia Mathematica. 1917: Karel Capek coins the term ‘robot’ in his play R.U.R. 1921: Ludwig Wittgenstein publishes Tractatus Logico-philosopicus. 1931: Kurt Gödel publishes The Incompleteness Theorem. 1937: Alan Turing invents the ‘Turing machine’. 1938: Claude Shannon demonstrates that symbolic logic can be implemented using electronic relays. 1941: Konrad Zuse constructs Z3, the first Turing-complete computer. 1942: Alan Turing and Claude Shannon work together at Bell Labs. 1943: Warren McCulloch and Walter Pitts demonstrate the equivalence between electronics and neurons. 1943: IBM funds the construction of Harvard Mark 1, the first program-controlled calculator. 1943: Charles Wynn-Williams and others create the computer Colossus at Bletchley Park. 1945: John von Neumann suggests a computer architecture whereby programs are stored in the memory. 1946: ENIAC, the first electronic general-purpose computer, is built. 1947: Invention of the transistor at Bell Labs. 1948: Norbert Wiener publishes Cybernetics. 1950: Alan Turing proposes the ‘Turing Test’. 1950: Isaac Asimov publishes I, Robot. 1952: Alan Turing commits suicide with cyanide-laced apple. 1952: Herman Carr produces the first one-dimensional MRI image. 1953: Claude Shannon hires Marvin Minsky and John McCarthy at Bell Labs. 1953: Ludwig Wittgenstein’s Philosophical Investigations pub-lished in German (two years after his death). 1956: The Dartmouth conference; the term ‘Artificial Intel-ligence’ is coined by John McCarthy. 1957: Allen Newell and Herbert Simon build the ‘General Problem Solver’. 1958: John McCarthy creates LISP programming language. 1959: John McCarthy and Marvin Minsky establish AI lab at MIT. 1963: The US government awards $2.2 million to AI lab at MIT for machine-aided cognition. 1965: Hubert Dreyfus argues against the possibility of Artificial Intelligence. 1969: Stanley Kubrick introduces HAL in the film 2001: A Space Odyssey. 1971: Leon Chua envisions the memristor. 1972: Alain Colmerauer develops Prolog programming language. 1973: The Lighthill report influences the British government to abandon research in AI. 1976: Hans Moravec builds the ‘Stanford Cart’, the first auto-nomous vehicle. Early 1980s: The Internet is invented. 1982: The 5th Generation Computer Systems Project is launch-ed by Japan. 1982: The film Blade Runner is released, directed by Ridley Scott, based on a short story by Philip K. Dick. 1989: Tim Berners-Lee invents the World Wide Web. 1990: Seiji Ogawa presents the first fMRI machine. 1993: Rodney Brooks and others start the MIT Cog Project, an attempt to build a humanoid robot child in five years. 1997: Deep Blue defeats Garry Kasparov at chess. 2000: Cynthia Breazeal at MIT describes Kismet, a robot with a face that simulates expressions. 2004: DARPA launches the Grand Challenge for autonomous vehicles. 2009: Google builds the self-driving car. 2011: IBM’s Watson wins the TV game show Jeopardy!.
Nerds on Wall Street: Math, Machines and Wired Markets
by
David J. Leinweber
Published 31 Dec 2008
Emanuel Derman, author of My Life as a Quant: Reflections on Physics and Finance ( John Wiley & Sons, 2004), made this point in the first line of his review for the Wall Street Journal: “By my reckoning, several of the 25 memoirists in How I Became a Quant are not true quants, and they are honest (or proud) enough to admit it.”2 xv xvi Introduction I am, no doubt, high on the list of poseurs, and I will be the first to admit it. Information technology applications in financial markets aren’t physics and closed-form solutions; they fit more in the zone of engineers and experimental guys, but they’ve been around forever. At the top of the heap we find Thomas Edison and Tim Berners-Lee, inventor of the World Wide Web. At the low end, they include more than a few potentially dangerous tinkerers like this guy: WANTED: Somebody to go back in time with me. This is not a joke. P.O. Box 322, Oakview, CA 93022. You’ll get paid after we get back. Must bring your own weapons. Safety not guaranteed.
…
We often think of the Web and the Internet as the same thing. However, the Internet is more than the Web; it is the technology that has expanded and transformed markets by making computer communication as simple as turning on the lights. Credit for the development of the World Wide Web goes to Tim Berners-Lee, then at CERN (European Organization for Nuclear Research) and now at the World Wide Web Consortium at MIT (www.w3c.org), which is the mother lode for information on new Web technologies. The original Internet started out as the ARPANET, developed by DARPA (www.darpa.mil). Many readers will recall the complexity and flashing red lights of the modems, and the effort to master the cryptic rites to connect to other machines.
Smarter Than You Think: How Technology Is Changing Our Minds for the Better
by
Clive Thompson
Published 11 Sep 2013
What we need now, as MacKinnon and other thinkers have argued, is a new Magna Carta for the digital age—one that requires corporate providers of online speech to respect the rights of those who speak on their platforms. “No person or organization shall be deprived of the ability to connect to others without due process of law and the presumption of innocence,” is the prime rule suggested by Tim Berners-Lee, the inventor of the Web. More countries worldwide (the United States included) could follow European Union officials and push for regulations requiring high-tech services to give users more control over their data—or deleting it upon request. It might seem utopian to imagine this sort of regulation stitched together across nations.
…
storyid=5881&mod=1&pg=1§ionId=21; “In India the Enemies of Free Speech Find a ‘Symbolic’ Means to Attack Cartoonist Aseem Trivedi,” Cartoonists Rights Network International, February 9, 2013, accessed March 26, 2013, www.cartoonistsrights.org/recent_developments_article.php?id=28; Cartoons Against Corruption, accessed March 26, 2013, cartoonsagainstcorruption.blogspot.com/. a new Magna Carta for the digital age: Tim Berners-Lee, “Long Live the Web: A Call for Continued Open Standards and Neutrality,” Scientific American, November 22, 2010, accessed March 26, 2013, www.scientificamerican.com/article.cfm?id=long-live-the-web. push for regulations: Venables, “The EU’s ‘Right to Be Forgotten.’” When student activists pressured apparel companies . . .
The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism
by
Nick Couldry
and
Ulises A. Mejias
Published 19 Aug 2019
As Rob Kitchin and Martin Dodge put it, “Software . . . alters the conditions through which society, space and time, and thus spatiality, are produced.”83 As a result, the internet, understood as an infrastructure of connection, has reconstituted social space in a fundamental way. In one sense, this was always the vision of IT pioneers. Tim Berners-Lee insisted in 1999 that “hope in life comes from the interconnections along all the people in the world.”84 But such connection was only the start of a larger transformation. Combine vastly extended interconnection with that basic feature of computers (they capture data), and the result is the beginning of a radical new possibility: that each computer’s archive of its own history becomes available to be tracked and influenced by any other computer anywhere in the internet’s vast universe of networks.
…
We do not therefore deny that property-based arguments (for example, legal fictions of ownership) can be developed that might support privacy and autonomy; for an interesting US argument along these lines, see Fairfield, Owned. There are also proposals for developing, even within the big-tech industry, some form of personal data portability; see the Data Transfer Project sponsored by Google, Micro soft, and Twitter, at https://datatransferproject.dev and http://dataportability.org. See also Tim Berners-Lee’s proposal for a new web infrastructure that would support data portability: https://solid.inrupt.com. But for caution against this type of argument, see Gurumurthy and Vasudevan, “Societal Need.” 18. We borrow here the notion of “settling” on a course of action from Steward, Metaphysics for Freedom. 19.
Dawn of the New Everything: Encounters With Reality and Virtual Reality
by
Jaron Lanier
Published 21 Nov 2017
For example, if one person could download a file, the other person, from whom the file was downloaded, could be notified of who was doing the downloading.1 Therefore, everything that was downloaded was contextualized, artists could be paid, scammers could be identified, and so on. Previous designs were centered on people, not data. There was never a need to copy information because one could always go back to the source, associated with a person. Indeed, copying was considered a crime against efficiency. Tim Berners-Lee chose to offer a different approach with the World Wide Web, one that was much easier to adopt in the short term, though we’ve paid dearly in the long term. To get started, one simply linked to online information, and the link went in only one direction. No one could tell if information had been copied.
…
It must be hard for people who have grown up with the Web to appreciate that feeling. Much later on, companies like Google and Facebook would make hundreds of billions of dollars for the service of partially mapping what should have been mapped from the start. This is in no way a criticism of Tim Berners-Lee. I continue to admire and respect him. He didn’t have a plan for world domination; only a plan to support physicists at a lab. Despite the feeling of guilt, the rise of the Web also felt miraculous. I used to wax rhapsodic about it in my lectures. The first time in history that millions of people had cooperated to do something not because of coercion, profit motive, or any influence other than the sense that the project was worthy.
Who Owns the Future?
by
Jaron Lanier
Published 6 May 2013
The early computers built at PARC looked remarkably like modern PCs and Macs, and the concept prototypes and sketches foresaw modern phones and tablets. Xerox became notorious for having funded the lab that defined the core of the modern feeling of computation, and yet famously failed to capitalize on it. Much later, when Tim Berners-Lee’s design for HTML first appeared, computer scientists who were familiar with the field Ted had pioneered—hypertext and networked media—offered the reaction you’d expect: “Wait, it only has one-way linking. That’s not adequate. It’s throwing away all the best information about network structure.”
…
But Ted’s preferred prefix was hyper-, which, he once told me, when I must have still been a teenager, also captured something of the frenetic edge that digital obsessions seem to bring into human character. So Ted coined terms like hypermedia and hypertext. Much later, in the early 1990s, the Web would be born when Tim Berners-Lee proposed HTML, the foundational protocol for Web pages. The letters ML stand for “markup language,” but the HT stands for Ted’s coinage, hypertext. Ted is the only person alive who invented a new humor to add to my scheme of humors.* Ted’s humor suggests an unlimited, but still human-centered future based on improving technologies.
The Wires of War: Technology and the Global Struggle for Power
by
Jacob Helberg
Published 11 Oct 2021
In time, this law would become the basis for much of the government’s subsequent attempts to track terrorism and espionage.25 Even as these primitive forms of fraud and abuse proliferated, the Internet grew. Between 1987 and 1989, the number of users surged from 28,000 to 160,000.26 In 1989, British computer scientist Tim Berners-Lee designed a system of “hypertext” that linked information digitally—a direct forerunner to those ten blue links I spent so much time on at Google. This hypertext was organized by “hypertext markup language” (HTML) and transmitted according to a “hypertext transfer protocol” (HTTP) to Internet nodes identified by a “uniform resource locator” (URL).
…
These humiliations are hardly a distant memory; when Trump and President Xi Jinping first met in April 2017, Xi treated Trump to a lecture on this unhappy history.85 Later that year, Xi declared that China was entering a “new era” and “must take center stage in the world.”86 In Beijing’s view, that means asserting full control over Hong Kong, regaining control over “breakaway” territories like Taiwan, expanding Chinese influence throughout the Asia-Pacific, and challenging the United States for global supremacy. “The signs that China is gearing up to contest America’s global leadership are unmistakable, and they are ubiquitous,” write Hal Brands and Jake Sullivan.87 And in Vint Cerf and Tim Berners-Lee’s ingenious creation, the Chinese Communist Party saw a new way to advance their old ambitions. * * * Two years before the Tiananmen massacre, the first email was sent from China. It traveled 4,500 miles from Beijing to Berlin. In hindsight, the message set a rather ominous tone: “Across the Great Wall we can reach every corner in the world.”88 Chinese use of the Internet subsequently skyrocketed.
Greater: Britain After the Storm
by
Penny Mordaunt
and
Chris Lewis
Published 19 May 2021
Our inventions and the impact they have had on the world, ‘from inventing the World Wide Web to bringing about new industrial and manufacturing processes that have raised the living standards of literally billions of people around the world’. Others focused on consistency and longevity: For centuries, we’ve been at the heart of technological innovation. Through figures like Charles Babbage, Ada Lovelace and, more recently, Tim Berners-Lee, we’ve pioneered computing, algorithms and cyberspace. These are breakthroughs that have changed the world as we know it beyond recognition, and they will continue to ensure that the UK shapes the direction of the world. Our engineering skills were mentioned frequently, as were Britain’s ‘creativity and ingenuity’.
…
WELL-BEING AND HEALTH For our panel, the NHS was the pragmatic manifestation of British values. Its creation was a rare moment, outside of war, when there was a true national mission. The celebration of it alongside the countryside, the industrial revolution, James Bond, HM The Queen, Rowan Atkinson, Tim Berners-Lee and the Spice Girls during Danny Boyle’s 2012 Olympic opening ceremony confirms this. Good health and good healthcare are consistently the prime care and concern of the British public. There are indices that show the NHS and wider UK healthcare services are the best in the world. There are also ones that will tell you its outcomes are poor.
Software Design for Flexibility
by
Chris Hanson
and
Gerald Sussman
Published 17 Feb 2021
Guttag; “Abstract data types and the development of data structures,” Communications of the ACM, 20(6) (1977): 397–404. [51]Chris Hanson; MIT/GNU Scheme Reference Manual. https://www.gnu.org/software/mit-scheme/ [52]Chris Hanson; SOS software: Scheme Object System, 1993. [53]Chris Hanson, Tim Berners-Lee, Lalana Kagal, Gerald Jay Sussman, and Daniel Weitzner; “Data-Purpose Algebra: Modeling Data Usage Policies,” in Eighth IEEE International Workshop on Policies for Distributed Systems and Networks (POLICY’07), (June 2007). [54]Hyman Hartman and Temple F. Smith; “The Evolution of the Ribosome and the Genetic Code,” in Life, 4 (2014): 227–249
…
Waltz; Generating Semantic Descriptions From Drawings of Scenes With Shadows, PhD thesis, MIT, also Artificial Intelligence Laboratory Technical Report 271, November 1972. http://hdl.handle.net/1721.1/6911 [126]Stephen A. Ward and Robert H. Halstead Jr.; Computation Structures. Cambridge, MA: MIT Press, 1990. [127]Stephen Webb; Measuring the Universe: The Cosmological Distance Ladder, Springer-Praxis Series in Astronomy and Astrophysics. Berlin: Springer, 1999. [128]Daniel J. Weitzner, Hal Abelson, Tim Berners-Lee, Chris Hanson, Jim Hendler, Lalana Kagal, Deborah McGuinness, Gerald Jay Sussman, and K. Krasnow Waterman; Transparent Accountable Data Mining: New Strategies for Privacy Protection, MIT CSAIL Technical Report MIT-CSAIL-TR-2006-007, January 2006. [129]Robert Edwin Wengert; “A simple automatic derivative evaluation program,” in Communications of the ACM, 7(8) (1964): 463–464
Full Stack Web Development With Backbone.js
by
Patrick Mulder
Published 18 Jun 2014
Send email to index@oreilly.com. 165 Backbone.Collection, sorting/filtering models with, 61–71 Backbone.js dependencies, 2 distributed application design, 6 fetching local copy of, 4 fetching via content delivery networks, 5 fetching with Node’s package manager, 2 philosophy of, 2, 145 Backbone.Model building a data layer, 26 data resolution, 88 DRYer Views and ViewModels, 46 modal view, 125 sorting, 62 wrapping a data store, 101 Backbone.ModelBinder, 39 Backbone.Obscura, 68, 137 Backbone.Router addressing state, 49–55 orchestrating views, 55–60 overview of, 49 Backbone.Sync, 84, 87 Backbone.View basic events, 31 basic rendering, 37 basic view templates, 41 DRYer Views and ViewModels, 46 filtering, 66 handling UI events, 43 modal view, 125 navbar view, 123 parent/child views, 56 rendering a collection, 42 sorting, 62 templates, 74 Backburner, 76 backend-as-a-service providers, 94, 98 bind function, 159 bindAll function, 159 binding, 39 Bluebird library, 103 Bower, 136 Browserify, 10, 29, 136 browsers development console, 15 DOM representation in, 161 packaging modules for, 9 166 | Index security in, 113 (see also authentication) browsing experience mock-up, 19 Brunch, 136 build automation goals of, 77 Grunt, 77 overview of, 135 scaffolding components, 143 tools to improve productivity, 135 Yeoman, 138 C callbacks, 103 Catero, 136 Cavage, Mark, 100 chaining methods, 161 change events, 28 Chaplin framework, 136, 146 child views, 56 className property, 37 click dummy basic CSS for, 25 basic events, 31 basic HTML for, 24 data layer, 26 preparation overview, 24 Cloudflare, 5 Cocoa API, 22 Codepen.io, 5 CoffeeScript, 136 collection helpers, 161 collections filtering, 21, 66 pagination, 68 sorting, 21, 62 transforming, 61 Underscore.js helpers, 158 command line interface (CLI) benefits of, 1 bundling modules from, 10 npm (Node package manager), 2 CommonJS modules benefits of, 8 Browserify, 10 Cartero management system, 136 Express.js and Stitch, 13 require in browsers, 9 comparator function, 62 content delivery network (CDN), 5 controllers, 24, 55 convention-over-configuration, 147 cookies drawbacks of, 115, 118 overview of, 114 session management, 118 user signup, 116 CORS (cross origin resource sharing), 99 createUser method, 116 cross-site request forgery (CSRF), 114 cross-site scripting (XSS), 114 D data binding, 39 building the data layer, 26 controlling access to, 113 (see also authentication) representation with models, 21 transforming with Underscore.js, 158 databases non-relational, 98 NoSQL, 98 relational, 98 wrapping data stores behind an API, 101 debuggers, 15 Decker, Kevin, 145 default properties, 27 dependencies managing with Bower, 136 resolving with main.js file, 141 reusing across projects, 8 Underscore.js, 158–160 Document Object Model (DOM) changing multiple nodes at once, 76 manipulation libraries, 2 node types, 161 statelessness and, 19 DOM nodes attaching/removing event handlers, 162 chaining methods on, 161 operating on directly, 161 preventing event bubbling, 162 selecting with jQuery, 161 types of, 161 DRYer views, 46 E Eastridge, Ryan, 145 ECO (embedded CoffeeScript), 75 event bubbling, 162 event handlers attaching/removing, 162 for UI events, 43 event listeners, 39 events change events, 28 default, 31 handling UI events, 43 sources of, 21, 31 Express.js, 13, 100 extend function, 27, 160 F fetching information asynchronous effects, 92 from hosted backend services, 94 overview of, 83, 87 RESTful web service handling, 84 filtering, 66 Firebase, 94 frameworks benefits of, 145 Chaplin, 146 Giraffe, 146 Junior, 146 Marionette, 146 Rendr, 146 Thorax.js, 146 Function.prototype.bind, 159 functional programming, 158 functions binding context to, 159 get, 28 private, 28 set, 28 sharing across multiple objects, 160 G get function, 28 Giraffe, 146 Grunt, 77 Index | 167 H Handlebars, 76 hashes/hashbangs, 50 Homebrew package manager, 157 HTTP requests basic verbs, 84 cookies, 115 sending from JavaScript, 163 signing, 114 HTTP responses, 102 I index.html, 9 inheritance, 160 isomorphic application design, 97 J JavaScript adding moudles from command line, 143 Ajax, 163 basic abstractions for Backbone.js, 1 debugging, 15 distributed application design, 6 HTTP requests from, 163 jQuery basics of, 160 element selection, 161 event handling, 162 Node.js installation, 157 overview of, 157 promises, 102 Underscore.js benefits of, 158 collections/arrays, 158 functions, 159 objects, 160 utility functions, 160 (see also objects) jQuery Ajax browsing experience mock-up, 19 jQuery API for, 163 basics of, 160 chaining methods, 161 collection helpers, 161 element selection, 161 event handling, 162 168 | Index node wrappers, 161 referencing, 35 JSBin, 5 JSFiddle, 5 JSLint/JSHint, 16 JST (JavaScript Templates), 74 Junior, 146 K key-value pairs data representation with, 21 primary keys, 107 syntax considerations for, 28 L LAMP (Linux-Apache-MySQL-PHP), 98 Layout View, 55 Linux, Node.js installation, 157 M Mac OS Homebrew package manager, 157 Node.js installation, 157 main.js file, 141 Marionette, 146 Mincer, 13 mixin functions, 46 mock-ups APIs, 85 browsing experience, 19 data, 149 wireframes, 19 Mockjax, 149 modal view, 125 model parameter, 29 model-view-controller (MVC) pattern, 22 models (see Backbone models) modules Browserify, 10 bundling from command line, 10 choosing, 8 CommonJS, 8 packaging for browsers, 9 RequireJS, 142 Morell, Jeremy, 68 Munich Cinema example API creation, 100 click dummy preparation basic CSS, 25 basic events, 31 basic HTML, 24 data layer, 26 overview of, 24 current web page, 18 preliminary mock-up, 19 project goals, 18 star rating/voting system, 108 synchronizing state in basic sync and fetch, 87 fetching remote movies, 84 goals of, 83 user interface DRYer views/ViewModels, 46 goals for, 35 handling UI events, 43 interfacing the DOM, 36–43 referencing jQuery, 35 N Navbar view, 123 navigate function, 54 navigation view (navbar), 123 NeXTSTEP operating system, 22 noBackend providers, 94, 98 Node.js installation of, 157 package manager, 2 read-eval-print-loop (REPL), 15 nodes (see DOM nodes) non-relational data stores, 98 npm (Node package manager), 2, 8 O object-relational-mapper (ORM), 98 objects customizing interfaces of, 160 rendering within templates, 160 open-source software, 4 P package managers, 13 pagination, 68 parent views, 56 passwords, 113 (see also authentication) persistence, 101, 108 primary keys, 107 private functions, 28 productivity, improving, 135 (see also workflow automation) promises, 103 proxies, 98 publish-subscribe pattern, 31 pushState(), 50 R React.js, 77 read-eval-print-loop (REPL), 15, 29 relational databases, 98 render function, 37 Rendr, 146 representations in RESTful web services, 85 with models, 21 RequireJS adding modules, 142 benefits of, 140 main.js file, 141 RESTful web services, 84 Restify library, 100 router basics addressing state defining routes, 51 goal of, 49 navigating, 54 preparing, 50 orchestrating views Layout View, 55 parent/child views, 56 overview of, 49 S security, 113 (see also authentication) session management Backbone applications API calls, 118 login dialog, 129 modal view, 125 navbar view, 123 cookies, 118 Index | 169 creating new, 131 logout, 132 set function, 28 signing requests approaches to, 114 benefits of, 114 sorting, 62 Sprockets, 13 state addressing with routers defining routes, 51 goal of, 49 navigating, 54 preparing, 50 authentication and, 131 decoupling from UI benefits of, 22 models and collections, 21 MVC pattern, 22 need for, 19 views, 22 synchronizing basic sync and fetch, 87 fetching remote information, 84 overview of, 83 statelessness, 19, 84 Stitch, 13 T tagName property, 37 template property, 41, 75 templates embedded CoffeeScript, 75 Handlebars, 76 JavaScript Templates, 74 overview of, 73 Thorax.js benefits of, 145 getting started application initialization, 150 build tasks, 147 installation/setup of, 147 mock data preparation, 149 overview of, 146 rendering advanced views, 154 Router setup, 152 Thorax.Collection, 152 TodoMVC demo, 24 tokens, access, 114 170 | Index U Ubuntu, Node.js installation, 157 Underscore.js benefits of, 158 collections/arrays, 158 functions, 159 objects, 160 utility functions, 160 user interface decoupling from state benefits of, 22 models and collections, 21 MVC pattern, 22 need for, 19 views, 22 DRYer views/ViewModels, 46 goals for, 35 handling UI events, 43 interfacing the DOM basic rendering, 37 basic view templates, 41 bindings to data changes, 39 rendering a collection, 42 strategy overview, 36 referencing jQuery, 35 V ViewModels, 46 views advanced view templates, 73 Backbone views, 22 data display management with, 23 DRYer view/ViewModels, 46 Layout View, 55 modal view, 125 MVC pattern, 22 navbar view, 123 parent/child views, 56 updating immediately, 39 welcome view, 59 vulnerabilities cross-site request forgery (CSRF), 114 cross-site scripting (XSS), 114 W Walmart’s shopping cart, 147 welcome view, 59 Windows, Node.js installation, 157 wireframes benefits of, 19 creating, 18 workflow, automation of (see build automation) X benefits of, 136, 138 installation of, 138 running, 139 Z Zepto library, 2, 160 XMLHttpRequest object, 163 Y Yeoman application directory, 140 Index | 171 About the Author Before discovering software development for web applications with Java and Ruby in 2008, Patrick Mulder mainly worked as a software engineer on measurement equip‐ ment and electronic devices. Web development allowed him to learn about networks and linking documents, but working with measurement equipment gave him an ap‐ preciation for the many forms data can have. Not for nothing, Tim Berners-Lee invented large parts of the WWW while working at CERN, a European research organization for particle physics. Yet, after programming with C, C++, Python, Ruby, and Java, learning Backbone.js proved difficult, as Patrick did not have much experience with the “nonblocking” be‐ havior of JavaScript when he started working with Backbone.
The Choice Factory: 25 Behavioural Biases That Influence What We Buy
by
Richard Shotton
Published 12 Feb 2018
It didn’t matter to any sizeable degree why someone had elected to join the priesthood. The situation, not the person, determined the behaviour. Do the findings still apply nearly 50 years later? A lot has changed since then. In 1973 a pint of beer cost 14p, Smash Martians were advertising instant mash and Tim Berners-Lee, inventor of the web, was still at school. But despite these differences our underlying motivations remain. As Bill Bernbach, the legendary creative, said: It took millions of years for man’s instincts to develop. It will take millions more for them to even vary. It is fashionable to talk about changing man.
Blockchain Revolution: How the Technology Behind Bitcoin Is Changing Money, Business, and the World
by
Don Tapscott
and
Alex Tapscott
Published 9 May 2016
In alphabetical order: Jeremy Allaire, Founder, Chairman, and CEO, Circle Marc Andreessen, Cofounder, Andreessen Horowitz Gavin Andresen, Chief Scientist, Bitcoin Foundation Dino Angaritis, CEO, Smartwallet Andreas Antonopoulos, Author, Mastering Bitcoin Federico Ast, CrowdJury Susan Athey, Economics of Technology Professor, Stanford Graduate School of Business Adam Back, Cofounder and President, Blockstream Bill Barhydt, CEO, Abra Christopher Bavitz, Managing Director, Cyberlaw Clinic, Harvard Law School Geoff Beattie, Chairman, Relay Ventures Steve Beauregard, CEO and Founder, GoCoin Mariano Belinky, Managing Partner, Santander InnoVentures Yochai Benkler, Berkman Professor of Entrepreneurial Studies, Harvard Law School Jake Benson, CEO and Founder, LibraTax Tim Berners-Lee, Inventor, World Wide Web Doug Black, Senator, Canadian Senate, Government of Canada Perriane Boring, Founder and President, Chamber of Digital Commerce David Bray, 2015 Eisenhower Fellow and Harvard Visiting Executive in Residence Jerry Brito, Executive Director, Coin Center Paul Brody, Americas Strategy Leader, Technology Group, EY (formerly IoT at IBM) Richard G.
…
These organizations were hierarchical by design, because hierarchies were the dominant paradigm during the first half of a war-torn century. But these industrial-scale solutions are ill suited to the challenges of the digital era. The rise of the Internet marked a significant departure from the traditional culture of governance. In 1992, most Internet traffic was e-mail. The graphical browser that enabled Tim Berners-Lee’s extraordinary World Wide Web was two years away. Most people weren’t connected and didn’t understand the technology. Many of the important institutions that would come to steward this important global resource were either embryonic or nonexistent. Barely four years old was the Internet Engineering Task Force, an international community that handles many aspects of Internet governance.
The One Device: The Secret History of the iPhone
by
Brian Merchant
Published 19 Jun 2017
Ground zero for the web, is, well, a pretty unremarkable office space. Apart from a commemorative plaque, it looks exactly the way you’d expect an office at a research center to look: functional, kind of drab. The future isn’t made in crystal palaces, folks. But it was developed here, in the 1980s, when Tim Berners-Lee built what he’d taken to calling the World Wide Web. While trying to streamline the sharing of data between CERN’s myriad physicists, he devised a system that linked pages of information together with hypertext. That story is firmly planted in the annals of technology. Bent Stumpe’s much lesser known step in the evolution of modern computing unfolded a stone’s throw away, in a wooden hut within shouting distance of Berners-Lee’s nook.
…
I plumbed early interviews with his alma mater’s newspaper, the New York Times, and the News Journal, which are where any quotes attributed to him originate. Jeff White, the erstwhile FingerWorks CEO, gave an interview to Technical.ly/Philly, which quotes are drawn from. As a nontouch aside, it’s also worth noting that Tim Berners-Lee built the World Wide Web using a NeXT Cube—the computer made by the company Steve Jobs founded after getting fired from Apple. 5. Lion Batteries SQM organized the tour of their facility in Atacama and allowed us to stay on-site so that we could visit both Salar de Atacama, where the lithium is harvested, and Salar de Carmen, where it is refined and prepared for distribution (I paid for the travel and the rest of the lodgings).
The Future of the Professions: How Technology Will Transform the Work of Human Experts
by
Richard Susskind
and
Daniel Susskind
Published 24 Aug 2015
David Card and Orley Ashenfelter (2011), 1043–171. 28 The spirit of their anxieties is shared with the original nineteenth-century ‘Luddities’ (whose name derives from their declared support for Ned Ludd, an East Midlands weaver who smashed a set of framing machines in anger and in fear in the early tremors of the Industrial Revolution). The Luddites viewed James Hargreaves’s spinning jenny in the nineteenth century with the same anxious suspicion that today’s pessimists view Tim Berners-Lee’s World Wide Web in the twenty-first century. See Eric Hobsbawm and George Rudé, Captain Swing (2001). 29 David Autor, ‘Polanyi’s Paradox and the Shape of Employment Growth’, NBER Working Paper 20485, National Bureau of Economic Research (2014). 30 Erik Brynjolfsson and Andrew McAfee, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies (2014).
…
Searle, John, ‘Watson Doesn’t Know it Won on “Jeopardy!”’, Wall Street Journal, 23 Feb. 2011 <http://www.wsj.com> (accessed 28 March 2015). Seidman, Dov, How (Hoboken, NJ: Wiley, 2007). Sennett, Richard, The Craftsman (London: Penguin Books, 2009). Sennett, Richard, Together (London: Allen Lane, 2012). Shadbolt, Nigel, Wendy Hall, and Tim Berners-Lee, ‘The Semantic Web Revisited’, IEEE Intelligent Systems, 21: 3 (2006), 96–101. Shanteau, James, ‘Cognitive Heuristics and Biases in Behavioral Auditing: Review, Comments, and Observations’, Accounting, Organizations, and Society, 14: 1 (1989), 165–77. Shapiro, Carl, and Hal Varian, Information Rules (Boston: Harvard Business School Press, 1999).
Making Globalization Work
by
Joseph E. Stiglitz
Published 16 Sep 2006
With more than 120,000 patent applications every year, it is virtually impossible for any researcher to know every idea that has been patented or for which there is a patent pending.27 Inherent ambiguities—for instance, in the breadth of the patent (that is, whether, to use our earlier example, Selden’s patent did indeed include all cars)—make a difficult task impossible. The result is that even the person usually given most credit for inventing the World Wide Web, Tim Berners-Lee, has concluded that, at least in his field, patents stifle innovation. They present, he says, a great stumbling block for Web development. Developers are stalling their efforts in a given direction when they hear rumors that some company may have a patent that may involve the technology.28 Over the past hundred years, the laws have changed enormously and differ across countries.
…
Myriad eventually developed a screening technology, and asks $3,000 for a complete screen; it refuses to let other firms perform the screen. The province of Ontario is ignoring this, allowing its citizens to be screened for free. 27.The global number has been soaring—up by 14 percent in just three years. The Intellectual Property Statistics Database is available at www.wipo.int/ipstatsdb/ en/stats.jsp. 28.Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (New York: HarperCollins, 2000). 29.The provision on data exclusivity—designed to limit the use of information—that the United States has been insisting upon in recent bilateral trade agreements clearly goes completely against the spirit of this traditional requirement. 30.There is a curious tension in the position of some of the most ardent free market advocates of intellectual property rights: while the liberalization/privatization agenda that they support in general entails minimizing the role of the government, this new set of reforms calls for a more active government and a new and restrictive set of regulations on the use of knowledge. 31.Their investment in lobbying has yielded high returns.
Science in the Soul: Selected Writings of a Passionate Rationalist
by
Richard Dawkins
Published 15 Mar 2017
‘Pearls before swine’ overestimates the average chat-room conversation, but it is the pearls of hardware and software that inspire me: the internet itself and the World Wide Web, succinctly defined by Wikipedia as ‘a system of interlinked hypertext documents contained on the internet’. The Web is a work of genius, one of the highest achievements of the human species, whose most remarkable quality is that it was constructed not by one individual genius like Tim Berners-Lee or Steve Wozniak or Alan Kay, nor by a top-down company like Sony or IBM, but by an anarchistic confederation of largely anonymous units located (irrelevantly) all over the world. It is Project MAC writ large. Suprahumanly large. Moreover, there is not one massive central computer with lots of satellites, as in Project MAC, but a distributed network of computers of different sizes, speeds and manufacturers, a network that nobody, literally nobody, ever designed or put together, but which grew, haphazardly, organically, in a way that is not just biological but specifically ecological.
…
Whether, on balance, the internet benefits the oppressed more than the oppressor is controversial, and at present may vary from region to region. We can at least hope that the faster, more ubiquitous and above all cheaper internet of the future may hasten the long-awaited downfall of ayatollahs, mullahs, popes, televangelists, and all who wield power through the control (whether cynical or sincere) of gullible minds. Perhaps Tim Berners-Lee will one day earn the Nobel Peace Prize. AFTERWORD Reading this again at the end of 2016 I find its generally optimistic tone a little jarring. There is alarmingly convincing evidence that the year’s momentous US presidential election (it remains to be seen quite how momentous it will prove to be, not just for America but the world) was swayed by a systematically orchestrated campaign of fake news defaming one of the candidates.
England: Seven Myths That Changed a Country – and How to Set Them Straight
by
Tom Baldwin
and
Marc Stears
Published 24 Apr 2024
The Jurors is formed of twelve bronze chairs designed to challenge people ‘to consider the ongoing significance and influences of Magna Carta’.25 It was unveiled by Prince William despite its Guyanese British creator, Hew Locke, saying his ‘feelings about the Royal Family are ambivalent’.26 On the front and back of each chair are symbols depicting the struggle against oppression not just in England but around the world. One has a portrait of suffragette Lillie Lenton in Holloway prison and Oscar Wilde’s poem while incarcerated in Reading gaol for homosexuality. There is a tribute to blind trade unionists and a commentary by the scientist Sir Tim Berners-Lee who has called for ‘Magna Carta of the internet’. It has nods to the eighteenth-century African American poet Phillis Wheatley, and to Mary Prince, a Black woman who presented an anti-slavery petition to Parliament in 1828. There is an image of the portable spinning wheel for cotton designed by Mahatma Gandhi in resistance to the laws of the British Empire, as well as the loudhailer used by the first openly gay man to be elected to public office in California, Harvey Milk.
…
It included elements of the stories we have explored in this book: the pastoral ideal of Arthurian Glastonbury Tor; games of cricket; the industrial revolution; suffragists demanding the vote; soldiers dying in world wars; Caribbean immigrants disembarking from the Empire Windrush; the BBC weather forecast; children bouncing on NHS hospital beds; and Tim Berners-Lee inventing the World Wide Web. It featured national cultural icons − everyone from Paul McCartney, the Sex Pistols and David Beckham to Mary Poppins hitting Lord Voldemort with her umbrella. Most spectacularly of all, it had James Bond parachuting with a body-double of the queen into the Olympic stadium.
Clock of the Long Now
by
Stewart Brand
Published 1 Jan 1999
I favor savage selection, but everyone making their own. That way you get a myriad of perspectives instead of one and instead of an undifferentiated heap.” This pretty well describes how the World Wide Web is organizing itself. People complain about overwhelming masses of information on the Web, but one of its inventors, Tim Berners-Lee, comments, “To be overloaded by the existence of so much on the Web is like being overloaded by the mass of a beautiful countryside. You don’t have to visit it, but it’s nice to know it’s there. Especially the variety and freedom.” The Internet may be showing the way to live with an infinite amount of past in infinite detail, and still encourage freedom to innovate without the need of violent revolution.
The Internet of Money
by
Andreas M. Antonopoulos
Published 28 Aug 2016
These attachments were 10 times the size of the text because people started sending bigger things, like drawings and pictures and of course, once again, sex. So, we could scale for email but not for email attachments. Everybody was up in an uproar: "We’re never going to be able to scale for email attachments. The internet will surely melt down!” Then we solved it. Until some British guy, Sir Tim Berners Lee (who then was just Tim) invented the web. Now, you could put the pictures into frames. 11.1.4. The Web Will Destroy the Internet It was about 1992 when I downloaded and ran the first web browser, NCSA Mosaic, at my university lab. We gathered together three or four friends. We worked for hours to get NCSA Mosaic downloaded and compiled and installed.
Start It Up: Why Running Your Own Business Is Easier Than You Think
by
Luke Johnson
Published 31 Aug 2011
Patent trolls abound – those who file ‘paper patents’ or ‘submarine patents’ that they never intend to exploit, but merely use as tools to sue unwitting infringers. Both Research in Motion, maker of the BlackBerry, and even Microsoft have suffered from this harmful toll on endeavour. Inventors I have met are fundamentally motivated by a desire to see their creations become appreciated and recognized, rather than an urge to accumulate wealth. Tim Berners-Lee, the man responsible more than any other for the initiation of the world wide web, is a classic example of this attitude. He is a modest academic who has, I am sure, resisted countless overtures to make huge fortunes from the web, in order to carry on his role as one of its custodians. Some inventors are almost dismissive of accountants and bankers – they say the money men do not understand the way creative minds work.
Intertwingled: Information Changes Everything
by
Peter Morville
Published 14 May 2014
In 1934, Paul Otlet envisioned a scholar’s workstation that turned millions of 3 x 5 index cards into a web of knowledge by using a new kind of relationship known as the “Link.”lxiv In 1945, Vannevar Bush imagined the memex, a machine that enabled its users to share an associative “web of trails.”lxv In the early 60s, Ted Nelson coined “hypertext” and set out to build Xanadu, a non-sequential writing system with visible, clickable, unbreakable, bi-directional hyperlinks. lxvi Figure 3-1. Ted Nelson’s Xanalogical Structure. In 1968, Doug Englebart “real-ized” these dreams by showing hypertext (and most elements of modern computing) in “the mother of all demos.”lxvii Through the 70s and 80s, dozens of protocols and networks were made and merged, and in 1991, Tim Berners-Lee launched the World Wide Web as a public service on the Internet. The rest, as everyone knows, is history. It’s hard to argue with the success of the Internet since, and yet it’s worth reflecting upon what was lost in the translation from idea to implementation. Ted Nelson did just that in 2013 in a tearful eulogy for his old friend, Doug Englebart.
Networks, Crowds, and Markets: Reasoning About a Highly Connected World
by
David Easley
and
Jon Kleinberg
Published 15 Nov 2010
In Proc. 16th ACM-SIAM Symposium on Discrete Algorithms, pages 301–310, 2005. BIBLIOGRAPHY 803 [52] Kenneth Berman. Vulnerability of scheduled networks and a generalization of Menger’s theorem. Networks, 28:125–134, 1996. [53] Tim Berners-Lee, Robert Cailliau, Ari Luotonen, Henrik Frystyk Nielsen, and Arthur Secret. The World-Wide Web. Communications of the ACM, 37(8):76–82, 1994. [54] Tim Berners-Lee and Mark Fischetti. Weaving the Web. Harper Collins, 1999. [55] Krishna Bharat, Bay-Wei Chang, Monika Rauch Henzinger, and Matthias Ruhl. Who links to whom: Mining linkage between web sites. In Proc. IEEE International Conference on Data Mining, pages 51–58, 2001
…
But since the Web is so enmeshed in the broader information infrastructure of the world (including the Internet, wireless communication systems, and the global media industry), it’s actually useful to think a bit about what the Web is and how it came about, starting from first principles. At a basic level, the Web is an application developed to let people share information over the Internet; it was created by Tim Berners-Lee during the period 1989-1991 [53, 54]. Although it is a simplification, we can view the original conception and design of the Web as involving two central features. First, it provided a way for you to make documents easily available to anyone on the Internet, in the form of Web pages that you could create and store on a publically accessible part of your computer.
…
The fact that Vannever Bush’s vision was so accurate is not in any sense coincidental; Bush occupied a prominent position in the U.S. government’s scientific funding establishment, and his ideas about future directions had considerable reach. Indeed, the creators of early hypertext systems explicitly invoked Bush’s ideas, as did Tim Berners-Lee when he set out to develop the Web. The Web and its Evolution. This brings us back to the 1990s, the first decade of the Web, in which it grew rapidly from a modest research project to a vast new medium with global reach. In the early phase of this period, the simple picture in Figure 13.2 captured the 392 CHAPTER 13.
Barefoot Into Cyberspace: Adventures in Search of Techno-Utopia
by
Becky Hogge
,
Damien Morris
and
Christopher Scally
Published 26 Jul 2011
And the “open source” method of working that the pioneers of free software adopted has been used as a prism through which to view the properties of the web for at least a decade. If Bill Gates had been completely in charge, there would be no world wide web. Sure, Gates saw that computers could play some role in home entertainment. About the time Tim Berners-Lee was circulating his ideas about the possibilities of linking documents together using something called hypertext, Gates was proposing set-top computers in every home, able to deliver more entertainment programming than even satellite TV. Gates initially missed the Web, because he couldn’t see the people who bought his products as anything more than consumers.
Information Doesn't Want to Be Free: Laws for the Internet Age
by
Cory Doctorow
,
Amanda Palmer
and
Neil Gaiman
Published 18 Nov 2014
Even the most dedicated, topical forums on highly technical subjects—cancer therapy, high-energy physics—inevitably coexist with nearby forums for “chatter” about pop culture, personal thoughts, and idle chitchat. Indeed, this is practically the Internet’s origin story: the U.S. government created a military and scientific network for information sharing, and its users promptly started a Star Trek discussion forum. Tim Berners-Lee created the World Wide Web for sharing high-energy physics papers, and its users promptly started posting pictures of their cats, their failed cake-baking adventures, and the titanic snowfall that had just been dumped outside their lab windows. And then they, too, started arguing about Star Trek.
Computer: A History of the Information Machine
by
Martin Campbell-Kelly
and
Nathan Ensmenger
Published 29 Jul 2013
Hypertext was, in fact, a lively computer research topic throughout the 1980s, but what made it so potent for the Internet—ultimately giving rise to the Word Wide Web—was that it would make it unnecessary to locate documents in centralized directories. Instead, links would be stored in the documents themselves, and they would instantly whisk the reader to related documents. It was all very much as Vannevar Bush had envisioned the memex. The World Wide Web was invented by Tim Berners-Lee. Its origins dated back to Berners-Lee’s early interest in hypertext in 1980, long before the Internet was widely known. Berners-Lee was born in London in 1955, the son of two mathematicians (who were themselves pioneers of early British computer programming). After graduating in physics from Oxford University in 1976, he worked as a software engineer in the UK before obtaining a six-month consulting post at CERN, the international nuclear physics research laboratory in Geneva.
…
Mitchell Waldrop has written a fine biography of J.C.R. Licklider, IPTO’s founder and the seminal figure of personal computing: The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal (2001). The context and evolution of the World Wide Web has been described by its inventor Tim Berners-Lee in Weaving the Web (1999), and by his colleagues at CERN James Gillies and Robert Cailliau in How the Web Was Born (2000). The “browser wars” are recounted in detail in Michael Cusumano and David Yoffie’s Competing on Internet Time (1998). The early days are chronicled in Robert Reid’s Architects of the Web: 1,000 Days That Built the Future of Business (1997), while the crash is recounted in John Cassidy’s Dot.con (2003).
The Rational Optimist: How Prosperity Evolves
by
Matt Ridley
Published 17 May 2010
The bread you eat was first cross-bred by a Neolithic Mesopotamian and baked in a way that was first invented by a Mesolithic hunter-gatherer. Their knowledge is enduringly embodied in machines, recipes and programmes from which you benefit. Unlike Louis, you number among your servants John Logie Baird, Alexander Graham Bell, Sir Tim Berners-Lee, Thomas Crapper, Jonas Salk and myriad assorted other inventors. For you get the benefit of their labours, too, whether they are dead or alive. The point of all this cooperation is to make (Adam Smith again) ‘a smaller quantity of labour produce a greater quantity of work’. It is a curious fact that in return for this cornucopia of service, you produce only one thing.
…
When a surfer named Larry Stanley first modified his surfboard to make jumping possible without parting company from the board, he never dreamed of selling the idea, but he told everybody how to do it including the manufacturers of boards and now his innovations can be bought in the form of new surfboards. The greatest lead-user innovation of all was probably the World Wide Web, devised by Sir Tim Berners-Lee in 1991 to solve the problem of sharing particle physics data between computers. Incidentally, nobody has yet suggested that research in software and surfboards must be government-funded because innovation in them would not happen without subsidy. In other words, we may soon be living in a post-capitalist, post-corporate world, where individuals are free to come together in temporary aggregations to share, collaborate and innovate, where websites enable people to find employers, employees, customers and clients anywhere in the world.
The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism
by
Jeremy Rifkin
Published 31 Mar 2014
Some of the best-known social media sites on the Web are revving up to find new ways to enclose, commercialize, and monopolize the new communications medium. And their bite is potentially far bigger than the companies managing the pipes. In a November 2010 article in Scientific American, Tim Berners-Lee, inventor of the World Wide Web, issued a damning missive on the twentieth anniversary of the day the Web first went live. He was concerned about what was happening to the Internet. Berners-Lee’s invention was simple in design and acute in impact. The Web allows anyone, anytime, anywhere to share information with anyone else without having to ask for permission or pay a royalty fee.
…
Kevin O’Brien, “Limiting Data Use in Germany,” New York Times, May 12, 2013, http://www .nytimes.com/2013/05/13/technology/deutsche-telekom-data-use-and-net-neutrality.html. 13. Ibid. 14. “Open Internet,” Federal Communications Commission, http://www.fcc.gov/openinternet#rules. 15. Brett Frischmann, Infrastructure: The Social Value of Shared Resources (New York: Oxford University Press, 2013), 349. 16. Tim Berners-Lee, “Long Live the Web: A Call for Continued Open Standards and Neutrality,” Scientific American, November 22, 2010, http://www.scientificamerican.com/article .cfm?id=long-live-the-web&print=true. 17. Ibid. 18. Ibid. 19. Ibid. 20. Matt Beswick, “Google Search Queries by the Numbers,” STAT, July 27, 2012, http://getstat .com/blog/google-search-queries-the-numbers/. 21.
Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World
by
Bruce Schneier
Published 2 Mar 2015
The EU has been trying to pass an updated and stricter data protection regulation, but faces furious lobbying from US Internet companies that don’t want to stop data collection. This newfound backbone to stand up to the NSA is more about managing user perceptions than about solving privacy problems. This is why we need strong regulations on corporations as well. A NEW MAGNA CARTA Tim Berners-Lee, the inventor of the World Wide Web, has called for a new Magna Carta—one that restricts the actions of both governments and corporations, and that imposes responsibilities on information-age corporations rather than just rights. The historical analogy is actually not that great, but the general idea is worth exploring.
…
James Fontanella-Khan (26 Jun 2013), “Brussels: Astroturfing takes root,” Financial Times, http://www.ft.com/cms/s/0/74271926-dd9f-11e2-a756-00144feab7de.html. David Meyer (12 Mar 2014), “Web firms face a strict new set of privacy rules in Europe: Here’s what to expect,” Gigaom, http://gigaom.com/2014/03/12/web-firms-face-a-strict-new-set-of-privacy-rules-in-europe-heres-what-to-expect. a new Magna Carta: Tim Berners-Lee (Dec 2010), “Long live the Web,” Scientific American, http://www.cs.virginia.edu/~robins/Long_Live_the_Web.pdf. that imposes responsibilities: Jemima Kiss (11 Mar 2014), “An online Magna Carta: Berners-Lee calls for bill of rights for web,” Guardian, http://www.theguardian.com/technology/2014/mar/12/online-magna-carta-berners-lee-web.
We Are the Nerds: The Birth and Tumultuous Life of Reddit, the Internet's Culture Laboratory
by
Christine Lagorio-Chafkin
Published 1 Oct 2018
Eulogies by academics and friends at Swartz’s funeral and at a massive memorial service hosted by the Internet Archive in San Francisco on January 24 channeled some of Swartz’s own energies. They portrayed him as a provocative, brilliant young man who sought nothing more than to make this world a better place. They called for action at turns, and wallowed in sadness at others. Tim Berners-Lee, the computer scientist best known for creating the World Wide Web, posted on Twitter, “World wanderers, we have lost a wise elder. Hackers for right, we are one down. Parents all, we have lost a child. Let us weep.” Neither Huffman nor Ohanian attended either the funeral or the memorial. They mourned privately: Ohanian at the loss of another human passionate about ideas dear to his heart, Huffman at a man who’d been his best friend and closest confidant for the greater part of a very formative year.
…
“It genuinely opened his eyes”: Noam Scheiber, “The Inside Story of Why Aaron Swartz Broke Into MIT and JSTOR,” New Republic, February 13, 2003. It was titled the Guerilla Open Access Manifesto: Aaron Swartz, “Guerilla Open Access Manifesto,” July 2008. “get bossed around”: Aaron Swartz, “Aaron’s Patented Demotivational Seminar,” Raw Thought, March 27, 2007. “What was so striking about Aaron”: “Sir Tim Berners-Lee pays tribute to Aaron Swartz,” Telegraph, January 14, 2013. Brewster Kahle of the Internet Archive: Brewster Kahle, speaking at a memorial to Aaron Swartz, January 24, 2013. Malamud emailed him back: Carl Malamud archives, Aaron Swartz email message 299, https://public.resource.org/aaron/pub/msg00299.html.
Fancy Bear Goes Phishing: The Dark History of the Information Age, in Five Extraordinary Hacks
by
Scott J. Shapiro
Sinofsky also happened to be Bill Gates’s technical assistant, responsible for alerting Microsoft’s CEO about cutting-edge technology. He fired off an email to his boss with the subject line “Cornell is WIRED!” It may be difficult to believe that in 1994, the web would be news to Bill Gates. The World Wide Web is a set of protocols developed by Tim Berners-Lee in 1989 to allow computers to share web pages over the internet. Web servers send web pages to web browsers. A web browser is a program that requests, receives, and displays web pages over the internet, much like an email program sends, receives, and displays emails over the internet. Though the web had existed for only five years, it was experiencing explosive growth.
…
: Bill Steele, “Gates Sees a Software-Driven Future Led by Computer Science,” Cornell Chronicle, March 4, 2004, news.cornell.edu/stories/2004/03/gates-sees-software-driven-future-led-computer-science. Gates contributed a new computer-science building to replace Upson Hall. The World Wide Web: Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (New York: Harper Business, 2000). experiencing explosive growth: “Share of the Population Using the Internet, 1990 to 1995,” Our World in Data, accessed June 2021, https://ourworldindata.org/grapher/share-of-individuals-using-the-internet?
The Stack: On Software and Sovereignty
by
Benjamin H. Bratton
Published 19 Feb 2016
Jacob Appelbaum, Laura Poitras, and Trevor Paglen, “Art as Evidence” (panel at Transmediale, Haus Der Kulturen Der Welt, Berlin, 2014), http://www.transmediale.de/content/presentation-by-jacob-applebaum-at-transmediale-2014-keynote-art-as-evidence. 22. See https://www.youtube.com/watch?v=ZJ6BuHL0EiQ. Thanks to Serene Han, programmer at Google Ideas, for helping me understand the uProxy technology. 23. See https://webwewant.org/, http://www.theguardian.com/technology/2014/sep/28/tim-berners-lee-internet-bill-of-rights-greater-privacy, and Agence France-Presse, “Tim Berners-Lee Calls for Internet Bill of Rights to Ensure Greater Privacy,” 2014. 24. See http://www.offnow.org/11361, a petition to cut off the water supply to the NSA's Utah data center. 25. Eliezer Yudkowsky,“Artificial Intelligence as a Positive and Negative Factor in Global Risk,” 2008, https://intelligence.org/files/AIPosNegFactor.pdf. 26.
…
An awkward-at-best vision of this at the scale of an entire nation might look like UK Prime Minister David Cameron's Big Society initiative, which sought to “integrate the free market with a theory of social solidarity based on hierarchy and voluntarism,” or in other words, offload state services onto local charities wrapped in words like “social media” and “decentralization.” See http://en.wikipedia.org/wiki/Big_Society. 38. Or even agents well beyond that government's borders, as today much of the US federal government data entry is done in India. 39. See the US data.gov and the UK's data.gov.uk (overseen by World Wide Web inventor Sir Tim Berners-Lee) as very curtailed and limited examples of this federal information transparency. 40. This refers to Robert Nozick's notorious economic libertarian manifesto, Anarchy, State and Utopia (New York: Basic Books, 1974), where he introduces the conceptual ideal of the minimal state. See also Methaven's “ultraminimal state” and its Facestate project: Andrea Hyde, “Metahaven's Facestate,” Walker Art Center, December 13, 2011, http://www.walkerart.org/magazine/2011/metahavens-facestate. 41.
The Network Imperative: How to Survive and Grow in the Age of Digital Business Models
by
Barry Libert
and
Megan Beck
Published 6 Jun 2016
It’s a wonderful opportunity to improve operations, decrease waste, and better serve our customers and the world. It’s time to figure out what opportunities are most exciting for you and your organization, and start the big data adventure. PRINCIPLE 9 BOARDS From Governance to Representation We need diversity of thought in the world to face the new challenges. —Tim Berners-Lee, inventor of the World Wide Web IT’S NOT EASY TO MANAGE A COMPANY YOU DON’T UNDERSTAND. In 2000, Kellogg Company, known for popular brands such as Froot Loops, Pop-Tarts, Frosted Flakes, and Pringles, purchased a small food company called Kashi. Kashi was a start-up that played in a similar part of the food market as Kellogg—with cereals, snack bars, crackers, and prepared foods—but Kashi had a mission focused on nutritious, plant-based foods and sustainable, ethical farming practices.
Digital Transformation at Scale: Why the Strategy Is Delivery
by
Andrew Greenway,Ben Terrett,Mike Bracken,Tom Loosemore
Published 18 Jun 2018
Few people wanted to do their business with the government online, put off by the poor design and incomprehensible jargon. It was still easier and quicker for them to pick up the phone or post a form. The UK scored 10th place on the United Nations’ e-Government ranking7; not exactly terrible, but hardly a source of pride for a country that counts the father of the web, Sir Tim Berners-Lee, among its citizens. In the midst of this, the country was emerging from the biggest recession since before World War II,8 led by its first coalition government in 65 years. The UK’s situation at the time was, and is, fairly typical. The US federal government, for purposes of comparison, invests more than $80 billion a year in IT – a figure exceeding the projected gross domestic product (GDP) of nearly two-thirds of the world’s nations.9 According to the Government Accountability Office, ‘these investments frequently fail, incur cost overruns and schedule slippages, or contribute little to mission-related outcomes’.
The Data Journalism Handbook
by
Jonathan Gray
,
Lucy Chambers
and
Liliana Bounegru
Published 9 May 2012
It used to be that you would get stories by chatting to people in bars, and it still might be that you’ll do it that way sometimes. But now it’s also going to be about poring over data and equipping yourself with the tools to analyze it and pick out what’s interesting. And keeping it in perspective, helping people out by really seeing where it all fits together, and what’s going on in the country. — Tim Berners-Lee, founder of the World Wide Web Number-Crunching Meets Word-Smithing Data journalism is bridging the gap between stat technicians and wordsmiths. Locating outliers and identifying trends that are not just statistically significant, but relevant to de-compiling the inherently complex world of today
Bitcoin: The Future of Money?
by
Dominic Frisby
Published 1 Nov 2014
The database provides the mathematical evidence – the so-called ‘crypto-proof’ – on which Bitcoin is based. It is called the ‘block chain’. The evolution of digital cash – and the monies that failed In 1955 and 1956, a generation of computer geniuses was born. This gamechanging cohort includes Tim Berners Lee, Bill Gates, Steve Jobs – and a little-known mathematician by the name of David Chaum. Chaum was one of the pioneers of early cryptography and the grandfather of digital cash. He first proposed the idea of digital cash in 1982,15then developed his thinking in 1985 with the fantastically titled paper, Security Without Identification: Transaction Systems to Make Big Brother Obsolete.
How We Got Here: A Slightly Irreverent History of Technology and Markets
by
Andy Kessler
Published 13 Jun 2005
New companies like UUNET and America OnLine would use cisco routers in the middle of their networks to move packets around, as well as at the edge of their 152 HOW WE GOT HERE networks to connect to banks of dialup modems so users could call in and connect. In 1991, a physicist at the Particle Physics Institute CERN in Geneva, Tim Berners-Lee was tired of the hassle involved in sharing research amongst scientists. Leveraging the hypertext Doug Engelbart demonstrated 23 years earlier, he wrote some code creating links across computer networks, and called it the World Wide Web. He didn’t patent it, didn’t start a company, and didn’t get royalties.
You Are Not a Gadget
by
Jaron Lanier
Published 12 Jan 2010
In the early 1990s, there were perhaps dozens of credible efforts to come up with a design for presenting networked digital information in a way that would attract more popular use. Companies like General Magic and Xanadu developed alternative designs with fundamentally different qualities that never got out the door. A single person, Tim Berners-Lee, came to invent the particular design of today’s web. The web as it was introduced was minimalist, in that it assumed just about as little as possible about what a web page would be like. It was also open, in that no page was preferred by the architecture over another, and all pages were accessible to all.
Built for Growth: How Builder Personality Shapes Your Business, Your Team, and Your Ability to Win
by
Chris Kuenne
and
John Danner
Published 5 Jun 2017
Blakely said, “This discovery allowed me to find my purpose, which was to help women.”2 The Explorer is attracted first to the problem itself—ideally a thorny one worthy of his or her time and talents. Often this happens by chance, as it did when Tom Leighton was working at MIT down the hall from Tim Berners-Lee, the inventor of the World Wide Web. One day, Berners-Lee told Leighton he believed traffic congestion and volume could significantly limit the potential and growth of the web itself. That casual conversation got Leighton and his graduate assistant, Danny Lewin, intrigued with how such a massive problem might be tackled.
Content Everywhere: Strategy and Structure for Future-Ready Content
by
Sara Wachter-Boettcher
Published 28 Nov 2012
A completely semantic Web is a lofty goal—one not without its detractors, I might note—and our path toward it is still meandering at best. But a more semantic Web seems closer than ever with the recent advent of linked data, which is made possible through structured content and markup. Coined by Tim Berners-Lee—yes, the guy who invented the World Wide Web—in 2006, linked data means exactly what it sounds like: bits of information that are linked to other, equivalent sets of data elsewhere on the Internet (often referred to as “in the cloud”), as illustrated in Figure 6.1. The idea is that, as opposed to HTML links, which link one document (e.g., a page) to another, linked data connects the things those pages are about by connecting the actual data behind those two pages instead.
Confessions of a Crypto Millionaire: My Unlikely Escape From Corporate America
by
Dan Conway
Published 8 Sep 2019
I updated my LinkedIn profile, changed my Twitter page, revised Medium, and posted my new career status on Facebook. While I was free from Acme, I was still in the grips of the centralized behemoths that controlled the modern Internet. These massive gated communities have taken a cut of everything, sold our data, and made all of the rules. It wasn’t supposed to be this way. In the 1970s and 1980s, Tim Berners-Lee, the inventor of the World Wide Web, was driven by the idea that the Internet wouldn’t be owned by anyone, that it would be a free and rollicking information superhighway. He is now working on new blockchain-heavy standards for what is collectively called Web 3.0. The goal is to move the Internet away from the current power brokers and give freedom and control back to the people.
This Is Not Normal: The Collapse of Liberal Britain
by
William Davies
Published 28 Sep 2020
Just as ‘sharing economy’ platforms such as Uber and Airbnb have recently been thwarted by legal rulings (Uber being compelled to recognise drivers as employees, Airbnb being banned altogether by some municipal authorities), privacy and human rights law represents a potential obstacle to the extension of data analytics. What is less clear is how the benefits of digital analytics might ever be offered to the public, in the way that many statistical data sets are. Bodies such as the Open Data Institute, founded by Tim Berners-Lee among others, campaign to make data publicly available, but have little leverage over the corporations where so much of our data now accumulates. Statistics began life as a tool through which the state could view society, but gradually developed into something that academics, civic reformers and businesses had a stake in.
Money: The True Story of a Made-Up Thing
by
Jacob Goldstein
Published 14 Aug 2020
Alan Greenspan, the Fed chairman who loved to warn against excessive regulation of everything, warned against excessive regulation of digital cash. “I am especially concerned that we not attempt to impede unduly our newest innovation, electronic money,” he said. (Much later, after the financial crisis, Greenspan would say that he had warned against regulation too much.) In 1994, Tim Berners-Lee, who invented the Web, invited Chaum to open the “First International Conference on the World Wide Web” in Geneva. By the end of 1995, DigiCash was working with banks in the United States, Switzerland, Germany, Australia, and Japan. The technology was in place. Massive financial institutions were behind it.
Walled Culture: How Big Content Uses Technology and the Law to Lock Down Culture and Keep Creators Poor
by
Glyn Moody
Published 26 Sep 2022
uri=CELEX%3A32000L0031&from=en 460 https://web.archive.org/web/20220603094009/https://curia.europa.eu/jcms/upload/docs/application/pdf/2011-11/cp110126en.pdf 461 https://web.archive.org/web/20220618190328/http://copyrightblog.kluweriplaw.com/2021/08/09/youtube-and-cyando-injunctions-against-intermediaries-and-general-monitoring-obligations-any-movement/ 462 https://web.archive.org/web/20220701134156/https://en.wikipedia.org/wiki/General_Data_Protection_Regulation 463 https://web.archive.org/web/20220618190351/https://gdpr-info.eu/art-22-gdpr/ 464 https://web.archive.org/web/20180713064910/https://www.theguardian.com/technology/2018/jun/20/eu-votes-for-copyright-law-that-would-make-internet-a-tool-for-control 465 https://web.archive.org/web/20220402115757/https://www.politico.eu/wp-content/uploads/2018/06/copyright-email-to-MEPs.docx 466 https://web.archive.org/web/20220402115757/https://www.politico.eu/wp-content/uploads/2018/06/copyright-email-to-MEPs.docx 467 https://web.archive.org/web/20220618203153/https://copybuzz.com/copyright/meps-email-says-article-13-will-not-filter-the-internet-juri-meps-tweet-says-it-will/ 468 https://web.archive.org/web/20220701134919/https://en.wikipedia.org/wiki/Independent_Music_Companies_Association 469 https://web.archive.org/web/20180704010146/http://impalamusic.org/content/copyright-say-no-scaremongering-and-yes-creators-getting-paid 470 https://web.archive.org/web/20220701135001/https://en.wikipedia.org/wiki/David_Kaye_%28academic%29 471 https://web.archive.org/web/20220428083157/https://www.ohchr.org/sites/default/files/Documents/Issues/Opinion/Legislation/OL-OTH-41-2018.pdf 472 https://web.archive.org/web/20220618203237/https://en.wikipedia.org/wiki/Vint_Cerf 473 https://web.archive.org/web/20220618203338/https://internethalloffame.org/vint-cerf 474 https://web.archive.org/web/20220618203358/https://en.wikipedia.org/wiki/Tim_Berners-Lee 475 https://web.archive.org/web/20220618203414/https://www.w3.org/People/Berners-Lee/ 476 https://web.archive.org/web/20220618203428/https://en.wikipedia.org/wiki/Jimmy_Wales 477 https://web.archive.org/web/20220620000819/https://twitter.com/Jimmy_wales 478 https://web.archive.org/web/20180708003333/https://www.eff.org/files/2018/06/13/article13letter.pdf 479 https://web.archive.org/web/20180708003333/https://www.eff.org/files/2018/06/13/article13letter.pdf 480 https://web.archive.org/web/20220618213154/https://felixreda.eu/2018/09/ep-endorses-upload-filters/ 481 https://web.archive.org/web/20220618213222/https://www.europarl.europa.eu/news/en/press-room/20180906IPR12103/parliament-adopts-its-position-on-digital-copyright-rules 482 https://web.archive.org/web/20220618214015/https://www.techdirt.com/2018/12/12/legacy-copyright-industries-lobbying-hard-eu-copyright-directive-while-pretending-that-only-google-is-lobbying/ 483 https://web.archive.org/web/20220618214015/https://www.techdirt.com/2018/12/12/legacy-copyright-industries-lobbying-hard-eu-copyright-directive-while-pretending-that-only-google-is-lobbying/ 484 https://web.archive.org/web/20220618214041/https://corporateeurope.org/en/2018/12/copyright-directive-how-competing-big-business-lobbies-drowned-out-critical-voices 485 https://web.archive.org/web/20220618214059/https://en.wikipedia.org/wiki/Catherine_Stihler 486 https://web.archive.org/web/20220618214120/https://en.wikipedia.org/wiki/Creative_Commons 487 https://web.archive.org/web/20220704134108/https://walledculture.org/interview-catherine-stihler-creative-commons-the-eu-copyright-directive-and-civil-societys-role/ 488 https://web.archive.org/web/20220618214136/https://en.wikipedia.org/wiki/Formal_trilogue_meeting 489 https://web.archive.org/web/20220618214155/https://felixreda.eu/2018/11/eu-council-upload-filters/ 490 https://web.archive.org/web/20190320092740/https://juliareda.eu/2019/01/copyright-hits_wall/ 491 https://web.archive.org/web/20220618214225/https://www.techdirt.com/2019/03/25/new-report-germany-caved-to-france-copyright-deal-russian-gas/ 492 https://web.archive.org/web/20190320092739/https://www.eff.org/deeplinks/2019/01/german-government-abandons-small-businesses-worst-parts-eu-copyright-directive 493 https://web.archive.org/web/20190321012132/https://www.change.org/p/european-parliament-stop-the-censorship-machinery-save-the-internet 494 https://web.archive.org/web/20190220103450/https://www.faz.net/aktuell/wirtschaft/diginomics/tausende-menschen-demonstrieren-gegen-urheberrechtsreform-16045816.html 495 https://web.archive.org/web/20190321104438/https://www.dw.com/en/thousands-in-berlin-protest-eus-online-copyright-plans/a-47753399 496 https://web.archive.org/web/20220620092750/https://twitter.com/AralePyon/status/1096714251153092609 497 https://web.archive.org/web/20220701135233/https://en.wikipedia.org/wiki/Sven_Schulze 498 https://web.archive.org/web/20220618224045/https://www.techdirt.com/2019/02/19/german-politician-thinks-gmail-constituent-messages-are-all-faked-google/ 499 https://web.archive.org/web/20220618224104/https://edri.org/our-work/join-the-ultimate-action-week-against-article-13/ 500 https://web.archive.org/web/20190216094123/https://medium.com/@EuropeanCommission/the-copyright-directive-how-the-mob-was-told-to-save-the-dragon-and-slay-the-knight-b35876008f16 501 https://web.archive.org/web/20220618224855/https://europeancommission.medium.com/the-copyright-directive-how-the-mob-was-told-to-save-the-dragon-and-slay-the-knight-b35876008f16 502 https://web.archive.org/web/20220618224933/https://www.techdirt.com/2019/03/01/why-is-eu-parliament-pushing-fake-propaganda-hollywood/ 503 https://web.archive.org/web/20220618225007/http://www.fosspatents.com/2019/02/germanys-federal-data-protection.html 504 https://web.archive.org/web/20220704114049/https://www.ohchr.org/en/news/2019/03/eu-must-align-copyright-reform-international-human-rights-standards-says-expert 505 https://web.archive.org/web/20220704114049/https://www.ohchr.org/en/news/2019/03/eu-must-align-copyright-reform-international-human-rights-standards-says-expert 506 https://web.archive.org/web/20220618225049/https://nextcloud.com/blog/130-eu-businesses-sign-open-letter-against-copyright-directive-art-11-13/ 507 https://www.google.com/maps/d/u/0/viewer?
Pattern Breakers: Why Some Start-Ups Change the Future
by
Mike Maples
and
Peter Ziebelman
Published 8 Jul 2024
NCSA stood as a crucial node in a network of minds and machines igniting the early wildfire called the internet. Andreessen and his colleague Eric Bina were living at technology’s front lines, tinkering at its boundaries. At the same time, the US government was starting to open the internet for commercial use. Tim Berners-Lee had invented the World Wide Web and recently turned it loose in the public domain. Marc and Eric were living in the future. Despite the fact that they had access to technology most could only dream about, the software that could unlock its potential was primitive at best. Seeing firsthand what was lacking, Marc and Eric set out to build the software they thought should exist.
In the Plex: How Google Thinks, Works, and Shapes Our Lives
by
Steven Levy
Published 12 Apr 2011
“It was science fiction more than computer science,” recalls Winograd. But an outlandish mind was a valuable asset, and there was definitely a place in the current science to channel wild creativity. In 1995, that place was the World Wide Web. It had sprung from the restless brain of a (then)-obscure British engineer named Tim Berners-Lee, who was working as a technician at the CERN physics research lab in Switzerland. Berners-Lee could sum up his vision in a sentence: “Suppose all the information stored on computers everywhere were linked … there would be a single global information space.” The web’s pedigree could be traced back to a 1945 paper by the American scientist Vannevar Bush.
…
Michael Brin also talked about his son in Tom Howell, “Raising an Internet Giant,” University of Maryland Diamondback; and Adam Tanner, “Google Co-founder Lives Modestly, Émigré Dad Says,” USA Today, April 6, 2004; and Mark Malseed, “The Story of Sergey Brin,” Moment, February 2007. Malseed expanded on his research in The Google Story. 15 “Suppose all the information” Tim Berners-Lee, Weaving the Web (New York: HarperBusiness, 2000), p. 4. 15 The web’s pedigree I give a detailed account of the work of Bush, Englebart, and Atkinson in Insanely Great: The Story of Macintosh, the Computer That Changed Everything (New York: Penguin, 1994), and discuss Nelson’s work in Hackers: Heroes of the Computer Revolution (New York: Doubleday, 1984). 16 personalized movie ratings Sergey Brin, résumé at http://infolab.stanford.edu/~sergey/. 17 “Why don’t we use the links” Page and Brin spoke to me in 2002 about developing the early search engine, a subject we also discussed in conversations in 1999, 2001, and 2004. 17 “The early versions of hypertext” Battelle, The Search, p. 72. 20 “For thirty years” Carolyn Crouch et al., “In Memoriam: Gerald Salton, March 8, 1927–August 28, 1995,” Journal of the American Society for Information Science 47(2), 108; “Salton Dies; Was Leader in Information Retrieval Field,” Computing Research Association website. 20 the web was winning I looked at the state of web search in “Search for Tomorrow,” Newsweek, October 28, 1996. 21 “The idea behind PageRank” John Ince, “The Lost Google Tapes,” a series of interviews with Google.
Lonely Planet Switzerland
by
Lonely Planet
In 1523, the city adopts his reform proposals. 1590–1600 Some 300 women in Vaud are captured, tortured and burned alive on charges of witchcraft, even as Protestants in other Swiss cantons strive to end witch hunts. 1847 ‘Hare shoot’ civil war between Protestants and Catholics lasts just 26 days, leaving 86 dead and 500 wounded, and paving the way for the 1848 federal constitution. 1863 After witnessing slaughter and untended wounded at the Battle of Solferino in 1859 in northern Italy, businessman and pacifist Henri Dunant co-founds the International Red Cross in Geneva. 1918 With a sixth of the population living below the poverty line and 20,000 dead of a flu epidemic, workers strike; the 48-hour week is among the long-term results. 1940 General Guisan’s army warns off WWII invaders; 430,000 troops are placed on borders but most are put in Alpine fortresses to carry out partisan war in case of German invasion. 1979 Five years after a first vote in favour in 1974, the Jura (majority French-speaking Catholics), absorbed by Bern in 1815, leaves Bern (German-speaking Protestants), becoming an independent canton. 1990 The internet is ‘born’ at Geneva’s CERN, where Tim Berners-Lee develops HTML, the language used to prepare pages for the World Wide Web and link text to graphics. 2001 National airline Swissair collapses, a gun massacre in Zug parliament kills 14 politicians, 21 people perish in a canyoning accident and 11 die in a fire in the St Gotthard Tunnel. 2008 The world financial crisis affects Switzerland’s two biggest banks, UBS and Credit Suisse.
…
He gained his doctorate in 1905 and subsequently became a professor in Zürich, remaining in Switzerland until 1914, when he moved to Berlin. Bern’s Einstein-Haus museum tells the full story. The internet, meanwhile, was born in Geneva at the European Organization for Nuclear Research, better known as CERN, on Christmas Day 1990. The genius behind the global information-sharing tool was Oxford graduate Tim Berners-Lee, a software consultant for CERN who started out creating a program for the research centre to help its hundreds of scientists share their experiments, data and discoveries. Two years on it had become a dramatically larger and more powerful beast than anyone could have imagined. Equally dramatic, large and powerful is CERN’s Large Hadron Collider, where Geneva scientists play God with Big Bang experiments.
I Live in the Future & Here's How It Works: Why Your World, Work, and Brain Are Being Creatively Disrupted
by
Nick Bilton
Published 13 Sep 2010
Lee, Gina Blaber, Brady Forrest, Kenyatta Cheese, Matt Buchanan, Andrea Sheehan, Scott Beale, Ori, Mor Naaman, Kim Naci, Mike Sharon, Jason Brush, Derek Gottfrid & Nick Thuesen, Jeff Koyen, Peter Ng, Bruce Headlam, Rex Sorgatz, Chad and Summer, Jennifer Magnolfi, Kio Stark, Nick Kristoff, John & Deirdre, Bob and Jamie, Ryan B., Marc and Tiff, Max and Roisin, Andrei K., Kevin E., Morgan, Leanne Citrone, Michael Citrone, Wuca & Pillow, Terry Bilton, Sandra and David Reston, Eboo Bilton and Weter, Betty and Len Bilton, Stephen, Amanda, Ben and Posh Jacobs, Daniel Jacobs, Ivan & Elsa Marin, Nathalie Marin, Chris Marin, Andy, Carm, George Jr., George Sr., Sonia, Joe, Chela, Tony, Jim, Andrea, Stephanie, Jessica, Lindsay, Diego and Yvonne, Cesar and Beatriz Southside, Sam H., Ariel Kaminer, Vint Cerf, Larry and Sergey, Tim Berners-Lee, Steve Jobs, and Bill Gates. Smallest, But Not Least Pixel, Hip Hop, & Magnolia. Kthxbye! notes and sources The following sources represent a portion of the research and interviews used for this book. Additional links, reference papers, and interview quotes can be found online at nickbilton.com.
Democratizing innovation
by
Eric von Hippel
Published 1 Apr 2005
Lead users that generate innovations of interest to manufacturers can reside, as we have seen, at the leading edges of target markets, and also in advanced analog markets. The innovations that some lead users develop are certainly disruptive from the viewpoint of some manufacturers—but the lead users are unlikely to care about this. After all, they are developing products to serve their own needs. Tim Berners-Lee, for example, developed the World Wide Web as a lead user working at CERN—a user of that software. The World Wide Web was certainly disruptive to the business models of many firms, but this was not Berners-Lee’s concern. Lead users typically have no reason to lead, mislead, or even contact manufacturers that might eventually benefit from or be disrupted by their innovations.
The Inner Lives of Markets: How People Shape Them—And They Shape Us
by
Tim Sullivan
Published 6 Jun 2016
E-Commerce Comes of Age Skoll came to Silicon Valley just as Omidyar and others were trying to figure out how to transform the World Wide Web into something that could serve as a platform for transparent market exchange. Part of the challenge was that it was created for an entirely different purpose. The web was born as an information management system in 1989 by computer scientist Tim Berners-Lee to handle the ever-expanding and interconnected data created by nuclear researchers at the CERN laboratories where he worked as a software engineer. There was no thought of buyers, sellers, or markets. Berners-Lee conceived of knowledge as an interconnected network, which made it possible for the web to eventually grow into communities of interlinked buyers and sellers.
Delete: The Virtue of Forgetting in the Digital Age
by
Viktor Mayer-Schönberger
Published 1 Jan 2009
As long as a signal can be digitized, it can be processed using standard digital information devices like personal computers, stored in off-the-shelf storage products like hard disks, and transmitted across the world using the Internet. The evolution of the Internet itself is an excellent case in point. Initially, the Internet was used to share a computer’s information processing power among multiple (distant) users. Later, file transfer and e-mail were added as new services using the same network. And when Tim Berners-Lee invented the WorldWideWeb, it rested on existing Internet infrastructure, just like later streaming audio and video as well as Internet telephony (called voice-over-IP). Of course, over time the Internet’s plumbing has been updated and revised, but its basic building principles have remained stable.
Cogs and Monsters: What Economics Is, and What It Should Be
by
Diane Coyle
Published 11 Oct 2021
The answer will need to take account of the distinctive feature of the economy described in Chapter Five—its non-convexities—and grapple (again) with the centrality of information. Policy in the Digital Age Economy Digital has been transformative. The scope of the changes the world has seen ranges from the automation of manufacturing from the 1980s on, and the waves of outsourcing and offshoring, to Tim Berners-Lee’s 1989 invention of the Web, to the 2007 confluence of smartphones, 3G/4G, and algorithms that have us all online, everywhere, always. Global production chains, e-commerce, social media, digital platforms, are all made possible by the technological and business innovations. And there is more to come as AI advances, and merges with other areas of innovation such as genomics, additive manufacturing, green energy and transport transition, or advanced materials.
The Smartphone Society
by
Nicole Aschoff
Moreover, in the early nineties, when the internet finally looked like it could be profitable, after decades of taxpayer-funded research and development, the US government quietly gave it away to a handful of private telecom providers—a move some deem the largest financial giveaway in American history.7 Nonetheless, a little bit of the internet fairy dust was real. The World Wide Web’s creator was Tim Berners-Lee, a man who saw the internet as a space where knowledge and information could be collectively created, owned, and shared—a digital commons. Anyone could create a webpage to share musings and ideas with the world, and thanks to net neutrality, everyone could see these sites if they were connected to the web.
Dual Transformation: How to Reposition Today's Business While Creating the Future
by
Scott D. Anthony
and
Mark W. Johnson
Published 27 Mar 2017
An even more important event took place later that same year when Marc Andreessen and his team introduced a beta version of the Netscape browser. Since the late 1960s, academics and defense officials had been experimenting with using a distributed network of computer connections to communicate and collaborate. The Netscape browser—coupled with Tim Berners-Lee’s invention of HyperText Markup Language (HTML) universal resource locators (URLs), along with a range of complementary innovations—allowed even the layperson to ride the so-called information superhighway. The disruptive effects of this internet-enabling technology reshaped the media business.
Rebel Ideas: The Power of Diverse Thinking
by
Matthew Syed
Published 9 Sep 2019
People are connected not just socially but digitally. The Internet has created a hyperspace that spans the globe and can be triggered instantly. We have unprecedented access to diverse opinions, beliefs, ideas and technologies, all at the click of a mouse. This was, of course, the original vision of the Internet by Tim Berners-Lee: a place where a scientific community could share research and ideas. And this has driven all manner of recombinant innovations. The Internet has been a positive in many profound ways. But high diversity in the overall network has the potential to create paradoxical effects in local networks.
The Dark Cloud: How the Digital World Is Costing the Earth
by
Guillaume Pitron
Published 14 Jun 2023
See ‘L’Internet pendant le confinement’ [the Internet during lockdown], framablog.org, 21 March 2020. ‘Why the world is short of computer chips, and why it matters’, Bloomberg, 17 February 2021. 5 Better known by its abbreviation ‘www’. The term was coined by the father of the Internet, British physician Tim Berners-Lee. For further reading on predictions of the Internet’s expansion to all of humanity, read ‘Humans on the Internet will triple from 2015 to 2022 and hit 6 billion’, Cybercrime Magazine, 18 July 2019. 6 ‘10 hot summer trends 2030’, Ericsson ConsumerLab, December 2019. 7 ‘Giant cell blob can learn and teach, study shows’, Science News, 21 December 2016. 8 Interview with Inès Leonarduzzi, director of Digital For The Planet, 2019. 9 Interviews with Françoise Berthoud, IT research engineer, 2019 and 2020. 10 ‘Lean ICT: Towards Digital Sobriety’, report of the working group directed by Hugues Ferreboeuf for the think tank The Shift Project, March 2019. 11 Interview with Jaan Tallinn, founder of Skype and the Future of Life Institute, 2020. 12 Google, Apple, Facebook, Amazon and Microsoft: the five most powerful US companies of the digital economy. 13 To borrow the term used by Agnès Crepet, Head of Software Longevity & IT at Fairphone. 14 fridaysforfuture.org 15 The start-up in question is We Don’t Have Time: wedonthavetime.org 16 ‘What’s Behind Climate Change Activist Greta Thunberg’s Remarkable Rise to Fame?’
The Signal and the Noise: Why So Many Predictions Fail-But Some Don't
by
Nate Silver
Published 31 Aug 2012
The same sciences that uncover the laws of nature are making the organization of society more complex. Technology is completely changing the way we relate to one another. Because of the Internet, “the whole context, all the equations, all the dynamics of the propagation of information change,” I was told by Tim Berners-Lee, who invented the World Wide Web in 1990.4 The volume of information is increasing exponentially. But relatively little of this information is useful—the signal-to-noise ratio may be waning. We need better ways of distinguishing the two. This book is less about what we know than about the difference between what we know and what we think we know.
…
This book is fairly scrupulous about citing the origin of its ideas, but some people I interviewed were more influential in determining its direction than might be inferred by the number of times that they appear in the text. This list includes Daniel Kahneman, Vasik Rajlich, Dr. Alexander “Sandy” McDonald, Roger Pielke Jr., John Rundle, Thomas Jordan, Irene Eckstrand, Phil Gordon, Chris Volinsky, Robert Bell, Tim Berners-Lee, Lisa Randall, Jay Rosen, Simon Jackman, Diane Lauderdale, Jeffrey Sachs, Howard Lederer, Rodney Brooks, Henry Abbott, and Bruce Bueno de Mesquita among others. I hope to return all these favors someday. I will start by buying the first beer for anybody on this list, and the first three for anybody who should have been, but isn’t.
Energy and Civilization: A History
by
Vaclav Smil
Published 11 May 2017
In 1972 Ray Tomlinson of BBN Technologies designed programs for sending messages to other computers and chose the @ sign as the locator symbol for email addresses (Tomlinson 2002). In 1983 ARPANET converted a protocol that made it possible to communicate across a system of networks, and by 1989, when it ended its operation, it had more than 100,000 hosts. A year later Tim Berners-Lee created the hypertext-based World Wide Web at Geneva’s CERN in order to organize online scientific information (Abbate 1999). The early web was not easy to navigate, but that changed rapidly with the introduction of efficient browsers, starting with Netscape in 1993. The first major electronic advance in telephony was the possibility of inexpensive intercontinental calls, thanks to automatic dialing via geostationary satellites.
…
Viking spacecraft lands on the Mars 1977 Human-powered flight of Gossamer Condor 1979 OPEC’s second round of crude oil price increases (until 1981) 1980s Ownership of personal computers takes off More efficient appliances and cars Concerns about global environmental change Genetic engineering takes off 1982 CD player (Philips, Sony) 1983 French TGV starts operation (Paris-Lyon) 1985 Antarctic ozone hole identified 1986 Chornobyl nuclear reactor disaster 1989 World Wide Web introduced (Tim Berners-Lee) 1990 Global population surpasses 5 billion 1994 Netscape launched 1999 Mass adoption of smart phones begins 2000s Widespread installation of wind turbines and PV cells 2000 German Energiewende begins 2003 Three Gorges Dam completed (Yangzi river, China) 2007 Hydraulic fracturing takes off in the United States 2009 China becomes world’s largest consumer of energy 2011 Tsunami and mismanagement cause Fukushima nuclear disaster Global population reaches 7 billion 2014 United States is once again the world’s largest producer of natural gas 2015 Average concentration of atmospheric CO2 reaches 400 ppm Power in History Power ratings: From a candle to global civilization Actions, prime movers, convertersPower (W) Small wax candle burning (800 BCE) 5 Egyptian boy turning Archimedean screw (500 BCE) 25 Small U.S. windmill rotating (1880) 30 Chinese woman cranking a winnowing machine (100 BCE) 50 Steadily working French glass polishers (1700) 75 Strong man treading rapidly a wooden wheel (1400) 200 Donkey turning a Roman hourglass mill (100 BCE) 300 Weak pair of Chinese oxen plowing (1900) 600 Good English horse turning a whim (1770) 750 Dutch treadwheel powered by eight men (1500) 800 Very strong American horse pulling a wagon (1890) 1,000 Long-distance runner at the Olympic Games (600 BCE) 1,400 Roman vertical waterwheel turning a millstone (100 CE) 1,800 Newcomen’s atmospheric engine pumping water (1712) 3,750 Engine of Ransom Olds’s Curved Dash automobile (1904) 5,200 Greek penteconter with 50 oarsmen at full speed (600 BC) 6,000 Large German post windmill crushing oilseeds (1500) 6,500 Roman messenger horse galloping (200 CE) 7,200 Large Dutch windmill draining a polder (1750) 12,000 Engine of Ford Model T at full speed (1908) 14,900 Greek trireme with 170 oarsmen at full speed (500 BCE) 20,000 Watt’s steam engine winding coal (1795) 20,000 Team of 40 horses pulling a California combine (1885) 28,000 Cascade of 16 Roman water mills at Barbegal (350 CE) 30,000 Benoît Fourneyron’s first water turbine (1832) 38,000 Water pumps for Versailles at Marly (1685) 60,000 Engine of Honda Civic GL (1985) 63,000 Charles Parsons’s steam turbine (1888) 75,000 Steam engine at Edison’s Pearl Street Station (1882) 93,200 Watt’s largest steam engine (1800) 100,000 Electricity use by a U.S. supermarket (1980) 200,000 Diesel engine of a German submarine (1916) 400,000 Lady Isabella, the world’s largest waterwheel (1854) 427,000 Large steam locomotive at full speed (1890) 850,000 Parsons’ steam turbine at Elberfeld Station (1900) 1,000,000 Shaw’s water works at Greenock, Scotland (1840) 1,500,000 Large wind turbine (2015) 4,000,000 Rocket engine launching V-2 missile (1944) 6,200,000 Gas turbine powering a pipeline compressor (1970) 10,000,000 Japanese merchant ship’s diesel engine (1960) 30,000,000 Four jet engines of Boeing 747 (1969) 60,000,000 Calder Hall nuclear reactor (1956) 202,000,000 Turbogenerator at Chooz nuclear power plant (1990) 1,457,000,000 Rocket engines launching Saturn C 5 (1969) 2,600,000,000 Kashiwazaki-kariwa nuclear power station (1997) 8,212,000,000 Japan’s primary energy consumption (2015) 63,200,000,000 U.S. coal and biomass energy consumption (1850) 79,000,000,000 U.S. commercial energy consumption (2010) 3,050,000,000,000 Global commercial energy consumption (2015) 17,530,000,000,000 Maximum power of prime movers in field work, 1700–2015 YearActions, prime moversPower (W) 1700 Chinese peasant hoeing a cabbage field 50 1750 Italian peasant harrowing with an old weak ox 200 1800 English farmer plowing with two small horses 1,000 1870 North Dakota farmer plowing with six powerful horses 4,000 1900 California farmer using 32 horses to pull a combine 22,000 1950 French farmer harvesting with a small tractor 50,000 2015 Manitoba farmer plowing with a large Diesel tractor 298,000 Maximum power of prime movers in land transportation, 1700–2015 YearPrime moversPower (W) 1700 Two oxen pulling a cart 700 1750 Four horses pulling a coach 2,500 1850 English steam locomotive 200,000 1900 The fastest American steam locomotive 1,000,000 1950 Powerful German diesel locomotive 2,000,000 2006 French TGV train by Alstom 9,600,000 2015 N700 series high-velocity shinkansen train 17,080,000 Average annual consumption (GJ/capita) of primary energy Note: All rates are rounded to the nearest 5 and include all phytomass (traditional and modern biofuels), fossil fuels, and primary electricity.
Business Metadata: Capturing Enterprise Knowledge
by
William H. Inmon
,
Bonnie K. O'Neil
and
Lowell Fryman
Published 15 Feb 2008
Semantics and Business Metadata 1. 2. 3. 4. 5. 6. 7. 8. 11.1 Introduction ................................................................................................195 The Vision of the Semantic Web .....................................................195 The Importance of Semantics ..........................................................196 Attempts to Capture Semantics: Semantic Frameworks .................................................................................................200 Semantics as Business Metadata ....................................................207 Semantics in Practice .............................................................................211 Summary .......................................................................................................216 References ....................................................................................................217 Introduction Semantics, a subject that has great depth and breadth, can only be viewed here in very broad overview, focusing specifically on semantics as a type of business metadata. After a brief survey of semantics and semantic technology, we will cover the relationship of semantics and business metadata. 11.2 C H A P T E R 11 C H A P T E R TA B L E O F CO N T E N T S The Vision of the Semantic Web Tim Berners-Lee envisioned the idea of the “semantic web,” wherein intelligent agents would be truly intelligent. 195 196 Chapter 11 Semantics and Business Metadata In his vision the computer would know exactly what “booking a restaurant reservation” meant, as well as all the underlying tasks associated with it.
The Facebook era: tapping online social networks to build better products, reach new audiences, and sell more stuff
by
Clara Shih
Published 30 Apr 2009
PCs provided the advantage JPMorgan’s relatively less-established investment banking business (the original investment banking business had been spun off in 1933 to form Morgan Stanley) needed to catch up and compete with Morgan Stanley and Goldman Sachs. The company was acquired by Chase Manhattan Corporation in 2000 and merged with Bank One Corporation in 2004. The World Wide Web The ‘90s were defined largely by the advent of the World Wide Web, developed by Tim Berners-Lee working with Robert Cailliau at CERN. E-mail, instant messaging, and Web conferencing applications dramatically improved communication capacity for businesses while drastically reducing costs. Web sites, online news, and search engines like Infoseek, Lycos, Yahoo!, Excite, and Google began providing affordable, real-time information for workers as well as a new medium for reaching customers. eBay, PayPal, and commerce sites like Amazon.com proved the feasibility and popularity of self-service transactions.
Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up
by
Philip N. Howard
Published 27 Apr 2015
In 1991 a group of hard-line Communist leaders tested Mikhail Gorbachev’s reforms in the Soviet Union. Dedicated citizens wouldn’t give up their cause and kept up their acts of civil disobedience. Boris Yeltsin made an impassioned plea from atop a tank in front of Russia’s parliament buildings, and the hard-liners lost. Yet that was also the year that Tim Berners-Lee published the first text on a webpage and demonstrated how large amounts of content could be made widely available over digital networks. Within only a few years, idealistic new social movements like the Zapatistas were using the internet to advertise their struggle and build international audiences.
The Dark Net
by
Jamie Bartlett
Published 20 Aug 2014
Unlike the cloistered Arpanet, Usenet and BBS, the forerunners of the chat room and forum, were available to anyone with a modem and a home computer. Although small, slow and primitive by today’s standards, they were attracting thousands of people intrigued by a new virtual world. By the mid-nineties and the emergence of Tim Berners-Lee’s World Wide Web, the internet was fully transformed: from a niche underground haunt frequented by computer hobbyists and academics, to a popular hangout accessed by millions of excited neophytes.fn2 According to John Naughton, Professor of the Public Understanding of Technology at the Open University, cyberspace at this time was more than just a network of computers.
The Great Fragmentation: And Why the Future of All Business Is Small
by
Steve Sammartino
Published 25 Jun 2014
It’s this idea that creates gifts to humanity in the technology age. It’s why resources such as Wikipedia, Linux or even the blogosphere exist. It’s about people who have undertaken projects to create value for others, often to the point where they set up platforms that others can make fortunes from. The gift Sir Tim Berners-Lee gave the world with the world wide web itself has been the platform for some of the quickest and biggest fortunes ever created on planet earth. It’s the real human needs of connection, collaboration and community that drive us; the need to feel valued, appreciated and wanted. And while many people mistake financial achievement as a means of filling that void, the smart money is on embarking on projects that aim for human fulfilment.
Big Data: A Revolution That Will Transform How We Live, Work, and Think
by
Viktor Mayer-Schonberger
and
Kenneth Cukier
Published 5 Mar 2013
Even in reticent Britain, where a lot of government information has been locked up by Crown Copyright and has been difficult and costly to license to use (such as postal codes for e-commerce companies), there has been substantial progress. The UK government has issued rules to encourage open information and supported the creation of an Open Data Institute co-directed by Tim Berners-Lee, the inventor of the World Wide Web, to promote novel uses of open data and ways to free it from the state’s grip. The European Union has also announced open-data initiatives that could soon become continent-wide. Countries elsewhere, such as Australia, Brazil, Chile, and Kenya, have issued and implemented open-data strategies.
A Pelican Introduction: Basic Income
by
Guy Standing
Published 3 May 2017
He told Bloomberg, ‘I’m fairly confident that at some point in the future, as technology continues to eliminate traditional jobs and massive new wealth gets created, we’re going to see some version of this [basic income] at a national scale.’21 In another interview, he put that point at ‘no fewer than 10 years’ and ‘no more than 100’.22 However, the immediate problem is one of income distribution rather than a sudden disappearance of work for humans to do. Indeed, this could be the first technological revolution that is generating more work, even though it is disrupting and replacing paid labour.23 But it is contributing to the growing inequality of income. Tim Berners-Lee, inventor of the World Wide Web, says he supports basic income as a tool for correcting massive inequality brought about by technology.24 So does Stephen Hawking, the acclaimed physicist and cosmologist.25 Even senior economists in the International Monetary Fund have concluded that rising technology-induced inequality means that ‘the advantages of a basic income financed out of capital taxation become obvious’.26 Basic income would be a way in which all would benefit from economic gains resulting from technological advance.
Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies
by
Reid Hoffman
and
Chris Yeh
Published 14 Apr 2018
Trade-oriented principalities like the Republic of Venice provided a welcoming ecosystem for merchants, complete with currency and the rule of law, as well as taxes to harvest the value of the platform. Technology platforms like Microsoft Windows demonstrated the power of being the chosen platform on which businesses were built back when the World Wide Web was still a glimmer in Tim Berners-Lee’s eye (Sir Berners-Lee wrote his proposal for a global hypertext system in 1989). Yet despite the proven value of platforms in the pre-Internet era, the Networked Age has made them vastly more powerful and valuable. Rather than being limited like the Republic of Venice to a specific geography, today’s software-based platforms can achieve global distribution almost immediately.
Restarting the Future: How to Fix the Intangible Economy
by
Jonathan Haskel
and
Stian Westlake
Published 4 Apr 2022
The UK government made a significant push to make a wide range of previously closed data sets freely available; it backed up this push by funding the establishment of the Open Data Institute, an independent, publicly funded body that materially increased state capacity in the field of open data (providing, for example, guidance and technical support on how to make data open). Its founders were Tim Berners-Lee, inventor of the World Wide Web, and Nigel Shadbolt, a computer scientist. The movement would have made much less headway if it had not received political support, in particular from Francis Maude, a senior UK government minister. Part of this support was linked to the creation of a useful, simplifying political narrative: the idea that open data would allow “armies of armchair auditors” to police government spending and effectiveness, an idea that had considerable political legitimacy with small-state Conservatives, especially at a time of public spending cuts.
Other Pandemic: How QAnon Contaminated the World
by
James Ball
Published 19 Jul 2023
In that sense, a meme is as meaningful an entity as a gene – and by Dawkins’ reasoning, the meme is set to outperform the gene by quite some margin. And it’s our minds – our brains in Dawkins’ 1976 book, but perhaps the online spaces we have built in our modern reality – that are to memes what primordial soup was to genes. The Selfish Gene was written in 1976, though it was somewhat revised and updated in 1989 – the same year Tim Berners-Lee first trialled what became the World Wide Web. All of Dawkins’ thoughts and analysis on memes predated the internet as a popular domestic phenomenon – in 1976 there were fewer than 100 devices worldwide connected to ARPANET (the network that became the internet), and by 1989 it was still barely over 100,000.9 If memes were arguably taking over from genes in the 1970s, what on earth does that mean in the 2020s?
Superbloom: How Technologies of Connection Tear Us Apart
by
Nicholas Carr
Published 28 Jan 2025
This story fit the net’s technical characteristics, and it reflected the public’s sunny view of new communication systems. It also resonated with the times. In the wake of the breakup of the Soviet Union—the Berlin Wall had come down only eight years earlier—the continued spread of democracy seemed inevitable. In his widely discussed article “The End of History?,” published in 1989, the same year Tim Berners-Lee invented the World Wide Web, Francis Fukuyama heralded the arrival of “the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.”2 The web seemed the perfect medium for the dawning age of universal democracy. The technology’s effect, wrote the prominent Economist editor Frances Cairncross in her 1997 book, The Death of Distance, “will be to increase understanding, foster tolerance, and ultimately promote worldwide peace.”3 Her words echoed, almost note for note, the utopian rhetoric that had accompanied new communication systems at the century’s start.
Hands-On RESTful API Design Patterns and Best Practices
by
Harihara Subramanian
Published 31 Jan 2019
If the API modifies the meaning of its response, then the client needs to be aware of it and act on those new responses accordingly. Well-designed APIs exhibit loose coupling and well-composed functionalities across service boundaries to maximize scalability factors. Leverage web architecture Since its invention by Sir Tim Berners-Lee in 1989, the fundamentals of the web remain as the foundations of all web architecture even today. As you all know, HTTP is the lifeline of the web architecture, and it powers every single client request, server response, and transfer of a document/content over all of the web. So, it is imperative that REST APIs should embrace its bursting power by building interfaces that can be consumed by any device or operating system.
Nuts and Bolts: Seven Small Inventions That Changed the World (In a Big Way)
by
Roma Agrawal
Published 2 Mar 2023
Without it, Guglielmo Marconi couldn’t have invented the long-distance radio system for which he is so well known, a deserved level of recognition that has eluded Bose, because without his work in using electromagnetic waves to move information, we wouldn’t today have our phones that can send and receive data. Another world-altering communications technology, the world wide web, which we connect into to use the internet, was invented in the offices of CERN, the European Council for Nuclear Research. Scientists from all over the world work for this organisation, and Sir Tim Berners-Lee wanted to create a more effective way to share data, so it could be easily and quickly accessed by the full team. I personally have been fascinated by CERN since I was a teenage physics nerd, but for a slightly different reason: the Large Hadron Collider, or LHC. The LHC is the world’s largest particle accelerator.
The Wealth of Networks: How Social Production Transforms Markets and Freedom
by
Yochai Benkler
Published 14 May 2006
Individual computer engineers contributed irrespective of formal status or organizational affiliation, and the organization ran on the principle that Dave Clark termed "rough consensus and running code." The World Wide Web protocols and authoring conventions HTTP and HTML were created, and over the course of their lives, shepherded by Tim Berners Lee, who has chosen to dedicate his efforts to making [pg 413] the Web a public good rather than cashing in on his innovation. The sheer technical necessity of these basic protocols and the cultural stature of their achievement within the engineering community have given these open processes and their commonslike institutional structure a strong gravitational pull on the design of other components of the logical layer, at least insofar as it relates to the communication side of the Internet. 728 This basic open model has been in constant tension with the proprietary models that have come to use and focus on the Internet in the past decade.
…
While not nearly as open as a genuinely open-source platform, Windows is also a far cry from a completely controlled platform, whose owner seeks to control all applications that are permitted to be developed for, and all uses that can be made of, its platform. Third, while IE controls much of the browser market share, Microsoft has not succeeded in dominating the standards for Web authoring. Web browser standard setting happens on the turf of the mythic creator of the Web-- Tim Berners Lee. Lee chairs the W3C, a nonprofit organization that sets the standard ways in which Web pages are authored so that they have a predictable appearance on the browser's screen. Microsoft has, over the years, introduced various proprietary extensions that are not part of the Web standard, and has persuaded many Web authors to optimize their Web sites to IE.
The Rise of the Network Society
by
Manuel Castells
Published 31 Aug 1996
But it diffused on a large scale 20 years later, because of several factors: regulatory changes; greater bandwidth in telecommunications; diffusion of personal computers; user-friendly software programs that made it easy to upload, access, and communicate content (beginning with the World Wide Web server and browser designed by Tim Berners-Lee in 1990); and the rapidly growing social demand for the networking of everything, arising from both the needs of the business world and the public’s desire to build its own communication networks. As a result, the number of Internet users on the planet grew from under 40 million in 1995 to about 1.5 billion in 2009.
…
A new technological leap allowed the diffusion of the Internet into the mainstream of society: the design of a new application, the world wide web, organizing the Internet sites’ content by information rather than by location, then providing users with an easy search system to locate the desired information. The invention of the world wide web took place in Europe, in 1990, at the Centre Européen pour Recherche Nucleaire (CERN) in Geneva, one of the leading physics research centers in the world. It was invented by a group of researchers at CERN led by Tim Berners-Lee and Robert Cailliau. They built their research not on the ARPANET tradition, but on the contribution of the hackers’ culture of the 1970s. In particular, they partly relied on the work of Ted Nelson who, in 1974, in his pamphlet “Computer Lib,” called upon people to seize and use computer power for their own benefit.
Television disrupted: the transition from network to networked TV
by
Shelly Palmer
Published 14 Apr 2006
That network, now called the Internet (or simply, the Net) is the transport system that packets of data travel over. Your e-mail, music and video files all live on individual storage devices (like the hard drive in your computer) and get from place to place over the public Internet. This is not to be confused with the World Wide Web. In 1989, Tim Berners-Lee and a group of other big brains at the European Laboratory for Particle Physics (usually referred to as CERN) proposed the protocol that we now know as HTTP (Hypertext Transfer Protocol) and in 1991 the first World Wide Web pages or Web sites were put online using HTML (Hypertext Markup Language) which is still an extremely popular language for the creation of Web pages.
The Perfect Thing: How the iPod Shuffles Commerce, Culture, and Coolness
by
Steven Levy
Published 23 Oct 2006
Apple Computer's online emporium lays a plausible claim for itself to be the savior of a music industry that feared that all its revenues would be drained by pirates. In a sense, though, the iTunes store was inevitable, the culmination of a story that began in 1988, when the music world changed forever and didn't know it. Those late-middle 1980s seem fuzzy now and somewhat quaint. The World Wide Web wasn't yet a glimmer in the eye of Tim Berners-Lee. Steve Jobs was trying to sell NeXT computers to educational institutions. The Sony Walkman was still the hottest thing going in personal music. But plenty of computer scientists knew that ultimately computers would be taking center stage in both audio and video. The digitization of everything had begun, and it was time to convert everything analog to the new regime of bits.
The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences
by
Rob Kitchin
Published 25 Aug 2014
As well as the transfer of data and files, e-mail had been established, as had as bulletin boards (Kitchin 1998). During the 1980s the infrastructure grew, with new institutional and corporate players widening participation, along with the development of intranets (private networks). In 1992, the World Wide Web was invented by Tim Berners-Lee at CERN, Geneva, producing a much more user-friendly way of accessing and using the Internet. Throughout the 1990s and 2000s, new networking technologies were developed such as near field and proximate communication with Bluetooth, local WiFi coverage, and national GSM/3G networks. According to George Gilder’s (2000) ‘law of telecosm’, the world’s supply of bandwidth (its capacity to transfer data) doubles roughly every six months, with much of the additional capacity provided through wireless networks.
Emergence
by
Steven Johnson
Couldn’t individual brains connect with one another, this time via the digital language of the Web, and form something greater than the sum of their parts—what the trendy philosopher/priest Teilhard de Chardin called the noosphere? Wright’s not exactly convinced that the answer is yes, but he’s willing to go on the record that the question is, as he puts it, “noncrazy”: Today’s talk of a giant global brain is cheap. But there’s a difference. These days, most people who talk this way are speaking loosely. Tim Berners-Lee, who invented the World Wide Web, has noted parallels between the Web and the structure of the brain, but he insists that “global brain” is mere metaphor. Teilhard de Chardin, in contrast, seems to have been speaking literally: humankind was coming to constitute an actual brain—like the one in your head, except bigger.
Drugs 2.0: The Web Revolution That's Changing How the World Gets High
by
Mike Power
Published 1 May 2013
To put it simply, Usenet may already have peaked. He needn’t have worried, though. The net drug scene was about to mutate once more, and technology was the driver. During the Usenet era, the net itself had been changing – morphing into the world wide web, the global graphic interface to the new world of data invented and named by Tim Berners Lee at the European Organisation for Nuclear Research (CERN), in 1991. With great modesty and foresight, the physicist demonstrated and distributed, for free, a technology that would help his fellow particle physicists to share their findings into the fundamental nature of reality at CERN. Berners Lee’s genius was to write HTML, or hypertext markup language, which allowed the linking of one document to any other that was hosted on the new networks.
Here Comes Everybody: The Power of Organizing Without Organizations
by
Clay Shirky
Published 28 Feb 2008
This fact has many ramifications, but two of the most important ones are vanishingly cheap many-to-many communications, and the flexibility that allows people to design and try new communications tools without having to ask anyone for permission. The most important of these experiments has been the Web. Begun as a research effort in the early 1990s by Sir Tim Berners-Lee (knighted, in fact, for that invention), the Web became a core part of modern life as quickly as it did precisely because it is such a flexible environment for letting people try new things. The communications tools broadly adopted in the last decade are the first to fit human social networks well, and because they are easily modifiable, they can be made to fit better over time.
The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
by
Erik Brynjolfsson
and
Andrew McAfee
Published 20 Jan 2014
What’s more, for most of the subsequent century, additional complementary innovations, from lean manufacturing and steel minimills to Total Quality Management and Six Sigma principles, continued to boost manufacturing productivity. As with earlier GPTs, significant organizational innovation is required to capture the full benefit of second machine age technologies. Tim Berners-Lee’s invention of the World Wide Web in 1989, to take an obvious example, initially benefited only a small group of particle physicists. But due in part to the power of digitization and networks to speed the diffusion of ideas, complementary innovations are happening faster than they did in the first machine age.
Peers Inc: How People and Platforms Are Inventing the Collaborative Economy and Reinventing Capitalism
by
Robin Chase
Published 14 May 2015
Often government financing does come with strings attached. In many states, federal financing of highways requires that the state mandate use of seatbelts or specific speed limits. Sometimes these government rules feel appropriate; sometimes they feel unnecessarily restrictive. Every time I see Vint Cerf (one of the Internet’s founding fathers) and Tim Berners-Lee (creator of the World Wide Web, the visible part of the Internet), I am struck by their personal humility and life choices. Instead of figuring out how to cash out on the government-funded research that led to their inventions, they tirelessly work to ensure that these public goods remain public.
Networks of Outrage and Hope: Social Movements in the Internet Age
by
Manuel Castells
Published 19 Aug 2012
It emerged from the culture of freedom prevailing in the university campuses in the 1970s (Markoff 2006). It was based on open source protocols from its inception, the TCP/IP protocols developed by Vint Cerf and Robert Kahn. It became user friendly on a large scale thanks to the World Wide Web, another open source program created by Tim Berners-Lee. In continuity with this emphasis on autonomy building, the deepest social transformation of the Internet came in the first decade of the twenty-first century, from the shift from individual and corporate interaction on the Internet (the use of email, for instance), to the autonomous construction of social networks controlled and guided by their users.
Ten Billion Tomorrows: How Science Fiction Technology Became Reality and Shapes the Future
by
Brian Clegg
Published 8 Dec 2015
Though the ARPANET, the antiquated predecessor of the Internet, was already in place, in was at the time just a means to remotely log in to other computers in universities and military institutions. The true Internet was arguably not started until 1982 (alongside that Omni article, with reality running parallel to fiction) while Tim Berners-Lee did not start work on the World Wide Web at CERN until 1990. By coincidence, 1976 was also the year when a true computer-based virtual world came to life. It was then that American computer engineer Will Crowther, who was working on ARPANET at the time, had an idea that would capture the hearts and minds of computer enthusiasts—me included.
Alpha Girls: The Women Upstarts Who Took on Silicon Valley's Male Culture and Made the Deals of a Lifetime
by
Julian Guthrie
Published 15 Nov 2019
She noted that she was seeing more and more companies bypass creating websites to go straight to mobile apps. After the talk, Theresia was approached by a man who asked for examples of companies doing mobile apps rather than websites. As she looked down at the man’s nametag, she was stunned to see it was Tim Berners-Lee. The inventor of the World Wide Web was asking her about Internet trends. Not long afterward she was interviewed by Willow Bay for a new hour-long Bloomberg TV show called Women to Watch, along with Jessica Herrin, CEO of Stella & Dot; Carolyn Everson, VP of global marketing for Facebook; and Selina Tobaccowala, VP of product and engineering at SurveyMonkey.
Everything for Everyone: The Radical Tradition That Is Shaping the Next Economy
by
Nathan Schneider
Published 10 Sep 2018
I can keep my calendars, contacts, and email with May First, but a lot of a day’s work for me still involves feeding data into Facebook Groups, Google Docs, and Slack channels, all for the sake of collaborating. There needs to be another way. This is a kind of holy grail for today’s guerrilla hackers—an antidote to the surveillance addiction that plagues so much of our online lives. Tim Berners-Lee, who invented the World Wide Web, is working on this problem with researchers at MIT, as are countless blockchain startups. But cooperatives could be especially well suited to offering data a trustworthy home, one free from acquisitive investor-owners—a collaborative cloud that is truly ours.
The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity
by
Byron Reese
Published 23 Apr 2018
We can’t turn off our Internet; we can’t turn off our smartphones; we can’t turn off our computers. You used to ask a smart person a question. Now who do you ask? It starts with g-o, and it’s not God.” In the 1960s and 1970s, we were building enough computers that it made sense to connect them to make one giant network. We call that the Internet. In 1989, Tim Berners-Lee created a protocol called HTTP to access a document on a server from a remote computer. We call this the World Wide Web. Today we are in the process of connecting not just computers to the Internet, but every device that is driven by data. Thirty billion or so devices are connected now, and that number is expected to rise to half a trillion by 2030.
Black Code: Inside the Battle for Cyberspace
by
Ronald J. Deibert
Published 13 May 2013
Weber, “The Council of Europe’s Convention on Cybercrime,” Berkeley Technology Law Journal 18, no.1 (2003). 10 would require ISPs and other telecommunication companies to store: The proposed Communications Data Bill has been profiled in “UK’s Data Communication Bill Faces Tough Criticism,” BBC, June 14, 2012, http://www.bbc.com/news/technology-18439226; “Jimmy Wales, Tim Berners-Lee Slam UK’s Internet Snooping Plans,” ZDNet, September 6, 2012, http://www.zdnet.com/uk/jimmy-wales-timberners-lee-slam-uks-internet-snooping-plans-7000003829; “UK’s Web Monitoring Draft Bill Revealed: What You Need to Know,” ZDNet, June 14, 2012, http://www.zdnet.com/blog/london/uks-web-monitoring-draft-bill-revealed-what-you-need-to-know/5183; and Mark Townsend, “Security Services to Get More Access to Monitor Emails and Social Media,” Guardian, July 28, 2012, http://www.guardian.co.uk/technology/2012/jul/28/isecurity-services-emails-social-media. 11 From documents released under federal access to information laws: See Christopher Parsons, “Canadian Social Media Surveillance: Today and Tomorrow,” Technology, Thoughts, and Trinkets, May 28, 2012, http://www.christopher-parsons.com/blog/technology/canadian-social-media-surveillance-today-and-tomorrow/. 8: MEET KOOBFACE: A CYBER CRIME SNAPSHOT 1 Meet Koobface: A Cyber Crime Snapshot: Between April and November 2010, the Information Warfare Monitor, led by Nart Villeneuve, conducted an investigation into the operations and monetization strategies of the Koobface botnet.
All Day Long: A Portrait of Britain at Work
by
Joanna Biggs
Published 8 Apr 2015
She’s been looking to buy property for two years, giving up buying a new handbag every quarter to save for the deposit, but she has placed an offer on eleven houses, and each has fallen through. She earns around £50,000 a year at Deutsche Bank, and corporate sponsorship covers her expenses for Stemettes. Her work with Stemettes has taken her to Buckingham Palace and Downing Street and lunch with Tim Berners-Lee: ‘The archetypal geek. Fantastic … He’s like Jack Dee, almost, but not as miserable.’ She started Stemettes because there were only three women on her course at Oxford (and two in the year below and one in the year below that), but sees that being an untypical geek has worked in her favour: ‘As much as it might have held me back as in people see me and think a certain thing, because of everything I’ve got it’s almost like that completely … it’s almost like I’m not a black woman.
Sex Power Money
by
Sara Pascoe
Published 26 Aug 2019
William Gibson says, ‘George Orwell has always been jealous of me.’ Ident of a real mouse running across a mouse mat while a computer mouse watches. ‘1985,’ shouts the presenter from a clifftop, ‘and the first domain name is registered.’ The presenter is standing holding hands with a hologram of Tim Berners Lee. ‘1990, and this guy –’ the hologram dances and waves – ‘invented HTML.’ ‘1991’ flashes up on the screen in pink and the presenter lies on a waterbed. She turns towards the camera. ‘This is the year CERN introduced the public to the World Wide Web.’ The Public is seated in a restaurant opposite the World Wide Web.
Power, for All: How It Really Works and Why It's Everyone's Business
by
Julie Battilana
and
Tiziana Casciaro
Published 30 Aug 2021
They have also brought us closer to becoming “like lords and possessors of Nature,” in the words of French scientist and philosopher René Descartes.10 For Descartes, science and technology were the gateway for people to understand, interpret, and analyze nature, and thereby gain a measure of control over Mother Nature herself.11 We have become so astoundingly powerful that we have even developed plans to prevent massive asteroids from hitting Earth and annihilating us.12 The digital revolution that reached warp speed at the turn of the twenty-first century has increased our power at a staggering pace.II In 1989, at one of the world’s largest physics laboratories, Tim Berners-Lee and Robert Cailliau invented a new network for sharing and searching information. They called their invention the World Wide Web. As they discussed what to do with it, they had a conversation that would have a tremendous impact on the way we live, work, and play. They were debating whether to patent their discovery, which would prevent its replication, use, or improvement by others.
Lost in Math: How Beauty Leads Physics Astray
by
Sabine Hossenfelder
Published 11 Jun 2018
“Look,” they say, “it’s always worked.” And then they preach the gospel of innovation by serendipity. It doesn’t matter what we do, the gospel goes; you can’t foresee breakthroughs, anyway. We’re all smart people, so just let us do our thing and be amazed by the unpredictable spin-offs that will come about. Haven’t you heard that Tim Berners-Lee invented the World Wide Web to help particle physicists share data? “Anything goes” is a nice idea, but if you believe smart people work best when freely following their interests, then you should make sure they can freely follow their interests. And doing nothing isn’t enough. I’ve seen it in my own research area, and this book tells the story.
The Internet Trap: How the Digital Economy Builds Monopolies and Undermines Democracy
by
Matthew Hindman
Published 24 Sep 2018
Links within the hypertext could point to other relevant passages, definitions, graphics, tables, or even other documents on the same computer. The second technology was the internet, which by the 1980s had become ubiquitous in universities and research labs. The internet had been created as a peer-to-peer network, in which there were no central hubs: each computer could send and receive data with any other computer. Tim Berners-Lee, the creator of the web, saw that hypertext could piggyback on top of the internet. Instead of being sandboxed within a single computer, hypertext could link documents on computers continents apart. Berners-Lee called the project the World Wide Web to emphasize that “any node can be linked to any other,” and to reflect “the distributed nature of the people and computers that the system could link.”3 In the opening sentence of the www’s project overview, one of the very first pages online, Berners-Lee declared, “There is no top to the Web.”
The Myth of Artificial Intelligence: Why Computers Can't Think the Way We Do
by
Erik J. Larson
Published 5 Apr 2021
RDF helped knowledge bases become, in essence, computational encyclopedias (as the late AI researcher John Haugeland once put it) with larger projects’ knowledge bases having thousands of triples. AI researchers hoped that the ease of use would encourage even non-experts to make triples—a dream articulated by Tim Berners-Lee, the creator of HTML. Berners-Lee called it the Semantic Web, because with web pages converted into machine-readable RDF statements, computers would know what everything meant. The web would be intelligently readable by computers. AI researchers touted knowledge bases as the end of brittle systems using only statistics—because, after all, statistics aren’t sufficient for understanding.
Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy
by
George Gilder
Published 16 Jul 2018
Above layer four is layer five—the all-important session layer—which governs a particular two-way communication from beginning to end, whether a video stream, a Skype call, a Session Initiation Protocol conference, a messaging exchange, an email post, or even—and this would prove fateful—a transaction. Layers six and seven are the schemes for presentations and applications—user interfaces, windows, formats, operating systems, and so on. These are summed up in the ingenious schemes of hyperlinks (click on a word and go to a new page) and universal resource locators (URLs) addresses. Tim Berners-Lee at CERN in Geneva invented them in 1989 as part of his World Wide Web. Berners-Lee wanted to make all data linkable into one Web, a skein of tools that made it easy to set up a Web page of “shared creative collaborative space where everyone could play together.” As 70 percent of all links came to be handled through Google and Facebook, Berners-Lee feared that his Web was dying.
Britain Etc
by
Mark Easton
Published 1 Mar 2012
We failed to adapt when the rule book was revised in the twentieth century. But we do have a chance to redeem ourselves in the twenty-first century. The rules are changing again. Just as James Watt was critical in developing the technology for Britain’s success in the industrial revolution, another Briton, Tim Berners-Lee is credited with the invention that is transforming the global economy today. The World Wide Web has powered a new period of globalisation. In the nineteenth century, it was about access to and the effective use of industrial machines. In the twenty-first century, it is about access to and the effective use of knowledge.
Ruby by example: concepts and code
by
Kevin C. Baird
Published 1 Jun 2007
The bottom line is, XML-based markup is everywhere. Luckily, Ruby can understand, output, and manipulate XML (and HTML). #30 Cleaning Up HTML (html_tidy.rb) Let’s start with HTML. This markup language has had several numbered releases, similar to different versions of software, and it’s come a long way since Tim Berners-Lee made the first web page at CERN in the mid ’90s. Recent versions of HTML are subsets of XML and are called XHTML as a result. However, the earlier versions of HTML were not as disciplined; they allowed very liberal interpretations of HTML. Especially when people were first learning how to use HTML, they would often throw together pages that were not very well designed, either aesthetically or technically.
Beautiful Visualization
by
Julie Steele
Published 20 Apr 2010
“Distinctae per locos schedulae non agglutinatae” – Das Census-Datenmodell und seine Vorgänger. Pegasus 10: 223–260. Bertin, Jaques. 1981. Graphics and Graphic Information Processing. Berlin: de Gruyter. Bertin, Jacques. 2001. “Matrix theory of graphics.” Information Design Journal 10, no. 1: 5–19. doi: 10.1075/idj.10.1.04ber. Bizer, Christian, Tom Heath, and Tim Berners-Lee. 2009. “Linked data—The story so Far.” International Journal on Semantic Web & Information Systems 5, no. 3: 1–22. Broder, Andrei, Ravi Kumar, Farzin Maghoul, Prabhakar Raghavan, Sridhar Rajagopalan, Raymie Stata, Andrew Tomkins, and Janet Wiener. 2000. “Graph structure in the Web.” Computer Networks 33, no. 1–6: 309–319.doi:10.1016/j.physletb.2003.10.071.
How to Create a Mind: The Secret of Human Thought Revealed
by
Ray Kurzweil
Published 13 Nov 2012
That is, we cannot do so biologically, but that is exactly what we will do technologically. A Strategy for Creating a Mind There are billions of neurons in our brains, but what are neurons? Just cells. The brain has no knowledge until connections are made between neurons. All that we know, all that we are, comes from the way our neurons are connected. —Tim Berners-Lee Let’s use the observations I have discussed above to begin building a brain. We will start by building a pattern recognizer that meets the necessary attributes. Next we’ll make as many copies of the recognizer as we have memory and computational resources to support. Each recognizer computes the probability that its pattern has been recognized.
Loneliness: Human Nature and the Need for Social Connection
by
John T. Cacioppo
Published 9 Aug 2009
The wisdom of Herodotus and Hegel is available to everyone. Even corporate predators who spend decades pillaging and plundering often see the light and, in the end, set up huge foundations to do something useful with their wealth. Mother Teresa devoted her life to helping the poor of Calcutta, but not with an eye on the Nobel Prize. Sir Tim Berners-Lee invented the basic structure of the World Wide Web as a means of bringing humanity together, with no thought of commercial exploitation. And yet the human record of beneficial advancement continues to be marred by “winner take all” and “my way or the highway” thinking, including tribalism, intolerance, bloodshed, and cruelty.
Uncharted: How to Map the Future
by
Margaret Heffernan
Published 20 Feb 2020
Because CERN grew into an international organisation committed to the open sharing of its data, it attracted pioneering computer scientists who built the information systems to do this work. As the amount of data expanded, so did the complex requirements of the network designed to support it. By the 1980s, CERN had email and scientists were able to send files – they just needed a PhD in computing to understand how to do so. By 1989, CERN was the largest internet node in Europe. Tim Berners-Lee, then a fellow at CERN, identified the need for a standard language and protocols, but at first his ideas attracted little interest. Berners-Lee and his collaborator, Robert Cailliau, realised that they needed a catchy title to capture the scale of the project’s potential. He’d wanted to call it ‘Mine of Information’, but in French the acronym ‘MOI’ sounded rather self-centred.
Radical Markets: Uprooting Capitalism and Democracy for a Just Society
by
Eric Posner
and
E. Weyl
Published 14 May 2018
Yet, what eventually became the mainstream Internet did not start as a commercial or economic project. Instead, it was a collaborative platform within government, military, and academic circles where participants were assumed to be interested in collaboration for reasons external to commercial motivations. The World Wide Web interface of hyperlinks developed by Tim Berners-Lee and others therefore placed emphasis on lowering barriers to participation rather than on providing incentives and rewards for labor. “Information wants to be free” became a slogan for entrepreneurs and a rallying cry for activists. It especially appealed to a Silicon Valley mentality that grew from the counterculture of the 1960s.8 During the 1990s, venture capital poured in to commercialize the booming Internet before online services had established how they would monetize their offerings.
RDF Database Systems: Triples Storage and SPARQL Query Processing
by
Olivier Cure
and
Guillaume Blin
Published 10 Dec 2014
Nevertheless, the latter is usually preferred when one refers to a resource with a globally well-known identifier or location. 49 50 RDF Database Systems In the following sections, we present three additional serializations for RDF documents that do not suffer the verbosity of the XML syntax.They are frequently used for readability reasons but are also handled by most of the RDF tools we will see later in this chapter e.g., parser, serializer. 3.2.2 N-triples N-triples is the simplest form of textual representation of RDF data but it’s also the most difficult to use in a print version because it does not allow URI abbreviation.The triples are given in subject, predicate, and object order as three complete URIs separated by spaces and encompassed by angle brackets (< and >). Each statement is given on a single line ended by a period (.). The following code illustrates the statements regarding the category Science. 3.2.3 N3 Notation 3, or N3 for short, was proposed by Tim Berners-Lee as a compromise between the simplicity of N-triples with the expressiveness of RDF/XML. The notation is very similar to N-triples. The main syntactic differences are: • The surrounding brackets have been removed. • URIs can be abbreviated by prefix names. • There is no restriction on the number of separating spaces. • Shortcut syntax for statements sharing a predicate and/or an object have been introduced: • "subject stuff ; morestuff ." stands for "subject stuff. subject morestuff ." • "subject predicate stuff, morestuff ." stands for "subject predicate stuff. subject predicate morestuff ." • Blank nodes can be replaced by declared existential variables (see http:// www.w3.org/DesignIssues/Notation3 for more details).
The Cultural Logic of Computation
by
David Golumbia
Published 31 Mar 2009
At the same time, perhaps the most ardent rationalist in contemporary philosophy, Jerry Fodor, also sees deep problems in the category models that underlie what Lakoff calls “classical” views of categorization; this has been a hallmark of Fodor’s career, from Fodor (1970) to Fodor (1998). 8. For the fully articulated vision of the Semantic Web and the extensive use of structured semantic markup in it, see, e.g., Antoniou and van Harmelen (2004), Berners-Lee, Hendler, and Lassila (2001), and Fensel, Hendler, Lieberman, and Wahlster (2003). Tim Berners-Lee, the inventor of the HTML protocol, explains his belief in the necessity of augmenting the web’s semantic processing capabilities in more detail in Berners-Lee (1999). 9. For prototype theory see Lakoff (1987). 10. For discussions of language ideologies and their relationship to ideology more generally see Kress and Hodge (1979), the essays in Kroskrity (1998), and Schieffelin, Woolard, and Kroskrity (1998). 11.
The Infinite Machine: How an Army of Crypto-Hackers Is Building the Next Internet With Ethereum
by
Camila Russo
Published 13 Jul 2020
Systems using this architecture are resilient against censorship, attacks, and manipulation. Just like the mythological Hydra, there’s no one head that you can cut off to kill it, and it gets stronger after each attack. The original vision for the World Wide Web, as imagined by its creator, Tim Berners-Lee, was meant to be closer to a P2P network than how it works today, that is, behind a series of firewalls and fed to us through Google, Facebook, and maybe a handful of other mega corporations. Berners-Lee has publicly lamented the current state of the web. The original vision is what inspired and drove cypherpunks.
Who Owns This Sentence?: A History of Copyrights and Wrongs
by
David Bellos
and
Alexandre Montagu
Published 23 Jan 2024
The campaign succeeded: it resulted in the Warning from the FBI that used to figure as the first frame of movies that could be bought or rented in cassette or D.V.D. format until those media themselves disappeared. However, the greatest perceived threat to financial stakeholders in the culture industry in the later years of the twentieth century came from Tim Berner-Lee’s invention of the Internet. Initially designed as a tool to allow scientists to share their work, it quickly became a means for distributing perfect copies of text, and then sound and image files, at a marginal cost that was, for the first time, zero to the copier, if not to the planet. Publishers feared their entire business would be destroyed, and music companies were in an even greater panic.
The Art of SEO
by
Eric Enge
,
Stephan Spencer
,
Jessie Stricchiola
and
Rand Fishkin
Published 7 Mar 2012
The “wisdom of the crowds” is becoming a factor in rankings, as discussed in Chapter 8. Mike Grehan talks about this in his paper, “New Signals to Search Engines” (http://www.acronym.com/new-signals-to-search-engines.html). He summarizes the state of web search as follows: We’re essentially trying to force elephants into browsers that do not want them. The browser that Sir Tim Berners-Lee invented, along with HTML and the HTTP protocol, was intended to render text and graphics on a page delivered to your computer via a dial-up modem, not to watch movies like we do today. Search engine crawlers were developed to capture text from HTML pages and analyze links between pages, but with so much information outside the crawl, is it the right method for an always-on, ever-demanding audience of self producers?
…
The source HTML code of any web page can be viewed in a web browser such as Firefox (right-click and select “View Page Source”) or Internet Explorer (select View→Source). HTML is not a programming language and therefore is quite static in nature. It is considered to be a subset of SGML (Standard Generalized Markup Language). Tim Berners-Lee first described HTML in 1991, in a publicly available document called “HTML Tags.” The Internet Engineering Task Force (IETF) published the first draft proposal in 1993. HTML became an international standard (ISO/IEC 15445:2000), and its specifications are maintained by the World Wide Web Consortium (W3C).
Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture
by
David Kushner
Published 2 Jan 2003
Though a global network of computers had been around since the 1970s –when the U.S. government’s Defense Advanced Research Projects Agency, or DARPA, linked networks of computers (the DARPAnet and, later, the Internet) together around the world–it was just starting to seep into the mainstream. This evolution began in 1989, when a computer researcher in Europe named Tim Berners-Lee wrote a program that linked information on the Internet into what was called the World Wide Web. Four years later, in 1995, two University of Illinois hackers named Marc Andreessen and Eric Bina created and released Mosaic: a free “browser” program that transformed the Web’s unseemly data into more easily digestible, magazinelike pages of graphics and text.
Augmented: Life in the Smart Lane
by
Brett King
Published 5 May 2016
“Last year, 60 per cent of [our] traffic came from desktop. Today, 70 per cent comes from mobile.”23 Michelle Phan The Internet is at the heart of some of the most innovative business models we’ve seen emerge in the last 50 years. It wasn’t that long ago that the very first website was published by Tim Berners-Lee, on 6th August 1991. The page simply explained the World Wide Web project and gave information on how users could set up a web server and create their own web pages.24 What we know as the commercial Internet is generally recognised as having launched three years later in 1994, with the likes of Yahoo, Lycos, the Economist, First Virtual (Bank), LawInfo, Pizza Hut, The Simpsons Archive (the very first fan website), Whitehouse.gov, WebCrawler, Wired magazine (hotwired.com back then) and others.
#Republic: Divided Democracy in the Age of Social Media
by
Cass R. Sunstein
Published 7 Mar 2017
A key innovation came one year later, when researchers at the European Organization for Nuclear Research (CERN) near Geneva, Switzerland, created the World Wide Web, a multimedia branch of the Internet. CERN researchers attempted to interest private companies in building the World Wide Web, but they declined (“too complicated”), and Tim Berners-Lee, the lead researcher and web inventor, had to build it on his own. Hard as it now is to believe, the Internet started to commercialize only in 1992, as a result of the enactment of new legislation removing restrictions on commercial activity. It was around that time that direct government funding was largely withdrawn, but indirect funding and support continues.
Cryptoassets: The Innovative Investor's Guide to Bitcoin and Beyond: The Innovative Investor's Guide to Bitcoin and Beyond
by
Chris Burniske
and
Jack Tatar
Published 19 Oct 2017
Thus far, the World Wide Web has been the greatest meta-application to leverage the underlying fiber of the Internet. The indexed web contains at least 4.73 billion pages, nearing the point where there will be one page for every human.1 The beginning of the Internet is commonly associated with the 1990s, with Tim Berners-Lee stumbling upon the idea of the World Wide Web while trying to create an information management system for CERN, and Marc Andreessen developing the first widely used web browser, which ultimately became Netscape. Although the accomplishments of Berners-Lee and Andreessen were linchpins to mainstream adoption, the web and the ability to browse it were the first killer apps built on top of the Internet, not to be conflated with the creation of the Internet itself.
The AI Economy: Work, Wealth and Welfare in the Robot Age
by
Roger Bootle
Published 4 Sep 2019
As Nigel Shadbolt and Roger Hampson put it: “… the problem is not that machines might wrest control of our lives from the elites. The problem is that most of us might never be able to wrest control of the machines from the people who occupy the command posts.”36 This would be ironic because the World Wide Web and much subsequent digital development originated in a spirit of extreme libertarianism. Sir Tim Berners-Lee, the founder of the internet, did not establish any patents or property rights in his creation because he wanted it to be free. The wired world is supposed to be a levelling force in society and its effect is supposedly anti-authoritarian and anti-hierarchical. At least in the West the internet does not belong to governments.
Boom and Bust: A Global History of Financial Bubbles
by
William Quinn
and
John D. Turner
Published 5 Aug 2020
The key to unlocking this potential, and the catalyst for the bubble, was the creation of a global network of information exchange: the now-ubiquitous Internet. Although governments and universities had been developing networks of computers since the 1960s, the Internet as we now know it originated in 1989. Tim Berners-Lee, a scientist at the European Council for Nuclear Research, suggested that the organisation would find it easier to keep track of its projects if it had a system which structured information in an easily accessible way. Berners-Lee conceived of a decentralised system of documents, to which any member could upload data.
Democracy for Sale: Dark Money and Dirty Politics
by
Peter Geoghegan
Published 2 Jan 2020
Yet while dismantling the modern-day equivalent of the railroad monopolies might help, it’s not clear how such a move would stem the flow of disinformation, especially on the closed groups and anonymous forums from which armies of online trolls are feeding a populist surge across the world. There have been other proposals. GDPR laws could be used to check the worst abuses of a sprawling online advertising technology industry that shares huge amounts of personal data with companies across the Internet. Tim Berners-Lee, the founder of the World Wide Web, has warned of a “digital dystopia” if disinformation and invasion of privacy are not stopped. In 2019, dozens of tech companies including Facebook and Google signed Berners-Lee’s ‘Contract for the Web’ calling for a safe, free and open Internet. But self-regulation will only ever go so far.
Growth: A Reckoning
by
Daniel Susskind
Published 16 Apr 2024
By his early twenties he was an iconic programmer, entrepreneur (creator of RSS and cofounder of Reddit), and a philosopher-activist with a cult following. His colourful life, coupled with his unusual sense of purpose so early on, was celebrated by many great minds of our time. The creator of the World Wide Web, Tim Berners-Lee, hailed him as a ‘fighter’ and ‘a shining force for good’. The Harvard law professor Larry Lessig described him as ‘an icon, an ideal’ and ‘my mentor’, though Lessig was twenty-five years his senior.35 At twenty-six, Swartz took his own life while under the pressure of a federal investigation.
AI in Museums: Reflections, Perspectives and Applications
by
Sonja Thiel
and
Johannes C. Bernhardt
Published 31 Dec 2023
The tools which I will discuss in turn are: 1. the Data Ethics Canvas, 2. Consequence Scanning, and 3. the Museums + AI Toolkit. Figure 1: Data Ethics Canvas. Source: Open Data Institute. The Data Ethics Canvas was developed by the Open Data Institute, a non-profit institute focussed on data and society established in London in 2012 by Tim Berners-Lee and Sir Nigel Shadbolt (fig. 1).2 The Data Ethics Canvas is designed to provide a framework that can be applied to any context, whatever the project’s size or scope. It is particularly useful for museum practitioners because it provides a framework and structured pathways for thinking about the broader ethical context of any data-led innovation, and therefore has benefits beyond AI-specific work, since it is 2 https://www.theodi.org/article/the-data-ethics-canvas-2021/. 77 78 Part 1: Reflections applicable to all data-led projects.
Culture & Empire: Digital Revolution
by
Pieter Hintjens
Published 11 Mar 2013
The Internet's dark side as we know and love it -- spam, viruses, porn sites, download sites, credit card fraud, identity theft, malware -- blessed us with a brief preview in 1988, when the first worm flattened the academic Internet. We had to wait until 1990, when commercial restrictions on Internet use were lifted; and then 1991, when Tim-Berners Lee invented the web at CERN, in Geneva; and finally 1993, when Al Gore found funding for the development of a graphical web browser named Mosaic. Suddenly, any fool with a PC and a modem could get on line, and The Real Internet was born. It still took Microsoft more than two years to catch on. Rather than recognize the new Internet, it stubbornly rolled out its own "Microsoft Network" that hardly talked to the Internet at all.
The End of Power: From Boardrooms to Battlefields and Churches to States, Why Being in Charge Isn’t What It Used to Be
by
Moises Naim
Published 5 Mar 2013
As General William Odom, Ronald Reagan’s National Security Agency director, observed: “By creating a security umbrella over Europe and Asia, Americans lowered the business transaction costs in all these regions: North America, Western Europe and Northeast Asia all got richer as a result.”3 Now those lower transaction costs could be extended, and with them also the promise of greater economic freedom. Slightly more than a year after thousands of Germans took sledgehammers to the Berlin Wall, in December 1990, Tim Berners-Lee, a British computer scientist at the European Organization for Nuclear Research on the Franco-Swiss border, sent the first successful communication between a Hypertext Transfer Protocol and a server via the Internet, thereby creating the World Wide Web. That invention, in turn, sparked a global communications revolution that has left no part of our lives untouched.
The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory
by
Kariappa Bheemaiah
Published 26 Feb 2017
Based on the fundamental principles of packet-switching and a protocol for decentralized communication , other protocols like HTTP, SMTP and VoIP were developed for specific communication purposes. As protocols evolved, they went on to create a digital, decentralized, and distributed environment that was fertile for innovation. Tim Berners-Lee’s invention of the World Wide Web led to the creation of the Internet which changed the way we work and live. As email (based on SMTP) became the preferred medium of communication, malicious agents created other inventions such as spam email and malware . Thankfully, Adam Back was there for the rescue.
The Evolution of Everything: How New Ideas Emerge
by
Matt Ridley
Well, then, perhaps we should forget about who was funding the work, and at least give credit to the individuals without whom the internet would never have happened. Paul Baran was first with the notion of packet switching, Vint Cerf invented the TCP/IP protocols that proved crucial to allowing different programs to run on the internet, and Sir Tim Berners Lee developed the worldwide web. Yet there is a problem here, too. Can anybody really think that these things – or their equivalents – would not have come into existence in the 1990s if these undoubtedly brilliant men had never been born? Given all we know about the ubiquitous phenomenon of simultaneous invention, and the inevitability of the next step in innovation once a technology is ripe (see Chapter 7), it is inconceivable that the twentieth century would have ended without a general, open means of connecting computers to each other so that people could see what was on other nodes than their own hard drive.
The Great Firewall of China
by
James Griffiths;
Published 15 Jan 2018
* The Soviet Union took its first steps onto the internet in 1990, a year before conservatives in the Communist Party attempted to remove reformist General Secretary Mikhail Gorbachev in a coup.22 At the start of the decade, the country was lagging far behind the West, which had imposed tight import restrictions on computer equipment, leaving Soviet engineers with only poor imitations of the latest hardware and a rapidly widening technology gap.23 Just as CERN, the European nuclear research organisation, became a key hub for Western technology pioneers, eventually resulting in Tim Berners-Lee inventing the world wide web, Soviet researchers at the Kurchatov Institute of Atomic Energy in Moscow were instrumental in linking the Union to the nascent global internet.24 In the early 1980s, the Institute had acquired a copy of the Unix operating system, and set a team of programmers to adapting it for Russian-speaking users.
Binge Times: Inside Hollywood's Furious Billion-Dollar Battle to Take Down Netflix
by
Dade Hayes
and
Dawn Chmielewski
Published 18 Apr 2022
A lone piece of furniture dominated the room: a table holding a VHS tape player and a high-end Silicon Graphics machine connected via a T-1 dedicated telephone line to the internet’s multicast backbone (or MBone), which was used to transmit real-time video and audio. The streaming experiment would easily outdo the most daring office activity at the time, according to one Sun Microsystems engineer: watching someone brew coffee. This demonstration would require all the computing firepower the group could muster. Only four years earlier, British scientist Tim Berners-Lee had conceived of a way for scientists in universities and research institutions to share information via a network of computers known as the World Wide Web. Blair had struggled to get distribution for his film, which he wrote and directed. It centered on a maker of weapons guidance systems named Jacob Maker, who falls under the control of his bees.
Hacking Capitalism
by
Söderberg, Johan; Söderberg, Johan;
The name dispute has also political ramifications since many within the computer underground and in the industry would like to keep the outspoken Stallman and the Free Software Foundation at an arms length. 24. Peter Wayner, Free For All—How Linux and the Free Software Movement Undercut the High-Tech Titans (New York: HarperBusiness, 2000). 25. http://news.netcraft.com/archives/2006/01/05/january_2006_web_server_survey.html, (accessed 2007-02-08). 26. Tim Berners-Lee & Mark Fischetti, Weaving the Web—The Past, Present and Future of the World Wide Web (London: Texere, 2000). 27. This fact is happily admitted to by free-software entrepreneur Robert Young: “Quietly, since Red Hat’s founding in the 1993, we had focused on an approach to software development that enabled us to tap into a worldwide software development team bigger than even the biggest industry giant could afford” Robert Young and Wendy Rohm, Under the Radar—How Red Hat Changed the Software Business and Took Microsoft by Surprise (Scottsdale, AZ: Coriolis, 1999), 9; hereafter cited in text. 28.
Words That Work: It's Not What You Say, It's What People Hear
by
Dr. Frank Luntz
Published 2 Jan 2007
Web—This word came into Old English from the German, and originally meant “woven fabric.” Sir Walter Scott wrote of “what a tangled web we weave, when first we practice to deceive” (a line often erroneously attributed to William Shakespeare). Contrary to popular belief, the World Wide Web, “www:”, is not synonymous with the Internet. The Web was given its name by Tim Berners-Lee, and was opened to the public on August 6, 1991.19 The World Wide Web is actually not the interconnected network of computers—that network is the Internet. The World Wide Web is the system for accessing this information. Think of it as a network of addresses for information, or a worldwide, cross-referenced filing cabinet.
Flowers of Fire: The Inside Story of South Korea's Feminist Movement and What It Means for Women's Rights Worldwide
by
Hawon Jung
Published 21 Mar 2023
Thus, what started on an online forum to discuss MERS wound up giving birth to the most well-known—and the most controversial—feminist website in the country. In its early days, the internet was painted as an open and equal free-speech paradise for everyone. But by 2015, that sort of innocent optimism had largely waned, replaced by the realization that the voices of social minorities were often drowned out in a sea of abuse and harassment. Even Tim Berners-Lee, creator of the World Wide Web, has noted a “dangerous trend” of abuse in cyberspace that “silences women and deprives the world of their opinions and ideas.”33 South Korea is a hyper-connected country where nearly all households have internet access, 90 percent of the population use social media, and the average citizen spends over ten hours a day online,34 stats that all far exceed global average.
Innovation and Its Enemies
by
Calestous Juma
Published 20 Mar 2017
The same year, the inaugural Queen Elizabeth Prize for Engineering was awarded to pioneers who developed the Internet and the Web. The £1 million prize was launched to reward and celebrate those responsible for groundbreaking innovations of global benefit to humanity. Robert Kahn, Vinton Cerf, and Louis Pouzin pioneered the development of protocols that constituted the fundamental architecture of the Internet, while Tim Berners-Lee created the World Wide Web and greatly stretched the use of the Internet beyond file transfer and email. Marc Andreessen, while a student in collaboration with colleagues, wrote the Mosaic browser, which was extensively distributed and popularized worldwide web access. These pioneering engineering achievements revolutionized the way humans communicate.
Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software
by
Scott Rosenberg
Published 2 Jan 2006
You could model just about anything in a simple three-part format that looked something like the subject-verb-object arrangement of a simple English sentence: <this> <has-relationship-with> <that> Then they discovered that the answer they’d come up with had already been outlined and at least partially implemented by researchers led by Tim Berners-Lee, the scientist who had invented the World Wide Web a dozen years before. Berners-Lee had a dream he called the Semantic Web, an upgraded version of the existing Web that relied on smarter and more complex representations of data. The Semantic Web would be built on a technical foundation called RDF, for Resource Description Framework.
Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots
by
John Markoff
Published 24 Aug 2015
Gruber’s ontology work was an obvious match for Tenenbaum’s commerce system because it was a system that required using a common language to connect disparate parts. Partly as a result of their collaboration, Gruber was one of the first Silicon Valley technologists to immerse himself in the World Wide Web. Developed by Tim Berners-Lee in the heart of the particle physics community in Switzerland, the Web was rapidly adopted by computer scientists. It became known to a much wider audience when it was described in the New York Times in December of 1993.1 The Internet allowed Gruber to create a small group that blossomed into a living cyber-community expressed in the exchange of electronic mail.
Cybersecurity: What Everyone Needs to Know
by
P. W. Singer
and
Allan Friedman
Published 3 Jan 2014
In 1990, a researcher at the European research center CERN in Switzerland took a relatively obscure form of presenting information in a set of linked computer documents and built a new networking interface for it. With this HyperText Transfer Protocol (HTTP), and an accompanying system to identify the linked documents (URLs), Tim Berners-Lee “invented” the World Wide Web as we now look at it. Amusingly, when Berners-Lee tried to present it at an academic conference, his breakthrough wasn’t considered worthy enough even to make a formal panel. Instead, he was relegated to showing a poster on it in a hallway. A few years later, researchers at the University of Illinois introduced the Mosaic web browser, which simplified both web design and introduced the new practice of “web surfing” for the general public.
Geek Heresy: Rescuing Social Change From the Cult of Technology
by
Kentaro Toyama
Published 25 May 2015
Smith suggested that these technologies bring people together, trigger revolutions, and make “all world knowledge . . . available online for free.” If she is right, everyone everywhere will soon have plenty of opportunity: Talent is universal, and opportunity is the Internet. The world’s leading technologists thoroughly agree, and they’re competing to speed things up. In 2009, Sir Tim Berners-Lee, the inventor of the key protocols that drive the Internet, founded the World Wide Web Foundation to spread the Web as “a global public good and a basic right.” Its tagline: “Connecting People. Empowering Humanity.”4 A couple years later, Smith’s colleagues at Google began working to deliver WiFi through solar-powered balloons.
The Establishment: And How They Get Away With It
by
Owen Jones
Published 3 Sep 2014
As the Council boasts, it ‘revolutionised biomedical research and sparked an international multi-billion pound biotechnology industry’, as well as drugs for diseases ranging from cancer to asthma.1 The Internet, meanwhile, had its origins in US government research, while the World Wide Web was created by British engineer Tim Berners-Lee at the publicly funded European research organization CERN. Google’s search engine would have been impossible without the algorithm that lies at its heart – an algorithm generously provided by the US National Science Foundation. Apple’s iPhone brings together a diverse range of state-funded innovations, ranging from touchscreen displays to microelectronics to the global positioning system (GPS).
Humankind: A Hopeful History
by
Rutger Bregman
Published 1 Jun 2020
Seems patronizing.3 Interviewer: What’s your speck on the horizon, Jos – that distant goal that inspires you and your team? Jos: I don’t have distant goals. Not all that inspired by specks.4 Unlikely as it may sound, this is also a man who’s received the prestigious Albert Medal from the Royal Society of Arts in London, ranking him with the likes of Tim Berners-Lee, brain behind the World Wide Web; Francis Crick, who unravelled the structure of DNA; and the brilliant physicist Stephen Hawking. In November 2014, it was Jos de Blok from small-town Holland who received the honour and the cream of British academia turned out to attend his keynote speech. In broken English, De Blok confessed that at first he thought it was a joke.
When Einstein Walked With Gödel: Excursions to the Edge of Thought
by
Jim Holt
Published 14 May 2018
Soon afterward, he says, he lost the ability to edit or revise on paper. Around 1990, he acquired a modem and an AOL subscription, which entitled him to spend five hours a week online sending e-mail, visiting chat rooms, and reading old newspaper articles. It was around this time that the programmer Tim Berners-Lee wrote the code for the World Wide Web, which, in due course, Carr would be restlessly exploring with the aid of his new Netscape browser. “You know the rest of the story because it’s probably your story too,” he tells us. “Ever-faster chips. Ever-quicker modems. DVDs and DVD burners. Gigabyte-sized hard drives.
Finding the Mother Tree: Discovering the Wisdom of the Forest
by
Suzanne Simard
Published 3 May 2021
M., Harniman, S. M. K., et al. 1997. Ectomycorrhizal diversity on Betula papyrifera and Pseudotsuga menziesii seedlings grown in the greenhouse or outplanted in single-species and mixed plots in southern British Columbia. Canadian Journal of Forest Research 27: 1872–89. McPherson, S. S. 2009. Tim Berners-Lee: Inventor of the World Wide Web. Minneapolis: Twenty-First Century Books. Read, D. J., Francis, R., and Finlay, R. D. 1985. Mycorrhizal mycelia and nutrient cycling in plant communities. In Ecological Interactions in Soil, ed. A. H. Fitter, D. Atkinson, D. J. Read, and M. B. Usher. Oxford: Blackwell Scientific, 193–217.
The Long History of the Future: Why Tomorrow's Technology Still Isn't Here
by
Nicole Kobie
Published 3 Jul 2024
More powerful chips meant smaller computers, but also bigger ones, and the Cray 1 supercomputer was installed at Los Alamos National Laboratory in 1976. In 1984, computers started to look more like their modern descendants. Apple introduced the Macintosh, with a mouse and a display – it’s hard to imagine computers without these now – and a year later, Microsoft unveiled the first version of Windows. In 1989, Tim Berners-Lee outlined a plan for a system that became the web. And in 2007, the iPhone arrived. Now, chips are tiny, everything is connected and computing is ubiquitous. That’s the short version of computer history. But who paid for it all? Those first computers were war machines, and there’s no question that much of what we now see as computing and technology was initially funded by the military, including Fairchild Semiconductor.
The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future
by
Keach Hagey
Published 19 May 2025
While much of what they scraped was protected by copyright—most digital publishers will say “all rights reserved” at the bottom of their pages, whether they have a paywall or not—the practice was embraced by the academic community because it was for research, not commerce, and because the websites (at least theoretically) did not have paywalls. Why would publishers have made their content free, the thinking went, if they didn’t want people to take it? “If you go back to Tim Berners-Lee, when he invented the web in 1989, the goal was to put all the information on the internet so that people could access it, and Common Crawl is a sampling of that,” said Rich Skrenta, the executive director of the Common Crawl Foundation. “Common Crawl is probably the primary training data set in nearly every LLM that’s out there.”
Capitalism 4.0: The Birth of a New Economy in the Aftermath of Crisis
by
Anatole Kaletsky
Published 22 Jun 2010
In theory, China’s gradual transformation into one of the most fiercely competitive and profit-oriented systems of private enterprise the world had ever seen began with Deng Xiaoping’s introduction of “Socialism with Chinese characteristics” in 1978.7 However, these reforms only began to deliver impressive results about a decade later, in the late 1980s, turning China into a serious commercial power, transforming the global trading system, and shifting the center of gravity of the world economy toward Asia. Three, a technological revolution accelerated in the late 1980s and did for human memory and intelligence what the steam engine and electricity did in the nineteenth century for muscle power. In March 1989, Tim Berners-Lee, a British physicist working at the CERN laboratory in Geneva, wrote a proposal for a “world wide web” of documents written in a standardized “hypertext language” that would reside on computers dotted around the world and communicating through phone lines with what he called “browsers.” Berners-Lee predicted that his world wide web would quickly allow “the creation of new links and new material,” making “authorship universal” for computer users everywhere.
Iron Sunrise
by
Stross, Charles
Published 28 Oct 2004
“I’ve got to make a couple of calls. This should be fun…” The UN headquarters campus hadn’t changed visibly in Rachel’s absence — the same neoclassical glass-and-steel skyscraper, looming over old Geneva’s stone arteries and quaint domes, the same big statues of founders Otto von Bismarck and Tim Berners-Lee sitting out front in the plaza. Rachel headed into the lobby, looking around tensely. There was a civil cop standing by the ornate reception throne, talking to the human greeter there. Rachel nodded in their direction then moved on toward the antique elevator bank, feeling reassured. I wonder how George is doing?
Against Intellectual Monopoly
by
Michele Boldrin
and
David K. Levine
Published 6 Jul 2008
The announcement and reactions have been widely covered in the press. 2. Bill Gates, “Microsoft Challenges and Strategy,” memo, May 16, 1991. 3. Extensive discussion of the role of copyright and patents in the software market can be found in Bessen and Hunt (2003). The first browser and Web server were written by Tim Berners-Lee, of the European Organization for Nuclear Research (known widely as CERN), who was also instrumental in persuading his superiors at CERN to keep the code and protocols free and open. NSCA Mosaic was the first popular browser and provided the source code for both Netscape and Internet Explorer.
Crypto: How the Code Rebels Beat the Government Saving Privacy in the Digital Age
by
Steven Levy
Published 15 Jan 2002
In 1993, two students at the University of Illinois had engaged in a coffeehouse conversation that would not only change the course of the twenty-two-year-old international network called the Internet but would profoundly affect the adoption of crypto. One of them, a chunky undergrad named Marc Andreessen, had recently been learning about a new system on the Internet brashly named the World Wide Web by its inventor, Tim Berners-Lee, a British computer scientist working in Switzerland. The Web was an ingenious way to publish and get access to information on the Net, but only a few in the technical community had adopted the system. Andreessen saw a wider potential. If someone created a slick “browser” to surf through the information space created by a multitude of people who shared text, pictures, and sounds on the Web, he said to his colleague Eric Bina, the Internet itself would be easier to use and a better way to get information.
Eloquent JavaScript: A Modern Introduction to Programming
by
Marijn Haverbeke
Published 15 Nov 2018
Also remember that scopes do not derive from Object.prototype, so if you want to call hasOwnProperty on them, you have to use this clumsy expression: Object.prototype.hasOwnProperty.call(scope, name); PART II BROWSER “The dream behind the Web is of a common information space in which we communicate by sharing information. Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished.” — Tim Berners-Lee, The World Wide Web: A very short personal history 13 JAVASCRIPT AND THE BROWSER The next chapters of this book will talk about web browsers. Without web browsers, there would be no JavaScript. Or even if there were, no one would ever have paid any attention to it. Web technology has been decentralized from the start, not just technically but also in the way it evolved.
Open: The Story of Human Progress
by
Johan Norberg
Published 14 Sep 2020
Since the pioneers primarily wanted to share processor time at mainframe computers, they could have made the decision to block other applications. But they had sufficient understanding of the limitations of their own imagination to make the platform open and unspecified so that others could later use it as they saw fit, for e-mail, the world wide web and other applications. At the same time, the British computer consultant Tim Berners-Lee worked on a way to keep track of all his colleagues and their projects, and got the idea of a system organized through hyperlinks. He eventually developed this into an idea of a digital platform that could gather all information on the internet, based on a previous platform called SGML developed at IBM in the 1960s.
Ways of Being: Beyond Human Intelligence
by
James Bridle
Published 6 Apr 2022
Together, the pair came up with a resonant new phrase to describe the mycorrhizal networks, which became the headline on the cover of that issue of Nature: ‘The Wood Wide Web’.13 Back in the 1960s, when the nascent internet started to thread its filaments across the planet, it did so primarily through university departments. It was the development of hypertext and the invention of the World Wide Web by Tim Berners-Lee at CERN in 1989 (specifically to facilitate the sharing of academic documents) which kick-started its wider adoption and understanding. But the gift of the Web wasn’t only informational: by its very existence it gave us new tools to identify and understand networks themselves. Before the Web’s arrival, scientists lacked the tools needed to understand how networks functioned in the real world.
Apple in China: The Capture of the World's Greatest Company
by
Patrick McGee
Published 13 May 2025
He auctioned off NeXT’s assets, and four of his five cofounders left the company. Jobs had failed; he was no longer the wunderkind featured on magazine covers. But NeXT wasn’t entirely dead. Its proprietary operating system, NeXTStep, might not have wide adoption, but it was a sophisticated, brilliant system that Tim Berners-Lee used to invent the World Wide Web. Jobs and his team changed focus to evolve the software into OPENSTEP, an operating system that could work on multiple platforms. Meanwhile, the industry had largely failed to notice that Steve was the CEO of yet another company. In 1986, he’d acquired a digital graphics group from Star Wars maker George Lucas for $5 million.
An Empire of Wealth: Rise of American Economy Power 1607-2000
by
John Steele Gordon
Published 12 Oct 2009
In 1983, by which time there were 563 computers on the net, the University of Wisconsin developed the Domain Name System, which made it much easier for one computer to find another on the net. By 1990 there were more than three hundred thousand computers on the Internet and the number was growing explosively, doubling every year. But it was still mostly a network connecting government agencies, universities, and corporate research institutions. Then, in 1992, Tim Berners-Lee, an Englishman working for CERN, the European nuclear research consortium, wrote and released without copyright the first Web browser, a program that allows people to easily find and link to different sites set up for the purpose. The World Wide Web (WWW) was born. Individuals and corporations quickly saw the potential of this new means to communicate as well as advertise and sell their products.
From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism
by
Fred Turner
Published 31 Aug 2006
Finally, in April 1995, the NSF relinquished control of the Internet backbone, facilitating the interlinking of commercial, alternative, and government-sponsored networks and the mixing of for-profit and not-forprofit uses across the system.12 By that time, another phenomenon had appeared on the Internet: the World Wide Web. Created in 1990 by Tim Berners-Lee and his colleagues at the Centre Européenne pour la Recherche Nucléaire (CERN), the Web took advantage of the Internet’s information transfer protocols to create a new system of information exchange. Thanks to hyperlinks embedded in documents, as well as the new Universal Resource Locator (URL) system, users could navigate information in new and complex ways.
The Future of the Internet: And How to Stop It
by
Jonathan Zittrain
Published 27 May 2009
RECURSION FROM TECHNOLOGY TO CONTENT TO SOCIETY The emergence of a vibrant public Internet and a powerful PC affects many traditional forms of creative and artistic expression because the accessibility of the PC and the Internet to coding by a variety of technically capable groups has translated to a number of platforms for use by artistically capable groups. For example, thanks to the hypertext standards first developed by researcher Tim Berners-Lee,76 the Web came into existence, and because Berners-Lee’s html hypertext markup language was easy to master, people without much technical know-how could build Web sites showcasing their creative work,77 or Web sites that were themselves creative works. Once html took off, others wrote html processors and converters so that one did not even have to know the basics of html to produce and edit Web pages.78 Similarly, simple but powerful software written by amateurs and running on Internet servers has enabled amateur journalists and writers to prepare and customize chronological accounts of their work—“blogs”79—and the pervasiveness of online search software has made these blogs accessible to millions of people who do not seek them out by name but rather by a topical search of a subject covered by a blog entry.80 The blog’s underlying software may be changeable itself, as Wordpress is, for example, and therefore generative at the technical layer.
European Spring: Why Our Economies and Politics Are in a Mess - and How to Put Them Right
by
Philippe Legrain
Published 22 Apr 2014
But individual European countries are far ahead: Switzerland managed 119.7 per million and Sweden 87.8. Britain had a paltry 27.9 and Italy only 13.2.596 Yet many truly great breakthroughs aren’t patented at all: think of Crick and Watson’s discovery of the molecular structure of DNA, the basis of much of biotechnology, or Tim Berners-Lee’s invention of the World Wide Web and hence the internet. Perhaps the best long-term yardstick of an advanced economy’s dynamism – its ability to generate new ideas and deploy them across the economy – is labour productivity growth. Even so, it is subject to all the same measurement problems as GDP (see Chapter 9).
Why We Can't Afford the Rich
by
Andrew Sayer
Published 6 Nov 2014
These trade pacts seek to augment the intellectual property of companies, prolonging patents beyond 20 years, enabling them to extract more rent for longer from ‘their’ products. Internet service providers will be required to filter and block content – thereby giving companies control over users’ use of ‘their’ products, stopping sharing or reverse engineering and adaptation. Tim Berners-Lee, the inventor of the internet, may have said ‘this is for everyone’, but some major companies want to privatise and control it for their own interests. Julian Assange of Wikileaks comments: If instituted, the TPP’s intellectual property regime would trample over individual rights and free expression, as well as ride roughshod over the intellectual and creative commons.
I'm Feeling Lucky: The Confessions of Google Employee Number 59
by
Douglas Edwards
Published 11 Jul 2011
.* This was almost single-handedly the work of Michael Schmitt, an engineer who took it upon himself to conduct a global search to track down tapes and CD backups of the earliest Usenet posts and the hardware that could read them. He recovered for posterity the first Usenet mention of Microsoft, Tim Berners-Lee's first posted reference to the "World Wide Web," and Marc Andreessen's public disclosure of the Mosaic web browser that would become Netscape. "If there were justice in the world," wrote a formerly disappointed Usenetter, "you guys would be rich and Bill Gates would be standing in line waiting for watery soup."
The Men Who United the States: America's Explorers, Inventors, Eccentrics and Mavericks, and the Creation of One Nation, Indivisible
by
Simon Winchester
Published 14 Oct 2013
It transmits this information through the physical system of the Internet that was devised at the Pentagon by Roberts and Licklider, using the protocols first made at the Pentagon by Cerf and Kahn. But the Web does not have many creators—just one. And neither he nor its early users were Americans. Tim Berners-Lee is British, and his first customers were scientists working at the nuclear research center CERN, in Switzerland. He told them about his invention in a memo sent electronically in August 1991. The web, as he called it, “aims to allow all links to be made to any information anywhere. [It] was started to allow high energy physicists to share data, news, and documentation.
Connectography: Mapping the Future of Global Civilization
by
Parag Khanna
Published 18 Apr 2016
The Dark Web of anonymous Tor-encrypted networks and Bitcoin transactions, the Deep Web of unindexed pages, corporate intranets, and other publicly unsearchable databases make up the vast majority of the Internet’s content. Though the Internet has no central authority, it is moving from its halcyon days as an ungoverned stateless commons with only technical supervision into a geopolitical arena of intense complexity. The Web’s founding father, Sir Tim Berners-Lee, has warned against strategic manipulation and advocated a cyber Magna Carta that guarantees the Internet remain a neutral utility. But it is too late: The Internet already shows signs of both digital sovereignty and feudalism, with rivalries not mapping neatly onto political geography. As the U.S.
More: The 10,000-Year Rise of the World Economy
by
Philip Coggan
Published 6 Feb 2020
These articles were turned into print by the use of “hot metal” that was pressed into the page, a technology that was a century old.13 However, computers would not have had a huge impact on society if they had merely replaced the electric typewriter. Here again, the government played a vital role. The US defence department’s Advanced Research Projects Agency (ARPA) was created in 1958 and succeeded in connecting remote computers in 1969. By 1985, 2,000 computers were connected to the network, dubbed the ARPANET.14 Tim Berners-Lee, a software engineer at CERN, the particle physics laboratory in Geneva, developed a protocol and a language, the Hypertext Transfer Protocol, or HTTP, which allowed computers to talk to each other and enabled users to send links to individual documents. Others adopted the system, which was known as the World Wide Web.
Adventures in the Anthropocene: A Journey to the Heart of the Planet We Made
by
Gaia Vince
Published 19 Oct 2014
The computer and Internet facilities here would be unusual in a school in London – here, they are astonishing. In the Anthropocene, the world no longer needs to end at the village perimeter. Just as social development goals now include a right to electricity, it is no longer acceptable for people to be denied access to Tim Berners-Lee’s brilliant toy. Through it, we are no longer a few individuals collaborating with a few more. We are a bigger more beautiful creature: the organism of humanity, ‘Homo omnis’. We can communicate not just with remotely located people, but with everybody simultaneously – we’re even attempting to speak to aliens located elsewhere in the universe.
How to Be a Liberal: The Story of Liberalism and the Fight for Its Life
by
Ian Dunt
Published 15 Oct 2020
In the late 20th Century, a new form of technology developed which intensified group dynamics and threatened to drive them out of control. It began in the CERN research facility in Switzerland in 1989, where physicists were trying to work out a way to share information between them about their experiments. A British computer scientist called Tim Berners-Lee, had an idea, which he called Mesh. A new kind of technology called hypertext allowed him to link documents together, so that a highlighted word could be selected by a user and take them to another document containing further information. The documents could then be stored on several servers, which were interconnected, so they were readily available for everyone to read.
Grand Transitions: How the Modern World Was Made
by
Vaclav Smil
Published 2 Mar 2021
When Gutenberg used his invention of movable type to print the first pages of his Bible in 1454, he could not foresee that during the remaining 45 years of the 15th-century European printers would publish more than 11,000 new editions, that the number of new titles would rise by an order of magnitude during the second half of the 16th century (to about 135,000), and that they would reach nearly 650,000 during the second half of the 18th century (Buringh and van Zanden 2009). Book printing, not textiles or tools, was the first mass-scale industry of the early modern era whose impact (is the Enlightenment conceivable without it?) was not obvious at the time of its origins. Similarly, when Tim Berners-Lee released his first World Wide Web browser in March 1991 it was not at all obvious that a near-monopoly in searching would arise. Mosaic, the first popular Internet browser, became available in 1993 and during the next five years, before Google’s launch, there was a mini-universe of search engines, from AltaVista and Ask Jeeves to WebCrawler and Yahoo!
The Relentless Revolution: A History of Capitalism
by
Joyce Appleby
Published 22 Dec 2009
What started out under government sponsorship became the twenty-first century’s biggest, commercial success story. The telecommunications network Telnet went into operation in 1969 with a commercial component in 1975. The desire to connect with other computers grew exponentially as people acquired PCs. A little more technological tweaking perfected the Internet.26 Meanwhile in the late 1980s, Tim Berners-Lee and Robert Cailliau at the European Organization for Nuclear Research in Geneva came up with a system to go beyond connecting computers and arrange for transferring information over the Internet by using hypertext. Their World Wide Web did indeed go worldwide as computer users discovered the wonders of the Web.
Beautiful Architecture: Leading Thinkers Reveal the Hidden Beauty in Software Design
by
Diomidis Spinellis
and
Georgios Gousios
Published 30 Dec 2008
A development team of up to 3,000 people has evolved and enhanced the system over the past 25 years. Based on success, persistence, and influence, the 5ESS architecture is a fine addition to our gallery. Another system to consider for inclusion in our Gallery of Beautiful Architectures is the architecture of the World Wide Web (WWW), created by Tim Berners-Lee at CERN, and described in Bass, Clements, and Kazman (2003). The WWW has certainly been commercially successful, transforming the way that people use the Internet. The architecture has remained intact, even as new applications are created and new capabilities introduced. The overall simplicity of the architecture contributes to its Conceptual Integrity, but decisions such as using a single library for both clients and servers and creating a layered architecture to separate concerns have ensured that the integrity of the architecture remains intact.
The Price of Inequality: How Today's Divided Society Endangers Our Future
by
Joseph E. Stiglitz
Published 10 Jun 2012
Some might claim, for instance, that Steve Jobs or the innovators of search engines or social media were, in their way, geniuses. Jobs was number 110 on the Forbes list of the world’s wealthiest billionaires before his death, and Mark Zuckerberg was 52. But many of these “geniuses” built their business empires on the shoulders of giants, such as Tim Berners-Lee, the inventor of the World Wide Web, who has never appeared on the Forbes list. Berners-Lee could have become a billionaire but chose not to—he made his idea available freely, which greatly speeded up the development of the Internet.18 A closer look at the successes of those at the top of the wealth distribution shows that more than a small part of their genius resides in devising better ways of exploiting market power and other market imperfections—and, in many cases, finding better ways of ensuring that politics works for them rather than for society more generally.
The Moral Animal: Evolutionary Psychology and Everyday Life
by
Robert Wright
Published 1 Jan 1994
† It’s amazing how fast a viewpoint can move from radical to trite. Today, with fascism seeming like an ancient relic, and the Internet looking strikingly neural, talk of a giant global brain is cheap. But there’s a difference. These days, most people who talk this way are speaking loosely. Tim Berners-Lee, who invented the World Wide Web, has noted parallels between the Web and the structure of the brain, but he insists that “global brain” is a mere metaphor. Teilhard de Chardin, in contrast, seems to have been speaking literally: humankind was coming to constitute an actual brain—like the one in your head, except bigger.
Nonzero: The Logic of Human Destiny
by
Robert Wright
Published 28 Dec 2010
† It’s amazing how fast a viewpoint can move from radical to trite. Today, with fascism seeming like an ancient relic, and the Internet looking strikingly neural, talk of a giant global brain is cheap. But there’s a difference. These days, most people who talk this way are speaking loosely. Tim Berners-Lee, who invented the World Wide Web, has noted parallels between the Web and the structure of the brain, but he insists that “global brain” is a mere metaphor. Teilhard de Chardin, in contrast, seems to have been speaking literally: humankind was coming to constitute an actual brain—like the one in your head, except bigger.
The Art of UNIX Programming
by
Eric S. Raymond
Published 22 Sep 2003
Early distributed-hypertext projects such as NLS and Xanadu were severely constrained by the MIT-philosophy assumption that dangling links were an unacceptable breakdown in the user interface; this constrained the systems to either browsing only a controlled, closed set of documents (such as on a single CD-ROM) or implementing various increasingly elaborate replication, caching, and indexing methods in an attempt to prevent documents from randomly disappearing. Tim Berners-Lee cut through this Gordian knot by punting the problem in classic New Jersey style. The simplicity of implementation he bought by allowing “404: Not Found” as a response was what made the World Wide Web lightweight enough to propagate and succeed. Gabriel himself, while sticking with the observation that ‘worse’ is more infectious and tends to win in the end, has publicly changed his mind several times about the underlying complexity-related question of whether or not this is actually a good thing.
Architects of Intelligence
by
Martin Ford
Published 16 Nov 2018
CSAIL today has 115 faculty members, and each of these faculty members has a big dream about computing, which is such an important part of our ethos here. Some of our faculty members want to make computing better through algorithms, systems or networks, while others want to make life better for humanity with computing. For example, Shafi Goldwasser wants to make sure that we can have private conversations over the internet; and Tim Berners-Lee wants to create a bill of rights, a Magna Carta of the World Wide Web. We have researchers who want to make sure that if we get sick, the treatments that are available to us are personalized and customized to be as effective as they can be. We have researchers who want to advance what machines can do: Leslie Kaelbling wants to make Lieutenant-Commander Data, and Russ Tedrake wants to make robots that can fly.
Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity
by
Daron Acemoglu
and
Simon Johnson
Published 15 May 2023
People doing this work then come up with previously untried schemes that are confronted with evidence and reasoning, and are subsequently further refined. All these human tasks can be helped by accurate filtering and the provision of useful information. The World Wide Web, often associated with the British computer scientist Tim Berners-Lee, is a quintessential example of this type of aid to human cognition. By the late 1980s, the internet, the global network of computers communicating with one another, had been around for about two decades, but there was no easy way of accessing the trove of information that existed in this network.
Clojure Programming
by
Chas Emerick
,
Brian Carper
and
Christophe Grand
Published 15 Aug 2011
[405] Because modifications to the state of the ref’s mapping may end up retrying due to request contention, some automatically generated identifiers pulled from the counter will be dropped on the floor. This seems reasonable, but worth being aware of. [406] ring.util.response contains a number of very useful utilities for creating and modifying response maps. [407] As much of a non sequitur as it is, we would be remiss if we did not suggest that you internalize Tim Berners-Lee’s rant, “Cool URIs don’t change”: http://www.w3.org/Provider/Style/URI.html. [408] Other request method macros Compojure provides out of the box include PUT, DELETE, HEAD, and a special macro ANY, which acts as a wildcard. If necessary, route macros can be defined as needed for “nonstandard” HTTP verbs like COPY
A Short History of Nearly Everything
by
Bill Bryson
Published 5 May 2003
It's an average rate, in other words, and you can apply it to any large sampling. Someone once worked out, for instance, that dimes have a half-life of about 30 years. Return to text. *25There are practical side effects to all this costly effort. The World Wide Web is a CERN offshoot. It was invented by a CERN scientist, Tim Berners-Lee, in 1989. Return to text. *26You are of course entitled to wonder what is meant exactly by “a constant of 50” or “a constant of 100.” The answer lies in astronomical units of measure. Except conversationally, astronomers don't use light-years. They use a distance called the parsec (a contraction of parallax and second), based on a universal measure called the stellar parallax and equivalent to 3.26 light-years.
NeuroTribes: The Legacy of Autism and the Future of Neurodiversity
by
Steve Silberman
Published 24 Aug 2015
Ultimately, the future of computing belonged not to the Big Iron mainframes and networks of “dumb terminals” that McCarthy loved but to the smart little machines that the members of the Homebrew Computer Club were soldering together in their garages. The task of claiming the power of the computing for the many remained to be done by Internet pioneers like Vint Cerf and Tim Berners-Lee—and an autistic engineer who launched the first social network for the people in a record store in Berkeley. V Lee Felsenstein had engineering in his blood. His grandfather, William T. Price, made a fortune by shrinking the design of diesel engines so they could fit into trains and trucks.
The Power Law: Venture Capital and the Making of the New Future
by
Sebastian Mallaby
Published 1 Feb 2022
The hacker ethic, championed by communalist nerds who obsessed over code and declined on principle to monetize it, actually originated at MIT—with the Tech Model Railroad Club, a group of MIT undergrads enthralled by the technology behind model trains before their attention was diverted to the TX-0 computer.[6] (The TX-0 was so captivating that the authorities at MIT considered getting rid of it. “People stopped washing, stopped eating, stopped their social life, and, of course, stopped studying,” according to one account.)[7] Similarly, Tim Berners-Lee, the British-born and Geneva-based inventor of the World Wide Web, combined creative imagination with an antimaterialist disdain for business. “If you’re interested in using the code, mail me,” he wrote in a public announcement, refusing to profit from his invention. In Finland, not the sort of place where Bono played a lot of gigs, Linus Torvalds created the bare bones of the Linux operating system and gave it away freely.
Frommer's London 2009
by
Darwin Porter
and
Danforth Prince
Published 25 Aug 2008
Insider’s Tip: A large addition to this museum explores such topics as genetics, digital technology, and artificial intelligence. Four floors of a new Welcome Wing shelter half a dozen exhibition areas and a 450-seat IMAX theater. On an upper floor, visitors can learn how DNA was used to identify living relatives of the Bleadon Man, a 2,000year-old Iron Age man. On the third floor is the computer that Tim Berners-Lee used to design the World Wide Web outside Geneva, writing the first software for it in 1990. Note also the marvelous interactive consoles placed strategically in locations throughout the museum. These display special itineraries, including directions to the various galleries for families, teens, adults, and those with special interests.
Tools of Titans: The Tactics, Routines, and Habits of Billionaires, Icons, and World-Class Performers
by
Timothy Ferriss
Published 6 Dec 2016
Marc co-created the highly influential Mosaic browser, the first widely used graphical web browser. He also co-founded Netscape, which later sold to AOL for $4.2 billion. He then co-founded Loudcloud, which sold as Opsware to Hewlett-Packard for $1.6 billion. He’s considered one of the founding fathers of the modern Internet, alongside pioneers like Tim Berners-Lee, who launched the Uniform Resource Locator (URL), Hypertext Transfer Protocol (HTTP), and early HTML standards. This all makes him one of the few humans ever to create software categories used by more than a billion people and establish multiple billion-dollar companies. Marc is now co-founder and general partner of venture capital firm Andreessen Horowitz, where he has become one of the most influential and dominant tech investors on the planet.
Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems
by
Martin Kleppmann
Published 17 Apr 2017
.: “TAO: Facebook’s Dis‐ tributed Data Store for the Social Graph,” at USENIX Annual Technical Conference (USENIX ATC), June 2013. [36] “Apache TinkerPop3.2.3 Documentation,” tinkerpop.apache.org, October 2016. [37] “The Neo4j Manual v2.0.0,” Neo Technology, 2013. [38] Emil Eifrem: Twitter correspondence, January 3, 2014. [39] David Beckett and Tim Berners-Lee: “Turtle – Terse RDF Triple Language,” W3C Team Submission, March 28, 2011. [40] “Datomic Development Resources,” Metadata Partners, LLC, 2013. [41] W3C RDF Working Group: “Resource Description Framework (RDF),” w3.org, 10 February 2004. [42] “Apache Jena,” Apache Software Foundation. 66 | Chapter 2: Data Models and Query Languages [43] Steve Harris, Andy Seaborne, and Eric Prud’hommeaux: “SPARQL 1.1 Query Language,” W3C Recommendation, March 2013. [44] Todd J.
Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems
by
Martin Kleppmann
Published 16 Mar 2017
.: “TAO: Facebook’s Distributed Data Store for the Social Graph,” at USENIX Annual Technical Conference (USENIX ATC), June 2013. [36] “Apache TinkerPop3.2.3 Documentation,” tinkerpop.apache.org, October 2016. [37] “The Neo4j Manual v2.0.0,” Neo Technology, 2013. [38] Emil Eifrem: Twitter correspondence, January 3, 2014. [39] David Beckett and Tim Berners-Lee: “Turtle – Terse RDF Triple Language,” W3C Team Submission, March 28, 2011. [40] “Datomic Development Resources,” Metadata Partners, LLC, 2013. [41] W3C RDF Working Group: “Resource Description Framework (RDF),” w3.org, 10 February 2004. [42] “Apache Jena,” Apache Software Foundation. [43] Steve Harris, Andy Seaborne, and Eric Prud’hommeaux: “SPARQL 1.1 Query Language,” W3C Recommendation, March 2013
Roller-Coaster: Europe, 1950-2017
by
Ian Kershaw
Published 29 Aug 2018
Outsourcing often meant, too, handing over elements in the production and distribution chain to self-employed people, thereby enabling firms to avoid cumbersome and costly obligations under labour law, although this often meant transposing onerous work practices on to the self-employed in small businesses. Communications and transnational relations were by the first decade of the twenty-first century utterly transformed. The rapid spread of the internet, especially after the creation of the World Wide Web (invented in 1989 by Tim Berners-Lee and made available to the general public two years later), spearheaded a revolution that was changing the possibility of communications and availability of knowledge and information at breathtaking speed and in previously unimaginable ways. Goods could be ordered from overseas to be delivered to the front door with astonishing rapidity at the touch of a computer key.
Founders at Work: Stories of Startups' Early Days
by
Jessica Livingston
Published 14 Aug 2008
Each one of the organizations went forward to figure out how to make this all go. This was in 1989/90. So we were all looking into the future. Livingston: WAIS seems to have ideas that anticipated the Web. Kahle: All these ideas were in the air. The Web came a bit later, but, as I understand, Tim Berners-Lee was working on some of the same things, but doing 268 Founders at Work them locally within CERN in Switzerland. We were doing them within a corporate environment using supercomputers and the Internet as well. Livingston: So Dow Jones, KPMG Peat Marwick, and Apple were all involved? Kahle: Yes, everybody was working together.
This Sceptred Isle
by
Christopher Lee
Published 19 Jan 2012
1401 First Lollard Martyr 1403 Percy’s Revolt; Henry Percy killed at Shrewsbury 1406 James I of Scots 1409 Owen Glyndŵr 1411 Foundation of Guildhall in London 1413 Henry V 1415 Agincourt 1420 Treaty of Troyes; Paston Letters 1422 Henry VI 1429 Joan of Arc at Orléans 1437 James II of Scots 1450 Cade’s Rebellion 1453 End of Hundred Years War; Gutenberg Bible 1455 Wars of the Roses begin 1460 James III of Scots 1461 Edward IV c.1474 Caxton prints first book in English 1483 Richard III 1485 Henry VII; founding of the Yeomen of the Guard 1488 James IV of Scots 1492 Christopher Columbus reaches America 1509 Henry VIII marries Catherine of Aragon 1513 James V of Scots 1519 Charles V, Holy Roman Emperor 1527 Henry VIII fails in attempt to divorce Catherine of Aragon 1533 Henry VIII marries Anne Boleyn; Cranmer, Archbishop of Canterbury 1536 Henry VIII marries Jane Seymour; Wales annexed to England 1540 Henry VIII marries and divorces Anne of Cleves; marries Catherine Howard 1540 Henry VIII, King of Ireland 1542 Mary, Queen of Scots 1547 Edward VI 1549 First Book of Common Prayer 1553 Mary I 1556 Cranmer executed 1558 Elizabeth I 1561 Mary, Queen of Scots returns to Scotland from France 1562 British slave trade starts 1567 James VI, King of Scotland 1571 First anti-Catholic Penal Law 1580 Drake’s circumnavigation 1587 Mary, Queen of Scots executed 1596 Robert Cecil, Secretary of State 1600 British East India Company incorporated 1601 Essex executed 1603 James I 1603 Ralegh treason trial and imprisonment 1611 Authorized Version of the Bible 1616 Death of William Shakespeare 1618 Ralegh executed; Thirty Years War starts 1625 Charles I 1632 Lord Baltimore granted patent for the settlement of Maryland 1641 The Grand Remonstrance issued 1642 Civil War starts; Battle of Edgehill 1643 Battle of Newbury 1644 Battle of Marston Moor 1645 New Model Army established 1649 Charles I executed; massacres at Wexford and Drogheda 1651 Charles II crowned at Scone; Hobbes’ Leviathan published 1655 Jamaica captured 1658 Cromwell dies 1660 Charles II; Declaration of Breda; Pepys begins his diary 1662 The Royal Society; Boyle’s Law 1666 Fire of London 1670 Hudson’s Bay Company 1673 Test Act 1678 Bunyan’s Pilgrim’s Progress 1685 James II 1689 William III and Mary II 1690 Battle of the Boyne 1692 Massacre of Glencoe 1694 Bank of England 1695 Bank of Scotland 1702 Queen Anne 1704 Battle of Blenheim; capture of Gibraltar 1707 Union with Scotland 1714 George I 1719 Daniel Defoe’s Robinson Crusoe 1722 Walpole, first Prime Minister 1727 George II 1740 War of Austrian Succession; Arne composes ‘Rule Britannia’ 1742 Handel’s Messiah 1746 Battle of Culloden 1751 Clive captures Arcot 1755 Dr Johnson’s Dictionary 1756 Seven Years War 1759 General Wolfe dies at Battle of Quebec 1760 George III 1765 Stamp Act; Hargreaves’ spinning jenny 1767 Revd Laurence Stone’s Tristram Shandy 1768 Royal Academy of Arts founded 1772 Warren Hastings, first Governor General of Bengal 1773 Boston Tea Party 1774 Priestley isolates oxygen 1775 American Revolution – Lexington and Concord 1776 American Declaration of Independence 1779 Captain Cook killed in Hawaii 1780 Gordon Riots; Epsom Derby 1781 Battle of Yorktown 1783 Pitt the Younger PM 1788 Regency Crisis 1789 French Revolution 1792 Tom Paine’s The Rights of Man 1799 Napoleon 1801 Union with Ireland 1805 Trafalgar 1807 Abolition of Slave Trade Act 1815 Waterloo 1820 George IV 1828 University of London founded 1829 Catholic Emancipation Act 1830 William IV 1832 First Reform Act 1833 Abolition of slavery in British colonies Act 1834 Houses of Parliament burned down 1836 Births, Marriages & Deaths Act 1837 Queen Victoria 1838 Public Records Office founded 1839 Bed Chamber Crisis; Opium War 1840 Prince Albert; Treaty of Waitangi 1843 Joule’s First Law 1844 Rochdale Pioneers; first telegraph line in England 1846 Repeal of Corn Laws 1847 Marks and Engels’ The Communist Manifesto 1849 Punjab conquered 1850 Public libraries; Tennyson, Poet Laureate 1854 Crimean War; British Medical Association founded 1855 Daily Telegraph founded; Palmerston PM 1857 Sepoy Rebellion (Indian Mutiny); Trollope’s Barchester Towers 1858 Canning, first Viceroy of India 1859 Darwin’s On the Origin of Species 1861 Prince Albert dies; American Civil War 1865 Abraham Lincoln assassinated 1867 Second Reform Act; first bicycle 1868 TUC 1869 Suez Canal opened; Cutty Sark launched 1870 Death of Dickens 1876 Victoria made Empress of India 1880 Gladstone PM 1881 First Boer War 1884 Third Reform Act 1885 Gordon dies at Khartoum 1887 Queen Victoria’s Golden Jubilee 1891 Elementary school fees abolished 1895 Salisbury PM 1896 Daily Mail founded 1898 Omdurman 1899 Second Boer War 1900 Elgar’s Dream of Gerontius 1901 Edward VII 1903 Suffragettes 1904 Entente Cordiale 1908 Borstal opened 1909 Old Age Pensions 1910 George V 1914 Irish Home Rule; First World War 1916 Lloyd George PM 1918 RAF formed from Royal Flying Corps; Marie Stopes 1919 John Maynard Keynes’ Economic Consequences of the Peace 1920 Black and Tans; Anglican Church in Wales disestablished 1921 Irish Free State 1922 Bonar Law PM 1923 Baldwin PM 1924 First Labour Government (MacDonald PM); Baldwin PM; Lenin dies 1925 Britain joins Gold standard 1926 General Strike 1928 Women over twenty-one given vote 1929 The Depression; MacDonald PM 1931 National Government; Statute of Westminster 1932 British Union of Fascists 1933 Hitler 1935 Baldwin PM 1936 Edward VIII; George VI; Spanish Civil War 1937 Chamberlain PM 1938 Austria annexed by Germany; Air Raid Precautions (ARP) 1939 Second World War 1940 Battle of Britain; Dunkirk; Churchill PM 1942 Beveridge Report; fall of Singapore and Rangoon 1944 Butler Education Act; Normandy allied landings 1945 Attlee PM; Germany and Japan surrender 1946 UN founded; National Insurance Act; National Health Service 1947 India Independence; Pakistan formed 1948 Railways nationalized; Berlin Airlift; Ceylon (Sri Lanka) independence 1949 NATO; Irish Independence; Korean War 1951 Churchill PM 1952 Elizabeth II 1955 Eden PM; Cyprus Emergency 1956 Suez Crisis 1957 Macmillan PM 1958 Life Peerages; EEC 1959 Vietnam War; Fidel Castro 1960 Macmillan’s Wind of Change speech 1963 Douglas-Home PM; De Gaulle veto on UK EEC membership; Kennedy assassination 1964 Wilson PM 1965 Southern Rhodesia UDI 1967 Pound devalued 1969 Open University; Northern Ireland Troubles; Robin Knox-Johnston first solo, non-stop sailing circumnavigation 1970 Heath PM 1971 Decimal currency in UK 1972 Bloody Sunday, Northern Ireland 1973 Britain in EEC; VAT 1974 Wilson PM 1976 Callaghan PM; first Concorde passenger flight 1979 Thatcher PM; Rhodesian Settlement 1982 Falklands War 1985 Mikhail Gorbachev; Global warming – British report hole in ozone layer 1986 Chernobyl; Reagan–Gorbachev Zero missile summit 1987 Wall Street Crash 1988 Lockerbie 1989 Berlin Wall down 1990 John Major PM; Iraq invades Kuwait 1991 Gulf War; Helen Sharman first Briton in space; Tim Berners-Lee first website; collapse of Soviet Communism 1992 Maastricht Treaty 1994 Church of England Ordination of Women; Channel Tunnel opens 1995 British forces to Sarajevo 1996 Dolly the Sheep clone 1997 Blair PM; Diana Princess of Wales dies; Hong Kong returns to China 1998 Rolls-Royce sold to BMW; Good Friday Agreement 1999 Scottish Parliament and Welsh Assembly elections 2001 Terrorist attacks on New York 2002 Elizabeth the Queen Mother dies 2003 Second Gulf War 2004 Asian Tsunami 2005 Freedom of Information Act; Prince of Wales and Camilla Parker-Bowles wed; terrorist attacks on London 2006 Queen’s eightieth birthday 2007 Ministry of Justice created; Brown PM 2008 Northern Rock collapse 2009 Market crash; banks partly nationalized; MPs expenses scandal 2010 Cameron PM.
Digital Empires: The Global Battle to Regulate Technology
by
Anu Bradford
Published 25 Sep 2023
Comm. on the Judiciary, Jan. 20, 2022). 124.Press Release, William Barr, Att’y Gen., Statement of the Att’y Gen. on the Announcement of Civil Antitrust Lawsuit Filed Against Google (Oct. 20, 2020), available at https://www.justice.gov/opa/pr/statement-attorney-general-announcement-civil-antitrust-lawsuit-filed-against-google. 125.Kari Paul, Washington Crackdown on Google Is the Greatest Threat Yet to Big Tech, The Guardian (Oct. 20, 2020), https://www.theguardian.com/technology/2020/oct/20/google-antitrust-charges-threat-big-tech. 126.See generally California Consumer Privacy Act of 2018, Cal. Civ. Code § 1798.100 (2018) amended by Initiative Proposition 24, Sec. 4 (California Privacy Rights Act of 2020). 127.Press Release, Rep. Ro Khanna, Release: Rep. Khanna Releases “Internet Bill of Rights” Principles, Endorsed by Sir Tim Berners-Lee (Oct. 4, 2018), https://khanna.house.gov/media/press-releases/release-rep-khanna-releases-internet-bill-rights-principles-endorsed-sir-tim#:~:text=Set%20of%20Principles%20for%20an,of%20personal%20data%20by%20companies%3B&text=(10)%20To%20have%20an%20entity,accountability%20to%20protect%20your%20privacy. 128.H.R. 5815, 115th Cong. (2018), available at https://www.govinfo.gov/content/pkg/BILLS-115hr5815ih/html/BILLS-115hr5815ih.htm. 129.Press Release, Sen.
The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal
by
M. Mitchell Waldrop
Published 14 Apr 2001
Shortly thereafter, another group at Thinking Machines, Inc., came up with WAIS, the Wide Area Informa- tion Service, which enabled users to search for Internet files based on their con- tent. And then around Christmastime 1990, at CERN, the European Center for Particle Physics in Geneva, Switzerland, an English physicist named Tim Berners- Lee finished the initial coding of a system in which Internet files could be linked via hypertext. Actually, Berners- Lee had already been playing with the idea of hypertext for a full decade by that point, having independently reinvented the idea long be- fore he ever heard of Vannevar Bush, Doug Engelbart, Ted Nelson, or, for that matter, the Internet itself; his first implementation, in 1980, had been a kind of free-forn1 database that simply linked files within a single computer.
A History of Modern Britain
by
Andrew Marr
Published 2 Jul 2009
By the end of the eighties the hot new topics were virtual reality, computer gaming – Sim City was launched in 1989 – and the exponentially increasing power of microprocessors. Computer graphics were becoming common in films, even though they were clunky and basic by modern standards. But the biggest about-to-happen event was the internet itself. The single most significant achievement by a British person in the early nineties had nothing to do with politics. Sir Tim Berners-Lee, inventor of the World Wide Web, stands alongside James Lovelock for influence above that of any politician. Today’s internet is a combination of technologies, from the satellites developing from the Soviet Sputnik success of 1957, to the US military programs to link computers, leading to the early ‘net’ systems developed by American universities, and the personal computer revolution itself.
Palo Alto: A History of California, Capitalism, and the World
by
Malcolm Harris
Published 14 Feb 2023
“The network is the computer” became Sun’s slogan, and though the company did make a desktop play, its next real success came in the new market for big computers that sat in closets. These powerful server computers hosted files and programs that skinnier client devices could access through a network. The backbone for the network of networks was still the NSFNET, and a commercial ban limited the internet’s spread through the ’80s. That’s when Tim Berners-Lee at CERN, in Switzerland, published the code for a web browser program called WorldWideWeb that allowed users to post and retrieve multimedia—photos and audio in addition to just text—and a program for a web server computer that always stayed on and hosted sites with pages. The Stanford Linear Accelerator Center (SLAC) installed the hemisphere’s first web server at the end of 1991, allowing Berners-Lee to demonstrate the system by browsing from a conference in France into the SLAC bibliographic database.
Gorbachev: His Life and Times
by
William Taubman
Shimon Peres, Lech Walesa, former French premier Michel Rocard, and Arnold Schwarzenegger offered tributes in person; Bill Clinton, George Shultz, and Bono prerecorded theirs. Gorbachev presented “Man Who Changed the World” awards (obviously named after him) to Ted Turner, World Wide Web inventor Tim Berners-Lee, and the African inventor of cheap, solar-powered lamps, Evans Wadongo. Musical interludes featured Valery Gergiev conducting the London Symphony Orchestra, baritone Dmitry Hvorostovsky, Shirley Bassey (singing “Diamonds Are Forever”), Paul Anka (singing “My Way,” whose lyrics he wrote, which became a signature number for Frank Sinatra), at least one crooning Spice Girl, plus assorted Russian performers and a German rock band, the Scorpions.92 Gorbachev with President Barack Obama and Vice President Joseph Biden at the White House, March 20, 2009.
Frommer's England 2011: With Wales
by
Darwin Porter
and
Danforth Prince
Published 2 Jan 2010
One exhibition explores everything from drug use in sports to how engineers observe sea life with robotic submarines. On an upper floor, visitors can learn how DNA was used to identify living relatives of the Bleadon Man, a 2,000-year-old Iron Age Man. On the third floor is the computer that Tim Berners-Lee used to design the World Wide Web outside Geneva, writing the first software for it in 1990. Note also the marvelous interactive consoles placed strategically in locations throughout the museum. These display special itineraries, including directions for getting to the various galleries for families, teens, adults, and those with special interests. 6 EXPLORING LONDON Hirst’s pickled sharks, then go look at a Rembrandt.