by Parmy Olson · 284pp · 96,087 words
one of the Google canteens, two floors below the office of Larry Page, the twenty-five-year-old Ukrainian was spitballing with two other researchers, Ashish Vaswani and Jakob Uszkoreit. His lunch mates also didn’t like following the conventions of other scientists in the building. Vaswani was hungry to work on
…
of caution, but more than anything, the company was obsessed with maintaining its reputation and the status quo. Soon, Google was going to experience what Ashish Vaswani describes as a “biblical moment.” As Google continued printing money from its advertising business, OpenAI was taking what looked like a monumental step toward AGI
by Madhumita Murgia · 20 Mar 2024 · 336pp · 91,806 words
in scientific discovery, the one that spurred this latest artificial intelligence advance came from a moment of serendipity. In early 2017, two Google research scientists, Ashish Vaswani and Jakob Uszkoreit, were in a hallway of the search giant’s Mountain View campus, discussing a new idea for how to improve machine translation
by Stephen Witt · 8 Apr 2025 · 260pp · 82,629 words
a word that had appeared many paragraphs earlier might provide a contextual clue to what the next word meant. Polosukhin and Uszkoreit were joined by Ashish Vaswani, another Google researcher, and by early 2017 the three had built a rudimentary English-to-German translator based on the self-attention mechanism. Polosukhin and
by Ray Kurzweil · 25 Jun 2024
how transformers work, and the original technical paper, see Giuliano Giacaglia, “How Transformers Work,” Towards Data Science, March 10, 2019, https://towardsdatascience.com/transformers-141e32e69591; Ashish Vaswani et al., “Attention Is All You Need,” arXiv:1706.03762v5 [cs.CL], December 6, 2017, https://arxiv.org/pdf/1706.03762.pdf. BACK TO NOTE
by Nate Silver · 12 Aug 2024 · 848pp · 227,015 words
Post, May 17, 2023, washingtonpost.com/technology/2023/05/16/sam-altman-open-ai-congress-hearing. GO TO NOTE REFERENCE IN TEXT “Attention Is All”: Ashish Vaswani et al., “Attention Is All You Need,” arXiv, August 1, 2023, arxiv.org/abs/1706.03762. GO TO NOTE REFERENCE IN TEXT most rapidly adopted
by Karen Hao · 19 May 2025 · 660pp · 179,531 words
REFERENCE IN TEXT Sutskever would get up: A photo of Sutskever at the event. GO TO NOTE REFERENCE IN TEXT In August 2017, that changed: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez et al., “Attention Is All You Need,” in NIPS ’17: Proceedings of the 31st