sorting algorithm

back to index

description: algorithm that puts elements of a list in a certain order

74 results

The Art of Computer Programming: Sorting and Searching

by Donald Ervin Knuth  · 15 Jan 1998

we may conclude that either (i) there are many important applications of sorting, or (ii) many people sort when they shouldn't, or (iii) inefficient sorting algorithms have been in common use. The real truth probably involves all three of these possibilities, but in any event we can see that sorting is

Many fascinating unsolved problems remain in this area, as well as quite a few solved ones. From a broader perspective we will find also that sorting algorithms make a valuable case study of how to attack computer programming problems in general. Many important principles of data structure manipulation will be illustrated in

, while external sorting shows us how to live with rather stringent accessing constraints. The time required to sort N records, using a decent general-purpose sorting algorithm, is roughly proportional to NlogN; we make about log A?' "passes" over the data. This is the minimum possible time, as we shall see

problem, assuming that only a few thousand words of internal memory are available, supplemented by about half a dozen tape units (enough tape units for sorting). Algorithms that work well under such limitations also prove to be efficient on modern machines. 10. [15] You are given a tape containing one million words

PERMUTATIONS A permutation of a finite set is an arrangement of its elements into a row. Permutations are of special importance in the study of sorting algorithms, since they represent the unsorted input data. In order to study the efficiency of different sorting methods, we will want to be able to

both cases. Rothe used this fact to prove that the determinant of a matrix is unchanged when the matrix is transposed. The analysis of several sorting algorithms involves the knowledge of how many permutations of n elements have exactly k inversions. Let us denote that number by In(k); Table 1 lists

is approximately 2, in agreement with the fact that about |(n + 1) runs appear in a random permutation of length n. For applications to sorting algorithms, a slightly different viewpoint is useful; we will consider the length of the kth run of the permutation from left to right, for k = 1

S until P is empty, we can read out its elements in increasing order. Unfortunately this doesn't turn out to be as efficient a sorting algorithm as other methods we will see; its minimum running time is proportional to n15, but similar algorithms that use trees instead of tableau structures

upon it. Cases A, B, and C above lead to important classes of sorting techniques that are refinements of the simple ideas stated. Many different sorting algorithms have been invented, and we will be dis- discussing about 25 of them in this book. This rather alarming number of methods is actually only

adapt any of the programs to the general case by using address table sorting or list sorting. An analysis of the running time of each sorting algorithm will be given with the MIX programs. Sorting by counting. As a simple example of the way in which we shall study internal sorting

but the running time remains proportional to TV2. To summarize what we have done so far: We started with Algorithm S, a simple and natural sorting algorithm that does about ^./V2 comparisons and \N2 moves. We improved it in one direction by considering binary insertion, which does about N\gN comparisons and

spread out over their range. Improvements to multiple list insertion are discussed in Section 5.2.5. EXERCISES 1. [10] Is Algorithm S a stable sorting algorithm? 2. [11] Would Algorithm S still sort numbers correctly if the. relation UK > K" in step S3 were replaced by "K > AV'? > 3. [SO]

If tt < tt', does it follow that tt is above tt'? 5.2.2. Sorting by Exchanging We come now to the second family of sorting algorithms mentioned near the beginning of Section 5.2: "exchange" or "transposition" methods that system- systematically interchange pairs of elements that are out of order until

1-sorting, but the comparisons do not overlap. Since Batcher's algorithm essentially merges pairs of sorted subsequences, it may be called the "merge exchange sort." Algorithm M (Merge exchange). Records Ri,..., Rn are rearranged in place; after sorting is complete their keys will be in order, K\ < ¦ ¦ ¦ < K^. We assume

random order. A generalization of radix exchange to radices higher than 2 is discussed in Section 5.2.5. *Asymptotic methods. The analysis of exchange sorting algorithms leads to some particularly instructive mathematical problems that enable us to learn more about how to find the asymptotic behavior of functions. For example, we

characteristics of tt. (See exercise 5.1.4-41 for another way to measure the disorder of a permutation.) 3. [10] Is the bubble sort Algorithm B a stable sorting algorithm? 4. [MS5] If t = 1 in step B4, we could actually terminate Algorithm B immediately, because the subsequent step B2 will do nothing

. Then we need not consider that position again in future selections, and we need not deal with infinite keys. This idea yields our first selection sorting algorithm. Algorithm S (Straight selection sort). Records Ri,..., R^ are rearranged in place; after sorting is complete, their keys will be in order, K\ < ¦ ¦ • < K^.

+ 3.SN — 11 units, just slightly slower than straight insertion (Program 5.2.IS). It is interesting to compare Algorithm S to the bubble sort (Algorithm 5.2.2B), since bubble sorting may be regarded as a selection algorithm that sometimes selects more than one element at a time. For this

this goal, not only eliminating — oo but also avoiding the need for an auxiliary output area. This line of thinking leads us to an important sorting algorithm that was christened "heapsort" by its discoverer J. W. J. Williams [CACM 7 A964), 347-348]. Heapsort. Let us say that a file of

von Neumann as early as 1945 (see Section 5.5). We shall study merging in considerable detail in Section 5.4, with regard to external sorting algorithms; our main concern in the present section is the somewhat simpler question of merge sorting within a high-speed random-access memory. Table 1 shows

; this tells us precisely how to allocate memory for the piles. We have already made use of the same idea in the "distribution counting sort," Algorithm 5.2D. Thus radix sorting can be carried out as follows: Start with a distribution sort based on the least significant digit of the keys

distribution sort with only N record areas (and M count fields), instead of 2N record areas. Does this lead to an improvement over the radix sorting algorithm illustrated in Table 1? 2. [13] Is Algorithm R a stable sorting method? 3. [15] Explain why Algorithm H makes B0TM[0] point to

as a function of M and N. 7. [20] Discuss the similarities and differences between Algorithm R and radix ex- exchange sorting (Algorithm 5.2.2R). > 8. [20] The radix-sorting algorithms discussed in the text assume that all keys being sorted are nonnegative. What changes should be made to the algorithms when the

when the keys are numbers expressed in signed-magnitude notation? 178 SORTING 5.2.5 10. [30] Design an efficient most-significant-digit-first radix-sorting algorithm that uses linked memory. (As the size of the subfiles decreases, it is wise to decrease M, and to use a nonradix method on

expansion of cotz.] 5. [16] When keys can be equal, each comparison may have three results instead of two: Ki < Kj, Ki = Kj, Ki > Kj. Sorting algorithms for this general situation can be represented as extended ternary trees, in which each internal node i:j has three subtrees; the left, middle, and

right subtrees correspond respectively to the three possible outcomes of the comparison. Draw an extended ternary tree that defines a sorting algorithm for n = 3, when equal keys are allowed. There should be 13 external nodes, corresponding to the 13 possible outcomes listed in exercise 3.

the a's, must make at least nlgn comparisons, even though the algorithm has only two possible outcomes. 30. [M23] (Optimum exchange sorting.) Every exchange sorting algorithm as defined in Section 5.2.2 can be represented as a comparison-exchange tree, namely a binary tree structure whose internal nodes have the

on level 2 it compares 1:3 in each branch unless that comparison would be redundant. In general, we can consider the class of all sorting algorithms whose comparisons are uniform in that way; assuming that the M = (^) pairs {(a, b) | 1 < a < b < N} have been arranged into a sequence

comparisons Kai'-Kb1, Ka2:Kb2, ... whose outcome is not already known. Each of the M! arrangements of the (a, b) pairs defines a uniform sorting algorithm. The concept of uniform sorting is due to H. L. Beus [JACM 17 A970), 482-495], whose work has suggested the next few exercises. It

if Kai < Kbt, the arc bi —> ai if Kai > K^ ¦ We are concerned primarily with the number of key comparisons made by a uniform sorting algorithm, not with the mechanism by which redundant comparisons are ac- actually avoided. Thus the graph G need not be constructed explicitly; it is used here

shall also consider restricted uniform sorting, in which only paths of length 2 are counted in cases 1, 2, and 3 above. (A restricted uniform sorting algorithm may make some redundant comparisons, but exercise 65 shows that the analysis is somewhat simpler in the restricted case.) Prove that the restricted uniform algorithm

i and (a*, bi), (a,j, bj), (afe, bk) forms a triangle. a) Prove that the average number of comparisons made by the restricted uniform sorting algorithm is Yli=i 2/(c» + 2). b) Use the results of (a) and exercise 64 to determine the average number of irredun- dant comparisons performed

this sequence do more or fewer comparisons than quicksort, on the average? 66. [M29] In the worst case, quicksort does (^) comparisons. Do all restricted uniform sorting algorithms (in the sense of exercise 63) perform (^) comparisons in their worst case? 67. [M^S] (H. L. Beus.) Does quicksort have the minimum average

number of com- comparisons, over all (restricted) uniform sorting algorithms? 68. [25] The Ph.D. thesis "Electronic Data Sorting" by Howard B. Demuth (Stanford University, October 1956) was perhaps the first publication to deal in

The data structures must be arranged so that comparatively slow peripheral memory devices (tapes, disks, drums, etc.) can quickly cope with the requirements of the sorting algorithm. Consequently most of the internal sorting techniques we have studied (insertion, exchange, selection) are virtually useless for external sorting, and it is necessary to reconsider

containing the input tape. The total number of records to be sorted is 100,000, but this information is not known in advance to the sorting algorithm. The foldout illustration in Chart A summarizes the actions that transpire when ten different merging schemes are applied to this data. The best way to

in Section 5.2.5 that internal radix sorting is superior to merging, on certain high-speed computers, because the inner loop of the radix sort algorithm avoids complicated branching. If the external memory is especially fast, it may be impossible for such machines to merge data rapidly enough to keep up

the number of inversions of a permutation by more than a bounded amount as it moves a bounded distance on tape; hence every one-tape sorting algorithm must take at least N2d units of time on the average, for some positive constant d that depends on the computer configuration. R. M.

moment we shall assume, in fact, that the disks are actually drums, containing 4000 tracks of 5000 characters each, with no seek time required. What sorting algorithm should be used? The method of merging is a fairly natural choice; other methods of internal sorting do not lend themselves so well to a

developed since 1956, this paper is still remarkably up-to-date in many respects. Friend gave careful descriptions of quite a few internal and external sorting algorithms, and he paid special attention to buffering techniques and the characteristics of magnetic tape units. He introduced some new methods (for example, tree selection,

chapter was first being written, compiled with the help of R. L. Rivest, appeared in Computing Reviews 13 A972), 283-289. Later developments. Dozens of sorting algorithms have been invented since 1970, although nearly all of them are variations on earlier themes. Multikey quicksort, which is discussed in the answer to exercise

G. Eddy, and 0. Petersson, Software Practice & Experience 26 A996), 781-797. Changes in computer hardware have prompted many interesting studies of the efficiency of sorting algorithms when the cost criteria change; see, for example, the discussion of virtual memory in exercise 5.4.9-20. The effect of hardware caches on

possible to do stable minimum-storage merging in O(N log N) units of time.] 5.5 SUMMARY, HISTORY, AND BIBLIOGRAPHY 391 > 4. [28] A sorting algorithm is called parsimonious if it makes decisions entirely by comparing keys, and if it never makes a comparison whose outcome could have been predicted from

important in practice. Search methods can be classified in several ways. We might divide them into internal versus external searching, just as we divided the sorting algorithms of Chapter 5 into internal versus external sorting. Or we might divide search methods into static versus dynamic searching, where "static" means that the contents

17. Consider the new record as a subfile of length 1. Repeatedly merge the smallest two subfiles if they have the same length. (The resulting sorting algorithm is essentially the same as Algorithm L, but the subfiles are merged at different relative times.) 18. Yes, but it seems to be a complicated

made, yet considering only paths of length 2 when testing for redundancy. Another way to solve this problem is to consider the equivalent tree insertion sorting algorithm of Section 6.2.2, which makes precisely the same comparisons as the uniform algorithm in the same order. 65. (a) The probability that

Effective STL: 50 Specific Ways to Improve Your Use of the Standard Template Library

by Scott Meyers  · 16 Jun 2001  · 474pp  · 91,222 words

from a vector, string, or deque, existing container elements are typically moved (copied) around (see Items 5 and 14). If you use any of the sorting algorithms (see Item 31); next_permutation or previous_permutation; remove, unique, or their ilk (see Item 32); rotate or reverse, etc., objects will be moved (copied

20 you get back are at least as good as the ones you didn’t. For a full sort, you have slightly more control. Some sorting algorithms are stable. In a stable sort, if two elements in a range have equivalent values, their relative positions are unchanged after sorting. Hence, if Widget

A precedes Widget B in the (unsorted) widgets vector and both have the same quality rating, a stable sorting algorithm will guarantee that after the vector is sorted, Widget A still precedes Widget B. An algorithm that is not stable would not make this guarantee

that require more: 1. partition 4. partial_sort 2. stable_partition 5. sort 3. nth_element 6. stable_sort My advice on choosing among the sorting algorithms is to make your selection based on what you need to accomplish, not on performance considerations. If you choose an algorithm that does only what

ranges demarcated by input iterators, nor to maps or multimaps, nor to some implementations of set and multiset (see Item 22). Similarly, many of the sorting algorithms (see Item 31) demand random access iterators, so it’s not possible to invoke these algorithms on the elements of a list. If you violate

for the standard associative containers are predicates, and predicate functions are commonly passed as parameters to algorithms like find_if and the various sorting algorithms. (For an overview of the sorting algorithms, turn to Item 31.) • A pure function is a function whose return value depends only on its parameters. If f is a

optimizations advance and 122 algorithms vs. explicit loops 182–184 associative container vs. sorted vector 100–106 case-insensitive string comparisons and 154 comparative, of sorting algorithms 138 copy vs. range insert 26 copying function objects and 164 objects and 21 distance and 122 empty vs. size 23–24 erasing from contiguous

reallocs via reserve 66–68 ostreambuf_iterators and 127 range vs. single-element member functions 24–33 small string optimization 71 sort vs. qsort 203 sorting algorithms, overview of 133–138 string implementation trade-offs 68–73 toupper and 236–237 use_facet and 234 Efficient C++ 202 bibliography entry for 226

qsort 162 declaration for 162 sort vs. 203 queue, as part of the STL 2 R Rabinowitz, Marty xviii random access iterators definition of 5 sorting algorithms requiring 137 range destination, ensuring adequate size 129–133 member functions 25 input iterators and 29 single-element versions vs. 24–33 summary of 31

and 146 sort 133–138 qsort vs. 203 sorted range algorithms requiring 146–150 sorted vectors associative containers vs. 100–106 legacy APIs and 76 sorting algorithms for 133–138 auto_ptrs 41 consistency and 149 stability, in sorting 135 stable_partition 133–138 stable_sort 133–138 stack, as part of

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems

by Martin Kleppmann  · 17 Apr 2017

of an algorithm To define what it means for an algorithm to be correct, we can describe its properties. For example, the output of a sorting algorithm has the property that for any two dis‐ tinct elements of the output list, the element further to the left is smaller than the ele

of Key” on page 203). The key-value pairs must be sorted, but the dataset is likely too large to be sorted with a conventional sorting algorithm on a single machine. Instead, the sorting is per‐ formed in stages. First, each map task partitions its output by reducer, based on the hash

Masterminds of Programming: Conversations With the Creators of Major Programming Languages

by Federico Biancuzzi and Shane Warden  · 21 Mar 2009  · 496pp  · 174,084 words

sorting. They may be different kinds of things. Sometimes they may be character strings, but think of what the possibilities are. When you write the sorting algorithm, you don’t know any of that stuff, which means it has to be put in later. If it’s done at runtime, of course

Real World Haskell

by Bryan O'Sullivan, John Goerzen, Donald Stewart and Donald Bruce Stewart  · 2 Dec 2008  · 1,065pp  · 229,099 words

of purely functional code, is idempotency—applying a function twice has the same result as applying it only once. For our sort routine—a stable sort algorithm—this should certainly be true, or things have gone horribly wrong! This invariant can be encoded as a property simply, as follows: -- file: ch11/QC

The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future

by Tom Chivers  · 12 Jun 2019  · 289pp  · 92,714 words

better bug fixers. But GenProg found simpler solutions. One was that it was supposed to repair a sorting algorithm that was buggy; it was sorting lists into the wrong order. After GenProg ‘fixed’ it, the sorting algorithm was run through the battery of tests, and scored perfectly in each one: not a single list

The Art of Computer Programming: Fundamental Algorithms

by Donald E. Knuth  · 1 Jan 1974

positive constants L and no such that \g(ri)\ > L\f(n)\ for all n > n0. Using this notation we can correctly conclude that a sorting algorithm whose running time is Q(n2) will not be as efficient as one whose running time is O(nlogn), if n is large enough. However

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems

by Martin Kleppmann  · 16 Mar 2017  · 1,237pp  · 227,370 words

of an algorithm To define what it means for an algorithm to be correct, we can describe its properties. For example, the output of a sorting algorithm has the property that for any two distinct elements of the output list, the element further to the left is smaller than the element further

“Partitioning by Hash of Key”). The key-value pairs must be sorted, but the dataset is likely too large to be sorted with a conventional sorting algorithm on a single machine. Instead, the sorting is performed in stages. First, each map task partitions its output by reducer, based on the hash of

Scala in Depth

by Tom Kleenex and Joshua Suereth  · 2 Jan 2010  · 554pp  · 108,035 words

utilize vector’s natural array separation in my sorting algorithm. Traditionally this has been solved with two mechanisms: overloading and overriding. Using overloading, the sort method is implemented in terms of Iterable and another is

Python for Data Analysis

by Wes McKinney  · 30 Dec 2011  · 752pp  · 131,533 words

sort_index methods and the Series order method are implemented with variants of these functions (which also must take into account missing values) Alternate Sort Algorithms A stable sorting algorithm preserves the relative position of equal elements. This can be especially important in indirect sorts where the relative ordering is meaningful: In [186]: values

(n2) 'mergesort' 2 Yes n / 2 O(n log n) 'heapsort' 3 No 0 O(n log n) Warning At the time of this writing, sort algorithms other than quicksort are not available on arrays of Python objects (dtype=object). This means occasionally that algorithms requiring stable sorting will require workarounds when

, Pivot Tables and Cross-Tabulation aggregate method, Data Aggregation, Column-wise and Multiple Function Application aggregations, Mathematical and Statistical Methods algorithms for sorting, Alternate Sort Algorithms–Alternate Sort Algorithms alignment of data, Time Series and Cross-Section Alignment–Time Series and Cross-Section Alignment all method, Methods for Boolean Arrays, Methods for Boolean Arrays

Options–HDF5 and Other Array Storage Options HDFStore class, Binary Data Formats header argument, Reading and Writing Data in Text Format heapsort sorting method, Alternate Sort Algorithms hierarchical data format (HDF5), Using HDF5 Format–Using HDF5 Format, HDF5 and Other Array Storage Options–HDF5 and Other Array Storage Options hierarchical indexing, Hierarchical

–Memory-mapped Files, Memory-mapped Files defined, Memory-mapped Files saving arrays to file, Memory-mapped Files–Memory-mapped Files mergesort sorting method, Alternate Sort Algorithms, Alternate Sort Algorithms merging data, Combining and Merging Data Sets–Combining Data with Overlap, Database-style DataFrame Merges–Database-style DataFrame Merges, Merging on Index–Merging on Index

.lib.recfunctions, More About Sorting–numpy.searchsorted: Finding elements in a Sorted Array, Indirect Sorts: argsort and lexsort–Indirect Sorts: argsort and lexsort, Alternate Sort Algorithms–Alternate Sort Algorithms, numpy.searchsorted: Finding elements in a Sorted Array–numpy.searchsorted: Finding elements in a Sorted Array, NumPy Matrix Class–NumPy Matrix Class, Advanced Array Input

periods, Quarterly Period Frequencies–Quarterly Period Frequencies quartile analysis, Decile and Quartile Analysis–Decile and Quartile Analysis question mark (?), Introspection, Introspection quicksort sorting method, Alternate Sort Algorithms quotechar option, Manually Working with Delimited Formats quoting option, Manually Working with Delimited Formats R r file mode, Files and the operating system r+ file

Sorting Levels, More About Sorting–numpy.searchsorted: Finding elements in a Sorted Array, Indirect Sorts: argsort and lexsort–Indirect Sorts: argsort and lexsort, Alternate Sort Algorithms–Alternate Sort Algorithms, numpy.searchsorted: Finding elements in a Sorted Array–numpy.searchsorted: Finding elements in a Sorted Array, numpy.searchsorted: Finding elements in a Sorted Array–numpy

Element-wise Array Functions squeeze argument, Reading and Writing Data in Text Format SSA (Social Security Administration), US Baby Names 1880-2010 stable sorting, Alternate Sort Algorithms stacked format, Pivoting “long” to “wide” Format start index, Slicing, Slicing startswith method, String Object Methods, Vectorized string functions in pandas, Vectorized string functions in

Programming Rust: Fast, Safe Systems Development

by Jim Blandy and Jason Orendorff  · 21 Nov 2017  · 1,331pp  · 183,137 words

Coders: The Making of a New Tribe and the Remaking of the World

by Clive Thompson  · 26 Mar 2019  · 499pp  · 144,278 words

Data Structures & Algorithms Interview Questions You'll Most Likely Be Asked

by Vibrant Publishers  · 24 Jan 2011  · 120pp  · 17,504 words

Algorithms in C++ Part 5: Graph Algorithms

by Robert Sedgewick  · 2 Jan 1992

Programming Python

by Mark Lutz  · 5 Jan 2011

Programming in Scala

by Martin Odersky, Lex Spoon and Bill Venners  · 15 Jan 2008  · 754pp  · 48,930 words

Python for Data Analysis: Data Wrangling with Pandas, NumPy, and IPython

by Wes McKinney  · 25 Sep 2017  · 1,829pp  · 135,521 words

Code Complete (Developer Best Practices)

by Steve McConnell  · 8 Jun 2004  · 1,758pp  · 342,766 words

Concepts, Techniques, and Models of Computer Programming

by Peter Van-Roy and Seif Haridi  · 15 Feb 2004  · 931pp  · 79,142 words

API Design Patterns

by Jj Geewax  · 19 Jul 2021  · 725pp  · 168,262 words

Beautiful Visualization

by Julie Steele  · 20 Apr 2010

The Art of Computer Programming

by Donald Ervin Knuth  · 15 Jan 2001

On Lisp: Advanced Techniques for Common Lisp

by Paul Graham  · 8 Sep 1993  · 423pp  · 21,637 words

Radical Technologies: The Design of Everyday Life

by Adam Greenfield  · 29 May 2017  · 410pp  · 119,823 words

Terms of Service: Social Media and the Price of Constant Connection

by Jacob Silverman  · 17 Mar 2015  · 527pp  · 147,690 words

Clojure Programming

by Chas Emerick, Brian Carper and Christophe Grand  · 15 Aug 2011  · 999pp  · 194,942 words

From Bacteria to Bach and Back: The Evolution of Minds

by Daniel C. Dennett  · 7 Feb 2017  · 573pp  · 157,767 words

Exploring Python

by Timothy Budd  · 17 Feb 2009  · 263pp  · 20,730 words

Pearls of Functional Algorithm Design

by Richard Bird  · 15 Sep 2010

The C++ Programming Language

by Bjarne Stroustrup  · 2 Jan 1986  · 923pp  · 516,602 words

How I Became a Quant: Insights From 25 of Wall Street's Elite

by Richard R. Lindsey and Barry Schachter  · 30 Jun 2007

Ajax: The Definitive Guide

by Anthony T. Holdener  · 25 Jan 2008  · 982pp  · 221,145 words

The Nature of Technology

by W. Brian Arthur  · 6 Aug 2009  · 297pp  · 77,362 words

The Pragmatic Programmer

by Andrew Hunt and Dave Thomas  · 19 Oct 1999  · 509pp  · 92,141 words

The Art of R Programming

by Norman Matloff  · 404pp  · 43,442 words

Rapid GUI Programming With Python and Qt

by Mark Summerfield  · 27 Oct 2007  · 643pp  · 53,639 words

Are You Smart Enough to Work at Google?: Trick Questions, Zen-Like Riddles, Insanely Difficult Puzzles, and Other Devious Interviewing Techniques You ... Know to Get a Job Anywhere in the New Economy

by William Poundstone  · 4 Jan 2012  · 260pp  · 77,007 words

Think Complexity

by Allen B. Downey  · 23 Feb 2012  · 247pp  · 43,430 words

Algorithms Unlocked

by Thomas H. Cormen  · 15 Jan 2013

Algorithms to Live By: The Computer Science of Human Decisions

by Brian Christian and Tom Griffiths  · 4 Apr 2016  · 523pp  · 143,139 words

The Art of SEO

by Eric Enge, Stephan Spencer, Jessie Stricchiola and Rand Fishkin  · 7 Mar 2012

Designing Interfaces

by Jenifer Tidwell  · 15 Dec 2010

Types and Programming Languages

by Benjamin C. Pierce  · 4 Jan 2002  · 647pp  · 43,757 words

The C Programming Language

by Brian W. Kernighan and Dennis M. Ritchie  · 15 Feb 1988  · 238pp  · 93,680 words

Learn Algorithmic Trading

by Sebastien Donadio  · 7 Nov 2019

The Ethical Algorithm: The Science of Socially Aware Algorithm Design

by Michael Kearns and Aaron Roth  · 3 Oct 2019

When Computers Can Think: The Artificial Intelligence Singularity

by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann and Michelle Estes  · 28 Feb 2015

97 Things Every Programmer Should Know

by Kevlin Henney  · 5 Feb 2010  · 292pp  · 62,575 words

Four Battlegrounds

by Paul Scharre  · 18 Jan 2023

The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services, Volume 2

by Thomas A. Limoncelli, Strata R. Chalup and Christina J. Hogan  · 27 Aug 2014  · 757pp  · 193,541 words

Smart and Gets Things Done: Joel Spolsky's Concise Guide to Finding the Best Technical Talent

by Joel Spolsky  · 1 Jun 2007  · 194pp  · 36,223 words

The Art of Assembly Language

by Randall Hyde  · 8 Sep 2003  · 968pp  · 224,513 words

Reset

by Ronald J. Deibert  · 14 Aug 2020

Sunfall

by Jim Al-Khalili  · 17 Apr 2019  · 381pp  · 120,361 words

Erlang Programming

by Francesco Cesarini  · 496pp  · 70,263 words

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers

by John MacCormick and Chris Bishop  · 27 Dec 2011  · 250pp  · 73,574 words

Programming Ruby 1.9: The Pragmatic Programmer's Guide

by Dave Thomas, Chad Fowler and Andy Hunt  · 15 Dec 2000  · 936pp  · 85,745 words

Big Data at Work: Dispelling the Myths, Uncovering the Opportunities

by Thomas H. Davenport  · 4 Feb 2014

The Mythical Man-Month

by Brooks, Jr. Frederick P.  · 1 Jan 1975  · 259pp  · 67,456 words

The Data Detective: Ten Easy Rules to Make Sense of Statistics

by Tim Harford  · 2 Feb 2021  · 428pp  · 103,544 words

Code Dependent: Living in the Shadow of AI

by Madhumita Murgia  · 20 Mar 2024  · 336pp  · 91,806 words

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy

by George Gilder  · 16 Jul 2018  · 332pp  · 93,672 words

NumPy Cookbook

by Ivan Idris  · 30 Sep 2012  · 197pp  · 35,256 words

High Performance JavaScript

by Nicholas C. Zakas  · 15 Mar 2010  · 375pp  · 66,268 words

New Dark Age: Technology and the End of the Future

by James Bridle  · 18 Jun 2018  · 301pp  · 85,263 words

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise

by Nathan L. Ensmenger  · 31 Jul 2010  · 429pp  · 114,726 words

Hacking Vim 7.2

by Kim Schulz  · 29 Apr 2010  · 236pp  · 67,823 words

Being Geek: The Software Developer's Career Handbook

by Michael Lopp  · 20 Jul 2010  · 336pp  · 88,320 words

Python for Algorithmic Trading: From Idea to Cloud Deployment

by Yves Hilpisch  · 8 Dec 2020  · 1,082pp  · 87,792 words

Ask Your Developer: How to Harness the Power of Software Developers and Win in the 21st Century

by Jeff Lawson  · 12 Jan 2021  · 282pp  · 85,658 words

Shipping Greatness

by Chris Vander Mey  · 23 Aug 2012  · 231pp  · 71,248 words

How to Turn Down a Billion Dollars: The Snapchat Story

by Billy Gallagher  · 13 Feb 2018  · 359pp  · 96,019 words

Refactoring: Improving the Design of Existing Code

by Martin Fowler, Kent Beck, John Brant, William Opdyke and Don Roberts  · 9 Mar 2012

Inner Entrepreneur: A Proven Path to Profit and Peace

by Grant Sabatier  · 10 Mar 2025  · 442pp  · 126,902 words