Linda problem

back to index

description: a problem used in psychology to illustrate the conjunction fallacy in probability theory

13 results

pages: 654 words: 191,864

Thinking, Fast and Slow
by Daniel Kahneman
Published 24 Oct 2011

In the Linda problem, in contrast, intuition often overcame logic even in joint evaluation, although we identified some conditions in which logic prevails. Amos and I believed that the blatant violations of the logic of probability that we had observed in transparent problems were interesting and worth reporting to our colleagues. We also believed that the results strengthened our argument about the power of judgment heuristics, and that they would persuade doubters. And in this we were quite wrong. Instead, the Linda problem became a case study in the norms of controversy. The Linda problem attracted a great deal of attention, but it also became a magnet for critics of our approach to judgment.

As in the Müller-Lyer illusion, the fallacy remains attractive even when you recognize it for what it is. The naturalist Stephen Jay Gould described his own struggle with the Linda problem. He knew the correct answer, of course, and yet, he wrote, “a little homunculus in my head continues to jump up and down, shouting at me—‘but she can’t just be a bank teller; read the description.’” The little homunculus is of course Gould’s System 1 speaking to him in insistent tones. (The two-system terminology had not yet been introduced when he wrote.) The correct answer to the short version of the Linda problem was the majority response in only one of our studies: 64% of a group of graduate students in the social sciences at Stanford and at Berkeley correctly judged “feminist bank teller” to be less probable than “bank teller.”

From the perspective of economic theory, this result is troubling: the economic value of a dinnerware set or of a collection of baseball cards is a sum-like variable. Adding a positively valued item to the set can only increase its value. The Linda problem and the dinnerware problem have exactly the same structure. Probability, like economic value, is a sum-like variable, as illustrated by this example: probability (Linda is a teller) = probability (Linda is feminist teller) + probability (Linda is non-feminist teller) This is also why, as in Hsee’s dinnerware study, single evaluations of the Linda problem produce a less-is-more pattern. System 1 averages instead of adding, so when the non-feminist bank tellers are removed from the set, subjective probability increases.

pages: 533 words: 125,495

Rationality: What It Is, Why It Seems Scarce, Why It Matters
by Steven Pinker
Published 14 Oct 2021

Yet as the article was becoming a sensation (including with President Bill Clinton, who passed it around the White House), the number of civil wars, the proportion of people without access to clean water, and the rate of American crime were sinking like stones.49 Within three years an effective treatment for AIDS would begin to decimate its death toll. And more than a quarter century later, national borders have barely budged. The conjunction fallacy was first illustrated by Tversky and Kahneman with an example that has become famous as “the Linda problem”:50 Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Please indicate the probability of each of these statements: Linda is a teacher in elementary school.

The dated vignette, with its baby-boomer “Linda,” backhanded compliment “bright,” passé protests, and declining occupation, betrays its early-1980s vintage. But as any psychology instructor knows, the effect is easily replicable, and today, highly intelligent Amanda who marches for Black Lives Matter is still deemed likelier to be a feminist registered nurse than a registered nurse. The Linda problem engages our intuitions in a particularly compelling way. Unlike the selection task, where people make errors when the problem is abstract (“If P then Q”) and get it right when it is couched in certain real-life scenarios, here everyone agrees with the abstract law “prob(A and B) ≤ prob(A)” but are upended when it is made concrete.

As we saw with the Monty Hall dilemma, “probability” has several meanings, including physical propensity, justified strength of belief, and frequency in the long run. Still another sense is provided by the Oxford English Dictionary: “the appearance of truth, or likelihood of being realized, which any statement or event bears in the light of present evidence.”54 People faced with the Linda problem know that “frequency in the long run” is irrelevant: there’s only one Linda, and either she is a feminist bank teller or she isn’t. In any coherent conversation the speaker would supply biographical details for a reason, namely to lead the listener to a plausible conclusion. According to the psychologists Ralph Hertwig and Gerd Gigerenzer, people may have rationally inferred that the relevant meaning of “probability” in this task is not one of the mathematical senses in which the conjunction rule applies, but the nonmathematical sense of “degree of warrant in light of the present evidence,” and they sensibly followed where the evidence pointed.55 In support of the charitable reading, many studies, beginning with ones by Tversky and Kahneman themselves, show that when people are encouraged to reason about probability in the sense of relative frequency, rather than being left to struggle with the enigmatic concept of the probability of a single case, they are likelier to obey the conjunction rule.

pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All
by Robert Elliott Smith
Published 26 Jun 2019

I then headed towards another exciting session entitled ‘The Kill Decision: Sci-Fi or Reality?’, but on the way I passed a group of people talking in a circle of metal chairs in the main hall. They were discussing the ‘Linda Problem’. Dammit! The ‘Kill Decision’ session, sadly, would have to wait, as the ‘Linda Problem’ was one of my personal bug bears, and I couldn’t resist taking a chair. The ‘Linda Problem’ was introduced in 1983 by psychologists Daniel Kahneman and Amos Tversky,1 at the headwaters of the field of behavioural economics, which would eventually result in their Nobel Prize in Economics. The problem goes like this: Linda is thirty-one years old, single, outspoken and very bright.

It is one of the many ‘biases’ in human decision-making that are pointed out in behavioural economics, because they indicate that people do not reason ‘correctly’ relative to probability theory. Maybe it was the coffee (I had another cup during my session), but at that moment at SCIFOO, the ‘Linda Problem’ represented everything that I thought was going wrong with AI, and I launched in crusader style. I interrupted, and asked ‘What if the fallacy in the conjunction fallacy isn’t in the people who are answering the question “incorrectly”, but in the people asking the question?’ That got me the floor, so I continued, explaining that what the people who pose the ‘Linda Problem’ think they are presenting is a well-structured problem in probability theory, which has a ‘correct’ answer.

As the session broke up, a researcher from Google walked up to me and said quite bluntly and loudly, ‘Sorry, but I think you’re wrong’. People walking by stopped to see where this was headed. He continued, saying that there is always a set of variables whose probabilities, correctly characterized, define correct answers to any situation, including the ‘Linda Problem’. This must be the case, as probability is the correct representation of the uncertain world around us, and statistics are the only basis from which we can reason and learn. I countered that probabilities only apply to truth uncertainty, and not semantic or ontological uncertainty, and explained each of these categories from Lane and Maxfield.

pages: 336 words: 113,519

The Undoing Project: A Friendship That Changed Our Minds
by Michael Lewis
Published 6 Dec 2016

Plus Danny didn’t believe that people would actually make this particular mistake. Seeing the descriptions side by side, they’d realize that it was illogical to say that anyone was more likely to be a bank teller active in the feminist movement than simply a bank teller. With something of a heavy heart, Danny put what would come to be known as the Linda problem to a class of a dozen students at the University of British Columbia. “Twelve out of twelve fell for it,” he said. “I remember I gasped. Then I called Amos from my secretary’s phone.” They ran many further experiments, with different vignettes, on hundreds of subjects. “We just wanted to look at the boundaries of the phenomenon,” said Danny.

They gave subjects the same description of Linda and asked, simply: “Which of the two alternatives is more probable?” Linda is a bank teller. Linda is a bank teller and is active in the feminist movement. Eighty-five percent still insisted that Linda was more likely to be a bank teller in the feminist movement than she was to be a bank teller. The Linda problem resembled a Venn diagram of two circles, but with one of the circles wholly contained by the other. But people didn’t see the circles. Danny was actually stunned. “At every step we thought, now that’s not going to work,” he said. And whatever was going on inside people’s minds was terrifyingly stubborn.

And whatever was going on inside people’s minds was terrifyingly stubborn. Danny gathered an auditorium full of UBC students and explained their mistake to them. “Do you realize you have violated a fundamental rule of logic?” he asked. “So what!” a young woman shouted from the back of the room. “You just asked for my opinion!” They put the Linda problem in different ways, to make sure that the students who served as their lab rats weren’t misreading its first line as saying “Linda is a bank teller NOT active in the feminist movement.” They put it to graduate students with training in logic and statistics. They put it to doctors, in a complicated medical story, in which lay embedded the opportunity to make a fatal error of logic.

pages: 453 words: 111,010

Licence to be Bad
by Jonathan Aldred
Published 5 Jun 2019

Forster famously contrasted a simple succession of facts – ‘The king died and then the queen died’ – with a plot: ‘The king died, and then the queen died of grief.’ There is more information in this plot, yet it is no harder to remember: it is cognitively more efficient.16 However, there is a catch. Daniel Kahneman and Amos Tversky provided the first clear evidence. The Linda Problem remains one of their most famous experiments: Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable?

Still, we must be careful not to overstate the problem. Yes, ordinary folk reach for narratives to help cope with uncertainty, but experts reach for their tools, theories and computer models. Experts may fool themselves with optimistic bell-curve thinking, but at least they don’t fall for an error as basic as the Linda Problem. Except that they do, as Kahneman and Tversky discovered when they ran similar experiments with doctors and other trained experts. And there is clear evidence that expert decision-makers are reluctant to abandon another narrative – the optimistic orthodoxy about decision-making under uncertainty described in this chapter, the one beginning with von Neumann’s jottings on the back of an envelope, then proceeding via Savage through a series of improvements and applications leading to Nobel prizes and other glory.

air travel, commercial, 63–4 Akerlof, George, 223, 237, 248 altruism, 150–51, 159, 162–4 game theory’s denial of, 31–2, 41, 42–3 misunderstanding of, 13–14, 25, 31–2, 41–3, 112, 178–9 as not depleted through use, 14 seen as disguised selfishness, 11–12, 25, 112, 178–9 Amazon, 155, 178, 208 American Economic Association, 257, 258 Angrist, Joshua, 249 antitrust regulation, 56–8 Apple, 222–3 Aristotle, 14 Arrow, Ken awarded Nobel Prize, 71 and blood donations, 14, 163 at City College, New York, 74–5, 91 collective preference, 73–4, 75–7, 78–82 and democracy, 72–4, 75–7, 78–83, 95, 97 framework presented as scientific, 81–2, 124–5 and free marketeers, 78–9, 82 Impossibility Theorem, 72, 73–4, 75–7, 78–83, 89, 97 and mathematics, 71, 72, 73–5, 76–7, 82–3, 97 and Mont Pèlerin Society, 9 preference satisfaction’, 80–82, 97, 124–5, 129 and Ramsey, 189 at RAND, 70–71, 72–3, 74, 75–6, 77, 78 top-secret-level security clearance, 71–2 ‘A Cautious Case for Socialism’ (1978), 83 ‘On the Optimal Use of Winds for Flight Planning’, 71 Social Choice and Individual Values (1951), 71, 72, 75–7, 78–80, 97 artificial intelligence, 214, 242 Atlas Economic Research Foundation, 7–8 Austen, Jane, 134 austerity policies, recent, 258 Axelrod, Robert, 41 Babbage, Charles, 222 baby-market idea, 61, 138, 145, 146 Bachelier, Louis, 193 Baird, Douglas, 58–9 bandwagon effect, 110 Bank of England, 96, 120, 185, 211–12, 258 bankers excuse/permission to be greedy, 1–2, 204, 238 and Keynesian economics, 5 performance as wholly relative, 204 quantification and recklessness, 213 rigged pay-for-performance contracts, 229–30, 238 role in 2007 crisis, 1–2, 57, 182, 192 as serial offenders over uncertainty, 201 see also financial markets Barro, Josh, 63, 64 Bateson, Gregory, 28 battery-chicken farming, 7 Baumol, William, 90–92, 93, 94 BBC, 48, 98 Beaverbrook, Lord, 157 Becker, Gary amoral understanding of crime, 137, 152 and citizenship rights, 146 and Coase, 69 Freakonomics followers of, 130, 134, 148–9, 156 and Friedman, 126, 131 hidden assumptions of, 130–31, 133–4 human capital idea, 149 and individualism, 134, 135–8 and maximization, 129–31, 133–4, 147 as outsider, 50 and Posner, 56 rejects need for realistic assumptions, 132, 133–4, 148 and sale of body parts, 147–8 sees poor health as just a preference, 135, 136, 140 sees values as mere tastes, 136–8, 140 theories as deeply controversial, 127–9, 130 theories as slippery, 129, 133–4 and ‘universality’ of economics, 125, 126–31, 133–4, 135–8, 147–8 version of ‘rational’ behaviour, 128–9, 135, 140, 151 De Gustibus Non Est Disputandum (with Stigler, 1977), 135–6 The Economic Approach to Human Behavior (1976), 130 The Economics of Discrimination (1957), 126–7 A Treatise on the Family (1981), 127–8, 130–31, 133 behaviourism, 154–8, 237 behavioural economics context and culture, 175–6 framing effects, 170–71, 259 and incentives, 160, 171, 175, 176–7 methods from psychology, 170–71 and Nudge, 171–2 and orthodox economics, 173, 174–5, 247, 255 and physics envy, 175–6 problems with, 173–5, 250–51 ‘self-command’ strategies, 140 theory of irrationality, 12, 171, 250–51 and welfare maximization, 149 Bell, Alexander Graham, 222 bell curve distribution, 191–4, 195, 196, 201, 203–4, 218–19, 257 Bentham, Jeremy, 102 Berlin, Isiah, 166, 167–8 Beveridge Report (1942), 4 Bezos, Jeff, 208 Black, Duncan, 77–8, 95 Blackstone (private equity firm), 235 black swans, 192, 194, 201, 203–4 Blinder, Alan, The Economics of Brushing Teeth (1974), 136 blood donors, 14, 112, 162–3, 164, 169, 176 Borel, Émile, 185* Brennan, William, 56 broadcasting, 48–50, 98 spectrum auctions, 39–40, 47, 49–50 Buchanan, James McGill, 8, 83–5, 87–8, 89, 95, 115 Buffett, Warren, 229, 230, 236 Calcraft, John, 120, 121 Cameron, David, 172 Caplan, Brian, The Myth of the Rational Voter, 245–6 carbon markets, 47, 65–7 Carlson, Jack, 141–2 Carroll, Lewis (Charles Lutwidge Dodgson), 72, 77 cartels and monopolies, 101, 102, 103–4 Cheney, Dick, 232–3 Chicago, University of, 2, 4, 34, 40, 49–51 antitrust ideas, 56–8 Buchanan at, 84 and Coase, 49–52, 53–4, 55, 56–7, 61, 68–9, 132 Friedman’s dominance, 50, 132 law and economics movement, 40, 55, 56–63, 64–7 revolution of 1968 at, 56, 58–9 zero-transaction-costs assumption, 51–2, 68–9 Chicago law school, 55, 56, 58–9 child labour, 124, 146 China, 65 City College, New York, 74–5, 91 climate change average temperature rises, 205–6, 207 and carbon markets, 47, 65–6 ‘cashing in’ on carbon markets, 67 Coasean worldview on pollution, 65–7, 68 denialists, 8 ‘discount rate’ on future costs, 208–9, 212 discrimination against future generations, 208–9 and free-riding theory, 2, 99, 113–17, 120 Intergovernmental Panel on, 207 measurement in numerical terms, 206–11, 213 and precautionary principle, 211–12 premature deaths due to, 207–9 and Prisoner’s Dilemma, 27 Stern Review, 206, 209–10 threat to economic growth, 209 Coase, Ronald argument given status of theorem, 51–2, 67 awarded Nobel Prize, 52 background of, 47–8 and Chicago School, 49–52, 53–4, 56–7, 61, 68–9, 132 and created markets, 47, 65–7 dismissal of ‘blackboard economics’, 48, 54, 64, 67–9 on Duncan Black, 77 evening at Director’s house (early 1960), 49–51, 132 fundamental misunderstanding of work of, 51, 52–3, 67–9 hypothetical world invoked by, 50–51, 52, 54–5, 62, 68 as Illinois resident, 46–7 and Mont Pèlerin Society, 8 and public-sector monopolies, 48–51 and transaction costs, 51–3, 54–5, 61, 62, 63–4, 68 ‘The Nature of the Firm’ (1932 paper), 48 The Problem of Social Cost’ (1960 paper), 47, 48, 50–51, 52, 54–5, 59 cognitive dissonance, 113–14 Cold War, 18–19, 20, 21–2, 24, 27, 181 Cuban Missile Crisis (1962), 33–4, 140 and Ellsberg, 184, 197, 198, 200 and game theory, 18, 20, 21–2, 24, 27, 33–4, 35, 70, 73, 198 and Impossibility Theorem, 75–6 RAND and military strategy, 18, 20, 21–2, 24, 27, 33–4, 70, 73, 75–6, 141, 200, 213 and Russell’s Chicken, 33 and Schelling, 138, 139–40 Washington–Moscow hotline installed, 139–40 collective preference and Ken Arrow, 73–4, 75–7, 78–82 Black’s median voter theorem, 77, 95–6 Sen’s mathematical framework, 80–81 communism, 82, 84, 101, 104, 237 Compass Lexecon, 58, 68 Condorcet Paradox, 76, 77 conspiracy theories, 3, 8, 9 cooperation cartels, monopolies, price-fixing, 101, 102, 103–4 and decision-making processes, 108–10 and free-riding theory, 2, 101, 102, 103–10 office teamwork, 109–10, 112 older perspective on, 100–102, 108, 111, 122 and Scandinavian countries, 103 view of in game theory, 21–2, 23, 25–32, 36–8, 41–3 corporate culture and antitrust regulation, 57–8 changes due to Friedman, 2, 152 Chicago approach to regulation, 40 and climate change, 113, 114, 115 executive pay, 215–16, 219, 224, 228–30, 234, 238 Jensen and Murphy’s article, 229 ‘optimal contracting’/pay-for-performance, 228–30, 238 predatory pricing, 57 and tax evasion/avoidance, 105–6 cost disease, 90–92, 93, 94 Cowles Commission in Chicago, 78 CP/M (Control Program for Microcomputers), 222 criminal responsibility, 111, 137, 152 Cuban Missile Crisis (1962), 33–4, 140 Damasio, Antonio, 14 data geeks, 248–50 ‘dead peasants insurance’, 124 decision-making processes, 108–10, 122, 170–71 ‘anchoring effect’, 212 authority figure–autonomy contradiction, 180 avoidance of pure uncertainty, 198–9 axioms (abstract mathematical assumptions), 198 Ellsberg Paradox, 184, 199–200 Ellsberg’s experiment (1961), 182–4, 187, 197, 198–200, 205 Linda Problem, 202–3 orthodox decision theory, 183–4, 185–6, 189–91, 193–4, 198–200, 201–2, 203–5, 211, 212–14 and the Savage orthodoxy, 190–91, 197, 198–200, 203 scenario planning as crucial, 251 Von Neumann’s theory of decision-making, 189, 190, 203 see also probability; risk and uncertainty democracy and Ken Arrow, 72–4, 75–7, 78–83, 95, 97 Black’s median voter theorem, 77, 95–6 and crises of the 1970s, 85–6 and economic imperialism, 145–7 equal citizenship principle at heart of, 145–6, 151 free-riding view of voting, 99, 110, 112, 115–16, 120–21 marketing by political parties, 95–6 modern cynicism about politics, 94–7 paradox of voter turnout, 88–9, 95–6, 115–16 paradox of voting, 75–7 politicians’ support for depoliticization, 96–7 post-war scepticism about, 78–9 and public choice theory, 85–6, 95–7 replacing of with markets, 79 Sen’s mathematical framework, 80–81 voter turnout, 88–9, 95–6, 115–16, 120–21 see also voting systems Dennison, Stanley, 13 dentistry, 258–9, 261 Depression (1930s), 3 digital technology, 68, 214, 222–3 data revolution, 247–50 and rising inequality, 215, 220, 242 Director, Aaron, 4–5, 49–51, 132 Disney World, 123 Dodd–Frank Act, 256 Dodgson, Charles Lutwidge (Lewis Carroll), 72, 77 dot.com bubble, 192, 201 Douglas Aircraft Corporation, 18 Downs, Anthony, An Economic Theory of Democracy (1957), 86, 89, 95 Dr Strangelove (Kubrick film, 1964), 19, 35, 139 DreamTours Florida, 123 Drucker, Peter, 153 Dulles, John Foster, 20 Dundee School of Economics, 48, 77–8 Dürrenmatt, Friedrich, The Visit of the Old Lady, 166 earthquakes, 194–5 Econometrica (journal), 77–8 economic imperialism arrogance of, 246–7 auctioning of university places, 124, 149–50 continuing damage wrought by, 151–2 and democracy, 145–7 emerges into the limelight, 130 Freakonomics, followers of, 130, 134, 148–9, 156 and inequality, 145–7, 148, 151, 207 markets in citizenship duties, 146 origins of term, 125 price as measure of value, 149, 150, 151 purchase of immigration rights, 125, 146 and sale of body parts, 123, 124, 145, 147–8 sidelining of moral questions, 125–9, 135–8, 141–5, 146–7, 148–9, 151–2, 207 value of human life (‘statistical lives’), 141–5, 207 welfare maximization, 124–5, 129–31, 133–4, 146–7, 148–9 see also Becker, Gary economic theory Arrow establishes benchmark for, 71 Baumol’s cost disease, 90–92, 93, 94 Coase Theorem, 45–7, 48–55, 56–7, 61, 63–6 and data revolution, 247–50 exclusion of by data geeks, 248–50 and financial markets, 9, 12–13, 182, 253 as focus of economics courses, 260 Kahneman and Tversky’s theory of irrationality, 12, 171, 250–51 of labour, 237 marginal productivity theory, 223–4, 228 Pareto efficiency, 217–18, 256* perfect competition, 103, 193–4 profit-maximizing firms, 228–9 rent-seeking, 230, 238 theory of motivation, 157–8, 164, 166–7, 168–70, 178–9 see also game theory; homo economicus; public choice theory; social choice theory economics accidental economists, 47–8 and Arrow’s framework, 78–9, 82 causes of growth, 223, 239 created markets, 47, 65–7 crises of the 1970s, 85–6 digital technology, 68, 214 efficiency as fundamental, 63, 64–5, 141, 153, 155, 193–4, 201, 211, 217–18, 255 empirical research as still rare, 247–8 extension into non-economic aspects of life, 40, 54–60, 65, 123–31, 132–4, 135–6, 145–50 gulf between reality and theory, 10–13, 31–2, 41–3, 51–3, 64–9, 86–9, 133, 136, 144–5, 228–30, 250–53, 260–61 history of, 260 lack of objective ‘facts’, 253 modern debate on, 9 and Olson’s analysis, 104 our love–hate relationship with, 3, 245 as partially self-fulfilling, 12–13, 14, 159, 253 percentage of GDP impact of climate change, 206–11, 213 positional goods, 239–41 Posner’s wealth-maximization principle, 57–63, 64–7, 137 predatory pricing, 57 principles for new relationship with, 251–61 privatization, 50, 54, 88, 93–4 rise of game theory, 40–41 Smith’s enlightened self-interest, 11 value of human life (‘statistical lives’), 141–5, 207 vocational role of, 260 see also behavioural economics; free-market economics economics, aims/pretensions to be science arrogance of, 205, 245–7, 258 Arrow’s framework presented as scientific, 72, 81–2, 124–5 attitude to value judgements, 10, 60–61, 64–9, 112, 136–8, 173–4, 204–5, 218, 247 claims of game theory, 21, 24–6, 28–9, 32, 34, 35, 38, 41 and data revolution, 247–50 desire for neutral science akin to physics, 9–10, 20–21, 34–5, 41, 116, 125, 132–3, 151, 175–6, 187–90, 212, 217–18, 246–56 desire for science of social control, 153, 154, 155, 164, 167 Friedman’s ‘The Methodology of Positive Economics’, 132–3 hidden political/ethical agendas, 10, 213, 253, 255–8 measurement of risk in numerical terms, 181–4, 187, 189, 190–94, 196–7, 201–2, 203–5, 212–13 natural experiments, 248–50 Pareto improvements, 217–18 and physics envy, 9, 20–21, 41, 116, 175–6, 212, 247 and public choice theorists, 88 quantification of all risks and values, 201–2, 203, 212–13 real world as problem for, 10–13, 31–2, 42–3, 51–3, 64–9, 86–9, 133, 136, 144–5, 228–30, 250–53, 260–61 ‘some number is better than no number’ mantra, 212–13 uncertainty as obstacle to, 190–91, 212–13 and use of mathematics, 9–10, 26, 72, 247, 248, 255, 259 use of term ‘rational’, 12 Von Neumann and Morgenstern’s grand project, 20–21, 24–5, 26, 35, 125, 151, 189 and wealth-maximization approach, 58, 60 economists advice to former Soviet Bloc nations, 257 conflicts of interest, 256–7, 258 data geeks, 248–50 economics curriculum reform needed, 259–60 errors and misjudgements, 13–14, 16, 132–3, 144–5, 256*, 257–8, 260–61 failure to explain ideas, 254–5 insularity of, 246–7 Keynes’ dentistry comparison, 258–9, 261 lack of ethics codes, 257–8 misunderstanding of altruism, 13–14, 25, 31–2, 41–3, 112, 178–9 need to show more humility, 258–9, 260–61 as not separate from economy, 251–3 and ordinary people, 245–6, 254–5, 258, 261 self-image as unsentimental and honest, 10 sneering descriptions of virtuous behaviour, 112 stating of the obvious by, 134, 259 education auctioning of university places, 124, 149–50 Baumol’s cost disease, 91, 92, 93, 94 incentivization as pervasive, 156, 169 value of, 150, 169, 170 ‘efficient market hypothesis’, 193–4, 201, 255 Einstein, Albert, 17, 22, 33, 213 Eisenhower, Dwight D., 19, 20, 231 Ellsberg, Daniel, 182–4, 187, 197–8 Ellsberg Paradox, 184, 199–200 and the ‘Pentagon Papers’, 200 probability experiment (1961), 182–4, 187, 197, 198–200 ‘Risk, Ambiguity and the Savage Axioms’ (paper, 1961), 198–9, 200 Engelbart, Douglas, 222–3 Engels, Friedrich, 223 English, Bill, 222–3 Enlightenment thinking, 11, 185 Epstein, Richard, 127 ethics and morality and autonomy, 164, 165–6, 168, 169–70, 180 bad behaviour redefined as rational, 12 and blame for accidents, 55, 60–61 and Coase Theorem, 46–7, 54–5, 56–7, 61, 63–6 Coasean worldview on pollution, 66–7, 68 as conditioned and limited by economics, 3, 10, 15, 43, 55, 60–61, 64–5, 179, 204–5, 218, 247 cooperative behaviour in game theory, 29, 30–32 core principles of current economic orthodoxy, 253 distinction between values and tastes, 136–8 economists’ language on virtuous behaviour, 112 inequality as moral issue, 242–3 influence of recent economic ideas, 1–3, 15–16 Keynes on economics as moral science, 252–3 law and economics movement, 40, 55, 56–63, 64–7 moral disengagement, 162, 163, 164, 166 morally wrong/corrupting incentives, 168–9 and Nash program, 25 Nudge economists, 173–4, 251 Posner’s wealth-maximization principle, 57–63, 64–7, 137 Puzzle of the Harmless Torturers, 118–19 Ramsey Rule on discounting, 208–9, 212 sale of body parts, 123, 124, 145, 147–8 sidelined by economic imperialism, 125–9, 135–8, 141–5, 146–7, 148–9, 151–2, 207 small contributions as important, 110, 114–15, 122 Smith’s enlightened self-interest, 11 value of human life (‘statistical lives’), 141–5, 207 see also altruism; free-riding behaviour European Commission, 96 Facebook UK, 99 fairness, 1, 149, 218, 228, 253 and Coase, 54, 55 and free-riding behaviour, 107 and game theory, 43 and incentives, 177, 179 and lucky geniuses, 221–3 and Posner’s wealth-maximization principle, 60, 61, 62 see also inequality family life, 127–8, 130–31, 133, 156 famine relief, 99, 114–15 Farmer, Roger, 259 Federal Communications Commission (FCC), 48–9 Ferdinand, Archduke Franz, 185 financial crisis, global (2007–10) Becker on, 128–9 and bell curve thinking, 192, 193–4, 196, 257 ‘blame the regulators’ argument, 1–2 and financial economists, 9, 88, 260–61 persuasive power of extreme numbers, 181–2 and Posner’s wealth-maximization principle, 57 underlying maths of, 194, 195–6 financial markets Bachelier’s theory of speculation, 193 bell curve thinking, 192, 193–4, 195, 196–7, 201, 203–4, 257 benchmarking against the market, 204 Black Monday (1987), 192 deregulation of US banks, 194 derivatives, 253 dot.com bubble, 192, 201 East Asian crisis (1997), 192 and economic theory, 9, 12–13, 182, 253 economists’ ignorance of, 260–61 and First World War, 185 and fractals (scale-invariance), 194, 195–6, 201 orthodox decision theory, 190–91, 193–4, 201 persuasive power of extreme numbers, 181–2, 191, 192 and rent-seekers, 230, 238 rigged pay-for-performance contracts, 229–30, 238 First World War, 185, 210, 211–12 Fisher, Antony, 6–8 Forster, E.

pages: 147 words: 39,910

The Great Mental Models: General Thinking Concepts
by Shane Parrish
Published 22 Nov 2019

In order for someone to deliberately get in your way they have to notice you, gauge the speed of your car, consider where you are headed, and swerve in at exactly the right time to cause you to slam on the brakes, yet not cause an accident. That is some effort. The simpler and thus more likely explanation is that they didn’t see you. It was a mistake. There was no intent. So why would you assume the former? Why do our minds make these kinds of connections when the logic says otherwise? The famous Linda problem, demonstrated by the psychologists Daniel Kahneman2 and Amos Tversky in a 1982 paper, is an illuminating example of how our minds work and why we need Hanlon’s Razor. It went like this: Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

With this experiment, and a host of others, Kahneman and Tversky exposed a sort of tic in our mental machinery: we’re deeply affected by vivid, available evidence, to such a degree that we’re willing to make judgments that violate simple logic. We over-conclude based on the available information. We have no trouble packaging in unrelated factors if they happen to occur in proximity to what we already believe. The Linda problem was later criticized as the psychologists setting their test subjects up for failure. If it was stated in a different way, subjects did not always make the error. But this of course was their point. If we present the evidence in a certain light, the brain malfunctions. It doesn’t weigh out the variables in a rational way.

pages: 807 words: 154,435

Radical Uncertainty: Decision-Making for an Unknowable Future
by Mervyn King and John Kay
Published 5 Mar 2020

And our knowledge of the world would lead us to think it more likely that Lenin met Rosa Luxemburg (the leader of the German communist revolution of 1918) than that he met James Joyce (if you are interested, Lenin and Luxemburg did meet when Lenin and his wife changed trains in Berlin in 1908). 8 Philadelphia is not the capital of Pennsylvania, and anyone who offers odds on the answer to such a question is a knave (and anyone who accepts them a fool). You will wind up with an earful of cider. The ‘Linda problem’ is one of the most frequently reported experiments in behavioural economics. In his bestseller Thinking, Fast and Slow , Daniel Kahneman describes it thus: ‘Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

When we ask whether it is likely that Joyce met Lenin, or that Philadelphia is the capital of Pennsylvania, they do not reason probabilistically but interpret the question in the light of their broad contextual knowledge. That was the lesson Sky Masterson’s father had conveyed to his son, and it is one well understood by Kahneman’s respondents. People do not think of the Linda problem in terms of frequencies, or as an exercise in probabilistic reasoning. They see the description of Linda as a story about a real person, and the biography of Linda which ends only by identifying her as a bank teller is not, without more information, a satisfactory account. Faced with such a narrative in real life, one would seek further explanation to resolve the apparent incongruity and be reluctant to believe, far less act on, the information presented.

., 298–9 , 407 Jolly, Philipp von, 429–30 Jonah (biblical character), 36 Jonathan’s coffee house (City of London), 55 J. P. Morgan, 366 Kafka, Franz, 226 Kahneman, Daniel, 121 , 135–6 , 144–7 , 175 , 393 , 442 ; dual systems of, 170–1 , 172 , 271 ; experiments with Tversky, 141–7 , 152 , 215 ; and the invisible gorilla, 140 ; and the Linda problem, 90–1 ; Thinking, Fast and Slow , 90 Kalahari Bushmen, 219–20 Katrina, Hurricane (2005), 426–7 Kay, John: The British Tax System (with Mervyn King), xiii ; Other People’s Money , 5–6 Kelvin, Lord, 40 , 86 , 219 , 430 Kennan, George, 293 Kennedy, John F., 278–81 , 298 , 412 Kennedy, Robert, 281 Kent County Cricket Club, 263–4 Kentucky Derby, 46 , 71 , 79–81 Kepler, Johannes, 18–19 , 388 Keynes, John Maynard: background of, 72 ; and creativity, 47 ; on economics as practical knowledge, 386 , 387 , 388 ; and Indifference Principle, 63–4 ; Keynesian models, 339–40 ; on probability, 63–4 , 105 ; and radical uncertainty, 15 , 72 , 73 , 105 , 170 , 340–1 , 356 , 420 ; and risk-uncertainty distinction, 12–13 , 14 , 420 ; on ‘uncertain’ knowledge, 13 ; The General Theory of Employment, Interest and Money , 13 , 15 , 339 ; A Treatise on Probability , 15 , 63–4 Khrushchev, Nikita, 280 , 281 King, Mervyn: The British Tax System (with John Kay), xiii ; The End of Alchemy , 5–6 Kissinger, Henry, 4 , 99 , 412 Klein, Gary, 151–2 , 167 , 270 , 271 , 399 , 414–15 , 416 Klemperer, Paul, 257 Knight, Frank: background of, 72–3 ; and creativity, 47 , 130 ; and entrepreneurship, 15 , 74 , 170 , 258 , 337 , 405 , 431 ; on measurement, 86 , 326 ; and radical uncertainty, 15 , 72 , 73 , 170 , 336 , 337 , 420 ; and risk-uncertainty distinction, 12–13 , 14 , 74 , 420 Knights, Lionel, How Many Children Had Lady Macbeth?

pages: 542 words: 132,010

The Science of Fear: How the Culture of Fear Manipulates Your Brain
by Daniel Gardner
Published 23 Jun 2009

Kahneman and Tversky were sure people would spot it and correct their intuition. But they were wrong. Almost exactly the same percentage of students—85 percent—said it is more likely that Linda is a bank teller and a feminist than a bank teller only. Kahneman and Tversky also put both versions of the “Linda problem,” as they called it, under the noses of experts trained in logic and statistics. When the experts answered the original question, with its long list of distracting details, they got it just as wrong as the undergraduates. But when they were given the two-line version, it was as if someone had elbowed them in the ribs.

That’s a simple example based on a simple notion of what’s “typical,” but we are capable of forming very complex images of typicality— such as that of a “typical” feminist or a “typical” bank teller. We make these sorts of judgments all the time and we’re scarcely aware of them for the good reason that they usually work, and that makes the Rule of Typical Things an effective way to simplify complex situations and come up with reliable snap judgments. Or at least, it usually is. The Linda problem demonstrates one way the Rule of Typical Things can go wrong. When there’s something “typical” involved, our intuition is triggered. It just feels right. And as always with intuitive feelings, we tend to go with them even when doing so flies in the face of logic and evidence. It’s not just ordinary people who fall into this trap, incidentally.

The participants were all experts—from universities, governments, and corporations—whose job was assessing current trends and peering into the future. If anyone could be expected to judge the chances of things happening rationally, it was this bunch. The psychologists gave a version of the “Linda problem” to two groups, totaling 115 experts. The first group was asked to evaluate the probability of “a complete suspension of diplomatic relations between the USA and the Soviet Union, sometime in 1983.” The second group was asked how likely it was that there would be “a Russian invasion of Poland, and a complete suspension of diplomatic relations between the USA and Poland, sometime in 1983.”

pages: 111 words: 1

Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets
by Nassim Nicholas Taleb
Published 1 Jan 2001

It corresponds to the practice of estimating the frequency of an event according to the ease with which instances of the event can be recalled. (2) The representativeness heuristic: gauging the probability that a person belongs to a particular social group by assessing how similar the person’s characteristics are to the “typical” group member’s. A feminist-style philosophy student is deemed more likely to be a feminist bank teller than to be just a bank teller. This problem is known as the “Linda problem” (the feminist’s name was Linda) and has caused plenty of academic ink to flow (some of the people engaged in the “rationality debate” believe that Kahneman and Tversky are putting highly normative demands on us humans). (3) The simulation heuristic: the ease of mentally undoing an event—playing the alternative scenario.

Simpson had 1/500,000 chance of not being the killer from the blood standpoint (remember the lawyers used the sophistry that there were four people with such blood types walking around Los Angeles) and adding to it the fact that he was the husband of the person and that there was additional evidence, then (owing to the compounding effect) the odds against him rise to several trillion trillion. “Sophisticated” people make worse mistakes. I can surprise people by saying that the probability of the joint event is lower than either. Recall the availability heuristic: with the Linda problem rational and educated people finding the likelihood of an event greater than that of a larger one that encompasses it. I am glad to be a trader taking advantage of people’s biases but I am scared of living in such a society. An Absurd World Kafka’s prophetic book, The Trial, about the plight of a man, Joseph K., who is arrested for a mysterious and unexplained reason, hit a spot as it was written before we heard of the methods of the “scientific” totalitarian regimes.

pages: 523 words: 154,042

Fancy Bear Goes Phishing: The Dark History of the Information Age, in Five Extraordinary Hacks
by Scott J. Shapiro

The conjunction rule states that the probability of two events occurring can never be greater than the probability of either of those events occurring by itself: Conjunction Rule: Prob(x) ≥ Prob(x AND y) Thus, the probability that a coin will land heads twice in a row (for two tosses) cannot be greater than the probability that a coin lands heads just once (for one toss). Similarly, the probability that Linda is a feminist bank teller cannot be greater than the probability that Linda is a bank teller. The Linda problem, first formulated by the Israeli psychologists Daniel Kahneman and Amos Tversky, is perhaps the most famous example of human violations of the basic rules of probability theory. Kahneman and Tversky spent their careers uncovering how mistaken our judgments and choices can be. The human mind is riddled with upcode that causes us to make biased predictions and irrational choices.

., job at; regulations on data collection by; surveillance of citizens by; upcode of National Security Decision Directive nation-states: cyber clubs and upcode of; cybercrime of individuals mistaken for; espionage as key feature of; firewalls for; intelligence sharing between; protection rackets role in; war and; see also cyberespionage; cyberwarfare Netscape browser Netyksho, Viktor Neumann, Peter 9/11 terrorist attacks Norman, Dalton NSA, see National Security Agency nudges and mudges Obama, Barack Olsen, Ashley operating systems: Apple computers; design and function of; DOS; heuristics used by; IBM quality of; malware working across; software development for; stack hacking; Windows; Winner Take All market for OVH (French cloud provider) Page, Carter passwords; Clinton 2016 campaign warnings about; CTSS vulnerability with; IoT security and; Morris Worm uncovering of; Multics approach to; phishing on resetting; side-channel attacks and; UNIX approach to; weak Patrick, Neal Patriot Act payment systems personal computer evolution Peterson, Elliott phishing: Affect Heuristic in; Availability Heuristic in; characteristics of emails in; definition and types of; Fancy Bear; Google accounts targeted in; Loss Aversion Heuristic in; misspellings and; Nigerian Prince; nudges and mudges in; on password resetting; Representativeness Heuristic in; spear or targeted; time pressure role in; typosquatting; whaling physicality principle Podesta, John Poitras, Laura polymorphic virus engine Poodle Corp presidential election (2016): intelligence about Russian interference in; internet outages and attacks around; state voter record hacks and; Trump and Russia ties and; WikiLeaks publishing of DNC emails and; see also Clinton, Hillary, campaign; Democratic National Committee hacks Prevalsky, Teodor privacy violations: government-led; Hilton experience and reaction to; penalties for data breaches and; surveillance capitalism and probability Prodigy programming and software development: Achilles and the Tortoise logic and; batch processing and; childhood introduction to; CTSS and history of; cybersecurity limits example; data and code difference in; decidable and undecidable problems relation to; education and virus writers; fuzzing in; interoperability and; laws and legislation on; liabilities for faulty; licensing and; Morris Worm; vulnerabilities with low-quality; vulnerability announcements risk in; Winner Take All market and; for World Wide Web early days; see also operating systems; self-replication programming languages protection rackets ProTraf Solutions proxies ProxyPipe psychology; Affect Heuristic in; Availability Heuristic in; cybersecurity threat relation to human; dual-process theories and; Fancy Bear exploitation of; of hackers; of helplessness to cybercrime; Kahneman and Tversky studies on human; Linda problem on probability and; Loss Aversion Heuristic in; Representativeness Heuristic and; of virus writers Putin, Vladimir Radai, Yisrael ransomware attacks Rasch, Mark rational/irrational choices Reagan, Ronald Representativeness Heuristic Richie, Nicole Rinehart, Billy: background of; Fancy Bear phishing of Ritchie, Dennis Rometty, Ginni Rousseau, Jean-Jacques routers Russia: Clinton relations with; cybercrime extradition and; cyberwarfare and; DDoS attacks on Estonia by; Fancy Bear origins in; first computer virus in; FSB hacking unit; Guccifer 2.0 hacks and; hacking history and education; M13 security company in; national firewalls; SolarWinds cyberespionage by; Trump ties with; 2016 election interference; Ukraine war with; X-Agent malware from; see also Democratic National Committee hacks; GRU Rutgers University Salomon, Rick Sampson, Robert Sanders, Bernie Sanger, David Schell, Roger Scherr, Allan Schneier, Bruce search warrants Securities and Exchange Commission (SEC) self-replication: cellular automaton and; internal blueprint and; 3D printer and; universal constructor and; virus definition and; Von Neumann research on Seligson, Ritchie SENDMAIL Sendov, Blagovest Shoch, John Shockwave Rider (Brunner) side-channel attacks Sidekick cell phones Simple Life, The Sinofsky, Steven Skripal, Sergei and Yulia Slate Slivka, Benjamin W.

pages: 519 words: 104,396

Priceless: The Myth of Fair Value (And How to Take Advantage of It)
by William Poundstone
Published 1 Jan 2010

The question was written so that Linda fits the stereotype of a feminist and doesn’t fit the stereotype of a bank teller. Hunches about Linda defied logic. Those hunches were amazingly tenacious, though. Tversky and Kahneman resorted to “a series of increasingly desperate manipulations” intended to get their subjects to obey simple logic. They tried giving volunteers the Linda problem, followed by two arguments about what the answer should be. The subjects didn’t have to commit to an answer, just to say which argument they believed was more convincing. Argument 1. Linda is more likely to be a bank teller than she is to be a feminist bank teller, because every feminist bank teller is a bank teller, but some women bank tellers are not feminists, and Linda could be one of them.

pages: 755 words: 121,290

Statistics hacks
by Bruce Frey
Published 9 May 2006

In the 1970s, Nobel Prize winner Daniel Kahneman and his colleague Amos Tversky presented college students with several problems in which one option was highly representative of a given personality description, one option was incongruent with the description, and one option included both the highly similar and the incongruent options. Perhaps the most well-known problem that demonstrates the conjunction fallacy is the now-famous (at least in cognitive psychology circles) Linda Problem: Linda is 31 years old, single, outspoken, and very bright. She majored in Philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and she also participated in antinuclear demonstrations. Subjects were asked to rank these statements based on high likely they were to be true: Linda is a teacher in elementary school.

pages: 913 words: 265,787

How the Mind Works
by Steven Pinker
Published 1 Jan 1997

Our ancestors’ usable probabilities must have come from their own experience, and that means they were frequencies: over the years, five out of the eight people who came down with a purple rash died the following day. Gigerenzer, Cosmides, Tooby, and the psychologist Klaus Fiedler noticed that the medical decision problem and the Linda problem ask for single-event probabilities: how likely is that this patient is sick, how likely is it that Linda is a bankteller. A probability instinct that worked in relative frequencies might find the questions beyond its ken. There’s only one Linda, and either she is a bankteller or she isn’t. “The probability that she is a bankteller” is uncomputable.