From Daniel Kahneman “Thinking, Fast and Slow”

Before starting a project , beware of the Planning Fallacy!

  • “When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities.They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns—or even to be completed.” (p. 252)

  • “In light of both the outside-view forecast and the eventual outcome, the original estimates we made that Friday afternoon appear almost delusional. This should not come as a surprise: overly optimistic forecasts of the outcome of projects are found everywhere. Amos and I coined the term planning fallacy to describe plans and forecasts that are unrealistically close to best-case scenarios could be improved by consulting the statistics of similar cases Examples of the planning fallacy abound in the experiences of individuals, governments, and businesses. The list of horror stories is endless.” (p. 249)

  • “In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds they face. I will return to this idea several times in this book—it probably contributes to an explanation of why people litigate, why they start wars, and why they open small businesses.” (p. 253)






Why do people neglect the Base Rate?

  • People think their individual case is different and unique. This is called the Inside View. Often people don’t even look for the Base Rate. When they find out what the Base Rate is they ignore it. The Base Rate (also called Outside View) is the the average rate of a reference class.


  • This is a common pattern: people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.”(p. 249)


  • intuitive predictions tend to be overconfident and overly extreme.” (p.192)


  • “the greatest responsibility for avoiding the planning fallacy lies with the decision makers who approve the plan. If they do not recognize the need for an outside view, they commit a planning fallacy.” (p. 251)

How Can You Prevent the Planning Fallacy?

  • First, write down the Base Rate. The Base Rate (the average rate of a reference class). Most people ignore the Base Rate.

  • Write down your intuitive prediction. You intuitive prediction will be based on the Inside View. The Inside View is your view of the situation that makes your case unique.

  • Estimate the Correlation Coefficient. Think about how strong or weak the quality of your evidence that makes your case unique. Don’t ignore the role of luck, competition neglect, be aware of the unknown of unknowns, and other hidden factors.

  • Make an updated prediction. Your updated prediction will most likely be between your intuitive prediction and the base rate. If you have a very high correlation coefficient, it will be closer to the intuitive prediction. If you have a low correlation coefficient it will be closer to the base rate.

  • “In both cases, you aim for a prediction that is intermediate between the baseline and your intuitive response.

    In the default case of no useful evidence, you stay with the baseline.

    At the other extreme, you also stay with your initial prediction. This will happen, of course, only if you remain completely confident in your initial prediction after a critical review of the evidence that supports it.

    In most cases you will find some reason to doubt that the correlation between your intuitive judgment and the truth is perfect, and you will end up somewhere between the two poles. This procedure is an approximation of the likely results of an appropriate statistical analysis. If successful, it will move you toward unbiased predictions, reasonable assessments of probability, and moderate predictions of numerical outcomes. The two procedures are intended to address the same bias: intuitive predictions tend to be overconfident and overly extreme.” (p. 191)

  • “There is one thing you can do when you have doubts about the quality of the evidence: let your judgments of probability stay close to the base rate. Don’t expect this exercise of discipline to be easy—it requires a significant effort of self-monitoring and self-control.” (P. 153)

  • The prevalent tendency to underweight or ignore distributional information is perhaps the major source of error in forecasting. Planners should therefore make every effort to frame the forecasting problem so as to facilitate utilizing all the distributional information that is available.” (P. 251)

  • This may be considered the single most important piece of advice regarding how to increase accuracy in forecasting through improved methods. Using such distributional information from other ventures similar to that being forecasted is called taking an “outside view” and is the cure to the planning fallacy.” (p. 251)

  • “The treatment for the planning fallacy has now acquired a technical name, reference class forecasting, and Flyvbjerg has applied it to transportation projects in several countries. The outside view is implemented by using a large database, which provides information on both plans and outcomes for hundreds of projects all over the world, and can be used to provide statistical information about the likely overruns of cost and time, and about the likely underperformance of projects of different types. The forecasting method that Flyvbjerg applies is similar to the practices recommended for overcoming base-rate neglect: (p. 251)

  • “This approach to prediction is general. You can apply it whenever you need to predict a quantitative variable, such as GPA, profit from an investment, or the growth of a company. The approach builds on your intuition, but it moderates it, regresses it toward the mean. When you have good reasons to trust the accuracy of your intuitive prediction—a strong correlation between the evidence and the prediction—the adjustment will be small. Intuitive predictions need to be corrected because they are not regressive and therefore are biased.” (p. 190)





How Daniel Kahneman Fell for the Planning Fallacy.

Daniel Kahneman and his colleagues were excited to design a new curriculum for a high school. They got together to discuss the project. Kahneman asked each person to predict how long the project would take. (To avoid Biases, he asked them independently). The responses ranged from 1.5 years to 2.5 years. They also never considered they would fail. “we had never considered the possibility that we might fail.” (p. 246)

Kahneman was making predictions based on the Inside View and forgot to look at the Base Rates!

Kahneman asked his colleague Seymour how long it took other groups to design a new curriculum. He thought about it. He said it took other groups a minimum of 7 years and a maximum of 10 years! Seymour also said about 40% of the teams failed to finish the project. Kahneman then asked Seymour, how does our team compare to other teams. Seymour said a little below average!

Even though Kahneman and Seymour discussed the Base Rates, they disregarded it and went ahead with the project.

“The new forecast still seemed unreal, because we could not imagine how it could take so long to finish a project that looked so manageable. No crystal ball was available to tell us the strange sequence of unlikely events that were in our future. All we could see was a reasonable plan that should produce a book in about two years, conflicting with statistics indicating that other teams had failed or had taken an absurdly long time to complete their mission.” (p. 246)

“What we had heard was base-rate information, from which we should have inferred a causal story: if so many teams failed, and if those that succeeded took so long, writing a curriculum was surely much harder than we had thought. But such an inference would have conflicted with our direct experience of the good progress we had been making. The statistics that Seymour provided were treated as base rates normally are—noted and promptly set aside.” (p. 247)

It ended up taking them 8 years to finish the project and Kahneman left the project before it was done.

“We should have quit that day. None of us was willing to invest six more years of work in a project with a 40% chance of failure. Although we must have sensed that persevering was not reasonable, the warning did not provide an immediately compelling reason to quit. After a few minutes of desultory debate, we gathered ourselves together and carried on as if nothing had happened.” (p. 247)

This embarrassing episode remains one of the most instructive experiences of my professional life. I eventually learned three lessons from it. The first was immediately apparent: I had stumbled onto a distinction between two profoundly different approaches to forecasting, which Amos and I later labeled the inside view and the outside view. The second lesson was that our initial forecasts of about two years for the completion of the project exhibited a planning fallacy. Our estimates were closer to a best-case scenario than to a realistic assessment. I was slower to accept the third lesson, which I call irrational perseverance: the folly we displayed that day in failing to abandon the project. Facing a choice, we gave up rationality rather than give up the enterprise.” (p. 247)

“The inside view is the one that all of us, including Seymour, spontaneously adopted to assess the future of our project. We focused on our specific circumstances and searched for evidence in our own experiences. We had a sketchy plan: we knew how many chapters we were going to write, and we had an idea of how long it had taken us to write the two that we had already done. The more cautious among us probably added a few months to their estimate as a margin of error.”

“Extrapolating was a mistake. We were forecasting based on the information in front of us—WYSIATI—but the chapters we wrote first were probably easier than others, and our commitment to the project was probably then at its peak. But the main problem was that we failed to allow for what Donald Rumsfeld famously called the “unknown unknowns.” There was no way for us to foresee, that day, the succession of events that would cause the project to drag out for so long. The divorces, the illnesses, the crises of coordination with bureaucracies that delayed the work could not be anticipated. Such events not only cause the writing of chapters to slow down, they also produce long periods during which little or no progress is made at all. The same must have been true, of course, for the other teams that Seymour knew about. The members of those teams were also unable to imagine the events that would cause them to spend seven years to finish, or ultimately fail to finish, a project that they evidently had thought was very feasible.” (p. 247)

“The project was my initiative, and it was therefore my responsibility to ensure that it made sense and that major problems were properly discussed by the team, but I failed that test. My problem was no longer the planning fallacy. I was cured of that fallacy as soon as I heard Seymour’s statistical summary. If pressed, I would have said that our earlier estimates had been absurdly optimistic. If pressed further, I would have admitted that we had started the project on faulty premises and that we should at least consider seriously the option of declaring defeat and going home. But nobody pressed me and there was no discussion; we tacitly agreed to go on without an explicit forecast of how long the effort would last. This was easy to do because we had not made such a forecast to begin with. If we had had a reasonable baseline prediction when we started, we would not have gone into it, but we had already invested a great deal of effort—an instance of the sunk-cost fallacy, which we will look at more closely in the next part of the book. It would have been embarrassing for us—especially for me—to give up at that point, and there seemed to be no immediate reason to do so. It is easier to change directions in a crisis, but this was not a crisis, only some new facts about people we did not know. The outside view was much easier to ignore than bad news in our own effort. I can best describe our state as a form of lethargy—an unwillingness to think about what had happened. So we carried on. There was no further attempt at rational planning for the rest of the time I spent as a member of the team—a particularly troubling omission for a team dedicated to teaching rationality. I hope I am wiser today, and I have acquired a habit of looking for the outside view. But it will never be the natural thing to do. (p. 253)







Other Good Quotes About the Planning Fallacy.

Examples of the planning fallacy abound in the experiences of individuals, governments, and businesses. The list of horror stories is endless.” (p.249).

In 2002, a survey of American homeowners who had remodeled their kitchens found that, on average, they had expected the job to cost $18,658; in fact, they ended up paying an average of $38,769. (p. 250)

“Errors in the initial budget are not always innocent. The authors of unrealistic plans are often driven by the desire to get the plan approved—whether by their superiors or by a client—supported by the knowledge that projects are rarely abandoned unfinished merely because of overruns in costs or completion times. In such cases, the greatest responsibility for avoiding the planning fallacy lies with the decision makers who approve the plan. If they do not recognize the need for an outside view, they commit a planning fallacy.” (p. 250)

“A proud emphasis on the uniqueness of cases is also common in medicine, in spite of recent advances in evidence-based medicine that point the other way. Medical statistics and baseline predictions come up with increasing frequency in conversations between patients and physicians. However, the remaining ambivalence about the outside view in the medical profession is expressed in concerns about the impersonality of procedures that are guided by statistics and checklists.” (p. 249)

“Suppose you did not know a thing about this particular legal case, only that it involves a malpractice claim by an individual against a surgeon. What would be your baseline prediction? How many of these cases succeed in court? How many settle? What are the amounts? Is the case we are discussing stronger or weaker than similar claims?” (p. 254)

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex