From Daniel Kahneman’s “Thinking Fast and Slow”

Beware of These 18 Biases and Illusions Before You Make a Decision!

  1. Availability Heuristic If you can easily recall an event, you will think it happens more frequently than reality.

    “The process of judging frequency by the ease with which instances come to mind” (p. 129)



  2. Anchoring Effect– If you see a number, it will affect your judgement and prediction of something.

    “However, you should assume that any number that is on the table has had an anchoring effect on you, and if the stakes are high you should mobilize yourself (your System 2) to combat the effect.” (p. 128)




  3. Representativeness Heuristic– If you read a stereotypical description of something, and you didn’t factor in the Base Rate, you will not make an accurate judgement.



  4. Optimism Bias– People who have the Optimism Bias underestimate the odds they face, and don’t put enough effort to find out what the odds really are.

    “The evidence suggests that an optimistic bias plays a role—sometimes the dominant role—whenever individuals or institutions voluntarily take on significant risks. More often than not, risk takers underestimate the odds they face, and do not invest sufficient effort to find out what the odds are. Because they misread the risks, optimistic entrepreneurs often believe they are prudent, even when they are not.” (p. 256)



  5. Planning Fallacy– If someone is planning a project or business, they will underestimate the time it will take, and overestimate the probabilities of success. They didn’t factor in the Base Rate.

    “When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities.” (p. 252)



  6. Base Rate Neglect– When you predict something, you ignore the Base Rates.

    “Some people ignore base rates because they believe them to be irrelevant in the presence of individual information. Others make the same mistake because they are not focused on the task.” (p. 153)




  7. Hindsight Bias– If you try to remember your past beliefs and predictions, you won’t be able remember them accurately.

    “Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events.” (p. 202)

    “The tendency to revise the history of one’s beliefs in light of what actually happened produces a robust cognitive illusion.” (p. 203)



  8. Outcome Bias– When you see a good outcome, you thought it was a good decision. This might be false. You might have had a good outcome due to luck, even thought the decision making was bad.

    When you see a bad outcome, you thought it was a bad decision. This might be false. You might have had a bad outcome due to bad luck, even though it was a good decision.

    “Let’s not fall for the outcome bias. This was a stupid decision even though it worked out well.” (p. 208)



  9. Affect Heuristic– “People make judgments and decisions by consulting their emotions;” (p. 138)

    “People let their likes and dislikes determine their beliefs about the world” (p. 103)



  10. Narrative Fallacy– You hear a story or explanation of something and it seems makes a lot of sense. You use this narrative as evidence to predict what will happen in the future, even though in reality the narrative is false.

    “Taleb introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future.”



  11. Illusion of Validity– When you think something is true to due the feeling of confidence.

    “Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.” (p. 212)



  12. Regression to the Mean.- If something goes well, and then later does worse, you think there is an explanation for it. If something goes bad, and then later does better, you will think there is an explanation for it. There is no explanation. It is simply Regression to the Mean. If something goes well, it will probably do worse next time because the good luck will run out. If something goes bad, it will be better next time, because bad luck also runs out.



  13. Halo Effect– If you like someone, you will think everything about them is great, even though in reality you don’t know them that well. If you dislike someone, you will think everything about them is bad, even if you don’t know them that well.

    “The tendency to like (or dislike) everything about a person- including things you have not observed is known as the halo effect.” (p. 81)



  14. Sunk Cost Fallacy– When you put money, time, and effort into a project, you will feel the need continue on with the project because you have already invested money, time, and effort. This is an illusion. The money, time, and effort is in the past and shouldn’t affect your future decisions!

    “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small. Driving into a blizzard because one paid for tickets is a sunk-cost error.” (p. 354)



  15. Competition Neglect– When you focus on what you know, and neglect what you don’t know and also neglect the competition.

    “Even when they are not sure they will succeed, these bold people think their fate is almost entirely in their own hands. They are surely wrong: the outcome of a start-up depends as much on the achievements of its competitors and on changes in the market as on its own efforts.” (P. 260)



  16. Conjunction Fallacy– “People commit when they judge a conjunction of two events to be more probable than one of the events in direct comparison.” (P. 158)

    Example: Who is Linda more likely to be? a Bank Teller? or Bank Teller who is a Feminist?



  17. Law of Small Numbers– If you see a statistic, make sure it’s a large enough sample size. Small sample sizes can give extreme results. Large sample sizes are more accurate.

    “Large samples are more precise than small samples. Small samples yield extreme results more often than large samples do.” (p. 111)



  18. Loss Aversion– Loses feel twice as painful as gains. For example, losing $100 feels more painful than winning winning $150.

    “He weighs losses about twice as much as gains, which is normal” (p. 288)

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex