From Max Bazerman’s “Judgement in Managerial Decision Making”

Beware of Overconfidence, the Mother of All Biases!

  • “Overconfidence. We lead with an exploration of this bias for two reasons. First, it is one of the most potent and pervasive biases to which human judgment is vulnerable. Second, it facilitates many of the other biases discussed in this book. Without it, we would be better able to acknowledge our own shortcomings and correct our other biases.”

  • “Griffin and Varey (1996) write that “overconfidence is not only marked but nearly universal.” The bias is “the most robust finding in the psychology of judgment,” according to DeBondt and Thaler (1995).”

  • “Overconfidence has been blamed for wars, stock market bubbles, strikes, unnecessary lawsuits, high rates of entrepreneurial bankruptcy, and the failure of corporate mergers and acquisitions. It could also explain the excessively high rate of trading in the stock market, despite the costs, argues Odean (1998). According to Camerer and Lovallo (1999), overconfidence may be the culprit behind high rates of entrepreneurial entry, which occur even though most new businesses go bankrupt within a few short years, having burned through the money provided by backers, investors, and founders. Overconfidence could explain the high rates of corporate mergers and acquisitions, despite the fact that such ventures so often fail, write Malmendier and Tate (2005). Plous (1993) suggests that overconfidence contributed to the nuclear accident at Chernobyl and to the explosion of the Space Shuttle Challenger.”

  • “Overconfidence has been studied in three basic ways: in terms of overprecision, overestimation, and overplacement. Overprecision describes the tendency to be too sure our judgments and decisions are accurate, uninterested in testing our assumptions, and dismissive of evidence suggesting we might be wrong. It leads us to draw overly narrow confidence intervals and to be too certain that we know the truth.”

  • Overestimation is the common tendency to think we’re better, smarter, faster, more capable, more attractive, or more popular (and so on) than we actually are. As a consequence, we overestimate how much we will accomplish in a limited amount of time or believe we have more control than we actually do.”

  • “If we could specify how uncertain people should be and could systematically vary the consequences of over- or underestimation, then we could test whether people shift their behavior as much as they should. Mannes and Moore (2012) did exactly that. The evidence corroborated what many others had already learned about confidence intervals: people act as if they are sure they know the truth. They draw their bull’s-eyes too small, make their confidence intervals too narrow, and don’t shift their actions as much as they should in the face of uncertainty. The consequence is that, too often, we tumble off a cliff we were too sure we were clear of. We miss our flights and we bounce checks, in part because we underestimate the uncertainties involved.”

  • “Interventions that force people to think about alternative perspectives, interpretations, or hypotheses are often effective at shaking their overconfidence and inducing greater realism (Koriat, Lichtenstein, & Fischhoff, 1980). In other words, thinking about why you might be wrong can help correct for the influence of the confirmation bias on your confidence judgments. In fact, Don Moore (one of the authors) and his colleagues have found in their research that simply asking people to explicitly consider the likelihood of alternative outcomes to the one they’ve proposed increases the accuracy of their judgments (Haran, Moore, & Morewedge, 2010).

  • Overprecision makes us too sure of our judgments, such that we are often in error yet rarely in doubt. Our assurance makes us too reluctant to take advice from others, suspicious of those whose views differ from our own, too quick to act on our opinions, and too slow to update our erroneous beliefs. Research on advice giving and advice taking helps us to understand how people learn from others and when they are open to receiving others’ wisdom. The single most important and robust finding from this substantial literature is that, reluctant to revise our opinions, we tend to ignore feedback from others on the problems we face (Yaniv & Kleinberger, 2000)”

  • “Ross and Ward (1996) use the term naive realism to describe the widespread belief that the way we see the world is the only sensible view. For most of us, the na€ıve view that our perspective is the only legitimate one is the default. Considering others’ perspectives takes energy and attention because it requires us to move from the comfortable familiarity of how we are used to seeing things (Epley, Keysar, Van Boven, & Gilovich, 2004) to the unfamiliar vantage point of an outside view. As we discuss in Chapter 11, an egocentric viewpoint can be a significant impediment to mutual understanding and agreement in negotiation. If we assume that those who see things differently are either stupid (for not seeing the facts right before their eyes) or evil (for seeing the truth but misrepresenting it for their own nefarious ends), we will be unwilling to consider other perspectives and find common ground. The result can be dysfunctional conflict and unnecessary divorces, lawsuits, strikes, and wars (Johnson, 2004).”

  • “Managers’ faith in their own judgment also leads them astray in the context of hiring decisions. Hiring and promotion decisions are probably among the most important decisions any organization makes. Recognizing this importance, managers generally give them a great deal of time, attention, and care—which usually means spending a lot of time interviewing candidates. The more important the position, the more interviews a candidate has. Unfortunately, many decades of study and hundreds of published research findings attest to the difficulty of accurately predicting work performance (Schmidt & Hunter, 1998). Moreover, the traditional face-to-face job interview is low on the list of useful tools we have for helping us predict how someone will perform on the job. Other tools, including the simple IQ test, are cheaper to administer, less biased, and better predictors of job performance. Nevertheless, managers stubbornly cling to the notion that even if others cannot predict how someone will perform based on interviews, they themselves are solid judges of character (Highhouse, 2008). Reluctant to acknowledge the true uncertainties associated with personnel selection, managers make overly precise forecasts of candidates’ potential. In the process, they waste time and effort conducting interviews that aren’t predictive of job performance.”

  • If we could only accept our personal vulnerability to bias, we could better anticipate our biases, correct them, and avoid the errors they cause. Unfortunately, people tend to be resistant to the idea that their views are biased (Pronin, Lin, & Ross, 2002). While we are often ready to acknowledge general imperfections in human judgment, and especially those of others, we are remarkably reluctant to acknowledge that any particular judgment of ours has been tarnished by bias. As we discuss in Chapter 7, our blindness to our own biases can be particularly problematic in ethical domains, making us unwilling to accept correction, guidance, or regulation.”

  • “We also tend to overestimate our own performance, abilities, or talents, a bias sometimes referred to as self-enhancement (Sedikides & Gregg, 2008). There is also evidence that we evaluate ourselves more positively on more desirable traits (Alicke, 1985) than on less desirable traits.”

  • “The illusion of control. Sometimes people think they have more control over circumstances than they actually do, a phenomenon know as the illusion of control (S. C. Thompson, 1999). In particular, when people have very little control, they tend to overestimate how much control they do have. Superficial signs of control, such as the chance to pick their own lottery ticket numbers, are enough to lead people to believe that they can exert control over uncontrollable events (Langer, 1975). We also cling to superstitious beliefs about performance and competitive success.”

  • “The planning fallacy. The planning fallacy describes the common tendency to overestimate the speed at which we will complete projects and tasks (Buehler, Griffin, & Ross, 1994). Using data from many major infrastructure projects in various different countries, Flyvbjerg (2003) highlights the dramatic tendency to underestimate the cost and duration of construction projects, including roads, bridges, tunnels, and buildings.”

  • “Entrepreneurs who believe they are more capable than their potential competitors will choose to enter new markets and compete even when their objective chances of success are not particularly good (Astebro, Jeffrey, & Adomdza, 2007; Koellinger, Minniti, & Schade, 2007). Many people wind up throwing away their life savings on business ideas that fail. Believing that they are better than other managers, many pursue mergers and acquisitions that wind up costing their shareholders handsomely (Malmendier & Tate, 2008). Indeed, evidence suggests, the majority of mergers fail (Ravenscraft &Scherer, 1989).”

  • “Yet we know of no study that has shown that positive illusions or, more broadly, overconfidence leads to better decisions. In contrast, much evidence is in the other direction. We are highly dubious of the overall benefits of overconfidence in general and positive illusions in particular. Our cynicism is shared by a number of scholars, who caution that positive illusions are likely to have a negative impact on learning and on the quality of decision making, personnel decisions, and responses to crises (such as the argument that “global warming isn’t that bad”).”

  • people who are overconfident about their abilities, their traits, and their future are often sure that these opinions are accurate. Being too sure that you will succeed— whether at mountain climbing, taking your company public, or getting a good grade—can set you up for failure. And while displaying confidence in your knowledge and abilities will give you credibility as a leader, that confidence can backfire if it turns out that you were wrong.”

  • Summing up the evidence from the overconfidence literature, we argue that, when making decisions, you should strive to be well calibrated. That is, you should try to match your private beliefs to reality. This basic prescription is surprisingly difficult to achieve. As the coming chapters will document, we all see the world through our own unique perspective, which includes simplifying assumptions, coping mechanisms, and biases that operate in ways that we often misunderstand.”
http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex