Oliver Sibony’s “You’re About to Make a Terrible Mistake!”

Favorite Quotes From “You’re About to Make a Terrible Mistake”

“Even if you are a competent, careful, and hardworking executive, you might end up making avoidable, predictable mistakes. This is precisely the mysterious problem of bad decisions by good leaders that we discussed above. Except it is not “them”—it’s you. And it is not mysterious—it is behavioral.” (p. 13)

Contrary to much of the advice that you may have read on the topic, you will generally not be able to overcome your own biases…”If we’re so stupid, how did we get to the moon?” The answer, of course, is that “we,” individual humans, did not land on the moon. A large and sophisticated organization, NASA, did. We have cognitive limitations that we may not be able to overcome, but organizations can make up for our shortcomings. They can produce choices that are less biased and more rational than our individual decisions would be.” (p. 14)

“Collaboration is needed because many people are more likely to detect biases than a lonely decision maker is. Good process is required to act on their insights.” (p 16)

A wise leader, therefore, does not see herself as someone who simply makes sound decisions; because she realizes she can never, on her own, be an optimal decision maker, she views herself as a decision architect in charge of designing her organization’s decision-making processes.” (p. 16)

“how could widely admired decision makers, surrounded by carefully selected teams, heading time-tested organizations, have fallen into traps that seem very crude to us? The simple answer is that when we are in the grip of a great story, confirmation bias can become irresistible. As we will see, the same reasoning applies to the biases we will discover in the coming chapters.” (p. 30)

“Storytelling makes us construct a coherent story from a selection of facts. But this is never the only possible story, and it can lead us into error.” (p. 35)

“However, in our quest for models, we too often commit three mistakes. First, we attribute all of a company’s success to a single person. Then, we see all the aspects of this person’s behavior as reasons for his or her success. Finally, we’re too quick to think we should imitate the model.” (p. 37)

“While it’s certain that Jobs played a decisive role in its history, it’s also fair to say that many of Apple’s 60,000 employees (as of 2011, the year Jobs died) contributed in some way. Apple’s continuing performance after Steve Jobs’s death confirms this.” (p. 37)

“our first impulse is to attribute success (or failure) to individuals, to their choices, to their personality, but not to the circumstances. This is our first mistake: the attribution error.” (p. 40)

Better yet, why shouldn’t we study worst practices? After all, everyone agrees that we learn from our mistakes even more than from our successes. Studying companies that collapsed may hold more lessons than focusing on those that succeed. Learning from their mistakes might be a good way to avoid making them ourselves.” (p. 46)

“An HR manager who has hired hundreds of people for the same entry-level position over the years and who has tracked the hires’ subsequent job performance may have developed good intuitive judgment. But this situation is the exception, not the rule.” (p. 54)

“When we make a plan, we don’t necessarily imagine all the reasons it could fail. We overlook the fact that success requires the alignment of many favorable circumstances, while a single glitch can derail everything. Above all, we focus on our plan as seen “from the inside”; that is, we don’t consider the universe of similar projects that took place in the past, ask ourselves what schedules and budgets they anticipated and whether they experienced delays and overruns.” (p.66)

“Of course, any prediction is uncertain. This is why it is not enough to make predictions: we must also have an idea of the level of confidence that we can place in them. This is particularly important when we make quantitative predictions. In principle, a good practice would be to use a “confidence interval”: if, for instance, we want to be 90 percent confident in our forecast, we will offer not a point prediction but a range, chosen so that we are 90 percent sure the real outcome will fall within it.” (p. 67)

“Our overconfidence in our own abilities, our exaggerated faith in our predictions, and organizational pressure to appear self-confident all produce another widespread problem: underestimating our competitors. “Underestimating” is actually an understatement, because usually we simply ignore them, failing to take their behavior and reactions into account at all.” (p. 68)

“The simple fact we so conveniently forget is that, at the very moment when we are presenting a project intended to beat our competitors, those same competitors are coming up with a plan to do the exact same thing to us.” (p. 69)

“Reducing or even eliminating quarterly earnings guidance won’t, by itself, eliminate all short-term performance pressures that U.S. public companies currently face, but it would be a step in the right direction.” (p. 114)

“Don’t we frequently see political leaders put off essential reforms for fear of the immediate reactions of public opinion? Simply put, is anyone ever criticized for their excessive long-term thinking? If the answer to these questions is obvious, it’s because, once again, our search for scapegoats leads us astray. Executives are certainly too focused on the short term. But so are we all.” (p. 115)

“If the first person to speak is in favor of the investment, then the second person will take this into consideration. Had she spoken first, she might have expressed some doubts. But hearing her colleague speak has made her more confident. She is now a bit more likely to approve of the plan without any reservations. Then comes the third person’s turn. As he considers the favorable opinion shared by two of his colleagues, he also becomes more likely to vote for the plan. And so on: in a completely rational fashion, each person will adjust his or her judgment in order to take into account previously expressed opinions. This is an information cascade.” (p. 129)

“Any executive knows she should expect colleagues to be self-interested to some degree. She knows she should ask herself at all times what the person sitting across from her is after. And she often prides herself on her ability not to be fooled by self-serving or self-promoting arguments. For experienced executives in large organizations, this precaution has become second nature.” (p. 139)

“Yet many researchers now hold a very different view. They believe that we are often unable to resist the influence of financial incentives, even if we sincerely intend to do so.” (p. 141)

“As a rule, nothing suggests that auditors are intentionally falsifying accounts to please their clients or that doctors intentionally mislead their patients. The problem, most of the time, is that these people are completely sincere. The phrase bounded ethicality, analogous to the bounded rationality of economics, describes, in the words of Max Bazerman and Don Moore, “the cognitive biases that lead honorable people to engage in unethical behavior without realizing that they are doing so.” This is often called self-serving bias.” (p. 142)

“Psychologists can identify a specific bias in the lab by controlling for all the other factors that may influence their subjects. In real life, however, it is rare to find a situation in which one bias is the sole cause of an error. However tempting it may be to look for a single “root cause,” when we fall into one of the traps we discussed in part 1, it’s usually the work of a combination of mutually reinforcing biases.” (p. 157)

“This is the key difference between a bias and a simple mistake. We all know what a mistake is; we are usually able to recognize when we’ve made a mistake and avoid making the same one twice. But we are almost never aware of bias in ourselves: on the contrary, we feel unchallenged, comfortable, confident in our reasoning.” (p. 162)

“The bottom line: obsessing over our own biases and how to reduce them is a waste of time. The way to improve decisions in an organization is to improve the decision-making practices of the organization.” (p. 169)

A smart leader facing a strategic choice will rely on her team, ask experts, consult the board of directors, and speak to advisors. She knows that she cannot correct her own biases, but she trusts others to see them quite clearly and help her avoid mistakes. Even if she is the one making the final call, she will never be alone in the decision-making process. By conceding defeat in the individual battle against her own biases, she improves her chances of winning the collective war against bad decisions.” (p. 169)

“When, like the Apollo 13 astronauts, we know that “failure is not an option,” we count, of course, on talented individuals—but we also rely on teamwork and on carefully designed methods.” (p. 179)

“But, above all, space explorers depend on a rigorously standardized process. As Clervoy explains, “For each type of emergency situation—fire, air leak, exposure to toxic products—and for less serious incidents, there is a checklist that we follow meticulously.” (p. 180)

Instead, astronauts are encouraged to discuss any doubts and report any mishap. They are thanked for doing so and debriefed so that lessons can be drawn for training the next crew or refining the checklists. If astronauts make sound, lifesaving decisions, they owe it to process, not improvisation, and to collaboration, not individual genius.” (p. 181)

“So Gawande asked the same surgeons a different question: if you were the patient and you were about to have an operation, would you like the checklist used? The answer: 93 percent of surgeons would insist upon it. Clearly, when you are the patient, failure is not an option.” (p. 183)

“This result is so surprising that it bears repeating. If we leave aside the factors in an investment decision that we cannot control, “collaboration plus process” counts more than analysis—six times more. The way we make the decision—the “how”—is six times more important than the contents, the “what”!” (p. 195)

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex