From Annie Duke’s “Thinking in Bets”

Here are the key takeaways from Annie Duke’s “Thinking in Bets.”

Beware of outcome bias or “resulting”

  • At one such tournament, I told the audience that one player would win 76% of the time and the other would win 24% of the time. I dealt the remaining cards, the last of which turned the 24% hand into the winner. Amid the cheers and groans, someone in the audience called out, “Annie, you were wrong!” In the same spirit that he said it, I explained that I wasn’t. “I said that would happen 24% of the time. That’s not zero. You got to see part of the 24%!” (p. 30)

  • “It would be absurd for me, after making a big bet on the best possible starting hand (a pair of aces) and losing, to spend a lot of time thinking that I was wrong to make the decision to play the hand in the first place. That would be resulting.” (p. 33)

  • “Carroll got unlucky. He had control over the quality of the play-call decision, but not over how it turned out. It was exactly because he didn’t get a favorable result that he took the heat. He called a play that had a high percentage of ending in a game-winning touchdown or an incomplete pass (which would have allowed two more plays for the Seahawks to hand off the ball to Marshawn Lynch). He made a good-quality decision that got a bad result.” (p. 7)

  • “It sounded like a bad result, not a bad decision. The imperfect relationship between results and decision quality devastated the CEO and adversely affected subsequent decisions regarding the company. The CEO had identified the decision as a mistake solely because it didn’t work out.”

    “Yet this is exactly what happened to that CEO. He changed his behavior based on the quality of the result rather than the quality of the decision-making process.”(p. 10)

Life is like poker, NOT chess.

  • “The decisions we make in our lives—in business, saving and spending, health and lifestyle choices, raising our children, and relationships—easily fit von Neumann’s definition of “real games.” They involve uncertainty, risk, and occasional deception, prominent elements in poker. Trouble follows when we treat life decisions as if they were chess decisions.” (p. 20)

  • “In chess, outcomes correlate more tightly with decision quality.

    In poker, it is much easier to get lucky and win, or get unlucky and lose.”

  • “If life were like chess, nearly every time you ran a red light you would get in an accident (or at least receive a ticket). If life were like chess, the Seahawks would win the Super Bowl every time Pete Carroll called that pass play.

    But life is more like poker. You could make the smartest, most careful decision in firing a company president and still have it blow up in your face. You could run a red light and get through the intersection safely—or follow all the traffic rules and signals and end up in an accident. You could teach someone the rules of poker in five minutes, put them at a table with a world champion player, deal a hand (or several), and the novice could beat the champion. That could never happen in chess.” (p. 22)

To check if you made a good decision, look at the QUALITY of your decision, not the outcome of the decision.

  • “What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge.” (p. 27)

  • “Outcomes don’t tell us what’s our fault and what isn’t, what we should take credit for and what we shouldn’t. Unlike in chess, we can’t simply work backward from the quality of the outcome to determine the quality of our beliefs or decisions. This makes learning from outcomes a pretty haphazard process. A negative outcome could be a signal to go in and examine our decision-making. That outcome could also be due to bad luck, unrelated to our decision, in which case treating that outcome as a signal to change future decisions would be a mistake. A good outcome could signal that we made a good decision. It could also mean that we got lucky, in which case we would be making a mistake to use that outcome as a signal to repeat that decision in the future.” (p. 86)

We are really bad at truthseeking, changing our minds and updating our beliefs.

  • “Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. 

    We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.” (p.55)


  • “if we were good at updating our beliefs based on new information, our haphazard belief-formation process might cause relatively few problems. Sadly, this is not the way it works. We form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.” (p. 55)

  • “Whether it is a football game, a protest, or just about anything else, our pre-existing beliefs influence the way we experience the world. That those beliefs aren’t formed in a particularly orderly way leads to all sorts of mischief in our decision-making.” (p. 59)

  • “As with many of our irrationalities, how we form beliefs was shaped by the evolutionary push toward efficiency rather than accuracy.” (p. 51)

Beware of confirmation bias.

  • Information that disagrees with us is an assault on our self-narrative. We’ll work hard to swat that threat away. On the flip side, when additional information agrees with us, we effortlessly embrace it.”

  • “Flaws in forming and updating beliefs have the potential to snowball. Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief.” (p. 59)

  • “Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek.” (p. 64)

Being smart makes confirmation bias worse.

  • Surprisingly, being smart can actually make bias worse. Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view. After all, people in the “spin room” in a political setting are generally pretty smart for a reason.”(p. 62)”

  • “yes, we all have a blind spot about recognizing our biases. The surprise is that blind-spot bias is greater the smarter you are. The researchers tested subjects for seven cognitive biases and found that cognitive ability did not attenuate the blind spot. “Furthermore, people who were aware of their own biases were not better able to overcome them.” (p. 62)

  • “Scientists, overwhelmingly trained and chartered toward truthseeking, aren’t immune. As the authors of the BBS paper recognized, “Even research communities of highly intelligent and well-meaning individuals can fall prey to confirmation bias, as IQ is positively correlated with the number of reasons people find to support their own side in an argument.” That’s how robust these biases are. We see that even judges and scientists succumb to these biases. We shouldn’t feel bad, whatever our situation, about admitting that we also need help.” (p. 147)

To help you think clearer, treat decisions as if you are making a bet.

  • “By treating decisions as bets, poker players explicitly recognize that they are deciding on alternative futures, each with benefits and risks. They also recognize there are no simple answers. Some things are unknown or unknowable. The promise of this book is that if we follow the example of poker players by making explicit that our decisions are bets, we can make better decisions and anticipate (and take protective measures) when irrationality is likely to keep us from acting in our best interest.” (p. 43)

  • “No matter how far we get from the familiarity of betting at a poker table or in a casino, our decisions are always bets. We routinely decide among alternatives, put resources at risk, assess the likelihood of different outcomes, and consider what it is that we value. Every decision commits us to some course of action that, by definition, eliminates acting on other alternatives.” (p. 44)

  • “Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.” (p. 4)

  • ““Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.” (p. 66)

  • “Expecting everyone starting to throw the gauntlet down, challenging each other to bet on any opinion, is impractical if you aren’t hanging out in a poker room. (Even in poker rooms, this generally happens only among players who know each other well.) I imagine that if you went around challenging everyone with “Wanna bet?” it would be difficult to make friends and you’d lose the ones you have. But that doesn’t mean we can’t change the framework for ourselves in the way we think about our decisions. We can train ourselves to view the world through the lens of “Wanna bet?” Once we start doing that, we are more likely to recognize that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white, 0% or 100%. And that’s a pretty good philosophy for living.” (p. 66)

When you express a belief, give a probability.

  • “We would be better served as communicators and decision-makers if we thought less about whether we are confident in our beliefs and more about how confident we are. Instead of thinking of confidence as all-or-nothing (“I’m confident” or “I’m not confident”), our expression of our confidence would then capture all the shades of grey in between.” (p. 68)

  • “When we express our beliefs (to others or just to ourselves as part of our internal decision-making dialogue), they don’t generally come with qualifications. What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten? Zero would mean we are certain a belief is not true. Ten would mean we are certain that our belief is true. A zero-to-ten scale translates directly to percentages. If you think the belief rates a three, that means you are 30% sure the belief is accurate. A nine means you are 90% sure.” (p. 68)

  • “So instead of saying to ourselves, “Citizen Kane won the Oscar for best picture,” we would say, “I think Citizen Kane won the Oscar for best picture but I’m only a six on that.” Or “I’m 60% that Citizen Kane won the Oscar for best picture.” That means your level of certainty is such that 40% of the time it will turn out that Citizen Kane did not win the best-picture Oscar.” (p. 68)

  • “Forcing ourselves to express how sure we are of our beliefs brings to plain sight the probabilistic nature of those beliefs, that what we believe is almost never 100% or 0% accurate but, rather, somewhere in between.” (p. 68)

  • “In a similar vein, the number can reflect several different kinds of uncertainty. “I’m 60% confident that Citizen Kane won best picture” reflects that our knowledge of this past event is incomplete. “I’m 60% confident the flight from Chicago will be late” incorporates a mix of our incomplete knowledge and the inherent uncertainty in predicting the future (e.g., the weather might intervene or there might be an unforeseen mechanical issue).” (p. 69)

  • “When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.” Our narrative of being a knowledgeable, educated, intelligent person who holds quality opinions isn’t compromised when we use new information to calibrate our beliefs, compared with having to make a full-on reversal. This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek.” (p. 70)


In you want to see reality, you must form a truthseeking group.

  • “Having the help of others provides many decision-making benefits, but one of the most obvious is that other people can spot our errors better than we can. We can help others in our pod overcome their blind-spot bias and they can help us overcome the same.” (p. 125)

  • On our own, we have just one viewpoint. That’s our limitation as humans. But if we take a bunch of people with that limitation and put them together in a group, we get exposed to diverse opinions, can test alternative hypotheses, and move toward accuracy.” (p. 138)

  • “It is almost impossible for us, on our own, to get the diversity of viewpoints provided by the combined manpower of a well-formed decision pod. To get a more objective view of the world, we need an environment that exposes us to alternate hypotheses and different perspectives. That doesn’t apply only to the world around us: to view ourselves in a more realistic way, we need other people to fill in our blind spots.” (p. 138)

  • “Motivated reasoning and self-serving bias are two habits of mind that are deeply rooted in how our brains work. We have a huge investment in confirmatory thought, and we fall into these biases all the time without even knowing it. Confirmatory thought is hard to spot, hard to change, and, if we do try changing it, hard to self-reinforce. It is one thing to commit to rewarding ourselves for thinking in bets, but it is a lot easier if we get others to do the work of rewarding us.” (p. 132)

  • “if we can find a few people to choose to form a truthseeking pod with us and help us do the hard work connected with it, it will move the needle—just a little bit, but with improvements that accumulate and compound over time. We will be more successful in fighting bias, seeing the world more objectively, and, as a result, we will make better decisions. Doing it on our own is just harder.” (p. 124)

How can you form a truthseeking group?

  • First you need to find people who are willing to be in your truthseeking group. People who agree to join your truthseeking group must break the social contract of being agreeable. They must tell you the painful truth. They must understand things will get uncomfortable.

    “Forming or joining a group where the focus is on thinking in bets means modifying the usual social contract. It means agreeing to be open-minded to those who disagree with us, giving credit where it’s due, and taking responsibility where it’s appropriate, even (and especially) when it makes us uncomfortable. That’s why, when we do it with others, we need to make it clear the social contract is being modified, or feelings will get hurt, defensiveness will rear its ugly head” (p. 125)

  • “Members of our decision pod could be our friends, or members of our family, or an informal pod of coworkers, or an enterprise strategy group, or a professional organization where members can talk about their decision-making. 

    So, while we find some people to think in bets with us, with the rest of the world, it is generally better to observe the prevailing social contract and not go around saying, “Wanna bet?” willy-nilly.” (p. 125)

  • “Whatever the obstacles to recruiting people into a decision group (and this chapter points out several, along with strategies for overcoming them), it is worth it to get a buddy to watch your back—or your blind spot. The fortunate thing is that we need to find only a handful of people willing to do the exploratory thinking necessary for truthseeking. In fact, as long as there are three people in the group (two to disagree and one to referee*), the truthseeking group can be stable and productive.” (p. 125)

Beware of this problem in your truthseeking group.

  • “while a group can function to be better than the sum of the individuals, it doesn’t automatically turn out that way. Being in a group can improve our decision quality by exploring alternatives and recognizing where our thinking might be biased, but a group can also exacerbate our tendency to confirm what we already believe.” (p. 128)

  • “Philip Tetlock and Jennifer Lerner, leaders in the science of group interaction, described the two kinds of group reasoning styles in an influential 2002 paper: “Whereas confirmatory thought involves a one-sided attempt to rationalize a particular point of view, exploratory thought involves even-handed consideration of alternative points of view.” 

  • “In other words, confirmatory thought amplifies bias, promoting and encouraging motivated reasoning because its main purpose is justification. Confirmatory thought promotes a love and celebration of one’s own beliefs, distorting how the group processes information and works through decisions, the result of which can be groupthink. Exploratory thought, on the other hand, encourages an open-minded and objective consideration of alternative hypotheses and a tolerance of dissent to combat bias. Exploratory thought helps the members of a group reason toward a more accurate representation of the world.” (p. 128)

  • “This is why it’s so important to have intellectual and ideological diversity within any group or institution whose goal is to find truth.” (p. 129)

To prevent problems in the truthseeking group, you must have an agreement.

  • “We know our decision-making can improve if we find other people to join us in truthseeking. And we know we need an agreement. What’s in that agreement? What are the features of a productive decision-making pod?” (p. 127)

  • Without an explicit charter for exploratory thought and accountability to that charter, our tendency when we interact with others follows our individual tendency, which is toward confirmation. The expression “echo chamber” instantly conjures up the image of what results from our natural drift toward confirmatory thought. That was the chorus I heard among some groups of players during breaks of poker tournaments. When one player brought up how unlucky they had gotten, another would nod in assent as a prelude to telling their own hard-luck story, which, in turn, would be nodded at and assented to by the group.” (p. 128)

Some suggested rules for your agreement.

  • Individuals need to avoid confirmatory thought and promote exploratory thought.

  • A focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness within the group

  • Accountability, for which members have advance notice,

  • Openness to a diversity of ideas. “In addition to accountability and an interest in accuracy, the charter should also encourage and celebrate a diversity of perspectives to challenge biased thinking by individual members.” (p. 129)

  • If the truth seeking group follows this agreement regularly, it becomes a habit. “Once we are in a group that regularly reinforces exploratory thought, the routine becomes reflexive, running on its own. Exploratory thought becomes a new habit of mind, the new routine, and one that is self-reinforced. ” (p. 134)

  • “None of this should be surprising to anyone who recognizes the benefits of thinking in bets. We don’t win bets by being in love with our own ideas. We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world. In the long run, the more objective person will win against the more biased person. In that way, betting is a form of accountability to accuracy. Calibration requires an open-minded consideration of diverse points of view and alternative hypotheses. Wrapping all that into your group’s charter makes a lot of sense.” (p. 130)

  • “As a rule of thumb, if we have an urge to leave out a detail because it makes us uncomfortable or requires even more clarification to explain away, those are exactly the details we must share. The mere fact of our hesitation and discomfort is a signal that such information may be critical to providing a complete and balanced account. Likewise, as members of a group evaluating a decision, we should take such hesitation as a signal to explore further.” (p. 156)

Here are some questions people should ask themselves:

  • Why might my belief not be true? 
  • What other evidence might be out there bearing on my belief? 
  • Are there similar areas I can look toward to gauge whether similar beliefs to mine are true? 
  • What sources of information could I have missed or minimized on the way to reaching my belief? 
  • What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me? 
  • What other perspectives are there as to why things turned out the way they did? (p. 138)

Not everyone in our lives has to be a truthseeker.

  • “It’s also helpful to recognize that people serve different purposes in our lives. Even if we place a high value on truthseeking, that doesn’t mean everyone in our lives has to adopt that or communicate with us in that way. Truthseeking isn’t a cult; we don’t have to cut off people who don’t share that commitment. Our Pilates friends or our football friends or any of our friends shouldn’t have to take the red pill to remain our friends. Different friends fill different needs and not all of them need to be cut from the same cloth. Those different groups can also provide much-needed balance in our lives. After all, it takes effort to acknowledge and explore our mistakes without feeling bad about ourselves, to forgo credit for a great result, and to realize, with an open mind, that not all our beliefs are true. Truthseeking flies in the face of a lot of comfortable behaviors; it’s hard work and we need breaks to replenish our willpower.” (p. 126)


To prepare for the future, mentally time travel.

  • “One of our time-travel goals is to create moments like that, where we can interrupt an in-the-moment decision and take some time to consider the decision from the perspective of our past and future.”

  • “We can then create a habit routine around these decision interrupts to encourage this perspective taking, asking ourselves a set of simple questions at the moment of the decision designed to get future-us and past-us involved. We can do this by imagining how future-us is likely to feel about the decision or by imagining how we might feel about the decision today if past-us had made it. The approaches are complementary; whether you choose to travel to the past or travel to the future depends solely on what approach you find most effective.” (p. 187)

  • “Business journalist and author Suzy Welch developed a popular tool known as 10-10-10 that has the effect of bringing future-us into more of our in-the-moment decisions. “Every 10-10-10 process starts with a question. . . . [W]hat are the consequences of each of my options in ten minutes? In ten months? In ten years?” This set of questions triggers mental time travel that cues that accountability conversation (also encouraged by a truthseeking decision group).” (p. 188)

  • “We can anticipate and prepare for negative outcomes. By planning ahead, we can devise a plan to respond to a negative outcome instead of just reacting to it. We can also familiarize ourselves with the likelihood of a negative outcome and how it will feel. Coming to peace with a bad outcome in advance will feel better than refusing to acknowledge it, facing it only after it has happened.” (p. 189)

You must use scenario planning.

  • “The reason why we do reconnaissance is because we are uncertain. We don’t (and likely can’t) know how often things will turn out a certain way with exact precision. It’s not about approaching our future predictions from a point of perfection. It’s about acknowledging that we’re already making a prediction about the future every time we make a decision, so we’re better off if we make that explicit. If we’re worried about guessing, we’re already guessing. We are already guessing that the decision we execute will result in the highest likelihood of a good outcome given the options we have available to us. By at least trying to assign probabilities, we will naturally move away from the default of 0% or 100%, away from being sure it will turn out one way and not another. Anything that moves us off those extremes is going to be a more reasonable assessment than not trying at all. Even if our assessment results in a wide range, like the chances of a particular scenario occurring being between 20% and 80%, that is still better than not guessing at all.” (p. 210)

  • “After identifying as many of the possible outcomes as we can, we want to make our best guess at the probability of each of those futures occurring. When I consult with enterprises on building decision trees and determining probabilities of different futures, people frequently resist having to make a guess at the probability of future events mainly because they feel like they can’t be certain of what the likelihood of any scenario is. But that’s the point. The reason why we do reconnaissance is because we are uncertain. We don’t (and likely can’t) know how often things will turn out a certain way with exact precision. It’s not about approaching our future predictions from a point of perfection. It’s about acknowledging that we’re already making a prediction about the future every time we make a decision, so we’re better off if we make that explicit.” (p. 210)

  • “This is true of most strategic thinking. Whether it involves sales strategies, business strategies, or courtroom strategies, the best strategists are considering a fuller range of possible scenarios, anticipating and considering the strategic responses to each, and so on deep into the decision tree. This kind of scenario planning is a form of mental time travel we can do on our own. It works even better when we do it as part of a scenario-planning group, particularly one that is open-minded to dissent and diverse points of view. 

    Diverse viewpoints allow for the identification of a wider variety of scenarios deeper into the tree, and for better estimates of their probability. In fact, if two people in the group are really far off on an estimate of the likelihood of an outcome, that is a great time to have them switch sides and argue the other’s position. Generally, the answer is somewhere in the middle and both people will end up moderating their positions. But sometimes one person has thought of a key influencing factor the other hasn’t and that is revealed only because the dissent was tolerated. In addition to increasing decision quality, scouting various futures has numerous additional benefits.” (p. 211)

Try backcasting.

  • “Just as great poker players and chess players (and experts in any field) excel by planning further into the future than others, our decision-making improves when we can more vividly imagine the future, free of the distortions of the present. By working backward from the goal, we plan our decision tree in more depth, because we start at the end.” (p. 219)

  • “They “found that prospective hindsight—imagining that an event has already occurred—increases the ability to correctly identify reasons for future outcomes by 30%.” (p. 219)

  • “In backcasting, we imagine we’ve already achieved a positive outcome, holding up a newspaper with the headline “We Achieved Our Goal!” Then we think about how we got there.” (p. 220)

  • “Let’s say an enterprise wants to develop a three-year strategic plan to double market share, from 5% to 10%. Each person engaged in the planning imagines holding up a newspaper whose headline reads “Company X Has Doubled Its Market Share over the Past Three Years.” The team leader now asks them to identify the reasons they got there, what events occurred, what decisions were made, what went their way to get the enterprise to capture that market share. This enables the company to better identify strategies, tactics, and actions that need to be implemented to get to the goal. It also allows it to identify when the goal needs to be tweaked. Backcasting makes it possible to identify when there are low-probability events that must occur to reach the goal. That could lead to developing strategies to increase the chances those events occur or to recognizing the goal is too ambitious. The company can also make precommitments to a plan developed through backcasting, including responses to developments that can interfere with reaching the goal and identifying inflection points for re-evaluating the plan as the future unfolds.” (p. 220)

Try premortems.

  • “We all like to bask in an optimistic view of the future. We generally are biased to overestimate the probability of good things happening. Looking at the world through rose-colored glasses is natural and feels good, but a little naysaying goes a long way. A premortem is where we check our positive attitude at the door and imagine not achieving our goals.” (p. 221)

  • “Backcasting and premortems complement each other. Backcasting imagines a positive future; a premortem imagines a negative future. We can’t create a complete picture without representing both the positive space and the negative space. Backcasting reveals the positive space. Premortems reveal the negative space. Backcasting is the cheerleader; a premortem is the heckler in the audience.” (p. 222)

  • “Oettingen recognized that we need to have positive goals, but we are more likely to execute on those goals if we think about the negative futures. We start a premortem by imagining why we failed to reach our goal: our company hasn’t increased its market share; we didn’t lose weight; the jury verdict came back for the other side; we didn’t hit our sales target. Then we imagine why. All those reasons why we didn’t achieve our goal help us anticipate potential obstacles and improve our likelihood of succeeding.” (p. 224)

  • “The key to a successful premortem is that everyone feels free to look for those reasons, and they are motivated to scour everything—personal experience, company experience, historical precedent, episodes of The Hills, sports analogies, etc.—to come up with ways a decision or plan can go bad, so the team can anticipate and account for them.” (p. 224)

  • “It may not feel so good during the planning process to include this focus on the negative space. Over the long run, however, seeing the world more objectively and making better decisions will feel better than turning a blind eye to negative scenarios. In a way, backcasting without premortems is a form of temporal discounting: if we imagine a positive future, we feel better now, but we’ll more than compensate for giving up that immediate gratification through the benefits of seeing the world more accurately, making better initial decisions, and being nimbler about what the world throws our way.” (p. 226)

Beware of tilt!

  • “Tilt, of course, is not just limited to poker. Any kind of outcome has the potential for causing an emotional reaction. We can be tempted to make a reactive, emotional decision in a disagreement with a relationship partner, or because of bad service in a restaurant, or a comment in the workplace, or making a sale only to have it canceled, or having an idea dismissed. We’ve all had this experience in our personal and professional lives: blowing out of proportion a momentary event because of an in-the-moment emotional reaction.” (p. 198)

  • “By recognizing in advance these verbal and physiological signs that ticker watching is making us tilt, we can commit to develop certain habit routines at those moments. We can precommit to walk away from the situation when we feel the signs of tilt, whether it’s a fight with a spouse or child, aggravation in a work situation, or losing at a poker table. We can take some space till we calm down and get some perspective, recognizing that when we are on tilt we aren’t decision fit. Aphorisms like “take ten deep breaths” and “why don’t you sleep on it?” capture this desire to avoid decisions while on tilt.” (p. 199)

  • “We can commit to asking ourselves the 10-10-10 questions or things like, “What’s happened to me in the past when I’ve felt this way?” or “Do I think it’s going to help me to be in this state while I’m making decisions?” Or we can gain perspective by asking how or whether this will have a real effect on our long-term happiness.” (p. 199)

Create precommitment contracts or Ulysses contracts to prevent yourself from making a bad decision.

  • “Odysseus told his crew to tie his hands to the mast and fill their ears with beeswax as they approached the island. They could then steer safely, unaffected by the song they could not hear, while he would get to hear the Sirens’ song without imperiling the ship. The plan worked perfectly. This action—past-us preventing present-us from doing something stupid—has become known as a Ulysses contract.” (p. 200)

  • “A lawyer attending a settlement negotiation can make a precommitment, with the client or other lawyers on their team, as to the lowest amount they would accept in a settlement (or the highest amount they would agree to pay to settle). Home buyers, understanding that in the moment they might get emotionally attached to a home, can commit in advance to their budget. Once they decide on a house they want to buy, they can decide in advance what the maximum amount they’d be willing to pay for it is so that they don’t get caught up in the moment of the bidding.” (p. 202)

  • “Investment advisors do this with clients, determining in advance, as they are discussing the client’s goals, the conditions under which they would buy, sell, hold, or press their positions on particular stocks. If the client later wants to make an emotional decision in the moment (involving, for example, a sudden rise or drop in the value of an investment), the advisor can remind the client of the discussion and the agreement.” (p. 203)

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex