From Julia Galef’s “The Scout Mindset”

“So I’ve given it one. I call it scout mindset: the motivation to see things as they are, not as you wish they were. Scout mindset is what allows you to recognize when you are wrong, to seek out your blind spots, to test your assumptions and change course. It’s what prompts you to honestly ask yourself questions like “Was I at fault in that argument?” or “Is this risk worth it?” or “How would I react if someone from the other political party did the same thing?” As the late physicist Richard Feynman once said, “The first principle is that you must not fool yourself—and you are the easiest person to fool.” (p. ix)

“Many people actively resist viewing reality accurately because they believe that accuracy is a hindrance to their goals—that if they want to be happy, successful, and influential, it’s better to view themselves and the world through a distorted lens.” (p. xi)

“Or maybe you hold the common belief that when you’re doing something hard, like starting a company, you need to be delusionally overconfident. You might be surprised to learn that some of the world’s most famous entrepreneurs expected their companies to fail. Jeff Bezos put Amazon’s probability of success at about 30 percent. Elon Musk estimated a 10 percent chance of success for each of his companies, Tesla and SpaceX.” (p. xi)

“The best description of motivated reasoning I’ve ever seen comes from psychologist Tom Gilovich. When we want something to be true, he said, we ask ourselves, “Can I believe this?,” searching for an excuse to accept it. When we don’t want something to be true, we instead ask ourselves, “Must I believe this?,” searching for an excuse to reject it.” (p. 5)

“You can see it in the way people happily share news stories that support their narratives about America or capitalism or “kids today,” while ignoring stories that don’t. You can see it in the way we rationalize away red flags in an exciting new relationship, and always think we’re doing more than our fair share of the work. When a coworker screws up, it’s because they’re incompetent, but when we screw up, it’s because we were under a lot of pressure.” (p. 6)

“The tricky thing about motivated reasoning is that even though it’s easy to spot in other people, it doesn’t feel like motivated reasoning from the inside. When we reason, it feels like we’re being objective. Fair-minded. Dispassionately evaluating the facts.” (p. 7)

“In scout mindset, there’s no such thing as a “threat” to your beliefs. If you find out you were wrong about something, great—you’ve improved your map, and that can only help you.” (p. 12)

“Scout mindset is what prompts us to question our assumptions and stress-test our plans. Whether you’re proposing a new product feature or a military maneuver, asking yourself, “What are the most likely ways this could fail?” allows you to strengthen your plan against those possibilities in advance. If you’re a doctor, that means considering alternate diagnoses before settling on your initial guess. As one master clinician used to ask himself—if he suspected a patient had pneumonia, for example—“If this could not be pneumonia, what else would it be?” (p. 13)

“Being the kind of person who welcomes the truth, even if it’s painful, is what makes other people willing to be honest with you. You can say that you want your partner to tell you about any problems in your relationship, or that you want your employees to tell you about any problems in the company, but if you get defensive or combative when you hear the truth, you’re not likely to hear it very often. No one wants to be the messenger that gets shot.” (p 14)

“In Aesop’s fable “The Fox and the Grapes,” a fox spots a bunch of juicy grapes, high up on a branch he can’t reach, and concludes that the grapes were sour anyway. We use similar “sour grapes” reasoning when we don’t get something we want. When someone we had a great first date with doesn’t return our calls, we may decide they were a bore anyway. When a job offer slips through our fingers, we conclude, “It’s for the best; the hours would have been brutal.” A close cousin to the sour grape is the sweet lemon: when it doesn’t seem feasible to fix a problem, we may try to convince ourselves that our “problem” is actually a blessing, and that we wouldn’t change it even if we could.” (p. 17)

“Small wonder then that in one survey of entrepreneurs, almost everyone estimated their company’s probability of success to be at least 7 out of 10, with a third giving themselves an eyebrow-raising 10 out of 10 chance, despite the fact that the baseline rate of start-up success is closer to 1 in 10.10 One strategy we use to justify such high confidence is downplaying the relevance of the baseline odds and telling ourselves that success is purely a matter of trying hard enough. As one motivational blogger promised, “[You] have 100% chance of being successful at doing what you love if you commit yourself to it and get off your ass and do it every day.” (p. 20)

“In some religious communities, losing your faith can mean losing your marriage, family, and entire social support system along with it. That’s an extreme case, but all social groups have some beliefs and values that members are implicitly expected to share, such as “Climate change is a serious problem,” or “Republicans are better than Democrats,” or “Our group is fighting for a worthy cause,” or “Children are a blessing.” Dissent may not get you literally kicked out of the group, but it can still alienate you from the other members.” (p. 25)

“Rather than pursuing social acceptance by suppressing your disagreements with your community, you could instead decide to leave and find a different community you fit in to better.” (p. 27)

“In scout mindset, our thinking is guided by the question “Is it true?” We use it to help us see things clearly for the sake of our judgment, so that we can fix problems, notice opportunities, figure out which risks are worth taking, decide how we want to spend our lives, and, sometimes, better understand the world we live in for the sake of sheer curiosity.” (p. 28)

“If you’re worried about a mistake you made and you convince yourself that “It wasn’t my fault,” you’re rewarded with a hit of instant emotional relief. The cost is that you miss out on learning from your mistake, which means you’re less able to prevent it from happening again. But that won’t affect you until some unknown point in the future.” (p. 33)

“Every time you’re willing to say, “I was wrong,” it gets a little bit easier to be wrong in general.” (p. 34)

“Just like the lies we tell others, the lies we tell ourselves have ripple effects. Suppose you tend to rationalize away your own mistakes, and consequently you see yourself as more perfect than you really are. This has a ripple effect on your views of other people: Now, when your friends and family screw up, you might not be very sympathetic. After all, you never make such mistakes. Why can’t they just be better? It’s not that hard. Or suppose that for the sake of your self-esteem, you view yourself through rose-colored glasses, judging yourself to be more charming, interesting, and impressive than you actually appear to other people. Here’s one possible ripple effect: How do you explain the fact that women don’t seem to be interested in dating you, given what a great catch you are? Well, maybe they’re all shallow.” (p. 35)

“The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. “I’m an objective person, so my views on gun control must be correct, unlike the views of all those irrational people who disagree with me,” we think. Or “I’m unbiased, so if this job applicant seems better to me, he must really be better.”

  1. Do you tell other people when you realize they were right?

Technically, scout mindset only requires you to be able to acknowledge to yourself that you were wrong, not to other people. Still, a willingness to say “I was wrong” to someone else is a strong sign of a person who prizes the truth over their own ego. Can you think of cases in which you’ve done the same? (p. 51)

  1. How do you react to personal criticism?

“Maybe you’ve had a boss or a friend who insisted, “I respect honesty! I just want people to be straight with me,” only to react poorly when someone took them up on that. They got offended or defensive or lashed out at the feedback-giver in retaliation. Or perhaps they politely thanked that person for their honesty and then gave them the cold shoulder from then on.” (p. 52)

“To gauge your comfort with criticism, it’s not enough just to ask yourself, “Am I open to criticism?” Instead, examine your track record. Are there examples of criticism you’ve acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you?” (p. 52)

Do you ever prove yourself wrong?

“Can you think of any examples in which you voluntarily proved yourself wrong? Perhaps you were about to voice an opinion online, but decided to search for counterarguments first, and ended up finding them compelling. Or perhaps at work you were advocating for a new strategy, but changed your mind after you ran the numbers more carefully and realized it wouldn’t be feasible.” (p. 54)

  1. Do you take precautions to avoid fooling yourself?

“Do you try to avoid biasing the information you get? For example, when you ask your friend to weigh in on a fight you had with your partner, do you describe the disagreement without revealing which side you were on, so as to avoid influencing your friend’s answer? When you launch a new project at work, do you decide ahead of time what will count as a success and what will count as a failure, so you’re not tempted to move the goalposts later?” (p. 56)

  1. Do you have any good critics?

“It’s tempting to view your critics as mean-spirited, ill-informed, or unreasonable. And it’s likely that some of them are. But it’s unlikely that all of them are. Can you name people who are critical of your beliefs, profession, or life choices who you consider thoughtful, even if you believe they’re wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable (even if you don’t happen to know of specific people who hold those views)?” (p. 57)

“Being able to name reasonable critics, being willing to say “The other side has a point this time,” being willing to acknowledge when you were wrong—it’s things like these that distinguish people who actually care about truth from people who only think they do.” (p. 57)

“You can’t detect motivated reasoning in yourself just by scrutinizing your reasoning and concluding that it makes sense. You have to compare your reasoning to the way you would have reasoned in a counterfactual world, a world in which your motivations were different—would you judge that politician’s actions differently if he was in the opposite party? Would you evaluate that advice differently if your friend had offered it instead of your spouse? Would you consider that study’s methodology sound if its conclusions supported your side?” (p. 61)

“Try to actually imagine the counterfactual scenario. To see why that matters, think of a six-year-old child who’s just made fun of another child. His mother reprimands him and tries to show him why what he did was wrong by posing this age-old thought experiment: “Imagine you were in Billy’s shoes and someone was making fun of you in front of your friends. How would you feel?” Her son replies instantly: “I wouldn’t mind!” (p. 62)

“Thought experiments only work if you actually do them. So don’t simply formulate a verbal question for yourself. Conjure up the counterfactual world, place yourself in it, and observe your reaction.” (p. 62)

“But one day he did a thought experiment that changed his perception. He asked himself: “Can you honestly say that if the situation was reversed, you wouldn’t be doing the exact same thing?” The answer was clear. “Yeah, if that were the case, I’d definitely be making time with all the hotties,” he realized.” (p. 63)

“The thought experiment Grove and Moore did is called an outsider test: Imagine someone else stepped into your shoes—what do you expect they would do in your situation? When you’re making a tough decision, the question of what to do can get tangled up with other, emotionally fraught questions like, “Is it my fault that I’m in this situation?” or “Are people going to judge me harshly if I change my mind?” The outsider test is designed to strip away those influences, leaving only your honest guess about the best way to handle a situation like the one you’re in.” (p. 65)

“After the song was over, she turned to me and asked me what I thought. I replied enthusiastically, “Yeah, it’s so good! I think it’s my favorite, too.” “Well, guess what?” she replied. “That’s not my favorite song. It’s my least favorite song. I just wanted to see if you would copy me.” (p. 66)

“Now I use Shoshana’s trick as a thought experiment when I want to test how much of “my” opinion is actually my own. If I find myself agreeing with someone else’s viewpoint, I do a conformity test: Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them? For example, suppose you’re in a strategic meeting and your colleague is making the case for hiring more people. You find yourself nodding along in agreement. “That’s true, it would end up saving us money,” you think. That feels like your own opinion—but to check, you can do a conformity test. Imagine your colleague suddenly said, “By the way, everyone, I’m just playing devil’s advocate here. I don’t necessarily believe we should hire right now.” (p. 67)

“The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. “I’m an objective person, so my views on gun control must be correct, unlike the views of all those irrational people who disagree with me,” we think. Or “I’m unbiased, so if this job applicant seems better to me, he must really be better.” (p. 44)

“I call this type of thought experiment the selective skeptic test: Imagine this evidence supported the other side. How credible would you find it then? Suppose someone criticizes a decision your company made, and your knee-jerk reaction is, “They don’t know what they’re talking about, because they don’t have all the relevant details.” Selective skeptic test: Imagine the person had praised your company’s decision instead. Would you still think that only insiders are informed enough to have valid opinions?” (p. 68)

“David’s thought experiment revealed that his attitude toward his options was likely being influenced by the “status quo bias,” a motivation to defend whatever situation happens to be the status quo. A leading theory for why we’re biased in favor of the status quo is that we’re loss averse: the pain we feel from a loss outweighs the pleasure we feel from a similar-size gain. That makes us reluctant to change our situation, because even if the change would make us better off overall, we fixate more on what we’ll be losing than what we’ll be gaining.” (p. 69)

“I call David’s thought experiment the status quo bias test: Imagine your current situation was no longer the status quo. Would you then actively choose it? If not, that’s a sign that your preference for your situation is less about its particular merits and more about a preference for the status quo.*” (p. 70)

“In the metaphor of the scout, it’s like peering through your binoculars at a far-off river and saying, “Well, it sure seems like the river is frozen. But let me find another vantage point—different angle, different lighting, different lens—and see if things look any different.” (p. 72)

“Not all overconfidence is due to motivated reasoning. Sometimes we simply don’t realize how complicated a topic is, so we overestimate how easy it is to get the right answer. But a large portion of overconfidence stems from a desire to feel certain. Certainty is simple. Certainty is comfortable. Certainty makes us feel smart and competent. Your strength as a scout is in your ability to resist that temptation, to push past your initial judgment, and to think in shades of gray instead of black and white. To distinguish the feeling of “95% sure” from “75% sure” from “55% sure.” That’s what we’ll learn to do in this chapter.” (p. 75)

“A tip when you’re imagining betting on your beliefs: You may need to get more concrete about what you believe by coming up with a hypothetical test that could be performed to prove you right or wrong. For example, if you believe “Our computer servers are highly secure,” a hypothetical test might be something like this: Suppose you were to hire a hacker to try to break in to your systems. If they succeed, you lose one month’s salary. How confident do you feel that you would win that bet? If you believe “I was being reasonable in that fight with my partner, and he was being unreasonable,” a hypothetical test might go something like this: Suppose another person, an objective third party, is given all of the relevant details about the fight, and is asked to judge which of you two is being more reasonable. If he judges in your favor, you win $1,000; if not, you lose $1,000. How confident do you feel that you would win that bet?” (p. 84)

“Instead, I remind myself of a silver lining: Conceding an argument earns me credit. It makes me more credible in other cases, because I’ve demonstrated that I don’t stick to my guns just for the sake of it. It’s like I’m investing in my future ability to be convincing.” (p. 98)

“Of course, any given individual may have a better or worse chance of success than the overall odds suggest, depending on how talented, hardworking, charismatic, or well connected they are. But the overall odds are an important baseline to be aware of; the longer the odds, the better and luckier you’ll have to be to beat them.” (p. 107)

“For someone else with a more singular passion for acting, or more talent than me, the long odds might be worth it. But to weigh these factors successfully, you need an accurate picture of what the odds actually are.” (p. 108)

“The reality is that there’s no clear divide between the “decision-making” and “execution” stages of pursuing a goal. Over time, your situation will change, or you’ll learn new information, and you’ll need to revise your estimate of the odds.” (p. 110)

“When Musk’s friends told him that he would probably fail, he replied: “Well, I agree. I think we probably will fail.”14 In fact, he estimated that there was only about a 10 percent chance that a SpaceX craft would ever make it into orbit.” (p. 112)

“In reality, you almost never get to repeat the exact same bet many times. But you’ll have the opportunity to make many different bets over the course of your life. You’ll face bets at your company and in your career more broadly; bets on investment opportunities; chances to bet on trusting another person, or making a difficult ask, or pushing your comfort zone. And the more positive expected value bets you make, the more confident you can be that you’ll end up ahead overall, even if each individual bet is far from a sure thing.” (p. 115)

“Scouts rely on a different kind of morale. Instead of being motivated by the promise of guaranteed success, a scout is motivated by the knowledge that they’re making a smart bet, which they can feel good about having made whether or not it succeeds. Even if a particular bet has a low probability of success, they know that their overall probability of success in the long run is much higher, as long as they keep making good bets.” (p. 119)

“What made the superforecasters so great at being right was that they were great at being wrong.” (p. 138)

“The superforecasters changed their minds all the time. Not dramatic, 180-degree reversals every day, but subtle revisions as they learned new information. The highest-scoring superforecaster, a software engineer named Tim Minto, usually changed his mind at least a dozen times on a single forecast, and sometimes as much as forty or fifty times.” (p. 138)

“The superforecasters had a very different relationship with their mistakes. When their predictions missed the mark by a lot—if they predicted something was very likely and it didn’t happen or if they predicted something was very unlikely and it did happen—they would go back and reevaluate their process, asking, “What does this teach me about how to make better forecasts?” Here’s an example:” (p. 141)

“But that would be missing one of the biggest benefits of noticing your errors: the opportunity to improve your judgment in general. When Brookshire realized she had been wrong, she asked herself why, and pinpointed two likely culprits.7 One was confirmation bias: “I had a preexisting belief that men wouldn’t respect me as much in email as women,” Brookshire realized, “and I most remembered the observations that confirmed this belief, forgetting entirely the evidence that showed me my belief was wrong.” The other was recency bias: “I was giving more weight to things that I had observed recently, forgetting things I had observed in the past,” she concluded.” (p. 143)

“So far in this chapter we’ve explored two ways in which scouts think about error differently from most people. First, they revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs. Second, they view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing “I was wrong” feel valuable, rather than just painful.” (p. 145)

“If you want to become better at predicting people’s behavior, then shrugging off the times when they violate your expectations is exactly the wrong response. Spock should have leaned in to his confusion about the aliens’ decision to attack: “What am I missing? Why might this behavior make sense to them?” (p. 156)

“The ex-employee claimed the company owed him $130,000 in commissions he had earned before he was fired. The company did the math, and found that the employee was mistaken. They sent him their analysis showing that he was not owed any money, but he still refused to abandon his lawsuit. The executive, who was a client of Deepak Malhotra’s, thought the ex-employee was being completely irrational, since he had no chance of winning in court. Malhotra suggested: “Is it possible that he doesn’t trust your accountant?” He urged the executive to try hiring an objective, third-party accounting firm to do the analysis and send the results directly to the ex-employee. Sure enough, the suit was dropped.” (p. 157)

“That doesn’t mean you should go to the other extreme and abandon a paradigm as soon as you notice the slightest bit of conflicting evidence. What the best decision-makers do is look for ways to make sense of conflicting evidence under their existing theory but simultaneously take a mental note: This evidence stretches my theory by a little (or a lot). If your theory gets stretched too many times, then you admit to yourself that you’re no longer sure what’s happening, and you consider alternate explanations.” (p. 165)

“You’ve probably heard some version of the following speech before: “It’s important to listen to people on the other side of the aisle! Escape your echo chamber! Get outside of your filter bubble! That’s how you broaden your perspective and change your mind.” It’s the kind of advice that sounds good; the kind of advice that well-intentioned people repeat and that other well-intentioned people nod along to enthusiastically. The dirty little secret is that it doesn’t work.” (p. 168)

“I suspect even the well-intentioned people passing around that advice already know, on some level, that it doesn’t work. We’ve all had the experience of receiving strongly worded disagreement on Facebook, maybe from an old classmate or a second cousin who has a totally different worldview from us. And when they explain to us how our views on abortion are immoral or why our political party is incompetent, we don’t usually come away from those interactions feeling enlightened.” (p. 168)

“To give yourself the best chance of learning from disagreement, you should be listening to people who make it easier to be open to their arguments, not harder. People you like or respect, even if you don’t agree with them. People with whom you have some common ground—intellectual premises, or a core value that you share—even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith.” (p 171)

“Why was that disagreement so productive? Because even though Litterman was on the other side of the climate change issue, he was nevertheless someone with “instant credibility with people like me,” Taylor said later. “He is from Wall Street. He is kind of a soft Libertarian.”10 Knowing that you have intellectual common ground with someone makes you more receptive to their arguments right off the bat. It also makes it possible for them to explain their side in your “language.” (p. 174)

“Lincoln’s “team of rivals” is now a standard example cited in books and articles urging people to expose themselves to diverse opinions. “Lincoln self-consciously chose diverse people who could challenge his inclinations and test one another’s arguments in the interest of producing the most sensible judgments,” (p. 176)

“It’s an age-old rule of etiquette that you’re not supposed to make conversation about politics or religion. That’s because we all know that people’s political and religious views are often part of their identities. When someone criticizes a belief that’s part of your identity, it’s antagonizing. It’s like someone insulting your family or stomping on your country’s flag.” (p. 186)

“The more you’ve argued a position to other people, especially in public, the more it’s become linked to your ego and reputation, and the harder it is to abandon that position later.” (p. 197)

“The problem with identity is that it wrecks your ability to think clearly. Identifying with a belief makes you feel like you have to be ready to defend it, which motivates you to focus your attention on collecting evidence in its favor. Identity makes you reflexively reject arguments that feel like attacks on you or the status of your group.” (p. 197)

“Keep Your Identity Small,” by tech investor Paul Graham. In it, Graham pointed to the problem I described in the previous chapter and warned, “The more labels you have for yourself, the dumber they make you.” (p. 199)

“On the rare occasions when her experiments did yield significant results, she got excited. But then, “as a scientist must,” Blackmore recalls, “I repeated the experiment, checked for errors, redid the statistics, and varied the conditions, and every time either I found the error or got chance results again.” Eventually, she had to face the truth: that she might have been wrong all along, and that perhaps paranormal phenomena weren’t real.” (p. 214)

“That’s a common theme among people who are good at facing hard truths, changing their mind, taking criticism, and listening to opposing views. Scout mindset isn’t a chore they carry out grudgingly; it’s a deep personal value of theirs, something they take pride in.” (p. 215)

“But in the medium-to-long term, one of the biggest things you can do to change your thinking is to change the people you surround yourself with. We humans are social creatures, and our identities are shaped by our social circles, almost without our noticing.” (p. 219)

“At the end of the day, we’re a bunch of apes whose brains were optimized for defending ourselves and our tribes, not for doing unbiased evaluations of scientific evidence. So why get angry at humanity for not being uniformly great at something we didn’t evolve to be great at? Wouldn’t it make more sense to appreciate the ways in which we do transcend our genetic legacy?” (p. 231)

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex