From Adam Grant’s “Think Again”

Most People Think and Talk Like a Preacher, Prosecutor, or Politician. This is not Good!

  • “As we think and talk, we often slip into the mindsets of three different professions: preachers, prosecutors, and politicians.” (p. 18)


  • “The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who are wrong, and politicking for support that we don’t bother to rethink our own views.” (p. 18)


  • Preacher Mode- “We go into preacher mode when our sacred beliefs are in jeopardy: we deliver sermons to protect and promote our ideals.”


  • Prosecutor Mode– “We enter prosecutor mode when we recognize flaws in other people’s reasoning: we marshal arguments to prove them wrong and win our case.


  • Politician Mode– “We shift into politician mode when we’re seeking to win over an audience: we campaign and lobby for the approval of our constituents.





If You Want to Get Closer to the Truth, Think Like a Scientist.

  • Think like a scientist. When you start forming an opinion, resist the temptation to preach, prosecute, or politick. Treat your emerging view as a hunch or a hypothesis and test it with data.”


  • “If you’re a scientist by trade, rethinking is fundamental to your profession. You’re paid to be constantly aware of the limits of your understanding. You’re expected to doubt what you know, be curious about what you don’t know, and update your views based on new data.” (p. 19)


  • “being a scientist is not just a profession. It’s a frame of mind—a mode of thinking that differs from preaching, prosecuting, and politicking. We move into scientist mode when we’re searching for the truth: we run experiments to test hypotheses and discover knowledge.” (p. 20)


  • “Thinking like a scientist.. means being actively open-minded. It requires searching for reasons why we might be wrong—not for reasons why we must be right—and revising our views based on what we learn.” (p. 25)


  • “Skeptics have a healthy scientific stance: They don’t believe everything they see, hear, or read. They ask critical questions and update their thinking as they gain access to new information. (p. 169)


  • Deniers are in the dismissive camp, locked in preacher, prosecutor, or politician mode: They don’t believe anything that comes from the other side. They ignore or twist facts to support their predetermined conclusions.” (p.169)


  • Scientific thinking favors humility over pride, doubt over certainty, curiosity over closure.” (p. 28)



People with high IQ Struggle to Update Their Beliefs.

  • “Research reveals that the higher you score on an IQ test, the more likely you are to fall for stereotypes, because you’re faster at recognizing patterns. And recent experiments suggest that the smarter you are, the more you might struggle to update your beliefs.” (p. 24)


  • “The better you are at crunching numbers, the more spectacularly you fail at analyzing patterns that contradict your views.” (p. 24)


  • “In psychology there are at least two biases that drive this pattern. One is confirmation bias: seeing what we expect to see. The other is desirability bias: seeing what we want to see.” (p. 25)


  • The tragedy is that we’re usually unaware of the resulting flaws in our thinking. My favorite bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that smart people are more likely to fall into this trap.” (p. 25)





Other Good Quotes:

“When we’re in scientist mode, we refuse to let our ideas become ideologies. We don’t start with answers or solutions; we lead with questions and puzzles. We don’t preach from intuition; we teach from evidence. We don’t just have healthy skepticism about other people’s arguments; we dare to disagree with our own arguments.” (p. 25)

As I’ve studied the process of rethinking, I’ve found that it often unfolds in a cycle. It starts with intellectual humility—knowing what we don’t know. We should all be able to make a long list of areas where we’re ignorant.” (p. 27)

“being a professional scientist doesn’t guarantee that someone will use the tools of their training. Scientists morph into preachers when they present their pet theories as gospel and treat thoughtful critiques as sacrilege.” (p. 22)

“Intelligence is traditionally viewed as the ability to think and learn. Yet in a turbulent world, there’s another set of cognitive skills that might matter more: the ability to rethink and unlearn.” (p. 2)

“We question the judgment of experts whenever we seek out a second opinion on a medical diagnosis. Unfortunately, when it comes to our own knowledge and opinions, we often favor feeling right over being right. In everyday life, we make many diagnoses of our own, ranging from whom we hire to whom we marry. We need to develop the habit of forming our own second opinions.” (p. 18)

“No matter how much brainpower you have, if you lack the motivation to change your mind, you’ll miss many occasions to think again.” (p. 24)

“Invite Others to Question Your Thinking. Build a challenge network, not just a support network. It’s helpful to have cheerleaders encouraging you, but you also need critics to challenge you. Who are your most thoughtful critics? Once you’ve identified them, invite them to question your thinking. To make sure they know you’re open to dissenting views, tell them why you respect their pushback—and where they usually add the most value.”

“We all have blind spots in our knowledge and opinions. The bad news is that they can leave us blind to our blindness, which gives us false confidence in our judgment and prevents us from rethinking.” (p.35)

“If we care about accuracy, we can’t afford to have blind spots. To get an accurate picture of our knowledge and skills, it can help to assess ourselves like scientists looking through a microscope.” (p. 48)

Our opinions can become so sacred that we grow hostile to the mere thought of being wrong, and the totalitarian ego leaps in to silence counterarguments, squash contrary evidence, and close the door on learning.” (p. 63)

Don’t confuse confidence with competence. The Dunning-Kruger effect is a good reminder that the better you think you are, the greater the risk that you’re overestimating yourself—and the greater the odds that you’ll stop improving. To prevent overconfidence in your knowledge, reflect on how well you can explain a given subject.”

Seek out information that goes against your views. You can fight confirmation bias, burst filter bubbles, and escape echo chambers by actively engaging with ideas that challenge your assumptions. An easy place to start is to follow people who make you think—even if you usually disagree with what they think.”

“That’s where the best forecasters excelled: they were eager to think again. They saw their opinions more as hunches than as truths—as possibilities to entertain rather than facts to embrace. They questioned ideas before accepting them, and they were willing to keep questioning them even after accepting them. They were constantly seeking new information and better evidence—especially disconfirming evidence.” (p. 67)

“I’d say the students who enjoyed the experience had a mindset similar to that of great scientists and superforecasters. They saw challenges to their opinions as an exciting opportunity to develop and evolve their thinking. The students who found it stressful didn’t know how to detach. Their opinions were their identities. An assault on their worldviews was a threat to their very sense of self. Their inner dictator rushed in to protect them.” (p. 74)

“Every time we encounter new information, we have a choice. We can attach our opinions to our identities and stand our ground in the stubbornness of preaching and prosecuting. Or we can operate more like scientists, defining ourselves as people committed to the pursuit of truth—even if it means proving our own views wrong.” (p. 76)

“Over the following year, the startups in the control group averaged under $300 in revenue. The startups in the scientific thinking group averaged over $12,000 in revenue.” (p. 21)

“The entrepreneurs who had been taught to think like scientists, in contrast, pivoted more than twice as often. When their hypotheses weren’t supported, they knew it was time to rethink their business models.” (p. 21)

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex