“Superthinking,” by Gabriel Weinberg

Being Wrong Less

“The concept of inverse thinking can help you with the challenge of making good decisions. The inverse of being right more is being wrong less.”

“Or consider healthy eating. A direct approach would be to try to construct a healthy diet, perhaps by making more food at home with controlled ingredients. An inverse approach, by contrast, would be to try to avoid unhealthy options.”

Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.”

“The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.”

“When arguing from first principles, you are deliberately starting from scratch. You are explicitly avoiding the potential trap of conventional wisdom, which could turn out to be wrong. Even if you end up in agreement with conventional wisdom, by taking the first-principles approach, you will gain a much deeper understanding of the subject at hand.”

“Ultimately, to be wrong less, you also need to be testing your assumptions in the real world, a process known as de-risking. There is risk that one or more of your assumptions are untrue, and so the conclusions you reach could also be false.” (p. 6)

“Once you identify the critical assumptions to de-risk, the next step is actually going out and testing these assumptions, proving or disproving them, and then adjusting your strategy appropriately.”

“Unfortunately, people often make the mistake of doing way too much work before testing assumptions in the real world. In computer science this trap is called premature optimization, where you tweak or perfect code or algorithms (optimize) too early (prematurely). If your assumptions turn out to be wrong, you’re going to have to throw out all that work, rendering it ultimately a waste of time.”

“Back in startup land, there is another mental model to help you test your assumptions, called minimum viable product, or MVP. The MVP is the product you are developing with just enough features, the minimum amount, to be feasibly, or viably, tested by real people.”

“No matter the context, what they’re all saying is that your first plan is probably wrong. While it is the best starting point you have right now, you must revise it often based on the real-world feedback you receive. And we recommend doing as little work as possible before getting that real-world feedback.”

Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true. When you encounter competing explanations that plausibly explain a set of data equally well, you probably want to choose the simplest one to investigate first. This model is a razor because it “shaves off” unnecessary assumptions.”

“If you’re trying to be as objective as possible when making a decision or solving a problem, you always want to account for your frame of reference. You will of course be influenced by your perspective, but you don’t want to be unknowingly influenced. And if you think you may not have the full understanding of a situation, then you must actively try to get it by looking from a variety of different frames of reference.”

“You can be nudged in a direction by a subtle word choice or other environmental cues. Restaurants will nudge you by highlighting certain dishes on menu inserts, by having servers verbally describe specials, or by just putting boxes around certain items. Retail stores and websites nudge you to purchase certain products by placing them where they are easier to see.”

“Another concept you will find useful when making purchasing decisions is anchoring, which describes your tendency to rely too heavily on first impressions when making decisions. You get anchored to the first piece of framing information you encounter. This tendency is commonly exploited by businesses when making offers.”

“More broadly, these mental models are all instances of a more general model, availability bias, which occurs when a bias, or distortion, creeps into your objective view of reality thanks to information recently made available to you.”

“With the rise of personalized recommendations and news feeds on the internet, availability bias has become a more and more pernicious problem. Online this model is called the filter bubble, a term coined by author Eli Pariser, who wrote a book on it with the same name. Because of availability bias, you’re likely to click on things you’re already familiar with, and so Google, Facebook, and many other companies tend to show you more of what they think you already know and like. Since there are only so many items they can show you—only so many links on page one of the search results—they therefore filter out links they think you are unlikely to click on, such as opposing viewpoints, effectively placing you in a bubble.”

“When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles. Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints.”

“In any conflict between two people, there are two sides of the story. Then there is the third story, the story that a third, impartial observer would recount. Forcing yourself to think as an impartial observer can help you in any conflict situation, including difficult business negotiations and personal disagreements.”

“If you can coherently articulate other points of view, even those directly in conflict with your own, then you will be less likely to make biased or incorrect judgments. You will dramatically increase your empathy—your understanding of other people’s frames of reference—whether or not you agree.”

“Another tactical model that can help you empathize is the most respectful interpretation, or MRI. In any situation, you can explain a person’s behavior in many ways. MRI asks you to you interpret the other parties’ actions in the most respectful way possible. It’s giving people the benefit of the doubt.”

Hanlon’s razor: never attribute to malice that which is adequately explained by carelessness.”

“Hanlon’s razor is especially useful for navigating connections in the virtual world. For example, we have all misread situations online. Since the signals of body language and voice intonation are missing, harmless lines of text can be read in a negative way.”

fundamental attribution error, where you frequently make errors by attributing others’ behaviors to their internal, or fundamental, motivations rather than external factors. You are guilty of the fundamental attribution error whenever you think someone was mean because she is mean rather than thinking she was just having a bad day.”

“You of course tend to view your own behavior in the opposite way, which is called self-serving bias. When you are the actor, you often have self-serving reasons for your behavior, but when you are the observer, you tend to blame the other’s intrinsic nature. (That’s why this model is also sometimes called actor-observer bias.)”

paradigm shift model, describing how accepted scientific theories change over time. Instead of a gradual, evolving progression, Kuhn describes a bumpy, messy process in which initial problems with a scientific theory are either ignored or rationalized away. Eventually so many issues pile up that the scientific discipline in question is thrown into a crisis mode, and the paradigm shifts to a new explanation, entering a new stable era.”

“Individuals still hang on to old theories in the face of seemingly overwhelming evidence—it happens all the time in science and in life in general. The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.

“Confirmation bias is so hard to overcome that there is a related model called the backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it. In other words, it often backfires when people try to change your mind with facts and figures, having the opposite effect on you than it should; you become more entrenched in the original, incorrect position, not less.”

“The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!”

“First, consider thinking gray, a concept we learned from Steven Sample’s book The Contrarian’s Guide to Leadership. You may think about issues in terms of black and white, but the truth is somewhere in between, a shade of gray.”

“The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts (which happens occasionally, but much less frequently than one might imagine). F. Scott Fitzgerald once described something similar to thinking gray when he observed that the test of a first-rate mind is the ability to hold two opposing thoughts at the same time while still retaining the ability to function.”

“This model is powerful because it forces you to be patient. By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm! It can be difficult to think gray because all the nuance and different points of view can cause cognitive dissonance. However, it is worth fighting through that dissonance to get closer to the objective truth.”

“More broadly, playing the Devil’s advocate means taking up an opposing side of an argument, even if it is one you don’t agree with. One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do so. Another, more effective approach is to proactively include people in a decision-making process who are known to hold opposing viewpoints. Doing so will help everyone involved more easily see the strength in other perspectives and force you to craft a more compelling argument in favor of what you believe. As Charlie Munger says, “I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”

“When something happens, the proximate cause is the thing that immediately caused it to happen. In the case of the Challenger, the Rogers Commission Report showed that the proximate cause was the external hydrogen tank igniting. The root cause, by contrast, is what you might call the real reason something happened. People’s explanations for their behavior are no different: anyone can give you a reason for their behavior, but that might not be the real reason they did something.”

“As part of its work, the commission conducted a postmortem. In medicine, a postmortem is an examination of a dead body to determine the root cause of death.”

“One technique commonly used in postmortems is called 5 Whys, where you keep asking the question “Why did that happen?” until you reach the root causes.”

“Sometimes you may want something to be true so badly that you fool yourself into thinking it is likely to be true. This feeling is known as optimistic probability bias, because you are too optimistic about the probability of success. NASA managers were way too optimistic about the probability of success, whereas the engineers who were closer to the analysis were much more on target.”

http://tamilkamaverisex.com
czech girl belle claire fucked in exchange for a few bucks. indian sex stories
cerita sex