Think Like a Freak Book Summary, by Steven Levitt and Stephen Dubner

Want to get the main points of Think Like a Freak in 20 minutes or less? Read the world’s #1 book summary of Think Like a Freak by Steven Levitt and Stephen Dubner here.

Steven Levitt and Stephen Dubner have built the Freaknomics brand by demonstrating the counterintuitive power of incentives in a variety of different circumstances. They wrote Think Like a Freak to help convey how to think in this economics-style matter.

The overall premise of the book is that there is rarely one right way to think. The complexity of the modern world requires that we think about problems and situations from very different angles, such as that of a child.

Principle

  • Most people put their private benefit ahead of the public or group benefit, even when they claim otherwise. This can mean anything from manipulating grades to improve class test scores to choosing a safe path that avoids humiliation.

Example

  • Few soccer players (17%) are willing to shoot a penalty kick down the middle of the goal, even though statistically goalies only remain in position to stop a shot 2-3% of the time. One hypothesis for this discrepancy is that when a penalty kick down the middle fails, the kicker looks very stupid, would likely feel humiliated and worry about being attacked by the fans. If they shoot a normal penalty kick and barely miss or the keeper gets lucky and guesses right, they may have chosen a suboptimal answer for their team, but they reduce the personal downside if they got it wrong.

The I don’t Know Principles

  • The hardest words to say in English are “I don’t know.” Most people in the west have a strong cultural bias against uttering these words. Because they believe it makes them look foolish. However, saying I don’t know can be reframed as a new beginning, the start of a journey to find the true answer rather than grasp for a false or incomplete one.
  • If people continually pretend to know something in order to avoid admitting ignorance and looking foolish, this might become its own reality over time.
  • Smart people are typically very confident in their intelligence due to their ability to reason and domain knowledge. Combined with the cultural inability to admit ignorance or doubt, this can create a deadly combination of being cocky + wrong. The solution to this problem is often simply admitting that you don’t know. With very complex problems or future predictions, few to no people can actually know for certain.
  • People act this way because of an asymmetric cost between being looking dumb in the moment, if one admits they don’t know, and making a smart sounding prediction where nobody can prove you right or wrong for some time in the future. People’s memories are short, and they are liable to forget. On the other hand, if one’s prediction happens to be right, they can then champion it as evidence of how great they are. Put together, these factors lead people to speaking assertively about matters they don’t fully understand nor predict.
  • One way to help achieve understanding is to test if your ideas don’t work, or trying to prove yourself wrong. The randomized control trial has been a gold standard of testing in science for centuries.
  • One strategy for handling questions that you would have to pretend to answer is to say ‘I don’t know, but maybe I can find out.’  In many situations, this is the most logical approach. An advantage for doing this is that if you do choose to bluff and make up an answer at some point, people who know your reputation will be more likely to believe you because you’ve admitted you didn’t know all those other times.

Example

  • The world is chock full of ‘experts’ in many fields who claim to know how the future will unfold. When their predictions in politics and the stock market were actually tracked over a large period of time, they were revealed to be as often wrong as right.
  • The rise and decline in suicide rates over time and place is a very complex and difficult problem. David Lester, a leading psychology professor in the field, freely admits he doesn’t know why. He has theories, but nothing concrete.

Redefining the Question Principles

  • Given it’s so difficult to admit we don’t know the answer, it’s naturally even harder to admit that we don’t even know the right question. One way to help approach this is to try to unpack all of the variables and test each one.
  • Sometimes the way a problem is instinctively framed makes it difficult to really look for other possibilities. If you can identify a root cause that can help, but often the root cause is obscured by time or distance. For instance, areas in Germany, a nation of great tradition, have different income depending on whether or not they embraced the Protestant reformation centuries ago. The concept of the Protestant work ethic was empirically confirmed by higher incomes according to religious region.

Examples

  • Many education theories have been floated which analyze all kinds of different variables in the school system. However, it’s quite possible that true education reform has to begin in the home. Given parents have far more time with their kids than the schools do, they may play a larger role than any school reform could make. However, education reformists almost never focus on this possibility, such as training parents. Thus they restrict the range of possible answers to within the school.
  • The hot dog eating world champion Takeru Kobayashi ate 50 hot dogs and buns in his first entry, compared to the previous record of 25 ¼. This shocking result came because Kobayashi radically changed his eating strategy. He arrived at this strategy by experimenting with all kinds of different variables and rigorously tracking the results. In doing so, he redefined the problem of how to optimally eat the hotdogs, compared to his rivals who merely tried to stuff them down in a quicker version of how people normally ate. He trained himself to not pay attention to that perceived barrier of 25 ¼.
  • When you go to do an exercise like pushups, the framing of how much you expect to do affects when you start feeling tired. The trick is to frame your expected amount at a higher level than normal.
  • Some solutions may not easily be considered because the actual answer is to revolting. For instance, doctors studying ulcers finally found the root cause and a potential solution: giving patients healthy gut bacteria. A potentially good source for this was fecal transplants, which is not the first solution that comes to anyone’s mind.

Principles of Thinking Like a Child

  • Children have many valuable insights about thinking that are worth mimicking; no matter how silly they may feel or look. Children are less hindered by preconceptions and will be more likely to find questions or answers adults might instinctively skip over or ignore. The reason this is effective because the risk of thinking up a wrong idea is very little. It is the acting on it than can be costly, which is why one should maintain a careful distance between thinking and acting here.
  • Thinking about and trying to solve big problems is often impractical and will only end up with a lot of wasted effort. Isaac Newton, for instance, recommended that it is better to focus on a small area one can actually understand than try to understand all of nature. Small questions are often less desirable to be asked, and thus there is less competition in trying to solve them.
  • Don’t be afraid to state the obvious. Just like how people are scared to say ‘I don’t know,’ many people are afraid to ask simple questions
  • Lastly, but most importantly, don’t be afraid to have fun and play around with your ideas. A key way to improve one’s ability at any activity is to simply enjoy doing it. So if a reader wants to improve their ability to think, a great method is to foster their joy of thinking.

Examples

  • A group of Chinese researchers tried a simple experiment on education in a poor frontier region of China: they gave out free glasses to half the students. With just a $15 investment per student, the students who received glasses performed 25-50% better on test scores, a phenomenal jump for such a cheap experiment.
  • A child’s question: if driving drunk is dangerous, is drunk walking also dangerous?
  • Magicians find children harder to fool than adults. Kids are relatively free from the assumptions that Magicians exploit in adults. Additionally, magicians hold their cards or objects at a certain height to fool the angle adults would naturally watch their hands at. However, because children are lower to the ground, they are able to see the problem from a different, unexpected angle. Kids also don’t overthink the problem. These two metaphors carry over to thinking as a whole.  

Incentive Principles

  • People respond to all kinds of incentives, even in seemingly unexpected or uncorrelated ways.  Both the size and type of incentive can change behavior. Many times people are unaware of what really drives them, and often they do not publicly admit what changes their actions.
  • You need to figure out what people really care about, and supply that to them in a way that is cheap for you and valuable for them.
  • Moral incentives are often a lot weaker than most people expect, while incentives of peer perception work surprisingly well. Many messages achieve counterproductive results because they normalize an action and allow you the comfort of knowing others are doing it. For example, increasing awareness about teen pregnancy or drinking and driving teaches people that such an activity is legitimate.
  • Incentives can be used to switch the frame people view something from adversarial to cooperative.
  • Once incentives are put into play, people will try almost anything to game the system.
  • Guilty and innocent people react differently to certain stimuli. Thus one can create a situation where they reveal their true nature by tricking them with the right incentive.
  • Incentives also can create hidden second and third order consequences that are not initially obvious. For instance, if people are given access to free health care, will that change their behavior about how often they use those services or partake in activities that might require them?

Examples

  • An economist went to Las Vegas on holiday and asked a beautiful woman if she would sleep with him a million dollars. She agreed, even though he wasn’t attractive. He then asked her if she would be willing to do so for $100, after which she yelled at him “What do you think I am, a prostitute?” He clarified that this was already established, and now they were just haggling over the price.
  • It isn’t obviously clear what the most important criteria would be for people to engage in energy conservation at home. When asked during an innovative study, people ranked protecting the environment as the most important, benefiting society as the second, saving money as the third, and that a lot of others do It as the last. After collecting this data, the researchers then send out placards to people in the experiment with different versions that were targeted at each of the above rationales, plus a control group. They then measured energy usage in homes to see how behaviors changed. The clear winner was the sign that said “Join your neighbors.” That is, people’s actual behavior was the opposite of what they claimed. They took an action because of social proof.  
  • Brian Mullaney ran a non-profit caused Smile Train. His strategy for fund raising was to use social pressure to psychologically bully people into giving, while giving them a perceived way of permanent relief. Their mailing material not only had pictures of disfigured children in need of surgery, it also contained a message that they would never ask for another donation if the recipient gave one this time. Non-profits never considered this tactic because it was already so costly to acquire donors in the first place, but the novelty won Smile Train many new donors. Even though they got into the door by promising to never ask for another donation if one was given now, only 1/3rd of the recipients actually checked this box on the form. This tactic improved their donations by 46%.
  • King Solomon and David Roth of the band Van Halen both used incentives in innovative ways to reveal the guilty from the innocent. When King Solomon was presented with two mothers who claimed a child was their own, he threatened to kill the baby and give half to each woman. The innocent mother, who loved the baby, was the one who pleaded with him to save the baby and give him to the other mother, which revealed to Solomon that she was the true mother. David Roth’s band had many technical requirements they had to trust their promoters would get right. To test whether they were trustworthy, they enclosed a requirement for no brown m&ms within their 53 page rider. Those promoters who failed to sift out the brown m&ms automatically revealed themselves as not being attention to the details, and thus putting everyone involved at risk.
  • Nigerian scam letters are designed as a time-saving mechanism to automatically weed out anyone savvy enough to not fall for it. So only the very ignorant or foolish would actually take the time to respond to such an outlandish letter, and those are the exact people the scammers are targeting.
  • The company Zappos, who hires call center people for $11/hour, gives candidates who they are ready to hire a final choice: will they accept the job or accept $2,000 in exchange for not ever being able to work there. This helps weed out people they wouldn’t want anyways, for a much cheaper price than if they had to hire then fire them.

How to Persuade People Entrenched in their Views

  • Smarter and more educated people are more likely to hold extreme views of one kind or another. For instance, they might automatically reject or embrace global warming theory. People can easily trick themselves because their confidence level is frequently uncorrelated with their actual competence in the matter at hand. Their higher levels of intelligence and education allow them to better justify their already established position. Accordingly, people with entrenched views are not going to easily change them.
  • When faced with such people, it is better to approach them subtly rather than going in to attack their arguments with guns blazing. Just as the incentives chapter observed, advocating for behavior change with moral exhortation is a losing strategy. Most people operate their beliefs on ideology and social proof, or herd thinking.  No matter how sound your argument may be, it is useless if the other party does not change their behavior from it.
  • To undermine their position subtly, you should first embrace their argument. Not only will it make them feel engaged with you, but you can learn about what matters to them, and use it to strengthen your own argument. Once that connection is established, you can then incorporate the strengths of their argument into an overarching story that combines certain subtleties of your own. This story cannot just be a banal anecdote, but must convey the nuance of your position in an evocative way that they will easily remember.

To Quit or Not

  • Many people are too willing to keep the status quo, often out of social pressure or the sunk cost psychological fallacy. Many social narratives in the west are used to shame the idea of quitting halfway, such as Winston Churchill’s steadfast refusal to give in to Hitler in World War II. However, these plausible narratives often conceal half-truths.  Churchill quit many things in his life, such as his political party (twice), and the government itself. Looking at it from another perspective, getting practice at quitting when the costs are relatively low can help train you to recognize the key times like Churchill found when you should truly stick it out.
  • While many organizations practice postmortem analysis, a premortem is also a useful tool. A premortem brings together everyone involved in a project, has them imagine the project failed, and then has them write exactly why it failed. This specific process can bring forth doubts people may have but are unwilling to verbally say for fear of being associated with failure, doubt, or uncertainty.
Think Like a Freak Book Summary, by Steven Levitt and Stephen Dubner

Enjoy this summary?

Subscribe to get my next book summary in your email.