Reading Richard Thaler’s – Nudge opened my eyes on some kick-ass concepts, or at least getting a different angle from some traditional concepts. The first one is “choice architects”- a term coined by the author, which is a mix of statistician and marketing expert. In a way, this summarizes the book very well as “Nudge” refers to the choices we all make in our lives, from picking up a different brand when doing groceries to voting for a certain political candidate. The choice, or lack of, comes into play when faced with multiple options. How these options are presented will determine your behavior and ultimately your choice.

“Nudge” talks about all the situations where research has shown that not all choices are equal. The book is heavy on the public policy aspect, this isn’t necessarily a self-improvement book, but the underlying theories can certainly be applied if you are trying to gauge your managers appreciation of you or if you are trying to pitch your idea to someone.

One thing that the book dispels early is that Homo economicus doesn’t exist in real life. This is probably obvious to everyone: we all barter when we are children and it’s immediately clear that some people are more conscious about their decision and they put their self-interest above anything else. There’s always emotion at play.

Richard Thaler and Cass Sunstein picked some modern examples to illustrate the paradox of the term: how items are displayed in a supermarket (the expensive ones at eye level, since those resonate more with us and we will pay a premium for them), how people never change their ringtones even though you get some options for free when you buy your phone and so on.

"If private companies or public officials think that one policy produces better outcomes , they can greatly influence the outcome by choosing it as the default."

The first part of the book goes into explaining why we are not really Homo econs, even though all our economic models are based on it. The reason for this are our reflective system and automatic system.

The automatic system is our instincts. It can be natural like fear of spiders, or taught through repetition: think of athletes or improving your driving. People rely on it all the time in a familiar setting(playing a sport) or unfamiliar (run when you see fire). The problem with the automatic system is that it is easily deceptive.
In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake? _____ days.

Your automatic system tells you the answer is 24. Your reflective system should tell you it’s 47. The automatic system is designed to save you, the fight or flight response. It’s part of our reptilian brain and it’s very useful at keeping us alive.

The reflective system is our thought process. Your native language is using your automatic system. Speaking a second one uses your reflective system – the rational part of the brain. Being bilingual in two languages means they both user the automatic system. Fun fact: voters heavily rely on the automatic system. First impressions, appearance etc. This is why we don’t elect people good at what they do, but people that seem that will be good at this.

The most common techniques that choice architects use involved three sets of biases: anchoring, availability and similarity.


  • How happy are you? How often do you date?
    Anchoring means your answer for the second question will be influenced by the first one, regardless if they are related or not. “I am pretty happy but I’ve only been on 1 date in the past week, that’s pretty good!”
  • How often do you date? How happy are you?
    “I’ve only been on 1 date in the past week, I think that’s kinda lame, I guess that doesn’t make me very happy”. The happiness will be anchored in the frequency of dates the person had in the previous weeks. Anchoring is an involuntary comparison to a nearby reference point.


If people can think of relevant examples, they are far more likely to be frightened or concerned. This influences risk-related behavior, including both public and private decisions to take precautions. People assume tornadoes or earthquakes are much more dangerous than asthma attacks, even though asthma attack have a far greater frequency (a factor of 20 in this case). This affects people’s probability assessments – easily remembered events will inflate their fears and if no such event comes to life, their judgements will be oriented downwards.


How often does this happen, you might ask. Well think of FUD – Fear, Uncertainty and Doubt – a classic disinformation strategy used in sales, marketing, public relations, media, politics, religious organizations and propaganda.
If you ever heard from your friends that “Android is too complicated, iPhone will not cause you any headaches” that’s very close to a FUD campaign (personally I own both phones, this is just an example).



Stereotypes come from this bias. When people are asked to judge something in relation to something else, they will try to use their past experiences. They will assume a tall black person is more likely to be a basketball player, even though there is no basis in that statement. Our minds tries to make assumptions of patterns, to relate to past events or situations so that we save time by not re-assessing each situation every time.

The last part of the book talks about a lot of other methods that companies or experts use to shape our decisions.

Peer pressure for example, can be deployed as a choice shaping method currently used for green causes. Simply informing people of what other people are doing is enough of a nudge to change people’s options – voters will vote with whoever they think others will vote, thus creating a self-fulfilling prophecy. People will react more favorable when exposed to information about how many other people are doing that, rather than being afraid of sanctions. A campaign used “70% of teens are tobacco free” and “tobacco kills” side by side to see what gives better results. Unsurprisingly the former did.

“Nudge” has another example when peer pressure works. Residents of a neighborhood received information about their energy consumption for the previous weeks. They were also informed about the average consumption in the neighborhood. Those above average significantly decreased their consumption, but those below the average raised their consumption – this is called the boomerang effect.

To solve this the researchers provided happy/sad faces to the participants. Those below the average in this case retained the same energy consumption numbers. Tweaking the feedback that was provided to the residents reduced the negative effects of the first method.

Priming: "If you ask people one day before voting, if they intend to vote, it will increase voter turnout by 25%.
If people are drinking cold-coffee, they are more likely to view people as selfish and cold, less sociable than those that drink hot coffee. If you see a lot of briefcases in your office, it causes the environment to be more competitive, less cooperative."

Who is the author?

Richard Thaler is a well-known behavioral economist, writing a lot about different biases and strategies on how people can avoid them. “Nudge” is a fun read and is filled with examples on what research in psychology can teach us in economics. So much so that a big part of the book is dedicated entirely to case studies where experts have employed various techniques in trying to solve large scale problems.

If you are interested in learning more about Richard Thaler’s latest book, Misbehaving: The Making of Behavioral Economics. You can also watch an interesting interview of him with Malcolm Gladwell below.

"We do this because as human beings, we all are susceptible to a wide array of routine biases that can lead to an equally wide array of embarrassing blunders in education, personal finance, health care, mortgages and credit cards, happiness, and even the planet itself."