Ambiguity Aversion
(Also known as ) People are more likely to prefer the things we know to those that we don’t — or known risks over unknown ones.
Anchoring & Adjustment
Humans tend to rely too much on the first information given about a topic — many of the examples you will see and hear from The Brainy Business use numbers, but this can also be seen in many other ways.
Baader-Meinhof Phenomenon
This is the illusion that when something has come to your attention (maybe it is a word or a phrase or an image) that it seems to be everywhere after you first heard about it. This could be when you are thinking about buying a blue car and you start seeing them everywhere.
Backfire Effect
Part of , when we press to hard to get someone to change their mind using logic, it can backfire and cause them to dig in their heels even more on something. When you come in too hard with data, it can backfire and cause an even bigger divide with that person, that is harder to overcome.
Bias (Cognitive)
A cognitive bias is described as the “error” in decision making when our brains have a plan and the result of the choice is a little off. We are biased to avoid losses or to herd or prefer the status quo. The concepts on this list are all biases in decision making (or because of biased decision making) and part of what makes us human. As noted in What Your Employees Need and Can't Tell You, because our brain’s decision making runs on bias (and we need to be able to make our decisions that way in order to survive) having a goal to have an “unbiased workplace” is impossible and an unachievable goal. It is instead better to understand and leverage bias to help it work for us. Also see .
Bikeshedding
Also known as Parkinson’s Law of Triviality. The tendency of people to dedicate an inordinate amount of time to something trivial (that feels incredibly important) while letting crucial items go unattended. The concept got its name “bikeshedding” because of Parkinson’s observation of a group that was designing a nuclear power plant, but spent a ridiculous amount of time designing the bike shed (instead of the much more scary and important work of the plant itself).
Blessing of the Uninformed
My reframed name for the to help it be seen as a positive thing when you look at the other side of the equation to help your company, communication, and brand be more relatable.
Bystander Effect
The greater number of people around when something bad is going on (someone is being bullied, there is misconduct at work, a crime is taking place, someone is having a heart attack) the less likely anyone is to take action. In essence, everyone assumes someone else is doing/will do something (because it is so obvious) so they don’t need to/shouldn’t — and therefore no one does anything.
Choice Architecture / Choice Architect
The way decision sets or choices are presented to people are their choice architecture. The person (or entity) presenting the options is the choice architect. Whether you think about it (or want the responsibility) or not, the way you present the options is absolutely impacting the decision someone makes, so it is better to be thoughtful about it. Also see .
Cobra Effect
When providing an incentive, it is important to look for the loopholes that someone might use to take advantage of it. While incentives can be great, “no loophole goes unexploited” and often the loopholes are hard to see when we are setting up the incentive.
This got its name because when the British ruled India, Delhi was infested with cobras 🐍 — they put a bounty on cobra skins to fix this...but as they were paying out incentives and the problem wasn’t getting better, they found that some people were raising cobras to kill and turn in for the bounty. The program was ended — and everyone let their cobras go free, which made the problem worse.
Cognitive Load / Cognitive Stress
Our brains are amazing, but they tend to get overwhelmed easily. When the conscious brain () is bogged down with too much information (much less than you think) that cognitive load/stress will cause an increased reliance on the subconscious () for more and more (even very important things). Because the subconscious has a preference for the , when someone has an increased cognitive load, they are less likely to be receptive to changes (even small ones and even ones that are “obviously better”).
Cold-Hot Empathy Gap
When we are in a cold state (i.e. not in the “heat of the moment”) we like to think we will have more control and better decision making when the crucial moment comes. The prediction/hope for what we think will happen while in a cold state and the difference in what actually happens in the hot state has a gap. For example, when smokers underestimated their cravings to smoke when they were in a cold state or someone was confident they would be able to resist dessert until the menu arrived and everyone else was ordering. Also see .
Confirmation Bias
Our brains like to prove themselves right and are constantly filtering the information we see in order to make decisions. Because of this, we are wired to search for information in a biased way, and look for validation of what we already believe to be true (and conveniently ignore information that goes against our beliefs). For example, searching for “Why astrology is bogus” will give you different results than “Why astrology is real” or “Is astrology bogus?” or “Examples for when astrology was proven right” — you get the point.
In the workplace, if you approach a “difficult” person with the confirmation bias that they are “difficult” that is what you will focus on and continue to see. If you that confirmation bias, it can work for you. Also see and the .
Counterfactual Thinking
These are our “what if” and “if only” thinkings, when we look back and wonder what might have happened if only something was different. There are many categories of counterfactual thinking (including prefactual thinking, where we look ahead) but for our purposes here, it is important to know that this can either become a vicious cycle (where people get stuck in what might have been and beat themselves up) or as a way to motivate better outcomes (where it is a virtuous cycle). One key way to help break the cycle of negative thinking on this is to say, “Next time I’ll...”
Curse of Knowledge
Because you know a lot about your area of expertise, it can be extremely difficult to think about problems from the perspective of those who know less than you. In addition to realizing that you can’t unknow what you know (and it is very hard to sit in the seat of those who know less then you) is to consider the flip of this and know there is a where you can gain insight from new customers or employees or vendors before they “know too much” and are stuck in your curse of knowledge.
Default
Whatever choice is presented as the default is more likely to be chosen. If you have a setting where people have to opt in to participate, you will have a lot less participants than if they have to opt out (regardless of whether they like the idea more or not). Consider default settings on your on your phone for its ringtone, or how quickly your screen saver goes on...we accept defaults for much of our choices.
Dopamine
Part of the DOSE brain chemicals (our four most common “happy” chemicals): dopamine, , and . Dopamine is key for motivation as it is released most when we anticipate something. Think about playing a game when you aren’t sure if you’ll win or watching a movie and waiting to know how the story will end — the joy is in the anticipation (when the dopamine is released) and dissipates once you know the result (dopamine stops being released).
Dunning-Kruger Effect
The skewed relationship between confidence and competence. When someone has very low competence they are incredibly confident (known as the “peak of mount stupid”). Once they gain a little competence, they realize how little they actually know and have a huge decrease in confidence (known as falling into the “valley of despair”). As they gain competence their confidence grows (up the “slope of enlightenment”) and their confidence might never reach the original levels (even as they have much more competence than in those early stages on the peak of mount stupid).
Think of someone who graduates from high school and knows everything, but has a rude awakening upon starting college. This is also common when people start new jobs. It is important to know where you and the people you are working with are at on the Dunning-Kruger scale throughout your working relationship so you know how to best communicate with them (and understand your own reactions to things).
Economics of Trust
Equations presented by Stephen M.R. Covey in his book The Speed of Trust. In essence, when there is an increase in trust, things can go faster and cost less. When there is a decrease in trust, things are slower and cost more.
More Trust = More Speed + Less Cost
Less Trust = Less Speed + More Cost
Effort Heuristic
When someone puts more work into something, we value it more. This can be for our own effort into something, but also if we believe that someone else put more effort in, we think it is worth more. This doesn’t necessarily mean we like it any more, but we believe it to be more valuable. For example, when someone thought a poem took 4 hours to write, they thought it was worth less than when they believed it took 20 hours (similar effects with paintings, suits of armor, and many other things). See the related concept of the .
Egocentric Bias
Because we are aware of our own effort going into any action, we often feel like we do more others. For example, someone on the team believes they put in more time than their coworkers or a spouse believes that they do more housework than their significant other. The interesting thing is, both people tend to feel this way, and because of our , we also expect other people to have this bias more than we do.
Elephant and the Rider
Way of thinking about our brain as proposed by NYU psychologist Jonathan Haidt. Imagine your brain as a person riding an elephant. The logical, conscious rider () has a plan and knows the best way to get there. Unfortunately, it is at the mercy of the subconscious elephant (). If the elephant wants to stay put or run in a different direction, the rider can’t pull or push or logic it into going where it wants, it is at the mercy of the elephant. When we create programs at work and in life, we like to think that the rider is in charge and we are communicating with other riders. In reality, the elephant makes the vast majority of the decisions, so it is best to focus on motivating and understanding the elephant (this is what the field of applied behavioral economics is all about).
Endorphins
Part of the DOSE brain chemicals (our four most common “happy” chemicals): , , and endorphins. Endorphins mask pain and are tied to that well known “runner’s high”.
Endowment Effect
We value things we own more than things we do not, even if they were given to us arbitrarily (and we wouldn’t have chosen them over something else). For example, imagine I give one half of the room $2 and the other half a lottery ticket that would have cost $2. When people are asked if they want to trade, very few do because they have painted the best picture for the thing they have been endowed with and it skews their view. Those with the lottery ticket think about how upset they would be if they traded and it was a winner, and those with the cash think about the odds and that the lottery ticket is likely a worthless piece of paper (and they both would have used the same logic if they had been part of the other group).
Extrinsic Motiviation
This is when people are motivated by rewards that are outside of themselves. This could include tangible stuff like money or intangibles like fame. Also see , and .
False Consensus Effect
This is our tendency to overestimate how much other people agree with us.
False Uniqueness Bias
This is our tendency to believe we are more unique than we really are (or more alike to others than we really are). When the trait in question is positive, we will underestimate how many others do it. When the trait is negative, we overestimate how many others do it. For example, if you always complete reviews for your employees on time, you will underestimate how many other managers do the same. If you are always late on completing reviews, you will rationalize by overestimating how many other managers do the same (i.e. it isn’t that big of a deal because everyone does it).
Familiarity Bias
We like things more when we are familiar with them. It doesn’t necessarily mean they are better, but our brains like predictability so we prefer the familiar. This is a “the devil you know is better than the devil you don’t” bias. Also see .
Focusing Illusion
Daniel Kahneman has put this best by saying, “Nothing is as important as it seems when you are thinking about it.” Our brains tend to focus on things in different ways when they are top of mind (or ). If you think you have a stain on your shirt, you will be very focused on your clothes and where people are looking when you talk to them in a way that you don’t care as much when you aren’t focused on it. Because our brains like to be right, this can combine with to keep us stuck in a particular mindset and only focus on the evidence that supports our original thoughts (while conveniently missing everything that proves us wrong).
Framing
How you say something matters more than what you say. My favorite example is of buying ground beef at the grocery store. There are two nearly identical stacks, one listed as “90% Fat Free” and the other as “10% Fat” — which one do you want to buy? Most everyone prefers 90% Fat Free. Logically we know they are the same thing, but our brains hear them completely differently.
When you are presenting a change, it isn’t about the change itself, but rather how you are presenting it makes a big difference in whether or note people will like and support it. This is one component of my .
Functional Fixedness
When all you have is a hammer, everything looks like a nail. This is our tendency to look for solutions that fit our current perspective and say other things are “impossible” if they don’t fit that model.
Take a step back from problems to see the bigger picture. If your goal is to “get something to stay on the wall” you could use a hammer/nails, drill/screws, Command strips, or many other things. If you are talking to a “drill” person or someone who is unwilling to put a hole in the wall, they aren’t necessarily wrong. How can we all be right, even when we are saying different things?
Fundamental Attribution Error
When someone isn’t part of our “in-crowd” we will not give them the same benefit of the doubt as we give ourselves. As an example, when someone cuts you off in traffic, they are a “jerk” and you tell yourself a story about their entire personality based on this one moment. When you cut someone off it is because you were running late or distracted because of some outside force — you are still a good person.
We expect others to always give us the benefit of the doubt we give ourselves, but we are many other peoples’ “them” so it is important to be aware of how this comes into play at work, both in our immediate reactions (”Ugh, he is so inconsiderate!”) and how we think others see us (”They know I’m busy and have back-to-back meetings, it’s ok that I’m late again today.”)
Game Theory
Game theory is looking at how we make decisions when our own outcomes and choices are dependent upon others and we have to guess/hope for what they will do and make a choice with incomplete information. There are a few games that are most commonly showcased (check out the podcast episodes for more) including the ultimatum game, prisoners dilemma and dictator game.
Some important notes: We humans tend to get emotional and a little vengeful when we think about groups, and will often think people are out to get us, so we better proactively protect ourselves and “look out for number 1”. This isn’t always a great strategy and we can make decisions that hurt ourselves in the process of keeping others in line. Even if we don’t think we will, the comes into play here and things go wrong when we get emotional. At work, it is important to remember the long memories humans have for when we have been wronged, and how that can throw decision-making off years down the line. In the game of life and business, it is also important to shift your strategy from playing “not to lose” to playing “to win”.
Groupthink
When people are in groups, and either because they want to have a harmonious experience or not rock the boat (or just the mentality of humans), the people within the group start to make bad decisions. These could be irrational or different than what the person would choose to do if left to their own devices, and it is often in an attempt to minimize conflicts. “Everyone else seems to be on board so I won’t point out this obvious flaw...”
Habits
The average person makes 35,000 decisions every day, the bulk of those are made on habits. Habits are built up of a context that has been repeated enough over time where the brain knows we get some sort of reward on the other side. Understanding (and often changing) the context is the key to changing habits (willpower is not an effective strategy).
It is important to know that even very small habits being upended can impact how we feel about other changes when they are presented. If too many habits change at once, we rely on the subconscious for big stuff that we may not like the outcome of. Lesson: be aware of the changes going on and how habits might be impacted. Don’t try to make too many changes at once. Prioritize projects.
Herding
Humans are a herding species, like cows, sheep and guppies. We look to others like us to help determine our best course of action (especially when we are in unfamiliar territory). Be sure to know where the herds are so you can use them in the right way (instead of having a short term herding gain for a long term loss). This is combined with for part of the . Also see .
Heuristic
The mental shortcut (rule of thumb) that the brain uses to make its 35,000 decisions per day. When these don’t align with our intentions and show an “error” we end up with a . The subconscious elephant () runs on heuristics and these are the building blocks of behavioral economics/science.
Hot-Cold Empathy Gap
In a cold state it’s much easier to make better decisions than in a hot state or in the moment. When in a hot state, we tend to underestimate how much those hot factors are impacting our decision making. When very hungry, we don’t realize how much that is impacting our reactions to questions someone may ask us or how we judge others, or our preferences for what we will want in the future. Also see .
Human Capital Factor (HCF)
(my interview with Dan Ariely, which doesn’t discuss the Human Capital Factor specifically) According to JP Morgan in a 2022 article, the new Human Capital Factor (HCF), which takes into account employee pride, purpose, and psychological safety “has demonstrated that it may be more relevant to a company’s stock performance than other investment factors.” The cofounder of Irrational Labs, who came up with the HCF, is behavioral economist Dan Ariely,
who said “[Investors] often focus on ‘what is easy to count’ instead of ‘what is important to count.’” It is time to pay attention to what is important instead of easy. Companies need to prioritize engagement and invest in their human capital.
IKEA Effect
People will value things more that they made themselves or partially assembled than things that they didn’t (and they believe that others will find them to be more valuable than they really will — often close to comparable to what they would pay for an expert’s work).
In business, whenever you can give someone an opportunity to be even a small part of a project it will increase their likelihood of supporting it. One of my favorite ways to do this with my clients is via .
Inequity Aversion
This is our preference for fairness. People can be averse to inequity both when they receive more (i.e., advantageous inequity) and when they receive less (i.e., disadvantageous inequity) than others. Though, it is important to note that just because we are averse to something doesn’t mean we like the idea of giving up anything we have to help make anything more equitable ( is a big factor here). The episode talks a lot about the difference between fairness, inequality and inequity.
Intrinsic Motivation
This is when people are motivated by rewards that are inside of themselves, like the pride of doing a good job, or competition with oneself to continually improve. Also see , and .
Loss Aversion
Humans don’t like to lose things. Many studies have shown that we feel the pain of a loss twice as much as we feel the “joy” of a realized or potential gain. In other words, it would take $40 to make us feel “healed” from losing $20. Our conscious brain likes to think that because people like things we need to give more things, and we believe we will not act from loss-led messaging, but that is often incorrect. Loss messaging doesn’t have to be negative and is a very important piece of the . Also see and .
Myopia
When we focus too closely on a problem or situation, we are being myopic. Taking a step back to look at the bigger picture is helpful here, as with .
Naïve Cynicism
We expect other people to have more of an than we do ourselves.
Naïve Realism
The belief that unlike other people, we see reality exactly as it is. We are objective and unbiased. The facts of the world are clear and should be obvious to everyone else. Anyone who agrees with us is rational and everyone else is either uninformed, lazy, irrational, or biased.
Negativity Bias
When it’s easier for us to remember negative memories over positive memories. We also like to avoid bad things, so we praise ideas, projects, or warnings that protect us from making a mistake (even if it is a very unlikely scenario or objectively worth the risk). Even optimists have a negativity bias and pessimists have .
Not Invented Here Bias (or Syndrome)
an aversion to using or coming into contact or referencing any information that was developed outside the in group.
Nudges
One of the foundational concepts of behavioral economics, as introduced by Richard Thaler (Nobel Prize winner) and Cass Sunstein in their book Nudge. The term “NUDGES” is an acronym for the six types of nudges: iNcentives, Understanding Mapping, Defaults, Give Feedback, Expect Error, Structuring Complex Choices.
In the field of behavioral economics we work in nudges — where the concepts of the field are introduced to help nudge a decision toward a certain outcome (but the decider still has free choice). The way the options are presented are known as the .
Optimism Bias
Our tendency to think that things will work out well for us and those close to us — and that we are better, faster, stronger, smarter than everyone else (including the “us” of 5 minutes ago). Our optimism bias can lead us to make risky decisions with small likelihood of success and ignore the negatives that don’t fit the model. Even pessimists have an optimism bias and optimists have a .
Ostrich Effect
Ignoring a negative situation by “sticking our heads in the metaphorical sand”.
Overwhelmed Brain
When we are overwhelmed, we tend to make worse decisions. This can be because of (a form of stress), , or many other factors. It takes much less information than we think to overwhelm the brain. For example, when people were asked to remember a 2-digit number and then pick a snack, they were much more likely to choose the healthy fruit salad than those who were remembering a 7-digit number (who were more likely to choose chocolate cake). Also see .
Oxytocin
Part of the DOSE brain chemicals (our four most common “happy” chemicals): , oxytocin, and . Oxytocin is for empathy and social bonding.
Paradox of Choice
Popularized by Barry Schwartz in his book, The Paradox of Choice. Humans like to think that they would prefer to have as many options as possible, but when we have too many choices (and it doesn’t take that many) we get stressed and often will choose badly or not at all (and feel anxiety or regret as we dwell on the decisions we did make). For most businesses, if you reduce the choices you present it will be a good thing. See .
Peak-End Rule
When thinking about an experience and how we feel about it, we don’t evaluate all the possible decision points (that would be far too much to process and a very complex equation). Instead, we tend to summarize everything in two points: the peak and the end. If you have a positive experience, it can be a good idea to end on the best possible moment (think of the final display at a fireworks show). If you have a negative experience, it is best to not have the end be on the worst point. Research has found that people preferred the experience of being in pain longer when the pain tapered off a bit before the end than if they could have had the whole medical procedure end earlier.
Planning Fallacy
When we are planning for things (like projects) we tend to underestimate how long it will take (thanks in part to ) and also not think about all the little things that can add up into a big problem (emails, notifications, client “fires”, getting a snack, talking to our kids, other breaks, distractions). This causes us to grossly underestimate how long something will take. Unfortunately, while it might seem like having a group work together would make this better, groups have been shown to be even worse at this — perhaps because people are trying to prove how great they are to the group. One tactic that can work is to estimate as if someone else were doing it (especially someone who you think is slower than you). And, even then, still add a minimum of 10% contingency onto your estimate. It is better to overestimate and come in under than it is to cause by underestimating and underdelivering.
Precommitment
When we commit to something in advance (especially publicly or with an accountability buddy) we are more likely to follow through on what we say.
Priming
Whatever happens just before the decision has a big impact on the choice someone will make or the associations that come to mind. All the senses are a big part of this and it is an incredibly important (and often overlooked) piece of the . All the little things that you think “shouldn’t” matter, are so much more impactful than you think.
Psychological Reactance
Humans don’t like to be told what to do and will often rebel against that (even if the thing they are being told to do is clearly in their best interest).
Psychological Safety
The belief that you won't be punished or humiliated for speaking up with ideas, questions, concerns, or mistakes. When we have psychological safety at work, we are more likely to speak up and have a better experience for the organization overall. We also have better decision making across the work because of a reduced .
Questionstorming
My favorite way to work with clients! It is incredibly easy to come up with the right answer to the wrong question and the biggest mistake businesses make is not spending enough time thinking about the problem before jumping into problem-solving mode. Questionstorming attacks “known truths” to help understand the real problem before thinking about behavioral interventions to incorporate into your project. Essentially, it helps us to ensure you are working on the right problem and making the best use of resources.
Reciprocity
When people are given a gift (even a very small one) they feel the need to give back in some way (reciprocate) and often in a way that outweighs the gift they were given. In the case of change management and the , reciprocity is about the gift of transparency and how it helps people be more likely to be on board with a change.
Relativity
Humans are bad at making one-off decisions. The different presentation of options will help someone to determine what feels like the best choice, and the “obvious best” choice will shift based on the information presented.
Rider, Elephant and the
Way of thinking about our brain as proposed by NYU psychologist Jonathan Haidt. Imagine your brain as a person riding an elephant. The logical, conscious rider () has a plan and knows the best way to get there. Unfortunately, it is at the mercy of the subconscious elephant (). If the elephant wants to stay put or run in a different direction, the rider can’t pull or push or logic it into going where it wants, it is at the mercy of the elephant. When we create programs at work and in life, we like to think that the rider is in charge and we are communicating with other riders. In reality, the elephant makes the vast majority of the decisions, so it is best to focus on motivating and understanding the elephant (this is what the field of applied behavioral economics is all about).
Risk Aversion
We don’t like to take on risk, and will do what we can to eliminate risk (even when the value of eliminating it is far less than the cost to do so). When we have lots of time (and a tendency to ) we are very risk averse, and when the is on we will tend to be more (FOMO), which impacts decision-making.
Salience
The prominence of something in our mind. When we are aware of something and believe it to be important. can create salience in a particular area of the decision making and (intentionally or unintentionally) influence the choice someone makes.
Scarcity
When there is less of something, we believe it to be more valuable. In the case of change management, and the , scarcity is referenced in regards to and , and helping people to make better decisions by reducing that mental stress.
Serotonin
Part of the DOSE brain chemicals (our four most common “happy” chemicals): , , serotonin and . Serotonin is related to our mood and pride.
Sludge
A concept popularized by Cass Sunstein (coauthor of Nudge). Where look at improving decisions by making things easier, sludge will slow things down. In many work scenarios, there is far too much sludge that makes it difficult for people to do the “best” thing (whatever that might be) and it is a good idea to reduce sludge.
However, there is a good side to sludge, and it can be leveraged to help us make better decisions by slowing down at the right points. For example, making it so you have to review every email at least 30 minutes after you wrote it before it can be sent out to avoid unintentional miscommunication.
Social Proof
Because we are a species, we look to others to help determine our best course of action. In the case of social proof, this can be testimonials, reviews, star ratings, and anything else which shows us what others have done previously. This is part of the .
Status Quo Bias
Because our brains run on predictability ( and ) we have an affinity for the status quo. This allows the subconscious () to remain in control more, which it likes. Change may feel uncomfortable because it represents a shift in the status quo, but that feeling doesn’t mean change is “bad.” Everything that is now our status quo was new to us at one point or another.
Survivorship Bias
When we are making decisions for what to do next, we may miss big pieces of data that are incredibly important in our decision making (we are missing what’s missing, to quote the Ted Talk by David McRaney on this topic). For example, if we search for stories about successful people getting up at 5:00am and believe that we will be successful if only we get up at 5:00am too, we forget about all the people who are successful but don’t get up at 5:00am, and the many “unsuccessful” people who do get up at 5:00am. Also see and a .
System 1 and System 2
Nobel-prize winning idea credited to Daniel Kahneman and Amos Tversky on their many decades of work which helped create the field of behavioral economics. This refers to the two systems of the brain which should be considered in decision making. System 1 (which I call subconscious or the “”) is fast, automatic processing. System 2 (which I call conscious or the “”) is slower and more manual. System 1 makes the vast majority of the decisions using the and that make up the field of behavioral economics.
Time Discounting
When we think about ourselves in the future, our brain lights up as if we are thinking about a completely different person. It is very easy to commit “future me” to do something, but in the moment when it is time for us to do something, it feels much more difficult (and we have a tendency to kick it out into the future again where it isn’t our problem). This is why we often prefer short-term gains over long-term ones (it is enjoyable to eat that cake now over the long-term benefit that jogging will give me years into the future). To overcome this, it is important to help the person realize the importance of whatever the task or action is today, instead of saying they will do something tomorrow (or 5 minutes from now). Instead of saying, “I will ask Suzie to hold me accountable.” take the step to reach out to Suzie right now and ask while you are in the moment to leverage the benefit and value of an accountability buddy (who we don’t want to lose face in front of because we are a species). I call this the “I’ll start Monday” Effect.
Time Pressure
When we are put under time pressure (deadlines, too many competing priorities, ) we will make worse decisions because this is a form of stress. When we have lots of time (and a tendency to ) we are very , and when the time pressure is on we will tend to be more (FOMO), which impacts decision-making.
One of my main arguments in the book is that removing time pressure and cognitive stress will make it easier for people to make the right decisions and work on the right things, as well as feel more intrinsically motivated. This makes them more receptive to change. If you want to naturally help the people around you to be better at change, the best thing you can do is be more thoughtful about your work and communications so you can reduce your own time pressure and theirs.
Type I
As outlined by Daniel Pink in the book Drive, when someone is naturally more motivated by intrinsic means, they are considered a Type I. Type I people are also going to accept any extrinsic incentives, but they are still doing the thing (whatever it is) for themselves primarily. If you give too many extrinsic motivators people can lose the joy of the intrinsic, so it is important to be careful with incentive programs. Also see , , and.
Type X
As outlined by Daniel Pink in the book Drive, when someone is naturally more motivated by extrinsic means, they are considered a Type X. There are many Type I’s masquerading as Type X because of our world being so focused on extrinsic incentive plans, so it is a good idea to look for opportunities to get back to intrinsic motivators and be thoughtful about this. Also see , , and .
Uncertainty Aversion
Also known as . People are more likely to prefer the things we know to those that we don’t — or known risks over unknown ones.
Unintentional Normalizing
Because we are a , we tend to look to others to help us decide how best to act. Unintentional Normalizing is Melina’s term to explain what happens when someone tries to share information they think is shocking and should get someone to change their behavior, but it has a backfiring, opposite effect (specifically because they are using a large group of people doing something to say things shouldn’t be that way, but by saying “lots of people do this thing” it makes it feel normal and “ok” to the person who the fact is trying to change.
For example, if the dean at a college says that “We take cheating very seriously and it will not be tolerated. 85% of students try to cheat at some point, but know we are watching you.” When the moment to cheat comes, even though you don’t think you would, when you are stressed in the moment (i.e. didn’t have time to study before a pop quiz) you may reflect upon hearing that 85% of students cheat and realize that you have never heard of anyone getting kicked out for cheating, so a lot of people must get away with it (and decide it is ok for you to do so too). By placing the “herd” into the thing you want to change, it actually makes it harder to do so and can cause more people to cheat because you brought it up.
Unity
A concept popularized by as his 7th Principle of Persuasion (from the New And Expanded version of his popular book Influence). It generally says that if we know someone is like us — particularly in a strong way that is core to our identity — we will like them more and afford them the benefit of the doubt (be nicer to them, be more likely to say yes to their requests, etc.) than if they aren’t part of that group. When used properly, bringing up the point that makes you a “we” just before an ask or nudge can make it more likely to be accepted.
Vulnerability Loop
A concept attributed to Jeffrey Polzer (and popularized by Daniel Coyle in The Culture Code) when we get vulnerable with someone, and they accept and reciprocate, trust is established and there is an increased connection with that person. As outlined in the episode of The Brainy Business podcast, managers and their teams need to incorporate vulnerability (and be willing to be vulnerable first) to increase trust (and, thus, allow their work to be more efficient and effective, have more engaged employees, and more).