10 Types of Cognitive Biases That Can Impact Decision-Making in the Lab

To make objective, data-driven decisions to benefit the lab, managers need to be educated about the different types of cognitive biases and seek mechanisms to offset them

Scott D. Hanton, PhD

Scott D. Hanton, PhD

ViewFull Profile
Learn about ourEditorial Policies.
Published:Dec 29, 2020
|5 min read
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Cognitive bias is a tendency, inclination, or prejudice for or against something or someone. It is often based on stereotypes, results in pre-judgements, and can lead to rushed decisions. Some of these biases are buried deep in our reactions to data and situations and are not part of our conscious thoughts about the decisions we make.

All people have some degree of unintended cognitive bias, stemming from how we were raised and nurtured. When lab managers are unaware of their bias, it can lead to blind spots that impact their decision making. To make objective, data-driven decisions to benefit the lab, managers need to be educated about the different types of cognitive biases and seek mechanisms to offset them with rational, objective thinking as they make key decisions.

10 Common Types of Cognitive Bias

The following are different types of cognitive bias that lab managers should try to identify and mitigate to improve their decision-making for the lab.

1. Confirmation bias

Confirmation bias is the selective search for supportive evidence, defined by the tendency to only find data that supports the desired decision or outcome. This bias can make individuals blind to contrary evidence or cause them to selectively retain information that supports their viewpoint. This is what happens when a lab manager really wants to buy a new instrument and reads up on the amazing science it can do, but doesn’t notice that it wouldn’t meet the lab’s utilization goals. Confirmation bias can be especially difficult to recognize when working alone. The best ways to combat it are to be skeptical of the information supporting a particular viewpoint and to bounce ideas off a group of people, and then listen to the group’s feedback. A dissenting voice, or a devil’s advocate, can be particularly helpful to reduce confirmation bias.

2. Attribution asymmetry

Attribution asymmetry is characterized by a feeling that our own actions carry more weight than others, and our success is driven by our abilities and expertise, while our failures are driven by external factors, like bad luck or unescapable circumstances. Developing specific, objective criteria to analyze outcomes will help normalize this bias. Asking for specific feedback about our own contributions in the lab, and getting feedback about those contributions from others, is a good way to mitigate attribution asymmetry.

3. Cognitive inertia

Cognitive inertia is a resistance to change. This bias presents a barrier to accepting and internalizing new data, knowledge, or information. It manifests as a stubbornness to continue a chosen path despite there being reasons to make a change for the better. We see this in some lab managers who don’t recognize that a staff member has grown and developed new skills deserving of a promotion and continue to only see that individual as a junior contributor. Curiosity is helpful in combating this bias. Continuing to seek and evaluate alternatives, even when it appears that we’re making progress, helps us to keep options open. Another useful tool to mitigate this bias is to pose the question, what benefit can come from a proposed change? Turning the focus to potential benefits can reduce resistance to change.

4. Premature termination

Premature termination is a tendency to stop seeking alternatives when the first potential solution is discovered. This bias tends to limit lab managers to solutions that are comfortable and shallow, rather than enabling real solutions that drive important change. If left unchecked, it can lead lab managers to be too cautious or lack innovation. One way to mitigate premature termination is to consciously generate multiple options when facing important decisions, and to ask others for their ideas to broaden and deepen the option pool.

5. Anchoring

Anchoring is a tendency to put too much influence on the initial information received. This form of unconscious bias overly rewards the first impression because it can be difficult to displace original information in the brain. For example, anchoring can be observed in the lab when a lab manager retains a service provider who really delivered on the first occasion but failed to deliver more recently. To address anchoring, try to slow down, carefully evaluate options, and be aware of the assumptions involved. It is also helpful to be curious and seek new information and feedback to compare to the original information.

6. Wishful thinking

Wishful thinking, also called optimism bias, is a blind spot that causes people to only see the positive side of a situation according to what they want to believe, rather than what is true. Wishful thinking leads to a misplaced, rosy view of the world. Lab managers can combat wishful thinking by using a process like DeBono’s Six Thinking Hats to view multiple perspectives when evaluating information and making decisions.

7. Groupthink

Groupthink is the pressure to conform to the opinions or perspective of the group. It is a form of peer pressure where disagreeing with the majority is seen as ineffective or disloyal. This bias greatly limits a leadership team from being creative, taking appropriate risks, and trying new things. Groupthink can be especially challenging for lab leadership teams who have been together for a long time with extensive shared knowledge and history. Ways to defeat groupthink include inviting outsiders to discuss issues with the team or assigning a team member to actively disagree or play devil’s advocate for the team.

8. Repetition

Repetition is a tendency to believe what has been said and written the most often. But just because something is repeated, that doesn’t make it correct. Most lab managers are scientists trained to be skeptical of data and information. The best defense against repetition bias is using that technical training to evaluate all forms of information entering the lab as if it were lab data. It is important to have ways to share new information with peers and colleagues to get multiple views on any suspect information.

9. Recency

Recency is the tendency to assign the greatest weight to more recent information and allow it to displace older information. Both recency and repetition are best addressed with a skeptical, scientific approach. Careful examination of data and information, and following the scientific method, even for non-technical information, are effective ways to combat these biases.

10. Implicit bias

Implicit biases, or unconscious biases, are attitudes or stereotypes that impact our social behaviors and actions toward, and understandings of, people. Implicit biases grow out of the patterns we observe and shortcuts our brains take to sort out those complex patterns. Even as children, we recognize patterns around groups and belonging. Implicit bias affects our interactions with people of different groups, such as different races, genders, ages, sizes, and sexual orientations. Implicit bias can negatively impact a lab manager’s efforts to recruit, hire, and retain diverse candidates. Careful examination of hiring materials and practices is critical to reducing the impact of implicit bias in recruiting. To help educate people about their own implicit bias, Harvard has developed an Implicit Association Test (IAT) you can take online.

How to mitigate bias

All people are impacted by their unintended bias, and lab managers are no exception. One of the keys to addressing them is realizing that they are present and have an impact on lab decisions. Getting multiple perspectives, using decision-making tools, and being skeptical, data driven, and curious are good ways to investigate many of the different forms of unconscious bias. While we can’t prevent ourselves from having unconscious biases, we can mitigate the impact they have on our decisions.