Is Cognitive Bias Hurting Your Company’s Hazard Recognition?

Learn how cognitive bias affects hazard recognition.

Hazard Recognition is a fundamental part of an effective health and safety program. But are unseen biases posing an overlooked threat? Learn how cognitive biases affect the workplace and how an organization can strengthen its hazard recognition efforts.

Author: Joseph Christian

June 23, 2023

What is Cognitive Bias?

Have you ever thought, ‘That won’t happen to me, I’ve got years of experience’? Have you ever assumed everything you do is safe because a higher-up told you so?

These are examples of cognitive bias – created by the tendency of most people to make decisions based on their own experience, what they’re told, and what they understand without deeper critical thinking. Simply put, it is our brains taking a shortcut. But when it comes to safety at work, cognitive biases can negatively affect an employee’s ability to properly identify hazards and leads to unsafe decisions and actions.

Health and safety professionals need to be aware hazard recognition is not just about physical hazards – but mental ones as well. Only by understanding the pitfalls of cognitive bias and its overlooked effects can your company develop strategies to lessen its impacts and exercise some mental muscle in hazard recognition.

Common Cognitive Biases to Watch Out For

While not a comprehensive list, here are some common cognitive biases health and safety professionals run into at work.

1-Confirmation Bias

What is it?
This is when you favor things that confirm your existing beliefs. Another way to understand this concerning hazard recognition is that people only see what they expect to see. An example of this may be a worker leaning too much on their experience because they have never gotten hurt before or even dismissing the experiences of others because it has not happened to them.

How it negatively affects hazard recognition
Confirmation bias elevates individual experience and preconceptions above the critical thinking and rules-based decision-making that should guide hazard recognition.

At its worst, it habituates workers into feeling nothing can go wrong. When workers stop thinking about hazards, they stop trying to do hazard analysis right. These workers will progressively talk themselves into an empty hazard analysis and miss dangerous conditions.

Leaders of hazard analysis that validate their experience over others will create a dangerous culture of silence. If workers see and think about the hazards but won’t speak up, it puts everyone in jeopardy.

Expert tip:
Reward critical thinking in your team. Active thinking challenges the status quo and encourages learning.

Take the time to properly identify hazards.

2-Availability Heuristic

What is it?
This is when your judgment of a situation is influenced by what most easily and quickly comes to mind. Common examples include when a person is rushed so they make snap judgments based on the first thing that comes to mind. Or when a person experiences an emotion like sadness or anger and makes decisions based on that immediate feeling.

How it negatively affects hazard recognition
When it comes to hazard recognition, this means a worker takes the path of least resistance. When they are tired, emotional, or short on time, they don’t rely on critical thinking. Instead, recall the most recent thing because it’s easier for them to push forward this way.

Over time if this cognitive bias dominates hazard analysis, workers will learn that safety is not a valuable part of their craft because critical thinking is not used or encouraged. Another danger is it fails to address at-risk behavior that cuts corners and saves time. Remind employees that working their craft safely is the skill they are being paid for.

Expert tip:
Set aside a scheduled time to stop and discuss dangers at work. During each discussion, assign a person to act as a ‘monkey wrench’ during hazard analysis and ask, what if?

3-Authority Bias

What is it?
This is when individuals attribute greater accuracy to the opinion of an authority figure and are unwilling/uncomfortable to challenge them even if they know something is unsafe. It can also happen when a leader refuses to listen to others or automatically attacks anyone who disagrees.

How it negatively affects hazard recognition
Over-relying on an authority figure creates a safety culture weak at questioning and improving hazard recognition. It can cause workers to stop critically thinking about dangers and only focus on doing things the ‘accepted way’. Authority bias also discourages active learning and engagement and can create an environment where workers struggle to adapt to new situations when authority figures are not present.

Expert tip:
Allow other crew members to occasionally lead the hazard analysis. A leader should show that you welcome other ideas.

4-Dunning-Kruger Effect

What is it?
This is a situation where the less an individual knows, the more confident they’re likely to be. It is a type of overconfidence bias that is dangerous, particularly for young and inexperienced workers.

Another way of understanding the Dunning-Kruger Effect is that when you begin to learn something new, the inflow of knowledge can give an individual a lot of confidence. Often these individuals feel empowered but don’t understand how complicated something really is.

How it negatively affects hazard recognition
The Dunning-Kruger effect can blind inexperienced workers from the real dangers of their job. A worker may only follow what little they know and discount the experience of others because they do not think about hidden hazards beyond their limited knowledge.

This bias can be a larger problem when inexperienced workers lack on-site guidance from more experienced ones. This can trap the new workers in a dangerous situation where they don’t know what hazards to look for or how to learn the knowledge that protects them.

Expert Tip:
Give new workers a chance to ask and learn from more experienced workers in an open and judgment-free environment. Welcome the observations of fresh eyes.

Critical thinking helps hazard recognition.

How to Address Cognitive Bias

One of the best ways to combat cognitive biases is to create an effective safety culture that values hazard recognition and encourages individuals to take ownership of their safety. Safety is not static and limited to training or a hazard checklist – it’s about openness to learn and improve.

Here are some things OSCAsafe’s team of safety experts recommend based on real-life experience.

  1. Challenges your ideas with self-reflection

  2. Promote a ‘see something, say something’ culture. Hazard recognition is about recognition, so promote it

  3. Have activities and training that challenge your hazard recognition process and encourage critical thinking

  4. Don’t rush safety. Ensure employees have time to learn, discuss, and practice hazard recognition

  5. Allow employees a safe space to discuss problematic situations and share stories so they can learn from each other

  6. Fully explain dangerous situations and standards to employees and allow them to ask questions

Need Help?

OSCAsafe understands that developing and running effective training programs is a challenge. Simplify things with the help of training experts. Contact OSCAsafe at