The Power of Attention Bias in Safety Management
- Jason Starke, Ph.D.
- 2 days ago
- 3 min read

Let's take a trip back in time. During my early years in the Air Force, I worked as an F-16 Weapons Technician stationed in Northern Japan. Among my colleagues was an old, gruff tech sergeant—Tech Sergeant Stewart, or "Stu" as we called him. He embodied every stereotype of a no-nonsense, seasoned military veteran, complete with a thick South Boston accent.
Every day, before our shift, Stu would gather us and deliver the same piece of advice: "Remember this—one 'aww crap' wipes out a thousand attaboys." This was his way of reminding us that despite all the good work we did, one mistake could overshadow everything.
His message, though blunt, highlighted an important psychological phenomenon: attention bias—our tendency to focus on the unusual, unexpected, or negative while overlooking the routine and positive. This bias plays a significant role in safety management, and understanding it can help us build better, more effective safety systems.
Understanding Attention Bias
Attention bias has been well-researched in psychology. In a workplace setting, especially in safety-critical industries, people tend to notice anomalies more than routine success. Think about it—when a coworker who always performs consistently suddenly does something different, it catches your attention. The same applies to safety incidents; errors stand out more than the countless uneventful operations that go smoothly every day.
Stu’s words reflected this bias. Although our team was highly proficient in maintaining and arming F-16s with live munitions, working long shifts under uncertain conditions, he emphasized that one mistake could erase all our past successes.
While vigilance is necessary in high-risk environments, focusing only on mistakes can lead to an unbalanced safety culture. This brings us to another cognitive bias that further complicates safety management: the fundamental attribution error.
The Fundamental Attribution Error
This error occurs when we attribute the cause of a mistake solely to an individual's actions rather than considering systemic factors. We tend to blame people rather than looking at the broader context. This is a common issue in accident investigations.
For instance, after the 2005 Texas City Refinery explosion, initial reports blamed individual technicians for their errors. However, deeper investigations revealed systemic failures—poor maintenance practices, inadequate training, and a culture that prioritized production over safety. The same pattern is evident in many accident analyses: the human factor is emphasized while organizational weaknesses are overlooked.
This misattribution can be dangerous in aviation and other high-risk industries. By focusing blame on individuals, we risk missing critical opportunities to improve underlying systems and prevent future incidents.
Shifting the Focus: Lessons from Safety II
Traditional safety management, sometimes called Safety-I, is largely reactive—it focuses on what went wrong. In contrast, Safety-II encourages us to study what goes right. Instead of solely analyzing accidents and errors, it asks: What enables safe operations? How do people successfully adapt to challenges?
Most daily operations fall within the "normal" range—there are no major incidents, and people perform their duties effectively despite constant variability. However, because these normal operations don’t draw attention, they remain underexplored. If we only investigate failures, we miss the wealth of information hidden in everyday success.
Practical Steps for a Balanced Approach
Encourage Reporting of Positive Adaptations
In many industries, reporting systems focus on deviations, errors, or violations. But what if we also encouraged reports on successful adaptations? For example, a pilot who skillfully navigated a challenging landing due to unexpected wind conditions could share their experience.
This type of reporting helps organizations learn from effective problem-solving and refine procedures to support safe adaptations in the future.
Recognize Systemic Factors
When analyzing incidents, look beyond individual actions. Consider environmental factors, organizational culture, and systemic influences.
Instead of defaulting to remedial training for individuals, ask: Was the procedure itself flawed? Were external pressures involved?
Balance Focus Between Successes and Failures
While addressing failures is necessary, also study normal operations. Conduct observational studies or Line Operations Safety Audits (LOSA) to understand everyday performance and the strategies people use to maintain safety.
Capturing and analyzing these insights can help refine processes, improve training, and create more resilient safety systems.
Conclusion
Attention bias and the fundamental attribution error are deeply ingrained in human psychology, but awareness of these biases can lead to more effective safety management. Instead of focusing solely on errors, we must also study success—how people adapt, navigate challenges, and maintain safety despite uncertainty.
By shifting our perspective, we can move beyond a reactive approach and toward a proactive, learning-oriented safety culture. After all, recognizing and reinforcing what works well is just as valuable as correcting what goes wrong.
Comentarios