To Err is Human - Part 1: How Institution of a "Just" Culture Can Lead to Effective Medication-Error Reporting & Prevention

Jubilant Radiopharma Radiopharmacies Division presents the first installment of Dr. Laura Bauman’s To Err Is Human, a blog series focused on common errors in nuclear medicine that may occur during the process of writing, ordering and/or filling prescriptions. This series also emphasizes the critical importance for organizations to establish error prevention strategies.

Dr. Bauman’s first article chronicles the establishment of the “just” culture in the medical profession, and how implementation of this system for reporting errors can help prevent mistakes from being repeated. 

We've all felt that sinking feeling when we realize we've made a mistake. Questions start flying through our minds: How did that happen? Why did I miss that? Who does this impact? How do I fix it? 

We are all human, and humans inevitably make mistakes. But while forgetting to pick up the bread at the grocery store may be just inconvenient, making a mistake in our line of work in nuclear medicine can have costly and damaging effects. Needless to say, we can never take accuracy for granted. With so few drugs in our niche field, we may believe that we are immune to the widespread problems of mistakes around medications, but there are always opportunities for improvement through education and implementation of best practices.

What would you do as a nuclear technologist or pharmacist if you made an error in ordering or filling a prescription? If you entered practice more than 25 years ago, your tendency might have been to conceal any and all mistakes. Before the 1990s, medical professionals tended to be trained, and thus to believe, that perfect performance was possible and expected. Unfortunately, fear of punishment encouraged practitioners to keep any mistakes a secret, which perpetuated the false notion that mistakes were rare and perfection was possible. This response of past decades was likely due to the “punitive” culture of that time, which used disciplinary action to encourage vigilance to prevent errors.

In 1999, the Institute of Medicine (IOM) released a report, titled To Err is Human: Building a Safer Health System, which recognized that even competent, caring professionals are human and will make errors.1 The report argued that the most effective way to prevent future errors is to build safety and processes of care into the system, rather than punishing individuals who err. 

As with most change implemented to counter a faulty system, the pendulum often swings completely to the other side, creating a process in direct contrast to the perceived shortcomings of the system in question. The “no-blame” culture that emerged in response to the punitive culture sought to reveal and correct potential vulnerabilities in the medication management system by instead encouraging the reporting and analyzing of all errors, without attributing responsibility to the individuals involved.2 Free from the fear that previously stifled error reporting, this culture not only provided a wealth of information and teachable moments, but it also granted amnesty — even to those who willfully behaved recklessly and caused harmful errors.

A "Just" Culture Emerges

In reaction to the amnesty-for-all system, the pendulum eventually settled back toward the center and, beginning in the early 2000s, a more “just” culture began to emerge. This current culture of error reporting balances accountability with open communication of all risks and errors. The just culture recognizes that, while we can all make mistakes, we also have control over our behavior. Those who lose perception or focus on the risks of their actions may drift into unsafe behaviors in order to save time or money. Over time, these risky behaviors may become commonplace and contribute to mistakes that impact patient care, cost and efficiency.3

The just culture seeks to provide stronger incentives for safe behaviors, thus encouraging better day-to-day behavioral choices. But even within a safety-minded organization, there may still be workers who make conscious choices to disregard the fully understood substantial and unjustifiable risk of their behavioral choices. When these workers behave recklessly, they are held accountable within the just culture, which stresses the importance of safety as more than just a priority, but as a value underpinning every choice and action. When individuals value safety, they can most effectively practice within a system that is continually improved by analyzing and correcting vulnerabilities through error reporting.3,4,5

In that vein, let's share our experiences and suggest corrective actions arising from specific examples of medication errors in nuclear medicine. As we work together to continually improve our niche field of nuclear medicine, let’s not only explore the “hows” and “whys”, but let’s also focus on developing strategies to prevent such incidents from happening again. 

Stay tuned for Dr. Bauman’s next Industry Insider Blog installment in the To Err Is Human series in which she will address pseudo-homophones and how to guard against potential errors resulting from sound-alike numbers when verbally communicating medications.


  1. To Err Is Human – Building a Safer Health System (Brief). Institute of Medicine. November 1999. National Academy of Science. (The complete report “To Err is Human: Building a Safer Health System” is available from the National Academies Press.
  2. Our Long Journey Towards a Safety-Minded Just Culture Part I: Where We’ve Been. ISMP Newsletter, Acute Care; Sept. 7, 2006
  3. Our Long Journey Towards a Safety-Minded Just Culture Part II: Where We’re GoingISMP Newsletter, Acute Care; Sept. 21, 2006
  4. Just Culture and Its Critical Link to Patient Safety (Part I). ISMP Newsletter, Acute Care; May 17, 2012
  5. Just Culture and Its Critical Link to Patient Safety (Part II)ISMP Newsletter, Acute Care; July 12, 2012