Playing It Safe
Radiology practices establish a culture that values the safety of its patients, as well as its employees.
To err is human, approximately three percent of the time.1 That’s how often the average human will make a mistake while performing complex tasks. For radiologists — and the medical profession in general — the target error rate is much smaller.
Historically, the medical profession has functioned within hierarchical structures, with administrators and senior staff implementing processes and checks to minimize errors. This is largely effective insofar as senior staff, through experience, understand the sources of common errors and can implement effective systems and processes to check them. However, because it doesn’t foster effective bottom-up communication, this model can also hamper feedback from frontline staff when those systems and processes break down.
In recent years, radiologists around the country have adopted innovative approaches to create a culture of safety marked by transparency, collaboration, and open communication across ranks and disciplines. This culture of safety encompasses everything from ensuring that safety protocols are followed to something as seemingly ancillary as considering if the placement of the water cooler inadvertently creates a greater fall risk for elderly patients, says Jacqueline A. Bello, MD, FACR, director of neuroradiology at the Montefiore Medical Center in New York.
“Safety is a culture; it’s not a topic,” says Bello, who chairs the ACR Commission on Quality and Safety. “It affects everybody in the workplace, and let’s not forget that the patient is the star of that workplace.” At Texas Children’s Hospital in Houston, the radiology department greatly improved communication by creating a daily readiness huddle. During the 10- to 15-minute meeting each morning, the leadership, physicians, technologists, nurses, and critical support staff from other departments discuss patient issues, protocols, and critical equipment status. Each huddle begins with a metrics review, which includes the number of days since the last radiology-related serious safety event, days since the last wrong patient/wrong procedure event, and days since the last MRI safety policy violation. The huddle also includes a clinical volume review, daily readiness assessment, and problem accountability.
When the team at Texas Children’s set out to establish the daily readiness huddle, the key question was, would busy professionals be willing to invest their time each day to know everything that was going on in the department and have a mechanism to voice their concerns? The answer is yes, according to Lane F. Donnelly, MD. Donnelly, who is now chief quality officer at the Lucile Packard Children’s Hospital at Stanford University, was instrumental in establishing the program as associate radiologist-in-chief at Texas Children’s.
During the clinical volume review and daily readiness assessment, staff identify a list of issues and concerns, which are divided into quick hits (which can be resolved in less than 48 hours) and complex issues (which might take as long as 60 days to fix).
Complex issues are tracked on a whiteboard and assigned an owner, a quality coach, and a date for completion. The program identified 102 complex problems during the first year, from April 2015 to March 2016. Each problem was assigned an owner who reported back regularly on progress toward a solution. Ninety-one of the problems were resolved that year. “I think we became a much more functional team,” Donnelly says. “The relationships grew stronger, particularly between groups that normally didn’t get together every day without this process. And our ability to identify and respond to concerns dramatically changed.”
Likewise, problem-solving is central to the 52 in 52 program at the Lucile Packard Children’s Hospital, notes David B. Larson, MD, MBA. The program’s focus is specifically on concerns that can be addressed efficiently by one person in six to eight weeks. Projects overlap, and the goal is to complete 52 projects in 52 weeks.
Staff members usually put forth project ideas, which have ranged from improving communication to decreasing mismarked images. The staff member is then paired with a coach from the leadership team, who provides any needed support. The program succeeds because staff are encouraged to think like problem solvers, taught to identify and overcome barriers, and held accountable to make progress through weekly two-minute updates as they work toward a solution.
Many of the projects address the inherent variation in the medical field by putting forth standard processes. For instance, says Larson, one strategy that has been found to reduce errors is when technologists in the radiology department have a “mini pause” with patients, asking some of the same questions patients were asked when they arrived for their appointment — what is your name, what is your birthday, why are you here? “Staff confirm information at every critical point. If there is a problem, they are expected to stop and resolve it right away,” says Larson.
“We look at quality and safety as two ends of the same spectrum,” Larson says. “Quality is about how to consistently do things well, and safety is about how to never do things we should never do. While humans will always make mistakes, we need to embed strategies to make sure those mistakes are caught and corrected before they cause harm. We don’t view it in terms of having a program for safety and a program for quality. If something needs to be improved, it is the same strategy for both quality and safety.”
When an error does arise, the radiology department at the Mayo Clinic in Rochester, Minn., uses a culture of safety approach to better understand the root of the problem, according to Karl N. Krecke, MD, FACR, assistant professor of diagnostic radiology at the Mayo Clinic School of Medicine. “Medicine is historically a fairly hierarchical system. You have the doctor in charge and the staff functioning below the physician, but it’s not a very effective system for communication, understanding vulnerabilities, and empowering the team to innovate,” Krecke says. One of the keys to moving toward a culture of safety is enabling frontline employees to freely discuss what they perceive to be trouble spots.
“The folks on the frontline have a wealth of information about how we intend for things to work versus how they actually work,” Krecke says. To facilitate this openness and move toward a flatter, less hierarchical organizational chart, the radiology department at the Mayo Clinic developed an intranet-based error reporting structure and encouraged personnel to fill out an event report for anything they felt didn’t go as well as intended. Those reports were then reviewed by a leadership team. If an error required immediate review, it was referred to an internal team for root cause analysis. This team comprises the radiology staff who witnessed or experienced the error, equipment specialists, and a quality officer.
“I prefer a minimum of supervisory staff at initial discussions because I don’t want to know how things are supposed to work. I want to know how they’re actually working,” Krecke says. “What in their day-to-day life encourages staff to take shortcuts or create workarounds for whatever impositions the system puts on them — patient volumes, equipment that doesn’t work right, staffing levels, training?”
Although it took time to build trust, the process now yields open discussions. The team makes a deliberate effort to give everyone at the table an equal voice in finding the answer to the question, “Why did this very talented, very dedicated, very smart individual make a mistake?”
Although the process often begins with heartfelt mea culpas from the individuals in question, it often ends with the identification of an equipment problem, a needed process change, or an environmental distraction. One result, for instance, is that RTs now run through a mini procedural pause — during which they confirm the patient’s name, birthdate, and specific cause for visit — that has reduced the number of inaccurately ordered scans that are actually performed from 3 in 100 to about 1 in 10,000.
Bello notes that ACR contributes to this culture of safety by promoting best practices and protocols and through its Annual Conference on Quality and Safety. The conference itself is an embodiment of a culture of safety, attracting radiologists, technology specialists, administrators, and research scientists. “They are convening over a culture,” Bello says.
For radiologists considering incorporating greater transparency, collaboration, and open communication into their departments or practices, Bello’s advice is simple. “Jump in with both feet,” she says.
“People shy away from quality and safety projects because they think, ‘Oh, that’s for the safety officer,’” says Bello. “No, that’s for you. And that’s for me. And that’s for the guy with his shoelace untied. That’s for everybody.”
“When we demonstrated to people that we actually did fix things during the readiness huddles, and that this was a way for them to give voice to the concerns that they had, they bought into it quite quickly,” Donnelly says.
“Just get started,” Larson advises. “It’s the leader’s job to organize and support the team to solve the problem, but then you’ve got to give them room to figure out how to solve it. Leaders should not jump in and try to solve it for them. By tackling the problems themselves, the staff develop problem-solving and leadership skills and then they own the solutions that they implement.”
Develop a Culture of Safety
- Use the ACR Appropriateness Criteria® (acr.org/AC), evidence-based guidelines to assist referring physicians and other providers in making the most appropriate imaging or treatment decision for a specific clinical condition.
- Subscribe to Quality Care News, the ACR quarterly newsletter for medical imaging and radiation oncology professionals, at acr.org/Quality_News.
- Attend the ACR Annual Conference on Quality and Safety, taking place Oct. 26–28, 2018, in Boston.
- Refer to Image Wisely® (imagewisely.org), Image Gently® (imagegently.org), the ACR Guidance Document on MR Safe Practices (acr.org/MR_Safety), and the ACR’s radiation safety webpage (acr.org/Rad_Safety) for additional resources.
ENDNOTE 1. Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, D.C: The National Academies Press. Accessed Jan.11, 2018. Available at bit.ly/Err_Human.
By Kevin Wilcox, freelance writer, ACR Press