Indmedica Home | About Indmedica | Medical Jobs | Advertise On Indmedica
Search Indmedica Web
Indmedica - India's premier medical portal

Journal of the Academy of Hospital Administration

Medical Errors and Patient Safety

Author(s): Anandh Rao T*, A K Agarwal**

Vol. 15, No. 1 (2003-01 - 2003-06)

Abstract :

Healthcare institutions of today are complex matrix organisations. Errors are bound to occur in any complex human endeavour, and healthcare is no exception. Medical errors are ubiquitous and the costs (human and financial) are substantial. Many practice systems have developed by evolution rather than design. The inherent faults that lead to errors are a result of the organisational pathologies - the "Vulnerable System Syndrome". Approaches to patient safety should focus on the latent errors, which represent the failure of system design and processes. The top priority must be to redesign systems geared to prevent, detect and minimise effects of undesirable combinations of design, performance, and circumstance. Safety improvement through system monitoring and feedback, and system and process redesign from aviation and nuclear power industries hold many lessons for healthcare. Healthcare institutions with patient safety as high priority should have a blame-free, non-punitive system for reporting errors in medical care to peer-review protected committees that are empowered to institute changes for system-wide improvements to prevent future errors.

Keywords : Medical Errors, Vulnerable System Syndrome, Risk Management Process, System Changes.

Introduction

Newspaper and television stories of catastrophic injuries occurring at the hands of clinicians spotlight the problem of medical error but provide little insight into its nature or magnitude. These horrific cases that make the headlines are just the tip of the iceberg.1

A report from the Institute of Medicine, USA, states that around 100 000 patients a year die from preventable errors in hospitals in America.2 The annual toll exceeds the combined number of deaths and injuries from motor and air crashes, suicides, falls, poisonings, and drowning. Medical error is the third most frequent cause of death in Britain after cancer and heart disease and kills four times more people than die from all other types of accidents.3 Around 850,000 medical errors occur per year resulting in up to 40,000 unintended patient deaths plus other harm in UK.3

Medical errors are ubiquitous and the costs (human and financial)4,5 are substantial. Patients injured as a result of a medical error spend longer in hospital6,7 and have higher hospital costs. The length of stay increased by 1.9 to 2.2 days as a result of adverse drug events in Utah and Harvard studies8. The annual costs of "loss" is estimated to be around 20% of budget to NHS organisations, UK.9 The total expenditure on healthcare in India is expected to be more than double by the year 2012 to around Rs 2,00,000 crore, according to the CII-Mckinsey study in 2002.10 It will be a huge amount lost due to medical errors, even if we go by the western standards of healthcare.

Brennan et al, reviewed the medical charts of 30121 patients admitted to 51 acute care hospitals and found that 69% of injuries were caused by errors. The Harvard study of medical practice and a study of the quality of Australian healthcare, have found that medical errors occur in 4 - 17 percent of admissions and 30 - 51 percent of these adverse events were considered to be preventable and represent suboptimal care.1 In contrast, non-preventable adverse events suggest, that anticipated and unavoidable "complications" were present. Donchin et al have reported that 1.7 errors per patient per bed occurred in a medical-surgical intensive care unit, by an observational study at university hospital in Israel.1

All physicians, after all, have had the unwelcome experience of becoming what Wu calls "the second victim," 4,5 being involved in an error or patient injury and feeling the attendant sense of guilt or remorse as responsible professionals. Familiar too, are Helmreich"s findings that doctors, like pilots, tend to overestimate their ability to function flawlessly under adverse conditions, such as under the pressures of time, fatigue, or high anxiety.5

Vulnerable System Syndrome9

Healthcare institutions complexity derives from several factors, but perhaps the most significant is the presence of many defences, barriers, safeguards, and administrative controls designed to protect potential victims from the local hazards. As in all well defended systems, a mishap requires some assistance from chance in order to bring about such a low probability event. The greater the complexity of the system, the more likely it is that some measure of bad luck is involved in achieving the precise conjunction of defensive gaps and weaknesses necessary to permit an adverse event. This view can be summarised in the "Swiss cheese" model of accident or error causation.

image missing

The "Swiss cheese" model of accident causation.

Not withstanding the chance element, however, investigations of accidents in a number of hazardous domains suggest that a cluster of organisational pathologies - the "vulnerable system syndrome" 9 (VSS)- render some systems more liable to adverse events. VSS is present to some degree in all organisations, and the ability to recognise its symptoms is an essential skill in the progress towards improved patient safety.

What are medical errors?

Many questions need to be answered. What constitutes a medical error? What is the magnitude of medical errors? What are the possible surrogate markers that can be used to measure error? What are the clinical consequences of these errors? Who is being injured by the errors? Most importantly, what are the causes of these errors and what can be done to prevent them? These questions can be answered only by rigorous study with focused research objectives.7

Medical errors are a serious threat to patient safety in both hospitals and in the community.8 Greater public awareness of clinical error combined with rapidly increasing litigation and insurance costs have created a pressing need to implement risk management in hospitals to improve patient safety.

Error - Failure of a planned action to be completed as intended (error of execution) or use of a wrong plan to achieve an aim (error of planning); the accumulation of errors results in accidents 13, 22, 23.

Active error - An error that occurs at the level of the frontline operator and whose effects are felt almost immediately.13

Latent error- errors in the design, organization, training, or maintenance that lead to operator errors and whose effects typically lie dormant in the system for lengthy periods of time.13

Medical errors happen when something that was planned as a part of medical care doesn't work out, or when the wrong plan was used in the first place.10 Medical errors can occur anywhere in the health care system: Hospitals to Patients" Homes. Errors can involve: Medicines, Surgery, Diagnosis, Equipment, Lab Reports, etc. They can happen during even the most routine tasks, such as when a hospital patient on a salt-free diet is given a high-salt meal. Errors also happen when doctors and their patients have problems communicating. Just anything can go wrong, as Murphy's Law applies to healthcare too, with serious consequences.

Approaches to reduce medical errors

The most commonly cited taxonomy of human error in the medical literature is based on the work of James Reason. Reason describes 2 major categories of error: active error, which generally occurs at the point of human interface with a complex system, and latent error, which represents failures of system design.12

The goal of any taxonomy is to capture the most salient information in the hope that it represents the universe of errors and system failures. Four complementary classifications detailed in the taxonomy are as follows:26

  • Impact - the outcome or effect of error and systems failure, commonly referred to as harm;
  • Type - the perceptible, outward, or visible process that was an error or a failure;
  • Domain - identifies where a health care error and systems failure occurred and the type of individual involved; and
  • Cause - the factors and agents that bring about a health care error and systems failure.

In 1997, the Joint Commission on the Accreditation of Healthcare Organizations (JCAHO) mandated the use of Root Cause Analysis in the investigation of sentinel events in accredited hospitals.12 Root Cause Analysis provide a structured and process-focused framework with which to approach sentinel event analysis. Systems and organizational issues can be identified and addressed, and active errors acknowledged. Systematic application of Root Cause Analysis may uncover common root causes that link a disparate collection of accidents (i.e., a variety of serious adverse events occurring at shift change). Careful analysis may suggest system changes to prevent future incidents.

Root causes of medical errors of the first 112 Root Cause Analysis summaries the JCAHO reviewed, the root causes in 70% of the cases were related to (often > 1 per case):3 orientation/training, patient assessment process, communication process, physical environment and information non-availability. The rest of the 30 % were related to staff competency, equipment factors, staffing levels and storage access issues.

While medical errors are not injuries in the classic sense, the multifactorial etiology of medical errors bears a remarkable resemblance to other forms of injury. That is, medical errors frequently result from interactions between the host (patient), the environment (health care system), and the vector (often the Healthcare workers) of transmission.7 The Institute of Medicine report suggests that environmental (health care system) factors may be the most important contributors to medical error causation.

In an Australian study, errors of omission outnumbered errors of commission by 2 to 11 in general, but it's the reverse in an Emergency department.15 This indicates the effect of the operating system and the environment on patient safety and errors.

Development of Standard Operating Procedures (SOP's) for each function/area in the hospital and induction training will prevent errors to a great extent.

Paradigms from Other Fields

The likelihood of dying per domestic jet flight in estimated to be one in eight million. Statistically, an average passenger would have to fly around the clock for more than 438 years before being involved in a fatal crash.27 What could be the chance of getting discharged from a contemporary healthcare facility without being subjected to a medical error?

Errors result from complex human-system interaction. In many instances, there is no need to re-invent the wheel. Cognitive psychology, various organizations, and fields7,13other than of medicine have already made great advancements in studying and reducing human error. Perhaps the area with most effort and success has been aviation. Several studies on the cognitive and procedural skills of pilots have illustrated the inherent limitations of human performance and how personal and environmental factors can influence the occurrence of errors.7 In the field of medicine, the specialty of anesthesiology has also made significant advancements in the reduction of human error, focusing on cognitive and procedural aspects of medical care. Many of these advancements, from both inside and outside of the medical arena, could readily be adapted to study medical errors and reduce their occurrence.

The field of anesthesia has clearly made tremendous strides in improving patient safety over the past 50 years.12 It is hard to discern a particular, isolated practice that accounts for the clear and dramatic secular change in its safety. While at one level, a pragmatist might argue, "Who cares, as long as it' s safe," trying to adopt the lessons of anesthesia (or for that matter aviation) to the rest of healthcare is made more challenging by tenuous causality.

Systems factors and safety

Strategies for the design of safe systems of care should focus on;20

  • Preventing errors
  • Making errors visible
  • Mitigating the effects of error

The automated teller machine that dispenses cash follows one of two sequences to complete a transaction. Some dispense the money first and then return the card. Others reverse these steps. Since the aim of the transaction is to obtain the money, common sense and research in human factors predict that the person using the machine is more likely to forget the card if it is returned after the money is dispensed. The order is designed into the system and produces a predicable risk of error.

Like the card forgotten at the automated teller machine, many of the adverse events resulted from an error made by a person who is capable of performing the task safely, had done so many times in the past, and faced significant personal consequences for the error.20

Medical practitioners rely heavily on their administrative support staff for the smooth running of their practice. In many practices, systems have developed by evolution rather than design.11Some work perfectly well but others may have inherent faults that are only exposed when a problem occurs.

When unique events of an event reporting system for transfusion medicine were subjected to Root Cause Analysis, it's found that human failure accounted for 46% of causes, 27% were due to technical failures, and 27% were from organizational failures.12 A similar distribution is seen in the petrochemical industry, perhaps an indication of the universality of cause of error in complex systems, regardless of industry.

Although we cannot change the aspects of human cognition that make us to err, we can design systems that reduce error and make them safer for patients, and inturn will even reduce errors due to gaps in human cognition.

Vincent et al have suggested a nested hierarchy of factors that determine the safety of a healthcare system.12 The factors relate to - institutional context, organisation and management, work environment, care team, individual team member, task, and patient. The proximal cause of error and adverse events are usually associated with some combination of the care team, one of its members, the task performed, and the patient. Cook and woods have called this part of Vincent's hierarchy the "sharp end" of the healthcare system. To prevent errors these factors must be considered in the design of the care system. However, the less obvious factors - institutional context, organization and management, and work environment, the so called "blunt end" of the system must also be addressed to mitigate latent errors due to systemic flaws.

Many tactics are available to make system changes to reduce errors and adverse events; they fall into five categories;20

  • Reducing complexity
  • Optimise information processing
  • Automate wisely
  • Use constraints
  • Mitigate the unwanted side effects of change

These tactics can be deployed to support any of the three strategic components of error prevention, detection, and mitigation and most importantly, review and renew them based on the organisational performance feedback.

Process framework for a Safer Healthcare

Healthcare provision is complex and the task of reducing error is not simple. One framework that has been successfully used by many organisations in the Risk management Process is the Risk Management Standard, AS/NZS4360:1999.1, 12 This Process has been further adapted for use in healthcare to provide a comprehensive approach for managing risk.

Risk Management Process AS/NZS 4360:1999 - Risk Management

missing image

An effective risk-management programme will be based on the following core elements:11

  • identifying each risk
  • measuring the identified risk in terms of magnitude and frequency of occurrence
  • prioritising and controlling the risk
  • constantly monitoring the effectiveness of control measures

Steps in the Risk Management of errors

Based on the concepts of active and latent errors described above, accident analysis is generally broken down into the following steps:12

Data collection: establishment of what happened through structured interviews, document review and/or field observation. These data are used to generate a sequence or timeline of events preceeding and following the event.

Data analysis: an iterative process to examine the sequence of events generated above with the goals of determining the common underlying factors.

  • Establishment of how the event happened, by identification of active failures in the sequence.
  • Establishment of why the event happened, through identification of latent failures in the sequence, which are generalisable.

Healthcare practitioners themselves are best placed to identify where weak system"s links may lie. So, practitioners must be encouraged to carry out self-assessment of risks and the effectiveness of the measures in place for managing those risks.18 Steps proposed are:

Step 1: Identifying risks

Practitioners examine the processes within their work and identify the key operational risks, e.g., systems for repeat prescriptions, handling of test results18, checking instruments before and after surgery, receipt of telephone messages, and so on.

Step 2: Determining the cause

Practitioners then identify the sort of situations that could cause a breakdown in care. For example, the absence of a system ensuring that the test results are seen and acted upon before being filed.

Step 3: Considering the consequences

The consequences for the practice of each risk are then assessed. These are likely to include injury/harm to patients or staff; death and damage to reputation.

Step 4: Assessing the likelihood of an adverse incident

Then the practitioners should take a view about, how likely it is that the practice will be exposed to the risk.

Step 5: Determining the risk

Once the consequences and likelihood of system failure in each of the areas identified has been assessed, it is then possible to rate the risks in order of significance. Obviously, the highest rating would be given to system failures with the worst consequences and the greatest likelihood of occurring.

Step 6: Identifying and reviewing controls

Having identified and assessed the risks, practitioners review the controls that are already in place. Are they adequate? How can they be improved? Do new controls need to be introduced?

Step 7 : Action plan

The practice now understands where risks lie and the controls needed for managing them. A clear action plan for improvement can now be drawn up for introducing new and better controls/systems for managing risk. e.g., development of standard operating procedures.

This will bring the following significant benefits to the practice:11

  • It highlights that everyone has a part to play in minimizing risk to patients.
  • It raises awareness among staff of the reasons for, and the need to adhere to, processes and systems.
  • By bringing together the practice team to identify problems and find solutions, there is a much greater chance that all staff will feel a sense of ownership of any procedural changes made and therefore more likely to be committed to make them work well.

Errors made by an individual often reflect system-wide problems.13 The correct response is to redesign systems so that errors are acknowledged, detected, intercepted, and mitigated.2 The IOM's report "crossing the quality chasm" points out the obvious, "trying harder will not work, Changing systems of care will". Poor designs set the work force up to fail - regardless of how hard they try.21

Systems should be designed to catch error as early as possible. Early error recognition may allow corrective or rescue steps to be taken before injury occurs.13 The identification of potential and preventable adverse events by clinicians and hospitals consisting of active surveillance systems, voluntary, nonpunitive reporting, quality improvement systems, and chart sampling is a reasonable starting point.

Discussion

The decades long aviation effort to improve safety through system monitoring and feedback holds many important lessons for health care. In the highly charged political, financially accountable, and legal environment of the nuclear power industry, no penalties are associated with reporting non-consequential events, or "close calls," to the human performance enhancement system. 13, 17

Despite the pressures, constraints, resistance to change and other seemingly insurmountable barriers, it is simply not acceptable for patients to be harmed by the same healthcare system that is supposed to offer healing and comfort. "First do no harm" is an often quoted term from Hippocrates. At a very minimum, the health system needs to offer that assurance and security to all its stakeholders. Medical students learn how to deal with error from their teachers as well as from the culture of the organization(s) in which they are trained. A study of physician trainees found that they used three major mechanisms to define and defend medical error: denial, discounting, and distancing.16

A comprehensive approach to improving patient safety is needed. This approach cannot focus on a single solution since there is no "magic bullet" that will solve this problem, and indeed, no single recommendation should be considered as the answer. Rather, complex problems require thoughtful, multifaceted responses.

We are left with our feet firmly planted in the middle of competing paradigms. One argues that an evidence-based, scientific approach has served healthcare well and should not be relaxed simply because a popular practice from a "safer" industry sounds attractive. The other counters that medicine"s slavish devotion to the scientific and epidemiologic method has placed us in a patient safety straightjacket, unable to consider the value of practices developed in other fields because of our myopic traditions and "reality".12

Merits can be seen in both arguments. Healthcare clearly has much to learn from other industries. Just as physicians must learn the "basic sciences" of immunology and molecular biology, providers and leaders interested in making healthcare safer must learn the "basic sciences" of organisational theory and human factors engineering. Moreover, the "cases" presented on rounds should, in addition to classical clinical descriptions, also include the tragedy of the Columbia and Challenger, and the successes of Motorola and McDonalds.12 On the other hand, an unquestioning embrace of dozens of promising practices from other fields is likely to be wasteful, distracting, and potentially dangerous. We are drawn to a dictum of Cold War era - "Trust, but verify".

Conclusion

Patients who are sicker, subjected to multiple interventions, and who remain in hospital longer are more likely to suffer serious injury as a result of medical errors.1 Unless we make substantial changes in the organisation and delivery of health care, all patients particularly the most vulnerable will continue to bear the burden of medical error.

In medicine there is a long tradition of examining past practice to understand how things might have been done differently. However, conferences on morbidity and mortality, grand rounds, medical audit and peer review currently share the same shortcomings:18,19 a lack of human factors and thinking about systems; a narrow focus on individual performance to the exclusion of contributory team and larger issues; hindsight bias; a tendency to search for errors as opposed to the myriad causes of error induction; and a lack of multidisciplinary integration into an organisation wide safety culture.

A clinical and business risk management program that is integrated into operational activities will improve organisational performance by enabling a balance between safety and cost.4 Patient safety is a long-term investment that must become key objective within strategic and business plans. A comprehensive risk management program will provide assurance to the public (the staff and the government) that providers are doing their reasonable best to improve patient safety within existing operational realities.

CII-Mckinsey study report urged the government to define and enforce minimum standards for healthcare facilitates and make them mandatory across all parts of the country, in order to provide quality health services.25

Are we ready to change? Or will we procrastinate and dissemble to lament later when the inevitable regulatory backlash occurs? It may seem that the concern for patient safety has just begun, but the patience of the public we serve is already wearing thin. Patients are asking us to promise something reasonable, but more than we have ever promised before: that they will not be harmed by the care that is supposed to help them. We owe them nothing less, and that debt is now due.

The identification of potential and preventable adverse events can occur only in an environment in which patient safety is a high organizational priority, individuals are not blamed for mistakes, and there is a nonpunitive system for reporting problems in medical care to peer-review protected committees that are empowered to institute changes in the system for system-wide improvements to prevent future error.13

To set eyes on the future of error reduction in medicine, visit a department store. Salespeople, tags, and signs help you choose the right items. Store design lowers the chances you will head away from something you need. Bar codes ensure rapid and accurate item identification. Laser scanning devices reduce the errors that were common in hand-entering prices at the counter. Credit card units allow rapid identification of credit limit, preventing you from spending more than allowed by the card. A combination of security cameras, security personnel, and sensors at the door prevent you from "accidentally" walking out of the store without paying for an item. A sign at the exit encouraging submission of comments and suggestions allows management to identify additional problems and errors. The benefits of these systems must more than pay for their costs or they would not be instituted. Customers receive better service and are more satisfied while the companies reap greater profits.13

References

  1. Saul N Weingart, "Epidemiology of Medical errors", BMJ 2000; 320:774-777.
  2. "Facing up to medical error" (Editorials), BMJ 2000;320.
  3. Stuart Emslie, "Risk Management in the National Health Service in England", ISO General Assembly 2001, Sydney.
  4. Lucian L Leape, "Safe health care: are we up to it?" BMJ 2000;320:725- 726
  5. Albert W Wu, "Medical error: the second victim, The doctor who makes the mistakes need help too", BMJ 2000;726-727.
  6. J T Reason, J Carthey, M R de Leval, "Diagnosing 'Vulnerable system syndrome': an essential prerequisite to effective risk management", Quality in health Care 2001;10: ii21?ii25.
  7. Demetrios N. Kyriacou, Jeffrey H.Coben , "Errors in emergency medicine: Research Strategies", Academic Emergency Medicine 2000; Volume 7, Number 11 1201- 1203.
  8. "Medication Errors and Risk Management in Hospitals", ISMP Medication Alert Vol.3, issue 18, September 9, 1998.
  9. "Clinical Risk Management - A Model for Hospitals". (www.riskmanagment.com.au accessed on 10/01/2003).
  10. Patient Fact Sheet. "20 Tips to Help Prevent Medical errors". Agency for Healthcare Research and Quality, Publication No.00-P038, February 2000.
  11. "Preventing avoidable harm to patients in general practice". (www.mps-risk-consulting.com Accessed on 10/01/2003.
  12. "Making Health Care Safer - A Critical Analysis of Patient Safety Practices", Root Cause Analysis, Chapter 5, Evidence Report/Technology Assessment, No. 43, Agency for Healthcare Research and Quality, Contract No. 290-97-0013.
  13. Jonathan A. Handler, "Defining, Identifying, and Measuraing error in Emergency Medicine", Academic Emergency Medicine 2000; Volume 7, Number 11, 1183-1188.
  14. Brennan TA, Leape LL, Laird NM, et al. "Incidence of adverse events and negligence in hospitalised patients". N Engl J Med 1991; 324:370?6.
  15. Cherri Hobgood et al, "Medical Errors: Who, What, When, and Why; What the Public Wants to Know", Academic Emergency Medicine 2001; Volume 8, Number 5498.
  16. D C Aron and L A Headrick, "Educating physicians prepared to improve care and safety is no accident: it requires a systematic approach", Qual Saf Health Care 2002;11:168-173.
  17. Paul Barach, "Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems", BMJ 2000;320:759?763.
  18. A K Agarwal, Medical audit and patient care in a hospital, Unpublished Doctoral thesis for M.D., Community Health Administration, NIHFP, 1977, pp. 93, 187.
  19. Thomas W Nolan, "System changes to improve patient safety", BMJ 2000;320:771-773.
  20. A M Kuhn, B J Youngberg, "The need for risk management to evolve to assure a culture of safety", Qual. Saf. Health Care 2002;11:158?162.
  21. Lucian L. Leape, M.D., "Reporting of Adverse Events", N.Engl.J.Med, Vol. 347, No. 20 " November 14, 2002.
  22. S M Dovey, "A preliminary taxonomy of medical errors in family practice", Quality and Safety in Health Care 2002;11:233-238
  23. Schimmel EM, "The hazards of hospitalization", Annuals of Internal Medicine 1964; 60:100?110.
  24. Lucian L Leape, "Reporting of medical errors? time for a reality check", West J Med 2001;174:159-161.
  25. CII, Press release, 2002 October.(www.ciionline.org accessed in December 2002)
  26. "Defining 'error': a key performance measurement issue in patient safety", Joint Commission Benchmark, February 2003, Volume 5, Issue 2.
  27. Federal Aviation Administration, Office of System Safety. Aviation reporting system (ASRS) Database [web Page]. 1999.

* Consultant, NIHFW, New Delhi. ([email protected])
** Director, School of Health Sciences, IGNOU, New Delhi.

Access free medical resources from Wiley-Blackwell now!

About Indmedica - Conditions of Usage - Advertise On Indmedica - Contact Us

Copyright © 2005 Indmedica