Continuing Medical Education, Vol 30, No 11 (2012)

CME 2571

Patient safety

Minimising medical error

T B Welzel, MB ChB, Dip PEC (SA), H Dip Int Med (SA), Dip HIV Man (SA), DTM&H, Dip For Med Clin/Path (SA), BSc (MedSc) Hons (Diving Med), EMDM, CML, MMed Sc (Clin Epi)

Senior Lecturer, Division of Emergency Medicine, Department of Surgery, University of Cape Town

Tyson Welzel has cultivated his diverse interests in all aspects of emergency medicine. He is currently a Senior Lecturer in the Division of Emergency Medicine at UCT and co-ordinator of the MPhil (EM) programme that includes a stream in patient safety.

Correspondence to: T B Welzel (twelzel@earthling.net)



A number of studies have followed after the abovementioned findings. The message remains the same: around 10% of all patients entering hospitals are harmed in one way or another, and 2% die because of a medical error.

The casual observer may think that the topic of this article refers to the security available to minimise personal violence and theft, sadly speaking of a local bias. However, it refers to measures and systems that have to be put in place to minimise medical error and patient harm. The patient safety movement is now 13 years old, led by the publication of the US Institute of Medicine (IOM) Report To Err is Human.1 The basic premise at the time was that annually up to 98 000 Americans were estimated to have died because of medical error (although this calculation is still criticised by many as being too high or too low). Indeed, subsequent studies in a number of different healthcare environments put the adverse event rate as a percentage of admissions between 3% (Utah-Colorado Study2 and Harvard Medical Practice Study3 ) and 17% (Quality in Australian Health Care Study4 ).

This variation stems from the methodology employed to determine the medical error rate: administrative databases, retrospective record reviews or critical incident registers suffer from reporting bias, i.e. many errors are simply not identified or reported. Prospective studies involving the shadowing of clinicians often suffer from the Hawthorne effect. In addition, the political motives of the authors need to be taken into account when analysing such literature. 

A number of studies have followed after the abovementioned findings. The message remains the same: around 10% of all patients entering hospitals are harmed in one way or another, and 2% die because of a medical error.5 The majority of the available data are on hospital-based in-patient care. Hospital-based ambulatory care or community-based private practice care is usually not assessed for logistical reasons. In terms of error rates in healthcare in South Africa, no real data exist. A recent guesstimate of the medical error rate in developing countries (including South Africa) was 8.2% of admissions.6 The authors of that paper freely admit the probable significant underestimation of that figure.

These are staggering numbers when confronted with them for the first time: if every 10th customer in a shop fell victim to an error, the shop would be out of business very quickly. This has been realised in many high-risk industries (e.g. chemical, nuclear, aviation), to the point that there is a chance of 1:1 000 000 of a traveller being harmed in an aircraft, in stark contrast to a 1:300 chance of a patient being harmed during any form of healthcare intervention.7 Even more so in medicine: medical practitioners take an oath to look after patients, following the maxim primum non nocere.

Why are we in South Africa not more concerned, when most other countries have made patient safety a priority and have dedicated considerable resources and personnel to research and implementation within the confines of multiple patient safety institutes and programmes?

Should a developing nation look at patient safety?

This interesting question may be answered with another question: can a developing nation afford not to invest in patient safety in all its activities? Each medical error, misdiagnosis or surgical mishap can result in treatment delays, worsened outcomes or prolonged care. This translates directly into healthcare monies wasted. It can be likened to a bucket with holes in it ‒ a fraction of the water poured into the bucket immediately flows out again. In the USA, it is estimated that for every dollar spent towards trying to heal, it may cause up to 45 cents worth of harm.8 In addition, there are social costs (e.g. the cost to society of staying away from work, caring for a sick family member) that in the USA also add up to almost $1 trillion per year.8

What is patient safety?

Patient safety is defined by the WHO as ‘the prevention of errors and adverse effects to patients associated with health care’.9 The automatic implication is that if we do not measure the outcomes and impact of our interventions and actively measure trends, we are unlikely to know where we are going wrong.

The cornerstone of such prevention is the realisation that medical intervention is in itself risky and becoming ever more so as technology progresses. The medical professional, being human, is only a small (but probably the most error prone) part of the system that provides patient care.10 In addition, it is dependent on a complex system of providers, technologists, consumables, machines, medications, medical devices and governance structures. Apart from many procedures, medications or implants being inherently associated with possible adverse events, a simple, seemingly non-medical component in this system can have the same effect (e.g. non-availability of adequate staff or water, a power failure or a poorly written protocol). Hence, one needs to adopt an approach of ‘systems thinking’ rather than simply looking for pure ‘component failures’ when assessing an adverse incident (Table 1). The exciting aspect of patient safety is that it needs intersectoral collaboration. We need to draw on the expertise of the industrial and behavioural psychologist, human factors engineer, ergonomist and systems engineer to improve safety in healthcare; otherwise we will not be able to improve our outcomes dramatically.

Table 1. Factors contributing to adverse events in developing countries (in descending order of importance)

• Inadequate training or supervision of clinical staff

• No protocol, no policy or failure to implement

• Inadequate communication or reporting

• Delay in providing service

• Defective equipment or supplies

• Unavailable equipment or supplies

• Inadequate functioning of hospital services

• Inadequate staffing

From: Wilson RM, Michel P, Olsen S, et. al. Patient safety in developing countries: retrospective estimation of scale and nature of harm to patients in hospital. BMJ 2012;344:e832.


A number of authors have attempted to classify the priority areas in patient safety, yet the list compiled by the Research Priority Setting Working Group of the World Alliance for Patient Safety is currently the most apt (Table 2), taking the systems approach into consideration, and looking at structural factors, processes and outcomes.

Table 2. Unsafe medical care

Structural factors

Organisational determinants

 

Structural accountability (accreditation and regulation)

 

Safety culture

 

Training, education and human resources

 

Stress and fatigue

 

Production pressure

 

Lack of appropriate knowledge and its transfer

 

Devices and procedures with no human factor

Processes

Misdiagnosis

 

Poor test follow-up

 

Counterfeit and substandard drugs

 

Inadequate patient safety measures

 

Lack of patient involvement in patient safety

Outcomes

Adverse events due to drug management

 

Adverse events and injuries due to medical devices

 

Injuries due to surgical and anaesthetic errors

 

Healthcare-associated infections

 

Unsafe blood products

 

Safety of pregnant women and newborns

 

Safety of the elderly

 

Injuries due to falls in hospitals

 

Decubitus ulcers

From: Jha A, ed. Summary of the Evidence on Patient Safety: Implications for Research, the Research Priority Setting Working Group of the World Alliance for Patient Safety. Geneva: WHO, 2008.


Culture of safety

Starting to consider systems thinking will be a huge step forward for every functionary in the healthcare system. Contrary to many other industries, healthcare in many parts of the world is locked into its own perception of itself ‒ for most medical slip-ups, mishaps and errors the clinician involved is blamed. This is favoured by managers and supervisors, as the problem is therefore quickly identified and can ‘easily’ be addressed by retraining, implementing a new procedure or getting rid of the ‘rotten apple’. This does not allow the organisation to learn from this mistake and identify the various factors that might have had a bearing on the identified incident.

If a doctor prescribes the incorrect dose of a medication or a nurse administers the wrong medication to a patient, did either of them set out to do that harm? In the vast majority of cases, they have spent years training and fine-tuning their skills to prevent just such an incident. Identifying the error is not the end ‒ one needs to find the ancillary factors. How many patients does the clinician have to deal with simultaneously? How many distractions are there for the same person (e.g. are they also taking outside calls, liaising with other clinicians and family members in addition to caring for their patients), how many hours of sleep did the clinician have that week? Are the medications homonyms of each other or do the vials look similar? Is a check procedure in place before administering the medications? While not absolving the individual from their responsibility, every part of the process needs to be examined to determine if it contributes positively as an ‘error net’. Any error rate is either worsened or improved by the surrounding system, governed by a number of factors, i.e. environmental, stress-related and intrinsic factors.

High-reliability versus normal accident theorists

Two camps with fundamentally opposing views exist, i.e. those who believe one can safeguard a system to eventually reach a near-zero error state and those who believe that, in spite of every attempt at improving a system, serious accidents and errors can and will always happen.

There are merits and limitations to both schools of thought. Both recognise that the human being is the weak link in any system and that an organisation needs to be designed to mitigate and strengthen against this weakness. Indeed, normal human error rates lie at 0.003 in errors of commission and 0.01 in errors of omission. This dramatically increases to 0.25 under a high-stress environment.11 The realisation is that humans make mistakes, even if they try not to ‒ and that doctors are human. Yet most doctors do not admit making mistakes. Admitting mistakes is the first step in self-correction. In a fascinating series of paediatric arterial switch operations observed by a human factors specialist and reported by James Reason,12 it was found that the best surgeons were not those who made the fewest mistakes, but those who were pushing the envelope and performing the more risky and demanding operations. Surgeons who anticipated mistakes and corrected them rapidly were those with the fewest adverse events.12

The high-reliability theory is based on the idea that four factors are necessary to create and maintain safety:

• For organisational leaders safety is a top priority and they will include it in planning and decision making.

• Multiple back-up systems exist to take over should a component fail (e.g. having multiple staff check the same step, and second and third on-call rostered personnel in case of illness).

• Authority is decentralised to the local level to allow for faster decision making, and personnel are continuously trained and retrained as per local requirements.

• Organisational learning takes place.13

The normal accident theory predicts that the likelihood of an inevitable incident will increase as coupling and complexity within an organisation increase. Coupling refers to how one process is linked to another in time or distance, and complexity to the predictability of an outcome based on the inputs. Healthcare, especially in acute care hospitals, is regarded as a tightly coupled, high-complexity system14 on a par with the nuclear industry or with chemical factories. This theory predicts that each one of the four elements put in place by the high-reliability theorists is ‘either ineffective, unlikely to be implemented or even counterproductive’.13

Whereas at first glance these two theories seem to be mutually exclusive, they are not. While each medical system should attempt to put measures in place, as suggested by the high-reliability theory, it should also recognise that one cannot make medicine inherently safe and error free. With such a systems approach in mind administrative and clinical managers need to first pre-emptively identify possible problems in their systems and when a patient-related incident occurs, not scapegoat the involved practitioner, but look at the enablers in the system that did not prevent this error from happening. The ‘Swiss cheese model’ proposed by Reason is such a way to look at redundancy and error prevention.15 Each ‘slice’ represents a barrier to error and harm. At the same time, there are always holes in each barrier. Each slice can catch errors that might creep through at a preceding step. Unfortunately, at times, such holes align and patient harm results.15

Quo vadis?

As the South African healthcare system becomes more stressed, it is meeting ever greater patient numbers with the same or diminishing resources. Over the past decades, any remaining redundancy in personnel or equipment has been lost, authority has been centralised (often owing to a lack of local ability) and the upper management levels only bear lip service to patient safety issues. Introducing patient safety thinking is not a nicety, but a necessity to uphold the quality of care and protect our patients. At the same time, it recognises that delivered patient care depends on a number of important factors, not just the clinician who finally sees the patient.

Why do highly trained pilots use checklists for routine take-off and landing, yet doctors feel that it is demeaning to do so? Something as simple as a pre-procedure checklist has already proven to have a huge impact on error reduction in theatres,16 , 17 but these checklists need not be limited to that specific environment. This is a huge and complex topic and should be every healthcare worker’s concern. I would encourage you to use the references to expand your reading on the topic.

 

In a nutshell

• Patient safety is everyone’s concern.

• Adverse events and medical errors are costing the healthcare system millions in lost funds every year.

• Medical practitioners are human and make mistakes, even those with the best intentions.

• We need to analyse the systems in which we work to determine which factors are error traps and which ones are safety nets.

• If we do not measure our current practices, we do not know how well or how badly we are operating and therefore cannot determine which areas need improvement.

• We need to start developing a culture of safety at our institutions and in our practices.

• One of the best places to start identifying error traps is to introduce a no-fault anonymous, adverse event reporting system.

References
    1. Kohn LT, Corrigan JM, Donaldson MS. To Err is Human: Building a Safer Health System, Institute of Medicine. Washington, DC: National Academy Press, 1999.

    1. Kohn LT, Corrigan JM, Donaldson MS. To Err is Human: Building a Safer Health System, Institute of Medicine. Washington, DC: National Academy Press, 1999.

    2. Thomas EJ, Studdert DM, Runciman WB. Incidence and type of adverse events and negligent care in Utah and Colorado. Med Care 2000;38(8):261-271.

    2. Thomas EJ, Studdert DM, Runciman WB. Incidence and type of adverse events and negligent care in Utah and Colorado. Med Care 2000;38(8):261-271.

    3. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalised patients. New Engl J Med 1991;324(6):370-376.

    3. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalised patients. New Engl J Med 1991;324(6):370-376.

    4. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Aus 1995;163:458-471.

    4. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Aus 1995;163:458-471.

    5. Reason J. Delivering Patient Safety: Facing the Facts. Edmonton, Canada: Canadian Patient Safety Institute, 2007.

    5. Reason J. Delivering Patient Safety: Facing the Facts. Edmonton, Canada: Canadian Patient Safety Institute, 2007.

    6. Wilson RM, Michel P, Olsen S, et al. Patient safety in developing countries: retrospective estimation of scale and nature of harm to patients in hospital. BMJ 2012;344:e832.

    6. Wilson RM, Michel P, Olsen S, et al. Patient safety in developing countries: retrospective estimation of scale and nature of harm to patients in hospital. BMJ 2012;344:e832.

    7. WHO Europe. 10 Facts on Patient Safety. http://www.who.int/features/factfiles/patient_safety/en/index.html# (accessed 12 August 2012).

    7. WHO Europe. 10 Facts on Patient Safety. http://www.who.int/features/factfiles/patient_safety/en/index.html# (accessed 12 August 2012).

    8. Goodman JC, Villarreal P, Jones B. The social cost of adverse medical events, and what we can do about it. Health Aff 2011;30:4590-4595.

    8. Goodman JC, Villarreal P, Jones B. The social cost of adverse medical events, and what we can do about it. Health Aff 2011;30:4590-4595.

    9. WHO Europe. What We Do: Patient Safety. http://www.euro.who.int/en/what-we-do/health-topics/Health-systems/patient-safety (accessed 12 August 2012).

    9. WHO Europe. What We Do: Patient Safety. http://www.euro.who.int/en/what-we-do/health-topics/Health-systems/patient-safety (accessed 12 August 2012).

    10. Reason J. The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Surrey, England: Ashgate Publishing, 2008.

    10. Reason J. The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Surrey, England: Ashgate Publishing, 2008.

    11. Smith DJ. Appendix 6: Human error rates. In: Reliability, Maintainability and Risk — Practical Methods for Engineers including Reliability Centred Maintenance and Safety-Related Systems. 7th ed. Oxford: Elsevier Butterworth-Heinemann, 2005:310-311.

    11. Smith DJ. Appendix 6: Human error rates. In: Reliability, Maintainability and Risk — Practical Methods for Engineers including Reliability Centred Maintenance and Safety-Related Systems. 7th ed. Oxford: Elsevier Butterworth-Heinemann, 2005:310-311.

    12. Reason J. Surgical excellence (1995-97). In: The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Surrey, England: Ashgate Publishing, 2008:184-188.

    12. Reason J. Surgical excellence (1995-97). In: The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Surrey, England: Ashgate Publishing, 2008:184-188.

    13. Sagan SD. The origins of accidents. In: The Limits of Safety. Princeton, New Jersey: Princeton University Press,1993:11-52.

    13. Sagan SD. The origins of accidents. In: The Limits of Safety. Princeton, New Jersey: Princeton University Press,1993:11-52.

    14. Gaba DM. Structural and organizational issues in patient safety: a comparison of health care to other high-hazard industries. Calif Manage Rev 2000;43:1-20.

    14. Gaba DM. Structural and organizational issues in patient safety: a comparison of health care to other high-hazard industries. Calif Manage Rev 2000;43:1-20.

    15. Reason J. Managing the risks of organisational accidents. Aldershot: Ashgate Publishing, 1997.

    15. Reason J. Managing the risks of organisational accidents. Aldershot: Ashgate Publishing, 1997.

    16. Gwande A. The Checklist Manifesto: How to Get Things Right. London: Profile Books, 2010.

    16. Gwande A. The Checklist Manifesto: How to Get Things Right. London: Profile Books, 2010.

    17. McConnell DJ, Fargen KM, Mocco J. Surgical checklists: A detailed review of their emergence, development, and relevance to neurosurgical practice. Surg Neurol Int [serial online] 2012 [cited 2012 Aug 18];3:2. http://www.surgicalneurologyint.com/text.asp?2012/3/1/2/92163

    17. McConnell DJ, Fargen KM, Mocco J. Surgical checklists: A detailed review of their emergence, development, and relevance to neurosurgical practice. Surg Neurol Int [serial online] 2012 [cited 2012 Aug 18];3:2. http://www.surgicalneurologyint.com/text.asp?2012/3/1/2/92163