Errors of several sorts inevitably occur in any process. The larger and more complex is the process, the greater the error rate. The more times the process occurs, the more errors.
Much of medical practice and of healthcare involves large complex processes which are frequently repeated.
Errors and Variation
Any process will produce results which vary, with no identifiable cause. This is distinct from error. A tool for analysing the process variation is the Process Control Chart, sometimes called a trumpet chart due to the shape of the boundaries drawn on it.
Much of the reward, criticism, and futher planning of the NHS has been based upon random variations within those borders.
Necessary, since they cannot be abolished. To do so would be to abolish entropy and would allow perpetual motion (safely!). The difference between error and negligence is subjective. This often causes major problems in accepting this concept in Healthcare. For example say for one surgeon out of every 100 operations 60 went right, 30 no benefit and in the remaining 10 the patient was slightly worse off. But another surgeon for the same operation has a record of 95 went right, 4 no benefit and 1 patient died on the operating table. The second surgeon is far more likely to be suspected of negligence as errors of commission are perceived as more important than errors of omission in human cultures. They tend to be easier to detect, and so easier to apportion blame. The first surgeon's error of ommission to practice to the technical standard of the second is not perceived by most to be as serious as the error of commission of the second surgeon which involved confusing left and right.
Certainly possible and desirable. As error control mechanisms can add to complexity of an already complex system they should be only used where justifiable. Root cause analysis can help. Some error reduction mechanisms are simple such as measures to minimise the number of workers tired on the job. Some are complex such as the multiple failsafes associated with the Nuclear Deterrent. There is a tendency for error reducing systems in an organisation to get more complex with time. This is often just as much due to changing perceptions as to acceptable risk outside the organisation as inside it. To help understand and reduce reduce error it helps to have a basic understanding of psychology and differences such as between the explicit knowledge any reader of this wiki might acquire and the implicit knowledge of a trained competent doctor.
The Swiss cheese
All our systems have holes. A series of holes that are lined up can result in patient harm. In order to catch errors before they have consequences we tend to line up systems in series - a checking system after a deciding one, reviews on admission for instance. Occasionally the holes in the systems line up, and an error sails through them all. Sometimes one or more needs redesigning to patch that. Regrettably the most common response to such completed errors is to blame the person running the last system that failed to catch it. This is no more sensible than blaming any other layer, and does not lead to repairs or improvements, but it is very tempting.
Healthcare errors have to be related to the system they are part of. This leads to doctors for right or for wrong being perceived as more error prone in some cultural environments than others. An understanding of such issues can help with the context that the cheese is greener elsewhere !
Learning from errors
This is made more difficult when punishment will result but no harm was intended. Thus the no blame approach. Tools such as those used in root cause analysis can then be applied to minimise risk, even when one event in a whole chain seems worthy of punishment.
This article is a work in progress. Please feel free to contribute to it.
- ↑ Lewis RQ, Fletcher M. Implementing a national strategy for patient safety: lessons from the National Health Service in England. Quality & safety in health care. 2005 Apr; 14(2):135-9.(Link to article – subscription may be required.)
- ↑ Espinosa JA, Nolan TW. Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study. BMJ (Clinical research ed.). 2000 Mar 18; 320(7237):737-40.
- ↑ Wolff AM, Taylor SA, McCabe JF. Using checklists and reminders in clinical pathways to improve hospital inpatient care. The Medical journal of Australia. 2004 Oct 18; 181(8):428-31.
- ↑ Donyai P, O'Grady K, Jacklin A, Barber N, Franklin BD. The effects of electronic prescribing on the quality of prescribing. British journal of clinical pharmacology. 2008 Feb; 65(2):230-7.(Link to article – subscription may be required.)
- ↑ Morey JC, Simon R, Jay GD, Wears RL, Salisbury M, Dukes KA, Berns SD. Error reduction and performance improvement in the emergency department through formal teamwork training: evaluation results of the MedTeams project. Health services research. 2002 Dec; 37(6):1553-81.
- ↑ Wolff AM, Bourke J, Campbell IA, Leembruggen DW. Detecting and reducing hospital adverse events: outcomes of the Wimmera clinical risk management program. The Medical journal of Australia. 2001 Jun 18; 174(12):621-5.
- ↑ Wip C, Napolitano L. Bundles to prevent ventilator-associated pneumonia: how valuable are they? Current opinion in infectious diseases. 2009 Apr; 22(2):159-66.(Link to article – subscription may be required.)