Definition. Human error is an inevitable, unpredictable, and unintentional failure in the way we perceive, think, or behave. It is not a behavioral choice—we do NOT choose to make errors, but we are all fallible.
Examples. Most errors can be classified as either an execution failure, which is a skill-based mistake, or a planning failure, which is either a rule-based or knowledge-based mistake. Mental slips and lapses are considered skill-based mistakes. An example of a mental slip is transposing the numbers of a medication dose. Omissions or forgetting to take certain steps in a process are examples of mental lapses. Incorrectly programming a new infusion pump following the directions used for an older pump is an example of a rule-based mistake. Prescribing an excessive dose of medication due to a knowledge deficit about a patient’s recent weight loss is an example of a knowledge-based mistake.
Causes. Human error is either endogenous (random human error), which arises within an individual from a random and unpredictable cognitive event, or exogenous (system-based human error), in which some feature of the environment contributes to a failure in cognitive processes. The risk of endogenous errors is increased by negative personal performance shaping factors such as anxiety and stress, fatigue, preoccupation and distractibility, fear and dread, sensory deficits, and other psychosocial factors. The risk of exogenous errors is increased by negative system or environmental performance shaping factors, such as low lighting, interruptions and physical distractions, fatiguing staffing patterns, technology glitches, the absence of job aids (e.g., calculators, labels), and unlimited access to medications. As negative performance shaping factors increase in scope and intensity, the probability of human error increases significantly.
Perceptual biases also contribute to both endogenous and exogenous errors. Examples of perceptual biases include confirmation bias (seeing what you believe), change blindness (inability to detect changes in plain view), and inattentional blindness (inability to see information because attention is focused elsewhere). Cognitive biases may influence how individuals respond to an error. Examples of cognitive biases include hindsight bias (tendency to see past events as predictable), normalcy bias (it will never happen here), and severity bias (tendency to base the severity of the response on the outcome).
Management. Since human errors are inevitable, they are best managed within a Just Culture through system redesign to make the system human error-proof or error-resistant. System redesign often requires the integration of high-leverage strategies (e.g., forcing functions, fail-safes, barriers, automation and technology, standardization and simplification) that increase system reliability and reduce or eliminate the risk of errors and/or patient harm. Discipline, including counseling, is not warranted or effective to address human error because erring individuals did not intend the action or any undesirable outcome that resulted. In a Just Culture, the only just option is to console the individual who made the error and to redesign systems to prevent future errors.
Furthermore, the potential or actual severity of the error outcome should play no role in determining how individuals are treated, even when patients are harmed. Individuals should know that they will be treated fairly when they report their mistakes, and that they will be accountable for the quality of their choices, not the human error itself or the severity of its outcome. Also, a severity bias often results in a “no harm, no foul” approach, with missed opportunities to redesign systems and console individuals for human error.