We continue to develop resources related to the COVID-19 pandemic. See COVID-19 initiatives on Appropedia for more information.
|This is the talk page for discussing improvements to the Admitting failure article.|
Several disciplines study failure, trying to learn from it, avoid it, design it out of systems, or clean up the resulting messes. See for example:
- Dorner, Dietrich (1997). The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations. Basic Books. Retrieved 2011-02-23.
In the field of aviation safety, agencies like the National Transportation Safety BoardW exist primarily to analyze accidents, pinpoint the exact chain of events that led to each failure, and issue regulations to make similar failures less likely. Crew resource managementW focuses on identifying and avoiding errors by cockpit crew. In the field of software development, failure is taken for granted, and thus tools and skills for debuggingW and testing are essential for the success of any significant software development project. In the field of fire safety, behind almost every aspect of a modern fire code are catastrophic fires in the past that killed dozens to thousands of people. Most people tend not to take safety very seriously until the body count gets high enough to focus attention.
Humans are naturally reluctant to own up to their own failures and analyze them objectively; experiencing a loss of face often seems like a greater threat than failing to learn from the failure. This may be an aspect of evolutionary psychology: in ancient times one of the greatest threats to our ancestors was getting kicked out of their tribe, in a world filled with large predatory animals and hostile neighboring tribes. Thus we evolved to be horrified at any potential threats to our status, and to do everything we can to avoid being held accountable for our mistakes. The institutional approach to failure must override any tendency for personnel to obscure their failures. Failure to expose and correct individual failures can lead to catastrophic systemic failure of the whole project (e.g. Chernobyl). Individuals who work mostly without supervision and are accountable only to themselves should try to guard against the instinctive tendency to hide from failure. They must act as their own board of inquiry when necessary, when all that matters is to identify the cause of a failure and learn from it. --Teratornis 21:18, 22 February 2011 (PST)
- Good thoughts. Part of the way to deal with it is to ensure that it is safe to admit failure. Realistic expectations by aid donors for example - expecting that failures will be frequent, especially when innovating. --Chriswaterguy 08:22, 23 February 2011 (PST)