Warning! You are not logged in. Log in or create an account to have your edits attributed to your username rather than your IP, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 18: Line 18:
Humans are naturally reluctant to own up to their own failures and analyze them objectively; experiencing a loss of face often seems like a greater threat than failing to learn from the failure. This may be an aspect of evolutionary psychology: in ancient times one of the greatest threats to our ancestors was getting kicked out of their tribe, in a world filled with large predatory animals and hostile neighboring tribes. Thus we evolved to be horrified at any potential threats to our status, and to do everything we can to avoid being held accountable for our mistakes. The institutional approach to failure must override any tendency for personnel to obscure their failures. Failure to expose and correct individual failures can lead to catastrophic systemic failure of the whole project (e.g. Chernobyl). Individuals who work mostly without supervision and are accountable only to themselves should try to guard against the instinctive tendency to hide from failure. They must act as their own board of inquiry when necessary, when all that matters is to identify the cause of a failure and learn from it. --[[User:Teratornis|Teratornis]] 21:18, 22 February 2011 (PST)
Humans are naturally reluctant to own up to their own failures and analyze them objectively; experiencing a loss of face often seems like a greater threat than failing to learn from the failure. This may be an aspect of evolutionary psychology: in ancient times one of the greatest threats to our ancestors was getting kicked out of their tribe, in a world filled with large predatory animals and hostile neighboring tribes. Thus we evolved to be horrified at any potential threats to our status, and to do everything we can to avoid being held accountable for our mistakes. The institutional approach to failure must override any tendency for personnel to obscure their failures. Failure to expose and correct individual failures can lead to catastrophic systemic failure of the whole project (e.g. Chernobyl). Individuals who work mostly without supervision and are accountable only to themselves should try to guard against the instinctive tendency to hide from failure. They must act as their own board of inquiry when necessary, when all that matters is to identify the cause of a failure and learn from it. --[[User:Teratornis|Teratornis]] 21:18, 22 February 2011 (PST)


:Good thoughts. Part of the way to deal with it is to ensure that it is safe to admit failure. Realistic expectations by aid donors for example - expecting that failures will be frequent, especially when innovating. --[[User:Chriswaterguy|Chriswaterguy]] 08:22, 23 February 2011 (PST)
:Good thoughts. Part of the way to deal with it is to ensure that it is safe to admit failure. Realistic expectations by aid donors for example - expecting that failures will be frequent, especially when innovating.
Warning! All contributions to Appropedia are released under the CC-BY-SA-4.0 license unless otherwise noted (see Appropedia:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here! You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted material without permission!
Cancel Editing help (opens in new window)
Cookies help us deliver our services. By using our services, you agree to our use of cookies.