9 Principles of Safe-Fail Probes

Home/Complexity/9 Principles of Safe-Fail Probes

9 Principles of Safe-Fail Probes

In a fail-safe environment, the key objective is to prevent things going wrong.  It is a system that has been structured such that it cannot fail (or that the
probability of such failure is extremely low) to accomplish its assigned mission. High levels of planning and predictability accompany any such initiatives.  Indeed, some contexts demand this kind of rigour – take for example the recent serious of “could-have-been-worse” incidents in the South African civil aviation ­ arena.  Every eventuality must be covered, and when there is the slightest doubt, get the plane onto the ground, even if it is only the backup system that has given a warning.

However, not all environments are as predictable the engineering surrounding an aircraft . . ; (not that all of civil aviation is predictable – just consider the number of planes, people, airports, weather phenomena, passenger behaviour patterns, and the like – I refer here mainly to the technical safety of an aircraft).  Some contexts require a different approach completely.  This is is the realm, not of the complicated but predictable, but of the complex and unpredictable.

In the complex world (now we can talk about all those other things in civil aviation), we rather need to create ways that allow unpredictable things to happen, and then to have a plan to deal with them, and learn from them.  Moreover, we can even create situations that will test the stability (or lack thereof) of a complex system, and then observe the results.  If it goes wrong, that is OK – it is allowed to.  We can call these types of interventions or events, Safe-Fail.

I would like to suggest that when we are deliberately setting about to the test the patterns that exist in complex systems by means of these experiments, or Probes, we should consider the following 9 principles:

  • Don’t be afraid to experiment – some will fail.
  • Every experiment will be different – don’t use the cookie-cutter approach when designing interventions. 
  • Don’t learn the same lesson twice – or maybe I should say, don’t make the same mistake twice.
  • Start with a low-risk area when y­ou begin to experiment with a system.
  • Design an experiment that can be measured.  That is, know what the success and failure indicators of each experiment are.
  • Don’t be afraid – did I mention that already?
  • Try doing multiple experiments on the same system – even at the same time.  Some will work, some will fail – good.  Shut down the ones that fail and create variations on the ones that work.
  • Introduce dissent.  Maximize diversity in the experiment design process by getting as many inputs as possible.
  • Learn from the results of other people’s experiments.
  • Teach other people the results of your experiments.

These, um  . . . 9 (actually 10, but you see that I repeated one for emphasis) principles should be a good start to experimenting.

Don’t be afraid.

By |November 13th, 2007|Categories: Complexity|Tags: |0 Comments

About the Author:

Leave A Comment

css.php