Complicated & Complex Systems in Safety Management | Safety Differently

Worth going to the source to look at the comments

Source: Complicated & Complex Systems in Safety Management | Safety Differently

COMPLICATED & COMPLEX SYSTEMS IN SAFETY MANAGEMENT

When General Stanley McChrystal took over the U.S. Joint Special Operations Command[1] (JSOC) in Iraq during the mid-2000s, he inherited an organisation struggling to overcome the Al Qaeda insurgency plaguing the country. After a few weeks in the job, he realised his new team had been viewing their enemy through the wrong lens, and therefore had been using the wrong strategies to defeat them. Ultimately, this insight led him to revolutionise the Command’s structure, and challenge its very core beliefs about how it could win the war.

At the heart of McChrystal’s revolutionary strategy was an appreciation for the difference between systems that are complicated and those that are complex.[2]

When people describe something as complex, what they usually mean is they think it’s really complicated. This suggests that there is a continuum of ‘complicatedness’ and that the difference between a complicated and a complex system is one of degrees, rather than type. In reality, a complex system is fundamentally different from a complicated one. It’s critical that we understand how they are different, and why this knowledge is important if our goal is to manage the safety of a system.[3]

What is a system?

A system can be defined as anything that involves ‘a set of things working together as parts of a mechanism or an interconnecting network’.[4] Examples of systems include an analogue watch, an underground rail network, an air conditioner, a business, a car, a skeleton, an aeroplane, a person, or a government. The way we think about the systems around us influences the methods we choose to solve the problems they pose.

It’s complicated

Traditional thinking tends to lead us to see all systems around us as complicated. A complicated system is usually something technical or mechanical and has many interacting parts.

Think of a jet engine. It contains thousands of mechanical parts, and to understand how it works you can read a manual that will tell you everything you need to know. If it stops working, you can take it apart, locate the broken component, replace it, and return the engine to service. This type of problem-solving works well with complicated systems because they work in a linear way, and are fully knowable (with enough study). The whole is equal to the sum of its parts.[5]

In complicated systems, unwanted events and outcomes (e.g. oil leak) are usually the direct result of component failures. The possible range of outcomes is finite because the system has been carefully designed for a specific purpose.

Unfortunately, problems start to arise when we treat complex systems as if they were complicated. This is exactly where the JSOC found themselves in the fight against Al Qaeda when McChrystal took over. They had been imagining their adversary as a traditional, hierarchical army with clear lines of vertical command and control; a complicated system. In reality, Al Qaeda was a complex web of cells interacting and operating in unpredictable ways, for which traditional battle tactics were useless.

No, it’s complex!

Complex systems are fundamentally different from their complicated cousins. They contain the same technical components (e.g. physical equipment and computers) but also consist of human elements and vast social networks.

A prime example of a complex ‘socio-technical’ system is an organisation, such as an airline. Airlines consist of many technical elements, like the aircraft and the IT, but also many forms of social systems, like management teams, frontline workforces, and customers.

Systems typically become complex by default when individuals or groups of people are added to them. Returning to the example of the jet engine – which we recognise as a complicated technical system – as soon as we decide to perform some maintenance on that engine, the new system we’ve created, ‘jet engine maintenance’, automatically becomes complex. This new system contains human, social and organisational elements (policies, procedures, culture etc.), as well as technical parts.

In complex systems, unwanted outcomes do not occur solely due to individual component failure, but most often they emerge from the unpredictable interactions between the components. For example, the way an engineer interacts with company policies, procedures, goal conflicts, organisational culture, their team, the environment, etc. when maintaining an engine.

If we think of the total aviation system, acknowledging that it is complex, then we recognise that the millions of sub-systems within it (ATC, airports, airlines, manufacturers, maintainers, etc.) will all interact with each other in complex and unpredictable ways. That means that any attempt to assert control over the system will ultimately fail because complex systems cannot be controlled in the same way that complicated ones can.

Critically, in a complex system, the whole is greater than the sum of its parts because outcomes emerge in ways that cannot be totally controlled or predicted.

McChrystal helped his team to see Al Qaeda as a complex web of unpredictable and adaptive elements. This meant employing fundamentally different strategies of battle, which ultimately led to significantly greater success in their fight.

What does this mean for how we manage safety and risk?

It’s natural and normal for us to treat complex problems as if they were complicated; to reduce them down to their parts and change out the troublesome component. This is after all how most formal education teaches us to solve problems; by ‘analytical reductionism’.

But in 21st-century airline safety, most of the time we’re dealing with human work performed by pilots, engineers, ground staff, and cabin crew. We can’t truly understand human work – how it normally goes right and sometimes go wrong – using the same methods we use to understand technical objects.

This means that when we’re looking for strategies to solve human-centred safety problems, we need to apply complex systems thinking to the task. This means avoiding the temptation to disassemble the problem to find the broken ‘component’ (human).

As safety leaders and practitioners we should use the thinking, methods, and tools that help us to understand the complex and dynamic nature of the systems we operate. We need to study the interactions, patterns and feedback loops in our systems and identify how small changes can lead to disproportionately large and unintended consequences.

Whether we’re designing new policies and procedures, investigating a maintenance error, or risk assessing a new piece of equipment, we instead need to consider safety in the context of the overall system – to think holistically and embrace complexity.

When writing about systems, one can’t finish a piece without quoting the great Russell Ackoff. A little Ackoff wisdom goes a long way:

“To manage a system effectively, you might focus on the interaction of the parts rather than their behavior taken separately.”

References

[1] https://en.wikipedia.org/wiki/Joint_Special_Operations_Command

[2] https://tinyurl.com/y4vcogp9

[3] https://tinyurl.com/y2ty5wkw

[4] https://www.lexico.com/en/definition/system

[5] https://www.skybrary.aero/bookshelf/books/2882.pdf

Source: Complicated & Complex Systems in Safety Management | Safety Differently