Is Complexity really Complex?
On many occasions I got struck by the fact that what we experience as ‘complex’ sometimes is just a construct in our head because we do not understand what we see.
Example? For some a computer is very complex, hardly to understand. To others it is a simple thing, which occurs in many variations on a single theme:
In Essence, every computer has a heart called the processor, around the processor we find a couple of in- and output devices and there are devices to store data, called memory. That’s it. Whatever computer you meet; basically they are all the same, presuming you know where to look for the common denominator(s).
What’s that to do with process-analysis?
Assume you find a process that gives you a headache, it dazzles you. There now are 3 options:
- It is a very good process but you simply do not know what it does and what you see…
- It is a complete random set of actions, that seam to belong together but that’s just ‘a coincidence’…
- Or it is an intermediate form; there is a kernel of valid actions mingled with some more or less useless and/or random activities…
We know that for business processes, 95 til 99.5% of the activities in most processes do not add any value to the intended purpose of that process… (Peters, End of the Hierarchy)
Value or not?
The problem now is: We do nót know what the valuable part of the process is and what the ‘noise’ part of the process is.
And even the valuable part of the process may show ‘an infinite amount of deviations’…
From the millions and millions of cars in the roads, there are simply no two identical cars; however they all are unmistakably cars.
Unique or the same?
From the 7 billion people on earth, each and everyone is unique en yet we all recognize them immediately as ‘a human being’; even the most mutilated or handicapped ones. And even stronger: We need only to see a part of the face to identify óne specific individual! And we do not know why that is…
Obviously there are some ‘parameters’, some ‘markers’ that we scan for, to distinguish a car from a bicycle, a man from a woman or a human being from an ape.
Human beings have a set of common denominators that distinguish them from apes. The eyes of a human being have a set of common denominators that makes it possible to discriminate one from the other. Complex? Only as long as you do not know the ‘system’ behind this process!
When focussing on the differences, things might become complex. When looking at the common shared ‘parameters’ things become clear. Looking at humans they are quite similar:
- 2 kinds
- 4 colors
- Weight between 0,6 and 200 Kg
Unless damaged they all have
- 2 arms
- 10 fingers
- same base-software
Let’s apply this knowledge to our processes… Could it be true that every invoicing process is basically the same? That they all have the same common denominators? Than what are the common steps in that process and what are the possible variations on those steps? As soon as we understand this ‘system’ of needed activities to perform a certain task, we have the basis to design a process.
For cars we know: they all belong to a certain brand, and within each brand there are several models, several engines, all engines are from a basic pallet (Gasoline, Diesel, LPG, Electricity…). Cars have a colour or set of colours, they all have wheels based on a rim and a tire etc.
So although the occurrences will be infinite, the parameters and its pallet of choices are limited!
As soon as the parameters and its palettes are identified, we have the ‘key’ to untangle the ‘complexity’ and bring it back to a set of simple choices…
Koch’s Curve shows how it is possible to have an endless pattern on a line by simply recursively reshaping the line. Fractals are other examples of apparently extreme complex figures that are basically recurrences of a much simpler activity. As soon as you see how the figure is constructed, its complexity disappears and it becomes a transparent pattern…
Nonlinear Dynamics show how to find out whether what you see is really ‘random’ or whether it has a structure. Although not proven yet, I feel it should be possible to use this techniques to detect whether the occurrence of an incident (like an accident or a quality defect). From Heinrichs Law and Birds Law we know that there is a correlation between the presence of abnormalities and the occurrence of an incident. Therefore accidents and incidents are no longer an “unhappily joining of circumstances”. The incident simply HAD to happen, based on the parameters having a certain value at a given time! It is a deep misunderstanding of statistics to assume that things will not happen because the chance of it to happen is incredibly small. The chance of that car in front of you at your next tank stop will be there is one in a zillion, and yet tomorrow or next week when you go for a refill, it WILL be there!
Value or noise
Distinguishing ‘Value’ form ‘noise’ or ‘non-value’ in a process allows us to eliminate unneeded activities from the process. Less activities and correctional loops in the process will result in the reduction of its complexity. To identify the steps taken and analyse them on value, Makigami is the starting technique. To further design a robust and structurally correct process, the PSD is being used.