[Off-Topic] Agility, Chaos, Self-Organization
This article is a general overview of various related topics that make up some of the knowledge necessary to understand the theme of Self-Organization. In this case it’s important to understand this concept also from the physical and mathematical foundation perspective.
The whole motivation comes from the 11th principle of the Agile Manifesto:
“The best architectures, requirements, and designs emerge from self-organizing teams.” – Principles behind the Agile Manifesto
Agility requires self-organization. But this concept is alien to most people and organizations. The 12th principle further adds:
“At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.” – Principles behind the Agile Manifesto
For this you need to understand the concept of a Learning Organization — or better yet, understand the dynamic moment of learning and continuous improvement called the Edge of Chaos, and how implementing Agility essentially means, in physics terms:
Removing the organization — which is a complex system — from its state of dynamic equilibrium, increasing the overall entropy of the system, forcing it toward chaos. At the ideal point, at the edge of chaos, self-organization happens and the organization becomes a learning organization.
Remember: “learning” also includes exploring the unknown, incorporating new information. Doing what you’ve always done means you’re not learning anything, by definition. You need to do things differently, make mistakes, and improve in order to learn.
Let’s get into the concepts now.
Translation: Chaos Theory in Organizational Development
from Wikipedia
Self-Organization, as opposed to natural or social selection, is a dynamic change within the organization where changes are made by recalculating, reinventing, and modifying its structure to adapt, survive, grow, and develop.
Self-organization is the result of creative reinvention and adaptation due to the introduction of — or by existing in a constant state of — perturbed equilibrium. An example of an organization that exists in a constant state of perturbation is a learning organization, which is one that “allows self-organization, instead of trying to control bifurcation through planned changes” (Dooley, 1995).
Being “out-of-balance” lends itself to the regrouping and reassessment of the present state of the system to make necessary adjustments and recover control and equilibrium. By understanding and introducing the element of poking the equilibrium (chaos) while facilitating the growth of networks, an organization can shift gears from “normal” to “turbo” in terms of speed and intensity of organizational change. While maintaining a state of equilibrium seems to be an intuitively rational method for enabling an organization to gain a sense of consistency and solidarity, existing at the edge of a chaotic state remains the most beneficial environment for systems to flourish, develop, and grow.
Incidentally, two competing organizations that differ in their level of homeostasis will not be competing for very long. Generally speaking, the organization with the less stable structure will come out ahead while the constant stability of the second will eventually lead to its downfall. Although very similar, small differences in levels of homeostasis are sufficient to make a tremendous difference in future outcomes for each organization. The notion of similarity in origin vs. different outcomes appears with the emergence of bifurcation.
The concept of bifurcation cannot be explained without discussing the term frequently called “sensitivity to initial conditions,” which refers to the high level of importance of primary conditions from which the future path and direction of a system will stretch. This sensitivity to initial conditions is commonly called the Butterfly Effect, where a butterfly beats its tiny wings in one corner of the world resulting in a typhoon or hurricane somewhere else on the globe. While this is an entertaining notion, sensitivity to initial conditions remains in reality a quite abstract concept without the presence of bifurcation, which is the mathematical name for the point of separation of two nearly identical entities that, because of sensitivity to initial conditions, tend to take two very different paths in two evolutionarily or geographically different places.
The primary goal of an organizational development (O.D.) consultant is to initiate, facilitate, and support successful change in an organization. Using chaos theory as the sole model for change may seem too risky to be supported by any stakeholder. The concept of uncertainty in chaos theory is not an attractive motivation for change compared to many alternative models considered “safer.” By planning and carefully managing disorder, a successful intervention is possible, but only with a truly dedicated arsenal of talented and creative resources. Allowing and actively forcing an organization to enter a chaotic state, changes become inevitable and bifurcation is imminent; but the question remains, “Will the new direction be the intended one?” To have more assurance of the direction taken, most of the planning attention must focus on attractors rather than on the initiation of disorder.
Although chaos eventually gives way to self-organization, how can we control the duration, intensity, and form of its outcome? It seems that poking the equilibrium and installing disorder in the organization are risky. Throwing an organization out of balance could possibly send it into a downward spiral if the structural integrity (i.e., identity) of the system is compromised beyond a point of no return. The only way to reap the benefits of chaos theory in O.D. while maintaining a sense of safety is to adjust the organization toward a state of existence that lives at the edge of chaos.
Existing at the edge of chaos, organizations are forced to find new, creative ways to compete to stay ahead. Good examples of this type of learning organization are found throughout the technology field as well as in the airline industry, like Southwest Airlines, which used reinvention not just to survive but also to thrive in a difficult market. In contrast, there are organizations that, due to long periods of equilibrium, find themselves struggling to survive. Telephone companies, for example, were once solid, static entities that dominated the communications market. While the rest of the world was developing new communications technologies, they did not grow creatively at the same rate. The result is an organization battling just to stay alive, unless it accepts the element of chaos through crisis and allows creative adaptability to function freely so that self-organization and reinvention can occur.
While organizations existing at the edge of chaos are known to be the most creative and adaptive, how do their members feel about constant evolution and reinvention? Is it possible to identify with, and remain loyal to, an organization that constantly changes form? The short answer is “yes.” As long as the company doesn’t change its core essence, its identifiable and shared purpose, its members will still experience the organization as a developing system that changes form but maintains the same familiar face.
Perhaps the safest way to use chaos theory in O.D. is not in instigating the organization to change, but in using its principles to deal with problems as they arise. By embracing organizational phenomena previously seen as dysfunctional, such as interpersonal conflicts, and using them as a source for transformation changes by applying principles found in chaos theory (Shelton, 2003), an organization can make “lemonade out of lemons” and become more responsive to change agents while continuously moving forward and growing from the inside out without fear of complete chaos.
How Business Is a Lot Like Life — The 4 Principles
Fast Company Magazine, March 2001. How Business is a Lot Like Life
Equilibrium is a precursor to death. When a living system is in a state of equilibrium, it is less responsive to changes occurring around it. The risk is greatest when one feels most secure.
When threatened or galvanized by an attractive opportunity, living things move toward the edge of chaos. This condition evokes higher levels of mutation and experimentation and has more likelihood of resulting in new solutions.
As living things move closer to the edge of chaos, they tend to self-organize and new forms emerge from the disorder. This property of life, called “self-organization and emergence,” is a major source of innovation, creativity, and evolution.
Living systems cannot be directed along a linear path. Unintended consequences are inevitable. The challenge is to learn how to disturb them in a way that approximates the desired outcome and then correct course as the outcome unfolds.
Definitions
from Wikipedia
Reductionism can either mean (a) a way of understanding the nature of complex things by reducing them to the interactions of their parts, or to simpler or more fundamental things or (b) a philosophical position that a complex system is nothing more than the sum of its parts.
In physics and systems theory, the superposition principle states that, for all linear systems, the net response at a given place and time caused by two or more stimuli is the sum of the responses that would have been caused by each stimulus individually. That is, if input A produces response X and input B produces response Y, then input (A+B) produces response (X+Y).
In mathematics, a nonlinear system is a system (…) that does not satisfy the superposition principle, or whose output is not directly proportional to its input. Less technically, a nonlinear system is any problem where the variable(s) to be solved cannot be written as a linear combination of independent components (…) Nonlinear problems are of interest to engineers and physicists because most physical systems are inherently nonlinear in nature. Nonlinear equations are difficult to solve and give rise to interesting phenomena like chaos. The weather is known to be nonlinear, where simple changes in one part of the system produce complex effects throughout.
Emergent Behavior — Thriving at the Edge of Chaos
by Chris Rollins, January 2009.
The cathedral ant, found in parts of Australia, is capable of creating mounds for the colony well over 3 meters tall. A single cathedral ant has the standard insect appearance — head, thorax, abdomen, legs, and so on, with a minuscule primitive brain. But when combined with others of its species, the cathedral ant is capable of constructing an enormous complex for the colony. Unlike human construction projects, however, there is no foreman, no plan, and it’s not implausible that the ant doesn’t even know what it’s helping to build.
How is this possible?
The answer lies in the fact that sometimes, a system can end up with more complexity than the sum of its parts — leading to what scientists call “emergent behavior.”
Emergent Behavior, or the spontaneous creation of order, is present all around us. Insects are a good example because they’re familiar to us and can create massive designs that we can appreciate. Other parts of the animal kingdom also demonstrate autonomous order — fish organize into schools that move in concert; birds cluster into flocks or formations in a similar way. There are many examples of nonlinearity — natural magnets aligning in a north-south orientation and crystals forming from liquids, showing a spontaneous increase in order despite the lack of an “intelligent” force.
There is a major thermodynamics problem with all of this, of course — entropy, a measure of disorder, should continue to rise. The universe tends toward chaos.
How, then, can order appear spontaneously, especially in a purely physical system?
Entropy still needs to increase, even when crystals form or birds flock, but the important distinction is in where entropy increases or decreases. It turns out that nature will allow entropy to decrease in certain areas if it increases elsewhere to compensate. For example, when a sugar solution begins to form crystals, they have a resulting lower energy level than the sugar molecules floating in solution. When sugar enters the structure, this is transferred to the water in the form of heat: the most disorganized form of energy.
Therefore, the crystalline portion of the solution has now decreased in entropy while the total system — including the water and the crystal — has had a net increase in entropy.
Why Gaussian Statistics Are Almost All Wrong for Organizational Strategies
Power Law phenomena exhibit Pareto distribution rather than Gaussian (normal). The fundamental difference lies in the premise about the correlation of events. In Gaussian distribution, events are assumed to be independent. Independent events generate normal distributions, which are at the heart of modern statistics. When events are interdependent, normality in distributions is not the norm. Instead, Pareto distribution dominates because extreme events occur more frequently than the normal distribution — with its Gaussian bell-shaped curve — would lead us to expect. Physical, biological, ecological, social, and industrial systems exhibit a remarkable variety of fractal structure (Kaye, 1993). Many scholars now believe that power laws are the best analytical framework for describing the origin and form of most natural objects. Given the ubiquity of these findings and the nature of scale-free theory, we believe they are unknown, underappreciated but equally ubiquitous phenomena in organizations (Andriani, 2003).
Extremes vs. Averages
Linear thinking is encrusted in our mentality. Scientific and mathematical models are based on the concepts of equilibrium and linearity. Linearity means two things: proportionality between cause and effect; and superposition, meaning when the dynamics of a system can be reconstructed by the sum of the effects of simple causes acting on simple components (Nicolis and Prigogine, 1989), which permits the efficient operation of causality, equation solving, and elaborate predictive models. Economics, for example, is almost theistic in its premise that economic phenomena tend toward equilibrium (Mirowski, 1989). However, these premises permit simple linear and analytical equations.
By focusing on systems in equilibrium, statisticians implicitly accept that the number of possible states a system can reach is limited (and computable) and that the search time following the onset of instability (i.e., an exogenous shock) is short compared to the equilibrium time. For this to be true, the many elements making up a system need to be independent data points by assumption; otherwise we could have interdependence, possible mutual causalities, and the occurrence of possible extreme events.
If we take 100 companies of approximately the same size belonging to the same sector, assume independence, and plot a variable — say, profit — we find that most events cluster near the mean, in a distribution that decays rapidly and follows a bell-shaped curve. This distribution is by far the most studied statistical distribution; it is assumed to correctly characterize most of our findings about natural and social worlds. However, the crux of the matter is the independence of events. In real life, these companies might: benchmark each other, imitate those perceived as most successful, exchange information, organize cartels, pursue acquisitions, compete for limited resources, etc. In one word, they are interdependent and not independent. The statistical distribution governing interconnected agents does not yield a bell-shaped distribution, but instead a power law distribution — Pareto.
Gaussian and Pareto distributions differ radically. The main functional form of the Gaussian can be entirely characterized by its mean and variance (Greene, 2002). A Pareto distribution does not exhibit well-behaved mean and variance. A power law, however, has no mean that can represent the typical functional form of the distribution and has no finite standard deviation on which confidence intervals could be based (Moss, 2002). There are two major implications of this.
One is that the social sciences’ dream of building robust frameworks that allow social scientists to predict the evolution of social phenomena is broken by the absence of statistical regularities in phenomena dominated by persistent interconnectedness. In fact, if there is an absence of stable mean and finite variance, probabilistic statements about individual outcomes become very difficult. The point reflects the more pervasive and structural problem of nonlinearity and emergence in complex systems. Linearity assumes the divisibility of systems into modules whose dynamics can be studied independently of context.
The second point is that power laws decay more slowly than normal distributions. These “fat tails” affect system behaviors in significant ways. For example, Buchanan (2004) reports that financial market drops of 10% in a single day should happen once every 500 years according to the normal distribution. Mandelbrot (Mandelbrot and Hudson, 2004) shows that instead, financial crises happen roughly once every five years. The lesson we can draw from this is that extreme events, which in the Gaussian world could be safely ignored, are not only more common than expected but also of much greater magnitude and with far larger consequences.