Why are we feeling powerless in the face of the current grand challenges affecting our planet and the sustainable environmental development of humanity? This chapter addresses the issue in the framework of complex systems.
In the following, we will limit ourselves to tools dedicated to describing a phenomenon, including quantitative and qualitative statistics, multivariate characterization, modeling and dynamic simulation of an event, etc.
Probability theory provides a rigorous mathematical framework for understanding the notion of risk and eventuality. In finance, as in industry, it is customary to base statistical studies on the normal distribution. We then describe a set of possible states of the world and then assign each state a weighting that describes the probability that this state will occur. However, in practice, the hypothesis of using conventional laws (Gauss, Poisson, exponential, etc.) is increasingly disputed. Several phenomena occur: the information is generally asymmetric; the frequency of major events such as natural disasters or earthquakes does not correspond with what can be predicted in distribution law tails… For example:
We have seen previously that the appearance of failures in large complex electronic assemblies does not follow normal laws. Furthermore, we have seen that this phenomenon also appeared in microsystems. In the last example given concerning disruptions, crashes or financial hype, the frequency and amplitude of these phenomena implies the introduction of unconventional descriptive models. We were there in mesosystems and macrosystems. But here again, and to our knowledge, only B. Mandebrot’s approaches make it possible to establish links between the micro- and macrolevels.
We wish to take a moment out of the professional framework of finance or industry because we want to show in a broader framework that conventional statistical laws are no more suitable to represent physical phenomena, in particular, Gaussian and normal law. Indeed, in Nature, examples abound, showing that the frequency and amplitude of random phenomena, disasters and risks are higher than expected by laws (it is for this reason that we had to use Lévy’s laws). This is the case for the occurrence of physical disasters such as avalanches or earthquakes.
The following two representative examples show this in the universe of the infinitely great: astrophysics. Two situations are described: collisions between galaxies and stellar collisions.
Industrial, economic or social phenomena are time-dependent. A set of states form a time series of events; this is called the trajectory of a stochastic process. In games of chance, for example, when a die is rolled successively, a sequence of numbers between 1 and 6 is created. Time is of little importance because the throws are independent so that past history does not tell us anything about future draws.
In reality, in an industry, the human being intervenes in all his or her choices. The approach is complex because everyone tries to predict the future by questioning the past. We try to analyze past variations to determine trends or look for causes to take them into account in future decisions. Many specialists have therefore developed statistical analysis models with memory effects, particularly in the context of defining stock price trends.
A logical approach will underpin some stock market transactions: if there is regularity in stock prices and if it is, for example, possible to predict a rise in the price of a stock, people who have this information will naturally buy the stock today and sell it tomorrow to pocket the increase in value. In doing so, they push up the daily price through their purchases and pull down the next day’s price by their sales. The supposed regularity will therefore self-destruct. On the other hand, since the law of large numbers helps (if human behavior is similar), it will compensate for these irregularities and give rise to a given order.
The first scientist who took an interest in stock price variations using an innovative approach was Benoît Mandelbrot [MAN 01]. He was able to show that the temporal change in stock market prices was multi-fractal. This is therefore a regularity, yet of a new type, difficult to exploit given the very nature of fractal properties. Finally, our purpose is finally to highlight structural properties whose scope is deeper than those of the descriptive type that we find in the statistics. This point is a major step forward which shows that prices are well between order and disorder.
Unlike simple games of chance, economics is a complex game where players’ expectations influence, as we have just said, the probabilities of the next roll of the die. Economic agents change their behavior according to how they view the future and, in turn, their combined actions create the economic phenomena of tomorrow. A rational expectation equilibrium describes a probability distribution that takes into account this logical loop. We therefore create rules for ourselves, but using conventional statistical approaches.
This theory has had some success in the past, but the events or the major disruptions observed have never been reliably anticipated. Traditional statistical models therefore offer satisfactory tools in stable periods (permanent or stationary regimes) but are ineffective in the event of a crisis. With current approaches, disasters cannot be predicted; therefore, we navigate on sight, and this can lead to disastrous monetary policies.
A first correction therefore consists of using Weibull’s laws whose distribution law tails may be thicker than those resulting from normal and other laws. A second point is to look for models using other mathematical theories such as those defined by Mandelbrot that better integrate the notions of apparent discontinuities and can therefore lead to more reliable results, especially in crisis situations and with unpredictable and non-stationary phenomena.
The other approaches used in finance are alternatives that shift the problem from market risk to counter party risk or minimal risk management. The problem is therefore displaced by taking action, not to counteract an adverse event, but to integrate an event and mitigate its effect. We do not change anything about the risks involved. We can quote:
Similarly, in industry, in terms of the occurrence of rare phenomena, the frequency of occurrence is always a major issue. To illustrate our point and show the importance of new analytical approaches with appropriate statistical or mathematical tools, we refer to a real case study on the manufacture of large computer systems several years ago in a factory at IBM. We consider a production system for the assembly and testing of highly customized and sophisticated products.
The process is under control; it is of the 6-sigma type, and the anomalies observed are infrequent. The computer has about 80 K components; it must be able to operate continuously, at the customer’s premises, for a fixed lifetime. These anomalies, discovered during very stringent tests, are very diversified and often non-critical. The SPQL (Shipped Product Quality Level) is therefore very close to 1.
In terms of test result, we obtain a series of real values giving the number of defects per machine, as detected at the end of the line, machine by machine. This series is: 1, 0, 0, 0, 1, 0, 7, etc. The average of the observed defects is about 0.7, but if we analyze the empirical distribution of the values and compare it to a binomial distribution, we might be surprised… This is because conventional statistical curves are not usable: from the Gauss curve to Weibull distribution (with three variables) or hyperbola. A different approach to quality and performance analysis is therefore needed. For this purpose, we have prepared Table 16.1, based on the series of figures described above.
Table 16.1. Anomaly indicators
Size N | STD ó | Skew | ó.s | Key figs | Kurtosis | ó.k | Ratio k | Q. factor | Cp | Cpk |
20 | 1.97 | 2.23 | 0.51 | 4.36 | 4.29 | 0.99 | 4.33 | 0.79 | 1 | 1 |
40 | 1.94 | 2.14 | 0.37 | 5.74 | 3.59 | 0.73 | 4.9 | 0.79 | 1.01 | 1.01 |
60 | 1.84 | 1.97 | 0.3 | 6.38 | 2.94 | 0.60 | 4.84 | 0.79 | 1.07 | 1.07 |
80 | 1.74 | 1.86 | 0.26 | 6.94 | 2.72 | 0.53 | 5.12 | 0.79 | 1.13 | 1.13 |
100 | 2.5 | 4.99 | 0.24 | 20.71 | 33.65 | 0.47 | 70 | 0.8 | 0.78 | 0.78 |
120 | 1.68 | 1.82 | 0.24 | 7.56 | 2.61 | 0.47 | 5.46 | 0.8 | 1.17 | 1.17 |
140 | 2.05 | 2.59 | 0.24 | 10.76 | 7.09 | 0.47 | 14.83 | 0.79 | 0.95 | 0.95 |
Size N in the first column indicates the number of computers involved in the study. Indeed, we take more or less long series, knowing that each computer uses the same family of technologies, but that the configuration, like personalization, is often different: it is a “mass personalized” production. In the analysis of Table 16.1, we can make the following observations:
In this example, we are not able, with a probability close to 1, to show that this is a non-Gaussian or chaotic distribution. In this sense, we are closer to the observations made by Mandelbrot on a series of stock prices: conventional statistical curves predict much too low failure densities and are not representative of reality. Indeed, exceptional cases, such as breakdowns or disasters with high failure/defect rates, are more frequent than those predicted by statistics.
We do know, however, that stable distributions such as Paul Lévy’s apply [WAL 91] – the latter is a generalization of the Gauss distribution, and its density is characterized by S(α, β, c, ∂), where α is the exponent, β represents the asymmetry, c is a scale factor and ∂ is a position parameter. Without calculating these parameters, α = 2 for a Gaussian distribution. Here, the coefficient is α ≤ 2, which means that there are large variations in distribution law tails and that conventional statistical analysis methods do not apply.
Thus, a methodology related to the analysis of time series and the identification of chaotic properties in processes could be developed and validated.
Let us highlight two points: how a disaster manifests itself and how it can be represented to better study it. This exercise will be followed by positive results if it is possible to identify mechanisms, or organizational rules, to exploit and propagate hidden orders. This exploitation will be done through the interactions existing in the system under study. This also raises the question of how much determinism or chance is involved in the evolution of our systems. So, we cannot avoid addressing this aspect of things.
It is easy to understand that the existence of uncertain and/or unpredictable facts is generally not well accepted by decision-makers. However, as philosophers have often argued, one of the essential driving forces of evolution would be chance. However, it is now more than necessary to consider evolution within a more general framework, that of Nature. Evolution, in fact, affects plants, animals, humans, as well as their by-products, which are industry, the economy, services, transport, etc.
Erasmus, Darwin, and even Jean-Baptiste Lamarck, believed that progress, like evolution, was the result of the accumulation of knowledge and that we could use it to change our situation and our environment. Originally, our world was governed by simple and immutable laws that made it possible to maintain order. This notion of progress was taken up by Charles Darwin: it can be biological [DAR 59], as well as scientific, cultural and philosophical.
At that time, progress was a corollary of evolution and, importantly, it was mainly about progressive evolution. Of course, the notion of system dynamics was not widespread at the time, and there was still no mention of deterministic chaos, fractals or catastrophe, in the topological or mathematical sense.
On the other hand, evolution is from the simplest to the most complex and complicated. While the complex term expresses behavior, the complicated term is intrinsic and structural. This evolution is linked to the notion of progressive evolution and will not be explained here knowing that many books are devoted to these concepts. Thus, over the last billion years that concern our planet, progress has been prodigious: living beings have evolved in size, in functionality, in terms of activities generated, etc. Our techniques of defending ourselves to survive, acquiring food, then having goods to live, social organization to better control our environment, etc., have paved the way to the evolution of the species we know today.
But to reach such a level of evolution, it is necessary to consider complementary assets:
Thus, in studies related to the evolution of the world and the approaches to progress, we see that Nature has been able to combine techniques of the progressive evolution type, where chance is not very present, with jump techniques introducing the notions of chance, uncertainty, deterministic chaos, etc., to form what some call fierce evolution. The two approaches are complementary, and each brings its own set of advantages and balances. They must therefore co-exist in a peaceful way.
We will draw a parallel and advance the same strategies in the areas of finance and risk. Wanting to model everything and/or store it in information and decision-making systems is attractive and reassuring but… ineffective! Indeed, we will never avoid all these phenomena and behaviors that we have explained in our books many times.
In stationary periods, it is certainly useful to have conventional tools and SIADs that allow you to act or not to act. But it is much more useful, in the event of unforeseen and uncertain circumstances, to develop the ability to react. These will make it possible, in a pro-active spirit, to anticipate. We see here that the notions of diagnosis are simplified because they lead to binary situations of the Go-Nogo type with reflex functions resulting from a relevant and coherent, simple and fast learning.
What has just been developed leads us to ask a question: are we (our society) ready to accept chance and survive the eventualities and disasters of life? Indeed, in everyday life, examples abound to show that very few people are willing to accept fatalism, chance and the risks. We are in an ultra-protected era (although the economic and social differences between citizens are very important).
On another level, the professional one, the error is not accepted although everyone knows the following precept: errare humanum est, perseverare diabolicum est (to err is human, to persevere [in erring] however [is] of the devil). Failure is not accepted as a coincidence or a tribute to evolution: the one who has failed is pointed out by society, banks and employees comfortably settled in offices and in the hidden state of mind of a predator.
We live in complex worlds, and chance, unpredictability and uncertainty are part of our lives and it is difficult to implement measurement and control systems because our environment, although evolving in a stable way, will always be subject to eventualities and disasters. It is not a question of remaining indifferent to what surrounds us, but of showing – and this has been repeated several times – common sense, discernment, intuition and emotion. It is therefore human qualities, which are more a matter of the “I” than of the “method”, that we need and that complement existing approaches.
We live in complex worlds, and chance, unpredictability and uncertainty are part of our lives and it is difficult to implement measurement and control systems because our environment, although evolving in a stable way, will always be subject to eventualities and disasters.
The situation can be clarified by the notion of “common goods” which is pervasive in our environment and unconsciously used by everybody. According to [DIM 15], common goods are defined in economics as goods, resources and/or products that are either prone to rivalrous behaviors or non-excludable. This definition does not allow unlocking the situation; therefore the notion of property (what is proprietary) should evolve.
With regard to a recent and actual example such as climate stability, we can consider conventional examples of common goods such as water, including oceans and air. Water and air be can easily polluted: water flows can be exploited beyond sustainability levels, and air is often used in fossil energy combustion, whether by motor vehicles, smokers, factories, wood and forest fires, etc. They are mainly caused by human beings’ activity, unconsciousness, greed attitude and laxity.
In a production process, natural resources and materials are transformed and changed into finished products such as food, shoes, toys, furniture, cars, houses and televisions. The activities leading to these products may be associated with pollution. This is simply due to the energy transformation processes (for the second law of thermodynamics, see [MAS 15b]). Thus, urban people are not in an idoneous position to criticize farmers, manage risks alone or instruct the rest of the population how to behave. In the meantime, the environment is degraded by product usage.
Another example is related to fish stocks in international waters and the difficulty in elaborating regulations, specifications, limitations… As soon as we are granted the possibility to consume many more resources every year than our Mother Earth can produce in the same time interval, the comfort provided to some contrasts with, for example, fishermen’s hungriness.
Good examples should always come from the top and benefits (if and when any) shared by the whole of society [MAS 17b]. Through the above examples, we described situations in which economic players withdraw resources to secure short-term gains (or conveniences) without regard for long-term consequences. For instance, how will we accommodate seven billion inhabitants under sustainable conditions and at the best societal cost? For these reasons, the expression tragedy of the commons was coined and nobody knows how to control and reduce the associated risks.
Going further, forest exploitation leads to barren lands, and overfishing leads to reducing an overall fish stock, both of which eventually result in diminishing the yields to be withdrawn periodically. The typically linear thinking (quantities, volumes, yields) must be replaced by a “frequency thinking” (ratio volume per period per regeneration faculty), which entails building more global economic models. No one can, for instance, earn the right to deforest at will a country or its private territory. Surely, a nation has the right to exploit its lands, but has no right to endanger the common goods of the (i.e. our) planet.
This conclusive discussion enables us to redefine the content of a risk and the way we have to manage it. We consider that common goods are an exploitable form of renewable resource such as fish stocks, grazing land, etc., in order to be sustainable. These common pool resources must be subject to sustainability and ethics, and require a widest institutional arrangement and consensus, as a shared and collaborative “common-pool resources” management.