Chapter 19

Awaiting the Wildness

The great statistician Maurice Kendall once wrote, “Humanity did not take control of society out of the realm of Divine Providence . . . to put it at the mercy of the laws of chance.”1 As we look ahead toward the new millennium, what are the prospects that we can finish that job, that we can hope to bring more risks under control and make progress at the same time?

The answer must focus on Leibniz’s admonition of 1703, which is as pertinent today as it was when he sent it off to Jacob Bernoulli: “Nature has established patterns originating in the return of events, but only for the most part.” As I pointed out in the Introduction, that qualification is the key to the whole story. Without it, there would be no risk, for everything would be predictable. Without it, there would be no change, for every event would be identical to a previous event. Without it, life would have no mystery.

The effort to comprehend the meaning of nature’s tendency to repeat itself, but only imperfectly, is what motivated the heroes of this book. But despite the many ingenious tools they created to attack the puzzle, much remains unsolved. Discontinuities, irregularities, and volatilities seem to be proliferating rather than diminishing. In the world of finance, new instruments turn up at a bewildering pace, new markets are growing faster than old markets, and global interdependence makes risk management increasingly complex. Economic insecurity, especially in the job market, makes daily headlines. The environment, health, personal safety, and even the planet Earth itself appear to be under attack from enemies never before encountered.

The goal of wresting society from the mercy of the laws of chance continues to elude us. Why?

image

For Leibniz, the difficulty in generalizing from samples of information arises from nature’s complexity, not from its waywardness. He believed that there is too much going on for us to figure it all out by studying a set of finite experiments, but, like most of his contemporaries, he was convinced that there was an underlying order to the whole process, ordained by the Almighty. The missing part to which he alluded with “only for the most part” was not random but an invisible element of the whole structure.

Three hundred years later, Albert Einstein struck the same note. In a famous comment that appeared in a letter to his fellow-physicist Max Born, Einstein declared, “You believe in a God who plays with dice, and I in complete law and order in a world which objectively exists.”2

Bernoulli and Einstein may be correct that God does not play with dice, but, for better or for worse and in spite of all our efforts, human beings do not enjoy complete knowledge of the laws that define the order of the objectively existing world.

Bernoulli and Einstein were scientists concerned with the behavior of the natural world, but human beings must contend with the behavior of something beyond the patterns of nature: themselves. Indeed, as civilization has pushed forward, nature’s vagaries have mattered less and the decisions of people have mattered more.

Yet the growing interdependence of humanity was not a concern to any of the innovators in this story until we come to Knight and Keynes in the twentieth century. Most of these men lived in the late Renaissance, the Enlightenment, or the Victorian age, and so they thought about probability in terms of nature and visualized human beings as acting with the same degree of regularity and predictability as they found in nature.

Behavior was simply not part of their deliberations. Their emphasis was on games of chance, disease, and life expectancies, whose outcomes are ordained by nature, not by human decisions. Human beings were always assumed to be rational (Daniel Bernoulli describes rationality as “the nature of man”), which simplifies matters because it makes human behavior as predictable as nature’s—perhaps more so. This view led to the introduction of terminology from the natural sciences to explain both economic and social phenomena. The process of quantifying subjective matters like preferences and risk aversion was taken for granted and above dispute. In all their examples, no decision by any single individual had any influence on the welfare of any other individual.

The break comes with Knight and Keynes, both writing in the aftermath of the First World War. Their “radically distinct notion” of uncertainty had nothing whatsoever to do with nature or with the debate between Einstein and Born. Uncertainty is a consequence of the irrationalities that Knight and Keynes perceived in human nature, which means that the analysis of decision and choice would no longer be limited to human beings in isolated environments like Robinson Crusoe’s. Even von Neumann, with his passionate belief in rationality, analyzes risky decisions in a world where the decisions of each individual have an impact on others, and where each individual must consider the probable responses of others to his or her own decisions. From there, it is only a short distance to Kahneman and Tversky’s inquiries into the failure of invariance and the behavioral investigations of the Theory Police.

Although the solutions to much of the mystery that Leibniz perceived in nature were well in hand by the twentieth century, we are still trying to understand the even more tantalizing mystery of how human beings make choices and respond to risk. Echoing Leibniz, G.K. Chesterton, a novelist and essayist rather than a scientist, has described the modern view this way:

The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait.3

In such a world, are probability, regression to the mean, and diversification useless? Is it even possible to adapt the powerful tools that interpret the variations of nature to the search for the roots of inexactitude? Will wildness always lie in wait?

image

Proponents of chaos theory, a relatively new alternative to the ideas of Pascal and others, claim to have revealed the hidden source of inexactitude. According to chaos theorists, it springs from a phenomenon called “nonlinearity.” Nonlinearity means that results are not proportionate to the cause. But chaos theory also joins with Laplace, Poincaré, and Einstein in insisting that all results have a cause—like the balanced cone that topples over in response to “a very slight tremor.”

Students of chaos theory reject the symmetry of the bell curve as a description of reality. They hold in contempt linear statistical systems in which, for example, the magnitude of an expected reward is assumed to be consistent with the magnitude of the risks taken to achieve it, or, in general, where results achieved bear a systematic relationship to efforts expended. Consequently, they reject conventional theories of probability, finance, and economics. To them, Pascal’s Arithmetic Triangle is a toy for children, Francis Galton was a fool, and Quetelet’s beloved bell curve is a caricature of reality.

Dimitris Chorafas, an articulate commentator on chaos theory, describes chaos as “. . . a time evolution with sensitive dependence on initial conditions.”4 The most popular example of this concept is the flutter of a butterfly’s wings in Hawaii that is the ultimate cause of a hurricane in the Caribbean. According to Chorafas, chaos theorists see the world “in a state of vitality. . . characterized by turbulence and volatility.”5 This is a world in which deviations from the norm do not cluster symmetrically on either side of the average, as Gauss’s normal distribution predicts; it is a craggy world in which Galton’s regression to the mean makes no sense, because the mean is always in a state of flux. The idea of a norm does not exist in chaos theory.

Chaos theory carries Poincaré’s notion of the ubiquitous nature of cause and effect to its logical extreme by rejecting the concept of discontinuity. What appears to be discontinuity is not an abrupt break with the past but the logical consequence of preceding events. In a world of chaos, wildness is always waiting to show itself.

Making chaos theory operational is something else again. According to Chorafas, “The signature of a chaotic time series. . . is that prediction accuracy falls off with the increasing passage of time.” This view leaves the practitioners of chaos theory caught up in a world of minutiae, in which all the signals are tiny and everything else is mere noise.

As forecasters in financial markets who focus on volatility, practitioners of chaos theory have accumulated immense quantities of transactions data that have enabled them, with some success, to predict changes in security prices and exchange rates, as well as variations in risk, within the near future.6 They have even discovered that roulette wheels do not produce completely random results, though the advantage bestowed by that discovery is too small to make any gambler rich.

So far, the accomplishments of the theory appear modest compared to its claims. Its practitioners have managed to cup the butterfly in their hands, but they have not yet traced all the airflows impelled by the flutterings of its wings. But they are trying.

In recent years, other sophisticated innovations to foretell the future have surfaced, with strange names like genetic algorithms and neural networks.7 These methods focus largely on the nature of volatility; their implementation stretches the capability of the most high-powered computers.

The objective of genetic algorithms is to replicate the manner in which genes are passed from one generation to the next. The genes that survive create the models that form the most durable and effective offspring.a Neural networks are designed to simulate the behavior of the human brain by sifting out from the experiences programed into them those inferences that will be most useful in dealing with the next experience. Practitioners of this procedure have uncovered behavior patterns in one system that they can use to predict outcomes in entirely different systems, the theory being that all complex systems like democracy, the path of technological development, and the stock market share common patterns and responses.8

These models provide important insights into the complexity of reality, but there is no proof of cause and effect in the recognition of patterns that precede the arrival of other patterns in financial markets or in the spin of a roulette wheel. Socrates and Aristotle would be as skeptical about chaos theory and neural networks as the theorists of those approaches are about conventional approaches.

Likeness to truth is not the same as truth. Without any theoretical structure to explain why patterns seem to repeat themselves across time or across systems, these innovations provide little assurance that today’s signals will trigger-tomorrow’s events. We are left with only the subtle sequences of data that the enormous power of the computer can reveal. Thus, forecasting tools based on nonlinear models or on computer gymnastics are subject to many of the same hurdles that stand in the way of conventional probability theory: the raw material of the model is the data of the past.

image

The past seldom obliges by revealing to us when wildness will break out in the future. Wars, depressions, stock-market booms and crashes, and ethnic massacres come and go, but they always seem to arrive as surprises. After the fact, however, when we study the history of what happened, the source of the wildness appears to be so obvious to us that we have a hard time understanding how people on the scene were oblivious to what lay in wait for them.

Surprise is endemic above all in the world of finance. In the late 1950s, for example, a relationship sanctified by over eighty years of experience suddenly came apart when investors discovered that a thousand dollars invested in low-risk, high-grade bonds would, for the first time in history, produce more income than a thousand dollars invested in risky common stocks.b In the early 1970s, long-term interest rates rose above 5% for the first time since the Civil War and have dared to remain above 5% ever since.

Given the remarkable stability of the key relationships between bond yields and stocks yields, and the trendless history of long-term interest rates over so many years, no one ever dreamed of anything different. Nor did people have any reason for doing so before the development of contracyclical monetary and fiscal policy and before they had experienced a price level that only went up instead of rising on some occasions and falling on others. In other words, these paradigm shifts may not have been unpredictable, but they were unthinkable.

If these events were unpredictable, how can we expect the elaborate quantitative devices of risk management to predict them? How can we program into the computer concepts that we cannot program into ourselves, that are even beyond our imagination?

We cannot enter data about the future into the computer because such data are inaccessible to us. So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician’s trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand. History provides us with only one sample of the economy and the capital markets, not with thousands of separate and randomly distributed numbers. Even though many economic and financial variables fall into distributions that approximate a bell curve, the picture is never perfect. Once again, resemblance to truth is not the same as truth. It is in those outliers and imperfections that the wildness lurks.

Finally, the science of risk management sometimes creates new risks even as it brings old risks under control. Our faith in risk management encourages us to take risks we would not otherwise take. On most counts, that is beneficial, but we must be wary of adding to the amount of risk in the system. Research reveals that seatbelts encourage drivers to drive more aggressively. Consequently, the number of accidents rises even though the seriousness of injury in any one accident declines.c Derivative financial instruments designed as hedges have tempted investors to transform them into speculative vehicles with sleigh-rides for payoffs and involving risks that no corporate risk manager should contemplate. The introduction of portfolio insurance in the late 1970s encouraged a higher level of equity exposure than had prevailed before. In the same fashion, conservative institutional investors tend to use broad diversification to justify higher exposure to risk in untested areas—but diversification is not a guarantee against loss, only against losing everything at once.

image

Nothing is more soothing or more persuasive than the computer screen, with its imposing arrays of numbers, glowing colors, and elegantly structured graphs. As we stare at the passing show, we become so absorbed that we tend to forget that the computer only answers questions; it does not ask them. Whenever we ignore that truth, the computer supports us in our conceptual errors. Those who live only by the numbers may find that the computer has simply replaced the oracles to whom people resorted in ancient times for guidance in risk management and decision-making.

At the same time, we must avoid rejecting numbers when they show more promise of accuracy than intuition and hunch, where, as Kahneman and Tversky have demonstrated, inconsistency and myopia so often prevail. G.B. Airy, one of many brilliant mathematicians who have served as director of Britain’s Royal Observatory, wrote in 1849, “I am a devoted admirer of theory, hypothesis, formula, and every other emanation of pure intellect which keeps erring man straight among the stumbling-blocks and quagmires of matter-of-fact observations.”9

The central theme of this whole story is that the quantitative achievements of the heroes we have met shaped the trajectory of progress over the past 450 years. In engineering, medicine, science, finance, business, and even in government, decisions that touch everyone’s life are now made in accordance with disciplined procedures that far outperform the seat-of-the-pants methods of the past. Many catastrophic errors of judgment are thus either avoided, or else their consequences are muted.

Cardano the Renaissance gambler, followed by Pascal the geometer and Fermat the lawyer, the monks of Port-Royal and the ministers of Newington, the notions man and the man with the sprained brain, Daniel Bernoulli and his uncle Jacob, secretive Gauss and voluble Quetelet, von Neumann the playful and Morgenstern the ponderous, the religious de Moivre and the agnostic Knight, pithy Black and loquacious Scholes, Kenneth Arrow and Harry Markowitz—all of them have transformed the perception of risk from chance of loss into opportunity for gain, from FATE and ORIGINAL DESIGN to sophisticated, probability-based forecasts of the future, and from helplessness to choice.

Opposed though he was to mechanical applications of the laws of probability and the quantification of uncertainty, Keynes recognized that this body of thought had profound implications for humanity:

The importance of probability can only be derived from the judgment that it is rational to be guided by it in action; and a practical dependence on it can only be justified by a judgment that in action we ought to act to take some account of it.

It is for this reason that probability is to us the “guide of life,” since to us, as Locke says, “in the greatest part of our concernment, God has afforded only the Twilight, as I may so say, of Probability, suitable, I presume, to that state of Mediocrity and Probationership He has been pleased to place us in here.”10

aal-Khowârizmî, the mathematician whose name furnished the root of the word “algorithm,” would surely be astonished to see the offspring of what he launched nearly 1200 years ago.

bFrom 1871 to 1958, stock yields exceeded bond yields by an average of about 1.3 percentage points, with only three transitory reversals, the last in 1929. In an article in Fortune magazine for March 1959 Gilbert Burke declared, “It has been practically an article of faith in the U.S. that good stocks must yield more income than good bonds, and that when they do not, their prices will promptly fall.” (See Bank Credit Analyst, 1995.) There is reason to believe that stocks yielded more than bonds even before 1871, which is the starting point for reliable stock market data. Since 1958, bond yields have exceeded stock yields by an average of 3.5 percentage points.

cFor an extensive analysis of such cases, see Adams, 1995.

Notes

1. Kendall, 1972, p. 42.

2. Quoted in Adams, 1995, p. 17.

3. Chesterton, 1909, pp. 149–150.

4. Chorafas, 1994, p. 15.

5. Ibid., p. 16.

6. See especially Hsieh, 1995, and Focardi, 1996.

7. For interesting and lucid descriptions of advances in these areas, see Focardi, 1996, and Leinweber and Arnott, 1995. The Journal of Investing, Winter 1995, has five excellent articles on the subject.

8. See “Can the Complexity Gurus Explain It All,” Business Week, November 6, 1995, pp. 22–24; this article includes reviews of two books on this subject.

9. Kruskal and Stigler, 1994, p. 7.

10. Keynes, 1921, p. 323.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset