Watch a drop of rain trace down a window-pane. Being acquainted with gravity, you might expect it to take a perfectly straight path, but it doesn’t. It zigs and zags, so its position at the bottom of the pane is almost never a plumb drop from where it began.
Or graph a series of Bernoulli trials. Provided the probability of winning is between 0 and 1, the path, again, will veer back and forth unpredictably.
You are observing Gaussian randomness in action. All Gaussian processes are Lipschitz continuous, meaning, approximately, that you can draw them without lifting your pencil from the paper.
The most famous and widely studied of all Gaussian processes is Brownian motion, discovered by the biologist Robert Brown in 1827, which has had a profound impact on almost every branch of science, both physical and social. Its first important applications were made shortly after the turn of the last century by Louis Bachelier and Albert Einstein.
Bachelier wanted to model financial markets; Einstein, the movement of a particle suspended in liquid. Einstein was looking for a way to measure Avogadro’s number, and the experiments he suggested proved to be consistent with his predictions. Avogadro’s number turned out be very large indeed — a teaspoon of water contains about 2x10E23 molecules.
Bachelier hoped that Brownian motion would lead to a model for security prices that would provide a sound basis for option pricing and hedging. This was finally realized, sixty years later, by Fischer Black, Myron Scholes and Robert Merton. It was Bachelier’s idea that led to the discovery of non-anticipating strategies for tackling uncertainty. Black et al showed that if a random process is Gaussian, it is possible to construct a non-anticipating strategy to eliminate randomness.
Theory and practice were reconciled when Norbert Wiener directed his attention to the mathematics of Brownian motion. Among Wiener’s many contributions is the first proof that Brownian motion exists as a rigorously defined mathematical object, rather than merely as a physical phenomenon for which one might pose a variety of models. Today Wiener process and Brownian motion are considered synonyms.
Back to Eustace.
Eustace plays in an uncertain world, his fortunes dictated by random processes. For any Gaussian process, it is possible to tame randomness without anticipating the future. Think of the quadrillions of Eustaces floating about, all encountering continuous changes in pH, salinity and temperature. Some will end up in conformations that mediate the disruptive effects of these Gaussian fluctuations. Such conformations will have lower overall volatility, less positive entropy, and, consequently, higher alpha.
Unfortunately for Eustace, all randomness is not Gaussian. Many random processes have a Poisson component as well. Unlike continuous Gaussian processes, disruptive Poisson processes exhibit completely unpredictable jump discontinuities. You cannot draw them without picking up your pencil.
Against Poisson events a non-anticipating strategy, based on continuous adjustment, is impossible. Accordingly they make trouble for all Eustaces, even human beings. Natural Poisson events, like tornadoes and earthquakes, cost thousands of lives. Financial Poisson events cost billions of dollars. The notorious hedge fund Long-Term Capital Management collapsed because of a Poisson event in August 1998, when the Russian government announced that it intended to default on its sovereign debt. Bonds that were trading around 40 sank within minutes to single digits. LTCM’s board members, ironically, included Robert Merton and Myron Scholes, the masters of financial Gaussian randomness. Yet even they were defeated by Poisson.
All hope is not lost, however, since any Poisson event worth its salt affects its surroundings by generating disturbances before it occurs. Eustaces configured to take these hints will have a selective advantage. Consider a moderately complex Eustace — a wildebeest, say. For wildebeests, lions are very nasty Poisson events; there are no half-lions or quarter-lions. But lions give off a musky stink and sometimes rustle in the grass before they pounce, and wildebeests that take flight on these signals tend to do better in the alpha casino than wildebeests that don’t.
Even the simplest organisms develop anti-Poisson strategies. For example, pH levels and salinity are mediated by buffers while capabilities like chemotaxis are a response to Poisson dynamics.
A successful Eustace must mediate two different aspects of randomness: Gaussian and Poisson. Gaussian randomness continuously generates events while Poisson randomness intermittently generates events. On the one hand, Gaussian strategies can be adjusted constantly; on the other, a response to a Poisson event must be based on thresholds for signals. Neither of these configurations is fixed. Eustace is a collection of coupled processes. So in addition to external events, some processes may be coupled to other internal processes that lead to configuration changes.
We will call this choreographed ensemble of coupled processes an alpha model. Within the context of our model, we can see a path to the tools of information theory where the numerator for alpha represents stored information and new information, and the denominator represents error and noise. The nature of this path will be the subject of the next installment.
Mandelbrot might remark that Gaussian and Poisson processes resemble each other but differ in temporal and quantitative scale.
Everyone familiar with the subject would agree with Mandlebrot. But even Mandlebrot, Feigenbaum and Lorenz together might not be able to control the number of occurrences of an event to produce enough data to yield all the moments of its distrubution.
The first application was the description of the number of deaths by horse kicking in the Prussian army.
Fascinating, boss, keep it coming.
But the "policy" implications are still not at all clear to me. Whether the uncertainty that I face is Poisson-like or Gaussian may matter little to my real-world strategy for deling with it, no? Two equally Gaussian situations may have entirely different degrees of uncertainty (as well as differently significant outcomes–death, say, versus, a few secods of lost time…) and require entirely different strategies to cope with. Same with Poisson events: one may be an event that I can easily and cheaply deal with in a constant way (however irregular the event), or one that I (truly) cannot anticipate at all (hurricanes do not qualify, here)… Is this really the normatively significant difference?
Or, is it "cheating" to look at specific situations to note the varying degrees of certainty that actualy exist among them, i.e., the facts that can (perhaps, only marginally) alter the level of my uncertainty in any particular instance? Don’t we need ALL the facts that we can muster about the specific context, and isn’t it those extra facts where the policy-implication lay?
Wiener process. heh heh. Brownian motion. heh heh.
So, Aaron, do you think the election was fixed? Exit polls vs. final vote is right up your analytical alley…
Jim: The difference is significant insofar as most successful Eustaces have already evolved pretty good strategies, or policies as you put it, for dealing with Gaussian randomness. At a high level in the casino Eustaces are distinguished by their ability to cope with Poisson.
Beavis: I don’t know what exit polls have to do with alpha theory, but of course the election was fixed. On election night I myself was riding the 7th Avenue IRT and counted the buttons on my fellow passengers: Kerry-Edwards 34, Bush-Cheney 0. So Kerry not only won, he won unanimously.
Aaron,
Your point is well-taken. Now keep going!
Your description is quite nice, though I feel that, having name-checked the Flying Bernoulli Brothers, you should at least have found a way to point out the binomial distribution upon which their performance hangs…
Anon
Upon consideration, a further comment.
Given the progression of your argument up to this point, it seems possible that you will use Gaussian and Poisson processes as more than just _examples_ of smooth or discontinuous processes. I would point out that the problem of actually determining the underlying probability distribution of a complex process is not a trivial problem — and, despite the simplifications that it allows, it is not always permissible to assume that, in the end, everything is basically Gaussian. Mandelbrot will tell you otherwise — half a century ago, so would have Kolmogorov.
All of this may have relevance is you start talking about maximizing a function whose underlying probability distribution you cannot with certainty describe.
Anon
Mr. Anon,
The derivation in no way implies that Eustace will know the true nature of the distributions. Rather, the point is to illustrate the different challenges posed by different types of randomness. If a stationary process has only two moments, it can be adapted.
Kolmogorov complexity has its uses but is degenerate when applied to living systems. You’ll find that functions defined with alpha are a much better metric with which to compare organisms and their complexity.
Let me know what type of content you’d like to see more of in the future!