Perceptions rather than rules: the (mis)behaviour of markets

  • Details
  • Transcript
  • Audio
  • Downloads
  • Extra Reading

Why do we seem to face 1 in 300 year events every three years? Unlike many physical systems, markets exhibit strange, non-normal statistical distributions. We will consider the complex inter-relationships between perceived risk and society's decisions, the importance of 'feed-forward' and the implications of chaos theory on our understanding of markets.

Download Transcript

 

PERCEPTIONS RATHER THAN RULES:
THE (MIS)BEHAVIOUR OF MARKETS

 

Professor Michael Mainelli

 

Good evening Ladies and Gentlemen. I’m pleased to see so many of you turn up for a lecture on perceptions rather than rules, but given that this is a former school, may I ask you to avoid any (mis)behaviour at least until we get to questions!

Well, it wouldn’t be a Commerce lecture without a commercial. The next Commerce lecture will follow the theme of better choice and explore “Goldilocks Government And The Market: Not Too Little, Not Too Much, But Just Right” here on Monday, 16 January 2006 at 18:00. And a special announcement – in the programme on page 7 we have an international finance event scheduled in February. On the evening of Monday, 6 February 2006 we shall have two lectures, one from Professor Joshua Ronen of NYU and one from me on “Reforming Auditing – Incremental Change or Radical Action?” at the Museum of London at 18:00. We also have a workshop on Radical Change in Auditing for those interested on the following day, Tuesday, 7 February 2006 in the City.

Well, as we say in Commerce – “To Business”.

[Slide: Outline]

 

It’s All About People

This talk is about feed-forward. What is feed-forward, why does it matter to markets and how does it make markets misbehave? Let me tell you a story my father repeatedly told us as children.

Once upon a time there was a man who ran a hot dog stand. This man ran one of the finest hot dog stands in the whole city and, strangely for a hot dog stand, he even used real meat in his sausages. People came from miles around to get tasty hot dogs that were freely covered in onions and sauces. In fact, the man was so successful that he could afford to send his son to Harvard. His son even went on to finish an MBA. After graduation the son came back to work with his dad. “Dad”, he said, “based on the current economic statistics, we’re heading for a recession. You’ve got to stop using all that sauce, and you dish out onions as if they were free.” The father was torn. He’d always been generous to his customers, but his very bright boy didn’t get all that education for nothing. So, reluctantly, he cut back on the sauces and the onions. His son moved him to buying a cheaper brand of hot dog with a more traditional sawdust ratio. It was just in time, because it turned out his son was right – his business took a real dive.

Now I happened to go to Harvard, so I’ve always wished my father had used “Yale” as the example. Even more to my dismay, my father listened attentively to his own story, such that after I went to Harvard he moved half-way around the globe and rather oddly never let me anywhere near his business. But a deeper truth is that what people think or believe will happen, frequently comes to pass. We could say that people’s self-fulfilling prophecies create chaos in the markets. And people’s perceptions cause markets to operate at the edge of chaos. So let me take you on an exploration of Systems Theory, Stochastic Systems Theory, Chaos Theory and Complexity.

Please Don’t Try To Do Systems Theory At Home

We can now start to explore what I meant in my first lecture when I mentioned that one of the four basic areas of knowledge we needed to comprehend in order to understand Commerce was “Stochastic Systems Theory”. I’d like to start by trying to explain Systems Theory to you in such a way that you can start looking at the world slightly differently.

[Slide: Systems Theory – IPOFMG]

Stochastic Systems Theory, Chaos Theory and Complexity have their roots in (just plain old ordinary) Systems Analysis, or as it was called at the time it was founded by Norbert Wiener in 1948, Cybernetics. Systems Analysis is an inter-disciplinary study of the communication and control of many independent units acting towards a goal. Systems Analysis has had many noteworthy contributors such as Ludwig von Bertalanffy, Claude Shannon, Alan Turing, John von Neuman, Stafford Beer or Peter Checkland. With so many intellects, there have been a few wars and one needs to avoid falling into a few of the trenches left behind. However, Gwilym Jenkins summarises well a relatively accepted, traditional view of systems analysis from an engineering perspective, as:


a complex grouping of human beings and machines;
able to be broken into sub-systems;
interaction between sub-system inputs and outputs;
part of a hierarchy of systems;
having an overall objective;
designed in a way capable of meeting its overall objective.

 

If you can memorise seven words, then you can remember Systems Theory. The idea is that all systems exhibit seven components – inputs, processes, outputs, feed-back, monitoring, feed-forward and governance. I tried a few mnemonics such as mopffig, giffmop and pigmoff, but ipofmfg seems to be the best I can do (ugh) that retains the idea of the feed-back and feed-forward cycles inherent in systems. But what do these seven components mean? Well, take an automobile:


inputs, processes and outputs: are all the ‘doing’ components, the engine, wheels, chassis, brakes, steering and fuel systems of a car – do note all the sub-systems;
feed-back: perhaps the one bit of Systems Theory or Cybernetics we’ve all heard about; it’s about how the system reacts to news about how well or poorly it’s doing, such as how we react to our speedometer when we see we’re over the speed limit, or at least the speed camera limit, or how we brake, or accelerate, when a cyclist crosses in front of us;
monitoring: is how we measure what’s going on, the speedometer;
feed-forward: is a setting that anticipates something, a bit like using cruise control or signalling a manoeuvre;
governance: is how we adjust the system to meet our goal, deciding where to drive.

 

[Slide: Please Don’t Try To Do Systems Theory At Home!]

This is all a bit abstract, so let’s go into detail with another example. Let’s have a look at a house with a heating system on a cool day. We want to regulate the temperature in our house. We see the same seven components:


input: fuel;
process: burning the fuel to heat water;
output: pumping the hot water around the radiators;
feed-back: a new, warmer temperature;
monitoring: comparing the new, warmer temperature with our desired temperature;
feed-forward: we set the thermostat based on our desired temperature;
governance: our goal, a bit warmer or a bit cooler?

 

Do note the huge variety of sub-systems that might be present, the pumping system for the radiators, the fuel delivery system, the electrical system that powers the meters, the insides of the meters themselves. However, I want to draw your attention to the peculiar role of feed-back and feed-forward, so I built a small heating simulator. Unfortunately though, I wasn’t able to get a very sensitive fuel input, so the temperature could either go up a degree or down a degree, not a lot in between. I also had to use a computer-based random number generator, despite John von Neumann’s curse that, “Anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin.” Let me share some of the results of trying to keep the temperature at 20ºC with you. This sequence of slides shows:

[Slide: Global Warming Goes For A Random Walk]


a random walk of temperature around 20ºC - just a random process that follows the outside temperature;

 

[Slide: Heating System Tries To Hit 20ºC]


a basic heating system that adjusts itself up when the inside temperature is below the target and down when the inside temperature is above the target. For a stupid machine, it doesn’t do too badly, but it’s not perfect;

 

[Slide: Person Knows Up/Down, Tries To Hit 20ºC]


a person trying to do without the system. However, this person has knowledge of whether the outside temperature is going up or down, so he or she injects some feed-forward. If the temperature is going to be cooler, say nightfall is approaching, they turn the heating up. If the temperature is going to be warmer, say the weather prediction is for a sunny day, then they turn the heating down. Notice that the temperature isn’t any better on target;

 

[Slide: Person And Heating System Working Together … hmmm]


so you’d say, let’s use the heating system and the person together and things will be great. Well here we have feed-back and feed-forward together. Not much better, as they often contradict each other;

 

[Slide: Person With Perfect Foreknowledge, Tries To Hit 20ºC]


so I gave the person a bit better knowledge – they could know exactly what the temperature outside was going to be 10 time periods in advance. Not really any better, is it?

 

[Slide: Person And Heating System With Perfect Foreknowledge, Try To Hit 20ºC]


finally, I gave both the heating system and the person perfect foreknowledge of what the outside temperature was going to be 10 time periods in advance – I even let them tell each other, but I didn’t let them cooperate. Not really any better, is it?

 

The point here is that the heating system and the person, even with very good foreknowledge, can’t really improve on basic feedback, in fact they can make it worse by knowing when they cannot use that knowledge to cooperate. Let’s look at a four more examples quickly, just so you can see that I didn’t pick a worst case...

[Slides: Looks Like A Market 1-4?]

 

Stochastic Systems Theory

[Slide: Stochastic Systems Theory]

Now let’s get all stochastic. Stochastic is just a fancy word for random. It is largely synonymous with chance. Stochastic comes from the Greek to “aim at”. I like to think of the metaphor of an archer aiming arrows at a target, shooting and then looking at the pattern of distribution around the target. Systems Theory encourages us to decompose complex systems into sub-systems, recognising interlinked feed-forward and feed-back loops. Stochastic Systems Theory takes basic Systems Theory and asks you to realise that there are many imperfections at all points – what is the target, how do I know the value of the target, where is the target moving, what will this bow do, how well will this arrow fly, from my last hundred shots was I high, low or all over the place? At each of the seven system components you do not have a single number, you have a range of numbers.

A stochastic view of the world develops our appreciation for the roles of chance and imperfect information both in planning forward and evaluating backward. Stochastic Systems Theory asks you to recognise the role of chance and probability. If stochastic doesn’t seem important, no greater an authority than Laplace said, “The most important questions of life are, for the most part, really only problems of probability.” Chance and probability feature most the key measures and what people do about them – think of financial people as blindfolded archers shooting at a target and your job is to analyze the distribution of the arrows.

Now I want to draw your attention to the people in the system, so I need to talk about my wife. We too happen to have a heating system at home, but sadly my relationship with my wife is thermostatically challenged. As with many couples, she likes it warm while I like it cool. She wants a heavier duvet and I want all the windows open. She happens to be concerned about global warming, while I take it very personally, it’s going to be very inconvenient for me. Of course, we have a traditional marriage. So rather than open warfare we engage in covert warfare. When I feel the temperature is getting down to just about right, I realise she’s highly likely to put the thermostat up, so I turn it down a bit more, just to be sure. Unfortunately, she can catch me at this and before I can convince her that the temperature is exactly perfect or that the thermometer just happens to be under-reading, she’s highly likely to have increased the thermostat, maybe by about ten degrees or so just to be sure. Of course, in order to save the planet, when she’s not looking it’s my duty to turn it down, and open the windows so the birds don’t freeze. This can go on for a while, although though to be fair, not for too long. In the northern hemisphere, we only do this from September through May, unless it’s a cool summer.

[Slide: Husband And Wife In Strife?!?]

I did one final heating system simulation of a couple fighting. You see the basic temperature and you see the heating system and the person holding close to 20ºC in trying circumstances. Again, the black line is temperature. What’s trying about the circumstances for this poor soul is his or her marriage. I have included another person turning things up and down, a bit like my wife and me. Notice that this bright line is just about as good as the plain line where we aren’t fighting each other. People add a bit of chaos to the system, but, frequently they also end up reaching a form of stability in all the chaos, a bit of a strange attractor to a range of temperature. In this case, my wife is winning, things are above 20ºC on average, but we’re not a million miles from where we ought to be.

One of the strange things about complex systems is that, with enough actors and complexity, they can adapt to circumstances such that the overall system remains stable in a gross sense. For instance, share prices may be based on sentiment, but be attracted to the general region of ‘intrinsic’ value. Here, while my wife and I can start to make the system a bit more unstable, our interaction ultimately seems to stabilise around a temperature we can both live with. Although it might be simpler to talk with each other and agree on a common temperature, I’m led to believe that talking like this might put our marriage at risk. And this, quite naturally, leads us to chaos.

 

The Edges of Chaos

[Slide: The Edges of Chaos]

Markets are always on the edge of chaos. Informally, we note the rapid changes in prices, such as oil at the moment; or complete reversals in fortune, such as Refco at the moment; or the total abandonment of previous passions, such as technology shares at the moment. In my last talk I emphasised that all markets are anthropo-centric; they are purely based on what people think. So how do these thoughts affect the numbers? More formally, we can look at an area of study called Chaos Theory and wonder about its applicability to markets. So let’s leave people’s perceptions for a moment and return after we’ve had a discursion through some mathematics. For as Roger Bacon remarked, “Mathematics is the door and the key to the sciences.”

Chaos Theory is not a theory, but a way of approaching problems, or a set of techniques and viewpoints which seem to recur in problems. Chaos Theory attempts to explain boundary conditions between order and disorder, between the easily modelled problem and the impossible to model problem. In Edgar E Peters’ words, “Many systems have now been found where randomness and determinism, or chance and necessity, integrate and coexist.” Chaos Theory is not particularly interested in true chaos or true order, but in areas where things appear chaotic, although they have a strong underlying order. Although earlier work by Cantor, Sierpinski or that great mathematician, Henri Poincaré, has been a great influence, “chaos” as a movement began in the early 1960’s with the work of Lorenz and Mandelbrot. While Chaos Theory is not a formal movement, a number of strong themes recur.

[1] An initial theme in Chaos Theory is self-similarity or symmetry across scale. Self-similarity is often illustrated by fractals, geometric constructions that show dimensions between the integer dimensions. The term fractal was coined in 1975 by Ben oît Mandelbrot, from the Latin word fractus or “broken”. Simple equations can produce apparently complex, even beautiful diagrams, that you can delve into at deeper and deeper resolutions unto infinity, yet still seem similar at each scale. No matter how deeply you move into this picture, you still find the Paisley Pillsbury Doughboy shape recurring at deeper and deeper resolutions. Self-similarity in fractals is thought to correspond to analogous similarity across scale in nature, e.g. coastlines or clouds look similar whether viewed close up or far away. Mandelbrot believes that markets are strongly self-similar, i.e. graphs of daily trading look similar to weekly which in turn look similar to monthly or annually. Mandelbrot demonstrates that markets appear to exhibit a memory of their entire history contained within their fractal dimension.

[Slide: 10 Years Seems Like 10 Days]

[2] A second theme is the recognition that simple models can produce apparent complexity to an observer, such as the relatively simple algorithm that generates the Mandelbrot set. Further, this apparent complexity seems, in many cases, to resemble apparent complexity found in nature, such as trees, clouds or coastlines. Graphs of deterministic non-linear models that appear to generate complex behaviour are popular in Chaos Theory because they illustrate well the existence of a region of models bounded by the organised simplicity of linear models, or continuous non-linear models, and chaos. These models tend to have the characteristics of being aperiodic and having forms that change structurally given small changes to the model variables. At the same time they exhibit an underlying order, with resonances and attractors, which people perceive, or symmetry which people love. Non-linear dynamic models are the posters of the Chaos Theory movement. This theme is taken further to the point that Chaos Theory and its illustrative companions, fractals, are seen to encompass a new science of wholeness, almost a new Romantic movement in science. While there are elements of romanticism within Chaos Theory literature, concrete analytical work sits alongside.

[3] A third large theme in Chaos Theory is said to originate with Poincaré, ‘sensitive dependence on initial conditions’. In one direction, we frequently examine financial-mathematical models and compare them with reality in hopes of finding a lasting comparison from which the existence of a theory may be inferred. Quoting Edgar E Peters again, this “can be confused with ‘data mining’ or torturing the data until it confesses … Actual results depend on many numerical experiments with varying test parameters. If this sounds unscientific, it is”. Another direction is to take a theory, develop a model and compare it with reality.

For example, back in 1990 De Grauwe and Vansenten built a model of the foreign exchange market from generally accepted, though not uncontested, theory. They demonstrated the chaotic properties of the model and contrasted the model with actual exchange rate data showing numerous statistical similarities with the model outputs. The results illustrated that models can be constructed which may be perceived to mimic actual market behaviour but which are not predictive. However, minute changes to the model inputs resulted in new outputs that bore little statistical similarity to actual exchange rate data. The model was sensitive to initial conditions to such a degree that no data could ever be accurate enough to begin forecasting. The small degree of variance in normal input data accuracy was more than sufficient to change the model output markedly. Trivial facts or events could completely alter the model output. “Exact” knowledge of the environment would be necessary to use the model predictively and knowledge to that accuracy is unlikely. Moreover, the model was untestable as there was no means of obtaining real world data of sufficient quality to test it. So De Grauwe and Vansenten succeeded in proving that their foreign exchange model was useless in real life.

[Slide: Spot the Fake]

Let’s take a moment to try and see how hard it is to spot the difference. You see before you eight different time series provided by Mandelbrot to illustrate how models can imitate real market prices. Your job is to try and spot which are real and which are fake. The real ones are either share price movements or foreign exchange rates. Let’s take a vote on each one as we go through them.


Fake – Bachelier random walk
Fake – random walk, Mandelbrot adjustments
Fake – random walk, Mandelbrot adjustments
Fake - Mandelbrot multifractal
Real - IBM share prices
Real - Dollar-Deutsche Mark
Fake - Mandelbrot multifractal
Fake - Mandelbrot multifractal

 

So what have we learned? Well, just as De Grauwe and Vansenten told us, we can build lots of models that appear to simulate reality, but we may never be able to select from among them those that have a provable basis in reality. If there is a significant number of possible models, none of which can be verified, but all of which exhibit behaviour compatible with the system being modelled, it is impossible to choose a best fit or, even if you choose one at random, it’s impossible to prove it’s valid. David Deutsch summarises things: “Chaos theory is about limitations on predictability in classical physics, stemming from the fact that almost all classical systems are inherently unstable. The ‘instability’ in question has nothing to do with any tendency to behave violently or disintegrate. It is about an extreme sensitivity to initial conditions.”

We gain three pointers from Chaos Theory to markets:


self-similarity at different scales;
simple models can produce apparently complex behaviour;
extreme sensitivity of models to initial conditions.

 

[Slide: Life Is A Bit Complex]

Chaos Theory’s distinctions among chaos, order and the boundary between chaos and order are particularly useful as metaphors for markets. If there are non-linearities in a system, then behaviour may be as unpredictable as it is interesting. But we also need to be aware that Chaos Theory and its pretty posters may foster illusions. Einstein, echoing William of Ockham, warns us:

“Although it is true that it is the goal of science to discover rules which permit the association and foretelling of facts, this is not its only aim. It also seeks to reduce the connexions discovered to the smallest possible number of mutually independent conceptual elements. It is in this striving after the rational unification of the manifold that it encounters its greatest successes, even though it is precisely this attempt which causes it to run the greatest risk of falling a prey to illusions.” [Einstein, Nature 146, 1940, page 605, as taken from van den Beukel, page 83]

 

Chaos Is A Bit Complex

As if Chaos Theory wasn’t enough for this evening, let’s move on to Complex Systems, or Complexity. Many ancient mythologies and religions dwell on the tension between order and disorder. So much so that our word “Chaos” comes from the name of the progenitor of all Greek gods, Kaos. The link between Chaos Theory and Complexity is the apparent emergence of order from chaos. Another way of expressing this is that some systems are self-organising or negentropic, i.e. order emerges from them. For some period of time a negentropic system defies entropy, the second law of thermodynamics that states all processes tend towards greater disorder. In some fashion, a self-organising system gains energy from the wider environment and establishes or increases order. Biological systems such as slime and swarms and weather systems such as tornadoes and hurricanes are examples of this, at least until their death. Perhaps Adam Smith’s espousal of the “invisible hand” is a seminal example of recognising self-organising systems emerging from apparent chaos in human organization.

Complexity is the multi-disciplinary study of systems from which order seems to emerge. Bees, a beehive; cells, a nervous system, a human being; a business unit, a company, a market, an economy – these all comprise complex networks of inter-related entities that interact and affect each other and seem to exhibit a deeper order, a group behaviour. Complexity asserts a commonality among these systems from which we may be able to derive common features and principles about how to treat these phenomena. Complexity studies tend to favour the modeling of complex systems using non-linear mathematics embedded in software. Complexity is frequently bound up with the study of information, imperfect information and the transmission of information among actors within the system. When Ikujiro Nonaka talks about new organisational orders emerging from business chaos, he notes that “the essence of self-organization is in the creation of information”.

[Slide: Dynamic Systems]

A critic might note that we are talking about “perceived complexity”, i.e. humans perceive complexity and humans have a tendency to find patterns in anything. Humans are pattern recognition machines, even when they’re looking at white noise. A Gresham Lecture by David Omand reminded me that we believe that Sir James Sutherland’s Crabtree Orations, a set of satirical academic commentaries attributed to the fictitious 18th century poet Joseph Crabtree, inspired the great intelligence expert R V Jones to coin Crabtree’s Bludgeon: ‘no set of mutually inconsistent observations can exist for which some human intellect cannot conceive a coherent explanation, however contrived.” In short, humans have no natural tendency to sharpen their minds on Occam’s Razor. On the other hand, to paraphrase the old joke about paranoia, “Just because the brain sees pictures that aren’t there, doesn’t mean they don’t exist.” A critical mind might well note that some of the observations from the Complexity community could tend towards the vacuous or obvious – “life is complex” or “complexity is all around us”. On the other hand, Chaos Theory and Complexity have provided thrilling analogies to business, society and nature. While to date few practical tools have emerged, surely it is right to explore the boundaries between order and chaos in nature and see what we can learn.

Predictably Chaotic

Chaos Theory and Complexity frequently use similar computer models. This interest has often taken the form of modelling evolution, for instance as ‘vivisystems’ which in turn provokes thoughts of other living system metaphors for markets. Of course, the search for models is also the search for the ability to predict. The desire for predictive capability is deep-rooted in people. Sherden devotes an entire book to examining the “second oldest profession”, prognosticators. Sherden estimates the market for prediction, “fortune sellers”, as over US$200 billion and then proceeds to demonstrate in most instances how poorly prognosticators perform against their own standards in areas as diverse as weather forecasting, economics, financial analysis, demographics, technology forecasting, futurology and corporate planning.

[Slide: Predictably Chaotic]

Paulos notes, “If a system as trivial as this single nonlinear equation [of population] can demonstrate such chaotic unpredictability, perhaps we shouldn’t be quite as assertive and dogmatic about the predicted efforts of various social, economic, and ecological policies on the gigantic nonlinear systems that are the U.S.A. and Planet Earth.” Yet we keep trying to predict the unpredictable. That reminds me of an old Groucho Marx joke that Woody Allen recycled to explain the inevitability of amorous relationships: “Doctor, Doctor, my brother thinks he’s a chicken. Can you help?” “Why don’t you stop him?” “We need the eggs!” Our desire to predict the unpredictable is deeply human, we need the eggs.

Popper however doesn’t need any eggs at all, “There can be no prediction of the course of human history by scientific or any other rational methods … We must reject the possibility of social science that would correspond to theoretical physics.” Popper had a keen grasp of systems thinking’s potential applicability long before Complexity as a field of study emerged:

“There is no doubt that the analysis of any concrete social situation is made extremely difficult by its complexity…a complexity due to the fact that social life is a natural phenomenon that presupposes the mental life of individuals…which in its turn presupposes biology, which again presupposes chemistry and physics. The fact that sociology comes last in this hierarchy of sciences plainly shows us the tremendous complexity of the factors involved in social life.”

Popper is correct, there are big problems in treating economics and finance as physical systems, but Complexity’s adherents persist in noting that lessons from other dynamic systems are as applicable to human organisation as they are to nature and that these approaches can afford insights about the limits of prediction.

Many people conclude that strategy or policy can emerge from environmental and organisational chaos, but paradoxically that chaos can be controlled towards certain ends. Many physical or ecological systems have implicit feed-back. For instance, as predators’ prey multiply, predators multiply, forcing a decrease in the prey population, leading the predator population to decrease, leading the prey population to increase, and so on. But as with the heating system metaphor we saw earlier, in addition to feed-back, economic and financial systems exhibit feed-forward, i.e. people’s perceptions affect the probability of future events. If people change their perceptions of a risk, e.g. terrorism recently, then that perception change alters future behaviours, such as transportation levels on public transport. I remember well working on some strategic planning for London rail transport in the mid-1980’s. Our extrapolated employment in the financial services industry indicated that the number of people coming into the City of London ’s financial centre would soon disastrously overwhelm the rail system at peak hours, particularly during the morning rush hour. The increasing growth of London as a global financial centre in the 1980’s just before and after the Big Bang of 1986 meant that people were more integrated with global markets and needed to distribute their working hours to coincide with other markets. So, while in the event employment in financial services did rise, people stopped coming in at a very narrow peak period and started work both earlier and later, thus averting that catastrophe, though I’m hardly claiming that London’s transportation couldn’t be improved.

 

10,000 Fed-Forward Maniacs

[Slide: 10,000 Fed-Forward Maniacs]

People wonder how financial market risk varies from natural risk such as earthquakes or hurricanes. To start, imagine we know that there is a hurricane coming to town. Imagine too, that we run an insurance company that will have to pay out on damages. We do some calculations and in order to reduce likely injuries we announce the hurricane’s imminent arrival to everyone with an enjoinder to leave town. We tell people that they ought to go, particularly as the police and emergency services are also on their way out. Unfortunately, that leads to people staying behind to loot and we wind up paying much more for damage to property by people than that we faced from damage by hurricane. The feed-forward information had unintended consequences. There is a big difference between a natural risk and a human behaviour risk.

In my opinion, the key distinction is that financial markets incorporate people’s perceptions. Mark Twain puts these words in Huckleberry Finn’s mouth, “Hain’t we got all the fools in town on our side? And hain’t that a big enough majority in any town.” Many people’s perceptions matter to markets – and that leads to 10,000 Fat-Tailed Maniacs deciding the fates of most financial markets. Many people object, surely a share has an ‘intrinsic’ value?

I often describe the majority of share buying and selling as gambling against investor perceptions, not some ‘intrinsic’ value (say the dividend return) of the share. When you buy a share, you bet against past investors prepared to sell to you today; when you sell a share you bet against future investors prepared to buy today. But it’s not really that simple. A bet crystallises, my football team lost and yours won; I pay you. Shares are different. A traditional share may never ‘crystallise’. You might sell a share today whose ‘intrinsic’ value (if there is such a thing) is as good as yesterday, but you believe that future investors will not value it so highly. Indirectly, you realise that future investors may not value the share so highly because the future future investors to whom they will need to sell may not value it so highly, and so on – feed-forward on feed-forward. The problem here is that you are betting on the future gambling propensities of future investors, not today’s investors or today’s facts. This leads to an interesting distinction between the prices of financial instruments and other physical systems – the importance of your perception of other people’s perceptions today of yet other people’s perceptions tomorrow and so on ad infinitum. As Jack A. Marshall noted, “nobody perceives anything with total accuracy”, so stochastic systems theory and feed-forward on feed-forward may well describe financial markets. A bit like my wife and I with our perceptions about each other’s heating cheating.

In fact, my firm Z/Yen has run a large number of fictional investment games over the years. These games originally began in order to help scientists break an all-too-common assumption they held that there was a correct price for a share. We wanted the scientists to abandon their Quixotic search for objective valuation of their technologies using financial-mathematical modeling and make them appreciate the importance of human sentiment in share valuation. We wanted them to see the value of perceptions rather than rules. We made them play a fictional stockmarket game we called the Technology Bourse. You see before you the results of five share prices across 10 typical games and can see that different shares did vastly better or worse in different games, with exactly the same information provided to the scientists, just their perceptions differed. Even scientists found themselves unable to agree consistently on an ‘intrinsic’ value; the sentiments are the markets. On the other hand, the search for, and consensus upon, some methods of determining ‘intrinsic’ value can function as strange attractors. While the share price bounces around nearby, it may not leave the area altogether. It’s a bit like the 20ºC thermostat target. 20ºC is a good approximation of what’s likely to happen unless everyone starts to believe that the heating will fail and turns up in snowsuits.

 

Fat Tails

[Slide: Leptokurtic Fat Tails]

I want to draw your attention to something statistically strange about these human games. Let’s go back to our heating system for a moment. If we plot the distribution of the feed-forward loop against a normal distribution, we see some odd results. Here’s one typical simulation from the many I ran. You can see that the feed-forward temperatures are not normally distributed; they deviate wildly above and below a normal distribution. Yes, feed-forward hits some numbers close to 20ºC with amazing frequency, but it also winds up wildly off. Further, the distribution of feed-forward efforts is not normal. Interlinking feed-forward actually exacerbates distributions leading to abnormally large numbers of events at both ends of the predicted normal distribution. In the financial markets, a larger than expected number of events at both ends of a distribution is called “having Fat Tails”. If you want to go home with an impressive word tonight, unlike Mary Poppins I’ll leave you with something much shorter than “supercalifragilisticexpialidocious” - that word is “kurtosis”. Kurtosis describes the ‘peakedness’ of a distribution. Kurtosis is informally described as the volatility of volatility. If a distribution has a higher central peak and some “Fat Tails”, typical of financial markets, then it is described as ‘leptokurtic’. So next time you look at some graphs of your bank balance and notice the fat tailed highs and lows you can muse rather pompously out loud, “hmmm, I wonder if these are leptokurtic?” However, one other thing I learned from the heating experiment with my wife, no matter how fancy it sounds, even in jest, don’t even think about called your wife’s backside “leptokurtic”.

These leptokurtic distributions are increasingly important. I believe that the feed-forward loop of people’s perceptions might well explain the non-normal distributions we encounter in finance. Financial markets are growing more accustomed to so called ‘3-plus standard deviation’ events. A more than three standard deviation event should only occur once in every 300 events. A tsunami in South East Asia, hurricanes in North America and earthquakes in Pakistan, disasters seem more common. Actually, natural disasters probably aren’t more common, though they are more widely reported and more financially important. Financial analysts may moan, but markets aren’t failing because of these disasters. Take hurricanes. Globally, experts believe that hurricanes are probably within the levels of historical normalcy. Hurricane intensity may have increased slightly, though there are some uncertainties, and there is no solid evidence that global warming is, as yet, affecting hurricanes. What is certain is that financial damage has increased, though this can be almost wholly attributed to greater population, greater wealth and greater insurance coverage in the areas at risk. In other words, so far it is people that have caused the risk to increase, not nature, not hurricanes, not global warming. Naturally, while hurricanes may follow typical natural distributions, the financial impact of hurricanes provides a leptokurtic distribution.

[Slide: Once In 300 Years]

Non-normal distributions cause large problems for financial analysts. Analyzing non-normal distributions is hard work. First, you have to recognize that you are dealing with non-normal distributions, then you have to deploy more sophisticated mathematics. Please be careful about what I’m saying. For instance, there are many natural systems with non-normal distributions; I’m just pointing out that human systems are all too frequently ab-normal. If systems with feed-forward, that is systems with people, are typically non-normal, then we may be looking at the root cause of our apparent increase in 1 in 300 year events, events well beyond the third standard deviation – that root cause is people. We are confusing two types of events, natural events that will follow a normal distribution and human systems that will frequently have leptokurtic distributions. These human systems are exacerbated by feed-forward on feed-forward.

And it’s not just heating. For instance, we can look at a number of financial failures and see that what failed wasn’t the firm, but people’s perceptions of other people’s perceptions about the firm. The huge runs on banks in the USA during the 1930’s depression were caused by people believing that other people would withdraw their money before they could, thus leading them to withdraw their money and banks to fail. Arthur Andersen’s involvement in Enron and other auditing disasters led people to realise that other people wouldn’t respect it in future as an auditor. Future audit clients and future auditors willing to work for Arthur Andersen evaporated, forcing the firm to close. More recently Refco may be an example of a firm that lost out because of people’s perceptions. We often attribute these disasters to a lack of trust. I don’t disagree, but I think that there is a more general point here too. Financial markets are about perceptions and the perceptions are the reality.

This leads us to my last topic tonight, financial bubbles.

 

Bubble, Bubble, Toil and Trouble

[Slide: Bubbles]

Surowiecki, in his fascinating book, The Wisdom of Crowds, shows that large numbers of people can be very good at arriving at correct answers to complex problems, but that to do so four conditions must typically be met:


diversity of opinion – each person should have some private information, even if it’s just an eccentric interpretation of the known facts;
independence – people’s opinions are not determined by the opinions of people around them [a subtle point];
decentralization – people are able to specialize and draw on local knowledge;
and aggregation – some mechanism exists for turning private judgments into a collective decision.”

 

Markets clearly meet three conditions handily – diversity of opinion, decentralization and aggregation. However, markets do not wholly fulfill Surowiecki’s second condition, independence. As someone once remarked, “when you give a bad weather forecast predicting rain tomorrow, it doesn’t make rain tomorrow more likely.” But when you give a bad forecast predicting a share will fall tomorrow, you do make it more likely. People add a very strong feed-forward loop to financial systems. ‘Talking a market’ up or down frequently moves a market up or down.

Now financial bubbles are a huge topic, worth an entire lecture, but what I wish to do is point out that they are somewhat due to our lack of independence about our own opinions. In many cases we follow what others tell us to follow. Contrarian investors are people who bet against the herd. Contrarians count on bubbles. Contrarians believe, to paraphrase Keynes, that “in the long run, we are all wrong”. You can imagine the funny stories in England during the Dutch tulip bubble, the tulipmania of the 1600’s. “Hey, those crazy Dutch, you’ll never believe what they think a tulip is worth. It’ll never catch on here.” And apparently, for the most part, it didn’t. Nevertheless, the UK went on to produce a few good, home-grown manias over the centuries, such as the South Sea bubble, until we get to the 1990’s. “Hey, those crazy Californians, you’ll never believe what they think a computer company is worth. It’ll never catch on here.” This time it did, and arguably Europeans were the bigger suckers as we didn’t even see the bubble when it was well advanced. As we are just emerging from the biggest financial bubble in history, the Internet Bubble, yes the biggest, this is a time to realize that, as Herbert Simon said, “What information consumes is rather obvious: it consumes the attention of its recipients. ” Global information leads to global perceptions and therefore to global bubbles.

So what might we learn from all this? Well, two things stand out:


realize that all information is only a guess, a stochastic stab at what the underlying reality might be. You begin to realize that the world of Commerce is one of nested sets of stochastic systems, based on people’s perceptions, not rules of ‘intrinsic’ value;
start looking for feed-forward and you’ll find it almost everywhere. When fashion people tell you that “black is this year’s new black”, realize that they are trying to create momentum towards buying their large stocks of dark materials. When people tell you about property prices, remember that feed-forward can ensure that a bubble is very hard to spot while it forms, but you’ll almost always hear the ‘pop’ because you’re highly likely to be in the middle of it. Business is not immune to fashion. To quote Professor Colin Haslam last week, “In recent years media and consultancy reports discuss the threat of off-shoring and how it might reduce employment levels in the City of London. Company cases are used to demonstrate the benefits of off-shoring and extrapolations employed to amplify trends.” Also, a bit like fashion and dress lengths coming round every few years, remember that all markets have their favourite, recurring, no-lose investments, whether it’s tire-recycling in the waste markets, ultra-efficient engines in the environmental markets, or funds-of-funds in the financial markets, the same stories come around with alarming frequency. One day they’ll be at the top of the wheel, just not this round.

 

In conclusion, I’d like to consider the future misbehaviour of markets. More and more people are joining the global economy. As perception is everything, then we should recognise that as more and more people come into the global economy, we must understand their perceptions. These are people who are more likely to add diversity and opinions that we don’t know or understand. Moisés Naim reminds us:

“Statistically, a “normal” human being in today’s world is poor, lives in oppressive physical, social, and political conditions, and is ruled by unresponsive and corrupt government. But normalcy is not only defined by statistics. Normal implies something that is “usual, typical, or expected.” Therefore, normal is not only what is statistically most frequent but also what others assume it to be. In this sense, the expectations of a tiny minority trump the realities of the vast majority. There is an enormous gap between what average citizens in advanced Western democracies – and the richer elites everywhere – assume is or should be normal, and the daily realities faced by the overwhelming majority of people. Information about the dire conditions common in poor countries is plentiful and widely discussed. Curiously, however, expectations about what it means to be normal in today’s world continue to reflect the abnormal reality of a few rich countries rather than the global norm.”

Cultural dissonance often starts with un-thought assumptions about other cultures. To paraphrase the Conservative Party slogan in the last election, “you’re not thinking what I’m not thinking”. Unusually, this mutual ignorance may be a very good thing. Feed-forward on feed-forward is less likely to start when we don’t know how to figure out what the other person is thinking. The greater the diversity of opinion, the more likely the herd of people will reach the right answer. The paradox we have today is how to ensure that global information does not overwhelm the diversity of local opinion, leading us all to misbehave.

[Slide: Discussion]

Thank you.

 

Further Discussion


Can you suggest a market with no feed-forward?
Is some physical phenomenon a good analogue for markets?
How can we share information and retain diverse opinion?

 

 

Further Reading


DE GRAUWE, Paul and VANSENTEN, Kris, “Deterministic Chaos in the Foreign Exchange Market”, Center for Economic Policy Research, Discussion Paper Number 370, January 1990.
GLEICK, James, Chaos: Making a New Science, William Heinemann, 1988.
JENKINS, Gwilym M., “The Systems Approach”, Journal of Systems Engineering, 1, 1, 1969 (also reprinted in Open Systems Group,Systems Behaviour, 1972).
KELLY, Kevin, Out of Control: The New Biology of Machines, Fourth Estate, 1994.
KINDLEBERGER, Charles P, Manias, Panics, and Crashes: A History of Financial Crises, John Wiley & Sons, 1978.
MANDELBROT, Benoit B., The Fractal Geometry of Nature, W. H. Freeman and Company, New York, 1977 (1983 ed).
MANDELBROT, Ben oit B and HUDSON, Richard L, The (mis)Behaviour of Markets: A Fractal View of Risk, Ruin and Reward, Profile Books Ltd, 2004.
NAIM, Moisés, “Dangerously Unique”, Foreign Policy, 21 September 2005.
PAULOS, John Allen, Beyond Numeracy, Penguin Books, 1991.
PETERS, Edgar E., Chaos and Order in the Capital Markets: A New View of Cycles, Prices, and Market Volatility, John Wiley & Sons, 1991.
POPPER, Karl R., The Poverty of Historicism, Beacon Press, 1957.
SHERDEN, William A., The Fortune Sellers, John Wiley & Sons, 1998.
SUROWIECKI, James, The Wisdom of Crowds: Why the Many Are Smarter Than the Few, Little, Brown, 2004.
VAN DEN BEUKEL, A., More Things in Heaven and Earth: God and the Scientists, SCM Press Ltd, 1991.

 

 

Further Surfing

There are numerous sites on fractals, Chaos Theory and Complexity, so Googling is easy, but some suggestions include:


for the source:

http://www.math.yale.edu/mandelbrot/


for an accessible article by Mandelbrot on fractals and finance:

http://www.elliottwave.com/education/SciAmerican/Mandelbrot_Article2.htm


“nothing new under the sun” for the probable, similar, earlier contribution of Ralph Nelson Elliott of “Elliott Wave” financial fame to fractal financial analysis in the 1930’s:

http://www.elliottwave.com/mandelbrot-bobsletter.htm


for the only-slightly-mathematically-inclined:

http://www.olympus.net/personal/dewey/mandelbrot.html


for the do-it-yourself there is Fractint, a popular free program, amongst others:

http://spanky.triumf.ca/www/fractint/fractint.html


The International Society for the Systems Sciences - www.isss.org.
The Sante Fe Institute, founded in 1984, is widely held to be the first centre for the “science of Complexity” - www.santafe.edu

On hurricanes, their intensity and their frequency:


http://wind.mit.edu/~emanuel/anthro2.htm
http://www.unep.ch/ipcc/

 

 

Thanks

My thanks for help with preparing this lecture go to Roberto Buiza, Linda Cook, Ian Harris and Mary O’Callaghan, as well as Benoit Mandelbrot for inspiring many of the ideas (and indirectly the title of the lecture) since I was first exposed to the wonder of fractals in 1977. I must also thank my co-experimenter on heating systems, Elisabeth Mainelli, for her contributions to Commerce research, and to my father for his “never-ending” stories.

 

 

© Professor Michael Mainelli, Gresham College, 14 November 2005

 

This event was on Mon, 14 Nov 2005

alderman professor michael mainelli

Alderman Professor Michael Mainelli

Mercers’ School Memorial Professor of Business

Alderman & Sheriff Professor Michael Mainelli is Emeritus Mercers' School Memorial Professor of Commerce at Gresham College, having held the chair from 2005 to 2009...

Find out more

Support Gresham

Gresham College has offered an outstanding education to the public free of charge for over 400 years. Today, Gresham plays an important role in fostering a love of learning and a greater understanding of ourselves and the world around us. Your donation will help to widen our reach and to broaden our audience, allowing more people to benefit from a high-quality education from some of the brightest minds.