Monday, December 22, 2008

Financial instruments, systems analysis and decision management

I have been reading a little about what is being presented as the causes for the current worldwide financial crisis. Nothing really surprising in the descriptions, of course, we’ve all heard or read these experts who failed to predict anything going to great details in explaining a posteriori what happened, and provide supposedly valuable insight into what would be coming.

What really amazes me is how each one of the human branches of activities likes living in its own ivory tower. Each industry learns its own lessons from its own failures, or attempts to, but rarely does one industry draw the lessons from another industry’s failures. I am certain some of that has to do with the fact that industries are frequently out of sync in their cycles, but also from a lack of abstraction and high level view on the nature of the industry, its processes and decisions.

A key factor is in this mess is simply the fact that the financial industry created all sorts of financial instruments working at ever higher levels of abstraction, more and more removed from the reality of the underlying relevant physical or legal entities without really understanding their systems behavior.

Derivatives, derivatives on derivatives, swaps, etc: sure, each and every one of them has a solid – and here, sorry for my friends in that space, I will qualify solid as “seemingly” solid – mathematical support. Sure, the bright mathematicians and physicist Wall Street hired by the plane load out of Europe (Russia, France, etc.) are very capable in their domain. They can understand business problems, abstract them into sophisticated models, even understand the conditions under which these models apply, provide all sorts of sensitivity analysis to take into account potential environmental changes, etc. I will not question that.
But I will say that there is no evidence whatsoever that the financial industry applied – or applies – serious systems analysis to the combination of these instruments as they are applied. The obvious aspects that systems analysts pay attention to have been totally ignored: no real systems analysis of applicability, analysis of error / exception propagation, exception management, etc.
As a consequence, each model may very well be well understood, inducing a false sense of security in that this is all under control.

When systems break, they most often break in the interface between components (the famous Mars Climate orbiter issue: http://mars.jpl.nasa.gov/msp98/news/mco990930.html), or in exception handling (the famous Ariane rocket explosion: read the report from one of my teachers, Jacques Louis Lions, http://esamultimedia.esa.int/docs/esa-x-1819eng.pdf), or in the actual decision making in particular when it involves humans (see http://architectguy.blogspot.com/2008/12/precision-speed-and-mistakes.html below, this crisis).
The financial industry – driven by its purely speculative focus, and totally blinded to the fact it no longer understands its instruments – failed and continues failing on all three counts.
For all their intellect and knowledge, the Nobel Prize winner creators of LTCM failed. The industry around it failed more than a decade ago, but little was learned from there. It went back to the same (maybe less obvious) hubris and disregard for proper risk management. With the same consequences…

Frequently enough these failures have simple root causes, overlooked only because of the overall complexity of the system and incredibly cheap to repair compared to the huge cost of these failures. Imagine if the verification systems of Société Générale had actually worked (http://en.wikipedia.org/wiki/January_2008_Soci%C3%A9t%C3%A9_G%C3%A9n%C3%A9rale_trading_loss_incident) : Kerviel would be unknown, and the company would not have lost billions of dollars (and this was limited thanks to the intervention of the company’s crack team and the help from the French government)

Other industries have learned and built into their practices both simplification of components and management of the systems to reach a level in which, when a black swan does appear, the consequences can be assessed and remedial action can be taken. The financial industry should do the same.

Decision Management can help as it increases its focus on scenario-based risk assessment. Modeling decisions and managing around them scenarios that enable the systems analysis of the business processes should allow much better understanding and control of the consequences of the decisions in changing conditions. A lot of the energy spent building predictive models based on well bounded simplifications should be shifted towards modeling the decisions that are made, the business outcomes, their dependencies with respect tot the environment, and managing vast and ever enriched portfolios of scenarios around them to keep risk to black swans in check.

This will require discipline, as it basically emphasizes analyzing highly unlikely but consequence heavy scenarios in times when things are going seemingly well.
But I believe it will be needed.

The Decision Management discipline can help, leverage it.

On a side note on all this,
- Read Nassim Taleb (http://en.wikipedia.org/wiki/Nassim_Nicholas_Taleb).
- Watch him discuss the crisis on TV (http://paul.kedrosky.com/archives/2008/10/12/nassim_taleb_ge.html)
- Watch him and Benoit Mandelbrot (http://en.wikipedia.org/wiki/Benoit_Mandelbrot) debate the crisis: http://uk.youtube.com/watch?v=DLFkQdiXPbo&NR=1

No comments: