Better Risk Management Means Greater Chance of Catastrophe
John Mauldin is a consultant who matches up accredited investors with money managers (hedge funds). As a sideline he also publishes a weekly letter about a variety of topics typically covering ways to look at the economy, investing and the world stage. If you can ignore the obligatory self promotion, his insights are often quite thought provoking.
Earlier this year he published a letter which provided one of the best metaphors for understanding market risk that I have seen. The punchline is that the better we get at modeling our world to hedge its risks, the greater the chance of catastrophic failure when our model turns out to be wrong. Why does this happen? Stability sows the seeds of instability and people tend to keep their risk exposure constant. If risk is reduced by improved technology/understanding people will often leverage up their risk back to its previous level.
You can see this happening in the investment world with the explosive growth of hedge funds and derivatives. This story with the lead “The derivatives market has soared, reaching nearly $300 trillion in value. Considering the total value of the stock and bond markets combined amounts to only $65 trillion..” should give one pause about what new risks exist in our highly hedged environment that traditional investment models don’t capture.
The recent blowup of Amaranth demonstrates how hedge funds make huge bets where most of the time they win (so it looks like they have skill), but when they lose the losses are so catastrophic as to undo all the wins plus more. Unfortunately, unless you understand exactly what a hedge fund is doing, you will never know if they are playing this game from the high level strategic description and simple return numbers they provide potential investors.
There is also a cautionary tale here for those who rely too much on the success percentages of Monte Carlo simulations. While the success percentage is clearly important, a deep understanding of the downside risk (i.e., how bad can things get if we do have failure) is just as important.