I was invited to speak on Friday at the XBRL Risk Governance Forum, hosted by the IBM Data Governance Council. Having said that, most everyone is going to be tempted to yawn and stop reading further. Don’t. Within the work of this Forum are the seeds of reducing the risk of future market crisis. Indeed, it could be the foundation for a quantum leap in risk management.
To explain why, let me start by going through the dynamics of market crises. A market crisis occurs when there are highly leveraged investors in a market that is under stress. These investors are forced to sell to meet their margin requirements. Their selling drops prices further – especially because the market was under stress to begin with. So you get a cascade down in the price of that market. A shock that might have initially led to only a five percent drop gets amplified, and the market might drop multiples of that. We have seen this in various guises in the current crisis, from the banks' 'toxic waste', to the downward spiral in housing prices, to the deleveraging of the carry trade, to the quant fund crisis in August 2007.
And the dynamic gets worse. Many of those under pressure to liquidate will discover they no longer can sell in the market that is under stress. If they can’t sell what they want to sell, they sell whatever else they can. So now they move to a second market where they have exposure and start selling there. If many of those who are in the first market also are in the second one, and if the investors in that market are also leveraged, then we see the contagion occur.
Here are two examples of what I am talking about.
Example one is LTCM. The proximate cause of LTCM’s demise was the Russian default in August, 1998. But LTCM was not highly exposed to Russia. A reasonable risk manager, aware of the Russian risks, might not have viewed it as critical to the firm. So how did it hurt them? It hurt them because many of those who did have high leverage in Russia also had positions in other markets where LTCM was leveraged. When the Russian debt markets failed and these investors had to come up with capital, they looked around and sold their positions in, among other things, Danish mortgage bonds. So the Danish mortgage bond market went into a tail spin, and because LTCM as big in that market, it took LTCM with it.
Example two is what happened with the Hunt silver bubble. When the bubble burst in 1980, guess what market ended up being correlated almost one-to-one with silver. Cattle. Why? Because the Hunts had to come up with margin for their silver positions, and they happened to have large holdings of cattle that they could liquidate.
Could we have ever anticipated beforehand that we would see a huge, correlated drop in both Russian MinFins and Danish Mortgage bonds? Or in silver and cattle? There is no way these dynamics can be uncovered with conventional, historically based VaR type of analysis. The historical return data do not tell us much if anything about leverage, crowding and linkages based on position holdings.
This is not to say VaR is not of value. I think everyone who is involved in risk management understands the limitations of VaR, what it can and cannot do. It is sometimes put up as a straw man because it is not doing things it was not designed to do, things it cannot do, such as assess these sorts of liquidity crisis events and the resulting cascade of correlations that result.
But the proper use of mark up languages along the lines of XBRL can give us the data we need to address market crises as they start to form. What we must do is have a regulator that extracts the relevant data – in this case position and leverage data – from major investment entities. These would include, as a start, the large banks and largest hedge funds. With assurances of data security – the data would not be revealed beyond the regulator – a government risk manager would then be able to know what currently cannot be known: where is there crowding in the markets, where are there ‘hot spots’ of high leverage, what linkages exist in the event of a crisis based on the positions these investors hold?
For these reasons, the first recommendation in both my Senate and House testimony was “get the data”. How can we do that? Well, first, by legislative demands to require investment firms -- including large hedge funds -- to provide the data. Then by the proper application of a mark up language so it can be done in a consistent, aggregatable way.
To give an analogy for this, one that came out in the conference and that illustrates how far behind we are in financial markets, a mark up language for risk would do for the financial products what bar codes already do for real products. If we discover a problem with peanuts being processed in some factory, we can use the bar codes to know where each product containing those peanuts is in the supply chain, all the way down to the grocery store shelf.
Having the proper tags – the proper bar code, if you will – for financial products, ranging from bonds and equities to structured products and swaps will allow us to understand the potential for crisis events and system risk. It will help us anticipate the course of a systemic shock. It will identify cases where many investors might be acting prudently, but where their aggregate positions lead to a level of risk which they on their own cannot see. It also will give us the means to evaluate crises after the fact. Just as the NTSB can use the black box information to help improve the airline industry by evaluating the causes of a airline accident, this position and leverage data will act as the black box data to help us understand how a crises started, and, coupled with interviews of the key participants, help us understand what we need to do to improve the safety of the markets.
February 28, 2009
Rick Bookstaber
Mapping the Market Genome
on
February 28, 2009
in
Finance