The Devil Is In The Interactions

The Devil is in the Interactions

Financial markets are getting closer, resulting in more interdependencies. While this creates more risk of a major catastrophe, it also provides opportunities for actuaries.
By Michael A. Ekhaus

This article introduces the reader to probabilistic methods used to model interactions. Three topics are specifically discussed; two concern modeling methods and the other concerns data analysis methods.

Twenty years after the largest one day drop in U.S. stock market history, and with the subprime mortgage crunch currently causing ripples in the market, many are wondering if these events could trigger a cascade of market events resulting in a crash. While this news might sell papers, the real question is whether market knowledge has improved sufficiently to allow for adjustments that avoid a downturn. It's about risk management, and this article will suggest improved modeling methods. Approaches currently being developed use advances in mathematics, together with increased computing power, to open the possibility of modeling interactions and exploring the effects that our collective decisions have on events.

Closely Knit

Life is full of interactions, with the interdependencies increasingly becoming more complex. The process of grappling with these complexities continues to evolve. If "the devil is in the details," then the details are in the interactions. Although financial markets have grown, the financial world is getting "closer" together. Increased dependencies have been created by creative financial instruments. Assets such as collateralized debt obligations (CDOs) and credit swaps create a network of dependencies throughout the financial world. Such a network can be used to distribute risk, but as the network of inter-dependencies increases it also increases the possibility of risk exposure. Consider an analogue to the U.S. power grid. The distributed nature of supplying power lowers the risk of locally being without power, but the nature of the grid also increases the potential for a major event such as the blackout in northeast North America in August 2003. The seeds of catastrophe are often planted based on the manner in which the network operates. Whether considering a bridge, a levee, the power grid or our financial system, it's important to understand the failure modes a system can undergo and the events that precede the catastrophe. We often speak of a catastrophe as being a single event, but it is usually the culmination of several events. Understanding how these events interact to trigger a major event can help to develop mitigation strategies to avert or reduce its impact.

There are important differences between financial and physical systems. The collection of over-the-counter financial contracts and their implications make for an arbitrarily complex network. It is not obvious whether this is beneficial or detrimental. It's also not obvious whether regulating these contracts would make their implied network fragile in exactly the manner that could cause a cascade. There are several differing explanations as to why the events of Oct. 19, 1987 occurred. We are not suggesting an alternative here. If the chain of dependencies allows all the conditions that people cite as the cause to simultaneously exist, if they compound one another in a large, complex, interacting, financial network, these conditions might allow an event on the scale of the entire network. One could model the integral components of the financial world, their inter-dependencies, and the policies that people make. Interconnections increase complexity, becoming an integral part of the network's growth. Whether controlled or not, decisions made are a feedback mechanism, and the outcome of events propagate through our networks. These outcomes trigger other events, good or bad, at unprecedented rates.

An important reason to understand how interactions arise and impede on every aspect of business (and life) is that there are risks associated with these dependencies. With risk comes opportunity. If there were no interactions, there would be no opportunities. The modeling of interactions with policies allows for modeling decisions as well. In doing so, a partially controllable feedback mechanism is introduced. The aggregate effects may be studied and, more importantly, stress tested under possible scenarios. Whether considering the entire global financial market or pricing of a single CDO, understanding the interactions is key. For example, if correlations between loan defaults are modeled weaker than is actually the case, then the pricing of bundles of such assets (CDOs) will be wrong. If the correlations are stronger than is assumed, or there is more sensitivity to interest rates, the investors' risk will not have been mitigated. Hedge funds also rely on accurate modeling of interactions. Regardless of a fund's strategy, the goal is to hedge and this requires consideration of interactions. The question is whether one needs to consider more than pair-wise correlations. As the world becomes increasingly more connected the answer is surely yes. Pair-wise correlations are a good place to start, and they form an aggregated time average effects of dependencies between entities.

Understanding the affects that the sequential order of events can have on the aggregated behavior is important. Entities are often correlated because of dependencies on other market entities. Understanding these dependencies enables resiliency to changes in the underlying factors and predicting if assets are mispriced. Modeling interactions can take a variety of forms. One option is to model dependencies and construct elaborate stochastic processes to propagate the effects of dependencies forward in time. Another option is to first consider improved modeling of marginal probabilities, which could later be used for constructing a more detailed model. At whatever level one wishes to consider their financial modeling, the details are in the interactions.

Policies and Constraints

The purpose of modeling is largely to aid in decisions that relate to the modeling objectives and, in this regard, we should not lose track that a model must be repeatedly calibrated against the real world. Often there are goals that decisions aim to optimize. For example, one may want to increase profits, decrease risk and reduce losses. Policies and constraints should be considered as part of the larger modeling activity that optimizes the objectives. Since policies define how decisions are made and these decisions effect the outcomes, such policies and possibly constraints should dictate the granularity of the modeling. Policies should be evaluated as part of the model and not be considered a disjoint activity.

INTERACTING PARTICLE SYSTEMS AND RANDOM GRAPHS

Statistical Mechanics, developed within the last two centuries, models the interactions between particles such as atoms or molecules. A major theme is the study of phase transitions, an abrupt change in the properties of a material resulting from sensitivity to a parameter. For example, with a change in temperature, water may be in a solid, liquid or gas phase.

Mathematicians refer to the rigorous treatment of non-equilibrium statistical mechanics as Interacting Particle Systems. Although the subject has progressed beyond its original motivations, the name has stuck.

Random Graphs

The classical work on random graphs is usually attributed to Paul Erdos and Alfred Renyi. In particular, their initial characterization of a phase transition is associated with a random graph.

Although the mathematical description of a phase transition may be new to the reader, the concept, as noted, is as familiar as water freezing to ice or boiling to steam. In this case, water undergoes a phase transition as a result of changes in temperature. As the temperature changes through "critical values" the water drastically changes its physical property.

Many phenomena in nature exhibit such behavior. Modeling of the dependencies and interactions reveals a mathematical description suggesting that phase transitions, and critical phenomena in general, are not limited to the physical world. Such behavior is a result of probabilistic dependencies within graphical structures that represent the interactions of entities being modeled.

The concept that a probabilistic model may exhibit multiple regimes is not new to actuaries. Actuaries have already considered concepts such as "regime switching." In this regard, a phase transition may be considered a type of regime switching. In the random graph context, the regime change is parametrically driven. In other cases, the regime change is dynamically driven, or a result of stochastic evolution.

In the now classical work of Erdos and Renyi, random graphs are parameterized. For our purpose this parameter is like temperature, and changes in the parameter results in changes to the structure of the random graphs. Similar to water, as the parameter changes through a critical value, the resulting random graph drastically changes its macroscopic properties. Put another way, the overall nature of the graph has changed and has undergone a phase transition.

As an example, the connectedness of the graphs' components becomes drastically different as a parameter varies through a critical value.

In a recent paper, "Contagion in Financial Networks," by Prasnna Gai and Sujit Kapadia of the Bank of England (see [1]), the authors apply random graph techniques to model shocks within a banking network and the aggregated spread of credit contagion. Gai and Kapadia use a directed random graph to model balance sheets concerning a bank's assets and obligations. They show how to model the cascade of contagion effects that can flow through the random graph or the virtual banking network.

Interacting Particle Systems

An interacting particle system (IPS) is a stochastic process that, in some sense, generalizes a Markov Chain. Consider a graph, G, and associated with each vertex of G there is a set of possible states. The set of all possible configurations of states associated with vertices will be the state space for the interacting particle system.

If there are two states associated with each vertex and 1,000 vertices (dimensions), then there are 21,000 possible states (or configurations).

Typically, particle systems are Markov processes. Where the graph G has N vertices and k states per vertex, an interacting particle system is a Markov Chain with kN states. This can become a very large Markov Chain. Simulations of such processes are performed using sampling techniques.

In a set of papers by Kay Giesecke and Stefan Weber (see [2] and [3]), the authors use an interacting particle system called the voter model as the basis for studying contagion of economic distress between banks. A bank can be in a state of "high liquidity" or "low liquidity." The voter model allows the liquidity state to evolve among the ensemble of banks. These authors consider the graph to be the regular square lattice, which is the classical setting for studying the voter model. In this case, the high versus low liquidity banks establish a diffusive boundary between subsets of banks.

It should be pointed out that the graph associated with a particle system may also be a random graph, and one could allow the graph to be static (fixed) or allow the graph to change over time. Although a changing graph makes a rigorous result extremely difficult, simulation of such systems is only incrementally more difficult. After all, it's only software. As compared to 20 years ago, computers are orders of magnitude more powerful and model capabilities are vastly improved.

Policy Decisions and Particle Systems

Together with Sandia National Laboratories, the author has applied particle system methods to study the effectiveness "certain policies" have on the large scale behavior of computer networks that are under attack, resulting in a cyber–conflict. The usefulness of these methods apply to any interacting system in which decisions result in effectiveness and effects that propagate throughout the entire system.

Particle system methods can be further expanded to represent the state of liquidity of a banking network or the stability of other financial instruments that interact over time. The network can be real in the sense that there are physical dependencies resulting from a computerized communication network, or virtual as is implied from a network resulting from contracts such as CDOs and swaps.

Regardless of the network of interactions, one can additionally model policy decisions and study the temporal effects that result. For example, consider the banking credit contagion models and that an explicit policy about interest rates could be modeled. For a given interest rate there is a given susceptibility and rate of spreading associated with inter–banking credit contagion. This can result, for example, from defaulting on adjustable rate mortgage (ARM) loans. If one also incorporated collateralized loan obligations (CLOs) in this interacting model, then the lowering of interest rates can promote riskier financial derivatives that feed back into the interactions, promoting further credit contagion in the future. Often models of this sort can have phase transitions, and the parameter sensitivity is crucial to stabilizing the model's behavior.

One might imagine lots of springs connected to one another. The feedback questions involve understanding the degree to which the network of springs can be stretched without breaking or creating unwanted oscillations in the network. A particle system augmented with policies and constraints is an ideal methodology for exploring such scenarios. Policy statements can be modeled at the desired level of granularity that mimics real world decisions. Of course, the appropriate level of granularity is always an issue.

Modeling Mutual Interactions from Data

This section concerns a related, but slightly different focus from the previous sections. Consider why the actuarial profession uses conditional contingency tables. In part, it's to narrow, or better focus, the models used to determine risk and hence the pricing of policies.

Many statistical calculations can be considered functions of contingency tables. Certainly the measurement of empirical correlations is such a case. Even if you want to average over all priors and consider the unconditional statistics, conditionalizing such calculations is important because it will expose assumptions that are implicitly made.

As an example, consider the Wall Street practice of "pairs–trading." Empirical correlations are generated for trading on spreads between correlated assets (also applies to co–integrated assets). One needs to understand two things very well. First, what are the correlations between entities being traded? Second, you must understand that the correlations are averages over a period of time and that these estimates are unconditionalized empirical estimators. One would like to better understand how sensitive these estimates are to possible priors that have been averaged out of the estimates. As events in the world occur, the priors change value. It takes time for the effects to be noticeable, but you want to exploit the changes before it's too late. This is analogous to using different tables for issuing life insurance policies depending on knowledge. There is nothing that inhibits an insurance company from not conditionalizing tables, except that it would not sufficiently measure risk. In an analogous manner, Wall Street quants could conditionalize their calculations to better understand the model assumptions. Of course, it is not a simple question to determine what granularity is needed to conditionalize calculations.

To measure the sensitivity that correlations have to possible priors requires conditionalizing the calculations. This involves a fair amount of bookkeeping. Performing such an activity on a large scale is a combination of data mining and statistical analysis, but will result in a better understanding of models.1

Conclusions

Three methods were introduced, all of which concern interactions. Although the examples were chosen from finance and banking, the issues concerning modeling and data analysis of interactions applies to every problem and field in which actuaries are concerned.

Although requiring a lengthier discussion, the data analysis methods can be combined with either of the other two methods, whereby implementing a data driven random graph model or a data driven interacting particle system. Measuring/calculating the dependencies is crucial to determining the risks, and required to avail the opportunities that come with the taking of risks.

The author is grateful to Max Rudolph for very useful discussions concerning this article.

Michael A. Ekhaus is with Gibraltar Analytical, LLC in Minneapolis, Minnesota.

V. References

[1] P. Gai and S.Kapadia, Contagion in Financial Networks, available via download from Csef.it/Unicredit_conference.htm
[2] K.Giesecke, S. Weber, "Cyclical Correlations, Credit Contagion, and Portfolio Losses," Journal of Banking and Finance, 28(12), 3009–3036, 2004
[3] K.Giesecke, S. Weber, "Credit Contagion and Aggregate Losses," Journal of Economic Dynamics and Control, 30 (2006) 741–767

Footnote:
1Gibraltar Analytical holds a patent on methods and has implemented custom applications for performing large scale conditionalizing of such statistical calculations and other data mining processes.