MODELING AFTER THE 2007–2009 GLOBAL FINANCIAL CRISIS

The period following the 2007–2009 global financial crisis has witnessed the acceleration of a number of modeling trends identified in previous research by Fabozzi, Focardi, and Jonas. In particular, the growing awareness of the nonnormal nature of returns is fueling efforts to adopt models and risk measures able to cope with nonlinearity. Among these are conditional value-at-risk (CVaR) to measure risk and copula functions to measure co-movements. Regime shifting models, which are in principle able to capture transitions from one regime to another, for example from stable to volatile market states, are increasingly being considered as a modeling option. Modelers also sought to find new factors and new predictors (hopefully unique), using new data sources, in particular from the derivatives market. However, as equity markets began to rebound in 2010, modeling strategies that essentially work in growth markets were back in favor. This includes strategies such as equal-weighted portfolios that benefit from both mean reversion and upward market valuations.

The 2007–2009 global financial crisis underlined the importance of asset allocation in explaining returns. The renewed awareness of the overwhelming importance of asset allocation, compared to, for example, stock selection or execution, has accelerated the adoption of dynamic asset allocation. According to the paradigm of dynamic asset allocation, investors switch into and out of asset classes, or at least dynamically change the weighting of different asset classes, as a function of the their forecast of the average return of each class. While traditional asset allocation is reviewed at time horizons of one to three years, the time horizon of dynamic asset allocation is typically of the order of months.

The execution process has also undergone important changes since about 2004. The increasing availability of intraday (or high frequency) data and modeling techniques based on optimizing the market impact of trades are behind the diffusion of program trading and high-frequency trading. Program trading, which utilizes computer programs to reduce the market impact of large trades by subdividing a large trade into many small trades with optimal rules, created a flow of high-frequency trades that has, in turn, created trading opportunities for those able to make, at a very low cost, many small trades with very short holding periods. Holding periods for high-frequency trades are generally less than one day and can be as short as a few milliseconds.

The diffusion of high-frequency trading, which is now estimated to represent more than half of all trading activity on major U.S and European exchanges, has fundamentally changed the characteristics of exchanges and created a new generation of models. High-frequency trading is a form of trading that leverages high-speed computing, high-speed communications, tick-by-tick data, and technological advances to execute trades in as little as milliseconds. A typical objective of high-frequency trading is to identify and capture (small) price discrepancies present in the market. This is done using computer algorithms to automatically capture and read market data in real-time, transmit thousands of order messages per second to an exchange, and execute, cancel or replace orders based on new information on prices or demand.26

Lastly, awareness of systemic risk has created a new area of research whose objective is to measure market connectivity. This research is based on a fundamental property of all random networks, namely that random networks exhibit connectivity thresholds. Below the connectivity thresholds, clusters remain small, with an exponential distribution; however, when approaching the connectivity threshold, very large clusters can be formed. This fact has an important bearing on risk management because connected clusters might propagate large losses. An example from the 2007–2009 crisis is the propagation of losses due to the subprime mortgage crisis. Focardi and Fabozzi suggest the use of a connectivity parameter based on percolation theory to improve the measurement of credit risk.27 More recently, and in the wake of the 2007–2009 global financial crisis, the Bank of England's executive director of financial stability Andrew Haldane, suggested the use of connectivity parameters to measure the risk of widespread contagion and propagation of losses.28

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset