Preface

It is often said that investment management is an art, not a science. However, since the early 1990s the market has witnessed a progressive shift toward a more industrial view of the investment management process. There are several reasons for this change. First, with globalization the universe of investable assets has grown many times over. Asset managers might have to choose from among several thousand possible investments from around the globe. Second, institutional investors, often together with their consultants, have encouraged asset management firms to adopt an increasingly structured process with documented steps and measurable results. Pressure from regulators and the media is another factor. Finally, the sheer size of the markets makes it imperative to adopt safe and repeatable methodologies.

In its modern sense, financial modeling is the design (or engineering) of financial instruments and portfolios of financial instruments that result in predetermined cash flows contingent upon different events. Broadly speaking, financial models are employed to manage investment portfolios and risk. The objective is the transfer of risk from one entity to another via appropriate financial arrangements. Though the aggregate risk is a quantity that cannot be altered, risk can be transferred if there is a willing counterparty.

Financial modeling came to the forefront of finance in the 1980s, with the broad diffusion of derivative instruments. However, the concept and practice of financial modeling are quite old. The notion of the diversification of risk (central to modern risk management) and the quantification of insurance risk (a requisite for pricing insurance policies) were already understood, at least in practical terms, in the 14th century. The rich epistolary of Francesco Datini, a 14th-century merchant, banker, and insurer from Prato (Tuscany, Italy), contains detailed instructions to his agents on how to diversify risk and insure cargo.

What is specific to modern financial modeling is the quantitative management of risk. Both the pricing of contracts and the optimization of investments require some basic capabilities of statistical modeling of financial contingencies. It is the size, diversity, and efficiency of modern competitive markets that makes the use of financial modeling imperative.

This three-volume encyclopedia offers not only coverage of the fundamentals and advances in financial modeling but provides the mathematical and statistical techniques needed to develop and test financial models, as well as the practical issues associated with implementation. The encyclopedia offers the following unique features:

  • The entries for the encyclopedia were written by experts from around the world. This diverse collection of expertise has created the most definitive coverage of established and cutting-edge financial models, applications, and tools in this ever-evolving field.
  • The series emphasizes both technical and managerial issues. This approach provides researchers, educators, students, and practitioners with a balanced understanding of the topics and the necessary background to deal with issues related to financial modeling.
  • Each entry follows a format that includes the author, entry abstract, introduction, body, listing of key points, notes, and references. This enables readers to pick and choose among various sections of an entry, and creates consistency throughout the entire encyclopedia.
  • The numerous illustrations and tables throughout the work highlight complex topics and assist further understanding.
  • Each volume includes a complete table of contents and index for easy access to various parts of the encyclopedia.

TOPIC CATEGORIES

As is the practice in the creation of an encyclopedia, the topic categories are presented alphabetically. The topic categories and a brief description of each topic follow.

VOLUME I

Asset Allocation

A major activity in the investment management process is establishing policy guidelines to satisfy the investment objectives. Setting policy begins with the asset allocation decision. That is, a decision must be made as to how the funds to be invested should be distributed among the major asset classes (e.g., equities, fixed income, and alternative asset classes). The term “asset allocation” includes (1) policy asset allocation, (2) dynamic asset allocation, and (3) tactical asset allocation. Policy asset allocation decisions can loosely be characterized as long-term asset allocation decisions, in which the investor seeks to assess an appropriate long-term “normal” asset mix that represents an ideal blend of controlled risk and enhanced return. In dynamic asset allocation the asset mix (i.e., the allocation among the asset classes) is mechanistically shifted in response to changing market conditions. Once the policy asset allocation has been established, the investor can turn his or her attention to the possibility of active departures from the normal asset mix established by policy. If a decision to deviate from this mix is based upon rigorous objective measures of value, it is often called tactical asset allocation. The fundamental model used in establishing the policy asset allocation is the mean-variance portfolio model formulated by Harry Markowitz in 1952, popularly referred to as the theory of portfolio selection and modern portfolio theory.

Asset Pricing Models

Asset pricing models seek to formalize the relationship that should exist between asset returns and risk if investors behave in a hypothesized manner. At its most basic level, asset pricing is mainly about transforming asset payoffs into prices. The two most well-known asset pricing models are the arbitrage pricing theory and the capital asset pricing model. The fundamental theorem of asset pricing asserts the equivalence of three key issues in finance: (1) absence of arbitrage; (2) existence of a positive linear pricing rule; and (3) existence of an investor who prefers more to less and who has maximized his or her utility. There are two types of arbitrage opportunities. The first is paying nothing today and obtaining something in the future, and the second is obtaining something today and with no future obligations. Although the principle of absence of arbitrage is fundamental for understanding asset valuation in a competitive market, there are well-known limits to arbitrage resulting from restrictions imposed on rational traders, and, as a result, pricing inefficiencies may exist for a period of time.

Bayesian Analysis and Financial Modeling Applications

Financial models describe in mathematical terms the relationships between financial random variables through time and/or across assets. The fundamental assumption is that the model relationship is valid independent of the time period or the asset class under consideration. Financial data contain both meaningful information and random noise. An adequate financial model not only extracts optimally the relevant information from the historical data but also performs well when tested with new data. The uncertainty brought about by the presence of data noise makes imperative the use of statistical analysis as part of the process of financial model building, model evaluation, and model testing. Statistical analysis is employed from the vantage point of either of the two main statistical philosophical traditions—frequentist and Bayesian. An important difference between the two lies with the interpretation of the concept of probability. As the name suggests, advocates of the frequentist approach interpret the probability of an event as the limit of its long-run relative frequency (i.e., the frequency with which it occurs as the amount of data increases without bound). Since the time financial models became a mainstream tool to aid in understanding financial markets and formulating investment strategies, the framework applied in finance has been the frequentist approach. However, strict adherence to this interpretation is not always possible in practice. When studying rare events, for instance, large samples of data may not be available, and in such cases proponents of frequentist statistics resort to theoretical results. The Bayesian view of the world is based on the subjectivist interpretation of probability: Probability is subjective, a degree of belief that is updated as information or data are acquired. Only in the last two decades has Bayesian statistics started to gain greater acceptance in financial modeling, despite its introduction about 250 years ago. It has been the advancements of computing power and the development of new computational methods that have fostered the growing use of Bayesian statistics in financial modeling.

Bond Valuation

The value of any financial asset is the present value of its expected future cash flows. To value a bond (also referred to as a fixed-income security), one must be able to estimate the bond's remaining cash flows and identify the appropriate discount rate(s) at which to discount the cash flows. The traditional approach to bond valuation is to discount every cash flow with the same discount rate. Simply put, the relevant term structure of interest rate used in valuation is assumed to be flat. This approach, however, permits opportunities for arbitrage. Alternatively, the arbitrage-free valuation approach starts with the premise that a bond should be viewed as a portfolio or package of zero-coupon bonds. Moreover, each of the bond's cash flows is valued using a unique discount rate that depends on the term structure of interest rates and when in time the cash flow is. The relevant set of discount rates (that is, spot rates) is derived from an appropriate term structure of interest rates and when used to value risky bonds augmented with a suitable risk spread or premium. Rather than modeling to calculate the fair value of its price, the market price can be taken as given so as to compute a yield measure or a spread measure. Popular yield measures are the yield to maturity, yield to call, yield to put, and cash flow yield. Nominal spread, static (or zero-volatility) spread, and option-adjusted spread are popular relative value measures quoted in the bond market. Complications in bond valuation arise when a bond has one or more embedded options such as call, put, or conversion features. For bonds with embedded options, the financial modeling draws from options theory, more specifically, the use of the lattice model to value a bond with embedded options.

Credit Risk Modeling

Credit risk is a broad term used to refer to three types of risk: default risk, credit spread risk, and downgrade risk. Default risk is the risk that the counterparty to a transaction will fail to satisfy the terms of the obligation with respect to the timely payment of interest and repayment of the amount borrowed. The counterparty could be the issuer of a debt obligation or an entity on the other side of a private transaction such as a derivative trade or a collateralized loan agreement (i.e., a repurchase agreement or a securities lending agreement). The default risk of a counterparty is often initially gauged by the credit rating assigned by one of the three rating companies—Standard & Poor’s, Moody's Investors Service, and Fitch Ratings. Although default risk is the one that most market participants think of when reference is made to credit risk, even in the absence of default, investors are concerned about the decline in the market value of their portfolio bond holdings due to a change in credit spread or the price performance of their holdings relative to a bond index. This risk is due to an adverse change in credit spreads, referred to as credit spread risk, or when it is attributed solely to the downgrade of the credit rating of an entity, it is called downgrade risk. Financial modeling of credit risk is used (1) to measure, monitor, and control a portfolio's credit risk, and (2) to price credit risky debt instruments. There are two general categories of credit risk models: structural models and reduced-form models. There is considerable debate as to which type of model is the best to employ.

Derivatives Valuation

A derivative instrument is a contract whose value depends on some underlying asset. The term “derivative” is used to describe this product because its value is derived from the value of the underlying asset. The underlying asset, simply referred to as the “underlying,” can be either a commodity, a financial instrument, or some reference entity such as an interest rate or stock index, leading to the classification of commodity derivatives and financial derivatives. Although there are close conceptual relations between derivative instruments and cash market instruments such as debt and equity, the two classes of instruments are used differently: Debt and equity are used primarily for raising funds from investors, while derivatives are primarily used for dividing up and trading risks. Moreover, debt and equity are direct claims against a firm's assets, while derivative instruments are usually claims on a third party. A derivative's value depends on the value of the underlying, but the derivative instrument itself represents a claim on the “counterparty” to the trade. Derivatives instruments are classified in terms of their payoff characteristics: linear and nonlinear payoffs. The former, also referred to as symmetric payoff derivatives, includes forward, futures, and swap contracts while the latter include options. Basically, a linear payoff derivative is a risk-sharing arrangement between the counterparties since both are sharing the risk regarding the price of the underlying. In contrast, nonlinear payoff derivative instruments (also referred to as asymmetric payoff derivatives) are insurance arrangements because one party to the trade is willing to insure the counterparty of a minimum or maximum (depending on the contract) price. The amount received by the insuring party is referred to as the contract price or premium. Derivative instruments are used for controlling risk exposure with respect to the underlying. Hedging is a special case of risk control where a party seeks to eliminate the risk exposure. Derivative valuation or pricing is developed based on no-arbitrage price relations, relying on the assumption that two perfect substitutes must have the same price.

VOLUME II

Difference Equations and Differential Equations

The tools of linear difference equations and differential equations have found many applications in finance. A difference equation is an equation that involves differences between successive values of a function of a discrete variable. A function of such a variable is one that provides a rule for assigning values in sequences to it. The theory of linear difference equations covers three areas: solving difference equations, describing the behavior of difference equations, and identifying the equilibrium (or critical value) and stability of difference equations. Linear difference equations are important in the context of dynamic econometric models. Stochastic models in finance are expressed as linear difference equations with random disturbances added. Understanding the behavior of solutions of linear difference equations helps develop intuition for the behavior of these models. In nontechnical terms, differential equations are equations that express a relationship between a function and one or more derivatives (or differentials) of that function. The relationship between difference equations and differential equations is that the latter are invaluable for modeling situations in finance where there is a continually changing value. The problem is that not all changes in value occur continuously. If the change in value occurs incrementally rather than continuously, then differential equations have their limitations. Instead, a financial modeler can use difference equations, which are recursively defined sequences. It would be difficult to overemphasize the importance of differential equations in financial modeling where they are used to express laws that govern the evolution of price probability distributions, the solution of economic variational problems (such as intertemporal optimization), and conditions for continuous hedging (such as in the Black-Scholes option pricing model). The two broad types of differential equations are ordinary differential equations and partial differential equations. The former are equations or systems of equations involving only one independent variable. Another way of saying this is that ordinary differential equations involve only total derivatives. Partial differential equations are differential equations or systems of equations involving partial derivatives. When one or more of the variables is a stochastic process, we have the case of stochastic differential equations and the solution is also a stochastic process. An assumption must be made about what is driving noise in a stochastic differential equation. In most applications, it is assumed that the noise term follows a Gaussian random variable, although other types of random variables can be assumed.

Equity Models and Valuation

Traditional fundamental equity analysis involves the analysis of a company's operations for the purpose of assessing its economic prospects. The analysis begins with the financial statements of the company in order to investigate the earnings, cash flow, profitability, and debt burden. The fundamental analyst will look at the major product lines, the economic outlook for the products (including existing and potential competitors), and the industries in which the company operates. The result of this analysis will be the growth prospects of earnings. Based on the growth prospects of earnings, a fundamental analyst attempts to determine the fair value of the stock using one or more equity valuation models. The two most commonly used approaches for valuing a firm's equity are based on discounted cash flow and relative valuation models. The principal idea underlying discounted cash flow models is that what an investor pays for a share of stock should reflect what is expected to be received from it—return on the investor's investment. What an investor receives are cash dividends in the future. Therefore, the value of a share of stock should be equal to the present value of all the future cash flows an investor expects to receive from that share. To value stock, therefore, an investor must project future cash flows, which, in turn, means projecting future dividends. Popular discounted cash flow models include the basic dividend discount model, which assumes a constant dividend growth, and the multiple-phase models, which include the two-stage dividend growth model and the stochastic dividend discount models. Relative valuation methods use multiples or ratios—such as price/earnings, price/book, or price/free cash flow—to determine whether a stock is trading at higher or lower multiples than its peers. There are two critical assumptions in using relative valuation: (1) the universe of firms selected to be included in the peer group are in fact comparable, and (2) the average multiple across the universe of firms can be treated as a reasonable approximation of “fair value” for those firms. This second assumption may be problematic during periods of market panic or euphoria. Managers of quantitative equity firms employ techniques that allow them to identify attractive stock candidates, focusing not on a single stock as is done with traditional fundamental analysis but rather on stock characteristics in order to explain why one stock outperforms another stock. They do so by statistically identifying a group of characteristics to create a quantitative selection model. In contrast to the traditional fundamental stock selection, quantitative equity managers create a repeatable process that utilizes the stock selection model to identify attractive stocks. Equity portfolio managers have used various statistical models for forecasting returns and risk. These models, referred to as predictive return models, make conditional forecasts of expected returns using the current information set. Predictive return models include regressive models, linear autoregressive models, dynamic factor models, and hidden-variable models.

Factor Models and Portfolio Construction

Quantitative asset managers typically employ multifactor risk models for the purpose of constructing and rebalancing portfolios and analyzing portfolio performance. A multifactor risk model, or simply factor model, attempts to estimate and characterize the risk of a portfolio, either relative to a benchmark such as a market index or in absolute value. The model allows the decomposition of risk factors into a systematic and an idiosyncratic component. The portfolio's risk exposure to broad risk factors is captured by the systematic risk. For equity portfolios these are typically fundamental factors (e.g., market capitalization and value vs. growth), technical (e.g., momentum), and industry/sector/country. For fixed-income portfolios, systematic risk captures a portfolio's exposure to broad risk factors such as the term structure of interest rates, credit spreads, optionality (call and prepayment), credit, and sectors. The portfolio's systematic risk depends not only on its exposure to these risk factors but also the volatility of the risk factors and how they correlate with each other. In contrast to systematic risk, idiosyncratic risk captures the uncertainty associated with news affecting the holdings of individual issuers in the portfolio. In equity portfolios, idiosyncratic risk can be easily diversified by reducing the importance of individual issuers in the portfolio. Because of the larger number of issuers in bond indexes, however, this is a difficult task. There are different types of factor models depending on the factors. Factors can be exogenous variables or abstract variables formed by portfolios. Exogenous factors (or known factors) can be identified from traditional fundamental analysis or from economic theory that suggests macroeconomic factors. Abstract factors, also called unidentified or latent factors, can be determined with the statistical tool of factor analysis or principal component analysis. The simplest type of factor models is where the factors are assumed to be known or observable, so that time-series data are those factors that can be used to estimate the model. The four most commonly used approaches for the evaluation of return premiums and risk characteristics to factors are portfolio sorts, factor models, factor portfolios, and information coefficients. Despite its use by quantitative asset managers, the basic building blocks of factor models used by model builders and by traditional fundamental analysts are the same: They both seek to identify the drivers of returns for the asset class being analyzed.

Financial Econometrics

Econometrics is the branch of economics that draws heavily on statistics for testing and analyzing economic relationships. The economic equivalent of the laws of physics, econometrics represents the quantitative, mathematical laws of economics. Financial econometrics is the econometrics of financial markets. It is a quest for models that describe financial time series such as prices, returns, interest rates, financial ratios, defaults, and so on. Although there are similarities between financial econometric models and models of the physical sciences, there are two important differences. First, the physical sciences aim at finding immutable laws of nature; econometric models model the economy or financial markets—artifacts subject to change. Because the economy and financial markets are artifacts subject to change, econometric models are not unique representations valid throughout time; they must adapt to the changing environment. Second, while basic physical laws are expressed as differential equations, financial econometrics uses both continuous-time and discrete-time models.

Financial Modeling Principles

The origins of financial modeling can be traced back to the development of mathematical equilibrium at the end of the nineteenth century, followed in the beginning of the twentieth century with the introduction of sophisticated mathematical tools for dealing with the uncertainty of prices and returns. In the 1950s and 1960s, financial modelers had tools for dealing with probabilistic models for describing markets, the principles of contingent claims analysis, an optimization framework for portfolio selection based on mean and variance of asset returns, and an equilibrium model for pricing capital assets. The 1970s ushered in models for pricing contingent claims and a new model for pricing capital assets based on arbitrage pricing. Consequently, by the end of the 1970s, the frameworks for financial modeling were well known. It was the advancement of computing power and refinements of the theories to take into account real-world market imperfections and conventions starting in the 1980s that facilitated implementation and broader acceptance of mathematical modeling of financial decisions. The diffusion of low-cost high-performance computers has allowed the broad use of numerical methods, the landscape of financial modeling. The importance of finding closed-form solutions and the consequent search for simple models has been dramatically reduced. Computationally intensive methods such as Monte Carlo simulations and the numerical solution of differential equations are now widely used. As a consequence, it has become feasible to represent prices and returns with relatively complex models. Nonnormal probability distributions have become commonplace in many sectors of financial modeling. It is fair to say that the key limitation of financial modeling is now the size of available data samples or training sets, not the computations; it is the data that limit the complexity of estimates. Mathematical modeling has also undergone major changes. Techniques such as equivalent martingale methods are being used in derivative pricing, and cointegration, the theory of fat-tailed processes, and state-space modeling (including ARCH/GARCH and stochastic volatility models) are being used in financial modeling.

Financial Statement Analysis

Much of the financial data that are used in constructing financial models for forecasting and valuation purposes draw from the financial statements that companies are required to provide to investors. The four basic financial statements are the balance sheet, the income statement, the statement of cash flows, and the statement of shareholders’ equity. It is important to understand these data so that the information conveyed by them is interpreted properly in financial modeling. The financial statements are created using several assumptions that affect how to use and interpret the financial data. The analysis of financial statements involves the selection, evaluation, and interpretation of financial data and other pertinent information to assist in evaluating the operating performance and financial condition of a company. The operating performance of a company is a measure of how well a company has used its resources—its assets, both tangible and intangible—to produce a return on its investment. The financial condition of a company is a measure of its ability to satisfy its obligations, such as the payment of interest on its debt in a timely manner. There are many tools available in the analysis of financial information. These tools include financial ratio analysis and cash flow analysis. Cash flows are essential ingredients in valuation. Therefore, understanding past and current cash flows may help in forecasting future cash flows and, hence, determine the value of the company. Moreover, understanding cash flow allows the assessment of the ability of a firm to maintain current dividends and its current capital expenditure policy without relying on external financing. Financial modelers must understand how to use these financial ratios and cash flow information in the most effective manner in building models.

Finite Mathematics and Basic Functions for Financial Modeling

The collection of mathematical tools that does not include calculus is often referred to as “finite mathematics.” This includes matrix algebra, probability theory, and statistical analysis. Ordinary algebra deals with operations such as addition and multiplication performed on individual numbers. In financial modeling, it is useful to consider operations performed on ordered arrays of numbers. Ordered arrays of numbers are called vectors and matrices while individual numbers are called scalars. Probability theory is the mathematical approach to formalize the uncertainty of events. Even though a decision maker may not know which one of the set of possible events may finally occur, with probability theory a decision maker has the means of providing each event with a certain probability. Furthermore, it provides the decision maker with the axioms to compute the probability of a composed event in a unique way. The rather formal environment of probability theory translates in a reasonable manner to the problems related to risk and uncertainty in finance such as, for example, the future price of a financial asset. Today, investors may be aware of the price of a certain asset, but they cannot say for sure what value it might have tomorrow. To make a prudent decision, investors need to assess the possible scenarios for tomorrow's price and assign to each scenario a probability of occurrence. Only then can investors reasonably determine whether the financial asset satisfies an investment objective included within a portfolio. Probability models are theoretical models of the occurrence of uncertain events. In contrast, statistics is about empirical data and can be broadly defined as a set of methods used to make inferences from a known sample to a larger population that is in general unknown. In finance, a particular important example is making inferences from the past (the known sample) to the future (the unknown population). There are important mathematical functions with which the financial modeler should be acquainted. These include the continuous function, the indicator function, the derivative of a function, the monotonic function, and the integral, as well as special functions such as the characteristic function of random variables and the factorial, the gamma, beta, and Bessel functions.

Liquidity and Trading Costs

In broad terms, liquidity refers to the ability to execute a trade or liquidate a position with little or no cost or inconvenience. Liquidity depends on the market where a financial instrument is traded, the type of position traded, and sometimes the size and trading strategy of an individual trade. Liquidity risks are those associated with the prospect of imperfect market liquidity and can relate to risk of loss or risk to cash flows. There are two main aspects to liquidity risk measurement: the measurement of liquidity-adjusted measures of market risk and the measurement of liquidity risks per se. Market practitioners often assume that markets are liquid—that is, that they can liquidate or unwind positions at going market prices—usually taken to be the mean of bid and ask prices—without too much difficulty or cost. This assumption is very convenient and provides a justification for the practice of marking positions to market prices. However, it is often empirically questionable, and the failure to allow for liquidity can undermine the measurement of market risk. Because liquidity risk is a major risk factor in its own right, portfolio managers and traders will need to measure this risk in order to formulate effective portfolio and trading strategies. A considerable amount of work has been done in the equity market in estimating liquidity risk. Because transaction costs are incurred when buying or selling stocks, poorly executed trades can adversely impact portfolio returns and therefore relative performance. Transaction costs are classified as explicit costs such as brokerage and taxes, and implicit costs, which include market impact cost, price movement risk, and opportunity cost. Broadly speaking, market impact cost is the price that a trader has to pay for obtaining liquidity in the market and is a key component of trading costs that must be modeled so that effective trading programs for executing trades can be developed. Typical forecasting models for market impact costs are based on a statistical factor approach where the independent variables are trade-based factors or asset-based factors.

VOLUME III

Model Risk and Selection

Model risk is the risk of error in pricing or risk-forecasting models. In practice, model risk arises because (1) any model involves simplification and calibration, and both of these require subjective judgments that are prone to error, and/or (2) a model is used inappropriately. Although model risk cannot be avoided, there are many ways in which financial modelers can manage this risk. These include (1) recognizing model risk, (2) identifying, evaluating, and checking the model's key assumption, (3) selecting the simplest reasonable model, (4) resisting the temptation to ignore small discrepancies in results, (5) testing the model against known problems, (6) plotting results and employing nonparametric statistics, (7) back-testing and stress-testing the model, (8) estimating model risk quantitatively, and (9) reevaluating models periodically. In financial modeling, model selection requires a blend of theory, creativity, and machine learning. The machine-learning approach starts with a set of empirical data that the financial modeler wants to explain. Data are explained by a family of models that include an unbounded number of parameters and are able to fit data with arbitrary precision. There is a trade-off between model complexity and the size of the data sample. To implement this trade-off, ensuring that models have forecasting power, the fitting of sample data is constrained to avoid fitting noise. Constraints are embodied in criteria such as the Akaike information criterion or the Bayesian information criterion. Economic and financial data are generally scarce given the complexity of their patterns. This scarcity introduces uncertainty as regards statistical estimates obtained by the financial modeler. It means that the data might be compatible with many different models with the same level of statistical confidence. Methods of probabilistic decision theory can be used to deal with model risk due to uncertainty regarding the model's parameters. Probabilistic decision making starts from the Bayesian inference process and involves computer simulations in all realistic situations. Since a risk model is typically a combination of a probability distribution model and a risk measure, a critical assumption is the probability distribution assumed for the random variable of interest. Too often, the Gaussian distribution is the model of choice. Empirical evidence supports the use of probability distributions that exhibit fat tails such as the Student's t distribution and its asymmetric version and the Pareto stable class of distributions and their tempered extensions. Extreme value theory offers another approach for risk modeling.

Mortgage-Backed Securities Analysis and Valuation

Mortgage-backed securities are fixed-income securities backed by a pool of mortgage loans. Residential mortgage-backed securities (RMBS) are backed by a pool of residential mortgage loans (one-to-four family dwellings). The RMBS market includes agency RMBS and nonagency RMBS. The former are securities issued by the Government National Mortgage Association (Ginnie Mae), Fannie Mae, and Freddie Mac. Agency RMBS include passthrough securities, collateralized mortgage obligations, and stripped mortgage-backed securities (interest-only and principal-only securities). The valuation of RMBS is complicated due to prepayment risk, a form of call risk. In contrast, nonagency RMBS are issued by private entities, have no implicit or explicit government guarantee, and therefore require one or more forms of credit enhancement in order to be assigned a credit rating. The analysis of nonagency RMBS must take into account both prepayment risk and credit risk. The most commonly used method for valuing RMBS is the Monte Carlo method, although other methods have garnered favor, in particular the decomposition method. The analysis of RMBS requires an understanding of the factors that impact prepayments.

Operational Risk

Operational risk has been regarded as a mere part of a financial institution's “other” risks. However, failures of major financial entities have made regulators and investors aware of the importance of this risk. In general terms, operational risk is the risk of loss resulting from inadequate or failed internal processes, people, or systems or from external events. This risk encompasses legal risks, which includes, but is not limited to, exposure to fines, penalties, or punitive damages resulting from supervisory actions, as well as private settlements. Operational risk can be classified according to several principles: nature of the loss (internally inflicted or externally inflicted), direct losses or indirect losses, degree of expectancy (expected or unexpected), risk type, event type or loss type, and by the magnitude (or severity) of loss and the frequency of loss. Operational risk can be the cause of reputational risk, a risk that can occur when the market reaction to an operational loss event results in reduction in the market value of a financial institution that is greater than the amount of the initial loss. The two principal approaches in modeling operational loss distributions are the nonparametric approach and the parametric approach. It is important to employ a model that captures tail events, and for this reason in operational risk modeling, distributions that are characterized as light-tailed distributions should be used with caution. The models that have been proposed for assessing operational risk can be broadly classified into top-down models and bottom-up models. Top-down models quantify operational risk without attempting to identify the events or causes of losses. Bottom-up models quantify operational risk on a micro level, being based on identified internal events. The obstacle hindering the implementation of these models is the scarcity of available historical operational loss data.

Optimization Tools

Optimization is an area in applied mathematics that, most generally, deals with efficient algorithms for finding an optimal solution among a set of solutions that satisfy given constraints. Mathematical programming, a management science tool that uses mathematical optimization models to assist in decision making, includes linear programming, integer programming, mixed-integer programming, nonlinear programming, stochastic programming, and goal programming. Unlike other mathematical tools that are available to decision makers such as statistical models (which tell the decision maker what occurred in the past), forecasting models (which tell the decision maker what might happen in the future), and simulation models (which tell the decision maker what will happen under different conditions), mathematical programming models allow the decision maker to identify the “best” solution. Markowitz's mean-variance model for portfolio selection is an example of an application of one type of mathematical programming (quadratic programming). Traditional optimization modeling assumes that the inputs to the algorithms are certain, but there are also branches of optimization such as robust optimization that study the optimal decision under uncertainty about the parameters of the problem. Stochastic programming deals with both the uncertainty about the parameters and a multiperiod decision-making framework.

Probability Distributions

In financial models where the outcome of interest is a random variable, an assumption must be made about the random variable's probability distribution. There are two types of probability distributions: discrete and continuous. Discrete probability distributions are needed whenever the random variable is to describe a quantity that can assume values from a countable set, either finite or infinite. A discrete probability distribution (or law) is quite intuitive in that it assigns certain values, positive probabilities, adding up to one, while any other value automatically has zero probability. Continuous probability distributions are needed when the random variable of interest can assume any value inside of one or more intervals of real numbers such as, for example, any number greater than zero. Asset returns, for example, whether measured monthly, weekly, daily, or at an even higher frequency are commonly modeled as continuous random variables. In contrast to discrete probability distributions that assign positive probability to certain discrete values, continuous probability distributions assign zero probability to any single real number. Instead, only entire intervals of real numbers can have positive probability such as, for example, the event that some asset return is not negative. For each continuous probability distribution, this necessitates the so-called probability density, a function that determines how the entire probability mass of one is distributed. The density often serves as the proxy for the respective probability distribution. To model the behavior of certain financial assets in a stochastic environment, a financial modeler can usually resort to a variety of theoretical distributions. Most commonly, probability distributions are selected that are analytically well known. For example, the normal distribution (a continuous distribution)—also called the Gaussian distribution—is often the distribution of choice when asset returns are modeled. Or the exponential distribution is applied to characterize the randomness of the time between two successive defaults of firms in a bond portfolio. Many other distributions are related to them or built on them in a well-known manner. These distributions often display pleasant features such as stability under summation—meaning that the return of a portfolio of assets whose returns follow a certain distribution again follows the same distribution. However, one has to be careful using these distributions since their advantage of mathematical tractability is often outweighed by the fact that the stochastic behavior of the true asset returns is not well captured by these distributions. For example, although the normal distribution generally renders modeling easy because all moments of the distribution exist, it fails to reflect stylized facts commonly encountered in asset returns—namely, the possibility of very extreme movements and skewness. To remedy this shortcoming, probability distributions accounting for such extreme price changes have become increasingly popular. Some of these distributions concentrate exclusively on the extreme values while others permit any real number, but in a way capable of reflecting market behavior. Consequently, a financial modeler has available a great selection of probability distributions to realistically reproduce asset price changes. Their common shortcoming is generally that they are mathematically difficult to handle.

Risk Measures

The standard assumption in financial models is that the distribution for the return on financial assets follows a normal (or Gaussian) distribution and therefore the standard deviation (or variance) is an appropriate measure of risk in the portfolio selection process. This is the risk measure that is used in the well-known Markowitz portfolio selection model (that is, mean-variance model), which is the foundation for modern portfolio theory. Mounting evidence since the early 1960s strongly suggests that return distributions do not follow a normal distribution, but instead exhibit heavy tails and, possibly, skewness. The “tails” of the distribution are where the extreme values occur, and these extreme values are more likely than would be predicted by the normal distribution. This means that between periods where the market exhibits relatively modest changes in prices and returns, there will be periods where there are changes that are much higher (that is, crashes and booms) than predicted by the normal distribution. This is of major concern to financial modelers in seeking to generate probability estimates for financial risk assessment. To more effectively implement portfolio selection, researchers have proposed alternative risk measures. These risk measures fall into two disjointed categories: dispersion measures and safety-first measures. Dispersion measures include mean standard deviation, mean absolute deviation, mean absolute moment, index of dissimilarity, mean entropy, and mean colog. Safety-first risk measures include classical safety first, value-at-risk, average value-at-risk, expected tail loss, MiniMax, lower partial moment, downside risk, probability-weighted function of deviations below a specified target return, and power conditional value-at-risk. Despite these alternative risk measures, the most popular risk measure used in financial modeling is volatility as measured by the standard deviation. There are different types of volatility: historical, implied volatility, level-dependent volatility, local volatility, and stochastic volatility (e.g., jump-diffusion volatility). There are risk measures commonly used for bond portfolio management. These measures include duration, convexity, key rate duration, and spread duration.

Software for Financial Modeling

The development of financial models requires the modeler to be familiar with spreadsheets such as Microsoft Excel and/or a platform to implement concepts and algorithms such as the Palisade Decision Tools Suite and other Excel-based software (mostly @RISK1, Solver2, VBA3), and MATLAB. Financial modelers can choose one or the other, depending on their level of familiarity and comfort with spreadsheet programs and their add-ins versus programming environments such as MATLAB. Some tasks and implementations are easier in one environment than in the other. MATLAB is a modeling environment that allows for input and output processing, statistical analysis, simulation, and other types of model building for the purpose of analysis of a situation. MATLAB uses a number-array-oriented programming language, that is, a programming language in which vectors and matrices are the basic data structures. Reliable built-in functions, a wide range of specialized toolboxes, easy interface with widespread software like Microsoft Excel, and beautiful graphing capabilities for data visualization make implementation with MATLAB efficient and useful for the financial modeler. Visual Basic for Applications (VBA) is a programming language environment that allows Microsoft Excel users to automate tasks, create their own functions, perform complex calculations, and interact with spreadsheets. VBA shares many of the same concepts as object-oriented programming languages. Despite some important limitations, VBA does add useful capabilities to spreadsheet modeling, and it is a good tool to know because Excel is the platform of choice for many finance professionals.

Stochastic Processes and Tools

Stochastic integration provides a coherent way to represent that instantaneous uncertainty (or volatility) cumulates over time. It is thus fundamental to the representation of financial processes such as interest rates, security prices, or cash flows. Stochastic integration operates on stochastic processes and produces random variables or other stochastic processes. Stochastic integration is a process defined on each path as the limit of a sum. However, these sums are different from the sums of the Riemann-Lebesgue integrals because the paths of stochastic processes are generally not of bounded variation. Stochastic integrals in the sense of Itô are defined through a process of approximation by (1) defining Brownian motion, which is the continuous limit of a random walk, (2) defining stochastic integrals for elementary functions as the sums of the products of the elementary functions multiplied by the increments of the Brownian motion, and (3) extending this definition to any function through approximating sequences. The major application of integration to financial modeling involves stochastic integrals. An understanding of stochastic integrals is needed to understand an important tool in contingent claims valuation: stochastic differential equations. The dynamic of financial asset returns and prices can be expressed using a deterministic process if there is no uncertainty about its future behavior, or, with a stochastic process, in the more likely case when the value is uncertain. Stochastic processes in continuous time are the most used tool to explain the dynamic of financial assets returns and prices. They are the building blocks to construct financial models for portfolio optimization, derivatives pricing, and risk management. Continuous-time processes allow for more elegant theoretical modeling compared to discrete time models, and many results proven in probability theory can be applied to obtain a simple evaluation method.

Statistics

Probability models are theoretical models of the occurrence of uncertain events. In contrast, statistics is about empirical data and can be broadly defined as a set of methods used to make inferences from a known sample to a larger population that is in general unknown. In finance, a particular important example is making inferences from the past (the known sample) to the future (the unknown population). In statistics, probabilistic models are applied using data so as to estimate the parameters of these models. It is not assumed that all parameter values in the model are known. Instead, the data for the variables in the model to estimate the value of the parameters are used and then applied to test hypotheses or make inferences about their estimated values. In financial modeling, the statistical technique of regression models is the workhorse. However, because regression models are part of the field of financial econometrics, this topic is covered in that topic category. Understanding dependences or functional links between variables is a key theme in financial modeling. In general terms, functional dependencies are represented by dynamic models. Many important models are linear models whose coefficients are correlation coefficients. In many instances in financial modeling, it is important to arrive at a quantitative measure of the strength of dependencies. The correlation coefficient provides such a measure. In many instances, however, the correlation coefficient might be misleading. In particular, there are cases of nonlinear dependencies that result in a zero correlation coefficient. From the point of view of financial modeling, this situation is particularly dangerous as it leads to substantially underestimated risk. Different measures of dependence have been proposed, in particular copula functions. The copula overcomes the drawbacks of the correlation as a measure of dependency by allowing for a more general measure than linear dependence, allowing for the modeling of dependence for extreme events, and being indifferent to continuously increasing transformations. Another essential tool in financial modeling, because it allows the incorporation of uncertainty in financial models and consideration of additional layers of complexity that are difficult to incorporate in analytical models, is Monte Carlo simulation. The main idea of Monte Carlo simulation is to represent the uncertainty in market variables through scenarios, and to evaluate parameters of interest that depend on these market variables in complex ways. The advantage of such an approach is that it can easily capture the dynamics of underlying processes and the otherwise complex effects of interactions among market variables. A substantial amount of research in recent years has been dedicated to making scenario generation more accurate and efficient, and a number of sophisticated computational techniques are now available to the financial modeler.

Term Structure Modeling

The arbitrage-free valuation approach to the valuation of option-free bonds, bonds with embedded options, and option-type derivative instruments requires that a financial instrument be viewed as a package of zero-coupon bonds. Consequently, in financial modeling, it is essential to be able to discount each expected cash flow by the appropriate interest rate. That rate is referred to as the spot rate. The term structure of interest rates provides the relationship between spot rates and maturity. Because of its role in valuation of cash bonds and option-type derivatives, the estimation of the term structure of interest rates is of critical importance as an input into a financial model. In addition to its role in valuation modeling, term structure models are fundamental to expressing value, risk, and establishing relative value across the spectrum of instruments found in the various interest-rate or bond markets. The term structure is most often specified for a specific market such as the U.S. Treasury market, the bond market for double-A rated financial institutions, the interest rate market for LIBOR, and swaps. Static models of the term structure are characterizations that are devoted to relationships based on a given market and do not serve future scenarios where there is uncertainty. Standard static models include those known as the spot yield curve, discount function, par yield curve, and the implied forward curve. Instantiations of these models may be found in both a discrete- and continuous-time framework. An important consideration is establishing how these term structure models are constructed and how to transform one model into another. In modeling the behavior of interest rates, stochastic differential equations (SDEs) are commonly used. The SDEs used to model interest rates must capture the market properties of interest rates such as mean reversion and/or a volatility that depends on the level of interest rates. For a one-factor model, the SDE is used to model the behavior of the short-term rate, referred to as simply the “short rate.” The addition of another factor (i.e., a two-factor model) involves extending the SDE to represent the behavior of the short rate and a long-term rate (i.e., long rate).

The entries can serve as material for a wide spectrum of courses, such as the following:

  • Financial engineering
  • Financial mathematics
  • Financial econometrics
  • Statistics with applications in finance
  • Quantitative asset management
  • Asset and derivative pricing
  • Risk management

Frank J. Fabozzi
Editor, Encyclopedia of Financial Models

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset