CHAPTER 9

Implementable Quantitative Equity Research*

Frank J. Fabozzi, Ph.D., CFA, CPA

Professor of Finance
EDHEC Business School

Sergio M. Focardi, Ph.D.

Professor of Finance
EDHEC Business School and Partner, The Intertek Group

K. C. Ma, Ph.D., CFA

KCM Asset Management, Inc. and Roland George Chair of Applied Investments Stetson University

Finance is by nature quantitative like economics but it is subject to a large level of risk. It is the measurement of risk and the implementation of decision-making processes based on risk that makes finance a quantitative science and not simply accounting. Equity investing is one of the most fundamental processes of finance. With the diffusion of affordable fast computers and with progress made in understanding financial processes, financial modeling has become a determinant of investment decision-making processes. Despite the growing diffusion of financial modeling, objections to its use are often raised.

In the second half of the 1990s, there was so much skepticism about quantitative equity investing that David Leinweber, a pioneer in applying advanced techniques borrowed from the world of physics to fund management, wrote an article entitled: “Is quantitative investment dead?”1 In the article, Leinweber defended quantitative fund management and maintained that in an era of ever faster computers and ever larger databases, quantitative investment was here to stay. The skepticism toward quantitative fund management, provoked by the failure of some high-profile quantitative funds at that time, was related to the fact that investment professionals felt that capturing market inefficiencies could best be done by exercising human judgment.

Despite mainstream academic opinion that held that equity markets are efficient and unpredictable, the asset managers' job is to capture market inefficiencies and translate them into enhanced returns for their clients. At the academic level, the notion of efficient markets has been progressively relaxed. Empirical evidence led to the acceptance of the notion that financial markets are somewhat predictable and that systematic market inefficiencies can be detected. There has been a growing body of evidence that there are market anomalies that can be systematically exploited to earn excess profits after considering risk and transaction costs.2 In the face of this evidence, Andrew Lo proposed replacing the efficient market hypothesis with the adaptive market hypothesis as market inefficiencies appear as the market adapts to changes in a competitive environment.3

In this scenario, a quantitative equity investment management process is characterized by the use of computerized rules as the primary source of decisions. In a quantitative process, human intervention is limited to a control function that intervenes only exceptionally to modify decisions made by computers. We can say that a quantitative process is a process that quantifies things. The notion of quantifying things is central to any modern science, including the dismal science of economics. Note that everything related to accounting—balance sheet/income statement data, and even accounting at the national level—is by nature quantitative. So, in a narrow sense, finance has always been quantitative. The novelty is that we are now quantifying things that are not directly observed, such as risk, or things that are not quantitative per se, such as market sentiment and that we seek simple rules to link these quantities.

The gradual move of replacing traditional human judgments with machine calculations is based on the assumption that computers outperform most humans. Since a quantitative process is capable of systematically handling a large amount of information quickly and consistently, ambiguity and unpredictability that are often associated with subjective choices during decision making can be kept to a minimum. For fact or fancy, most modern portfolio managers include some form of quantitative approach in their overall investment process.

However, “quants” are often too anxious and overzealous to prove their points. This attitude leads to several side effects that offset some of the advantages of quantitative analysis. First, cards can be unintentionally stacked toward locating some significant pattern that they are eager to show. Finding a way to do this is much easier than conventional subjective reasoning, due to fast computing power that allows numerous trials and errors. Second, using the conventional criterion of significance at some statistical level, researchers who are superstitious of its power often quickly jump to the wrong conclusion. What they sometimes fail to realize is that statistical significance is neither a necessary nor sufficient condition for implementable excess returns because trading strategies often work only on portions of the data. For example, a return reversal strategy might be profitable even if the information coefficient is very low at a good confidence level. Third, humans have a tendency to only look at the unusual. Do you notice an event because it is interesting or is it interesting because you notice it? The resulting bias is that theories are established or tests are performed more easily on extraordinary events, and seemingly correlations are easily mistaken for causalities. This bias is further reinforced by high-speed computing, since quantitative tools are very efficient in locating outliers and finding correlations.

In this chapter, we explain the process of performing quantitative equity research and converting that research into implementable trading strategies. We begin by comparing the discovery process in the physical sciences and in economics.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset