Man vs. Machine:
An Introduction to Quantitative Investing

DOWNLOAD PDF >

The rise of technology and big data has helped fueled interest in quantitative investing. According to FUSE Research, over the past four years, there have been over $79 billion net flows into quantitatively managed mutual funds*. Though quantitative investing, described as the application of math, specifically statistics and probability, to financial markets, has become increasingly popular, its foundation dates back to the 1950s. Over the years, quantitative investing has become a popularly debated topic in our industry, causing many investors to choose a corner in the man vs. machine ring. However, we challenge that one-sided mindset as we believe there is a fit for quantitatively managed investments in every portfolio.


It is important to note that there is a wide spectrum of quantitative strategies. On one end of the spectrum are passively-run, low-fee ETFs, such as factor-based ETFs where stocks are chosen based solely on a mathematical or rules-based algorithm. On the other end of the spectrum are the “black-box” strategies where top-secret, sophisticated algorithms determine investment decisions with little outside understanding. In the middle are strategies that apply quantitative analysis, and algorithms, to trading methodologies based on specific factors or fundamentals.

Why Quantitative Investing?
Most quantitative managers use mathematical algorithms to perform systematic assessments of both fundamentals and valuation on a large pool of securities. With factor investing as the basis for quantitative investing, these strategies are built with the goal of helping investors avoid common behavioral pitfalls; designed to remove the emotional input from the investing process. Leveraging a repeatable process, quantitative approaches can be used to improve investment decisions by producing objective analysis which illuminates elusive, but repeating, historical patterns. Some feel that quantitative strategies rely too heavily on historical data. However, utilizing careful research and statistical methods can mitigate many potential pitfalls; developing quantitative investment models is a continual process of research, analysis and refinement.

Since quantitative strategies employ mathematical algorithms and are systematic in nature, they can be almost entirely automated. Automation allows for lower operating costs than those that require expensive research teams to study hundreds of individual securities prior to investment. Quantitative strategies also allow for the ability to cover a large universe of stocks in a very timely manner versus a fundamental approach where fewer securities can be effectively analyzed.

Finally, quantitative strategies invest based upon well-tested, deeply studied ideas to identify potential returns; fundamental managers, on the other hand, look to create positive returns for portfolios by studying specific characteristics of individual companies and finding investments that they believe are undervalued and will consequently outperform others.

Quantitative Strategies Fundamental/Qualitative Strategies
Built to remove emotional input by employing mathematical
algorithms that are systematic in nature based on specific
factors
Built on research and analysis of fundamental stock
characteristics (e.g. valuation metrics such as price-to-earnings ratio) by analysts
Performance drawn from conviction on studied ideas that
identify potential returns
Performance drawn from differentiated opinions of investment manager and individual characteristics of specific security holdings
Greater breadth of analysis of various securities Greater depth of analysis on specific securities

The History of Quantitative Investing
The beginning of quantitative strategies date back to the 1950s with Harry Markowitz’s seminal work on portfolio theory and linking security risk and return. Following Markowitz’s findings, William Sharpe, Jack Treynor, John Litner and Jan Mossin developed a theory that proved the relationship between expected returns and security risk as measured by beta for individual securities; which became known as the Capital Asset Pricing Model (CAPM). During the 1970s and 1980s, financial market anomalies, such as size, value and momentum, became more popular in the financial academic research world. Many of these studied anomalies were precursors to commonly used factors today.

Factor investing rose to popularity in the late 1990s and early 2000s with Fama and French paving the foundation with their findings in the Three-Factor Model. This new model became the baseline for much of the quantitative work that followed. The addition of a fourth major factor, momentum, was introduced by Carhart in the late 1990s—resulting in the Four-Factor Model.

Additional studies by De Bondt and Thaler, known as the fathers of behavioral finance, determined that the stock market tends to overreact to unexpected news events. Thus, resulting in momentum often being characterized as an expression of investor sentiment. Since then, additional factors have been introduced and the search for better factors continues to be a focus for many quantitative academics.

Quantitative Investing Today
Quantitative equity management has grown from the work of these researchers and, with the cost reduction in processing power and the availability of data, information and artificial intelligence has led to a boom in quantitative investment strategies. Today, sophisticated algorithm-based programs process billions of financial data points in search of, often times, minuscule signals that indicate a stock is likely to outperform the market on a risk-adjusted basis. As the race between traditional and quantitative investors continues, quants are constantly challenged to find new signals and data sets to deliver returns in an ever-efficient market.

On any given day you can find a headline outlining the steps many of the largest money managers are taking to add quantitative investing to their scope of offers for investors. More and more asset managers are hiring portfolio managers and researchers from deeply rooted academic backgrounds in computer science, physics and engineering. In March 2018, industry giant, Blackrock, announced its restructure of its equities unit and the shifting of billions of dollars to their quantitative asset management business called Systematic Active Equities (SAE). In October 2018, JP Morgan announced its mandatory new training program that required all 300 new analysts, and other asset managers at the firm, to take coding classes; this year’s focus was Python (a code language that allows for the analysis of large data sets). This story continues in other large banks and firms across the industry as technology continues to provide opportunities within the ever-competitive and fee-sensitive asset management landscape.

The Future of Quantitative Investing: An Integrated Approach
Despite quantitative and fundamental investing approaches being seen as opposite, in recent years the line has begun to blur. In practice, both approaches incorporate some component of quantitative analysis in order to pare down a large universe of stocks to a smaller ideal pool of purchase candidates that share a common theme or characteristic. Equipped with performance data in multiple market cycles, investment managers now know that quantitative and fundamental strategies perform differently in various market environments. Since quantitative strategy returns are based on alpha factors, while fundamentally run strategies are more focused on company specifics, the methodologies can be complementary. For that reason, adopting an integrated approach to portfolio construction to include both quantitative and fundamental strategies may be the most effective way to set your clients up for long-term success by providing a low correlation of active returns.

Read more of our Viewpoints >