Chapter 12
Technology in Investing

Economists and investment practitioners have been trying to make sense out of the financial markets for many years. Much of the early work, such as the dividend discount model and the Capital Asset Pricing Model (CAPM), was accomplished with little if any support from computers, and it continues to provide a foundation for current views of the world of markets. More recently, however, information technology has become an integral part of the academic study of investing, and is essential to investment managers in their analytical work and trading. In this chapter we consider a few high points of the role of technology in investment theory and practice. For readers interested in a detailed discussion of financial innovation, we recommend two authoritative books, both authored by the late market scholar Peter Bernstein: Capital Ideas,1 published in 1992 and covering the work from 1900 through the 1980s; and Capital Ideas Evolving,2 from 2007, which begins with the development of behavioral finance in the 1990s. For more general ideas on investing, we also point out Jack Treynor’s Treynor on Institutional Investing,3 and A Bibliography of Finance, edited by Richard Brealey and Helen Edwards.4

Information at Work

For much of the last century, financial analysis went begging for computational power. In Chapter 9, we noted the pioneering work in the 1930s of John Burr Williams, a Harvard-trained investment manager who devised a formula for determining the value of stocks from their expected future dividends (to then be compared to their prices, toward a decision on investment). Most of Williams’s formulas are quite complex, requiring forecasts of companies’ dividends far into the future (which in turn call for long-term forecasts of those companies’ earnings and dividend policies, as well as projections of the economy, etc.). As a second and probably more crucial step, investors also have to arrive at a discount rate appropriate to the company in question for calculating the present value. Digital computers were still in the invention stage at that time,5 so all the burdensome work had to be done by hand, or with the help of mechanical calculators. Williams submitted his doctoral thesis to the publishers Macmillan and McGraw-Hill in 1937, but both returned it, objecting to the algebra it contained. Harvard University Press eventually published it as The Theory of Investment Value, but forced Williams to pick up some of the printing costs.6

Williams’s book The Theory of Investment Value is a classic, and the concept behind it is brilliant, but until computers became available to handle all the computational work, it’s unlikely that many investors actually followed the Williams doctrine. The computational challenges also help to explain the reliance by so many investors, then and now, on shorthand measures such as price-earnings and price-to-book ratios.

Order from Chaos: Applying Scientific Frameworks

Before the waves of hypotheses and theories developed specifically for finance in the 1960s and 1970s, other early efforts to make sense of the stock market applied the understanding of the physical universe. An example is three papers authored by M. F. M. Osborne, a physicist at the U.S. Naval Laboratory, on Brownian motion in the stock market. (Brownian motion describes the random movements of particles in a fluid, as they collide with other particles.) His work appeared in the journal Operations Research, rather than the few journals of the day dedicated to theoretical finance.

Osborne conducted extensive research on the distribution of movements in stock prices, supplemented by the results of marathon coin tosses. To a reader without a background in physics and astronomy, the papers themselves are impenetrable. But a few of his observations are helpful: in a 1959 paper, he noted that prices of individual shares were correlated with the general market at about 0.70.7 And in a 1962 effort, Osborne helpfully concluded:

The picture of chaotic or Brownian motion does not imply that there can be no underlying rational structure [to the stock market]. We have tried to show that there is some underlying structure associated with what appears superficially to be the epitome of bedlam.8

In more familiar terms, imagine a table covered with dust. When a window is opened and the dust flies around the room, the resulting chaos looks like the erratic movement of prices in the stock market. Once the system is in action, the movements may appear to be unpredictable or random, but there is a definite cause.

Borrowing from the scientific world has not been limited to the theoretical side, however: brokers and asset managers have been hiring physicists since the 1980s, seeking to bring fresh insights of “econophysicists” to increasingly dynamic markets and financial instruments.9

Development of a rigorous theoretical framework custom-designed for investing had actually begun a few years earlier, with the pioneering 1952 work of Harry Markowitz, at the time a graduate student at the University of Chicago. (M. F. M. Osborne seems to have been unaware of these early efforts, as his early paper cited only books on astronomy and the behavior of gases.)10

Rather than examine the behavior of prices of individual securities, Markowitz considered the relationships among securities held in portfolios. He posited that investors should diversify, and that they should seek a maximum expected return. He offered a systematic solution for finding an efficient portfolio—one that provides the maximum output (return) for a given input (risk), or minimum input needed to achieve a given output.11

Markowitz’s key insight was that portfolio risk was determined by the variance and covariance of the returns on securities—the extent to which the returns moved together—and not just the riskiness of the individual securities. His “mean-variance” framework provided a foundation for the formulation of the CAPM in the following decade (Chapter 4).

Computers to the Rescue

By themselves, Markowitz’s creation of portfolio math and the CAPM research that followed required in-depth knowledge of statistics but were not computationally complex. However, implementing those ideas required extensive number crunching. “In order to follow Markowitz’s prescriptions, investors must analyze every possible combination of assets, searching for the efficient portfolios among them,” wrote Peter Bernstein.12 For each security, investors need reliable estimates of expected return and variability. “But that is the easy part,” Bernstein added: “They must then determine how each of the many securities under consideration will vary in relation to every one of the others. This is not something you can figure out on the back of an envelope.”

Fortunately, Markowitz possessed skill with computers, in particular linear programming. The output of all this work—scrutinizing combinations of securities—was a set of efficient portfolios, which were ranked according to their expected return or riskiness, named by Markowitz as the “Efficient Frontier.”

Computer technology was not a crucial part of the development of the theory of the CAPM, which dictated that all investors own, in some fashion, the market portfolio containing all available assets. However, it was essential to its implementation, in managing such a broad portfolio. Into the early 1960s, the use of computers had been limited to accounting, but in 1964 an ambitious mathematics student, John McQuown, set out to mechanize investing, or at least parts of it.13 He found his way to Wells Fargo Bank in San Francisco, which granted him a generous budget, and McQuown was able to recruit a large and talented team—many of whom later became financial legends, among them Fischer Black, Myron Scholes, William Sharpe, Eugene Fama, Merton Miller, and even Harry Markowitz.

Three years of effort yielded no meaningful results, but Wells Fargo management was dedicated to the project, telling McQuown to start again from scratch. In 1971, six years in, a bank client—the pension fund of the luggage maker Samsonite—asked for a diversified portfolio of $6 million that spanned the entire stock market. The group developed a program for a portfolio that held all 1,500 or so issues listed on the New York Stock Exchange, in equal dollar amounts. The feature of equal weighting called for an enormous number of transactions to keep the positions at their proper weights, and the additional requirement of tracking all the transaction costs. But the fund worked, and forever changed the course of investing.

A Virtuous Circle

The expansion of information technology in the investing world resembles the general tech notion set out by Erik Brynjolfsson and Andrew McAfee: computers became cheaper, more powerful and easier to use; the increasing capacity and usefulness led to the amassing of digital financial data; and innovative ideas, both theoretical and practical, became easier to combine and cross-fertilize. Today, every corner and aspect of the markets is highly computerized: long-term equity investing, which is our main focus; as well as short-term trading and brokers’ processing of orders; custody and financial recordkeeping; financial media; and the pricing of securities that ties them all together. Typically, new ideas have emerged on the investing side, leading to a demand for new types of data, and forcing the transaction processing and custody side to catch up.

Wells Fargo provides a second early example. Once the innovators at Wells Fargo had gotten started in index funds, in 1973 they devised a more practical approach to a passive portfolio, with a commingled fund that tracked the S&P 500 and weighted stocks by their market capitalization. This product had greater appeal to institutional investors, as the S&P 500 had become an accepted performance benchmark. The scheme of market weighting also required less frequent rebalancing, and thus incurred much lower transaction costs.

In a process that today seems prehistoric, orders for the initial transaction to establish the fund were hand-carried by messenger from Wells Fargo to the broker Salomon Brothers, who traded the portfolio over several days, charging then-standard commissions that amounted to about one percent of the value of the fund assets. Wells Fargo’s trust accounting systems were overwhelmed by the sudden trade volumes, so that implementation of the new fund represented the start of a new era not only for investment management, but for trading and back office operations as well.14

Expansion of Index Funds

The large, liquid market segment of S&P 500 stocks in the 1970s was a logical starting place for index funds, but it failed to address the one-third or so of U.S. stock market capitalization in smaller companies. Index fund managers therefore established small-cap index funds. The large number of small-cap stocks, as well as their illiquidity, made owning all small companies infeasible and expensive, so the quantitative wrinkle of sampling was applied to select a portfolio that would generate a return representative of the small-cap universe.

“The development of the non-S&P 500 index fund [tracking a small cap stock index] placed active management performance at a greater disadvantage than before,” wrote investment practitioner William Jahnke. The S&P 500 index fund would deliver strong returns in years favoring large cap stocks, while the non-S&P 500 index fund did well when small stocks were in favor; together, they presented tough competition for old-line active managers.15 The development of index funds for international stock markets and bond markets was not far behind.16 By 1990, assets in indexed funds were reported at $270 billion, one-third of which was managed by Wells Fargo.17 After several business combinations, today that business resides with investment management giant BlackRock Corporation.

Betting Against the CAPM

While meeting the early demand from asset owners that were devotees of the CAPM, quantitative managers saw other opportunities in computer-managed strategies that defied the CAPM’s assumptions of market efficiency and optimal market portfolios. Soon to follow were funds that took advantage of the many anomalies and inefficiencies observed by academics.

Robert Hagin, who migrated to the brokerage firm Kidder, Peabody & Co. from teaching at the Wharton School, explained the opportunity to the ICFA Continuing Education Series in 1984: “The era of the two-parameter CAPM is quietly drawing to a close. In its place we are seeing a newly emerging era—an era of multifactor valuation models.”18

“Two things have happened to trigger disillusionment in the CAPM,” Hagin continued: “First, there is the increased evidence of excess returns associated with factors other than beta. . . . Specifically, evidence of ‘abnormal’ returns associated with factors such as a stock’s P/E ratio, its size, and its yield has brought the CAPM into question. . . .” As discussed in Chapter 5, these sorts of factors later emerged in the rehabilitation of the CAPM by Eugene Fama and Kenneth French in the 1990s.

The CAPM, Hagin conceded, had provided the valuable insight into how stocks were influenced by the market, but “with an increased understanding of the shortcomings of the CAPM, many practitioners are looking for relationships between returns and the classical attributes that we use to describe securities.” Computer-assisted quantitative analysis allowed investors to scan the market widely for positive effects, and then test their reliability in producing outperformance over time and in different market conditions.

In 1979, Wells Fargo pioneered a Yield-Tilt Fund that started with a broad index, but tilted the portfolio toward those stocks offering higher dividend yields. This was the beginning of another branch of quantitative investing—“enhanced indexing.” Rather than select its own portfolio from scratch, the methodology starts with the full index, and over- and underweights securities to emphasize the desired risk exposures. Wells Fargo’s insight evolved into the highly successful Alpha Tilts products of Barclays Global Investors, which are today managed by the Scientific Active Equity group of BlackRock and claim managed assets of $60 billion.

Selecting stocks on fundamental factors—through computer intelligence—continues to be the core of quantitative management. It is the central idea behind systematic strategies investing in value, growth, momentum, and low volatility, which are widely available in the institutional world, and are accessible to individual investors as well, through mutual funds and exchange-traded funds. The current generation of “smart beta” strategies is refined from the early days of selecting stocks by computer, but also has fundamental thinking behind it.

Concurrent Developments

Quantitative active management received a significant, although indirect, boost from the 1974 passage of the Employee Retirement Income Security Act (ERISA). Prior to ERISA, investment managers had been bound by the “prudent man” rule, which required each holding in a portfolio to stand on its own merits, but the new law introduced the “portfolio standard” which considered securities holdings in a portfolio context. “With ERISA a defense could be made for owning a diversified portfolio of low P/E stocks, where many of the individual holdings would be considered imprudent investments by the earlier standard,” wrote William Jahnke.19 The institutional market’s embrace of quantitative strategies lay years ahead, but the provisions of ERISA removed an important obstacle.

Another important advance to come out of the late 1970s was the commercialization of risk models and the portfolio optimizer, generally credited to Barr Rosenberg, who opened a firm consulting to investment managers in 1975, known as BARRA. “Before the optimizer, managers bought 30-stock portfolios and hoped they went up,” says Laurence Siegel, a financial scholar and the Gary P. Brinson director of research at the CFA Institute Research Foundation. “But with Rosenberg’s model, and other commercial optimizers, they had insights into building portfolios that were mean-variance efficient. Optimizers allowed managers to keep track of more diverse portfolios, tracking the overweights and underweights versus indices, and the consequences of those decisions.”20

Trained as an econometrician, Rosenberg was attuned to nonsystematic patterns and extreme observations in data. Echoing the earlier views of M. F. M. Osborne, Rosenberg told author Peter Bernstein:

Randomness is not a mystery. Instead it is the poorly described aspects of a process. . . . This sets me off from most people in finance who say that randomness is just what their model does not capture. . . .21

His models aimed to look beyond how individual stocks varied with the market, to predict risk that arose from industry effects and the broader economy. “Economic events give rise to ripples through the economy, but individual assets respond according to their individual, or microeconomic characteristics,” he explained, such as company size, cost structure, customer groups, and record of growth. In a 2005 interview, Rosenberg elaborated: “The thrust . . . was to associate investment returns with investment fundamentals; the goal was to model expected returns, variances, and covariances in this manner. In other words, to represent the mean-variance world in terms of the influence of fundamentals.”22

Acceptance of risk models by investment managers was grudging at first, but their power and utility became obvious. Today, they are universal in investment management, as well as with investment consultants and asset owners, who apply them to judging the individual performance of asset managers as well as asset managers in combination. In 1985, Barr Rosenberg went into competition with his risk model clients, and opened Rosenberg Institutional Equity Management, a quantitative management firm. (In 2010 Rosenberg’s firm encountered regulatory challenges, forcing him to leave the investment industry.)

The Spread of Quant

Quantitative management grew in several directions. Laurence Siegel offers an evolutionary description: “The image I think of is speciation—it started with one organism, and pretty soon you had all these different animals and plants competing with each other in the garden. A few survive.” Early on, the brokerage firms led the quant effort, Siegel says, because they could afford to pay high salaries to large teams. But the buy side was interested as well, both at large firms and smaller specialists. Clusters of quant activity sprung up around the financial academic centers of Boston and Chicago, as well as the West Coast and New York.

William Jahnke, in 1990, saw the evolution differently:

The innovation in the use of computers to manage active investment strategies has come for the most part from new entrants in the business without large established interests. This is true even within the established organizations that have permitted quantitative investment management to develop and coexist. . . .23

Computing and Data, Neck and Neck

The evolution of the data side of finance has been just as impressive and important as trading and investing. One crucial, early resource was the Center for Research in Securities Prices at the University of Chicago’s Booth School of Business. The university had collected prices and dividends on stocks listed on the New York Stock Exchange since 1926, but in 1959 a $300,000 grant from the brokerage firm then known as Merrill, Lynch, Pierce, Fenner & Smith funded computerization of the database—a boon to academics and investment practitioners, greatly increasing the accuracy and scope of their research.24

In 1981 a company called Innovative Market Systems opened its doors, providing pricing services for the inscrutable fixed income markets. By 1983 the firm had attracted $30 million of financing from Merrill Lynch, and in 1986 was renamed Bloomberg LP after one of the founding partners. The Bloomberg terminal is now ubiquitous in the financial world, and provides not only pricing data on all the world’s markets, but high-powered analysis as well, and has become a superhighway for financial and conventional news and communication among securities traders and their clients.

At about the same time arrived what is probably the biggest advance in finance, and for that matter any other field that relies on organized data—the electronic spreadsheet. VisiCalc (for visible calculator) was introduced in 1978, and improved upon by Lotus 1-2-3, which added graphing and database functions. Other contenders such as Symphony and Quattro Pro added new and important features, but all were eventually trumped by Microsoft Excel, which has evolved to a very powerful and versatile state, and today sees universal applications to all sorts of financial tasks.25

The expanding digitization of data has offered astute traders and investors important, if fleeting information advantages. One case is analysts’ estimates of companies’ earnings per share—a crucial input to valuation models for both fundamental and quantitative investors. Plenty of individual estimates were available, but discerning any useful trends would involve gathering large samples from many brokerage firms, and tracking the data and trends by hand. But starting in the mid-1970s, Lynch, Jones & Ryan, an innovative New York brokerage firm, began collecting earnings estimates in scale, and offered monthly summaries, on computer tapes, to institutional investor clients in a service known as I/B/E/S.

Available so broadly and quickly, earnings estimate revisions became an entirely new sort of information, and emitted a very strong—although short-term—signal that generated considerable excess returns. Some investors were so eager to plug each month’s new data into their models that rather than wait a day for delivery of the fresh tapes, they would travel to downtown Manhattan and jostle for position at Lynch, Jones & Ryan’s mailroom. Over time competing firms offered similar services and more rapid delivery, as well as consulting services interpreting trends, so that both the data and revision-based strategies became a standard ingredient for many investment firms’ offerings. Momentum in earnings estimates is still closely followed, and still works as a quantitative factor, but its power has diminished over time as more investors have refined their analyses.

Big Data—Beyond Bloomberg

The entire investment community has benefited from massive digitization—the digital compilation and delivery of data on the global economy, markets, and company financials. Through the conduit of the Internet, information of all sorts is available more rapidly, much of it free of charge. While the main use of technology in the 1970s and 1980s was the development of quantitative management techniques, today’s investors don’t have to be serious quants.

The nature and quality of information has changed as well. Not only is economic information widely and instantly reported from conventional sources, the world offers real-time feedback, such as indexes of economic surprise compiled by Citibank and HSBC, measuring daily the differences between actual reports on various data points and economists’ expectations. Adobe Systems now publishes several indexes on economic activity—although limited to the digital world—for prices of electronics and groceries, as well as trends in housing and employment, drawn from an extensive flow of online transactions.26 And since July 2014, the Atlanta Federal Reserve Bank has issued what it terms a “nowcast” of U.S. gross domestic product (GDP) growth, named GDPNow, updated five or six times each month. Forecasts are refreshed after successive releases of significant U.S. economic data such as personal income, purchasing managers indexes, retail trade, and home sales.27

With so much information available to so many, and so quickly, it might seem that any information advantage from public sources may have been eroded. However, some investment managers see new promise in two areas of current development: big data and artificial intelligence. Much ink has been spilled on the topic of investment applications, discussing the potential for systems in trading, regulatory compliance and reporting, asset custody, and risk management, as well as our focus—the investment management function. Proponents describe the “Four Vs” of big data: volume, velocity, variety and veracity.

As it pertains to portfolio management, we define big data as assembling systems to gather and interpret the massive amounts of information available that might offer insight on securities prices—conventional information such as analysts’ research reports, corporate news and financial statements, management conference calls, and media comments. Brokerage firms issue an average of 4,000 research reports per day, in 53 languages, according to a September 2015 paper from the Scientific Active Equity Group of BlackRock Corporation. They observe that while humans can make better sense of the messages in such research, they can’t possibly look through it all, making advanced text analysis a requirement for managers of global portfolios hoping to stay ahead.

Big data relevant to securities prices also extends to granular, less organized, real-time information—postings to Facebook, Twitter, and other social media that could hold breaking information on consumer preferences and reactions. Motivated by academic research that links employee satisfaction to superior share performance, BlackRock reports it has devised an automated system that informs stock selection by seeking out employee sentiment on particular companies from job search web sites and social media.28 We can envision similar systems tailored to measure economic developments and sentiment at the level of industries and companies as well.

Goldman Sachs has posited a highly nuanced opportunity to exploit research analysts’ changes in earnings estimates, by anticipating revisions before the fact through close interpretation of the language in their reports. “[I]nvestment research analysts may sometimes be reluctant to raise or lower a price target or rating too rapidly,” the firm writes: “Analysts may instead opt to reflect new views incrementally, by changing the tone and view of the text they write in their reports. . . . [I]dentifying an analyst’s evolving views prior to the release of higher ratings potentially can provide investors advantages in the decision to buy or sell a stock.” That is, analysts tend to telegraph their punches, and observant investors may be able to benefit.

Goldman also cites the tactic of looking for unintuitive relationships in the share behavior of companies in disparate industries, based on the impact of influences such as oil prices, changing weather, or new regulations: “[These examples entail] the linkage of data which cannot be downloaded from standard market-data terminals.”29

Investment managers are understandably guarded about their views and efforts on big data: it’s a new area with an uncertain and far-off payday. From the few public reports, early adopters of big data techniques have been hedge funds, especially those with a quantitative bent (and large tech budgets). Traditional asset managers have been slower to move. At Epoch, we see significant potential in big data, and are researching the topic with some of the many consultants that have sprung up.

One project that is particularly interesting to us, although still aspirational, is a more granular analysis of the stocks we select for our portfolios. Traditional analytics have focused only on the securities that portfolio managers have purchased, and compared the return of that group to the total return of a benchmark for a quarter or year. We believe there is tremendous insight to be gained from more thorough analyses—a sort of Moneyball for portfolio managers and research analysts. (The practice started in baseball, conducted by managers diligently gathering and analyzing each at bat, run, hit, and RBI, through a science called Moneyball. It’s more properly known as sabermetrics, and the acknowledged founder is Bill James, who in 1974 established the statistical analysis committee of the Society for American Baseball Research, along with Pete Palmer and Dick Cramer.)

Ultimately, our goal is to reckon not only our “errors of commission”—our results on the securities purchased—but also our “errors of omission”—the record on those that we should have bought but did not. Likewise, for our research analysts, a system can be devised to rigorously evaluate the choices of stocks recommended or discarded, which will enable us to better focus their coverage.

Big data takes investment research in a new direction—compiling thousands or millions of transactions, consumer opinions, and weather observations in real time. And it’s available now, in various forms of development. Individual investors can access a rough-and-ready source of new data, for free, from Google Trends, and plot the prices of individual stocks against the frequency of various searches (such as “durable goods,” “auto sales,” or “mobile and wireless”).

Artificial Intelligence

The last technological topic we consider represents a quantum leap for investing and technology—computers that can guide themselves and learn from the experience, or artificial intelligence (AI). The notion of computers that can operate outside preset programs and think on their own has been around for a long time. (Depending on the source, artificial intelligence may reach back to myths of ancient Greece, Frankenstein’s monster (1818), or the Babbage difference engine (1822).) A web site dedicated to computer pioneer Alan Turing presents a 22-page paper, written in 1948 but not published, titled “Intelligent Machinery”—laying out principles resembling today’s conception of AI, and discussing how the various computing machines available at the time measured up to the challenge.30 Current efforts at AI have received plenty of attention through their accomplishments beating the best human opponents at several sorts of games, but while complex, they operate in environments that are defined and relatively well controlled. They’re not subject to constant bombardment by new information such as China’s purchasing manager index, or a quick turn in the price of crude oil. What role does AI have in investment management?

For Epoch the development and implementation of such systems are well beyond our technical capabilities at this time, but we are interested observers nevertheless. The major AI proponents as of early 2016 seem to be large and technically inclined asset managers, in particular hedge funds. That is what press reports say, at least: it’s likely that most firms working on AI are not willing to share their involvement, for fear of losing whatever proprietary edge the effort can produce.

The interest that has been reported is remarkable, however. Point72 Asset Management, a Connecticut-based hedge fund manager, is said to have hired a team of 30 in 2015 (although, as described by Bloomberg, their objective seems more oriented to big data than AI). Bridgewater Associates, currently the world’s largest hedge fund manager, in 2012 hired David Ferrucci, who led the team of engineers at IBM Corporation who developed the Jeopardy!-winning Watson supercomputer.31 In early 2016 the firm also hired as president Jon Rubinstein, said to be the primary developer of Apple Inc.’s innovative iPod music player.

“[T]he human mind has not become any better than it was 100 years ago, and it’s very hard for someone using traditional methods to juggle all the information of the global economy in their head,” said David Siegel, co-head of Two Sigma, another leading computer-powered manager, quoted in the Financial Times. “Eventually the time will come that no human investment manager will be able to beat the computer.32

At Epoch, we disagree, and our money is on the human-computer combination: racing with the machine.

Notes

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.147.252