The rapid evolution of computer technology in the last few decades has provided investment professionals (and amateurs) with the capability to access and analyze tremendous amounts of financial data. Additionally, the world wide web, email, and bulletin boards make it possible for people around the globe to access this information quickly, as well as providing a means for individuals to voice their opinions and interact. As a result, some of the more intriguing topics of debate in recent years have revolved around the practice and consequences of "data mining."
Data mining involves searching through databases for correlations and patterns that differ from results that would be anticipated to occur by chance or in random conditions. The practice of data mining in and of itself is neither good nor bad and the use of data mining has become common in many industries. For instance, in an attempt to improve life expectancy researchers might use data mining to analyze causes and correlations with death rates. Data Mining is also used by advertisers and marketing firms to target consumers. But possibly the most notorious group of data miners are stock market researchers that seek to predict future stock price movement. Most if not all Stock Market Anomalies have been discovered (or at least documented) via data mining of past prices and related (or sometimes unrelated) variables.
When market beating strategies are discovered via data mining, there are a number of potential problems in making the leap from a back-tested strategy to successfully investing in future real world conditions. The first problem is determining the probability that the relationships occurred at random or whether the anomaly may be unique to the specific sample that was tested. Statisticians are fond of pointing out that if you torture the data long enough, it will confess to anything.
In what is becoming an infamous example David Leinweber went searching for random correlations to the S&P 500. Peter Coy described Leinweber's findings in a Business Week article titled "He who mines data may strike fool's gold" (6/16/97). The article discussed data mining, Michael Drosnin’s book The Bible Code, and the fact that patterns will occur in data by pure chance, particularly if you consider many factors. Many cases of data mining are immune to statistical verification or rebuttal. In describing the pitfalls of data mining, Leinweber "sifted through a United Nations CD-ROM and discovered that historically, the single best predictor of the Standard & Poor's 500-stock index was butter production in Bangladesh." The lesson to learn according to Coy is a "formula that happens to fit the data of the past won't necessarily have any predictive value."
Back testing has always been a suspect class of information . . . When you look backwards, you're only going to show what's good.
Barry Miller (SEC) in What's the Stock Market Got to Do with the Production of Butter in Bangladesh? from Money (March 1998)Anomalies discovered through data mining are considered to be more significant as the period of time increases and if the anomaly can be confirmed in out of sample tests over different time periods and comparable markets (for instance on foreign exchanges). If an anomaly is discovered in back tests, its also important to determine how costs (transactions costs, the bid-ask spread, & impact costs for institutional traders) would reduce the returns. Some anomalies are simply not realizable. See the value line anomaly and implementation shortfall for more on this topic. Additionally, strategies that have worked in the past may simply stop working as more investors begin investing according to the strategy. See the Efficient Market Hypothesis for more on this topic.
The Motley Fool has been praised by many for offering educational advice to individual investors (for instance, the Motley Fool offers sound recommendations in advising investors to buy and hold stocks, to be wary of stock brokers and analysts conflicts of interest, and to be wary of unrealistic performance claims). But the Motley Fool’s "Foolish Four" stock strategy and its underlying rationale have drawn criticism.
In 1997, BYU Professors Grant McQueen and Steven Thorley coathored a paper in the Financial Analysts Journal (FAJ) that questioned the immensely popular Dogs of the Dow Strategy (Abstract). Having already gathered the data to analyze the Dow Dogs, the Professors followed up by making a case study in data mining out of the Motley Fool’s Foolish Four. McQueen and Thorley analyzed the Foolish Four as described in The Motley Fool Investment Guide (MFIG), but the Fools actually have multiple variations of the Foolish Four (See also the Foolish Four explained and Foolish Four History). That research resulted in another article published in the March/April 1999 issue of the Financial Analysts Journal titled "Mining Fool's Gold." In the spirit of the Fool's entertaining and creative writing style, the professors have posted a "lighthearted" version of the paper (in Wordperfect) on the BYU server. The data used in the study can be downloaded here.
McQueen and Thorley include a full explanation of the potential pitfalls of data mining and they conducted out of sample tests on the Foolish Four. The Professors reason that data mining can be detected by the complexity of the trading rule, the lack of a coherent story or theory, the performance of out-of-sample tests, and the adjustment of returns for risk, transaction costs, and taxes. Additionally, they argue that the Foolish Four and Dow Ten trading rules have become popular enough to impact stock prices at the turn of the year.
The Motley Fool has posted a spirited response to the FAJ paper in their Foolish Four portfolio reports which are accessible in their 1999 archives. See reports dated 5/10, 5/11, 5/12, 5/13, 5/14, 5/17, 5/18, 5/19, 5/20, and 5/21. Included in these responses are several counter arguments to the FAJ paper and as well acknowledgements of valid issues discussed in the paper.
While many of the issues are debatable, the real acid test and critical finding of the FAJ paper was an out of sample test for the Foolish Four returns from 1949 to 1972. For that period the Foolish Four barely beat the Dow 30 by an average of 0.32% per year with substantially more risk. Not only did the strategy underperform the Dow Dogs for the period, but after transactions costs and accounting for risk it clearly would have lagged the DJIA for the period. This critical issue was discussed briefly in the report dated 5/14.
To put this issue in perspective, consider an investor at the start of 1973 looking back at the DJIA performance over the preceding 24 years. It's difficult to rationalize how an investor could have known at that time that the Foolish Four would produce market beating returns going forward.
In another out of sample test, McQueen and Thorley used the base 1973 -1996 period discussed in MFIG, but used July for rebalancing rather than January. Under those conditions the Foolish Four returns beat the DJIA by only 2.95% per year on average, substantially lower than the 12.23% advantage over the DJIA with January rebalancing.
In defense of the Fools, several disclosures were at least made in MFIG and on the web site. In the Foolish Four report dated 8/7/98, they disclose that returns were lower when rebalancing occurred in months other than January. Additionally, in MFIG a 25.5% return figure from a twenty year period is used many times, but they do at least mention that they researched the numbers back to 1961 and for the longer time period, the returns dropped to 18.35%. On the other hand, once it is disclosed that a longer period of time was studied, continuing to cite the stronger shorter term numbers and basing arguments on that data certainly can be viewed as suspect. Disclosing and focusing on longer term results tends to increase the credibility of a data miner's argument.
Jason Zweig voiced his opinion of the Foolish Four and shares his own data mined "Very Stupid" and "Extra Dumb" portfolios in False Profits from Money magazine (August, 1999). On the Morningstar web site you can also read John Rekenthaler's opinion in Just foolin’ around as well as Investment Advisor William Bernstein's opinions in an article titled Mined: All Mined (see also James O'Shaughnessy's response and the ensuing debate).
In December 2000, The Motley Fool announced that they no longer advocate the "Foolish Four" stock strategy, which they had created. See Re-thinking the Foolish Four for the rationale behind the Fool's no longer recommending a strategy they had touted for years via their web site and books.
Moving on to another data mining debate, William Brock, Josef Lakonishok, and Blake LeBaron (BLL) published an article titled "Simple Technical Trading Rules and the Stochastic Properties of Stock Returns," in the December 1992 edition of the Journal of Finance. The study is one of the few academic papers to document a successful trading strategy based on technical analysis (See Technical Anomalies for a complete discussion of the article). The Professors demonstrated that both moving averages and support and resistance tools had predictive value relative to the Dow Jones Industrial Average for the period from 1897-1986.
Data-Snooping, Technical Trading Rule Performance, and the Bootstrap is an article that revisits the BLL paper and will appear in the October 1999 Edition of Journal of Finance. In the article, Ryan Sullivan, Allan Timmermann, and Halbert White (STW) attempt to determine the effect of Data-Snooping on the BLL results. They also use data collected from the period following the original study (BLL data ran through 1986) in order to provide an out of sample test. Adding the recent years provided a full 100 years of data. STW calculated a break even transaction cost level of 0.27% percent per trade for the best performing trading rule for the full period.
Since the original BLL data covered an extremely long period of almost 90 years, one might expect the strategies to perform well in the out of sample tests. But the study's conclusions may end up being used as another potential example of the Efficient Market Hypothesis. STW found "that the results of BLL appear to be robust to data-snooping . . . However, we also find that the superior performance of the best trading rule is not repeated in the out-of-sample experiment covering the period 1987-1996" and "there is scant evidence that technical trading rules were of any economic value during the period 1987-1996." This may offer another caveat for stock market data miners and active investors. Even if an anomaly worked in the past over very long periods of time, and even if results do not appear to suffer from the pitfalls of data snooping, once the anomaly is discovered it may cease to work going forward.
Reasonable people can have a reasonable difference of opinion without it becoming an issue of ethics or faith.
Leigh Steinberg in Winning With IntegrityAlarming Efficiency (RR) from Dow Jones Asset Management (5-6/99) is an interesting article that discusses data mining and the problem of "overfitting." Included are comments from investment industry veterans David Shaw, Ted Aronson, and Robert Arnott. The article argues that given a finite amount of historical data and an infinite number of complex models, uninformed investors might be lured into "overfitting" the data. Patterns that are assumed to be systematic may actually be sample-specific and therefore of no value.
People are coming to us all the time with trading strategies that reportedly make very large excess returns . . . But the vast majority of the things that people discover by taking standard mathematical tools and sifting through a vast amount of data are statistical artifacts.
David Shaw in Alarming Efficiency (RR) from Dow Jones Asset Management (5-6/99)Aronson argues that the market is "nearly totally efficient" and that "You're fooling yourself if you think you'll outguess the other guy by more than about 51% or 52% of the time." Aronson believes that investors searching for market inefficiencies have reduced the potential to profit from those anomalies to the equivalent of transactions costs. If that is the case, minimizing transactions costs is critical in attempting to beat the market.
So are there any anomalies that have been confirmed in out of sample tests? In another forthcoming Journal of Finance article, James L. Davis, Eugene F. Fama, and Kenneth R. French argue that the answer is a definite yes. Companies with low price to book value ratios outperform and the pattern has been documented in both US and foreign markets. In Characteristics, Covariances, and Average Returns: 1929 to 1997 the authors go a big step further in documenting returns of low price to book value stocks from 1929 to 1963. For the earlier period, the value premium was even larger (.50% per month) than the more recent July 1963 to June 1997 period (.43% per month).
In the end, do we ever really know for sure what strategies will outperform in the future? Opinions on that question definitely vary, but the standard disclaimer applies as always. Past performance is no guarantee of future performance.
- Statistics Resources
- An Introduction to Data Mining
- An Introduction to Data Mining
- What data mining is - and isn't
- Data Mining
- Data Mining Glossary from Two Crows Corp
- Sixteen Tons of Information Overload from Fortune (8/2/99)
- Looking for Patterns ($$) from the Wall Street Journal (6/21/99)
- Gallery Of Statistics Jokes Additional mathematical discussions are included in the Cherry Picking, Stock Market Scam, and Coin-Flipping pages.
Last update 2/12/2001. Copyright © 2001 Investor Home. All rights reserved. Disclaimer