Cliff Asness: Lies, Damned Lies, and Data Mining

by Clifford Asness, Ph. D. AQR Capital Management, Inc.

We are the whipping boy for a recent article on the dangers of data mining in our field. And the whipping is delivered largely based on an unsupported shot taken by my frequent foil and sparring partner, Rob Arnott. Before I take on this attack 1  we need to back up a bit.

Data mining, that is searching the data to find in-sample patterns in returns that are not real but random, and then believing you’ve found truth, is a real problem in our field. Random doesn’t tend to repeat so data mining often fails to produce attractive real life returns going forward. And given the rewards to gathering assets, often made easier with a good “backtest,” the incentive to data mine is great. We’ve talked about it endlessly for years and written on it many times. But we’re not nihilists who believe everything is data mining. 2 ,  3  We are more likely to believe in-sample evidence when it’s also accompanied by strong out-of-sample evidence (across time, geography, and asset class 4 ) and an economic story that makes sense. 5  In that case, and barring exceptionally convincing evidence something has changed, we not only believe in it but will stick to it like grim death through its inevitable ups and downs. After many years of research and managing portfolios, we believe there are at least four widely known types of factors that are real (that is, they don’t just look good because of data mining). 6 , 7 , 8 People are often shocked that we believe in only a few core investment concepts – somehow they think there are many more. Nope. For instance: No small firm effect. No January effect. No Super Bowl effect – though if you do believe that indicator, you should be shorting stocks this year because of Tom Brady; sorry if that’s deflating.

But all of this didn’t stop us from being the cannon fodder in this new article. Are we data miners? Heck no. We’ve always explicitly stood for the opposite. The list of things with great backtests we don’t believe in is legion, and the ones we do believe in have, again, tended to work through time, in a multitude of asset classes and across geographies, with many of these being out-of-sample tests of the original findings. But, when you are trying your best to come up with a story, you find people who will say what you need them to (a journalistic version of data mining!) about someone vaguely interesting (I guess we’re vaguely so). So the reporter asked a non-objective guy with whom I’ve feuded to opine on me and by extension AQR. 9 , 10  It’s no secret to readers of this perspectives column, and our work in general, that we have had an intellectual dispute with Rob Arnott on the subject of whether the main factors commonly discussed are something one should time (get in/out based on how expensive they look versus history). 11  A secondary debate has indeed been whether some of the main factors in finance 12  are the result of long-term data mining. Well, perhaps because he lost the first point, Arnott upped the secondary topic to primary and unleashed on me in the Businessweek piece, leading with a tiny bit of honey about my “outstanding” prior work but then bringing on a big bowl of vinegar flavored whoop-ass. 13  Rob says,

“I think Cliff has done some outstanding work over the years,” but adds that he’s “insufficiently skeptical about the pervasiveness of data-mining and its impact even in the factors he uses.”

That is, he says I’m a data-miner. That may seem like an innocuous little comment actually prefaced with a kind of compliment. It’s not. It’s a damning accusation that’s provably false, backwards in fact. Worse, it’s a falsehood meant to deflect and confuse as it kind of rhymes with a separate dispute we’ve been having, the “secondary debate” mentioned above – a dispute Rob’s been ducking. So, if you just read Rob’s comment on its own by most peoples’ standards I’m overreacting here. Admittedly that’s kind of my go-to move. But, in the broader context, the ongoing debate and what a serious and backwards “shot” he really took, I think I’m reacting appropriately. Of course, I usually think that


After Rob’s quote the article provides a response from me. They actually ran a somewhat truncated version of what I said. Here is the verbatim response I sent the reporter to Rob’s above comment,

"Rob and AQR largely believe in a very similar set of factors like value, low risk, and momentum, to which we think we've both applied a lot of a priori skepticism. Protestations otherwise are marketing tactics and reflect an ongoing confusion between factor timing, which he believes in more than we do, and long term factor efficacy."

That kind of says it all but way too briefly and calmly for my taste; hence, this longer version you’re reading now.

In the first part of the above quote I was making a very simple point. Rob and Research Affiliates publicly claim to believe in, and run investment products based on, factors that largely overlap with AQR. Now, for competitive reasons I wish they’d stop, but it’s a free country. Value, low risk, and momentum are all things to various degrees we both believe in, and when it comes to investing in equities, it covers a large part of what each of us do. 14   Check out how his firm describes one of its products. What exactly does he claim we believe in because of data mining that isn’t in his list here? I mean, if I and AQR are data miners, then double data-mining on you Rob! 15  That he’d accuse us of being “insufficiently skeptical” about the dangers of data mining isn’t just at odds with our long history of the exact opposite, but bat**** crazy when it’s mostly the stuff he believes in too. I guess he’s hoping nobody noticed. I noticed. I notice things, particularly when they are about me and they are so very noticeable. 16

Let me be clear. Rob doesn’t actually think the factors we at AQR believe in are data mined any more than he believes his own are data mined. That’s a smokescreen. What Rob is doing here is the time honored strategy of the best defense is a good offense combined with the old adage about pounding the table when you’ve got nothing. Separate from this kerfuffle Rob has actually accused most of the field of applied finance of data mining in a very specific way and we’ve shown he’s wrong (at least massively exaggerating). Apparently he doesn’t like that so we have this deflection. Please note, he’s not wrong that data mining is a big problem, everybody reasonable thinks that certainly including me. Whoever shouts it louder at others doesn’t necessarily believe it more. But his very specific accusation against the field about a very specific type of data mining has no teeth. Unfortunately to understand this we have to get much more into the geeky weeds, sorry


Rob has made claims in various papers that some of the major factors that much of the academic/practitioner world (not just AQR, and oddly again including him) has found to work historically are not just due to generic data mining but to a highly specific form of data mining he has uncovered. This is important. He’s not just crying that people data mine, which is always a dangerous possibility. He thinks he’s uncovered precisely the error they’re making. The highly specific type of data mining that he alleges is that some of these factors have richened over the long haul leading to a one-time long term windfall to an investor in that factor (or, more likely, to the backtest of that factor). He claims researchers have mistaken this windfall for repeatable return. It’s a good story that can indeed apply at relatively short horizons and at the peaks or troughs of major factor bubbles/depressions. But as I exhaustively show, based on a technique from Rob and his colleagues’ own papers, his very strong assertions are just wrong. The effect he discusses just doesn’t affect the long term results enough to matter for the factors in question. In a nutshell, if a factor richens by 100% over 50 years (let’s ignore compounding), a simplistic, and ultimately wrong, approach says the investor gets an extra 2% a year from this richening (this is indeed the approach Rob and colleagues focus on assuming the investor gets the full 2%). If true, this could indeed bias researchers to find and love factors that have richened. But, as Rob shows himself   17 (but then promptly ignores) if a factor richens like this you don’t get nearly 2% a year in reality, and that difference is most severe for higher turnover factors (though the difference exists for them all). The factor at the end that is 100% more expensive than the factor at the beginning is composed of very different positions. It’s not a single asset that’s richened but rather turnover has led the factor investor to own a very different portfolio over time. As such, the investor in the richening factor benefits in a much smaller way as some, often a lot, of the richening didn’t happen to the stocks they actually owned but to the stocks the strategy ended up owning at the end. 18  When you adjust for this, over the long haul (like the 50+ years used by most credible researchers) none of the factors Arnott and crew examine are seriously harmed by adjusting for long-term richening.

Essentially, despite them (very briefly) flagging the turnover issue themselves and presenting the outline of a way to deal with it (fleshed out in great detail and utilized in my previously referenced work), they then completely ignore all this and still scream “bad researchers have data mined over factor richening and we’re here to save the day!” They are saying that the entire rest of the field has missed something huge and important and has thus misled people. They say this despite their own evidence to the contrary that I expanded upon. I called him out here and in the last bullet here suggesting that they either retract their assertion or prove me wrong. 19   He has ignored this as, in general, he references little relevant research by others and none of our recent work as he’s writing repeated breathless white papers (i.e., watch out for the crash - we’re at the 85th percentile!) rather than participating in give-and-take debate (I promise I’m really ok with both the give and the take as long as we are actually addressing each other’s points not just taking shots in the media).

Let’s try to be super clear with a flowchart:

Arnott accuses most of the industry (academics and practitioners) of a specific form of data mining based on their mistaking factor richening for true factor return.

↓

Asness shows that this specific accusation is simply bad math (though, of course, acknowledging that other forms of data mining are always a concern). Asness repeatedly calls on Arnott to defend this specific broadside he undertook against academics and practitioners everywhere. Arnott, to date, declines.

↓

Arnott calls Cliff and AQR data miners in recent Bloomberg/ Businessweek article, presumably as a deflection from the real debate on the very specific accusation he’s made regarding data mining. Or perhaps it was just a revenge shot for past transgressions (i.e., Asness pointing out that both fundamental indexing and value-based factor timing are just systematic value investing and not new findings). Only the shadow knows.

↓

Asness, befuddled, points out that Arnott believes in largely the same very limited robust set of factors as AQR/Asness, and thus wonders, in some awe, how Arnott could thus make that particular accusation with a straight face?

↓

Arnott keeps face straight.

↓

Asness writes this screed.

Rob wants to debate “Resolved: Data mining is a real problem” with himself in the affirmative and me cast as the doubter. That’s not the debate. We both agree on that, and in fact mostly on what factors pass the “it’s not data mining” test. The actual debate is “Resolved: Researchers massively mistake factor richening for true factor return” with Rob arguing for and me against the proposition. That’s a debate I’ve done my part in but where he’s yet to engage. If he wants to, and I’m wrong, so be it. But this ain’t it, and his comments ain’t right.

At AQR we pride ourselves on minimizing data mining. Nobody in our field is perfect on this front but we’ve had the discipline to walk away from good-looking factors we don’t trust. We have no desire to find things that have worked in the past that won’t work going forward. In many ways we’ve led this fight against data mining for many years. We believe in a small subset of things 20  that have worked out-of-sample through time, out-of-sample across geography, out-of-sample across asset class, and importantly, are explained by an economic story that’s not just “the data says so.” So, finding myself the victim of a drive-by shooting accusation of data mining by someone who believes in largely the same things we do, even if he occasionally renames them and claims them in the name of the Kingdom of Arnott, was pretty jarring. I would put my “respect for data” (and wood) and “sufficient cynicism about data mining” up against Rob’s any day (by the way, I’m not saying he doesn’t have it too).

Bottom line, his accusation that the industry has data mined over richening is false, his accusation that I’m a data miner is false (and particularly hypocritical), and I think he knows both these things and says them anyway.

Aside from that, Mrs. Asness and I very much enjoyed the play.

Total
0
Shares
Previous Article

3 Tips for Managing Cash in a Rising Rate Environment

Next Article

Global markets: Five issues to watch

Related Posts
Subscribe to AdvisorAnalyst.com notifications
Watch. Listen. Read. Raise your average.