Dennis Gartman, author of The Gartman Letter, was on CNBC saying he has never seen anything like the recent sharp sell-off in Gold.  It reminds me why we don't use predictive systems.

My father has always said to me, "plans are useless, planning is everything."   The problem with plans is that they assume life follows the exact sequence you predict -- or want -- it to.  In a non-linear system any slight deviation from that sequence can dramatically shift the outcome.

In my experience, prediction-based systems are only useful in scenarios where the data is stationary -- or for adaptive systems, at least near-term stationary.  In other words, prediction is possible only when the data you are predicting over looks like the data you've trained on.  Consider that for each dimension of complexity you add to your model (e.g. each explanatory variable), the number of data-points required to ensure a "well-rounded" dataset increases exponentially (e.g. if we want 10 uniformly spread data-points per variable, and we have n explanatory variables, we need 10^n data-points to cover the entire set; at 4 variables, that's 10,000 data-points).  Achieving a well-rounded dataset is nearly impossible in itself, however, due to the curse of dimensionality: the volume of the data-space increases so rapidly, available data becomes sparsely located, making achieving statistical significance over the entire space difficult.

And, as the saying goes, the only constant is change.

So not only does the amount of data required for statistical significance grow at an exponential rate, but we can't be guaranteed that we've even seen the full set of possible information for a given variable.  Consider developing a prediction model that utilizes the VIX: pre-2008, you would only have data under 50 to train on.  Unless you stress-tested your model with values above 50, 2008 would have been "uncharted territory."

It is these facts that make me avoid prediction-based modeling -- when I hear, "We've never seen anything like this," I think: "That's another broken prediction model."

Corey is co-founder and Chief Investment Officer of Newfound Research, a quantitative asset manager offering a suite of separately managed accounts and mutual funds. At Newfound, Corey is responsible for portfolio management, investment research, strategy development, and communication of the firm's views to clients.

Prior to offering asset management services, Newfound licensed research from the quantitative investment models developed by Corey. At peak, this research helped steer the tactical allocation decisions for upwards of $10bn.

Corey is a frequent speaker on industry panels and contributes to ETF.com, ETF Trends, and Forbes.com’s Great Speculations blog. He was named a 2014 ETF All Star by ETF.com.

Corey holds a Master of Science in Computational Finance from Carnegie Mellon University and a Bachelor of Science in Computer Science, cum laude, from Cornell University.

You can connect with Corey on LinkedIn or Twitter.