Catching a frisbee is difficult. Doing so successfully requires the catcher to weigh a complex array of physical and atmospheric factors, among them wind speed and frisbee rotation. Were a physicist to write down frisbee-catching as an optimal control problem, they would need to understand and apply Newton’s Law of Gravity.

Yet despite this complexity, catching a frisbee is remarkably common. […] It is a task that an average dog can master. […]

So what is the secret of the dog’s success? The answer, as in many other areas of complex decision-making, is simple. Or rather, it is to keep it simple. For studies have shown that the frisbee-catching dog follows the simplest of rules of thumb: run at a speed so that the angle of gaze to the frisbee remains roughly constant. Humans follow an identical rule of thumb.

– Andrew Haldane, “The dog and the frisbee”

Markets do not obey math.  Two stocks are not bound to wander together by some force called correlation; correlation is simply a numerical summary of historical similarity.  There are no fundamental, underlying equations that dictate market behavior that quantitative models can anchor themselves to.  Rather, quantitative models seek to summarize and create actionable conclusions from certain salient market features.  Recognizing that quantitative models, by definition, will never be a complete summary of market complexity means that robustness must be the critical element in model design.

Simple by designTM is a core tenet of Newfound Research’s Quantitative IntegrityTM philosophy.  We believe that adhering to the tenet enforces model robustness.  The belief is well rooted in econometric best practices, where the most parsimonious model is generally considered to be the better model. The siren’s song of “big data” would have us believe that solutions can come from simply throwing more data at our problem.  The risk is that as the quantity of data we use increases, the relationships we uncover become more and more likely to be spurious.  The more complexity we allow in our model, the greater the likelihood that the model departs from assumptions based on reality and becomes an artifact of data mining.  The art of modeling is in creating the simplest possible description of a complex phenomenon that still captures the salient features we are interested in.

One of the mandatory assumptions that all modelers have to make is once the model is calibrated, the underlying statistical properties of the data do not change.  In other words, we assume that the future looks like the past and that nothing about the problem we are modeling will change in an unexpected way.  The more complexity in our model, the more relationships we require to remain stable.  Unfortunately, the past is often a fragile guide to the future; complex, over-fit models exacerbate this risk since they tend to sway with the slightest statistical breeze.  The more parameters we set, the more vulnerable a model is to changes in underlying data.

In complex environments, having the right model but being uncertain about the parameters is often worse than having the wrong model but knowing the parameters perfectly.  A model’s robustness is measured and tested when it faces uncertainty.  Richard Feynman said it best: “[i]t is not what we know, but what we do not know which we must always address, to avoid major failures ….”

Complexity often arises in models not by initial design, but frequently in adjustments that are built over time due to new, contradictory data.  For example, this evolution of features occurred in Ptolemaic – or geocentric – cosmology as new astronomical observations were made.  Planetary movements that fell outside historical data (i.e. the future did not look like the past) forced the model to extrapolate into a region it had no prior data about.  The failure of the model on this new data led to increased complexity being added to the model, like planetary cycles and epicycles.  Eventually, the model buckled under its own weight and it took a complete teardown and the construction of a simpler, heliocentric model to successfully match data.  At Newfound, we believe that enhancements in the face of contradictory data focus not on what we can add, but rather what we can remove from our model to make it more robust.

Beyond the statistical bases and the best practices of econometric model design that underlie our simple by designTM policy, there are two practical reasons as well.  When we bring model design into the world of active portfolio management, data can impose two limiting factors upon an investment strategy:

  • 1. Limits over when, and how frequently, you can trade
  • 2. Limits what instruments you can trade

Consider a model that utilizes U.S. economic information such as CPI, industrial production and GDP.  While these values may provide insight into the economic cycle, the frequency with which they are released limits how frequently new model values can be calculated and therefore how frequently a strategy based on the model may be able to rebalance.  If the data is released monthly, the model will only be able to provide a new output monthly, reducing how adaptive the model can be to market changes.

The data also limits the instruments that can be traded.  U.S. economic data is likely much less explanatory for changes in German Bund yields than German economic data.  Specialized data leads to specialized models.  While specialized models typically have more explanatory power, the tradeoff is they are more likely to be over-optimized to the data they were trained on.  The assumption of data stationarity becomes far more critical than with a more generic model.  Specialization makes it impossible to answer questions like, “how will our model behave if the U.S. economy enters a Japanese-esque deflationary spiral?”

Simple is not easy; in fact, simple is very hard.  Simple by designTM is the practice of constantly removing, not adding, and avoiding the allure of complexity and specialization.  Simple means leaving explanatory power on the table in preference of robustness.  We believe simple makes all the difference.

Corey is co-founder and Chief Investment Officer of Newfound Research, a quantitative asset manager offering a suite of separately managed accounts and mutual funds. At Newfound, Corey is responsible for portfolio management, investment research, strategy development, and communication of the firm's views to clients.

Prior to offering asset management services, Newfound licensed research from the quantitative investment models developed by Corey. At peak, this research helped steer the tactical allocation decisions for upwards of $10bn.

Corey is a frequent speaker on industry panels and contributes to ETF.com, ETF Trends, and Forbes.com’s Great Speculations blog. He was named a 2014 ETF All Star by ETF.com.

Corey holds a Master of Science in Computational Finance from Carnegie Mellon University and a Bachelor of Science in Computer Science, cum laude, from Cornell University.

You can connect with Corey on LinkedIn or Twitter.