Site icon Flirting with Models

Time Dilation

This post is available as a PDF download here.

Summary

In the 2014 film Interstellar, Earth has been plagued by crop blights and dust storms that threaten the survival of mankind. Unknown, interstellar beings have opened a wormhole near Saturn, creating a path to a distant galaxy and the potential of a new home for humanity.

Twelve volunteers travel into the wormhole to explore twelve potentially hospitable planets, all located near a massive black hole named Gargantua. Of the twelve, only three reported back positive results.

With confirmation in hand, the crew of the spaceship Endurance sets out from Earth with 5,000 frozen human embryos, intent on colonizing the new planets.

After traversing the wormhole, the crew sets down upon the first planet – an ocean world – and quickly discovers that it is actually inhospitable. A gigantic tidal wave kills one member of the crew and severely delays the lander’s departure.

The close proximity of the planet to the gravitational forces of the supermassive black hole invites exponential time dilation effects. The positive beacon that had been tracked had perhaps been triggered just minutes prior on the planet. For the crew, the three hours spent on the planet amounted to over 23 years on Earth. The crew can only watch, devastated, as their loved ones age before their eyes in the video messages received – and never responded to – in their multi-decade absence.


Our lives revolve around the clock, though we do not often stop to reflect upon the nature of time.

Some aspects of time tie to corresponding natural events. A day is simply reckoned from one midnight to the next, reflecting the Earth’s full rotation about its axis. A year, which reflects the length of time it takes for the Earth to make a full revolution around the Sun, will also correspond to a full set of a seasons.

Others, however, are seemingly more arbitrary. The twenty-four-hour day is derived from ancient Egyptians, who divided day-time into 10 hours, bookended by twilight hours. The division of an hour into sixty minutes comes from the Babylonians, who used a sexagesimal counting system.

We impose the governance of the clock upon our financial system as well. Public companies prepare quarterly and annual reports. Economic data is released at a scheduled monthly or quarterly pace. Trading days for U.S. equity markets are defined as between the hours of 9:30am and 4:00pm ET.

In many ways, our imposition of the clock upon markets creates a natural cadence for the flow of information.

Yet, despite our best efforts to impose order, information most certainly does not flow into the market in a constant or steady manner.

New innovations, geopolitical frictions, and errant tweets all represent idiosyncratic events that can reshape our views in an instant. A single event can be of greater import than all the cumulative economic news that came before it; just consider the collapse of Lehman Brothers.

And much like the time dilation experienced by the crew of Endurance, a few, harrowing days of 2008 may have felt longer than the entirety of a tranquil year like 2017.

One way of trying to visualize this concept is by looking at the cumulative variance of returns. Given the clustered nature of volatility, we would expect to see periods where the variance accumulates slowly (“calm markets”) and periods where the variance accumulates rapidly (“chaotic markets”).

When we perform this exercise – by simply summing squared daily returns for the S&P 500 over time – we see precisely this. During market environments that exhibit stable economic growth and little market uncertainty, we see very slow and steady accumulation of variance. During periods when markets are seeking to rapidly reprice risk (e.g. 2008), we see rapid jumps.

Source: CSI Data. Calculations by Newfound Research.

If we believe that information flow is not static and constant, then sampling data on a constant, fixed interval will mean that during calm markets we might be over-sampling our data and during chaotic markets we might be under-sampling.

Let’s make this a bit more concrete.

Below we plot the –adjusted closing price of the S&P 500– and its –200-day simple moving average–. Here, the simple moving average aims to estimate the trend component of price. We can see that during the 2005-2007 period, it estimates the underlying trend well, while in 2008 it dramatically lags price decline.

Source: CSI Data. Calculations by Newfound Research.

The question we might want to ask ourselves is, why are looking at the prior 200 days? Or, more specifically, why is a day a meaningful unit of measure? We already demonstrated above that it very well may not be: one day might be packed with economically-relevant information and another entirely devoid.

Perhaps there are other ways in which we might think about sampling data. We could, for example, sample data based upon cumulative volume intervals. Another might be on a fixed number of cumulative ticks or trades. Yet another might be on a fixed cumulative volatility or variance.

As a firm which makes heavy use of trend-following techniques, we are particularly partial to the latter approach, as the volatility of an asset’s trend versus its price should inform the trend lookback horizon. If we think of trend following as being the trading strategy that replicates the payoff profile of a straddle, increased volatility levels will decrease the delta of the option positions, and therefore decrease our position size. An interpretation of this effect is that the increased volatility decreases our certainty of where price will fall at expiration, and therefore we need to decrease our sensitivity to price movements.

If that all sounds like Greek, consider this simple example. Assume that price follows a highly simplified model as a function of time:

There are two components of this model: the linear trend and the noise.

Now let’s assume we are attempting to identify whether the linear trend is positive or negative by using a simple moving average (“SMA”) of price:

To determine if there is a positive or a negative trend, we simply ask if our current SMA value is greater or less than the prior SMA value. For a positive trend, we require:

Substituting our above definition of the simple moving average:

When we recognize that most of the terms on the left also appear on the right, we can re-write the whole comparison as the new price in the SMA being greater than the old price dropping out of the SMA:

Which, through substitution of our original definition, leaves us with:

Re-arranging a bit, we get:

Here we use the fact that sin(x) is bounded between -1 and 1, meaning that:

Assuming a positive trend (m > 0), we can replace with our worst-case scenario,

To quickly test this result, we can construct a simple time series where we assume a=3 and m=0.5, which implies that our SMA length should be greater than 11. We plot the –time series– and –SMA– below. Note that the –SMA– is always increasing.

Despite being a highly simplified model, it illuminates that our lookback length should be a function of noise versus trend strength. The higher the ratio of noise to trend, the longer the lookback required to smooth out the noise. On the other hand, when the trend is very strong and the noise is weak, the lookback can be quite short.1

Thus, if trend and noise change over time (which we would expect them to), the optimal lookback will be a dynamic function. When trend is much weaker than noise, we our lookback period will be extended; when trend is much stronger than noise, the lookback period shrinks.

But what if we transform the sampling domain? Rather than sampling price every time step, what if we sample price as a function of cumulative noise? For example, using our simple model, we could sample when cumulative noise sums back to zero (which, in this example, will be the equivalent of sampling every 2π time-steps).2

Sampling at that frequency, how many of data points would we need to estimate our trend? We need not even work out the math as before; a bit of analytical logic will suffice. In this case, because we know the cumulative noise equals zero, we know that a point-to-point comparison will be affected only by the trend component. Thus, we only need n=1 in this new domain.

And this is true regardless of the parameterization of trend or noise. Goodbye! dynamic lookback function.

Of course, this is a purely hypothetical – and dramatically over-simplified – model. Nevertheless, it may illuminate why time-based sampling may not be the most efficient practice if we do not believe that information flow is constant.

Below, we again plot the –S&P 500– as well as a standard –200-day simple moving average–.

We also sample prices of the S&P 500 based upon cumulative magnitude of log differences, approximating a cumulative 2.5% volatility move. When the market exhibits low volatility levels, the process samples price less frequently. When the market exhibits high volatility, it samples more frequently. Finally, we plot a –200 period moving average– based upon these samples.

We can see that sampling in a different domain – in this case, the log difference space – we can generate a process that reacts dynamically in the time domain. During the calm markets of 2006 and early 2007, the –200 period moving average– behaves like the –200-day simple moving average–, whereas during the 2008 crisis it adapts to the changing price level far more quickly.

By changing the domain in which we sample, we may be able to create a model that is dynamic in the time domain, avoiding the time-dilation effects of information flow.

Conclusion

Each morning the sun rises and each evening it sets. Every year the Earth travels in orbit around the sun. What occurs during those time spans, however, varies dramatically day-by-day and year-by-year. Yet in finance – and especially quantitative finance – we often find ourselves using time as a measuring stick.

We find the notion of time almost everywhere in portfolio construction. Factors, for example, are often defined by measurements over a certain lookback horizon and reformed based upon the decay speed of the signal.

Even strategic portfolios are often rebalanced based upon the calendar. As we demonstrated in our paper Rebalance Timing Luck: The Difference Between Hired and Fired, fixed-schedule rebalancing can invite tremendous random impact in our portfolios.

Information does not flow into the market at a constant rate. While time may be a convenient measure, it may actually cause us to sample too frequently in some market environments and not frequently enough in others.

One answer may be to transform our measurements into a different domain. Rather than sampling price based upon the market close of each day, we might sample price based upon a fixed amount of cumulative volume, trades, or even variance. In doing so, we might find that our measures now represent a more consistent amount of information flow, despite representing a dynamic amount of data in the time domain.

  1. In practice, estimating parameters for “noise” and “trend” can be rather complicated. However, strategies that employ a volatility targeting approach may have an advantage, as one parameter becomes fixed.
  2. If this isn’t immediately obvious, consider that the integral of sin(x) from 0 to 2π is zero.

Corey is co-founder and Chief Investment Officer of Newfound Research. Corey holds a Master of Science in Computational Finance from Carnegie Mellon University and a Bachelor of Science in Computer Science, cum laude, from Cornell University. You can connect with Corey on LinkedIn or Twitter.

Exit mobile version