Blogs Posted on

ETF Product Development: Innovation Versus Over-Engineering

ETF Product Development: Innovation Versus Over-Engineering


  • Riding on decades of academic research into quantitative-based investing, ETF sponsors have rolled out ETFs that mechanically track a single- or multi-factor investment strategy.  The first generation of such ETFs focused on simple themes such as dividend-yield/growth and low volatility.  Recent ETF innovation has seen the rollouts of multi-strategy ETFs built on multiple layers of complex rules designed to achieve a certain outcome.
  • As the ETF product landscape has exploded with new product ideas, innovation can easily turn into over-engineering as the complexities embedded within ETF strategies combined with ‘tight coupling’ can produce disappointing results for the investor.
  • When evaluating complex, multi-strategy ETFs, one should analyze the individual components that go into the strategy, their interaction with each other, and scenarios where such a strategy can ultimately fail due to the complexities and interdependence.    
  • When it comes to strategic beta or complex ETFs, know what you’re buying and why you’re buying it.  What is this ETF designed to achieve and how does it fit within your asset allocation?  If simpler solutions are available to achieve a similar objective, then opt for simple over complexity.

ETF Product Development: When Innovation Turns into Over Engineering

This quarter’s volatility has produced some early victims, notably some high profile hedge funds and quantitative market-neutral based strategies.  As a former quantitative equity portfolio manager, I lived through the infamous August 2007 Quantitative Meltdown, where highly levered strategies using a combination of value and momentum were forced to liquidate all at once causing significant losses tied to what had been historically strong performing strategies.  With this quarter’s sharp underperformance of similar strategies, the concern floating out amongst trading desks is whether we’re seeing another forced unwind of such strategies, particularly ones focused on price momentum, which had been one of the better performing strategies in recent periods.  In discussions with a handful of capital market desks, I don’t get the sense that there is as much leverage employed today as there was in 2007, but one never knows until the counter-trend unwind exhausts itself. 

In some ways, the August 2007 Quant Meltdown served as an early warning signal of the fragility of capital markets resulting in the Great Financial Crisis of 2008.  Much has been written about this period (and more recently in film, such as The Big Short), but I highly recommend reading a Demon of Our Own Design written by Richard Bookstaber, who formerly headed firm-wide risk management at Salomon Brothers.  In a nutshell, Bookstaber maintains that a system designed with ‘complexity’ and ‘interdependence and tight coupling’ is prone to normal accidents, whether nuclear power plants or leveraged financial vehicles tied to the performance of subprime mortgage-backed derivatives.    It’s a cautionary tale particularly for Wall Street whose lifeblood is tied to increasing innovation that can quickly mutate into over-engineering.  Complexity when combined with leverage leaves the financial markets more prone to liquidity-driven accidents and contagious selling of unrelated market segments.  Correlations spike to one where diversification no longer matters as long as the investment program is only invested in safe assets.

ETF Innovation: Know What You’re Buying Becoming Increasingly More Difficult

This quarter’s market-neutral meltdown partly inspired this blog post, but it was primarily due to an analysis of a recently-introduced multi-strategy ETF designed to provide U.S. equity market exposure but with lower volatility and greater risk-adjusted returns. 

First, ‘smart’ beta (factor) investing is not ‘smart’ at all but just a reformulation of the Dimensional Fund Advisors’ (DFA) strategy of investing in areas of the market which have afforded higher risk premia over the long run.  ‘Small cap’ and ‘value’ factors outperform over the market because they come with higher risks which investors are compensated for over the long run – these factors are no ‘smarter’ than a traditional market-cap based approach such as the S&P 500.  Corey Hoffstein from Newfound Research published a recent piece on in which he makes this astute observation about smart beta investing:

“It is important to point out that for the long-term premiums to exist in these factors, they must be volatile over time. The excess return generated by one investor is at the detriment of another.  If the returns were not time-varying, they would be viewed as “free.” In that case, there would be significant money inflow into the style, driving up prices and valuations and driving down forward expected returns until the premium converged to zero. Quite simply, volatility in the premium itself causes weak hands to fold, passing the premium to the strong hands that remain. [Underline Emphasis Added by 3D]”

Smart beta investing is not a free lunch but one with real risks involved, such that the largest harvests of risk premia occur when weaker investors are bailing out at just the wrong time.

Now ETF product innovation is a good thing as it has afforded investors access to market segments and themes only available to institutional investors.  ETF product innovation has captured the systematic elements of many actively-managed strategies and has helped expanded the list of options to DFA like-minded investors who no longer wish to be constrained to the Fama/French 3-factor world. 

But new entrants to ETF sponsorship along with more participation from institutional investors has resulted in a new cycle of product innovation characterized by increased complexity. ‘Dynamic’ management of market exposures represents the latest innovation.  Rather than providing a static exposure to, say, currency hedging, the ETF sponsor implements a rules-based dynamic hedging scheme designed to generate superior risk-adjusted performance over a static hedged or fully unhedged equivalent. 

The chase for ‘dynamic’ management introduces a new layer of complexity into the underlying exposure an ETF is designed to achieve.  This brings us back to an analysis of a recently-launched ETF whose objective is to generate superior risk-adjusted returns over traditional asset classes through a combination of long and short positions where the short exposure is dynamically managed.  Consider the components:

  1. ‘Long’ position weightings are based on a multi-factor approach combining value and growth metrics.
  2. The long weighting is further adjusted for its volatility characteristics (lower volatility stocks receive an incrementally higher weighting).
  3. The short exposure is designed to hedge out equity market risk.  It is implemented using short S&P equity futures.
  4. These same value and growth metrics are used to determine the hedge ratio where the ETF can be 0%, 50%, or 100% hedged to market risk. 
    1. Some variants of this approach provided by other ETF sponsors use a separate top-down business cycle indicator to determine the hedge ratio or base the hedge ratio on momentum-driven technical analysis.

Now consider the complexity embedded in this approach as well as the interlocking dependencies making it more vulnerable to the type of accidents of the kind found in Bookstaber’s narrative.   The ETF investor must ask, “What exposure am I ultimately buying with this ETF?” It is not a straightforward answer because the exposure is contingent on how this ETF is positioned given the latest market conditions.  In addition, multiple things have to go right in order for this ETF to achieve its objective.  The choice of factors must be correct.  How the factors are mixed and weighted must be correct.  The hedge ratio must be correct (market timing is an historically dubious exercise). 

But what ultimately is this ETF trying to achieve?  It’s trying to achieve that elusive equity market free lunch – high capital market returns associated with equity investing but without as much risk.  But that sort of objective flies in the face of capital markets pricing theories and would be expected to achieve the opposite, namely lower risk-adjusted returns when you factor in the ETF’s complexities and underlying fees.  Many aspects of this strategy would have to perform consistently well in order for the ETF to achieve its objective, whereas, the failure of just one aspect can result in underperformance or an accident (especially if it were combined with leverage). 

When it comes to strategic beta or complex ETFs, it is imperative to know what you’re buying and why you’re buying it.  Ask yourself, “What is this ETF designed to achieve and how does it fit within your asset allocation?”  If simpler solutions are available to achieve a similar objective, then opt for simple over complexity.  Product innovation is welcomed to a growing marketplace, but it a balance must be struck between innovation and system complexity.  That is the elegance of design. 

By: Benjamin Lavine