Fed: Goodhart's Law and the Thresholds

Published on January 10, 2014

For the Federal Reserve, the main takeaway from this morning’s almost nonsensical Non Farm Payroll numbers was not only it’s more than normal noise to signal ratio rendering it almost meaningless, but that it is further undermining the credibility of the 6.5% headline unemployment threshold.

A mere 74,000 net jobs being added is being explained away by the cold weather while a headline unemployment that dropped to 6.7% – a mere 0.2% away from the threshold – is likewise being explained away by a plunge in the labor participation rate that is falling so rapidly for God knows why exactly.

But whatever the take, the conclusion among many Fed officials is that Goodhart’s Law has come to the thresholds, further denting the reliability and credibility of yet another “rule(ish)” to guide monetary policy.

*** Rising uncertainty over the merits of the 6.5% unemployment threshold, and to some extent whether the central bank’s near term reaction function is rules or discretionary based, is stoking market volatility as much as confusing, crossed-signals data. A headline unemployment rate that could cross the threshold within the next few months — when the Fed has barely started to taper its monthly bond purchases much less begun debating the timing of a first rate hike — is putting a premium on clarity from Fed officials on the likely reaction function once the labor market threshold is indeed crossed. ***

*** The credibility of the unemployment threshold was already a topic of debate at the December Federal Open Market Committee meeting and it will be even more so at the January meeting. It will be difficult to reach a consensus on what to do with the thresholds then, but we do expect those discussions to set in motion a steady stream of speeches laying out the reaction function options that will be on the table for a formal revamp at the March FOMC meeting. ***

*** The easiest but less intellectually honest solution would be simply to punt, by lowering the threshold to 6%; citing the wider array of labor market data, such as the hire/fire rate or including the labor participation rate among others, or perhaps shifting the focus to a three month or six month rolling average, may be more likely. But some FOMC members will press for abandoning the thresholds altogether. ***

Rules Versus Discretion

Charles Goodhart was a former advisor to the Bank of England and professor at the London School of Economics when he made the case in the 1970s that as soon as a particular data series or measure is used as a target for policy, the underlying behavior alters, soon rendering it useless as a target. It was made famous in the criticism of the monetarism of the late 1970s and early 1980s under Margaret Thatcher in adopting narrow and broad measures of the money supply to guide monetary policy.

To some extent, the same fatal flaw is becoming apparent in the Fed’s unemployment threshold, which was already being questioned for its reliability due to the influence of a hard to predict or understand labor participation rate. Indeed, a debate has been reopened inside the Fed over the merits of the thresholds altogether, and whether they areĀ misguided attempt at a rules-based reaction function to guide market expectations when in fact by insisting the 6.5% unemployment rate and 2.5% inflation rate are “thresholds” and not “triggers, meaning the FOMC still wanted enough discretion to change their mind.

It was certainly a factor, judging by the Minutes, in the decision in December to discard the notion of lowering the 6.5% headline unemployment threshold to, say, 6%. Behind the concerns over its credibility if the threshold is changed is a nagging uncertainty over just where the natural trend level of unemployment lies, and whether the labor participation rate is likely to drop further still or rise along the lines of the assumptions built into the Fed’s forecasting models.

Lowering the threshold closer to the assumed longer run trend level was thus seen as a bit too risky at this point, at least until the Fed forecasters have a better handle on the falling labor participation rate, and how much of it is cyclical rather than structural and thus a driver to an amply accommodative monetary policy to spur further job growth and demand without excessive wage inflation.

For now, the Fed would prefer to perhaps stick with the more qualitative guidance on the other side of the threshold, simply affirming that it expects rates to be kept low well after the threshold is crossed. That expectation could be reinforced by including the wider array of all the labor market data points the FOMC is looking at anyway to gauge the labor market improvements — the hire/fire rates, openings, the quits rate, the participation rate, etc., — albeit with some wordy awkwardness in the formal statement.

But in light of the crucial importance of forward guidance, one option being weighed by Fed officials, particularly the more dovish-inclined, is to lean more on the low inflation readings to anchor the forward guidance on lower for longer rates.

But the same Goodhart problems may soon emerge with the inflation measures, as no one is quite sure why the low inflation has persisted for so long against modeled expectations, or for that matter, how quickly or exactly why the trend-line in inflation could reverse and begin a steadier ascent? And if inflation seems to be approaching the 2.5% safeguard inflation threshold sooner than projected, how will it alter the rates guidance? We can be pretty certain how the market would trade on it.

Perhaps in the case of today’s NFP number, the net jobs added is not truly representative of the underlying labor market trends, or perhaps the headline rate is the false signal. Either way, the noise to signal ratio is making a mockery of the monthly data point as a reliable guide to policy, and indeed isn’t saying a whole lot over the mantra of a “data-dependent” policy path – when its policy not data dependent anyway?

Fed officials are indeed weighing how and when to offer far more clarity on the forecasting process, and how the most recent data points, with all their revisions and noise, are fed into the Fed’s giant forecasting models. Likewise, implicit in that, further clarification of the underlying assumptions in the optimal control framework can also be expected to be forthcoming. In one form or another, they still want to put up some easy to grasp guide or threshold to guide expectations on the central bank’s reaction function. At the zero lower bound, with the wind down from the higher cost QE, forward guidance on rates is the primary policy tool, at least until the great rate tightening finally gets underway.

Until then, in some ways, the dilemma Janet Yellen will face as she takes the helm of the central bank in March is a mirror to that faced by Paul Volcker when he arrived in 1979. Then, Volcker knew what he wanted to do, to raise rates to crush the surging inflation and was searching for a measure to guide policy and expectations to break the wage-price spiral that would inevitably push the economy into a deep recession. Some thirty five years later, Yellen on the other hand is still facing high unemployment rather than inflation, and is searching for the right means to steer market expectations to shield a still nascent recovery from an even deeper recession.

Hints of the Yellen’s thinking about guidance, the thresholds, and the optimal control framework will inevitably be fleshed out in her first press conference in March, and perhaps in more general terms at her first Humphrey Hawkins testimony in February. But a showcase speech soon may be the ideal.

Back to list