Kevin Warsh, the nominee for Federal Reserve chair, has called for a substantial reassessment of the tools used to gauge inflation. That call landed against a backdrop of sharply divergent readings from different inflation measures for the same recent month, underscoring how difficult it is to identify an agreed-upon underlying inflation trend.
In February, the Fed's chief gauge of price change - the personal consumption expenditures price index used by the central bank for its 2% target - showed inflation running nearly a percentage point above that goal. Yet other calculations for the same month produced markedly different impressions.
The Dallas Federal Reserve, using a trimmed mean approach that discards extreme price moves such as an annualized 384% surge in moving services and a 50% annualized plunge in calculators and typewriters, estimated inflation close to target at about 2.3%. The Cleveland Fed's Center for Inflation Research, which emphasized median price changes, produced a result similar to the Bureau of Economic Analysis' headline number. By contrast, a more elaborate statistical model from the New York Fed put trend inflation higher still, at about 3.1%.
Those contrasts illustrate the central difficulty Warsh highlighted at his confirmation hearing - that different reasonable procedures for separating short-lived price swings from sustained trend growth can produce materially different policy signals. Omair Sharif, president and founder of Inflation Insights, criticized the notion that a single overlooked measure would provide the right answer. "Said like somebody who has not been in the building for a while and has not looked at inflation research since he left the Fed" in 2011 after serving as governor, Sharif said.
Sharif noted the trimmed mean measure Warsh mentioned at the hearing currently shows the lowest inflation among commonly used measures. That feature makes it more consistent with expectations of rate cuts that have been articulated by President Donald Trump. However, Sharif cautioned the trimmed mean missed the broadening and acceleration of inflation in 2021 - a failure captured by other, more widely used statistics - suggesting it can understate inflation when the distribution of price changes shifts.
The trimmed mean maintained by the Dallas Fed is constructed by removing items with outsized price moves and averaging the remainder. This method can provide a clearer picture when many prices move erratically in opposite directions. But at points when the comparative skew between the fastest and slowest price changes shifts, the metric may undercount the true underlying rise in prices.
For market participants and outside observers, one practical appeal of well-established gauges is familiarity. Krishna Guha, vice chair at Evercore ISI, wrote that the benefit of sticking with measures like core PCE - personal consumption expenditures excluding food and energy - is that they are better known in markets and are easier to explain without detailed statistical references. Core PCE is commonly regarded as a superior guide to trend inflation, even though the Fed's formal 2% target is defined in terms of headline PCE.
Warsh did not call for an explicit change to the Fed's 2% target at his hearing, but he did say focus should be "left of the decimal point," language that suggests some tolerance for inflation modestly above the objective.
Institutional resistance and the challenge of change
Warsh framed his remarks as a challenge to the "tyranny of the status quo," arguing for fresh frameworks, tools, and communications. But the history and structure of the Federal Reserve make such reforms difficult.
The Fed encompasses the Washington-based Board of Governors and its cadre of board economists, twelve regional Federal Reserve Banks each with their own staffs and analytic perspectives, and a broad ecosystem of former officials and outside researchers who are regularly drawn into internal debates. That mix can slow the adoption of new approaches. Current Chair Jerome Powell, for example, shared some of Warsh's criticisms of the Fed's communications toolbox but was largely rebuffed by other policymakers and staff when he proposed changes last year.
Change is not impossible. Research by Governor Christopher Waller and staff economist Andrew Figura in 2022, for instance, helped shift the debate by showing why the fast pace of Fed rate hikes was less likely to produce massive unemployment than many had feared, easing concerns among policymakers about raising rates to control inflation. And in 2020 the Fed adopted a markedly different framework - a move that demonstrated the institution's capacity to embrace new ideas.
That 2020 framework also illustrated the risks of innovation. Officials abandoned it last year after concluding it had downplayed inflation at precisely the moment policymakers needed to respond, delaying policy tightening. The episode is a cautionary note for any future redesign: novel frameworks can both improve outcomes and introduce new blind spots.
Technology, productivity, and the limits of certainty
Warsh has argued that advances like artificial intelligence could raise potential output and thus reduce inflationary pressures, a contention he advanced during questioning from lawmakers. He did not commit to a timing for interest-rate moves and was careful to hedge the outlook for short-term rates. "We don't know that. We can't bank on that," he said, urging the Fed to do considerable work to evaluate the productivity implications of AI.
Current officials have expressed a similar caution: while AI and related technologies will change parts of the economy, the speed and scale of those effects are uncertain and therefore hard to incorporate into current policy settings. Lawmakers at the hearing warned that misreading the productivity effects of AI could produce a policy mistake that would raise inflation.
Data work under way and the path forward
Fed officials and regional bank researchers have already experimented with a range of methods to tease out persistent inflation from one-off shocks. During the pandemic surge in prices, officials sliced the data in multiple ways and explored alternative "big data" sources - the sorts of private price feeds Warsh referenced - to try to detect the underlying signal in noisy monthly reports.
Sharif observed that government agencies have been integrating more actual price data from private sources into official indexes. He suggested a pragmatic route for Warsh to speed better statistics would be to persuade Congress to increase funding and staff for ongoing projects at agencies such as the Bureau of Labor Statistics. "I think what he was going for was we need to understand the data collection for all types of prices. That is a laudable goal. It is what BLS has been doing for many years," Sharif said. "But I don't think you get to some big new trend you never thought of. What he should have said is what everyone says. We will look at a variety of things to get a handle on inflation."
In short, the technical work is extensive and options abound. The trimmed mean offers one lens that currently reads inflation more mildly. Median measures and headline statistics tell a different story. Some sophisticated models show trend inflation higher still. Choosing among them - or devising a superior composite approach - demands careful research, resources for better price collection, and significant institutional buy-in.
Bottom line
Warsh's push for a fresh inflation framework brings to the foreground an unresolved dilemma at the Fed: a single month of data can produce sharply different policy implications depending on the metric used to interpret it. Reform advocates argue for a more nuanced signal extraction, while skeptics warn of the dangers of adopting new measures that may miss shifting patterns in price behavior. The Fed's past willingness to innovate shows change is possible, but the institution's mixed record in recent years also illustrates the costs of moving too quickly without broad consensus and better underlying data.