Inflation leveled off below 3% in 2024 and has drifted around the 3% level in 2025. The rate of increase in the core PCE (Personal Consumption Deflator) is the inflation measure of most interest to the Federal Reserve as a policy reference, but advances in the core CPI (Consumer Price Index) have settled at about the same level. The core inflation rates exclude food and energy prices due to the volatility of those components, but even with food and energy, inflation in the PCE and the CPI have been running near 3%.
It’s a 2% Target… Or Is It?
The Fed continues to maintain that its “official” inflation target is 2% for the core PCE. However, the central bank is now easing policy despite inflation running a full percentage point faster than the target. The rationale turns on the Fed’s dual mandate to maintain both “price stability” and full employment, goals that are not always compatible.
Currently, the labor market is showing signs of weakness, so the Fed has elected to ease policy by guiding the federal funds rate downward, and by putting a stop to run-off in its balance sheet holdings of securities. The latter ends a brief period of so-called quantitative tightening.
Just a couple of months ago, the central bank announced a new emphasis on targeting 2% inflation in the long run, with notable differences from the “flexible average inflation targeting” (FAIT) that it claimed to have adopted in 2020. In some respects, the Fed appeared to be giving more primacy to the “2%” definition of price stability than to the full employment mandate. Yet the “new approach” still allows plenty of wiggle room and might not differ much from the approach followed prior to FAIT.
“… an asymmetric approach to the dual mandate: It would implement makeup policy on misses below the inflation target, and it would respond to shortfalls from maximum employment. These asymmetries, while well- intended, created an inflationary bias that caused FAIT to fail the ‘stress test’ of the 2021–22 inflation surge. This failure caused the Fed to effectively abandon FAIT in early 2022 and become a single-mandate central bank focused on price stability.“
Scott Sumner says the Fed never really really practiced FAIT to begin with. It should have been a symmetric policy, but it wasn’t. During 2021-22, the Fed did not attempt to correct for rising inflation. Instead, it focused on the recessionary effects of Covid and the impingements of Covid-era restrictions on employment.
Clearly, Covid was a shock that monetary policy was ill-suited to address without reinforcing inflation. Furthermore, the pandemic inflation was thought by the Fed to be transitory, but easing policy was a critical error. Stimulating demand via monetary accommodation gave inflation more permanence than the Fed apparently expected.
Lost In the Tea Leaves Again
While a strong commitment to price stability is welcome, it’s not clear that is what’s guiding the Fed’s decisions at the moment. Again, the Fed’s preferred inflation gauge has flattened out at around 3%. However, with uncertainty about tariffs and tariff pass throughs in 2026, the weak dollar, and unrelenting Treasury borrowing, easier monetary conditions could well set the stage for persistent inflation above 3%, despite the official 2% target. That might help explain the failure of longer-term interest rates to decline in the wake of the Fed’s latest quarter-point cut in the federal funds target in October.
Suspicious Minds
Speculation that the Fed is allowing its true inflation target to creep upward is hardly new. Back in June, former New York Fed economist Robert Brusca noted the following:
“A Cleveland Fed survey already has the business community thinking that the REAL target for inflation is 2.5%.”
More recently, Mark Sobel of the Official Monetary and Fiscal Institutions Forum stated that the real target, for now, is probably 3%:
“But could the Fed stealthily and unintentionally end up near 3%? Even apart from above-target inflation in recent years, short- and longer-term structural forces are at play that could usher in slightly higher inflation, notwithstanding Fed speeches on the sanctity of the 2% inflation target.“
Chewing On Data
It’s pretty clear that the Fed has become a skittish about the pace of the real economy, lending more weight to the full employment part of its dual mandate. Employment growth slowed over the past year, partly due to government employee buy-outs and separations of illegal immigrants from their employers. The last official employment report was in early September, however, so the nonfarm payroll data is two months out-of-date:
Private payroll growth from ADP over the past two months has not looked especially encouraging:
Tariffs and weakened profit margins have likely had a contractionary effect, and the six-week government shutdown just ended will shave 0.5% or more off fourth quarter GDP growth. Furthermore, while money (M2) growth has accelerated over the past year, it remains fairly restrained.
And the monetary base has been pretty flat for most of 2025:
We’ll see where these aggregates go from here. The extended “restraint” might now be of some concern to the Fed, given recent doubts about employment and economic growth. Still, in October, Fed Chairman Jerome Powell said that another quarter-point cut in the federal funds rate target in December was not a foregone conclusion. That statement seems to have worried equity investors while offering little solace to bond investors.
Aborted Landing
If (and as long as) the Fed gives primacy or greater weight in its policy deliberations to employment than inflation, it might as well have adopted an inflation target of 3% or more. The additional erosion in purchasing power wrought by that leniency is bad enough, but the effect of monetary policy on the real side of the economy is more poorly understood than its effect on nominal variables. The Fed’s shift in priorities is both unreliable on the real side and dangerous in terms of price stability. These concerns are even more salient given the upcoming appointment (in May) of a new Fed Chairman by President Trump, who seems eager for easy money.
Tariffs have far-reaching effects that strike some as counter-intuitive, but they are real forces nevertheless. Much like any selective excise tax, tariffs reduce the quantity demanded of the taxed good; buyers (importers) pay more, but sellers of the good (foreign exporters) extract less revenue. Suppose those sellers happen to be the primary buyers of what you produce. Because they have less to spend, you also will earn less revenue.
The Lerner Effect
The imposition of tariffs by the U.S. means that foreigners have fewer dollars to spend on exports from the U.S. (as well as fewer dollars to invest in the U.S. assets like Treasury bonds, stocks, and physical capital). That much is true without any change in the exchange rate. However, lower imports also imply a stronger dollar, further eroding the ability of foreigners to purchase U.S. exports.
The implications of the import tariff for U.S. exports may be even more starkly negative. Scott Sumner discusses an economic principle called Lerner Symmetry: a tax on imports can be the exact equivalent of a tax on exports! That’s because two-directional trade flows rely on two-directional flows of income.
Note that this has nothing to do with foreign retaliation against U.S. trade policy, although that will also hurt U.S. exporters. Nor is it a consequence of the very real cost increase that tariffs impose on U.S. export manufacturers who require foreign inputs. That’s a separate issue. Lerner Symmetry is simply part of the mechanics of trade flows in response to a one-sided tariff shock.
Assumptions For Lerner Symmetry
Scott Sumner enumerated certain conditions that must be in place for full Lerner Symmetry. While they might seem strict, the Lerner effect is nevertheless powerful under relaxed assumptions (though somewhat weaker than full Lerner Symmetry).
As Sumner puts it, while full Lerner Symmetry requires perfect competition, nearly all markets are “workably competitive”. In the longer-run, assumptions of price flexibility and full employment are anything but outlandish. Complete non-retaliation is an unrealistic assumption, given the breadth and scale of the Trump tariffs. Some countries will retaliate, but not all, and it is certainly not in their best interests to do so. The assumption of balanced trade is one and the same as the assumption of no capital flows; a departure from these “two” assumptions weakens the symmetry between tariffs and export taxes because a reduction in capital flows takes up some of the slack from lower revenue earned by foreign producers.
Trump Tariff Impacts
So here we are, after large hikes in tariffs and perhaps more on the way. Or perhaps more exceptions will be carved out for favored supplicants in return for concessions of one kind or another. All that is economically and ethically foul.
But how are imports and exports faring? Here I’ll quote the Yale Budget Lab’s (YBL) September 26th report on tariffs, which includes the chart shown at the top of this post:
“Consumers face an overall average effective tariff rate of 17.9%, the highest since 1934. After consumption shifts, the average tariff rate will be 16.7%, the highest since 1936. …
The post-substitution price increase settles at 1.4%, a $1,900 loss per household.“
The “post-substitution” modifier refers to the fact that price increases caused by tariffs would be somewhat larger but for consumers’ attempts to find lower-priced domestic substitutes. Suppose the PCE deflator ends 2025 with a 2.8% annual increase. The YBL’s price estimate implies that absent the Trump tariffs, the PCE would have increased 1.4%. If that seems small to you (and the tariff effect seems large to you), recall that monetary policy has been and remains moderately restrictive, so we might have expected some tapering in the PCE without tariffs.
We also know that the early effects of the tariffs have been dominated by thinner margins earned by businesses on imported goods. Those firms have been swallowing a large portion of the tariff burden, but they will increasingly attempt to pass the added costs into prices.
But back to the main topic … what about exports? Unfortunately, the data is subject to lags and revisions, so it’s too early to say much. However, we know exports won’t decline as much as imports, given the lack of complete Lerner symmetry. YBL predicts a drop in exports of 14%, but that includes retaliatory effects. In August the WTO predicted only about a 4% decline, which would be about half the decline in imports.
Seeking Compensatory Rents
More telling perhaps, and it may or may not be a better indicator of the Lerner effect, is the clamoring for relief by American farmers who face diminished export opportunities. As Tyler Cowen says, “Lerner Symmetry Bites”. Other industries will feel the pinch, but many are likely preoccupied with the more immediate problem of increases in the direct cost of imported materials and components.
The farm lobby is certainly on its toes. The Trump Administration is now asking U.S. taxpayers to subsidize soybean producers to the tune of $15 billion. Those exporting farmers are undoubtedly victimized by tariffs. But so much for deficit reduction! More from Cowen:
“Using tariff revenue to subsidize the losses of exporters is a textbook illustration of Lerner Symmetry because the export losses flow directly from the tax on imports! The irony is that President Trump parades the subsidies as a victory while in fact they are simply damage control for a policy he created.“
A List of Harms
Tariffs are as distortionary as any other selective excise tax. They restrict choice and penalize domestic consumers and businesses, whose judgement of cost and quality happen to favor goods from abroad. Tariffs create cost and price pressures in some industries that both erode profit margins and reduce real incomes. For consumers, a tariff is a regressive tax, harming the poor disproportionately.
Tariffs also diminish foreign flows of capital to the U.S., slowing the long-term growth of the economy as well as productivity growth and real wages. And the Lerner effect implies that tariffs harm U.S. exporters by reducing the dollars available to foreigners for purchasing goods from the U.S. In these several ways, Americans are made worse off by tariffs.
We now see attempts to cover for the damage done by tariffs by subsidizing the victims. A “tariff dividend” to consumers? Subsidies to exporters harmed by the Lerner effect? In both cases, we would forego the opportunity to pay down the bloated public debt. Thus, the American taxpayer will be penalized as well.
Stablecoins are a very hot topic, and not only among crypto enthusiasts. This is “Crypto Week” in Congress, but current activity in the stablecoin (SC) space ranges from an explosion of transactions and issuance by banks and other institutions, plans for issuance by other businesses like large retailers, the introduction of new embedded SC features, laws affirming the right of use in non-crypto transactions, regulatory maneuvers, and central bank scrutiny.
The Digital Money Realm
An SC is a digital asset convertible to currency at a value pegged to some other asset with a stable market value. SCs are almost all pegged to the dollar, but they can be algorythmically pegged to a basket of currencies, Treasury securities, gold, silver, or other commodities, or a combination of various kinds of assets. Still, it’s thought that the growth of SCs will reinforce the dollar’s position as the world’s dominant currency.
SCs had their genesis and are still primarily used for settlement of transaction involving crypto-currencies and cross-border transactions. They function as a store of value and provide investors exposure to the underlying asset(s), but they are increasingly seen as transactions media as well. They offer a direct channel to instant settlement without other intermediaries and with low transaction costs.
Unfortunately, the purported stability of SCs has not always held up. In 2022, the collapse of the SC Terra/Luna demonstrated that a run on an SC is a real risk. Pending legislation in the U.S. will attempt to address this risk (see below). Tether is the dominant SC on the market today, and its issuer, Tether Ltd., claims to back it with 100% fiat currency reserves. However, those claims have come under suspicion with concerns about the true liquidity of their backing. Tether has other problems, including money laundering allegations. The bills now under consideration in the Congress would require a major change in the way Tether and other SC issuers do business in the U.S.
Crypto-Week Pending Legislation
SC issuers hold levels of reserves against their outstanding value, but currently only under various state regulations. That’s likely to change soon. Bipartisan legislation is moving through Congress: the so-called GENIUS Act was approved by the Senate in June; the STABLE Act in the House has many similar provisions.
The GENIUS and STABLE bills would require public disclosure, frequent audits, and establish 100 percent reserve requirements for so-called “payment” SCs. The bills also stipulate that reserves must be held in highly liquid assets like U.S. dollars, money market fund shares, and Treasury securities maturing within 93 days. This is likely a disappointment to “hard money” partisans who’d like to see SCs backed by precious metals. Both bills would also prohibit interest-bearing SCs, obviously an impediment to risk-taking by issuers and also a nod to banks hoping to avoid new competitive pressures. Altogether, the bills would make SCs more currency-like and less vehicles for saving or speculation of any kind.
A third piece of federal legislation, the so-called CLARITY Act, would sort out the regulatory roles of different federal agencies pertaining to digital assets.
CBDC
Central banks like the Federal Reserve have taken a keen interest in SCs, which amount to an alternative monetary system. Advocates of a Central Bank Digital Currency (CBDC) maintain that it would have greater stability and public trust than privately-issued SCs. No doubt a CBDC would facilitate investigation of fraud and money laundering, and supporters say it would help preserve the sovereignty of the U.S. monetary system.
However, a CBDC is off the table in the U.S. for the foreseeable “political” future. President Trump has issued an executive order (EO) prohibiting the development or issuance of a CBDC in the U.S. The EO asserts that a CBDC would not promote stability and in fact would do the opposite.
Opposition to a CBDC revolves around several issues: 1) it would cause an atrophy in the private development of digital assets and SCs in the U.S.; 2) a CBDC would create grave concerns about surveillance and potential use of the CBDC as an input to a social credit tool; 3) the alleged risk of a CBDC to the stability of the banking system. #3 is apparently in reference to possible disintermediation when a CBDC is substituted for traditional bank deposits — but SCs have been noted for that same risk.
Neither the GENIUS Act nor the STABLE Act explicitly prohibits a CBDC, which has riled a few conservatives. However, there are provisions in the GENIUS Act that effectively rule a CBDC out at a “retail” and consumer level.
A fourth piece of legislation, the Anti-CBDC Surveillance State Act, would prohibit the Federal Reserve from “… testing, studying, developing, creating, or implementing a central bank digital currency and bar the banks from using such a currency to implement monetary policy.” The bill was passed by the House of Representatives in May, but it has yet to clear the Senate. Some House members might like to have its major provisions incorporated into the current SC legislation, but that remains to be seen, and if such a revision was passed by the House it would require another Senate vote in any case.
Not Quite Like Cash
As a “programmable” currency, a CBDC could be used to control transactions deemed impermissible by a future “regime”. This would be a manifestation of what Dave Friedman calls “The Convergence of AI and the State”. His concerns extend to privately-issued SC’s as well, inasfar as SCs and other payment systems have us “sleepwalking into a cashless society”.
Privacy has been a downside to SCs and all blockchain transactions from the start, but there are several technological extensions that could protect SC transactions and accounts from nosy governments or nefarious actors. Taurus, a crypto custodian, has launched a Stablecoin contract for businesses with privacy features using so-called zero knowledge proofs that would satisfy “Know Your Customer” requirements and anti-money laundering laws, but without revealing amounts paid or the recipient’s identity. Still, there are legitimate concerns regarding access by regulators, and law enforcement could ultimately gain access to account and transaction data given a reasonable suspicion of wrongdoing. This will almost certainly be addressed in any SC legislation that makes it to Trump’s desk.
Macro Policy Implications
Will broader adoption of SCs compromise the ability of central banks to conduct monetary policy? Scott Sumner says no:
“The Fed will still control the monetary base, and they have almost unlimited ability to adjust both the supply and the demand for base money. This means they will be able to react to the creation of money substitutes as required to prevent any impact on macroeconomic objectives such as employment and the price level.”
When Sumner’s says the Fed controls the demand for base money, he refers to the interest rate the Fed pays on bank reserves.
As noted above, however, it’s widely feared that public substitution of SCs for bank deposits could drain bank reserves, adding variability to the broader demand for monetary assets, thus weakening the relationship between policy actions, the money stock, and other key variables.
Even if this is correct and Summer is wrong, the Federal Reserve should be treated as a special (but very important) case. That’s because the dollar is the dominant global currency, almost all SCs are backed by dollars, and essentially all SCs used in the U.S. will be backed by dollar-denominated assets should GENIUS-type legislation become law. That severely limits any potential disintermediation that SCs might otherwise cause. Control of bank reserves should be manageable, and therefore SCs will not meaningfully weaken the Fed’s controlof base money or the transmission of monetary policy.
Things are not so simple for countries having home currencies that play a minor role internationally. SC’s backed by other currencies or assets are then more likely to weaken the central bank’s control of domestic monetary assets. In fact, SCs might create greater vulnerability to “dollarization” in some countries, which would weaken the efficacy of domestic monetary control. If Sumner is correct, the existence of SCs would still add a layer of variability for these central banks, making policy adjustments more complex and error-prone.
Conclusion
Stablecoins are already huge in the crypto world and they are making inroads to the broader financial sector, factor payments, and everyday consumer decisions. Naturally they have attracted a great deal of interest in policy circles, both for their benefits and the risks they present. The purported liquidity and stability of SCs, together with a few prior missteps, make the legislation now before Congress a key to broader adoption, particularly the provisions on reserves and transparency. While not strictly a part of the legislation, the incorporation of privacy features will enhance the value of SCs to all users.
Conservatives and libertarians undoubtedly will welcome the proscription on development of a digital currency by the Fed. Private SCs backed by dollar reserves should allow the Fed to maintain ample control over the monetary base and the supply of monetary assets. Moreover, the growth of dollar-backed SCs will strengthen the dollar’s dominance in international trade and finance. However, while stablecoins can and do reduce transaction costs in a variety of circumstances, dollar-backed SCs cannot be better stores of value than the dollar itself, which we know has had its shortcomings over the years.
As a long-time user of macroeconomic statistics, I admit to longstanding doubts about their accuracy and usefulness for policymaking. Almost any economist would admit to the former, not to mention the many well known conceptual shortcomings in government economic statistics. However, few dare question the use of most macro aggregates in the modeling and discussion of policy actions. One might think conceptual soundness and a reasonable degree of accuracy would be requirements for serious policy deliberation, but uncertainties are almost exclusively couched in terms of future macro developments; they seldom address variances around measures of the present state of affairs. In many respects, we don’t even know where we are, let alone where we’re going!
Early and Latter Day Admonitions
In the first of a pair of articles, Reuven Brenner discusses the hazards of basing policy decisions on economic aggregates, including critiques of these statistics by a few esteemed economists of the past. The most celebrated developer of national income accounting, Simon Kuznets, was clear in expressing his reservations about the continuity of the U.S. National Income and Product Accounts during the transition to a peacetime economy after World War II. The government controlled a large share of economic activity and prices during the war, largely suspending the market mechanism. After the war, market pricing and private decision-making quickly replaced government and military planners. Thus, the national accounts began to reflect values of production inherent in market prices. That didn’t necessarily imply accuracy, however, as the accounts relied (and still do) on survey information and a raft of assumptions.
The point is that the post-war economic results were not remotely comparable to the data from a wartime economy. Comparisons and growth rates over this span are essentially meaningless. As Brenner notes, the same can be said of the period during and after the pandemic in 2020-21. Activity in many sectors completely shut down. In many cases prices were simply not calculable, and yet the government published aggregates throughout as if everything was business as usual.
More than a decade after Kuznets, the game theorists Oskar Morgenstern and John von Neumann both argued that the calculations of economic aggregates are subject to huge degrees of error. They insisted that the government should never publish such data without also providing broad error bands.
Morgenstern delineated several reasons for the inaccuracies inherent in aggregate economic data. These include sampling errors, both private and political incentives to misreport, systematic biases introduced by interview processes, and inherent difficulties in classifying components of production. Also, myriad assumptions must be fed into the calculation of most economic aggregates. A classic example is the thorny imputation of services provided by owner-occupied homes (akin to the value of services generated by rental units to their occupants). More recently. Charles Manski reemphasized Morganstern’s concerns about the aggregates, reaching similar conclusions as to the wisdom of publishing wide ranges of uncertainty.
Real or Unreal?
Estimates of real spending and production are subject to even larger errors than estimates of nominal values. The latter are far simpler to measure, to the extent that they represent a simple adding up of current amounts spent (or income earned) over the course of a given time period. In other words, nominal aggregates represent the sum of prices times quantities. To estimate real quantities, nominal values must be adjusted (deflated) by price aggregates, the measurement of which are fraught with difficulties. Spending patterns change dramatically over time as preferences shift; technology advances, new goods and services replace others, and the qualities of goods and services evolve. A “unit of output” today is usually far different than what it was in the past, and adjusting prices for those changes is a notorious challenge.
This difficulty offers a strong rationale for relying on nominal quantities, rather than real quantities, in crafting certain kinds of policy. Perhaps the best example of the former is so-calledmarket monetarism and monetary policy guided by nominal GDP-level targeting, as championed by Scott Sumner.
Government’s Contribution
Another fundamental qualm is the inconsistency between data on government’s contribution to aggregate production versus private sector contributions. This is similar in spirit to Kuznets’ original critique. Private spending is valued at market prices of final output, whereas government spending is often valued at administered prices or at input cost.
An even deeper objection is that much of the value of government output is already subsumed in the value of private production. Kuznets himself thought so! For example, to choose two examples, public infrastructure and law enforcement contribute services which enhance the private sector’s ability to reliably produce and deliver goods to market. To add the government’s “output” of these services separately to the aggregate value of private production is to double count in a very real sense. Even Tyler Cowen is willing to entertain the notion that including defense spending in GDP is double counting. The article to which he links goes further than that.
Nevertheless, our aggregate measures allow for government spending to drive fluctuations in our estimates of GDP growth from one period to another. It’s reasonable to argue that government spending should be reported as a separate measure from private GDP.
But what about the well known Keynesian assertion that an increase in government spending will lift output by some multiple of the change? That proposition is considered valid (by Keynesians) only when resources are idle. Of course, today we see steady growth of government even at full employment, so the government’s effort to commandeer resources creates scarcity that crowds out private activity.
Measurement and Policy Uncertainty
Acting on published estimates of economic aggregates is hazardous for a number of other reasons. Perhaps the most basic is that these aggregates are backward-looking. A policy activist would surely agree that interventions should be crafted in recognition of concurrent data (were it available) or, even better, on the basis of reliable predictions of the future. Financial market prices are probably the best source of such forward-looking information.
In addition, revising the estimates of aggregates and their underlying data is an ongoing process. Initial published estimates are almost always based on incomplete data. Then the estimates can change substantially over subsequent months, underscoring uncertainty about the state of the economy. It is not uncommon to witness consistent biases over time in initial estimates, further undermining the credibility of the effort.
Even worse, substantial annual revisions and so-called “benchmark revisions” are made to aggregates like GDP, inflation, and employment data. Sometimes these revisions alter economic history substantially, such as the occurrence and timing of recessions. All this implies that decisions made on the basis of initial or interim estimates are potentially counterproductive (and on a long enough timeline, every aggregate is an “interim” estimate). At a minimum, the variable nature of revisions, which is an unavoidable aspect of publishing aggregate statistics, magnifies policy uncertainty.
Case Studies?
Brenner cites two historical episodes as support for his argument that aggregates are best ignored by policymakers. They are interesting anecdotes, but he gives few details and they hardly constitute proof of his thesis. In 1961, Hong Kong’s financial secretary stopped publishing all but “the most rudimentary statistics”. Combined with essentially non-interventionist policy including low tax rates, Hong Kong ran off three decades of impressive growth. On the other hand, Argentina’s long economic slide is intended by Brenner to show the downside of relying on economic aggregates and interventionism.
Bad Models, Bad Policy
It’s easy to see that economic aggregates have numerous flaws, rendering them unreliable guides for monetary and fiscal policy. Nevertheless, their publication has tended to encourage the adoption of policy interventions. This points to another issue lurking in the background: the role of economic aggregates in shaping the theory and practice of macroeconomics and the models on which policy recommendations are based. The conceptual difficulties surrounding aggregates, and the errors embedded within measured aggregates, have helped to foster questionable model treatments from a scientific perspective. For example, Paul Romer has said:
“Macroeconomists got comfortable with the idea that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take, after Kydland and Prescott (1982) launched the real business cycle (RBC) model. … [which] explains recessions as exogenous decreases in phlogiston.”
This is highly reminiscent of a quip by Brenner that macroeconomics has become a bit like astrology. A succession of macro models after the RBC model inherited the dependence on phlogiston. Romer goes on to note that model dependence on “imaginary” forces has aggravated the longstanding problem of statistically identifying individual effects. He also debunks the notion that adding expectations to models helps solve the identification problem. In fact, Romer insists that it makes it worse. He goes on to paint a depressing picture of the state of macroeconomics, one to which its reliance on faulty aggregates has surely contributed.
Aggregates also mask the detailed, real-world impacts of policies that invariably accompany changes in spending and taxes. While a given fiscal policy initiative might appear to be neutral in aggregate terms, it is almost always distortionary. For example, spending and tax programs always entail a redirection of resources, whether a consequence of redistribution, large-scale construction, procurement, or efforts to shape the industrial economy. These are usually accompanied by changes in the structure of incentives, regulatory requirements, and considerable rent seeking activity. Too often, outlays are dedicated to shoring up weak sectors of the economy, short-circuiting the process of creative destruction that serves to foster economic growth. Yet the macro models gloss over all the messy details that can negate the efficacy of activist fiscal policies.
Conclusion
The reliance of macroeconomic policy on aggregates like GDP, employment, and inflation statistics certainly has its dangers. These measures all suffer from theoretical problems, and they simply cannot be calculated without errors. They are backward-looking, and the necessity of making ongoing revisions leads to greater uncertainty. But compared to what? There are ways of shifting the focus to measures subject to less uncertainty, such as nominal income rather than real income. A number of theorists have proposed market-based methods of guiding policy, including Fischer Black. This deserves broader discussion.
The problems of aggregates are not solely confined to measurement. For example, national income accounting, along with the Keynesian focus on “underconsumption” during recessions, led to the fallacious view that spending decisions drive the economy. This became macroeconomic orthodoxy, driving macro mismanagement for decades and leading to inexorable growth in the dominance of government. Furthermore, macroeconomic models themselves have been corrupted by the effort to explain away impossibly error-prone measurements of aggregate activity.
Brenner has a point: it might be more productive to ignore the economic aggregates and institute stable policies which reinforce the efficacy of private markets in allocating resources. If nothing else, it makes sense to feature the government and private components separately.
In this case, the “A” stands for Altman. Now Sam Altman is no slouch, but he’s taken a few ill-considered positions on public policy. Altman, the CEO of Open AI, wrote a blog post back in 2021 entitled “Moore’s Law For Everything” in which he predicted that AI will feed an explosion of economic growth. He also said AI will put a great many people out of work and drive down the price of certain kinds of labor. Furthermore, he fears that the accessibility of AI will be heavily skewed against the lowest socioeconomic classes. In later interviews (see here and here), Altman is somewhat demure about those predictions, but the general outline is the same: despite exceptional growth of GDP and wealth, he envisions job losses, an underclass of AI-illiterates, and a greater degree of income and wealth inequality.
Not Quite Like That
We’ve yet to see an explosion of growth, but it’s still very early in the AI revolution. The next several years will be telling. AI holds the potential to vastly increase our production possibilities over the course of the next few decades. For that and other reasons, I don’t buy the more dismal aspects of Altman’s scenario, as my last two posts make clear (here and here).
There will be plenty of jobs for people because humans will have comparative advantages in various areas of production. AI agents might have absolute advantages across most or even all jobs, but a rational deployment would have AI agents specialize only where they have a comparative advantage.
Scarcity will not be the sort of anachronism envisioned by some AI futurists, Altman included, and scarcity of AI agents (and their inputs) will necessitate their specialization in certain tasks. The demand for AI agents will be quite high, and their energy and “compute” requirements will be massive. AI agents will face extremely high opportunity costs in other tasks, leaving many occupations open for human labor, to say nothing of abundant opportunities for human-AI collaboration.
However, I don’t dismiss the likelihood of disruptions in markets for certain kinds of labor if the AI revolution proceeds as rapidly as Altman thinks it will. Many workers would be displaced, and it would take time, training, and a willingness to adapt for them to find new opportunities. But new kinds of jobs for people will emerge with time as AI is embedded throughout the economy.
Altman’s Rx
Altman’s somewhat pessimistic outlook for human employment and inequality leads him to make a couple of recommendations:
1) Ownership of capital must be more broadly distributed.
2) Capital and land must be taxed, potentially replacing income taxes, but primarily to fund equity investments for all Americans.
Here I agree with the spirit of #1. Broad ownership of capital is desirable. It allows greater participation in the capitalist system, which fosters political and economic stability. And wider access to capital, whether owned or not, allows a greater release of entrepreneurial energy. It also diversifies incomes and reduces economic dependency.
Altman proposes the creation of an American Equity Fund (AEF) to hold the proceeds of taxes on land and corporate assets for the benefit of all Americans. I’ll get to the taxes in a moment, but in discussing the importance of educating the public on the benefits of compounding, Altman seems to imply that assets in AEF would be held in individual accounts, as opposed to a single “public” account controlled by the federal government. Individual accounts would be far preferable, but it’s not clear how much control Altman would grant individuals in managing their accounts.
To Kill a Golden Goose
Taxes on capital are problematic. Capital can only be accumulated over time by saving out of income. Thus, as Michael Munger points out, as a general proposition under an income tax, all capital has already been taxed once. And we tax the income from capital at both the corporate and individual level. So corporate income is already double taxed: corporate profits are taxed along with dividend payments to shareholders.
Altman proposed in his 2021 blog post to levy a tax of 2.5% on the market value of publicly-traded corporations each year. The tax would be payable in cash or in corporate shares to be placed into the AEF. The latter would establish a kind of UnLiquidated Tax Reserve Accounts (ULTRA), which Munger discusses in the article linked above (my bracketed x% in the quote here):
“Instead of taking [x%] of the liquidated value of the wealth, the state would simply take ownership of the wealth, in place. An ULTRA is a ‘notional equity interest.’ The government literally takes a portion of the value of the asset; that value will be paid to the state when the asset is sold. Now, it is only a ‘notional’ stake, in the sense that no shared right of control or voting rights exists. But for those who advocate for ULTRAs, in any situation where tax agencies are authorized to tax an asset today, but cannot because there is no evaluation event, the taxpayer could be made to pay with an ULTRA rather than with cash.”
This solves all sorts of administrative problems associated with wealth taxes, but it is draconian nevertheless. Munger quotes an example of a successful, privately-held business subject to a 2% wealth tax every year in the form of an ULTRA. After 20 years, the government owns more than a third of the company’s value. That represents a substantial penalty for success! However, the incidence of such a tax might fall more on workers and customers and less on business owners. And Altman would tax corporations more heavily than in Munger’s example.
A tax on wealth essentially penalizes thrift, reduces capital accumulation, and diminishes productivity and real wages. But another fundamental reason that taxes on capital should be low is that the supply of capital is elastic. A tax on capital discourages saving and encourages capital flight. The use of avoidance schemes will proliferate, and there will be intense pressure to carve out special exemptions.
A Regressive Dimension
Another drawback of a wealth tax is its regressivity with respect to returns on capital. To see this, we can convert a tax on wealth to an equivalent income tax on returns. Here is Chris Edwardson that point:
“Suppose a person received a pretax return of 6 percent on corporate equities. An annual wealth tax of 2 percent would effectively reduce that return to 4 percent, which would be like a 33 percent income tax—and that would be on top of the current federal individual income tax, which has a top rate of 37 percent.”
… The effect is to impose lower effective tax rates on higher‐yielding assets, and vice versa. If equities produced returns of 8 percent, a 2 percent wealth tax would be like a 25 percent income tax. But if equities produced returns of 4 percent, the wealth tax would be like a 50 percent income tax. People with the lowest returns would get hit with the highest tax rates, and even people losing money would have to pay the wealth tax.“
Edwards notes the extreme inefficiency of wealth taxes demonstrated by the experience of a number of OECD countries. There are better ways to increase revenue and the progressivity of taxes. The best alternative is a tax on consumption, which rewards saving and capital accumulation, promoting higher wages and economic growth. Edwards dedicates a lengthy section of his paper to the superiority of a consumption tax.
Is a Wealth Tax Constitutional?
The constitutionality of a wealth tax is questionable as well. Steven Calabresi and David Schizer (C&S)contend that a federal wealth tax would qualify as a direct tax subject to the rule of apportionment, which would also apply to a federal tax on land. That is, under the U.S. Constitution, these kinds of taxes would have to be the same amount per capita in every state. Thus, higher tax rates would be necessary in less wealthy states.
C&S also note a major distinction between taxes on the value of wealth relative to income, excise, import, and consumption taxes. The latter are all triggered by transactions entered into voluntarily. They are avoidable in that sense, but not wealth taxes. Moreover, C&S believe the founders’ intent was to rely on direct taxes only as a backstop during wartime.
The recent Supreme Court decision in Moore v. United States created doubt as to whether the Court had set a precedent in favor of a potential wealth tax. According to earlier precedent, the Constitution forbade the “laying of taxes” on “unrealized” income or changes in wealth. However, in Moore, the Court ruled that undistributed profits from an ownership interest in a foreign business are taxable under the mandatory repatriation tax, signed into law by President Trump in 2017 as part of his tax overhaul package. But Justice Kavanaugh, who wrote the majority opinion, stated that the ruling was based on the foreign company’s status as a pass-through entity. The Wall Street Journalsays of the decision:
“Five Justices open the door to taxing unrealized gains in assets. Democrats will walk through it.”
In a brief post, Calabrisi laments Justice Ketanji Brown Jackson’s expansive view of the federal government’s taxing authority under the Sixteenth Amendment, which might well be shared by the Biden Administration. But the Wall Street Journal piece also describes Kavanaugh’s admonition regarding any expectation of a broader application of the Moore opinion:
“Justice Kavanaugh does issue a warning that ‘the Due Process Clause proscribes arbitrary attribution’ of undistributed income to shareholders. And he writes that his opinion should not ‘be read to authorize any hypothetical congressional effort to tax both an entity and its shareholders or partners on the same undistributed income realized by the entity.’”
Growth Is the Way, Not Taxes
AI growth will lead to rapid improvements in labor productivity and real wages in many occupations, despite a painful transition for some workers requiring occupational realignment and periods of unemployment and training. However, people will retain comparative advantages over AI agents in a number of existing occupations. Other workers will find that AI allows them to shift their efforts toward higher-value or even new aspects of their jobs. Along the same lines, there will be a huge variety of new occupations made possible by AI of which we’re only now catching the slightest glimpse. Michael Strain has emphasized this aspect of technological diffusion, noting that 60% of the jobs performed in 2018 did not exist in 1940. In fact, few of those “new” jobs could have been imagined in 1940.
AI entrepreneurs and AI investors will certainly capture a disproportionate share of gains from an AI revolution. Of course, they’ll have created a disproportionate share of that wealth. It might well skew the distribution of wealth in their favor, but that does not reflect negatively on the market process driving the outcome, especially because it will also give rise to widespread gains in living standards.
Altman goes wrong in proposing tax-funded redistribution of equity shares. Those taxes would slow AI development and deployment, reduce economic growth, and produce fewer new opportunities for workers. The surest way to effect a broader distribution of equity capital, and of equity in AI assets, is to encourage innovation, economic growth, and saving. Taxing capital more heavily is a very bad way to do that, whether from heavier taxes on income from capital, new taxes on unrealized gains, or (worst of all) from taxes on the value of capital, including ULTRA taxes.
Altman is right, however, to bemoan the narrow ownership of capital. As I mentioned above, he’s also on-target in saying that most people do not fully appreciate the benefits of thrift and the miracle of compounding. That represents both a failure of education and our calamitously high rate of time preference as a society. Perhaps the former can be fixed! However, thrift is a decision best left in private hands, especially to the extent that AI stimulates rapid income growth.
Killer Regulation
Altman also supports AI regulation, and I’ll cut him some slack by noting that his motives might not be of the usual rent-seeking variety. Maybe. Anyway, he’ll get some form of his wish, as legislators are scrambling to draft a “roadmap” for regulating AI. Some are calling for billions of federal outlays to “support” AI development, with a likely and ill-advised effort to “direct” that development as well. That is hardly necessary given the level of private investment AI is already attracting. Other “roadmap” proposals call for export controls on AI and protections for the film and recording industries.
These proposals are fueled by fears about AI, which run the gamut from widespread unemployment to existential risks to humanity. Considerable attention has been devoted to the alignment of AI agents with human interests and well being, but this has emerged largely within the AI development community itself. There are many alignment optimists, however, and still others who decry any race between tech giants to bring superhuman generative AI to market.
The Biden Administration stepped in last fall with an executive order on AI under emergency powers established by the Defense Production Act. The order ranges more broadly than national defense might necessitate, and it could have damaging consequences. Much of the order is redundant with respect to practices already followed by AI developers. It requires federal oversight over all so-called “foundation models” (e.g., ChatGPT), including safety tests and other “critical information”. These requirements are to be followed by the establishment of additional federal safety standards. This will almost certainly hamstring investment and development of AI, especially by smaller competitors.
Patrick Hedger discusses the destructive consequences of attempts to level the competitive AI playing field via regulation and antitrust actions. Traditionally, regulation tends to entrench large players who can best afford heavy compliance costs and influence regulatory decisions. Antitrust actions also impose huge costs on firms and can result in diminished value for investors in AI start-ups that might otherwise thrive as takeover targets.
Conclusion
Sam Altman’s vision of funding a redistribution of equity capital via taxes on wealth suffers from serious flaws. For one thing, it seems to view AI as a sort of exogenous boon to productivity, wholly independent of investment incentives. Taxing capital would inhibit investment in new capital (and in AI), diminish growth, and thwart the very goal of broad ownership Altman wishes to promote. Any effort to tax capital at a global level (which Altman supports) is probably doomed to failure, and that’s a good thing. The burden of taxes on capital at the corporate level would largely be shifted to workers and consumers, pushing real wages down and prices up relative to market outcomes.
Low taxes on income and especially on capital, together with light regulation, promote saving, capital investment, economic growth, higher real wages, and lower prices. For AI, like all capital investment, public policy should focus on encouraging “aligned” development and deployment of AI assets. A consumption tax would be far more efficient than wealth or capital taxes in that respect, and more effective in generating revenue. Policies that promote growth are the best prescription for broadening the distribution of capital ownership.
The Fed’s “higher for longer” path for short-term interest rates lingers on, and so does inflation in excess of the Fed’s 2% target. No one should be surprised that rate cuts aren’t yet on the table, but the markets freaked out a little with the release of the February CPI numbers last week, which were higher than expected. For now, it only means the Fed will remain patient with the degree of monetary restraint already achieved.
Dashed Hopes
As I’ve said before, there was little reason for the market to have expected the Fed to cut rates aggressively this year. Just a couple of months ago, the market expected as many as six quarter-point cuts in the Fed’s target for the federal funds rate. The only rationale for that reaction would have been faster disinflation or the possibility of an economic “hard landing”. A downturn is not out of the question, especially if the Fed feels compelled to raise its rate target again in an effort to stem a resurgence in inflation. Maybe some traders felt the Fed would act politically, cutting rates aggressively as the presidential election approaches. Not yet anyway, and it seems highly unlikely.
There is no assurance that the Fed can succeed in engineering a “soft landing”, i.e., disinflation to its 2% goal without a recession. No one can claim any certainty on that point — it’s too early to call, though the odds have improved somewhat. As Scott Sumner succinctly puts it, a soft landing basically depends on whether the Fed can disinflate gradually enough.
It’s a Demand-Side Inflation
I’d like to focus a little more on Sumner’s perspective on Fed policy because it has important implications for the outlook. Sumner is a so-called market monetarist and a leading proponent of nominal GDP level targeting by the Fed. He takes issue with those ascribing the worst of the pandemic inflation to supply shocks. There’s no question that disruptions occurred on the supply side, but the Fed did more than accommodate those shocks in attempting to minimize their impact on real output and jobs. In fact, it can fairly be said that a Fed / Treasury collaboration managed to execute the biggest “helicopter drop” of money in the history of the world, by far!
That “helicopter drop” consisted of pandemic relief payments, a fiscal maneuver amounting to a gigantic monetary expansion and stimulus to demand. The profligacy has continued on the fiscal side since then, with annual deficits well in excess of $1 trillion and no end in sight. This reflects government demand against which the Fed can’t easily act to countervail, making the job of achieving a soft landing that much more difficult.
The Treasury, however, is finding a more limited appetite among investors for the flood of bonds it must regularly sell to fund the deficit. Recent increases in long-term Treasury rates reflect these large funding needs as well as the “higher-for-longer” outlook for short-term rates, inflation expectations, and of course better perceived investment alternatives.
The Nominal GDP Proof
There should be no controversy that inflation is a demand-side problem. As Summer says, supply shocks tend to reverse themselves over time, and that was largely the case as the pandemic wore on in 2021. Furthermore, advances in both real and nominal GDP have continued since then. The difference between the two is inflation, which again, has remained above the Fed’s target.
So let’s see… output and prices both growing? That combination of gains demonstrates that demand has been the primary driver of inflation for three-plus years. Restrictive monetary policy is the right prescription for taming excessive demand growth and inflation.
Here’s Sumner from early March (emphasis his), where he references flexible average inflation targeting (FAIT), a policy the Fed claims to be following, and nominal GDP level targeting (NGDPLT):
“Over the past 4 years, the PCE price index is up 16.7%. Under FAIT it should have risen by 8.2% (i.e., 2%/year). Thus we’ve had roughly 8.5% excess inflation (a bit less due to compounding.)
Aggregate demand (NGDP) is up by 27.6%. Under FAIT targeting (which is similar to NGDPLT) it should have been up by about 17% (i.e., 4%/year). So we’ve had a bit less than 10.6% extra demand growth. That explains all of the extra inflation.”
Is Money “Tight”?
The Fed got around to tightening policy in the spring of 2022, but that doesn’t necessarily mean that policy ever advanced to the “tight” stage. Sumner has been vocal in asserting that the Fed’s policy hasn’t looked especially restrictive. Money growth feeds demand and ultimately translates into nominal GDP growth (aggregate demand). The latter is growing too rapidly to bring inflation into line with the 2% target. But wait! Money growth has been moderately negative since the Fed began tightening. How does that square with Sumner’s view?
In fact, the M2 money supply is still approximately 35% greater than at the start of the pandemic. There’s still a lot of M2 sloshing around out there, and the Fed’s portfolio of securities acquired during the pandemic via “quantitative easing” remains quite large ($7.5 trillion). Does this sound like tight money?
Again, Sumner would say that with nominal GDP ripping ahead at 5.7%, the Fed can’t be credibly targeting 2% inflation given an allowance for real GDP growth at trend of around 1.8% (or even somewhat greater than that). It’s an even bigger stretch if M2 velocity (V — turnover) continues to rebound with higher interest rates.
Wage growth also exceeds a level consistent with the Fed’s target. The chart below shows the gap between price inflation and wage inflation that left real wages well below pre-pandemic levels. Since early 2023, wages have made up part of that decline, but stubborn wage inflation can impede progress against price inflation.
Just Tight Enough?
Despite Sumner’s doubts, there are arguments to be made that Fed policy qualifies as restrictive. Even moderate declines in liquidity can come as a shock to markets grown accustomed to torrents from the money supply firehose. And to the extent that inflation expectations have declined, real interest rates may be higher now than they were in early November. In any case, it’s clear the market was disappointed in the higher-than-expected CPI, and traders were not greatly assuaged by the moderate report on the PPI that followed.
However, the Fed pays closest attention to another price index: the core deflator for personal consumption expenditures (PCE). Inflation by this measure is trending much closer to the Fed’s target (see the second chart below). Still, from the viewpoint of traders, many of whom, not long ago, expected six rate cuts this year, the reality of “higher for longer” is a huge disappointment.
Danger Lurks
As I noted, many believe the odds of a soft landing have improved. However, the now-apparent “stickiness” of inflation and the knowledge that the Fed will standby or possibly hike rates again has rekindled fears that the economy could turn south before the Fed elects to cut its short-term interest rate target. That might surprise Sumner in the absence of more tightening, as his arguments are partly rooted in the continuing strength of aggregate demand and nominal GDP growth.
There’s a fair degree of consensus that the labor market remains strong, which underscores Sumner’s doubts as to the actual tenor of monetary policy. The March employment numbers were deceptive, however. The gain in civilian employment was just shy of 500,000, but that gain was entirely in part-time employment. Full-time employment actually declined slightly. In fact, the same is true over the prior 12 months. And over that period, the number of multiple jobholders increased by more than total employment. Increasing reliance on part-time work and multiple jobs is a sign of stress on household budgets and that firms may be reluctant to commit to full-time hires. From the establishment survey, the gain in nonfarm employment was dominated once again by government and health care. These numbers hardly support the notion that the economy is on solid footing.
There are other signs of stress: credit card delinquencies hit an all-time high in February. High interest rates are taking a toll on households and business borrowers. Retail sales were stronger than expected in March, but excess savings accumulated during the pandemic were nearly depleted as of February, so it’s not clear how long the spending can last. And while the index of leading indicators inched up in February, it was the first gain in two years and the index has shown year/over-year declines over that entire two-year period.
Conclusion
It feels a little hollow for me to list a series of economic red flags, having done so a few times over the past year or so. The risks of a hard landing are there, to be sure. The behavior of the core PCE deflator over the next few months will have much more influence on the Fed policy, as would any dramatic changes in the real economy. The “data dependence” of policy is almost a cliche at this point. The Fed will stand pat for now, and I doubt the Fed will raise its rate target without a dramatic upside surprise on the core deflator. Likewise, any downward rate moves won’t be forthcoming without more softening in the core deflator toward 2% or definitive signs of a recession. So rate cuts aren’t likely for some months to come.
Leftism has taken on new dimensions amid its preoccupation with identity politics, victimhood, and “wokeness”. Traditional socialists are still among us, of course, but “wokeists” and “identitarians” have been on the progressive vanguard of late, rooting for the deranged human butchers of Hamas and the dismantling of liberal institutions. This didn’t happen overnight, of course, and traditional socialists are mostly fine with it.
An older story is the rebranding of leftism that took place in the U.S. during the first half of the 20th century, when the word “liberal” was co-opted by leftists. Before that, a liberal orientationwas understood to be antithetical to the collectivist mindset long associated with the Left. Note also that liberalism retains its original meaning even today in much of Europe. Often we hear the term “classical liberal” to denote the “original” meaning of liberalism, but the modifier should be wholly unnecessary.
Liberalism Is Not “In-Betweenism”
In this vein, Nate Silver presents a basic taxonomy of political orientationin a recentSubstack post. It includes the diagram above, which distinguishes between socialism, conservatism, and liberalism. Silver draws on a classic essay written by Friedrich Hayek in 1945, “Why I am Not a Conservative”, in which Hayek discussed the meaning of the word “liberal” (and see here). Liberalism’s true emphasis is a tolerance for individual rights and freedoms, subject to varying articulations of the “nonaggression principle”. That is, “do as you like, but do no harm to others”.
We often see a linear representation distinguishing between so-called progressives on the left and conservatives on the right. Of course, a major hallmark of leftist thinking is extreme interventionism. Leftists or progressives are always keen to detect the slightest whiff of an externality or the slightest departure from the perfectly competitive market ideal. They seem eager to find a role for government in virtually every area of life. While it’s not a limiting case, we can substitute socialism or statism for progressivism on the far left, as Silver does, whereby the state takes primacy in economic and social affairs.
Conservatism, on the other hand, is a deep resistance to change, whether institutional, social, and sometimes economic. Conservatives too often demonstrate a willingness to use the coercive power of the state to prevent change. Hayek noted the willingness of both socialists and conservatives to invoke state power for their own ends.
Similarly, religious conservatives often demand state support beyond that afforded by the freedom to worship in the faith of one’s choice. They might strongly reject certain freedoms held to be fundamental by liberals. Meanwhile, socialists often view mere religious freedom as a threat to the power of the state, or at least they act like it (e.g. see here for an example).
Like conservatives, dedicated statists would doubtless resist change if it meant a loss of their own power. That is, they’d wish to preserve socialist institutions. On this point, witness the vitriol from the Left over what it perceives as threats to the public school monopoly. Witness also the fierce resistance among public employees to reducing the scale of the administrative state, and how advocates of entitlements fiercely resist decreases in the growth rate of those expenditures.
Silver, like Hayek, objects to the traditional, linear framework in which liberals are thought to occupy a range along a line between socialism and conservatism. He objects to that because real liberals value individual liberty as a natural human right, a viewpoint typically abhored by both socialists and conservatives. There is nothing “in between” about it! And of course, conservatives and progressives are equally guilty in their mistaken use of the word “liberal”.
Mapping Political Preferences
Liberty, statism, and conservatism are not exactly orthogonal political dimensions. Larger government almost always means less economic liberty. At a minimum, state dominance implies a social burden associated with public monopoly and monopsony power, as well as tax and welfare-state incentive problems. These features compromise or corrupt the exercise of basic rights. On the other hand, capitalism and its concomitant reliance on consumer sovereignty, individual initiative, free exchange and secure property rights is most in harmony with true liberalism.
For conservatives, resistance to change in support of a traditionally free market economy might offer something of a contradiction. In one sense, it corresponds to upholding market institutions. However, free markets allow new competitors and new technologies to undermine incumbents, who conservatives sometimes wish to defend through regulatory or protectionist measures. And conservatives are almost always too happy to join in the chorus of “price gouging” in response to the healthy operation of the free market in bringing forth supplies.
All that is to say that preferences involving liberty, statism, and traditionalism are not independent of one another. They cannot simply be mapped onto a three-dimensional space. At least the triangular representation gets liberalism out of the middle, but it’s difficult to visualize other ideological positions there. For example, “state religionism” could lie anywhere along the horizontal line at the top or even below it if certain basic liberties are preserved. Facism combines elements of socialism and a deformed version of capitalism that is properly called corporatism, but where would it fall within the triangle?
Big Government Liberalism?
Silver says he leans heavily toward a “big government” version of liberalism, but big government is hard to square with broad liberties. Granted, any well-functioning society must possess a certain level of “state capacity” to defend against private or public violations of individual rights, adjudicate disputes, and provide true public goods. It’s not clear whether Silver’s preferences lie within the bounds of those ambitions. Still, he deserves credit for his recognition that liberalism is wholly different from the progressive, socialist vision. It is the opposite.
The “New” Triangle
Silver attempts to gives the triangular framework a more contemporary spin by replacing conservatism with “MAGA Conservatism” and socialism with “Social Justice Leftism” (SJL), or “wokeism”. Here, I’m treating MAGA as a “brand”. Nothing below is intended to imply that America should not be a great nation.
The MAGA variant of conservatism emphasizes nationalism, though traditional conservatives have never been short on love of nation. For that matter, as a liberal American, it’s easier to forgive nationalist sentiments than it is the “Death to America” refrain we now hear from some SJLs.
The MAGA brand is also centered around a single individual, Donald Trump, whose rhetoric strikes many as nativistic. And Trump is a populistwhosepolicy proposals are often nakedly political and counterproductive.
SJL shares with socialism an emphasis on various forms of redistribution and social engineering, but with a new focus on victimhood based on classes of identity. Of SJL, Silver says:
“Proponents of SJL usually dislike variations on the term ‘woke’, but the problem is that they dislike almost every other term as well. And we need some term for this ideology, because it encompasses quite a few distinctive features that differentiate it both from liberalism and from traditional, socialist-inflected leftism. In particular, SJL is much less concerned with the material condition of the working class, or with class in general. Instead, it is concerned with identity — especially identity categories involving race, gender and sexuality, but sometimes also many others as part of a sort of intersectional kaleidoscope.”
The gulf between liberals and SJLs couldn’t be wider on issues like free speech and “equity”, and equality of opportunity. MAGAns, on the other hand, have some views on individual rights and responsibility that are largely consistent with liberals, but reflexive populism often leads them to advocate policies protecting rents, corporate welfare, and protectionism.
Divided Liberalism
Liberalism emphasizes limited government, individual autonomy, and free exchange. However, there are issues upon which true liberals are of divided opinion. For example, one such area of controversy is the conflict between a woman’s right to choose and the fetal right to life. Many true liberals disagree over whether the rights of a fetus outweigh its mother’s right to choose, but most would concede that the balance shifts to the fetus at some point well short of birth (putting aside potential dangers to the mother’s life). Open borders is another area that can divide true liberals. On one side, the right to unrestricted mobility is thought to supersede any public interest in enforcing borders and limiting the flow of immigrants. On the other side, questions of national sovereignty, national security, as well as social and state capacity to absorb immigrants take primacy.
Don’t Call Lefties “Liberal”… They’re Not!
True liberalism (including most strains of libertarianism) recognizes various roles that a well-functioning state should play, but it also recognizes the primacy of the individual and individual rights as a social underpinning. As Hayek noted, true liberals are not resistant to change per se, unlike conservatives. But modern progressives demand changes of the worst kind: that the state should intervene to pursue their favored objectives, laying claim to an ever-greater share of private resources. This requires government coercion on a massive scale, the antithesis of liberalism. It’s time to recognize that “progressives” aren’t liberals in any sense of the word. For that matter, they don’t even stand for progress.
I’ll close with a quote from Adam Smith that I cribbed from Scott Sumner. Unfortunately, Sumner does not give the full reference, but I’ll take his word that Smith wrote this 20 years before the publication of The Wealth of Nations:
“Little else is requisite to carry a state to the highest degree of opulence from the lowest barbarism, but peace, easy taxes, and a tolerable administration of justice; all the rest being brought about by the natural course of things. All governments which thwart this natural course, which force things into another channel, or which endeavour to arrest the progress of society at a particular point, are unnatural, and to support themselves are obliged to be oppressive and tyrannical.”
The joke’s on me, but my “out” on the question above is “long and variable lags” in the impact of monetary policy, a description that goes back to the work of Milton Friedman. If you call me out on my earlier forebodings of a hard landing or recession, I’ll plead that I repeatedly quoted Friedman on this point as a caveat! That is, the economic impact of a monetary tightening will be lagged by anywhere from 9 to 24 months. So maybe we’re just not there yet.
Of course, maybe I’m wrong and we won’t have to get “there”: the rate of inflation has indeed tapered over the past year. A soft landing now seems like a more realistic possibility. Still, there’s a ways to go, and as Scott Sumner says, when it comes to squeezing inflation out of the system, “It’s the final percentage point that’s the toughest.” One might say the Federal Reserve is hedging its bets, avoiding further increases in its target federal funds rate absent evidence of resurging price pressures.
Strong Growth or Mirage?
Economic growth is still strong. Real GDP in the third quarter grew at an astonishing 5.2% annual rate. A bulge in inventories accounted for about a quarter of the gain, which might lead to some retrenchment in production plans. Government spending also accounted for roughly a quarter, which corresponds to a literal liability as much as a dubious gain in real output. Unfortunately, fiscal policy is working at cross purposes to the current thrust of monetary policy. Profligate spending and burgeoning budget deficits might artificially prop up the economy for a time, but it adds to risks going forward, not to mention uncertainty surrounding the strength and timing in the effects of tight money.
Consumers accounted for almost half of the third quarter growth despite a slim 0.1% increase in real personal disposable income. That reinforces the argument that consumers are depleting their pandemic savings and becoming more deeply indebted heading into the holidays.
The economy continues to produce jobs at a respectable pace. The November employment report was slightly better than expected, but it was buttressed by the return of striking workers, and retail and manufacturing jobs declined. Still, the unemployment rate fell slightly, so the labor market has remained stronger than expected by most economists.
Consumer sentiment had been in the dumps until the University of Michigan report for December, which erased four months of declines. The expectations index is one component of the leading economic indicators, which has been at levels strongly suggesting a recession ahead for well over a year now. See the chart below:
But expectations improved sharply in November, and that included a decline in inflation expectations.
Another component of the LEI is the slope of the yield curve (measured by the difference between the 10-year Treasury bond yield and the federal funds rate). This spread has been a reliable predictor of recessions historically. The 10-year bond yield has declined by over 90 basis points since mid-October, a sign that bond investors think the inflation threat is subsiding. However, that drop steepened the negative slope of the yield curve, meaning that the recession signal has strengthened.
Disinflation, But Still Inflation
Inflation measures have been slowing, and the Fed’s “target” inflation rate of 2% appears within reach. In the Fed’s view, the most important inflation gauge is the personal consumption expenditures deflator excluding food and energy prices (the “core” PCE). The next chart shows the extent to which it has tapered over the past two quarters. While it’s encouraging that inflation has edged closer to the Fed’s target, it does not mean the inflation fight is over. Still, the decision taken at the December meeting of the Fed’s Open Market Committee (FOMC) to leave its interest rate target unchanged is probably wise.
Real wages declined during most of the past three years with the surge in price inflation (see next chart). Some small gains occurred over the past few months, but the earlier declines reinforce the view that consumers need to tighten their belts to maintain savings or avoid excessive debt.
Has Policy Really Been “Tight“?
The prospect of a hard landing presupposes that policy is “tight” and has been tight for some months, but there is disagreement over whether that is, in fact, the case. Scott Sumner, at the link above in the second paragraph, is skeptical that policy is “tight” even now. That’s despite the fact that the Fed hiked its federal funds rate target 11 times between March 2022 and July 2023 (by a total of 5.25%). The Fed waited too long to get started on its upward rate moves, which helps explain the continuing strength of the economy right now.
The real fed funds rate turned positive (arguably) as early as last winter as the rate rose and as expected inflation began to decline. There is also solid evidence that real interest rates on the short-end of the maturity spectrum are higher than “neutral” real rates and have been for well over a year (see chart below). If the Fed leaves its rate target unchanged over the next few months, assuming expected inflation continues to taper, the real rate will rise passively and the Fed’s policy stance will have tightened further.
Another view is that the Fed’s policy became “tight” when the monetary aggregates began to decrease (April 2022 for M2). A few months later the Fed began so-called “quantitative tightening” (QT—selling securities to reduce its balance sheet). Thus far, QT has reversed only a portion of the vast liquidity provided by the Fed during the pandemic. However, markets do grow accustomed to generous ongoing flows of liquidity. Cutting them off creates financial tensions that have real economic effects. No doubt the Fed’s commitment to QT established some credibility that a real policy shift was underway. So it’s probably fair to say that policy became “tight” as this realization took hold, which might place the date demarcating “tight” policy around 15 – 18 months ago.
Back to the Lags
Again, changes in monetary policy have a discernible impact only with a lag. The broad range of timing discussed among monetary experts (again, going back to Milton Friedman) is 9 – 24 months. We’re right in there now, which adds to the conviction among many forecasters that the onset of recession is likely during the first half of 2024. That’s my position, and while the tapering of inflation we’ve witnessed thus far is quite encouraging, it might take sustained monetary restraint before we’re at or below the Fed’s 2% target. That also increases the risk that we’ll ultimately suffer through a hard landing. In fact, there are prominent voices like hedge fund boss Bill Ackman who predict the Fed must begin to cut the funds rate soon to avoid a hard landing. Jamie Diamond, CEO of JP Morgan, says the U.S. is headed for a hard landing in 2024.
Looking Forward
If new data over the next few months is consistent with a “soft landing” (and it would take much more than a few months to be conclusive), or especially if the data more strongly indicate an incipient recession, the Fed certainly won’t raise its target rate again. The Fed is likely to begin to cut the funds rate sometime next year, and sooner if a recession seems imminent. Otherwise, my guess is the Fed waits at least until well into the second quarter. The average of FOMC member forecasts at the December meeting works out to three quarter-point rate cuts by year-end 2024. When the Fed does cut its target rate, I hope it won’t at the same time abandon QT, the continuing sales of securities from its currently outsized portfolio. Reducing the Fed’s holdings of securities will restrain money growth and give the central bank more flexibility over future policy actions. QT will also put pressure on Congress and the President to reduce budget deficits.
I’m not terribly surprised to learn that scientific advancement has slowed over my lifetime. A recent study published in the journal Nature documented a secular decline in the frequency of “disruptive” or “breakthrough” scientific research across a range of fields. Research has become increasingly dominated by “incremental” findings, according to the authors. The graphic below tells a pretty dramatic story:
The index values used in the chart range “from 1 for the most disruptive to -1 for the least disruptive.” The methodology used to assign these values, which summarize academic papers as well as patents, produces a few oddities. Why, for example, does the tech revolution of the last 40 years create barely a blip in the technology index in the chart above? And why have tech research and social science research always been more “disruptive” than other fields of study?
Putting those questions aside, the Nature paper finds trends that are basically consistent across all fields. Apparently, systematic forces have led to declines in these measures of breakthrough scientific findings. The authors try to provide a few explanations as to the forces at play: fewer researchers, incrementalism, and a growing role of large-team research that induces conformity. But if research has become more incremental, that’s more accurately described as a manifestation of the disease, rather than a cause.
Conformity
Steven F. Hayward skewers the authors a little, and perhaps unfairly, stating a concern held by many skeptics of current scientific practices. Hayward says the paper:
“… avoids the most significant and obvious explanation with the myopia of Inspector Clouseau, which is the deadly confluence of ideology and the increasingly narrow conformism of academic specialties.”
Conformism in science is nothing new, and it has often interfered with the advancement of knowledge. The earliest cases of suppression of controversial science were motivated by religious doctrine, but challenges to almost any scientific “consensus” seem to be looked upon as heresy. Several early cases of suppression are discussed here. Matt Ridley has described the case of Mary Worley Montagu, who visited Ottoman Turkey in the early 1700s and witnessed the application of puss from smallpox blisters to small scratches on the skin of healthy subjects. The mild illness this induced led to immunity, but the British medical establishment ridiculed her. A similar fate was suffered by a Boston physician in 1721. Ridley says:
“Conformity is the enemy of scientific progress, which depends on disagreement and challenge. Science is the belief in the ignorance of experts, as [the physicist Richard] Feynman put it.”
When was the Scientific Boom?
I couldn’t agree more with Hayward and Ridley on the damaging effects of conformity. But what gave rise to our recent slide into scientific conformity, and when did it begin? The Nature study on disruptive science used data on papers and patents starting in 1945. The peak year for disruptive science within the data set was … 1945, but the index values were relatively high over the first two decades of the data set. Maybe those decades were very special for science, with a variety of applications and high-profile accomplishments that have gone unmatched since. As Scott Sumner says in an otherwise unrelated post, in many ways we’ve failed to live up to our own expectations:
“In retrospect, the 1950s seem like a pivotal decade. The Boeing 707, nuclear power plants, satellites orbiting Earth, glass walled skyscrapers, etc., all seemed radically different from the world of the 1890s. In contrast, airliners of the 2020s look roughly like the 707, we seem even less able to build nuclear power plants than in the 1960s, we seem to have a harder time getting back to the moon than going the first time, and we still build boring glass walled skyscrapers.”
It’s difficult to put the initial levels of the “disruptiveness” indices into historical context. We don’t know whether science was even more disruptive prior to 1945, or how the indices used by the authors of the Nature article would have captured it. And it’s impossible to say whether there is some “normal” level of disruptive research. Is a “normal” index value equal to zero, which we now approach as an asymptote?
Some incredible scientific breakthroughs occurred decades before 1945, to take Einstein’s theory of relativity as an obvious example. Perhaps the index value for physical sciences would have been much higher at that time, were it measured. Whether the immediate post-World War II era represented an all-time high in scientific disruption is anyone’s guess. Presumably, the world is always coming from a more primitive base of knowledge. Discoveries, however, usually lead to new and deeper questions. The authors of the Nature article acknowledge and attempt to test for the “burden” of a growing knowledge base on the productivity of subsequent research and find no effect. Nevertheless, it’s possible that the declining pattern after 1945 represents a natural decay following major “paradigm shifts” in the early twentieth century.
The Psychosis Now Known As “Wokeness”
The Nature study used papers and patents only through 2010. Therefore, the decline in disruptive science predates the revolution in “wokeness” we’ve seen over the past decade. But “wokeness” amounts to a radicalization of various doctrines that have been knocking around for years. The rise of social justice activism, critical theory, and anthropomorphic global warming theology all began long before the turn of the century and had far reaching effects that extended to the sciences. The recency of “wokeness” certainly doesn’t invalidate Hayward and Ridley when they note that ideology has a negative impact on research productivity. It’s likely, however, that some fields of study are relatively immune to the effects of politicization, such as the physical sciences. Surely other fields are more vulnerable, like the social sciences.
Citations: Not What They Used To Be?
There are other possible causes of the decline in disruptive science as measured by the Nature study, though the authors believe they’ve tested and found these explanations lacking. It’s possible that an increase in collaborative work led to a change in citation practices. For example, this study found that while self-citation has remained stable, citation of those within an author’s “collaboration network” has declined over time. Another paper identified a trend toward citing review articles in Ecology Journals rather than the research upon which those reviews were based, resulting in incorrect attribution of ideas and findings. That would directly reduce the measured “disruptiveness” of a given paper, but it’s not clear whether that trend extends to other fields.
Believe it or not, “citation politics” is a thing! It reflects the extent to which a researcher should suck-up to prominent authors in a field of study, or to anyone else who might be deemed potentially helpful or harmful. In a development that speaks volumes about trends in research productivity, authors are now urged to append a “Citation Diversity Statement” to their papers. Here’s an academic piece addressing the subject of “gendered citation practices” in contemporary physics. The 11 authors of this paper would do well to spend more time thinking about problems in physics than in obsessing about whether their world is “unfair”.
Science and the State
None of those other explanations are to disavow my strong feeling that science has been politicized and that it is harming our progress toward a better world. In fact, it usually leads us astray. Perhaps the most egregious example of politicized conformism today is climate science, though the health sciences went headlong toward a distinctly unhealthy conformism during the pandemic (and see this for a dark laugh).
Politicized science leads to both conformism and suppression. Here are several channels through which politicization might create these perverse tendencies and reduce research productivity or disruptiveness:
Political or agenda-driven research is driven by subjective criteria, rather than objective inquiry and even-handed empiricism
Research funding via private or public grants is often contingent upon whether the research can be expected to support the objectives of the funding NGOs, agencies, or regulators. The gravy train is reserved for those who support the “correct” scientific narrative
Promotion or tenure decisions may be sensitive to the political implications of research
Government agencies have been known to block access to databases funded by taxpayers when a scientist wishes to investigate the “wrong questions”
Journals and referees have political biases that may influence the acceptance of research submissions, which in turn influences the research itself
The favorability of coverage by a politicized media influences researchers, who are sensitive to the damage the media can do to one’s reputation
The chance that one’s research might have a public policy impact is heavily influenced by politics
The talent sought and/or attracted to various fields may be diminished by the primacy of political considerations. Indoctrinated young activists generally aren’t the material from which objective scientists are made
Conclusion
In fairness, there is a great deal of wonderful science being conducted these days, despite the claims appearing in the Nature piece and the politicized corruption undermining good science in certain fields. Tremendous breakthroughs are taking place in areas of medical research such as cancer immunotherapy and diabetes treatment. Fusion energy is inching closer to a reality. Space research is moving forward at a tremendous pace in both the public and private spheres, despite NASA’s clumsiness.
I’m sure there are several causes for the 70-year decline in scientific “disruptiveness” measured in the article in Nature. Part of that decline might have been a natural consequence of coming off an early twentieth-century burst of scientific breakthroughs. There might be other clues related to changes in citation practices. However, politicization has become a huge burden on scientific progress over the past decade. The most awful consequences of this trend include a huge misallocation of resources from industrial planning predicated on politicized science, and a meaningful loss of lives owing to the blind acceptance of draconian health policies during the Covid pandemic. When guided by the state or politics, what passes for science is often no better than scientism. There are, however, even in climate science and public health disciplines, many great scientists who continue to test and challenge the orthodoxy. We need more of them!
I leave you with a few words from President Dwight Eisenhower’s Farewell Address in 1961, in which he foresaw issues related to the federal funding of scientific research:
“Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.
In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.
Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.
The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.
Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.”
The debate over the Federal Reserve’s policy stance has undergone an interesting but understandable shift, though I disagree with the “new” sentiment. For the better part of this year, the consensus was that the Fed waited too long and was too dovish about tightening monetary policy, and I agree. Inflation ran at rates far in excess of the Fed’s target, but the necessary correction was delayed and weak at the start. This violated the necessary symmetry of a legitimate inflation-targeting regime under which the Fed claims to operate, and it fostered demand-side pressure on prices while risking embedded expectations of higher prices. The Fed was said to be “behind the curve”.
Punch Bowl Resentment
The past few weeks have seen equity markets tank amid rising interest rates and growing fears of recession. This brought forth a chorus of panicked analysts. Bloomberg has a pretty good take on the shift. Hopes from some economists for a “soft landing” notwithstanding, no one should have imagined that tighter monetary policy would be without risk of an economic downturn. At least the Fed has committed to a more aggressive policy with respect to price stability, which is one of its key mandates. To be clear, however, it would be better if we could always avoid “hard landings”, but the best way to do that is to minimize over-stimulation by following stable policy rules.
Price Trends
Some of the new criticism of the Fed’s tightening is related to a perceived change in inflation signals, and there is obvious logic to that point of view. But have prices really peaked or started to reverse? Economist Jeremy Siegel thinks signs point to lower inflation and believes the Fed is being too aggressive. He cites a series of recent inflation indicators that have been lower in the past month. Certainly a number of commodity prices are generally lower than in the spring, but commodity indices remain well above their year-ago levels and there are new worries about the direction of oil prices, given OPEC’s decision this week to cut production.
Central trends in consumer prices show that there is a threat of inflation that may be fairly resistant to economic weakness and Fed actions, as the following chart demonstrates:
Overall CPI growth stopped accelerating after June, and it wasn’t just moderation in oil prices that held it back (and that moderation might soon reverse). Growth of the Core CPI, which excludes food and energy prices, stopped accelerating a bit earlier, but growth in the CPI and the Core CPI are still running above 8% and 6%, respectively. More worrisome is the continued upward trend in more central measures of CPI growth. Growth in the median component of the CPI continues to accelerate, as has the so-called “Trimmed CPI”, which excludes the most extreme sets of high and low growth components. The response of those central measures lagged behind the overall CPI, but it means there is still inflationary momentum in the economy. There is a substantial risk that expectations of a more permanent inflation are becoming embedded in expectations, and therefore in price and wage setting, including long-term contracts.
The Fed pays more attention to a measure of prices called the Personal Consumption Expenditures (PCE) deflator. Unlike the CPI, the PCE deflator accounts for changes in the composition of a typical “basket” of goods and services. In particular, the Fed focuses most closely on the Core PCE deflator, which excludes food and energy prices. Inflation in the PCE deflator is lower than the CPI, in large part because consumers actively substitute away from products with larger price increases. However, the recent story is similar for these two indices:
Both overall PCE inflation and Core PCE inflation stopped accelerating a few months ago, but growth in the median PCE component has continued to increase. This central measure of inflation still has upward momentum. Again, this raises the prospect that inflationary forces remain strong, and that higher and more widespread expected inflation might make the trend more difficult for the Fed to rein in.
That leaves the Fed little choice if it hopes to bring inflation back down to its target level. It’s really a only a choice of whether to do it faster or slower. One big qualification is that the Fed can’t do much about supply shortfalls, which have been a source of price pressure since the start of the rebound from the pandemic. However, demand pressures have been present since the acceleration in price growth began in earnest in early 2021. At this point, it appears that they are driving the larger part of inflation.
The following chart shows share decompositions for growth in both the “headline” PCE deflator and the Core PCE deflator. Actual inflation rates are NOT shown in these charts. Focus only on the bolder colored bars. (The lighter bars represent estimates having less precision.) Red represents “supply-side” factors contributing to changes in the PCE deflator, while blue summarizes “demand-side” factors. This division is based on a number of assumptions (methodological source at the link), but there is no question that demand has contributed strongly to price pressures. At least that gives a sense about how much of the inflation can be addressed by actions the Fed might take.
I mentioned the role of expectations in laying the groundwork for more permanent inflation. Expected inflation not only becomes embedded in pricing decisions: it also leads to accelerated buying. So expectations of inflation become a self-fulfilling prophesy that manifests on both the supply side and the demand-side. Firms are planning to raise prices in 2023 because input prices are expected to continue rising. In terms of the charts above, however, I suspect this phenomenon is likely to appear in the “ambiguous” category, as it’s not clear that the counting method can discern the impacts of expectations.
What’s a Central Bank To Do?
Has the Fed become too hawkish as inflation accelerated this year while proving to be more persistent than expected? One way to look at that question is to ask whether real interest rates are still conducive to excessive rate-sensitive demand. With PCE inflation running at 6 – 7% and Treasury yields below 4%, real returns are still negative. That’s hardly seems like a prescription for taming inflation, or “hawkish”. Rate increases, however, are not the most reliable guide to the tenor of monetary policy. As both John Cochrane and Scott Sumner point out, interest rate increases are NOT always accompanied by slower money growth or slowing inflation!
However, Cochrane has demonstrated elsewhere that it’s possible the Fed was on the right track with its earlier dovish response, and that price pressures might abate without aggressive action. I’m skeptical to say the least, and continuing fiscal profligacy won’t help in that regard.
The Policy Instrument That Matters
Ultimately, the best indicator that policy has tightened is the dramatic slowdown (and declines) in the growth of the monetary aggregates. The three charts below show five years of year-over-year growth in two monetary measures: the monetary base (bank reserves plus currency in circulation), and M2 (checking, saving, money market accounts plus currency).
Growth of these aggregates slowed sharply in 2021 after the Fed’s aggressive moves to ease liquidity during the first year of the pandemic. The monetary base and M2 growth have slowed much more in 2022 as the realization took hold that inflation was not transitory, as had been hoped. Changes in the growth of the money stock takes time to influence economic activity and inflation, but perhaps the effects have already begun, or probably will in earnest during the first half of 2023.
The Protuberant Balance Sheet
Since June, the Fed has also taken steps to reduce the size of its bloated balance sheet. In other words, it is allowing its large holdings of U.S. Treasuries and Agency Mortgage-Backed Securities to shrink. These securities were acquired during rounds of so-called quantitative easing (QE), which were a major contributor to the money growth in 2020 that left us where we are today. The securities holdings were about $8.5 trillion in May and now stand at roughly $8.2 trillion. Allowing the portfolio to run-off reduces bank reserves and liquidity. The process was accelerated in September, but there is increasing tension among analysts that this quantitative tightening will cause disruptions in financial markets and ultimately the real economy, There is no question that reducing the size of the balance sheet is contractionary, but that is another necessary step toward reducing the rate of inflation.
The Federal Spigot
The federal government is not making the Fed’s job any easier. The energy shortages now afflicting markets are largely the fault of misguided federal policy restricting supplies, with an assist from Russian aggression. Importantly, however, heavy borrowing by the U.S. Treasury continues with no end in sight. This puts even more pressure on financial markets, especially when such ongoing profligacy leaves little question that the debt won’t ever be repaid out of future budget surpluses. The only way the government’s long-term budget constraint can be preserved is if the real value of that debt is bid downward. That’s where the so-called inflation tax comes in, and however implicit, it is indeed a tax on the public.
Don’t Dismiss the Real Costs of Inflation
Inflation is a costly process, especially when it erodes real wages. It takes its greatest toll on the poor. It penalizes holders of nominal assets, like cash, savings accounts, and non-indexed debt. It creates a high degree of uncertainty in interpreting price signals, which ordinarily carry information to which resource flows respond. That means it confounds the efficient allocation of resources, costing all of us in our roles as consumers and producers. The longer it continues, the more it erodes our economy’s ability to enhance well being, not to mention the instability it creates in the political environment.
Imminent Recession?
So far there are only limited signs of a recession. Granted, real GDP declined in both the first and second quarters of this year, but many reject that standard as overly broad for calling a recession. Moreover, consumer spending held up fairly well. Employment statistics have remained solid, though we’ll get an update on those this Friday. Nevertheless, payroll gains have held up and the unemployment rate edged up to a still-low 3.7% in August.
Those are backward-looking signs, however. The financial markets have been signaling recession via the inverted yield curve, which is a pretty reliable guide. The weak stock market has taken a bite out of wealth, which is likely to mean weaker demand for goods. In addition to energy-supply shocks, the strong dollar makes many internationally-traded commodities very costly overseas, which places the global economy at risk. Moreover, consumers have run-down their savings to some extent, corporate earnings estimates have been trimmed, and the housing market has weakened considerably with higher mortgage rates. Another recent sign of weakness was a soft report on manufacturing growth in September.
Deliver the Medicine
The Fed must remain on course. At least it has pretensions of regaining credibility for its inflation targeting regime, and ultimately it must act in a symmetric way when inflation overshoots its target, and it has. It’s not clear how far the Fed will have to go to squeeze demand-side inflation down to a modest level. It should also be noted that as long as supply-side pressures remain, it might be impossible for the Fed to engineer a reduction of inflation to as low as its 2% target. Therefore, it must always bear supply factors in mind to avoid over-contraction.
As to raising the short-term interest rates the Fed controls, we can hope we’re well beyond the halfway point. Reductions in the Fed’s balance sheet will continue in an effort to tighten liquidity and to provide more long-term flexibility in conducting operations, and until bank reserves threaten to fall below the Fed’s so-called “ample reserves” criterion, which is intended to give banks the wherewithal to absorb small shocks. Signs that inflationary pressures are abating is a minimum requirement for laying off the brakes. Clear signs of recession would also lead to more gradual moves or possibly a reversal. But again, demand-side inflation is not likely to ease very much without at least a mild recession.
In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun