You might know someone so smart and multi-talented that they are objectively better at everything than you. Let’s call him Harvey Specter. Harvey’s prospects on the labor market are very good. Economists would say he has an absolute advantage over you in every single pursuit! What a bummer! But obviously that doesn’t mean Harvey can or should do everything, while you do nothing.
Fears of Human Obsolescence
That’s the very situation many think awaits workers with the advent of artificial general intelligence (AGI), and especially with the marriage of AGI and advanced robotics (also see here). Any job a human can do, AGI or AGI robots of various kinds will be able to do better, faster, and in far greater quantity. The humanoid AGI robots will be like your talented acquaintance Harvey, but exponentiated. They won’t need much “sleep” or downtime, and treating wear and tear on their “health” will be a simple matter of replacing components. AGI and its robotic manifestations will have an absolute advantage in every possible endeavor.
But even with the existence of super-human AGI robots, I claim that work will be available to you if you want or need it. You won’t face the same set of pre-AGI opportunities, but there will be many opportunities for humans nonetheless. How can that be if AGI robots can do everything better? Won’t they be equipped to meet all of our material needs and wants?
Specter of the Super Productive
Let’s return to the example of you and Harvey, your uber-talented acquaintance. You’ll each have an area of specialization, but on what basis? Harvey has his pick of very lucrative and stimulating opportunities. You, however, are limited to a less dazzling array of prospects. There might be some overlap, and hard work or luck can make up for large differences, but chances are you’ll specialize in something that requires less talent than Harvey. You might wind up in the same profession, but Harvey will be a star.
Where will you end up? The answer is you and Harvey will find your respective areas of specialization based on comparative advantages, not absolute advantages. Relative opportunity cost is the key here, or its inverse: how much do you expect to gain from a certain area of specialization relative to the rewards you must forego.
For example, Harvey doesn’t sacrifice much by shunning less challenging areas of specialization. That is, he faces a low opportunity cost, while his chosen area offers great rewards for his talent.
You, on the other hand, might not have much to gain in Harvey’s line of work, if you can get it. You might be a flop if you do! Realistically, you forego very little if you instead pursue more achievable success in a less daunting area. You’ll be better off choosing an option for which your relative gains are highest, or said differently, where your relative opportunity cost is low.
A Quick Illustration
If you’re unwilling to slog through a simple numerical example, skip this section and the graph below. The graph was produced the old fashioned way: by a human being with a pencil, paper, ruler, and smart phone camera.
Here goes: Harvey can produce up to 100 units of X per period or 100 units of Y, or some linear combination of the two. Harvey’s opportunity costs are constant along this tradeoff between X and Y because it’s a straight line. It costs him one unit of Y output to produce every additional unit of X, and vice versa.
You, on the other hand, cannot produce X or Y as well as Harvey in an absolute sense. At most, you can produce up to 50 units of X per period, 20 units of Y, or some combination of the two along your own constant cost (straight line) tradeoff. You sacrifice 5/2 = 2.5 units of X to produce each unit of Y, so Harvey has the lower opportunity cost and a comparative advantage for Y. But it only costs you 2/5 = 0.4 units of Y to produce each additional unit of X, so you have a comparative advantage over Harvey in X production.
Reciprocal Advantages
In the end, you and Harvey specialize in the respective areas for which each has their lowest relative opportunity cost and a comparative advantage. If he has a comparative advantage in one area of production, and unless your respective tradeoffs have identical slopes (unlikely), the reciprocal nature of opportunity costs dictates that you have a comparative advantage in the other area of production.
Obviously, Harvey’s formidable absolute advantage over you in everything doesn’t impinge on these choices. In the real world, of course, comparative advantages play out across many dimensions of output, but the principle is the same. And once we specialize, we can trade with one another to mutual advantage.
No Such Thing As a Free AGI Robot
That brings us back to AGI and AGI robots. Like Harvey, they might well have an absolute advantage in every area of specialization, or they can learn quickly to achieve such an advantage, but that doesn’t mean they should do everything!
Just as in times preceding earlier technological breakthroughs, we cannot even imagine the types of jobs that will dominate the human and AGI work forces in the future. We already see complementarity between humans and AGI in many applications. AGI makes those workers much more productive, which leads to higher wages.
However, substitution of AGIs for human labor is a dominant theme of the many AGI “harm” narratives. In fact, substitution is already a reality in many occupations, like coding, and substitution is likely to broaden and intensify as the marriage of AGI and robotics gains speed. But that will occur only in industries for which the relative opportunity costs of AGIs, including all of the ancillary resources needed to produce them, are favorable. Among other things, AGI will require a gigantic expansion in energy production and infrastructure, which necessitates a massive exploitation of resources. Relative opportunity costs in the use of these resources will not always favor the dominance of AGIs in production. Like Harvey, AGIs and their ancillary resources cannot do everything because they cannot have comparative advantages without reciprocal comparative disadvantages.
Super-Abundance vs. Scarcity
Some might insist that AGIs will lead to such great prosperity that humans will no longer need to work. All of our material wants will be met in a new age of super-abundance. Despite the foregoing, that might suggest to some that AGIs will do everything! But here I make another claim: our future demands on resources will not be satisfied by whatever abundance AGIs make possible. We will still want to do more, whether we choose to construct fusion reactors, megastructures in space (like Dyson spheres or ring worlds), terraform Mars, undertake interstellar travel, perfect asteroid defense, battle disease, extend longevity, or improve our lives in ways now imagined or unimagined.
As a result, scarcity will remain a major force. To that extent, resources will have competing uses, they will face opportunity costs, and they will have comparative advantages vis a vis alternative uses to which they can be put. Scarcity is a reality that governs opportunity costs, and that means humans will always have roles to play in production.
Concluding Remarks
I wrote about human comparative advantages once before, about seven years ago. I think I was groping along the right path. The only other article I’ve seen to explicitly mention a comparative advantage of human labor vs. AGIs in the correct context is by Andrew Mayne in the most recent issue of Reason Magazine. It’s almost a passing reference, but it deserves more because it is foundational.
Harvey Specter shouldn’t occupy his scarce time performing tasks that compromise his ability to deliver his most rewarding services. Likewise, before long it will become apparent that highly productive AGI assets, and the resources required to build and operate them, should not be tied up in activities that humans can perform at lesser sacrifice. That’s a long way of saying that humans will still have productive roles to play, even when AGI achieves an absolute advantage in everything. Some of the roles played by humans will be complimentary to AGIs in production, but human labor will also be valuable as a substitute for AGI assets in other applications. As long as AGI assets have any comparative advantages, humans will have reciprocal comparative advantages as well.
Housing costs are taking a toll on many Americans. Home prices have risen about 47% cumulatively since 2020, while higher mortgage rates have compounded the difficulties faced by potential homebuyers. Meanwhile, rents are up about 23% over the same period. There just aren’t enough homes available, and the primary cause is an extensive set of regulatory obstacles to increasing the supply of homes.
High housing costs are often blamed on various manifestations of greed. Renters tend to resent their landlords, while those suffering from housing sticker-shock sometimes cast paranoid blame on people with second homes, investor properties, Airbnb rentals, and even residential developers, as if those seeking to build new housing are at the root of the problem.
Quite the contrary: we have an acute shortage of housing. The chart below shows how home vacancy rates have fallen to a level that can’t accommodate the normal frictions associated with housing turnover.
Doubts about this shortfall might owe to confusion over the meaning of one statistic: our high current level of housing units per capita. It does not indicate a plentiful stock of housing, as some assume. Alex Tabarrok, in commenting favorably on a lengthier post by Kevin Erdman, offers a simple example demonstrating that units per capita is not a reliable guide to the adequacy of housing supply:
“Suppose we have 100 homes and 100 families, each with 2 parents and 2 kids. Thus, there are 100 homes, 400 people and 0.25 homes per capita. Now the kids grow up, get married, and want homes of their own but they have fewer kids of their own, none for simplicity. Imagine that supply increases substantially, say to 150 homes. The number of homes per capita goes up to 150/400 (.375), an all time high! Supply-side skeptics are right about the numbers, wrong about the meaning. The reality is that the demand for homes has increased to 200 but supply has increased to just 150 leading to soaring prices.”
Fewer kids have led to more homes per capita even as we suffer from a shortage of housing. In the long run, lower fertility might make it easier for housing supply to catch up with demand, but not if government continues to hamstring housing construction. Only new construction can rectify this shortfall.
That’s the message of Bryan Caplan’s “Build Baby, Build!”. Caplan has been a prominent advocate of eliminating obstacles to the construction of new housing. His book is rather unique in its contribution to economic literature because it tells the story of counterproductive housing policy in the form of a “graphic novel”, which is to say an elaborate comic book. Caplan appears in the book as protagonist, teacher and persistent gadfly.
Government obstructs additions to the supply of housing in a variety of ways: rent controls, zoning laws, density restrictions, height limits, environmental rules, and compliance paperwork. And very often these interventions are supported by existing occupants and even owners of existing homes as a matter of NIMBYism. Construction of new homes, the sure answer to the problem of an inadequate supply of housing, is actively resisted. These limitations have widespread implications for the health of the economy.
As Caplan points out, the scarcity and expense of housing limits mobility, so workers are often unable to exploit opportunities that require a move, particularly to areas of rapid growth. This makes it difficult for the labor market to adjust to negative shocks or long-term decline that might displace workers in specific locales. The mobility of resources is key to well-functioning economy, but our policies fail miserably on this count.
Rent control is an insidious policy option usually favored in dense urban areas by current renters as well as politicians seeking a visible and easy “fix” to rising rental rates. The problem is obvious: rent control destroys incentives to improve or even maintain properties. Depending on specific rules, it might even discourage development of new rental units. The result is a slow decay of the existing housing stock.
Zoning laws are an old tool of NIMBYism. The objective is to keep multifamily housing (or certain kinds of commercial development) safely away from single-family neighborhoods, or to prevent developments with relatively small lot sizes. There is also agricultural zoning, which can prevent new development along urban peripheries. It’s not difficult to understand how restrictive zoning causes rents and housing prices to escalate.
Similarly, density limits, height restrictions, burdensome filing requirements, and environmental rules all work to limit the supply of new homes.
As if crushing the supply side wasn’t enough, housing costs will come under pressure from the demand side as the Biden Administration pushes new home buying subsidies. They propose tax credits of $400 a month (at least while mortgage rates remain elevated) and an end to title insurance fees on government-backed mortgages. This would drive prices higher still. The Administration also threatens to prosecute landlords who “collude” in utilizing third-party algorithms for information in establishing rental rates. Finally, Biden proposes to dedicate billions to the construction of affordable housing, but the history of affordable housing initiatives and building subsidies is one of drastically inflated costs. This is unlikely to differ in that regard.
As wrongheaded as it is, the fact that the public is often favorably disposed to so much housing regulation is easy to understand. Rent controls prevent increases in rents to existing tenants, an easily “seen” benefit. The deleterious long-term consequences on the stock of housing are “unseen”, in the language of Frederic Bastiat.
As for zoning, homeowners are resistant to the construction of nearby “low-value” units for a variety of reasons, some aesthetic and some practical, like maintaining home values or preventing excessive traffic. “Keeping the riffraff out” is undoubtedly at play as well.
This resistance extends well beyond the limits of enforcing private property rights. It is pure rent seeking behavior in the public sphere for private benefit. Politicians and government officials tend to view the motives behind zoning as sensible, however, despite the long-term consequences of strict zoning for housing supply. Similarly, environmental restrictions sound well and good, but they too have their “unseen” negative consequences.
Most puzzling is the animus with which so many regard private residential developers, who generally build what people want: low-density suburban enclaves. Developers do it for profit, but this alienates voters who are ignorant of the economic role of profit. As in any other pursuit, profit creates a basic incentive for development activity, and to provide the kinds of homes and neighborhood amenities demanded by consumers, and to do so efficiently.
On the other hand, sprawling development inflicts external costs on incumbent residents due to added congestion, and developers and their home buyers benefit from the provision of roads that are free to users. The solution is to internalize the cost of building roads by pricing their use. Homebuyers would then weigh the value of buying in a particular area against the full marginal cost, including road use, while helping to defray the cost of maintenance and upgrades to roads and other infrastructure.
Our housing policies restrict the actions of landlords, developers, and ultimately consumers of housing. The misallocations of resources occur every time a tenant or homeowner feels they can’t afford to move in response to changing circumstances. Here is Veronique de Rugy, in an article inspired by Ryan Bourne’s “The War on Prices”, on the constraints imposed on individuals by one form of misguided intervention (my bracketed additions):
“Prices and wages [and housing rents] set on market dynamics reflect underlying economic realities and then send out a signal for help. Price [rent] controls only mask these realities, which inevitably worsens the economy’s ability to respond with what ordinary consumers and workers need.“
But our housing problem is not solely caused by interference with the price mechanism. Rather, excessive regulation of rents and a panoply of other details of the legal environment for housing have led to our current shortfall. The lesson is deregulate, and to let developers build (and rehabilitate) the housing that people need.
The chart above makes a convincing case that we have a spending problem at the federal level. Really, we’ve had a spending problem for a long time. But at least tax revenue today remains reasonably well-aligned with its 50-year historical average as a share of GDP. Not spending. Even larger deficits opened up during the pandemic and they haven’t returned to pre-pandemic levels.
We’ve seen Joe Biden break spending records. His initiatives, often with questionable merit, have included the $1.8 trillion American Rescue Plan and the nearly $0.8 trillion Infrastructure Investment and Jobs Act, along with several other significant spending initiatives such as the Promise to Address Comprehensive Toxics Act and the subsidy-laden CHIPS Act. Meanwhile, emergency spending has become a regular occurrence on Biden’s watch. More recently, he’s made repeated efforts to forgive massive amounts of student loans despite the Supreme Court’s clear ruling that such gifts are unconstitutional.
Indeed, while Biden keeps pretty busy spinning tales of his days driving an 18-wheeler, cannibals devouring his Uncle Bosie Finnegan, his upbringing in black churches, synagogues, or in the Puerto Rican community, he still finds time to dream up ways for the government to spend money it doesn’t have. Or his kindly puppeteers do.
Biden’s New Budget
Eric Boehm expressed wonderment at Biden’s fiscal 2025 budget not long after its release in March. He was also mystified by the gall it took to produce a “fact sheet” in which the White House congratulated itself on fiscal responsibility. That’s how this Administration characterizes deficits projected at $16 trillion over the next ten years. No joke!
Furthermore, the Administration says the record spending will be “paid for”. Well, yes, with tax increases and lots of borrowing! There are a great many fabulist claims made by the White House about the budget. This link from the Office of Management and Budget includes a handy list of propaganda sheets they’ve managed to produce on the virtues of their proposal.
The Congressional Budget Office (CBO) projects ten-year deficits under current law that are $3 trillion higher than Biden’s proposed budget. That’s the basis of the White House’s boast of fiscal restraint. But the difference is basically paid for with a couple of accounting tricks (see below). More charitably, one could say it’s paid for with higher taxes, aided by the assumption of slightly faster economic growth. The latter will be a good trick while undercutting incentives and wages with a big boost to the corporate tax rate.
The revenue projected by the While House from those taxes does not come anywhere close to eliminating the gap shown in the CBO’s chart above. Federal spending under Biden’s budget grows at about 4% annually, just a bit slower than nominal GDP. Thus, the federal share of GDP remains roughly constant and only slightly higher than the CBO’s current projection for 2034. Nevertheless, spending relative to GDP would continue at an historically high rate. Over the next decade, it would average more than 3% higher than its 50-year average. That would be about $1.3 trillion in 2034!
Meanwhile, the ratio of tax revenue to GDP under Biden’s proposal, as they project it, would average slightly higher than its 50-year average, reaching a full percentage point above by 2034 (and higher than the CBO baseline). That’s probably optimistic.
There is little real effort in this budget to reduce federal deficits, with Treasury borrowing rates now near 15-year highs. Interest expense has grown to an alarming share of spending. In fact, it’s expected to exceed spending on defense in 2024! Perhaps not coincidentally, the White House assumes a greater decline in interest rates than CBO over the next 10 years.
Treats or Tricks?
The situation is likely worse than the White House depicts, given that its budget incorporates assumptions that look generous to their claim of fiscal restraint. First, they frontload nondefense discretionary spending, allowing Biden to make extravagant promises for the near-term while pushing off steep declines in budget commitments to the out-years. The sharp reductions in this category of spending pares more than $2 trillion from the 10-year deficit. From the link above:
Biden also proposes to restore the expanded the child tax credit — for one year! How handy from a budget perspective: heroically call for an expanded credit (for a year) while avoiding, for the time being, the addition of a couple of trillion to the 10-year deficit.
Code Red
So where does this end? The ratio of federal debt to GDP will resume its ascent after a slight decline from the pandemic high. Here is the CBO’s projection:
The Biden budget shows a relatively stable debt to GDP ratio through 2034 due to the assumptions of slightly faster GDP growth, lower Treasury borrowing rates, and the aforementioned “fiscal restraint”. But don’t count on it!
The government’s growing dominance over real resources will have negative consequences for growth in the long-term. Purely as a fiscal matter, however, it must be paid for in one of three ways: revenue from explicit taxes, federal borrowing, or an implicit tax on the public more commonly known as the inflation tax. The last two are intimately related.
Bond investors always face at least a small measure of default risk even when lending to the U.S. Treasury. There is almost no chance the government would ever default outright by failing to pay interest or principal when due. However, investors hold an expectation that the value of their bonds will erode in real terms due to inflation. To compensate, they demand an “inflation premium” in the interest rate they earn on Treasury bonds. But an upside surprise to inflation would constitute a “soft default” on the real value of their bonds. This occurred during and after the pandemic, and it was triggered by a burgeoning federal deficit.
Brief Mechanics
John Cochrane has explained the mechanism by which acts of fiscal profligacy can be transmitted to the price of goods. The real value of outstanding federal debt cannot exceed the expected real value of future surpluses (a present value summed across positive and negative surpluses). If expected surpluses are reduced via some emergency or shock such that repayment in real terms is less likely, then the real value of government debt must fall. That means either interest rates or the price level must rise, or some combination of the two.
The Federal Reserve can prevent interest rates from rising (by purchasing bonds and increasing the money supply), but that leaves a higher price level as the only way the real value of debt can come into line. In other words, an unexpected increase in the path of federal deficits would be financed by money printing and an inflation tax. The incidence of this unexpected “implicit” tax falls not only to bondholders, but also on the public at large, who suffer an unexpected decline in the purchasing power of their nominal assets and incomes. This in turn tends to free-up real resources for government absorption.
Government Debt Is Risky
It appears that investors expect the future deficits now projected by the CBO (and the White House) to be paid down someday, to some extent, by future surpluses. That might seem preposterous, but markets apparently aren’t surprised by the projected deficits. After all, fiscal policy decisions can change tremendously over the course of a few years. But it still feels like excessive optimism. Whatever the case, Cochrane cautions that the next fiscal emergency, be it a new pandemic, a war, a recession, or some other crisis, is likely to create another huge expansion in debt and a substantial increase price level. Joe Biden doesn’t seem inclined to put us in a position to deal with that risk very effectively. Unfortunately, it’s not clear that Donald Trump will either. And neither seems inclined to seriously address the insolvencies of Social Security and Medicare. If unaddressed, those mandatory obligations will become real crises over the next decade.
The current protests on college campuses across the nation bring into focus differing opinions on the limits of free speech and assembly. Particular questions seem to defy resolution. Nevertheless, there is some misunderstanding regarding the settled breadth of the First Amendment.
The protestors have acted as if they have constitutional carte blanche to gather anywhere to say anything in opposition to Israel and its war against Hamas terrorists; a subset thinks this encompasses “occupation” of any space for any duration; a still smaller subset believes this includes a right to condemn Jews, all Jews.
I strongly doubt, however, that many of the protestors truly believe their constitutional protections extend to intimidation and bullying of Jewish students attempting to go about their business on campus (scroll to a few of the articles here), destruction of property, or the use of “fighting words”, or physical attacks on Jews or other “oppressors”.
It’s well known that the Constitution does not protect “fighting words”, including threats.Furthermore,Eugene Volokh explains that there is no constitutional right to “occupy” a college campus, either public or private.
Of course, private schools are not legally bound to respect free speech or assembly rights. They can regulate activity on their private campuses in any way they see fit. Some explicitly abide the same rights as public universities, which seems reasonable for any institution dedicated to the free spirit of inquiry.
Volokh, however, cites Supreme Court precedents in which a majority held that government can prohibit camping in certain parks, for example, and that public colleges and universities can impose restrictions on campus activities:
“There is no First Amendment right to camp out in any university, public or private. Indeed, there is no First Amendment right to camp out even in public parks (see Clark v. CCNV (1984)), and the government’s power to limit the use of property used for a public university is even greater than its power as to parks (Widmar v. Vincent (1981)):
“‘A university differs in significant respects for public forums such as streets or parks or even municipal theaters. A university’s mission is education, and decisions of this Court have never denied a university’s authority to impose reasonable regulations compatible with that mission upon the use of its campus and facilities. We have not held, for example, that a campus must make all of its facilities equally available to students and nonstudents alike, or that a university must grant free access to all of its grounds or buildings.’
“Likewise, if UC Berkeley had held a law student party in the law school building rather than at Dean Chemerinsky’s house, it could have stopped students from using the party as an occasion to orate to the audience (especially with their own sound amplification devices, which the student brought to Chemerinsky’s house). See Spears v. Arizona Bd. of Regents (D. Ariz. 2019)(upholding public university’s right to stop people from speaking with sound amplification at an on-campus book fair).“
Volokh also notes, however, that public universities cannot restrict mere “offensive” expression, which would include certain antisemitic statements or evenswastikas (for example), as long as the expression falls short of “fighting words” or explicit threats. Do calls for the “extermination of Jews” qualify as fighting words? That deserves a resounding yes. It’s clearly hate speech, and it’s exactly the sort of expression that might be deemed so offensive to counterprotestors (for example) as to constitute an immediate threat to public order.
Does the meaning of “fighting words” include such chants as “From the river to the sea…”? Some say that depends on the speaker, but that can’t provide a sound basis of distinction. It is clearly associated with calls to eliminate the state of Israel. Some believe it also implies the genocide of Jews in Israel, and Jews can’t be blamed for finding it threatening. Okay, how about “Intifada”? I doubt all of the students involved in the current protests understand the genocidal implications of these words. The agitators understand them well enough.
This is a grey area in our understanding of the First Amendment. The “River to the Sea” chant, and Intifada, seemlike fighting words to me, but they might not qualify as direct threats to anyone on campus. By comparison, the swastika is “just” a party emblem, whatever policies it stands for, and apparently the Court did not deem it a direct threat to anyone in Skokie, Illinois. The legal distinctions here feel inadequate. Still, we say the “mere” expression of offensive ideas or symbols is protected speech, provided that it does not directly threaten harm to any party.
Many libertarians, with whom I usually agree, urge tolerance of the protests and encampments, including at least cautious tolerance of the protests. The Foundation for Individual Rights and Expression (FIRE) has strenuously objected to the actions of police in Austin, Texas in dispersing demonstrators at the University of Texas. Alex Tabarrak has reposted a tweet or two apparently critical of the government’s response to protestors in Texas and at Emory University in Atlanta, though it should be noted that the economics professor who was taken down and handcuffed on video had actually hit a police officer. Michael Munger, in a variation of his “worst enemy test” of government power, says that giving campus authorities “the power to crush us, at their discretion” is probably a bad idea. But they have that power if they choose to exercise it, for better or worse. (By “us”, I don’t think Munger intended to take sides).
I’m highly skeptical of the motives and incentives of some of the “occupiers” of campus spaces, not to mention their status as students. More importantly, there is ample evidence that “fighting words” and threats against Jews have been used by many of the protesters. This violates the codes of conduct at many schools, and should not only be censured, but any student identified as guilty of this sort of hate speech should be expelled, not merely suspended. There should be severe consequences for professors choosing to participate in these protests as well.
This behavior should have long-term consequences, and that is happening at some schools. I saw the following quote from P.J. O’Rourke on Instapundit, which seems appropriate here:
“There’s only one basic human right, the right to do as you damn well please. And with it comes the only basic human duty, the duty to take the consequences.”
The kids are wearing masks for a reason, and it ain’t Covid! Now, the protestors’ demands include “amnesty” for their participation in the protests. That shouldn’t play well if you’re provably guilty of calling for the extermination of a race of people. But here’s the thing: certain institutions like Columbia University have allowed the aberrant behavior to go on with little challenge, showing that the real limits to free speech and assembly are whatever acquiescent campus administrators are willing to put up with.
Removing these encampments is more than justified on constitutional grounds at any school, public or private. The arrest of some of the more intransigent elements among the protesters may be well justified. Insulting hate speech is one thing, but eliminationist hate speech constitutes fighting words and should not be tolerated. Of course, forcibly removing the encampments is risky in terms of public safety because some of the protestors will physically challenge the police. Comparatively innocent (though naive) students might get caught up in a conflict with law enforcement, but ignorance is no defense. They should not be there. Those risks must be taken to end the “hate encampments”, which are a direct threat to the rights of others wishing only to go about their business.
This is a first for me…. The following is partly excerpted from a post of two weeks ago, but I’ve made a number of edits and additions. The original post was way too long. This is a bit shorter, and I hope it distills a key message.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Failures of industrial policies are nothing new, but the current manipulation of electric power generation by government in favor of renewable energy technologies is egregious. These interventions are a reaction to an overwrought climate crisis narrative, but they have many shortcomings and risks of their own. Chief among them is whether the power grid will be capable of meeting current and future demand for power while relying heavily on variable resources, namely wind and sunshine. The variability implies idle and drastically underutilized hours every day without any ability to call upon the assets to producewhen needed.
The variability is vividly illustrated by the chart above showing a representative daily profile of power demand versus wind and solar output. Below, with apologies to Dante, I describe the energy hellscape into which we’re being driven on the horns of irrational capital outlays. These projects would be flatly rejected by any rational investor but for the massive subsidies afforded by government.
The First Circle of Dormancy: Low Utilization
Wind and solar power assets have relatively low rates of utilization due to the intermittency of wind and sunshine.Capacity factors for wind turbines averaged almost 36% in the U.S. in 2022, while solar facilities averaged only about 24%. This compared with nuclear power at almost 93%, natural gas (66%), and coal (48%).
Despite their low rates of utilization, new wind and solar facilities are always touted at their full nameplate capacity. We hear a great deal about “additions to capacity”, which overstate the actual power-generating potential by factors of three to four times. More importantly, this also means wind and solar power costs per unit of output are often vastly understated. These assets contribute less economic value to the electric grid than more heavily utilized generating assets.
Sometimes wind and solar facilities are completely idle or dormant. Sometimes they operate at just a fraction of capacity. I will use the terms “idle” and dormant” euphemistically in what follows to mean assets operating not just at low levels of utilization, but for those prone to low utilization and also falling within the Second Circle of Dormancy.
The Second Circle of Dormancy: Non-Dispatchability
The First Circle of Dormancy might be more like a Purgatory than a Hell. That’s because relatively low average utilization of an asset could be justifiable if demand is subject to large fluctuations. This is the often case, as with assets like roads, bridges, restaurants, amusement parks, and many others. However, capital invested in wind and solar facilities is idle on an uncontrollable basis, which is more truly condemnable. Wind and solar do not provide “dispatchable” power, meaning they are not “on call” in any sense during idle or less productive periods. Not only is their power output uncontrollable, it is not entirely predictable.
Again, variable but controllable utilization allows flexibility and risk mitigation in many applications. But when utilization levels are uncontrollable, the capital in question has greatly diminished value to the power grid and to power customers relative to dispatchable sources having equivalent capacity and utilization. It’s no wonder that low utilization, variability, and non-dispatchability are underemphasized or omitted by promoters of wind and solar energy. This sort of uncontrollable down-time is a drain on real economic returns to capital.
The Third Circle of Dormancy: Transmission Infrastructure
The idleness that besets the real economic returns to wind and solar power generation extends to the transmission facilities necessary for getting power to the grid. Transmission facilities are costly, but that cost is magnified by the broad spatial distribution of wind and solar generating units. Transmission from offshore facilities is particularly complex. When wind turbines and solar panels are dormant, so are the transmission facilities needed to reach them. Thus, low utilization and the non-dispatchability of those units diminishes the value of the capital that must be committed for both power generation and its transmission.
The Fourth Circle of Dormancy: Backup Power Assets
The reliability of the grid requires that any commitment to variable wind and solar power must also include a commitment to back-up capacity. As another example, consider shipping concerns that are now experimenting with sails on cargo ships. What is the economic value of such a ship without back-up power? Can you imagine these vessels drifting in the equatorial calms for days on end? Even light winds would slow the transport of goods significantly. Idle, non–dispatchable capital, is unproductive capital.
Likewise, solar-powered signage can underperform or fail over the course of several dark, wintry days, even with battery backup. The signage is more reliable and valuable when it is backed-up by another power source. But again, idle, non-dispatchable capital is unproductive capital.
The needed provision of backup power sources represents an imposed cost of wind and solar, which is built into the cost estimates shown in a section below. But here’s another case of dormancy: some part of the capital commitment, either primary energy sources or the needed backups, will be idle regardless of wind and solar conditions… all the time. Of course, back-up power facilities should be dispatchable because they must serve an insurance function. Backup power therefore has value in preserving the stability of the grid even while completely idle. However, at best that value offsets a small part of the social loss inherent in primary reliance on variable and non-dispatchable power sources.
We can’t wholly “replace” dispatchable generating capacity with renewables without serious negative consequences. At the same time, maintaining existing dispatchable power sources as backup carries a considerable cost at the margin for wind and solar. At a minimum, it requires normal maintenance on dispatchable generators, periodic replacement of components, and an inventory of fuel. If renewables are intended to meet growth in power demand, the imposed cost is far greater because backup sources for growth would require investment in new dispatchable capacity.
The Fifth Circle of Dormancy: Outages
The pursuit of net-zero carbon emissions via wind and solar power creates uncontrollably dormant capital, which increasingly lacks adequate backup power. Providing that backup should be a priority, but it’s not.
Perhaps much worse than the cost of providing backup power sources is the risk and imposed cost of grid instability in their absence. That cost would be borne by users in the form of outages. Users are placed at increasing risk of losing power at home, at the office and factories, at stores, in transit, and at hospitals. This can occur at peak hours or under potentially dangerous circumstances like frigid or hot weather.
Outage risks include another kind of idle capital: the potential for economy-wide shutdowns across a particular region of all electrified physical capital. Not only can grid failure lead to economy-wide idle capital, but this risk transforms all capital powered by electricity into non-dispatchable productive capacity.
Reliance on wind and solar power makes backup capacity an imperative. Better still, just scuttle the wind and solar binge and provide for growth with reliable sources of power!
QuantifyingInfernal Costs
A “grid report card“ from the Mackinac Center for Public Policy gets right to the crux of the imposed-cost problem:
“… the more renewable generation facilities you build, the more it costs the system to make up for their variability, and the less value they provide to electricity markets.”
The report card uses cost estimates for Michigan from the Center of the American Experiment. Here are the report’s average costs per MWh through 2050, including the imposed costs of backup power:
—Existing coal plant: $33/MWh
—Existing gas-powered: $22
— New wind: $180
—New solar: $278
—New nuclear reactor (light water): $74
—Small modular reactor: $185
—New coal plant: $106 with carbon capture and storage (CCS)
—New natural gas: $64 with CCS
It’s should be no surprise that existing coal and gas facilities are the most cost effective. Preserve them! Of the new installations, natural gas is the least costly, followed by the light water reactor and coal. New wind and solar capacity are particularly costly.
Proponents of net zero are loath to recognize the imposed cost of backup power for two reasons. First, it is a real cost that can be avoided by society only at the risk of grid instability, something they’d like to ignore. To them, it represents something of an avoidable external cost. Second, at present, backup dispatchable power would almost certainly entail CO2 emissions, violating the net zero dictum. But in attempting to address a presumed externality (climate warming) by granting generous subsidies to wind and solar investors, the government and NGOs induce an imposed cost on society with far more serious and immediate consequences.
Deadly Sin: Subsidizing Dormant Capital
Wind and solar capital outlays are funded via combinations of private investment and public subsidies, and the former is very much contingent on the latter. That’s because the flood of subsidies is what allows private investors a chance to profit from uncontrollably dormant capital.Wind and solar power are far more heavily subsidized than fossil fuels, as noted by Mitch Rolling and Isaac Orr:
“In 2022, wind and solar generators received three and eighteen times more subsidies per MWh, respectively, than natural gas, coal, and nuclear generators combined. Solar is the clear leader, receiving anywhere from $50 to $80 per MWh over the last five years, whereas wind is a distant second at $8 to $10 per MWh …. Renewable energy sources like wind and solar are largely dependent on these subsidies, which have been ongoing for 30 years with no end in sight.”
But even generous subsidiesoften aren’t enough to ensure financial viability. Rent-enabled malinvestments like these crowd out genuinely productive capital formation. Those lost opportunities span the economy and are not limited to power plants that might otherwise have used fossil fuels.
Despite billions of dollars in “green energy” subsidies, bankruptcy has been all too common among wind and solar firms. That financial instability demonstrates the uneconomic nature of many wind and solar investments. Bankruptcy pleadings represent yet another way investors are insulated against wind and solar losses.
SubsidizedOff-Hour (Wasted) Output
This almost deserves a sixth circle, except that it’s not about dormancy. Wind and solar power are sometimes available when they’re not needed, in which case the power goes unused because we lack effective power storage technology. Battery technology has a long way to go before it can overcome this problem.
When wind and solar facilities generate unused and wasted power during off-hours, their operators are nevertheless paid for that power by selling it into the grid where it goes unused. It’s another subsidy to wind and solar power producers, and one that undermines incentives for investment in batteries.
A Path To Redemption
Space-based solar power beamed to earth may become a viable alternative to terrestrial wind and solar production within a decade or so. The key advantages would be constancy and the lack of an atmospheric filter on available solar energy, producing power 13 times as efficiently as earth-bound solar panels. From the last link:
“The intermittent nature of terrestrial renewable power generation is a major concern, as other types of energy generation are needed to ensure that lights stay on during unfavorable weather. Currently, electrical grids rely either on nuclear plants or gas and coal fired power stations as a backup…. “
Construction of collection platforms in geostationary orbit will take time, of course, but development of space-based solar should be a higher priority than blanketing vast tracts of land with inefficient solar panels while putting power users at risk of outages.
No Sympathy for Malinvestment
This post identified five ways in which investments in wind and solar power create frequent and often extended periods of damnably dormant physical capital:
Low Utilization
Nondispatchable Utilization
Idle Transmission Infrastructure
Idle Backup Generators
Outages of All Electrified Capital
Power demand is expected to soar given the coming explosion in AI applications, and especially if the heavily-subsidized and mandated transition to EVs comes to pass. But that growth in demand will not and cannot be met by relying solely on renewable energy sources. Their variability implies substantial idle capacity, higher costs, and service interruptions. Such a massive deployment of dormant capital represents an enormous waste of resources, and the sad fact is it’s been underway for some time.
In the years ahead, the net-zero objective will motivate more bungled industrial planning as a substitute for market-driven forces. Costs will be driven higher by the imposed costs of backup capacity and/or outages. Ratepayers, taxpayers, and innocents will all share these burdens.
Creating idle, non-dispatchable physical capital is malinvestment which diminishes future economic growth. The boom in wind and solar activity began in earnest during the era of negative real interest rates. Today’s higher rates might slow the malinvestment, but they won’t bring it to an end without a substantial shift in the political landscape. Instead, taxpayers will shoulder an even greater burden, as will ratepayers whose power providers are guaranteed returns on their regulated rate bases.
The Fed’s “higher for longer” path for short-term interest rates lingers on, and so does inflation in excess of the Fed’s 2% target. No one should be surprised that rate cuts aren’t yet on the table, but the markets freaked out a little with the release of the February CPI numbers last week, which were higher than expected. For now, it only means the Fed will remain patient with the degree of monetary restraint already achieved.
Dashed Hopes
As I’ve said before, there was little reason for the market to have expected the Fed to cut rates aggressively this year. Just a couple of months ago, the market expected as many as six quarter-point cuts in the Fed’s target for the federal funds rate. The only rationale for that reaction would have been faster disinflation or the possibility of an economic “hard landing”. A downturn is not out of the question, especially if the Fed feels compelled to raise its rate target again in an effort to stem a resurgence in inflation. Maybe some traders felt the Fed would act politically, cutting rates aggressively as the presidential election approaches. Not yet anyway, and it seems highly unlikely.
There is no assurance that the Fed can succeed in engineering a “soft landing”, i.e., disinflation to its 2% goal without a recession. No one can claim any certainty on that point — it’s too early to call, though the odds have improved somewhat. As Scott Sumner succinctly puts it, a soft landing basically depends on whether the Fed can disinflate gradually enough.
It’s a Demand-Side Inflation
I’d like to focus a little more on Sumner’s perspective on Fed policy because it has important implications for the outlook. Sumner is a so-called market monetarist and a leading proponent of nominal GDP level targeting by the Fed. He takes issue with those ascribing the worst of the pandemic inflation to supply shocks. There’s no question that disruptions occurred on the supply side, but the Fed did more than accommodate those shocks in attempting to minimize their impact on real output and jobs. In fact, it can fairly be said that a Fed / Treasury collaboration managed to execute the biggest “helicopter drop” of money in the history of the world, by far!
That “helicopter drop” consisted of pandemic relief payments, a fiscal maneuver amounting to a gigantic monetary expansion and stimulus to demand. The profligacy has continued on the fiscal side since then, with annual deficits well in excess of $1 trillion and no end in sight. This reflects government demand against which the Fed can’t easily act to countervail, making the job of achieving a soft landing that much more difficult.
The Treasury, however, is finding a more limited appetite among investors for the flood of bonds it must regularly sell to fund the deficit. Recent increases in long-term Treasury rates reflect these large funding needs as well as the “higher-for-longer” outlook for short-term rates, inflation expectations, and of course better perceived investment alternatives.
The Nominal GDP Proof
There should be no controversy that inflation is a demand-side problem. As Summer says, supply shocks tend to reverse themselves over time, and that was largely the case as the pandemic wore on in 2021. Furthermore, advances in both real and nominal GDP have continued since then. The difference between the two is inflation, which again, has remained above the Fed’s target.
So let’s see… output and prices both growing? That combination of gains demonstrates that demand has been the primary driver of inflation for three-plus years. Restrictive monetary policy is the right prescription for taming excessive demand growth and inflation.
Here’s Sumner from early March (emphasis his), where he references flexible average inflation targeting (FAIT), a policy the Fed claims to be following, and nominal GDP level targeting (NGDPLT):
“Over the past 4 years, the PCE price index is up 16.7%. Under FAIT it should have risen by 8.2% (i.e., 2%/year). Thus we’ve had roughly 8.5% excess inflation (a bit less due to compounding.)
Aggregate demand (NGDP) is up by 27.6%. Under FAIT targeting (which is similar to NGDPLT) it should have been up by about 17% (i.e., 4%/year). So we’ve had a bit less than 10.6% extra demand growth. That explains all of the extra inflation.”
Is Money “Tight”?
The Fed got around to tightening policy in the spring of 2022, but that doesn’t necessarily mean that policy ever advanced to the “tight” stage. Sumner has been vocal in asserting that the Fed’s policy hasn’t looked especially restrictive. Money growth feeds demand and ultimately translates into nominal GDP growth (aggregate demand). The latter is growing too rapidly to bring inflation into line with the 2% target. But wait! Money growth has been moderately negative since the Fed began tightening. How does that square with Sumner’s view?
In fact, the M2 money supply is still approximately 35% greater than at the start of the pandemic. There’s still a lot of M2 sloshing around out there, and the Fed’s portfolio of securities acquired during the pandemic via “quantitative easing” remains quite large ($7.5 trillion). Does this sound like tight money?
Again, Sumner would say that with nominal GDP ripping ahead at 5.7%, the Fed can’t be credibly targeting 2% inflation given an allowance for real GDP growth at trend of around 1.8% (or even somewhat greater than that). It’s an even bigger stretch if M2 velocity (V — turnover) continues to rebound with higher interest rates.
Wage growth also exceeds a level consistent with the Fed’s target. The chart below shows the gap between price inflation and wage inflation that left real wages well below pre-pandemic levels. Since early 2023, wages have made up part of that decline, but stubborn wage inflation can impede progress against price inflation.
Just Tight Enough?
Despite Sumner’s doubts, there are arguments to be made that Fed policy qualifies as restrictive. Even moderate declines in liquidity can come as a shock to markets grown accustomed to torrents from the money supply firehose. And to the extent that inflation expectations have declined, real interest rates may be higher now than they were in early November. In any case, it’s clear the market was disappointed in the higher-than-expected CPI, and traders were not greatly assuaged by the moderate report on the PPI that followed.
However, the Fed pays closest attention to another price index: the core deflator for personal consumption expenditures (PCE). Inflation by this measure is trending much closer to the Fed’s target (see the second chart below). Still, from the viewpoint of traders, many of whom, not long ago, expected six rate cuts this year, the reality of “higher for longer” is a huge disappointment.
Danger Lurks
As I noted, many believe the odds of a soft landing have improved. However, the now-apparent “stickiness” of inflation and the knowledge that the Fed will standby or possibly hike rates again has rekindled fears that the economy could turn south before the Fed elects to cut its short-term interest rate target. That might surprise Sumner in the absence of more tightening, as his arguments are partly rooted in the continuing strength of aggregate demand and nominal GDP growth.
There’s a fair degree of consensus that the labor market remains strong, which underscores Sumner’s doubts as to the actual tenor of monetary policy. The March employment numbers were deceptive, however. The gain in civilian employment was just shy of 500,000, but that gain was entirely in part-time employment. Full-time employment actually declined slightly. In fact, the same is true over the prior 12 months. And over that period, the number of multiple jobholders increased by more than total employment. Increasing reliance on part-time work and multiple jobs is a sign of stress on household budgets and that firms may be reluctant to commit to full-time hires. From the establishment survey, the gain in nonfarm employment was dominated once again by government and health care. These numbers hardly support the notion that the economy is on solid footing.
There are other signs of stress: credit card delinquencies hit an all-time high in February. High interest rates are taking a toll on households and business borrowers. Retail sales were stronger than expected in March, but excess savings accumulated during the pandemic were nearly depleted as of February, so it’s not clear how long the spending can last. And while the index of leading indicators inched up in February, it was the first gain in two years and the index has shown year/over-year declines over that entire two-year period.
Conclusion
It feels a little hollow for me to list a series of economic red flags, having done so a few times over the past year or so. The risks of a hard landing are there, to be sure. The behavior of the core PCE deflator over the next few months will have much more influence on the Fed policy, as would any dramatic changes in the real economy. The “data dependence” of policy is almost a cliche at this point. The Fed will stand pat for now, and I doubt the Fed will raise its rate target without a dramatic upside surprise on the core deflator. Likewise, any downward rate moves won’t be forthcoming without more softening in the core deflator toward 2% or definitive signs of a recession. So rate cuts aren’t likely for some months to come.
A week ago I posted about electrification and particularly EV mandates, one strand of government industrial policy under which non-favored sectors of the economy must labor. This post examines a related industrial policy: manipulation of power generation by government policymakers in favor of renewable energy technologies, while fossil fuels are targeted for oblivion. These interventions are a reaction to an overwrought climate crisis narrative, but they present many obstacles, oversights and risks of their own. Chief among them is whether the power grid will be capable of meeting current and future demand for power while relying heavily on variable resources: wind and sunshine.
Like almost everything I write, this post is too long! Here is a guide to what follows. Scroll down to whatever sections might be of interest:
Malinvestment: Idle capital
Key Considerations to chew on
False Premises: zero CO2? Low cost?
Imposed Cost: what and how much?
Supporting Growth: with renewables?
Resource Constraints: they’re tight!
Technological Advance: patience!
The Presumed Elephant: CO2 costs
Conclusion
Malinvestment
The intermittency of wind and solar power creates a fundamental problem of physically idle capital, which leaves the economy short of its production possibilities. To clarify, capital invested in wind and solar facilities is often idle in two critical ways. First, wind and solar assets have relatively low rates of utilization because of their variability, or intermittency. Second, neither provides “dispatchable” power: it is not “on call” in any sense during those idle periods, which are not entirely predictable. Wind and solar assets therefore contribute less value to the electric grid than dispatchable sources of power having equivalent capacity and utilization.
Is “idle capital” a reasonable characterization? Consider the shipping concerns that are now experimenting with sails on cargo ships. What is the economic value of such a ship without back-up power? Can you imagine them drifting in the equatorial calms for days on end? Even light winds would slow the transport of goods significantly. Idle capital might be bad enough, but a degree of idleness allows flexibility and risk mitigation in many applications. Idle, non–dispatchable capital, however, is unproductive capital.
Likewise, solar-powered signage can underperform or fail over the course of several dark, wintry days, even with battery backup. The signage is more reliable and valuable when it is backed-up by another power source. Again, idle, non-dispatchable capital is unproductive capital.
The pursuit of net-zero carbon emissions via wind and solar power creates idle capital, which increasingly lacks adequate backup power. That should be a priority, but it’s not. This misguided effort is funded from both private investment and public subsidies, but the former is very much contingent on the latter. That’s because the flood of subsidies is what allows private investors to profit from idle capital. Rent-enabled investments like these crowd out genuinely productive capital formation, which is not limited to power plants that might otherwise use fossil fuels.
Creating idle or unemployed physical capital is malinvestment, and it diminishes future economic growth. The surge in this activity began in earnest during the era of negative real interest rates. Today, in an era of higher rates, taxpayers can expect an even greater burden, as can ratepayers whose power providers are guaranteed returns on their regulated rate bases.
Key Consideration
The forced transition to net zero will be futile, but especially if wind and solar energy are the primary focus. Keep the following in mind:
The demand for electricity is expected to soar, and soon! Policymakers have high hopes for EVs, and while adoption rates might fall well short of their goals, they’re doing their clumsy best to force EVs down our throats with mandates. But facilitating EV charging presents difficulties. Lionel Shriver states the obvious: “Going Electric Requires Electricity”. Reliable electricity!
Perhaps more impressive than prospects for EVs is the expected growth in power demand from data centers required by the explosion of artificial intelligence applications across many industries. It’s happening now! This will be magnified with the advent of artificial general intelligence (AGI).
Dispatchable power sources are needed to back-up unreliable wind and solar power to ensure service continuity. Maintaining backup power carries a huge “imposed cost” at the margin for wind and solar. At present, that would entail CO2 emissions, violating the net zero dictum.
Perhaps worse than the cost of backup power would be the cost borne by users under the complete elimination of certain dispatchable power sources. An imposed cost then takes the form of outages. Users are placed at risk of losing power at home, at the office and factories, at stores, in transit, and at hospitals at peak hours or under potentially dangerous circumstances like frigid or hot weather.
Historically, dispatchable power has allowed utilities to provide reliable electricity on-demand. Just flip the switch! This may become a thing of the past.
Wind and solar power are sometimes available when they’re not needed, in which case the power goes unused because we lack effective power storage technology.
Wind and solar power facilities operate at low rates of utilization, yet new facilities are always touted at their full nameplate capacity. Capacity factors for wind turbines averaged almost 36% in the U.S. in 2022, while solar facilities averaged only about 24%. This compared with nuclear power at almost 93%, natural gas (66%), and coal (48%). Obviously, the low capacity factors for wind and solar reflect their variable nature, rather than dispatchable responses to fluctuations in power demand.
Low utilization and variability are underemphasized or omitted by those promoting wind and solar plant in the media and often in discussions of public policy, and no wonder! We hear a great deal about “additions to capacity”, which overstate the actual power-generating potential by factors of three to four times. Here is a typical example.
Wind and solar power are far more heavily subsidized than fossil fuels. This is true in absolute terms and especially on the basis of actual power output, which reveals their overwhelmingly uneconomic nature. From the link above, here are Mitch Rolling and Isaac Orr on this point:
“In 2022, wind and solar generators received three and eighteen times more subsidies per MWh, respectively, than natural gas, coal, and nuclear generators combined. Solar is the clear leader, receiving anywhere from $50 to $80 per MWh over the last five years, whereas wind is a distant second at $8 to $10 per MWh …. Renewable energy sources like wind and solar are largely dependent on these subsidies, which have been ongoing for 30 years with no end in sight.”
The first-order burden of subsidies falls on taxpayers. The second-order burdens manifest in an unstable grid and higher power costs. But just to be clear, subsidies are paid by governments to producers or consumers to reduce the cost of activity favored by policymakers. However, the International Monetary Fund frequently cites “subsidy” figures that include staff estimates of unaddressed externalities. These are based on highly-simplified models and subject to great uncertainty, of course, especially when dollar values are assigned to categories like “climate change”. Despite what alarmists would have us believe, the extent and consequences of climate change are not settled scientific issues, let alone the dollar cost.
Wind and solar power are extremely land- and/or sea-intensive. For example, Casey Handmer estimates that a one-Gigawatt data center, if powered by solar panels, would need a footprint of 20,000 acres.
Solar installations are associated with a significant heat island effect: “We found temperatures over a PV plant were regularly 3–4 °C warmer than wildlands at night….”
In addition to the destruction of habitat both on- and offshore, turbine blades create noise, electromagnetism, and migration barriers. Wind farms have been associated with significant bird and bat fatalities. Collisions with moving blades are one thing, but changes to the winds and air pressure around turbines are also a danger to avian species.
Solar farms present dangers to waterfowl. These creatures are tricked into diving toward what they believe to be bodies of water, only to crash into the panels.
The production of wind and solar equipment requires the intensive use of scarce resources, including environmentally-sensitive materials. Extracting these materials often requires the excavation of massive amounts of rock subject to extensive processing. Mining and processing rely heavily on diesel fuel. Net zero? No.
Wind and solar facilities often present major threats of toxicity at disposal, or even sooner. A recent hail storm in Texas literally destroyed a solar farm, and the smashed panels have prompted concerns not only about solar “sustainability”, but also that harsh chemicals may be leaking into the local environment.
The transmission of power is costly, but that cost is magnified by the broad spatial distribution of wind and solar generating units. Transmission from offshore facilities is particularly complex. And high voltage lines run into tremendous local opposition and regulatory scrutiny.
When wind turbines and solar panels are idle, so are the transmission facilities needed to reach them. Thus, low utilization and the variability of those units drives up the capital needed for power and power transmission.
There is also an acute shortage of transformers, which presents a major bottleneck to grid development and stability.
While zero carbon is the ostensible goal, zero carbon nuclear power has been neglected by our industrial planners. That neglect plays off exaggerated fears about safety. Fortunately, there is a growing realization that nuclear power may be surest way to carbon reductions while meeting growth in power demand. In fact, new data centers will go off-grid with their own modular reactors.
At the Shriver link, he notes the smothering nature of power regulation, which obstructs the objective of providing reliable power and any hope of achieving net zero.
The Biden administration has resisted the substitution of low CO2 emitting power sources for high CO2 emitting sources. For example, natural gas is more energy efficient in a variety of applications than other fuel sources. Yet policymakers seem determined to discourage the production and use of natural gas.
False Premises
Wind and solar energy are touted by the federal government as zero carbon and low-cost technologies, but both claims are false. Extracting the needed resources, fabricating, installing, connecting, and ultimately disposing of these facilities is high in carbon emissions.
The claim that wind and solar have a cost advantage over traditional power sources is based on misleading comparisons. First, putting claims about the cost of carbon aside, it goes without saying that the cost of replacing already operational coal or natural gas generating capacity with new wind and solar facilities is greater than doing nothing.
The hope among net zero advocates is that existing fossil fuel generating plant can be decommissioned as more renewables come on-line. Again, this thinking ignores the variable nature of renewable power. Dispatchable backup power is required to reliably meet power demand. Otherwise, fluctuating power supplies undermine the economy’s productive capacity, leading to declines in output, income, health, and well being. That is costly, but so is maintaining and adding back-up capacity. Costs of wind and solar should account for this necessity. It implies that wind and solar generating units carry a high cost at the margin.
Imposed Costs
A “grid report card” from the Mackinac Center for Public Policy notes the conceptual flaw in comparing the levelized cost (à la Lazard) of a variable resource with one capable of steady and dispatchable performance. From the report, here is the crux of the imposed-cost problem:
“… the more renewable generation facilities you build, the more it costs the system to make up for their variability, and the less value they provide to electricity markets.”
A committment to variable wind and solar power along with back-up capacity also implies that some capital will be idle regardless of wind and solar conditions. This is part of the imposed cost of wind and solar built into the accounting below. But while back-up power facilities will have idle periods, it is dispatchable and serves an insurance function, so it has value even when idle in preserving the stability of the grid. For that matter, sole reliance on dispatchable power sources requires excess capacity to serve an insurance function of a similar kind.
The Mackinac report card uses estimates of imposed cost from an Institute for Energy Research to construct the following comparison (expand the view or try clicking the image for a better view):
The figures shown in this table are somewhat dated, but the Mackinac authors use updated costs for Michigan from the Center of the American Experiment. These are shown below in terms of average costs per MWh through 2050, but the labels require some additional explanation.
The two bars on the left show costs for existing coal ($33/MWh) and gas-powered ($22) plants. The third and fourth bars are for new wind ($180) and solar ($278) installations. The fifth and sixth bars are for new nuclear reactors (a light water reactor ($74) and a small modular reactor($185)). Finally, the last two bars are for a new coal plant ($106) and a natural gas plant ($64), both with carbon capture and storage (CCS). It’s no surprise that existing coal and gas facilities are the most cost effective. Natural gas is by far the least costly of the new installations, followed by the light water reactor and coal.
The Mackinac “report card” is instructive in several ways. It provides a detailed analysis of different types of power generation across five dimensions, including reliability, cost, cleanliness, and market feasibility (the latter because some types of power (hydro, geothermal) have geographic limits. Natural gas comes out the clear winner on the report card because it is plentiful, energy dense, dispatchable, clean burning, and low-cost.
Supporting Growth
Growth in the demand for power cannot be met with variable resources without dispatchable backup or intolerable service interruptions. Unreliable power would seriously undermine the case for EVs, which is already tenuous at best. Data centers and other large users will go off-grid before they stand for it. This would represent a flat-out market rejection of renewable investments, ESGs be damned!
Casey Handmer makes some interesting projections of the power requirements of data centers supporting not just AI, but AGI, which he discusses in “How To Feed the AIs”. Here is his darkly humorous closing paragraph, predicated on meeting power demands from AGI via solar:
“It seems that AGI will create an irresistibly strong economic forcing function to pave the entire world with solar panels – including the oceans. We should probably think about how we want this to play out. At current rates of progress, we have about 20 years before paving is complete.”
Resource Constraints
Efforts to force a transition to wind and solar power will lead to more dramatic cost disadvantages than shown in the Mackinac report. By “forcing” a transition, I mean aggressive policies of mandates and subsidies favoring these renewables. These policies would effectuate a gross misallocation of resources. Many of the commodities needed to fabricate the components of wind and solar installations are already quite scarce, particularly on the domestic U.S. front. Inflating the demand for these commodities will result in shortages and escalating costs, magnifying the disadvantages of wind and solar power in real economic terms.
To put a finer point on the infeasibility of the net zero effort, Simon P. Michaux produced a comparative analysis in 2022 of the existing power mix versus a hypothetical power mix of renewable energy sources performing an equal amount of work, but at net-zero carbon emissions (the link is a PowerPoint summary). In the renewable energy scenario, he calculated the total quantities of various resources needed to achieve the objective over one generation of the “new” grid (to last 20 -30 years). He then calculated the numbers of years of mining or extraction needed to produce those quantities based on 2019 rates of production. Take a look at the results in the right-most column:
Those are sobering numbers. Granted, they are based on 2019 wind and solar technology. However, it’s clear that phasing out fossil fuels using today’s wind and solar technology would be out of the question within the lifetime of anyone currently living on the planet. Michaux seems to have a talent for understatement:
“Current thinking has seriously underestimated the scale of the task ahead.”
He also emphasizes the upward price pressure we’re likely to witness in the years ahead across a range of commodities.
Technological Breakthroughs
Michaux’s analysis assumes static technology, but there may come a time in the not-too-distant future when advances in wind and solar power and battery storage allow them to compete with hydrocarbons and nuclear power on a true economic basis. The best way to enable real energy breakthroughs is through market-driven economic growth. Energy production and growth is hampered, however, when governments strong-arm taxpayers, electricity buyers, and traditional energy producers while rewarding renewable developers with subsidies.
We know that improvements will come across a range of technologies. We’ve already seen reductions in the costs of solar panels themselves. Battery technology has a long way to go, but it has improved and might some day be capable of substantial smoothing in the delivery of renewable power. Collection of solar power in space is another possibility, as the feasibility of beaming power to earth has been demonstrated. This solution might also have advantages in terms of transmission depending on the locations and dispersion of collection points on earth, and it would certainly be less land intensive than solar power is today. Carbon capture and carbon conversion are advancing technologies, making net zero a more feasible possibility for traditional sources of power. Nuclear power is zero carbon, but like almost everything else, constructing plants is not. Nevertheless, fission reactors have made great strides in terms of safety and efficiency. Nuclear fusion development is still in its infancy, but there have been notable advances of late.
Some or all of these technologies will experience breakthroughs that could lead to a true, zero-carbon energy future. The timeline is highly uncertain, but it’s likely to be faster than anything like the estimates in Michaux’s analysis. Who knows? Perhaps AI will help lead us to the answers.
A Presumed Elephant
This post and my previous post have emphasized two glaring instances of government failure on their own terms: a headlong plunge into unreliable renewable energy, and forced electrification done prematurely and wrong. Some would protest that I left the veritable “elephant in the room”: the presumed external or spillover costs associated with CO2 emissions from burning fossil fuels. Renewables and electrification are both intended to prevent those costs.
External costs were not ignored, of course. Externalities were discussed explicitly in several different contexts such as the mining of new materials, EV tire wear, the substitution of “cleaner” fuels for others, toxicity at disposal, and the exaggerated reductions in CO2 from EVs when the “long tailpipe” problem is ignored. However, I noted explicitly that estimates of unaddressed externalities are often highly speculative and uncertain, and especially the costs of CO2 emissions. They should not be included in comparisons of subsidies.
Therefore, the costs of various power generating technologies shown above do not account for estimates of externalities. If you’re inclined, other SCC posts on the CO2 “elephant” can be found here.
Conclusion
Power demand is expected to soar given the coming explosion in AI applications, and especially if the heavily-subsidized and mandated transition to EVs comes to pass. But that growth in demand will not and cannot be met by relying on renewable energy sources. Their variability implies substantial idle capacity, higher costs, and service interruptions. Such a massive deployment of idle capital would represents an enormous waste of resources, but the sad fact is it’s been underway for some time.
In the years ahead, the net-zero objective will prove representative of a bumbling effort at industrial planning. Costs will be driven higher, including the cost inflicted by outages and environmental damage. Ratepayers, taxpayers, and innocents will share these burdens. Travis Fisher is spot on when he says the grid is becoming a “dangerous liability” thanks to wounds inflicted by subsidies, regulations, and mandates.
“The National Electrical Grid is teetering on collapse. The shift away from full-time available power (like fossil fuels, LNG, etc.) to so-called ‘green’ sources has deeply impacted reliability.”
“Also, as more whale-killing off-shore wind farms are planned, the Biden administration forgot to plan for the thousands of miles of transmission lines that will be needed. And in a perfect example of leftist autophagy, there is considerable opposition from enviro-groups who will tie up the construction of wind farms and transmission lines in court for decades.”
Meanwhile, better alternatives to wind and solar have been routinely discouraged. The substantial reductions in carbon emissions achieved in the U.S. over the past 15 years were caused primarily by the substitution of natural gas for coal in power generation. Much more of that is possible. The Biden Administration, however, wishes to prevent that substitution in favor of greater reliance on high-cost, unreliable renewables. And the Administration wishes to do so without adequately backing up those variable power sources with dispatchable capacity. Likewise, nuclear power has been shunted aside, despite its safety, low risk, and dispatchability. However, there are signs of progress in attitudes toward bringing more nuclear power on-line.
Industrial policy usually meets with failure, and net zero via wind and solar power will be no exception. Like forced electrification, unreliable power fails on its own terms. Net zero ain’t gonna happen any time soon, and not even by 2050. That is, it won’t happen unless net zero is faked through mechanisms like fraudulent carbon credits (and there might not be adequate faking capacity for that!). Full-scale net-zero investment in wind and solar power, battery capacity, and incremental transmission facilities will drive the cost of power upward, undermining economic growth. Finally, wind and solar are not the environmental panacea so often promised. Quite the contrary: mining of the necessary minerals, component fabrication, installation, and even operation all have negative environmental impacts. Disposal at the end of their useful lives might be even worse. And the presumed environmental gains … reduced atmospheric carbon concentrations and lower temperatures, are more scare story than science.
Postscript: here’s where climate alarmism has left us, and this is from a candidate for the U.S. Senate (she deleted the tweet after an avalanche of well-deserved ridicule):
Industrial policy allows government planners to select favored and disfavored industries or sectors. It thereby bypasses and distorts impersonal market signals that would otherwise direct scarce resources to the uses most valued by market participants. Instead, various forms of aid and penalties are imposed on different sectors in order to accomplish the planners’ objectives, This includes interventions in foreign trade and attempts to steer technological development. Industrial policy often comes under the guise of enhanced national security. Of course, it can also be used to reward cronies. And it has a poor record of accomplishing its objectives and avoiding unintended consequences.
The Sausage Factory
The executive and legislative branches of the U.S. government are loaded with economic interventionists, regardless of party affiliation. In an age of (Chevron) judicial deference to “experts” within the administrative state, it is not uncommon for legislative language to give abundant leeway to those who implement policy within the executive branch (though a couple of upcoming Supreme Court decisions might change that balance). Increasingly, bills are stuffed so full of provisions that lawmakers find it all but impossible to read them in full, let alone make an accurate assessment of their virtues, drawbacks, and internal contradictions.
Even worse is the fact that bills are, in great part, written by relatively youthful legislative staffers with little real world experience in industry, and who harbor the naive belief that whatever is wished, government can make it so. But their work also proceeds under guidance from lawmakers, administration officials, consultants, and lobbyists who have their own agendas and axes to grind. This is how industrial policy is promulgated in the U.S., and it is through this ugly prism that we must view environmental policy.
The Left dictates environmental and energy policy in several states, especially California, where energy costs have soared under renewable energy initiatives. California households now pay almost triple the rate per kilowatt-hour paid in Washington, and more than double what’s paid in Oregon. Something similar may happen in New York, which has highly ambitious goals for renewable energy even as the costs of the state’s offshore wind projects are out of control. These and other state-level “laboratories” are demonstrating that a renewable energy agenda can carry very high costs to the populace. The same is true of the painful experience in Germany with its much-heralded Energiewende.
Net Zero
The Left is also pulling the strings within the federal bureaucracy and the Biden Administration. The objective is an industrial policy to achieve “net zero” CO2 emissions, a practical impossibility for at least several decades (unless it’s faked, of course). Nevertheless, that policy calls for phasing out the use of fossil fuels. Under this agenda, mandates and subsidies are bestowed upon the use of renewable electric power sources, while restrictions and penalties are imposed on the production and use of fossil fuels. A subsequent post on the subject of power generation will address this prototypical failure of central planning.
Electrification
Here, I discuss another key objective of our industrial planners: electrify whatever is not electrified in order to advance the net zero agenda. Of course, for some time to come, more than half of electric power will be generated using fossil fuels (currently about 60%, with another 18% nuclear), so the policy is largely a sham on its face, but we’ll return to that point below. The EV tailpipe is very long, as they say.
Electrification means, among other things, the forced adoption of electronic vehicles (EVs). President Biden’s EPA has issued rules on auto emissions that are expected to require, by 2032, that 60% or more of cars and light trucks sold will be EVs. The USA Today article at the link offers this rich aside:
“…the original proposal — which was always technology-neutral in theory, meaning automakers could sell any cars and light-duty trucks they wanted as long as they hit the fleetwide reductions….”
Technology neutral? Hahaha! We aren’t forcing you to choose technologies as long as you meet our technological requirements!
EV Doldrums
Anyway, the EPA’s targets are completely impractical, partly because the value for drivers is lacking. Not coincidentally, the market for EVs seems to have chilled of late. Hertz has soured on heavy use of EVs in its fleet, and Ford has announced reductions in EV production. The new UAW agreements will make it difficult for some domestic producers to turn a profit on EVs. Fisker is just about broke. Apple has cancelled development of its EV, and several other automakers have reduced their production plans. Toyota was the first producer to raise the red flag on the breakneck transition to EVs in favor of a measured reliance on hybrids. Of course, there are other prominent voices cautioning against rapid attempts at electrification in general.
To be fair, some EVs are marvelous machines, but they and their supporting infrastructure are not yet well-suited to the mass market.
A Tangled Web
Here are some drawbacks of EVs that have yet to be adequately addressed:
They are expensive, even with the rich-man’s subsidy to buyers paid by the government and carbon credit subsidies granted to producers.
Costly battery replacement is an eventuality that looms over the wallets of EV owners.
EVs have limited range given the state of battery technology, especially when the weather is cold.
There presently exist far too few charging stations to make EVs workable for many people. In any case, charging away from home can be extremely time consuming and the charges vary widely.
The purchase and installation of EV chargers at home is a separate matter, and can cost $4,000 or more if an upgrade to the service panel is necessary. Installed costs commonly range from $1,175 to $3,300, depending on the type of charger and the region.
EVs are much heavier than vehicles powered by internal combustion engines. As a result, EV tire wear can be a surprising cost causer and pollutant.
Battery fires in EVs are extremely difficult to extinguish, creating a new challenge for emergency responders.
Reliance on EVs for local emergency services would be dangerous without duplicative investment by local jurisdictions to offset the down-time required for charging.
For decades to come, the power grid will be unable to handle the load required for widespread adoption of EVs. A rapid conversion would be impossible without a great expansion in generating and transmission capacity, including transformer availability.
Domestically we lack the natural resources to produce the batteries required by EVs in a quantity that would satisfy the Administration’s goals. This forces dependence on China, our chief foreign adversary.
The mining of those resources is destructive to the environment. Much of it is done in China due to the country’s abundance of rare earth minerals, but wherever the mining occurs, it relies heavily on diesel power.
Joel Kotkin points out that China now hosts the world’s largest EV producer, BYD. Biden’s mandates might very well allow China to dominate the U.S. auto market, even as its own CO2 emissions are soaring,,
Producers of EVs earn carbon credits for each vehicle sold, which they can sell to other auto producers who fall short of their required mix of EVs in total production. Tesla, for example, earned revenue of $1.8 billion from carbon credit sales in 2022. But note again that these so-called zero-emission vehicles use electricity generated with an average of 60% fossil fuels. Thus, the scheme is largely a sham.
The push for EVs has been hampered by the botched rollout of (non-Tesla) charging stations under a huge Biden initiative in the Infrastructure Investment and Jobs Act. Progress has been bogged down by sheer complexity and expense, including the cost of bringing adequate power supplies to the chargers as well as the difficulty of meeting contracting requirements and operating standards. This is exemplary of the failures that usually await government efforts to engineer outcomes contrary to market forces.
Electric Everything?
Like EVs, electric stoves have drawbacks that limit their popularity, including price and the nature of the heat needed for quality food preparation. In addition to autos and stoves, wholesale electrification would require the replacement or costly reconfiguration of a huge stock of business and household capital that is now powered by fossil fuels, like gas furnaces, tractors, chain saws, and many other tools and appliances. This set of legacy investment choices was guided by market prices that reflect the scarcity and efficiency of the resources, yet government industrial planners propose to lay much of it to waste.
Central Planning: a False Conceit
John Mozena quotes Adam Smith on the social and economic hazards of rejecting the market mechanism and instead accepting governmental authority over the allocation of resources:
“All governments which thwart this natural course, which force things into another channel, or which endeavour to arrest the progress of society at a particular point, are unnatural, and to support themselves are obliged to be oppressive and tyrannical.”
“As Ludwig von Mises and Friedrich Hayek pointed out during the socialist calculation debate, central planners lack the information that is produced by markets. By over-riding market prices and substituting their own judgment, regulators incur the same loss of information.”
Advocates of EV industrial policy have failed to appreciate the large gaps between the technology they are determined to dictate and basic consumer requirements. These gaps are along such margins as range, charging time, tire and battery wear, and perhaps most importantly, affordability. The planners have failed to foresee the massive demands on the power grid of a forced replacement of the internal combustion auto stock with EVs. The planners elide the true nature of EV-driven emissions, which are never zero carbon but instead depend on the mix of power sources used to charge EV batteries. Finally, EV mandates show that the industrial planners are oblivious to other environmental burdens inherent in EVs, whatever their true carbon footprint might be.
About a year ago I wrote about the sketchy nature of carbon credits (or “offsets”), which are purchased by people or entities whose actions generate CO2 emissions they’d like to offset. Those actions would include Taylor Swift’s private air travel, electric power generation, and many other activities whose participants wish to have “greenwashed”.
One short digression before I get started: see those black clouds of CO2 in the image above? Well, carbon dioxide doesn’t really look like that. In fact, CO2 is transparent. Trees breathe it! Visually, it’s less obvious than the greenhouse gas known as water vapor in those puffy white clouds, but virtually every image you’ll ever see on-line depicting CO2 emissions shows dark, roiling smoke. I just hate to spoil the scary effect, but there it is.
Back to carbon credits, which help fund projects that offset CO2 emissions (at least theoretically), such as planting new forest acreage (which would absorb CO2 … someday) or preventing deforestation. Other types of offset activities include investment in renewable energy projects and carbon capture technology. So, for example, if a utility’s power generation emits CO2, the creation or preservation of some amount of forested acreage can serve as a carbon sink adequate to offset the utility’s emissions. Net zero! Or so the utility might claim.
If only it were that simple! Paul Mueller explains that the incentive structure of these arrangements is perverse. What if credits are sold on the basis of supposed efforts to preserve forests that were never at risk to begin with? In fact, the promise of revenue from the sale of credits may be a powerful incentive to falsely present forested lands as targets for development. For that matter, cutting forestland for lumber makes more sense if it can be replanted immediately in exchange for revenue from the sale of carbon credits. And newly planted acreage won’t lead to absorption of much CO2 for many years, until the trees begin to mature. Then there are the risks of forest fires or disease that could compromise a forest’s ultimate value as a carbon sink.
Whether through fraud, calamity, or mismanagement, the sad truth is that projects serving as a basis for credits have done far less to reduce deforestation than promised. On top of that, another issue plaguing carbon markets for some time has been double counting of offsets, which can occur under several circumstances. Ultimately, CO2 emissions themselves may have done more to promote the growth of forests than purchases of carbon credits, because CO2 gives life to vegetation!
Obviously, the purchase of offsets raises the incremental cost of any project having CO2 emissions. The incidence of this added cost is borne to a large extent by consumers, especially because power demand is fairly inelastic. The craziness of offset logic may even dictate the purchase of offsets when a plant emitting more CO2 (e.g., coal) is replaced by a plant emitting less (natural gas), because the replacement would still emit carbon!
Some carbon offsets help pay for the construction of renewable power facilities like wind and solar farms. These renewable power facilities contribute to the power supply, of course, but wind turbines and solar farms typically operate at a small fraction of nameplate capacity due to the intermittency of wind and sunshine. Thus, these offsets are far less than complete. And from that low rate of renewable utilization we can deduct another fraction: periods of actual utilization often occur when no one wants the power, and while utilities can sell that excess power into the grid, it doesn’t replace other power at those times and it therefore doesn’t contribute to reductions in CO2 emissions.
Claims of achieving net zero are very much in vogue in the corporate world, and for a few related reasons. One is that they help keep activists and protesters away from the gates. There are, however, plenty of activists serving on corporate boards, in the executive suite, and among regulators.
The purchase of carbon offsets by “socially responsible corporations” might put stakeholder pressure on competitors who are “insufficiently green”. That would help to compensate for the higher costs imposed by offsets. After all, carbon credits are not cheap. In fact, smaller competitors might struggle to fund additional outlays for the credits.
Finally, claims of carbon neutrality also help with another constituency: “woke” investors. “Achieving” net zero boosts a firm’s so-called ESG score, presumed to reflect soundness in terms of environmental (E) and social (S) responsibility, as well as the quality of internal governance (G). With firms jockeying for ESG improvements, they help keep the offset charade going.
There is no common standard for calculating ESG, and there is considerable variance in ESG scores across rating firms. This should be cause for great skepticism, but too many investors are vulnerable to suggestions that screening on ESGs can enable both social responsibility and better returns. Sadly, they are sometimes paying higher fees for the privilege. The ESG fad among these investors might have helped fulfill hopes of greater returns for a while, but the imagined ESG advantage may have faded.
Carbon credits or offsets are plagued by bad incentives that often lead to wasteful outlays if not outright fraud. At present, they generally fail to reduce atmospheric CO2 as promised and they contribute to higher costs, which are passed on to consumers. They also serve as an unworthy basis for higher ESG scores, which are something of a sham in any case.
There have been efforts underwayto improve the quality and legitimacy of carbon offsets. Some of this is voluntary due diligence on the part of purchasers. The effort also includes various NGOs and regulators. Ultimately, the push for quality is likely to push the price of offsets upward dramatically. Perhaps offsets will become more credible, but they won’t come cheap. The cost of achieving net zero targets will largely come out of consumers’ pockets, and those net zeros will still be nominal at best.
First, a preliminary issue: many resources qualify as commons in the very broadest sense, yet free societies have learned over time that many resources are used much more productively when property rights are assigned to individuals. For example, modern agriculture owes much to defining exclusive property rights to land so that conflicting interests don’t have to compete (e.g,, the farmer and the cowman). Federal land is treated as a commons, however. There is a rich history on the establishment of property rights, but within limits, the legal framework in place can define whether a resource is treated as a commons, a club good, or private property. The point here is that there are substantial economic advantages to preserving strong property rights, rather than treating all resources as communal.
The authors of the planetary commons (PC) paper present a rough sketch for governance over use of the planet’s resources, given their belief that a planetary crisis is unfolding before our eyes. The paper has two main thrusts as I see it. One is to broadly redefine virtually all physical resources as common pool interests because their use, in the authors’ view, may entail some degree of external cost involving degradation of the biosphere. The second is to propose centralized, “planetary” rule-making over the amounts and ways in which those resources are used.
It’s an Opinion Piece
The PC paper is billed as the work product of a “collaborative team of 22 leading international researchers”. This group includes four attorneys (one of whom was a lead author) and one philosopher. Climate impact researchers are represented, who undoubtedly helped shape assumptions about climate change and its causes that drive the PC’s theses. (More on those assumptions in a section below.) There are a few social scientists of various stripes among the credited authors, one meteorologist, and a few “sustainability”, “resilience”, and health researchers. It’s quite a collection of signees, er… “research collaborators”.
Grabby Interventionists
The reasoning underlying a “planetary commons” (PC) is that the planet’s biosphere qualifies as a commons. The biosphere must include virtually any public good like air and sunshine, any common good like waterways, or any private good or club good. After all, any object can play host to tiny microbes regardless of ownership status. So the PC authors characterization of the planet’s biosphere as a commons is quite broad in terms of conventional notions of resource attributes.
We usually think of spillover or external costs as arising from some use of a private resource that imposes costs on others, such as air or water pollution. However, mere survival requires that mankind exploit both public and non-public resources, acts that can always be said to impact the biosphere in some way. Efforts to secure shelter, food, and water all impinge on the earth’s resources. To some extent, mankind must use and shape the biosphere to succeed, and it’s our natural prerogative to do so, just like any other creature in the food chain.
Even if we are to accept the PC paper’s premise that the entire biosphere should be treated is a commons, most spillovers are de minimus. From a public policy perspective, it makes little sense to attempt to govern over such minor externalities. Monitoring behavior would be costly, if not impossible, at such an atomistic level. Instead, free and civil societies rely on a high degree of self-governance and informal enforcement of ethical standards to keep small harms to a minimum.
Unfortunately, the identification and quantification of meaningful spillover costs is not always clear-cut. This has led to an increasingly complex regulatory environment, an increasingly litigious business environment, and efforts by policymakers to manage the detailed inputs and outputs of the industrial economy.
All of that is costly in its own right, especially because the activities giving rise to those spillovers often enable large welfare enhancements. Regulators and planners face great difficulties in estimating the costs and benefits of various “correctives”. The very undertaking creates risk that often exceeds the cost of the original spillover. Nevertheless, the PC paper expands on the murkiest aspects of spillover governance by including “… all critical biophysical Earth-regulating systems and their functions, irrespective of where they are located…” as part of a commons requiring “… additional governance arrangements….”
Adoption of the PC framework would authorize global interventions (and ultimately local interventions, including surveillance) on a massive scale based on guesswork by bureaucrats regarding the evolution of the biosphere.
Ostrom Upside Down
Not only would the PC framework represent an expansion of the grounds for intervention by public authorities, it seeks to establish international authority for intervention into public and private affairs within sovereign states. The authors attempt to rationalize such far-reaching intrusions in a rather curious way:
“Drawing on the legacy of Elinor Ostrom’s foundational research, which validated the need for and effectiveness of polycentric approaches to commons governance (e.g., ref. 35, p. 528, ref. 36, p. 1910), we propose that a nested Earth system governance approach be followed, which will entail the creation of additional governance arrangements for those planetary commons that are not yet adequately governed.”
Anyone having a passing familiarity with Elinor Ostrom’s work knows that she focused on the identification of collaborative solutions to common goods problems. She studied voluntary and often strictly private efforts among groups or communities to conserve common pool resources, as opposed to state-imposed solutions. Ostrom accepted assigned rights and pricing solutions to managing common resources, but she counseled against sole reliance on market-based tools.
Surely the PC authors know they aren’t exactly channeling Ostrom:
“An earth system governance approach will require an overarching global institution that is responsible for the entire Earth system, built around high-level principles and broad oversight and reporting provisions. This institution would serve as a universal point of aggregation for the governance of individual planetary commons, where oversight and monitoring of all commons come together, including annual reporting on the state of the planetary commons.”
Polycentricity was used by Ostrom to describe the involvement of different, overlapping “centers of authority”, such as individual consumers and producers, cooperatives formed among consumers and producers, other community organizations, local jurisdictions, and even state or federal regulators. Some of these centers of authority supersede others in various ways. For example, solutions developed by cooperatives or lower centers of authority must align with the legal framework within various government jurisdictions. However, as David Henderson has noted, Ostrom observed that management of pooled resources at lower levels of authority was generally superior to centralized control. Henderson quotes Ostrom and a co-author on this point:
“When users are genuinely engaged in decisions regarding rules affecting their use, the likelihood of them following the rules and monitoring others is much greater than when an authority simply imposes rules.”
The authors of the PC have something else in mind, and they bastardize the spirit of Ostrom’s legacy in the process. For example, the next sentence is critical for understanding the authors’ intent:
“If excessive emissions and harmful activities in some countries affect planetary commons in other areas—for example, the melting of polar ice—strong political and legal restrictions for such localized activities would be needed.”
Of course, there are obvious difficulties in measuring impacts of various actions on polar ice, assigning responsibility, and determining the appropriate “restrictions”. But in essence, the PC paper advocates for a top-down model of governance. Polycentrism is thus reduced to “you do as we say”, which is not in the spirit of Ostrom’s research.
Planetary Governance
Transcending national sovereignty on questions of the biosphere is key to the authors’ ambitions. At a bare minimum, the authors desire legally-binding commitments to international agreements on environmental governance, unlike the unenforceable promises made for the Paris Climate Accords:
“At present, the United Nations General Assembly, or a more specialized body mandated by the Assembly, could be the starting point for such an overarching body, even though the General Assembly, with its state-based approach that grants equal voting rights to both large countries and micronations, represents outdated traditions of an old European political order.”
But the votes of various “micronations” count for zilch when it comes to real “claims” on the resources of other sovereign nations! Otherwise, there is nothing “voluntary” about the regime proposed in the PC paper.
“A challenge for such regimes is to duly adapt and adjust notions of state sovereignty and self-determination, and to define obligations and reciprocal support and compensation schemes to ensure protection of the Earth system, while including comprehensive stewardship obligations and mandates aimed at protecting Earth-regulating systems in a just and inclusive way.”
So there! The way forward is to adopt the broadest possible definition of market failure and global regulation of any and all private activity touching on nature in any way. And note here a similarity to the Paris Accords: achieving commitments would fall to national governments whose elites often demonstrate a preference for top-down solutions.
Ah Yes, Redistribution
It should be apparent by now that the PC paper follows a now well-established tradition in multi-national climate “negotiations” to serve as subterfuge for redistribution (which, incidentally, includes the achievement of interspecies justice):
“For instance, a more equal sharing of the burdens of climate stabilization would require significant multilateral financial and technology transfers in order not to harm the poorest globally (116).”
The authors insist that participation in this governance would be “voluntary”, but the following sentence seems inconsistent with that assurance:
“… considering that any move to strengthen planetary commons governance would likely be voluntarily entered into, the burdens of conservation must be shared fairly (115).”
Wait, what? “Voluntary” at what level? Who defines “fairness”? The authors approvingly offer this paraphrase of the words of Brazilian President Lula da Silva,
“… who affirmed the Amazon rainforest as a collective responsibility which Brazil is committed to protect on behalf of all citizens around the world, and that deserves and justifies compensation from other nations (117).”
Let Them Eat Cake
Furthermore, PC would require de-growth and so-called “sufficiency” for thee (i.e., be happy with less), if not for those who’ll design and administer the regime.
“… new principles that align with novel Anthropocene dynamics and that could reverse the path-dependent course of current governance. These new principles are captured under a new legal paradigm designed for the Anthropocene called earth system law and include, among others, the principles of differentiated degrowth and sufficiency, the principle of interconnectivity, and a new planetary ethic (e.g., principle of ecological sustainability) (134).”
If we’re to take the PC super-regulators at their word, the regulatory regime wouldimpinge on fertility decisions as well. Just who might we trust to govern humanity thusly? If we’re wise enough to applythe Munger Test, we wouldn’t grant that kind of power to our worst enemy!
Global Warmism
The underlying premise of the PC proposal is that a global crisis is now unfolding before our eyes: anthropomorphic global warming (AGW). The authors maintain that emissions of carbon dioxide are the cause of rising temperatures, rapidly rising sea levels, more violent weather, and other imminent disasters.
“It is now well established that human actions have pushed the Earth outside of the window of favorable environmental conditions experienced during the Holocene…”
“Earth system science now shows that there are biophysical limits to what existing organized human political, economic, and other social systems can appropriate from the planet.”
For a variety of reasons, both of these claims are more dubious than one might suppose based on popular narratives. As for the second of these, mankind’s limitless capacity for innovation is a more powerful force for sustainability than the authors would seem to allow. On the first claim, it’s important to note that the PC paper’s forebodings are primarily based on modeled, prospective outcomes, not historical data. The models are drastically oversimplified representations of the earth’s climate dynamics driven by exogenous carbon forcing assumptions. Their outputs have proven to be highly unreliable, overestimating warming trends almost without exception. These models exaggerate climate sensitivity to carbon forcings, and they largely ignore powerful natural forcings such as variations in solar irradiance, geological heating, and even geological carbon forcings. The models are also notorious for their inadequate treatment of feedback effects from cloud cover. Their predictions of key variables like water vapor are wildly in error.
The measurement of the so-called “global temperature” is itself subject to tremendous uncertainty. Weather stations come and go. They are distributed very unevenly across land masses, and measurement at sea is even sketchier. Averaging all these temperatures would be problematic even if there were no other issues… but there are. Individual stations are often sited poorly, including distortions from heat island effects. Aging of equipment creates a systematic upward bias, but correcting for that bias (via so-called homogenization) causes a “cooling the past” bias. It’s also instructive to note that the increase in global temperature from pre-industrial times actually began about 80 years prior to the onset of more intense carbon emissions in the 20th century.
Climate alarmists often speak in terms of temperature anomalies, rather than temperature levels. In other words, to what extent do temperatures differ from long-term averages? The magnitude of these anomalies, using the past several decades as a base, tend to be anywhere from zero degrees to well above one degree Celsius, depending on the year. Relative to temperature levels, the anomalies are a small fraction. Given the uncertainty in temperature levels, the anomalies themselves are dwarfed by the noise in the original series!
Pick Your Own Tipping Point
It seems that“tipping point” scares are heavily in vogue at the moment, and the PC proposal asks us to quaff deeply of these narratives. Everything is said to be at a tipping point into irrecoverable disaster that can be forestalled only by reforms to mankind’s unsustainable ways. To speak of the possibility of other causal forces would be a sacrilege. There are supposed tipping points for the global climate itself as well as tipping points for the polar ice sheets, the world’s forests, sea levels and coastal environments, severe weather, and wildlife populations. But none of this is based on objective science.
For example, the 1.5 degree limit on global warming is a wholly arbitrary figure invented by the IPCC for the Paris Climate Accords, yet the authors of the PC proposal would have us believe that it was some sort of scientific determination. And it does not represent a tipping point. Cliff Mass explains that climate models do not behave as if irreversible tipping points exist.
Likewise, the rise of sea levels has not accelerated from prior trends, so it has nothing to do with carbon forcing.
One thing carbon forcings have accomplished is a significant greening of the planet, which if anything bodes well for the biosphere
What about the disappearance of the polar ice sheets? On this point, Cliff Mass quotes Chapter 3 of the IPCC’s Special Report on the implications of 1.5C or more warming:
“there is little evidence for a tipping point in the transition from perennial to seasonal ice cover. No evidence has been found for irreversibility or tipping points, suggesting that year-round sea ice will return given a suitable climate.”
The PC paper also attempts to connect global warming to increases in forest fires, but that’s incorrect: there has been no increasing trend in forest fires or annual burned acreage. If anything, trends in measures of forest fire activity have been negative over the past 80 years.
Concluding Thoughts
The alarmist propaganda contained in the PC proposal is intended to convince opinion leaders and the public that they’d better get on board with draconian and coercive steps to curtail economic activity. They appeal to the sense of virtue that must always accompany consent to authoritarian action, and that means vouching for sacrifice in the interests of environmental and climate equity. All the while, the authors hide behind a misleading version of Elinor Ostrom’s insights into the voluntary and cooperative husbandry of common pool resources.
One day we’ll be able to produce enough carbon-free energy to accommodate high standards of living worldwide and growth beyond that point. In fact, we already possess the technological know-how to substantially reduce our reliance on fossil fuels, but we lack the political will to avail ourselves of nuclear energy. With any luck, that will soften with installations of modular nuclear units.
Ultimately, we’ll see advances in fusion technology, beamed non-intermittent solar power from orbital collection platforms, advances in geothermal power, and effective carbon capture. Developing these technologies and implementing them at global scales will require massive investments that can be made possible only through economic growth, even if that means additional carbon emissions in the interim. We must unleash the private sector to conduct research and development without the meddling and clumsy efforts at top-down planning that typify governmental efforts (including an end to mandates, subsidies, and taxes). We must also reject ill-advised attempts at geoengineered cooling that are seemingly flying under the regulatory radar. Meanwhile, let’s save ourselves a lot of trouble by dismissing the interventionists in the planetary commons crowd.
In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun