• About

Sacred Cow Chips

Sacred Cow Chips

Category Archives: Global Warming

Wind, Solar, and the Five Circles of Dormant Capital

22 Monday Apr 2024

Posted by Nuetzel in Energy, Global Warming, Industrial Policy

≈ 1 Comment

Tags

Backup Power, Battery Technology, Capacity Factors, Center of the American Experiment, Climate Change, Dante’s Inferno, Dispatchable Power, Dormant Capital, Fossil Furls, Green Energy, Imposed Costs, Industrial Planning, Isaac Orr, Mackinac Center for Public Policy, Malinvestment, Mitch Rolling, Power Outages, Power Tramsmission, Solar Energy, Space Based Solar Power, Subsidies, Wind Energy

This is a first for me…. The following is partly excerpted from a post of two weeks ago, but I’ve made a number of edits and additions. The original post was way too long. This is a bit shorter, and I hope it distills a key message.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Failures of industrial policies are nothing new, but the current manipulation of electric power generation by government in favor of renewable energy technologies is egregious. These interventions are a reaction to an overwrought climate crisis narrative, but they have many shortcomings and risks of their own. Chief among them is whether the power grid will be capable of meeting current and future demand for power while relying heavily on variable resources, namely wind and sunshine. The variability implies idle and drastically underutilized hours every day without any ability to call upon the assets to produce when needed.

The variability is vividly illustrated by the chart above showing a representative daily profile of power demand versus wind and solar output. Below, with apologies to Dante, I describe the energy hellscape into which we’re being driven on the horns of irrational capital outlays. These projects would be flatly rejected by any rational investor but for the massive subsidies afforded by government.

The First Circle of Dormancy: Low Utilization

Wind and solar power assets have relatively low rates of utilization due to the intermittency of wind and sunshine. Capacity factors for wind turbines averaged almost 36% in the U.S. in 2022, while solar facilities averaged only about 24%. This compared with nuclear power at almost 93%, natural gas (66%), and coal (48%).

Despite their low rates of utilization, new wind and solar facilities are always touted at their full nameplate capacity. We hear a great deal about “additions to capacity”, which overstate the actual power-generating potential by factors of three to four times. More importantly, this also means wind and solar power costs per unit of output are often vastly understated. These assets contribute less economic value to the electric grid than more heavily utilized generating assets.

Sometimes wind and solar facilities are completely idle or dormant. Sometimes they operate at just a fraction of capacity. I will use the terms “idle” and dormant” euphemistically in what follows to mean assets operating not just at low levels of utilization, but for those prone to low utilization and also falling within the Second Circle of Dormancy.

The Second Circle of Dormancy: Non-Dispatchability

The First Circle of Dormancy might be more like a Purgatory than a Hell. That’s because relatively low average utilization of an asset could be justifiable if demand is subject to large fluctuations. This is the often case, as with assets like roads, bridges, restaurants, amusement parks, and many others. However, capital invested in wind and solar facilities is idle on an uncontrollable basis, which is more truly condemnable. Wind and solar do not provide “dispatchable” power, meaning they are not “on call” in any sense during idle or less productive periods. Not only is their power output uncontrollable, it is not entirely predictable.

Again, variable but controllable utilization allows flexibility and risk mitigation in many applications. But when utilization levels are uncontrollable, the capital in question has greatly diminished value to the power grid and to power customers relative to dispatchable sources having equivalent capacity and utilization. It’s no wonder that low utilization, variability, and non-dispatchability are underemphasized or omitted by promoters of wind and solar energy. This sort of uncontrollable down-time is a drain on real economic returns to capital.

The Third Circle of Dormancy: Transmission Infrastructure

The idleness that besets the real economic returns to wind and solar power generation extends to the transmission facilities necessary for getting power to the grid. Transmission facilities are costly, but that cost is magnified by the broad spatial distribution of wind and solar generating units. Transmission from offshore facilities is particularly complex. When wind turbines and solar panels are dormant, so are the transmission facilities needed to reach them. Thus, low utilization and the non-dispatchability of those units diminishes the value of the capital that must be committed for both power generation and its transmission.

The Fourth Circle of Dormancy: Backup Power Assets

The reliability of the grid requires that any commitment to variable wind and solar power must also include a commitment to back-up capacity. As another example, consider shipping concerns that are now experimenting with sails on cargo ships. What is the economic value of such a ship without back-up power? Can you imagine these vessels drifting in the equatorial calms for days on end? Even light winds would slow the transport of goods significantly. Idle, non–dispatchable capital, is unproductive capital.

Likewise, solar-powered signage can underperform or fail over the course of several dark, wintry days, even with battery backup. The signage is more reliable and valuable when it is backed-up by another power source. But again, idle, non-dispatchable capital is unproductive capital.

The needed provision of backup power sources represents an imposed cost of wind and solar, which is built into the cost estimates shown in a section below. But here’s another case of dormancy: some part of the capital commitment, either primary energy sources or the needed backups, will be idle regardless of wind and solar conditions… all the time. Of course, back-up power facilities should be dispatchable because they must serve an insurance function. Backup power therefore has value in preserving the stability of the grid even while completely idle. However, at best that value offsets a small part of the social loss inherent in primary reliance on variable and non-dispatchable power sources.

We can’t wholly “replace” dispatchable generating capacity with renewables without serious negative consequences. At the same time, maintaining existing dispatchable power sources as backup carries a considerable cost at the margin for wind and solar. At a minimum, it requires normal maintenance on dispatchable generators, periodic replacement of components, and an inventory of fuel. If renewables are intended to meet growth in power demand, the imposed cost is far greater because backup sources for growth would require investment in new dispatchable capacity.

The Fifth Circle of Dormancy: Outages

The pursuit of net-zero carbon emissions via wind and solar power creates uncontrollably dormant capital, which increasingly lacks adequate backup power. Providing that backup should be a priority, but it’s not.

Perhaps much worse than the cost of providing backup power sources is the risk and imposed cost of grid instability in their absence. That cost would be borne by users in the form of outages. Users are placed at increasing risk of losing power at home, at the office and factories, at stores, in transit, and at hospitals. This can occur at peak hours or under potentially dangerous circumstances like frigid or hot weather.

Outage risks include another kind of idle capital: the potential for economy-wide shutdowns across a particular region of all electrified physical capital. Not only can grid failure lead to economy-wide idle capital, but this risk transforms all capital powered by electricity into non-dispatchable productive capacity.

Reliance on wind and solar power makes backup capacity an imperative. Better still, just scuttle the wind and solar binge and provide for growth with reliable sources of power!

Quantifying Infernal Costs

A “grid report card“ from the Mackinac Center for Public Policy gets right to the crux of the imposed-cost problem:

“… the more renewable generation facilities you build, the more it costs the system to make up for their variability, and the less value they provide to electricity markets.”

The report card uses cost estimates for Michigan from the Center of the American Experiment. Here are the report’s average costs per MWh through 2050, including the imposed costs of backup power:

—Existing coal plant: $33/MWh

—Existing gas-powered: $22

— New wind: $180

—New solar: $278

—New nuclear reactor (light water): $74

—Small modular reactor: $185

—New coal plant: $106 with carbon capture and storage (CCS)

—New natural gas: $64 with CCS

It’s should be no surprise that existing coal and gas facilities are the most cost effective. Preserve them! Of the new installations, natural gas is the least costly, followed by the light water reactor and coal. New wind and solar capacity are particularly costly.

Proponents of net zero are loath to recognize the imposed cost of backup power for two reasons. First, it is a real cost that can be avoided by society only at the risk of grid instability, something they’d like to ignore. To them, it represents something of an avoidable external cost. Second, at present, backup dispatchable power would almost certainly entail CO2 emissions, violating the net zero dictum. But in attempting to address a presumed externality (climate warming) by granting generous subsidies to wind and solar investors, the government and NGOs induce an imposed cost on society with far more serious and immediate consequences.

Deadly Sin: Subsidizing Dormant Capital

Wind and solar capital outlays are funded via combinations of private investment and public subsidies, and the former is very much contingent on the latter. That’s because the flood of subsidies is what allows private investors a chance to profit from uncontrollably dormant capital. Wind and solar power are far more heavily subsidized than fossil fuels, as noted by Mitch Rolling and Isaac Orr:

“In 2022, wind and solar generators received three and eighteen times more subsidies per MWh, respectively, than natural gas, coal, and nuclear generators combined. Solar is the clear leader, receiving anywhere from $50 to $80 per MWh over the last five years, whereas wind is a distant second at $8 to $10 per MWh …. Renewable energy sources like wind and solar are largely dependent on these subsidies, which have been ongoing for 30 years with no end in sight.”

But even generous subsidies often aren’t enough to ensure financial viability. Rent-enabled malinvestments like these crowd out genuinely productive capital formation. Those lost opportunities span the economy and are not limited to power plants that might otherwise have used fossil fuels.

Despite billions of dollars in “green energy” subsidies, bankruptcy has been all too common among wind and solar firms. That financial instability demonstrates the uneconomic nature of many wind and solar investments. Bankruptcy pleadings represent yet another way investors are insulated against wind and solar losses.

Subsidized Off-Hour (Wasted) Output

This almost deserves a sixth circle, except that it’s not about dormancy. Wind and solar power are sometimes available when they’re not needed, in which case the power goes unused because we lack effective power storage technology. Battery technology has a long way to go before it can overcome this problem.

When wind and solar facilities generate unused and wasted power during off-hours, their operators are nevertheless paid for that power by selling it into the grid where it goes unused. It’s another subsidy to wind and solar power producers, and one that undermines incentives for investment in batteries.

A Path To Redemption

Space-based solar power beamed to earth may become a viable alternative to terrestrial wind and solar production within a decade or so. The key advantages would be constancy and the lack of an atmospheric filter on available solar energy, producing power 13 times as efficiently as earth-bound solar panels. From the last link:

“The intermittent nature of terrestrial renewable power generation is a major concern, as other types of energy generation are needed to ensure that lights stay on during unfavorable weather. Currently, electrical grids rely either on nuclear plants or gas and coal fired power stations as a backup…. “

Construction of collection platforms in geostationary orbit will take time, of course, but development of space-based solar should be a higher priority than blanketing vast tracts of land with inefficient solar panels while putting power users at risk of outages.

No Sympathy for Malinvestment

This post identified five ways in which investments in wind and solar power create frequent and often extended periods of damnably dormant physical capital:

  • Low Utilization
  • Nondispatchable Utilization
  • Idle Transmission Infrastructure
  • Idle Backup Generators
  • Outages of All Electrified Capital

Power demand is expected to soar given the coming explosion in AI applications, and especially if the heavily-subsidized and mandated transition to EVs comes to pass. But that growth in demand will not and cannot be met by relying solely on renewable energy sources. Their variability implies substantial idle capacity, higher costs, and service interruptions. Such a massive deployment of dormant capital represents an enormous waste of resources, and the sad fact is it’s been underway for some time.

In the years ahead, the net-zero objective will motivate more bungled industrial planning as a substitute for market-driven forces. Costs will be driven higher by the imposed costs of backup capacity and/or outages. Ratepayers, taxpayers, and innocents will all share these burdens.

Creating idle, non-dispatchable physical capital is malinvestment which diminishes future economic growth. The boom in wind and solar activity began in earnest during the era of negative real interest rates. Today’s higher rates might slow the malinvestment, but they won’t bring it to an end without a substantial shift in the political landscape. Instead, taxpayers will shoulder an even greater burden, as will ratepayers whose power providers are guaranteed returns on their regulated rate bases.

Tangled Up In Green Industrial Policy II: Rewarding Idle Capital

06 Saturday Apr 2024

Posted by Nuetzel in Energy, Global Warming, Industrial Policy

≈ 1 Comment

Tags

AI, Capacity Factors, Carbon Capture, Casey Handmer, Center of the American Experiment, Charles Glasser, crowding out, Dispatchable Power, EV Mandates, Externalities, Heat Island Effect, Hydrocarbons, Idle Capital, IMF, Imposed Cost, Industrial Policy, Institute for Energy Research, Lazard Levelized Costs, Lionel Shriver, Long Tailpipe, Mackinac Center for Public Policy, Malinvestment, Modular Reactors, Natural Gas, Net Zero, Nuclear Fusion, Power Transmission, Production Possibilities, Renewable energy, Simon P. Michaux, Subsidies, Toxicity, Travis Fisher, Wildlife Hazards

A week ago I posted about electrification and particularly EV mandates, one strand of government industrial policy under which non-favored sectors of the economy must labor. This post examines a related industrial policy: manipulation of power generation by government policymakers in favor of renewable energy technologies, while fossil fuels are targeted for oblivion. These interventions are a reaction to an overwrought climate crisis narrative, but they present many obstacles, oversights and risks of their own. Chief among them is whether the power grid will be capable of meeting current and future demand for power while relying heavily on variable resources: wind and sunshine.

Like almost everything I write, this post is too long! Here is a guide to what follows. Scroll down to whatever sections might be of interest:

  • Malinvestment: Idle capital
  • Key Considerations to chew on
  • False Premises: zero CO2? Low cost?
  • Imposed Cost: what and how much?
  • Supporting Growth: with renewables?
  • Resource Constraints: they’re tight!
  • Technological Advance: patience!
  • The Presumed Elephant: CO2 costs
  • Conclusion

Malinvestment

The intermittency of wind and solar power creates a fundamental problem of physically idle capital, which leaves the economy short of its production possibilities. To clarify, capital invested in wind and solar facilities is often idle in two critical ways. First, wind and solar assets have relatively low rates of utilization because of their variability, or intermittency. Second, neither provides “dispatchable” power: it is not “on call” in any sense during those idle periods, which are not entirely predictable. Wind and solar assets therefore contribute less value to the electric grid than dispatchable sources of power having equivalent capacity and utilization.

Is “idle capital” a reasonable characterization? Consider the shipping concerns that are now experimenting with sails on cargo ships. What is the economic value of such a ship without back-up power? Can you imagine them drifting in the equatorial calms for days on end? Even light winds would slow the transport of goods significantly. Idle capital might be bad enough, but a degree of idleness allows flexibility and risk mitigation in many applications. Idle, non–dispatchable capital, however, is unproductive capital.

Likewise, solar-powered signage can underperform or fail over the course of several dark, wintry days, even with battery backup. The signage is more reliable and valuable when it is backed-up by another power source. Again, idle, non-dispatchable capital is unproductive capital.

The pursuit of net-zero carbon emissions via wind and solar power creates idle capital, which increasingly lacks adequate backup power. That should be a priority, but it’s not. This misguided effort is funded from both private investment and public subsidies, but the former is very much contingent on the latter. That’s because the flood of subsidies is what allows private investors to profit from idle capital. Rent-enabled investments like these crowd out genuinely productive capital formation, which is not limited to power plants that might otherwise use fossil fuels.

Creating idle or unemployed physical capital is malinvestment, and it diminishes future economic growth. The surge in this activity began in earnest during the era of negative real interest rates. Today, in an era of higher rates, taxpayers can expect an even greater burden, as can ratepayers whose power providers are guaranteed returns on their regulated rate bases.

Key Consideration

The forced transition to net zero will be futile, but especially if wind and solar energy are the primary focus. Keep the following in mind:

  • The demand for electricity is expected to soar, and soon! Policymakers have high hopes for EVs, and while adoption rates might fall well short of their goals, they’re doing their clumsy best to force EVs down our throats with mandates. But facilitating EV charging presents difficulties. Lionel Shriver states the obvious: “Going Electric Requires Electricity”. Reliable electricity!
  • Perhaps more impressive than prospects for EVs is the expected growth in power demand from data centers required by the explosion of artificial intelligence applications across many industries. It’s happening now! This will be magnified with the advent of artificial general intelligence (AGI).
  • Dispatchable power sources are needed to back-up unreliable wind and solar power to ensure service continuity. Maintaining backup power carries a huge “imposed cost” at the margin for wind and solar. At present, that would entail CO2 emissions, violating the net zero dictum.
  • Perhaps worse than the cost of backup power would be the cost borne by users under the complete elimination of certain dispatchable power sources. An imposed cost then takes the form of outages. Users are placed at risk of losing power at home, at the office and factories, at stores, in transit, and at hospitals at peak hours or under potentially dangerous circumstances like frigid or hot weather.
  • Historically, dispatchable power has allowed utilities to provide reliable electricity on-demand. Just flip the switch! This may become a thing of the past.
  • Wind and solar power are sometimes available when they’re not needed, in which case the power goes unused because we lack effective power storage technology.
  • Wind and solar power facilities operate at low rates of utilization, yet new facilities are always touted at their full nameplate capacity. Capacity factors for wind turbines averaged almost 36% in the U.S. in 2022, while solar facilities averaged only about 24%. This compared with nuclear power at almost 93%, natural gas (66%), and coal (48%). Obviously, the low capacity factors for wind and solar reflect their variable nature, rather than dispatchable responses to fluctuations in power demand.
  • Low utilization and variability are underemphasized or omitted by those promoting wind and solar plant in the media and often in discussions of public policy, and no wonder! We hear a great deal about “additions to capacity”, which overstate the actual power-generating potential by factors of three to four times. Here is a typical example.
  • Wind and solar power are far more heavily subsidized than fossil fuels. This is true in absolute terms and especially on the basis of actual power output, which reveals their overwhelmingly uneconomic nature. From the link above, here are Mitch Rolling and Isaac Orr on this point:
    • “In 2022, wind and solar generators received three and eighteen times more subsidies per MWh, respectively, than natural gas, coal, and nuclear generators combined. Solar is the clear leader, receiving anywhere from $50 to $80 per MWh over the last five years, whereas wind is a distant second at $8 to $10 per MWh …. Renewable energy sources like wind and solar are largely dependent on these subsidies, which have been ongoing for 30 years with no end in sight.”
  • The first-order burden of subsidies falls on taxpayers. The second-order burdens manifest in an unstable grid and higher power costs. But just to be clear, subsidies are paid by governments to producers or consumers to reduce the cost of activity favored by policymakers. However, the International Monetary Fund frequently cites “subsidy” figures that include staff estimates of unaddressed externalities. These are based on highly-simplified models and subject to great uncertainty, of course, especially when dollar values are assigned to categories like “climate change”. Despite what alarmists would have us believe, the extent and consequences of climate change are not settled scientific issues, let alone the dollar cost.
  • Wind and solar power are extremely land- and/or sea-intensive. For example, Casey Handmer estimates that a one-Gigawatt data center, if powered by solar panels, would need a footprint of 20,000 acres. 
  • Solar installations are associated with a significant heat island effect: “We found temperatures over a PV plant were regularly 3–4 °C warmer than wildlands at night….”
  • Wind and solar power both represent major hazards to wildlife both during and after construction.
    • In addition to the destruction of habitat both on- and offshore, turbine blades create noise, electromagnetism, and migration barriers. Wind farms have been associated with significant bird and bat fatalities. Collisions with moving blades are one thing, but changes to the winds and air pressure around turbines are also a danger to avian species.
    • There is a strong likelihood that offshore wind development is endangering whales and dolphins.
    • Solar farms present dangers to waterfowl. These creatures are tricked into diving toward what they believe to be bodies of water, only to crash into the panels.
  • The production of wind and solar equipment requires the intensive use of scarce resources, including environmentally-sensitive materials. Extracting these materials often requires the excavation of massive amounts of rock subject to extensive processing. Mining and processing rely heavily on diesel fuel. Net zero? No.
  • Wind and solar facilities often present major threats of toxicity at disposal, or even sooner. A recent hail storm in Texas literally destroyed a solar farm, and the smashed panels have prompted concerns not only about solar “sustainability”, but also that harsh chemicals may be leaking into the local environment.
  • The transmission of power is costly, but that cost is magnified by the broad spatial distribution of wind and solar generating units. Transmission from offshore facilities is particularly complex. And high voltage lines run into tremendous local opposition and regulatory scrutiny.
  • When wind turbines and solar panels are idle, so are the transmission facilities needed to reach them. Thus, low utilization and the variability of those units drives up the capital needed for power and power transmission.
  • There is also an acute shortage of transformers, which presents a major bottleneck to grid development and stability.
  • While zero carbon is the ostensible goal, zero carbon nuclear power has been neglected by our industrial planners. That neglect plays off exaggerated fears about safety. Fortunately, there is a growing realization that nuclear power may be surest way to carbon reductions while meeting growth in power demand. In fact, new data centers will go off-grid with their own modular reactors.
  • At the Shriver link, he notes the smothering nature of power regulation, which obstructs the objective of providing reliable power and any hope of achieving net zero.
  • The Biden administration has resisted the substitution of low CO2 emitting power sources for high CO2 emitting sources. For example, natural gas is more energy efficient in a variety of applications than other fuel sources. Yet policymakers seem determined to discourage the production and use of natural gas.

False Premises

Wind and solar energy are touted by the federal government as zero carbon and low-cost technologies, but both claims are false. Extracting the needed resources, fabricating, installing, connecting, and ultimately disposing of these facilities is high in carbon emissions.

The claim that wind and solar have a cost advantage over traditional power sources is based on misleading comparisons. First, putting claims about the cost of carbon aside, it goes without saying that the cost of replacing already operational coal or natural gas generating capacity with new wind and solar facilities is greater than doing nothing.

The hope among net zero advocates is that existing fossil fuel generating plant can be decommissioned as more renewables come on-line. Again, this thinking ignores the variable nature of renewable power. Dispatchable backup power is required to reliably meet power demand. Otherwise, fluctuating power supplies undermine the economy’s productive capacity, leading to declines in output, income, health, and well being. That is costly, but so is maintaining and adding back-up capacity. Costs of wind and solar should account for this necessity. It implies that wind and solar generating units carry a high cost at the margin.

Imposed Costs

A “grid report card” from the Mackinac Center for Public Policy notes the conceptual flaw in comparing the levelized cost (à la Lazard) of a variable resource with one capable of steady and dispatchable performance. From the report, here is the crux of the imposed-cost problem:

“… the more renewable generation facilities you build, the more it costs the system to make up for their variability, and the less value they provide to electricity markets.”

A committment to variable wind and solar power along with back-up capacity also implies that some capital will be idle regardless of wind and solar conditions. This is part of the imposed cost of wind and solar built into the accounting below. But while back-up power facilities will have idle periods, it is dispatchable and serves an insurance function, so it has value even when idle in preserving the stability of the grid. For that matter, sole reliance on dispatchable power sources requires excess capacity to serve an insurance function of a similar kind.

The Mackinac report card uses estimates of imposed cost from an Institute for Energy Research to construct the following comparison (expand the view or try clicking the image for a better view):

The figures shown in this table are somewhat dated, but the Mackinac authors use updated costs for Michigan from the Center of the American Experiment. These are shown below in terms of average costs per MWh through 2050, but the labels require some additional explanation.

The two bars on the left show costs for existing coal ($33/MWh) and gas-powered ($22) plants. The third and fourth bars are for new wind ($180) and solar ($278) installations. The fifth and sixth bars are for new nuclear reactors (a light water reactor ($74) and a small modular reactor($185)). Finally, the last two bars are for a new coal plant ($106) and a natural gas plant ($64), both with carbon capture and storage (CCS). It’s no surprise that existing coal and gas facilities are the most cost effective. Natural gas is by far the least costly of the new installations, followed by the light water reactor and coal.

The Mackinac “report card” is instructive in several ways. It provides a detailed analysis of different types of power generation across five dimensions, including reliability, cost, cleanliness, and market feasibility (the latter because some types of power (hydro, geothermal) have geographic limits. Natural gas comes out the clear winner on the report card because it is plentiful, energy dense, dispatchable, clean burning, and low-cost.

Supporting Growth

Growth in the demand for power cannot be met with variable resources without dispatchable backup or intolerable service interruptions. Unreliable power would seriously undermine the case for EVs, which is already tenuous at best. Data centers and other large users will go off-grid before they stand for it. This would represent a flat-out market rejection of renewable investments, ESGs be damned!

Casey Handmer makes some interesting projections of the power requirements of data centers supporting not just AI, but AGI, which he discusses in “How To Feed the AIs”. Here is his darkly humorous closing paragraph, predicated on meeting power demands from AGI via solar:

“It seems that AGI will create an irresistibly strong economic forcing function to pave the entire world with solar panels – including the oceans. We should probably think about how we want this to play out. At current rates of progress, we have about 20 years before paving is complete.”

Resource Constraints

Efforts to force a transition to wind and solar power will lead to more dramatic cost disadvantages than shown in the Mackinac report. By “forcing” a transition, I mean aggressive policies of mandates and subsidies favoring these renewables. These policies would effectuate a gross misallocation of resources. Many of the commodities needed to fabricate the components of wind and solar installations are already quite scarce, particularly on the domestic U.S. front. Inflating the demand for these commodities will result in shortages and escalating costs, magnifying the disadvantages of wind and solar power in real economic terms.

To put a finer point on the infeasibility of the net zero effort, Simon P. Michaux produced a comparative analysis in 2022 of the existing power mix versus a hypothetical power mix of renewable energy sources performing an equal amount of work, but at net-zero carbon emissions (the link is a PowerPoint summary). In the renewable energy scenario, he calculated the total quantities of various resources needed to achieve the objective over one generation of the “new” grid (to last 20 -30 years). He then calculated the numbers of years of mining or extraction needed to produce those quantities based on 2019 rates of production. Take a look at the results in the right-most column:

Those are sobering numbers. Granted, they are based on 2019 wind and solar technology. However, it’s clear that phasing out fossil fuels using today’s wind and solar technology would be out of the question within the lifetime of anyone currently living on the planet. Michaux seems to have a talent for understatement:

“Current thinking has seriously underestimated the scale of the task ahead.”

He also emphasizes the upward price pressure we’re likely to witness in the years ahead across a range of commodities.

Technological Breakthroughs

Michaux’s analysis assumes static technology, but there may come a time in the not-too-distant future when advances in wind and solar power and battery storage allow them to compete with hydrocarbons and nuclear power on a true economic basis. The best way to enable real energy breakthroughs is through market-driven economic growth. Energy production and growth is hampered, however, when governments strong-arm taxpayers, electricity buyers, and traditional energy producers while rewarding renewable developers with subsidies.

We know that improvements will come across a range of technologies. We’ve already seen reductions in the costs of solar panels themselves. Battery technology has a long way to go, but it has improved and might some day be capable of substantial smoothing in the delivery of renewable power. Collection of solar power in space is another possibility, as the feasibility of beaming power to earth has been demonstrated. This solution might also have advantages in terms of transmission depending on the locations and dispersion of collection points on earth, and it would certainly be less land intensive than solar power is today. Carbon capture and carbon conversion are advancing technologies, making net zero a more feasible possibility for traditional sources of power. Nuclear power is zero carbon, but like almost everything else, constructing plants is not. Nevertheless, fission reactors have made great strides in terms of safety and efficiency. Nuclear fusion development is still in its infancy, but there have been notable advances of late.

Some or all of these technologies will experience breakthroughs that could lead to a true, zero-carbon energy future. The timeline is highly uncertain, but it’s likely to be faster than anything like the estimates in Michaux’s analysis. Who knows? Perhaps AI will help lead us to the answers.

A Presumed Elephant

This post and my previous post have emphasized two glaring instances of government failure on their own terms: a headlong plunge into unreliable renewable energy, and forced electrification done prematurely and wrong. Some would protest that I left the veritable “elephant in the room”: the presumed external or spillover costs associated with CO2 emissions from burning fossil fuels. Renewables and electrification are both intended to prevent those costs.

External costs were not ignored, of course. Externalities were discussed explicitly in several different contexts such as the mining of new materials, EV tire wear, the substitution of “cleaner” fuels for others, toxicity at disposal, and the exaggerated reductions in CO2 from EVs when the “long tailpipe” problem is ignored. However, I noted explicitly that estimates of unaddressed externalities are often highly speculative and uncertain, and especially the costs of CO2 emissions. They should not be included in comparisons of subsidies.

Therefore, the costs of various power generating technologies shown above do not account for estimates of externalities. If you’re inclined, other SCC posts on the CO2 “elephant” can be found here.

Conclusion

Power demand is expected to soar given the coming explosion in AI applications, and especially if the heavily-subsidized and mandated transition to EVs comes to pass. But that growth in demand will not and cannot be met by relying on renewable energy sources. Their variability implies substantial idle capacity, higher costs, and service interruptions. Such a massive deployment of idle capital would represents an enormous waste of resources, but the sad fact is it’s been underway for some time.

In the years ahead, the net-zero objective will prove representative of a bumbling effort at industrial planning. Costs will be driven higher, including the cost inflicted by outages and environmental damage. Ratepayers, taxpayers, and innocents will share these burdens. Travis Fisher is spot on when he says the grid is becoming a “dangerous liability” thanks to wounds inflicted by subsidies, regulations, and mandates.

As Charles Glasser put it on Instapundit:

“The National Electrical Grid is teetering on collapse. The shift away from full-time available power (like fossil fuels, LNG, etc.) to so-called ‘green’ sources has deeply impacted reliability.”

“Also, as more whale-killing off-shore wind farms are planned, the Biden administration forgot to plan for the thousands of miles of transmission lines that will be needed. And in a perfect example of leftist autophagy, there is considerable opposition from enviro-groups who will tie up the construction of wind farms and transmission lines in court for decades.”

Meanwhile, better alternatives to wind and solar have been routinely discouraged. The substantial reductions in carbon emissions achieved in the U.S. over the past 15 years were caused primarily by the substitution of natural gas for coal in power generation. Much more of that is possible. The Biden Administration, however, wishes to prevent that substitution in favor of greater reliance on high-cost, unreliable renewables. And the Administration wishes to do so without adequately backing up those variable power sources with dispatchable capacity. Likewise, nuclear power has been shunted aside, despite its safety, low risk, and dispatchability. However, there are signs of progress in attitudes toward bringing more nuclear power on-line.

Industrial policy usually meets with failure, and net zero via wind and solar power will be no exception. Like forced electrification, unreliable power fails on its own terms. Net zero ain’t gonna happen any time soon, and not even by 2050. That is, it won’t happen unless net zero is faked through mechanisms like fraudulent carbon credits (and there might not be adequate faking capacity for that!). Full-scale net-zero investment in wind and solar power, battery capacity, and incremental transmission facilities will drive the cost of power upward, undermining economic growth. Finally, wind and solar are not the environmental panacea so often promised. Quite the contrary: mining of the necessary minerals, component fabrication, installation, and even operation all have negative environmental impacts. Disposal at the end of their useful lives might be even worse. And the presumed environmental gains … reduced atmospheric carbon concentrations and lower temperatures, are more scare story than science.

Postscript: here’s where climate alarmism has left us, and this is from a candidate for the U.S. Senate (she deleted the tweet after an avalanche of well-deserved ridicule):

Lords of the Planetary Commons Insist We Banish Sovereignty, Growth

29 Thursday Feb 2024

Posted by Nuetzel in Central Planning, Environmental Fascism, Global Warming, Liberty

≈ Leave a comment

Tags

Anthropocene, Beamed Solar Power, Carbon Capture, Carbon Forcings, Cliff Mass, Common Pool Resources, Elinor Ostrom, Externalities, Fusion Power, Geoengineering, Geothermal Power, global warming, Heat Islands, Interspecies Justice, IPCC, Lula Da Silva, Munger Test, Nuclear power, Orbital Solar Collection, Paris Climate Accords, Planetary Commons, Polycentrism, Private Goods, Property Rights, Public goods, Redistribution, Solar Irradiance, Spillovers, Tipping points

We all share Planet Earth as our home, so there’s a strong sense in which it qualifies as a “commons”. That’s one sensible premise of a new paper entitled “The planetary commons: A new paradigm for safeguarding Earth-regulating systems in the Anthropocene”. The title is a long way of saying that the authors desire broad-based environmental regulation, and that’s what ultimately comes across.

First, a preliminary issue: many resources qualify as commons in the very broadest sense, yet free societies have learned over time that many resources are used much more productively when property rights are assigned to individuals. For example, modern agriculture owes much to defining exclusive property rights to land so that conflicting interests don’t have to compete (e.g,, the farmer and the cowman). Federal land is treated as a commons, however. There is a rich history on the establishment of property rights, but within limits, the legal framework in place can define whether a resource is treated as a commons, a club good, or private property. The point here is that there are substantial economic advantages to preserving strong property rights, rather than treating all resources as communal.

The authors of the planetary commons (PC) paper present a rough sketch for governance over use of the planet’s resources, given their belief that a planetary crisis is unfolding before our eyes. The paper has two main thrusts as I see it. One is to broadly redefine virtually all physical resources as common pool interests because their use, in the authors’ view, may entail some degree of external cost involving degradation of the biosphere. The second is to propose centralized, “planetary” rule-making over the amounts and ways in which those resources are used.

It’s an Opinion Piece

The PC paper is billed as the work product of a “collaborative team of 22 leading international researchers”. This group includes four attorneys (one of whom was a lead author) and one philosopher. Climate impact researchers are represented, who undoubtedly helped shape assumptions about climate change and its causes that drive the PC’s theses. (More on those assumptions in a section below.) There are a few social scientists of various stripes among the credited authors, one meteorologist, and a few “sustainability”, “resilience”, and health researchers. It’s quite a collection of signees, er… “research collaborators”.

Grabby Interventionists

The reasoning underlying a “planetary commons” (PC) is that the planet’s biosphere qualifies as a commons. The biosphere must include virtually any public good like air and sunshine, any common good like waterways, or any private good or club good. After all, any object can play host to tiny microbes regardless of ownership status. So the PC authors characterization of the planet’s biosphere as a commons is quite broad in terms of conventional notions of resource attributes.

We usually think of spillover or external costs as arising from some use of a private resource that imposes costs on others, such as air or water pollution. However, mere survival requires that mankind exploit both public and non-public resources, acts that can always be said to impact the biosphere in some way. Efforts to secure shelter, food, and water all impinge on the earth’s resources. To some extent, mankind must use and shape the biosphere to succeed, and it’s our natural prerogative to do so, just like any other creature in the food chain.

Even if we are to accept the PC paper’s premise that the entire biosphere should be treated is a commons, most spillovers are de minimus. From a public policy perspective, it makes little sense to attempt to govern over such minor externalities. Monitoring behavior would be costly, if not impossible, at such an atomistic level. Instead, free and civil societies rely on a high degree of self-governance and informal enforcement of ethical standards to keep small harms to a minimum.

Unfortunately, the identification and quantification of meaningful spillover costs is not always clear-cut. This has led to an increasingly complex regulatory environment, an increasingly litigious business environment, and efforts by policymakers to manage the detailed inputs and outputs of the industrial economy.

All of that is costly in its own right, especially because the activities giving rise to those spillovers often enable large welfare enhancements. Regulators and planners face great difficulties in estimating the costs and benefits of various “correctives”. The very undertaking creates risk that often exceeds the cost of the original spillover. Nevertheless, the PC paper expands on the murkiest aspects of spillover governance by including “… all critical biophysical Earth-regulating systems and their functions, irrespective of where they are located…” as part of a commons requiring “… additional governance arrangements….”

Adoption of the PC framework would authorize global interventions (and ultimately local interventions, including surveillance) on a massive scale based on guesswork by bureaucrats regarding the evolution of the biosphere.

Ostrom Upside Down

Not only would the PC framework represent an expansion of the grounds for intervention by public authorities, it seeks to establish international authority for intervention into public and private affairs within sovereign states. The authors attempt to rationalize such far-reaching intrusions in a rather curious way:

“Drawing on the legacy of Elinor Ostrom’s foundational research, which validated the need for and effectiveness of polycentric approaches to commons governance (e.g., ref. 35, p. 528, ref. 36, p. 1910), we propose that a nested Earth system governance approach be followed, which will entail the creation of additional governance arrangements for those planetary commons that are not yet adequately governed.”

Anyone having a passing familiarity with Elinor Ostrom’s work knows that she focused on the identification of collaborative solutions to common goods problems. She studied voluntary and often strictly private efforts among groups or communities to conserve common pool resources, as opposed to state-imposed solutions. Ostrom accepted assigned rights and pricing solutions to managing common resources, but she counseled against sole reliance on market-based tools.

Surely the PC authors know they aren’t exactly channeling Ostrom:

“An earth system governance approach will require an overarching global institution that is responsible for the entire Earth system, built around high-level principles and broad oversight and reporting provisions. This institution would serve as a universal point of aggregation for the governance of individual planetary commons, where oversight and monitoring of all commons come together, including annual reporting on the state of the planetary commons.”

Polycentricity was used by Ostrom to describe the involvement of different, overlapping “centers of authority”, such as individual consumers and producers, cooperatives formed among consumers and producers, other community organizations, local jurisdictions, and even state or federal regulators. Some of these centers of authority supersede others in various ways. For example, solutions developed by cooperatives or lower centers of authority must align with the legal framework within various government jurisdictions. However, as David Henderson has noted, Ostrom observed that management of pooled resources at lower levels of authority was generally superior to centralized control. Henderson quotes Ostrom and a co-author on this point:

“When users are genuinely engaged in decisions regarding rules affecting their use, the likelihood of them following the rules and monitoring others is much greater than when an authority simply imposes rules.”

The authors of the PC have something else in mind, and they bastardize the spirit of Ostrom’s legacy in the process. For example, the next sentence is critical for understanding the authors’ intent:

“If excessive emissions and harmful activities in some countries affect planetary commons in other areas—for example, the melting of polar ice—strong political and legal restrictions for such localized activities would be needed.”

Of course, there are obvious difficulties in measuring impacts of various actions on polar ice, assigning responsibility, and determining the appropriate “restrictions”. But in essence, the PC paper advocates for a top-down model of governance. Polycentrism is thus reduced to “you do as we say”, which is not in the spirit of Ostrom’s research.

Planetary Governance

Transcending national sovereignty on questions of the biosphere is key to the authors’ ambitions. At a bare minimum, the authors desire legally-binding commitments to international agreements on environmental governance, unlike the unenforceable promises made for the Paris Climate Accords:

“At present, the United Nations General Assembly, or a more specialized body mandated by the Assembly, could be the starting point for such an overarching body, even though the General Assembly, with its state-based approach that grants equal voting rights to both large countries and micronations, represents outdated traditions of an old European political order.”

But the votes of various “micronations” count for zilch when it comes to real “claims” on the resources of other sovereign nations! Otherwise, there is nothing “voluntary” about the regime proposed in the PC paper.

“A challenge for such regimes is to duly adapt and adjust notions of state sovereignty and self-determination, and to define obligations and reciprocal support and compensation schemes to ensure protection of the Earth system, while including comprehensive stewardship obligations and mandates aimed at protecting Earth-regulating systems in a just and inclusive way.”

So there! The way forward is to adopt the broadest possible definition of market failure and global regulation of any and all private activity touching on nature in any way. And note here a similarity to the Paris Accords: achieving commitments would fall to national governments whose elites often demonstrate a preference for top-down solutions.

Ah Yes, Redistribution

It should be apparent by now that the PC paper follows a now well-established tradition in multi-national climate “negotiations” to serve as subterfuge for redistribution (which, incidentally, includes the achievement of interspecies justice):

“For instance, a more equal sharing of the burdens of climate stabilization would require significant multilateral financial and technology transfers in order not to harm the poorest globally (116).”

The authors insist that participation in this governance would be “voluntary”, but the following sentence seems inconsistent with that assurance:

“… considering that any move to strengthen planetary commons governance would likely be voluntarily entered into, the burdens of conservation must be shared fairly (115).”

Wait, what? “Voluntary” at what level? Who defines “fairness”? The authors approvingly offer this paraphrase of the words of Brazilian President Lula da Silva,

“… who affirmed the Amazon rainforest as a collective responsibility which Brazil is committed to protect on behalf of all citizens around the world, and that deserves and justifies compensation from other nations (117).”

Let Them Eat Cake

Furthermore, PC would require de-growth and so-called “sufficiency” for thee (i.e., be happy with less), if not for those who’ll design and administer the regime.

“… new principles that align with novel Anthropocene dynamics and that could reverse the path-dependent course of current governance. These new principles are captured under a new legal paradigm designed for the Anthropocene called earth system law and include, among others, the principles of differentiated degrowth and sufficiency, the principle of interconnectivity, and a new planetary ethic (e.g., principle of ecological sustainability) (134).”

If we’re to take the PC super-regulators at their word, the regulatory regime would impinge on fertility decisions as well. Just who might we trust to govern humanity thusly? If we’re wise enough to apply the Munger Test, we wouldn’t grant that kind of power to our worst enemy!

Global Warmism

The underlying premise of the PC proposal is that a global crisis is now unfolding before our eyes: anthropomorphic global warming (AGW). The authors maintain that emissions of carbon dioxide are the cause of rising temperatures, rapidly rising sea levels, more violent weather, and other imminent disasters.

“It is now well established that human actions have pushed the Earth outside of the window of favorable environmental conditions experienced during the Holocene…”

“Earth system science now shows that there are biophysical limits to what existing organized human political, economic, and other social systems can appropriate from the planet.”

For a variety of reasons, both of these claims are more dubious than one might suppose based on popular narratives. As for the second of these, mankind’s limitless capacity for innovation is a more powerful force for sustainability than the authors would seem to allow. On the first claim, it’s important to note that the PC paper’s forebodings are primarily based on modeled, prospective outcomes, not historical data. The models are drastically oversimplified representations of the earth’s climate dynamics driven by exogenous carbon forcing assumptions. Their outputs have proven to be highly unreliable, overestimating warming trends almost without exception. These models exaggerate climate sensitivity to carbon forcings, and they largely ignore powerful natural forcings such as variations in solar irradiance, geological heating, and even geological carbon forcings. The models are also notorious for their inadequate treatment of feedback effects from cloud cover. Their predictions of key variables like water vapor are wildly in error.

The measurement of the so-called “global temperature” is itself subject to tremendous uncertainty. Weather stations come and go. They are distributed very unevenly across land masses, and measurement at sea is even sketchier. Averaging all these temperatures would be problematic even if there were no other issues… but there are. Individual stations are often sited poorly, including distortions from heat island effects. Aging of equipment creates a systematic upward bias, but correcting for that bias (via so-called homogenization) causes a “cooling the past” bias. It’s also instructive to note that the increase in global temperature from pre-industrial times actually began about 80 years prior to the onset of more intense carbon emissions in the 20th century.

Climate alarmists often speak in terms of temperature anomalies, rather than temperature levels. In other words, to what extent do temperatures differ from long-term averages? The magnitude of these anomalies, using the past several decades as a base, tend to be anywhere from zero degrees to well above one degree Celsius, depending on the year. Relative to temperature levels, the anomalies are a small fraction. Given the uncertainty in temperature levels, the anomalies themselves are dwarfed by the noise in the original series!

Pick Your Own Tipping Point

It seems that “tipping point” scares are heavily in vogue at the moment, and the PC proposal asks us to quaff deeply of these narratives. Everything is said to be at a tipping point into irrecoverable disaster that can be forestalled only by reforms to mankind’s unsustainable ways. To speak of the possibility of other causal forces would be a sacrilege. There are supposed tipping points for the global climate itself as well as tipping points for the polar ice sheets, the world’s forests, sea levels and coastal environments, severe weather, and wildlife populations. But none of this is based on objective science.

For example, the 1.5 degree limit on global warming is a wholly arbitrary figure invented by the IPCC for the Paris Climate Accords, yet the authors of the PC proposal would have us believe that it was some sort of scientific determination. And it does not represent a tipping point. Cliff Mass explains that climate models do not behave as if irreversible tipping points exist.

Consider also that there has been absolutely no increase in the frequency or intensity of severe weather.

Likewise, the rise of sea levels has not accelerated from prior trends, so it has nothing to do with carbon forcing.

One thing carbon forcings have accomplished is a significant greening of the planet, which if anything bodes well for the biosphere

What about the disappearance of the polar ice sheets? On this point, Cliff Mass quotes Chapter 3 of the IPCC’s Special Report on the implications of 1.5C or more warming:

“there is little evidence for a tipping point in the transition from perennial to seasonal ice cover. No evidence has been found for irreversibility or tipping points, suggesting that year-round sea ice will return given a suitable climate.”

The PC paper also attempts to connect global warming to increases in forest fires, but that’s incorrect: there has been no increasing trend in forest fires or annual burned acreage. If anything, trends in measures of forest fire activity have been negative over the past 80 years.

Concluding Thoughts

The alarmist propaganda contained in the PC proposal is intended to convince opinion leaders and the public that they’d better get on board with draconian and coercive steps to curtail economic activity. They appeal to the sense of virtue that must always accompany consent to authoritarian action, and that means vouching for sacrifice in the interests of environmental and climate equity. All the while, the authors hide behind a misleading version of Elinor Ostrom’s insights into the voluntary and cooperative husbandry of common pool resources.

One day we’ll be able to produce enough carbon-free energy to accommodate high standards of living worldwide and growth beyond that point. In fact, we already possess the technological know-how to substantially reduce our reliance on fossil fuels, but we lack the political will to avail ourselves of nuclear energy. With any luck, that will soften with installations of modular nuclear units.

Ultimately, we’ll see advances in fusion technology, beamed non-intermittent solar power from orbital collection platforms, advances in geothermal power, and effective carbon capture. Developing these technologies and implementing them at global scales will require massive investments that can be made possible only through economic growth, even if that means additional carbon emissions in the interim. We must unleash the private sector to conduct research and development without the meddling and clumsy efforts at top-down planning that typify governmental efforts (including an end to mandates, subsidies, and taxes). We must also reject ill-advised attempts at geoengineered cooling that are seemingly flying under the regulatory radar. Meanwhile, let’s save ourselves a lot of trouble by dismissing the interventionists in the planetary commons crowd.

Canadian Wildfires, Smoky Days Are Recurring Events

11 Sunday Jun 2023

Posted by Nuetzel in Forest Fires, Global Warming, Wildfires

≈ Leave a comment

Tags

Anthropomorphic Global Warming, Boreal Forests, Canadian Fires, Climate Change, Dark Days, David Marcus, Edward Struzik, Fire Suppression, Forest Management, Prescribed Burns, Québec Fires, rent seeking, Wildfires

Smoke from this spring’s terrible forest fires in Canada has fouled the air in much of the country and blown into the northeastern U.S. and mid-Atlantic coastal states. The severity of the fires, if they continued at this pace over the rest of the fire season, would break Canadian records for number of fires and burned area.

Large wildfires with smoky conditions occur in these in regions from time-to-time, and it’s not unusual for fires to ignite in the late spring. The article shown above appeared in the New York Tribune on June 5, 1903. Other “dark day” episodes were recorded in New England in 1706, 1732, 1780, 1814, 1819, 1836, 1881, 1894, and 1903, and several times in the 20th century. I list early years specifically because they preceded by decades (even centuries) the era of supposed anthropomorphic global warming, now euphemistically known as “climate change”.

More recently, however, in the past 10 years, Quebec experienced relatively few wildfires. That left plenty of tinder in the boreal forests with highly flammable, sappy trees. In May, a spell of sunshine helped dry the brush in the Canadian forests. Then lightning and human carelessness sparked the fires, along with multiple instances of arson, some perpetrated by climate change activists.

On top of all that, poor forest management contributed to the conflagrations. So-called fire suppression techniques have done more harm than good over the years, as I’ve discussed on this blog in the past. David Marcus emphasizes the point:

“For years, Canadian parks officials have been warning that their country does not do enough to cull its forests and now we’re witnessing the catastrophic results.

It’s simple really. Edward Struzik, author of ‘Dark Days at Noon, The Future of Fire’ lays it out well.

‘We have been suppressing fires for so many decades in North America that we have forests that are older than they should be,’ he said. …

‘Prescribed burns are one of the best ways to mitigate the wildfire threat,’ he added.”

Nevertheless, the media are eager to blame climate change for any calamity. That’s one part simple naïveté on the part of young journalists, fresh off the turnip truck as it were, with little knowledge or inclination to understand the history and causes of underlying forest conditions. But many seasoned reporters are all too ready to support the climate change narrative as well. There’s also an element of calculated political misinformation in these claims, abetted by those seeking rents from government climate policies.

Wildfires are as old as time; without good forest management practices they are necessary for forest renewal. Agitation to sow climate panic based on wildfires is highly unscrupulous. There is no emergency except for the need to reform forest management, reduce the fuel load, and more generally, put an end to the waste of resources inherent in government climate change initiatives.

See this tweet! Hmmm.

Relax: Natural Variability Causes Heatwaves

30 Saturday Jul 2022

Posted by Nuetzel in Global Warming

≈ 2 Comments

Tags

Al Gore, Anthony WAtts, Build Back Better, Cliff Mass, Climate Emergency, CO2, Emergency Powers, Forest Management, Greenhouse Gases, Heat Index, Heatwaves, Joe Biden, National Oceanic and Atmospheric Administration, NOAA, Urban Heat Island, Wildfires

Lately almost any passing weather phenomenon is said to have been rooted in climate change and higher carbon concentrations. The recent heatwaves that seared parts of Europe and the U.S. are no exception, and climate change activists always find heat spells ripe for rhetorical exploitation. But while these would-be Cassandras and Gretas push their fearful narrative, there are strong reasons to doubt that these weather events are any cause for alarm. This summer’s heat waves, like all others, were of limited geographic scope, and they certainly weren’t the most severe heat waves on record in terms of either duration or magnitude. More on that below.

Data Problems

Temperature measurements tend to be exaggerated these days because so many “official” temperature records come from local airports or other urban sites rich in impervious cover and heat absorbing building materials. This gives rise to the so-called “urban heat island effect”, which refers to the elevated temperatures measured in urban versus rural areas. It’s even worse than that, however, as the vast majority of active weather stations in the U.S. are sited at “hot spots”, and many of them are poorly maintained. Data problems plague European temperature records as well.

Furthermore, official temperature records are extremely short on climatological scales, going back only about 150 years in the U.S. And these records have been “adjusted” by weather authorities like the National Oceanographic and Atmospheric Administration (NOAA), usually with the early records “cooled” relative to more recent readings. That means the long-term trend in temperatures is biased upward.

Climate Catastrophists

Nevertheless, Joe Biden has been threatening to declare a wholly unjustified “climate emergency“, perhaps thinking these dog days are the perfect time to assume a host of new emergency powers. It’s unclear whether the new “Build Back” bill making its way through Congress will be enough to satisfy the appetite of Biden’s handlers for costly and ultimately ineffective climate measures.

It’s tempting to think delirium from the heat waves is what prompted Al Gore to compare climate change skeptics to the dithering police officers in Uvalde, TX, but Gore’s fever is nothing new. We’re still waiting for the world to end, which he once predicted would occur by 2016.

Even weather reporters on TV are breathless in their descriptions of the heatwaves. They’ve certainly become dramatists for the climate-change cause. And people love good scare stories. It gives them an excuse to polish up their pitchforks! Or to be lazy and stay inside. It’s telling that so many people now quote heat index values (which combine heat and humidity), rather than actual temperatures, in the warm summer months. After all, it’s more thrilling to say it’s 105 outside than it is to say 95.

Anyway, compare the paired maps in each of the graphics below (here are links to sources for the first and second):

The temperatures are comparable, but the use of RED colors on the 2022 maps is so much more frightening! This post from Anthony Watts provides a list of links to news sources taking alarmist perspectives on the heatwaves in the U.S. and Europe, and falsely attributing the heatwaves to CO2.

Same Old High Pressure Domes

Cliff Mass offers a bone to the climate change community. He thinks perhaps 5% – 10% of the recent temperature anomaly in the UK is attributable to greenhouse gases. An effect of that magnitude is hardly worthy of government action, let alone panic. Mass says:

“Natural variability of the atmosphere was the proximate cause of the warmth and does not represent an existential threat to the population of Europe.”

The heat wave phenomenon is typical of slow-moving high-pressure systems that often develop during the summer months. These domes of high pressure vary in temperature and geographic breadth, and they are sandwiched between or adjacent to low-pressure systems with cooler temperatures. That’s been the case in both Europe and the U.S. during this summer’s heat waves, as illustrated by the following graphics, The northern hemisphere is not entirely enveloped in a heat wave.

And the rest of the globe? In the tropics (below 20 degrees latitude), June 2022 was the coolest June in 22 years, according to satellite temperature readings! Furthermore, the monthly anomaly in June was the coolest in 10 years. In the Southern Hemisphere, Australia and South America have had extremely cold winters. Antarctica had its coldest winter on record in 2021. Yet Joe Biden is under the misapprehension that we’re experiencing “a climate emergency”.

These are not the worst heat waves on record. Both the U.S. and Europe experienced higher temperatures and prolonged heat waves during the 1930s. For example, St.Louis, Missouri matched or exceeded 110 degrees four times in the 1930s, and twice in 1954, whereas the city topped out at 102 so far this year, and that was after a cool spring. There was an extreme European heat wave in 1976 that was drier and much lengthier, and others occurred in 1911 and 1906. Of course, available temperature comparisons are distorted because the early readings weren’t as impacted by urban heat islands. There are historical accounts of drastic heat waves much earlier, such as the 1500s and 1700s. Here is more heatwave history, in case you’re interested.

We’ll Be Fine

Heat isn’t the only story, of course. A wide range of other disastrous events are blamed on climate change. Wildfires are a prime example, but as we know, wildfires are not new, and the worst wildfires have more to do with poor forest management than anything else. Likewise, there is little if any association between extreme weather events and climate change. In that context, it’s also worth noting that cold weather is much deadlier than hot weather. The climate today, and going forward, presents far fewer dangers to humanity than in the past.

I did a lot of dirty, outdoor work in my youth, and it was hot! There were times just as hot as this summer, if not worse, I’d venture to say. Anyone old enough to have lived through the 1970s or even the 1950s should recognize the heatwave Chicken Littles as such.

The SEC’s Absurd Climate Overreach

04 Monday Apr 2022

Posted by Nuetzel in Central Planning, Global Warming

≈ 2 Comments

Tags

capital costs, Carbon Emissions, Carbon Forcing Models, carbon Sensitivity, central planning, Corporatism, Disclosure Requirements, ESG Risk, ESG Scores, Green Energy, Greenhouse Gas, Hester Peirce, John Cochrane, Litigation Risk, Paris Agreement, Regulatory Risk, Renewable energy, Scope 1, Scope 2, Scope 3, SEC Climate Mandate, Securities and Exchange Commission

The Securities and Exchange Commission recently issued a proposed rule for reporting on climate change risk, and it is fairly outrageous. It asks that corporations report on their own direct greenhouse gas emissions (GHG – Scope 1), the emissions caused by their purchases of energy inputs (Scope 2), and the emissions caused by their “downstream” customers and “upstream” suppliers (Scope 3). This is another front in the Biden Administration’s efforts to bankrupt producers of fossil fuels and to force the private sector to radically alter its mix of energy inputs. The SEC’s proposed “disclosures” are sheer lunacy on several levels.

The SEC Mandate

If implemented, the rule would allow the SEC to stray well outside the bounds of its regulatory authority. The SEC’s role is not to regulate emissions or the environment. Rather, as its web site makes clear, the agency is charged with:

“… protecting investors, maintaining fair, orderly, and efficient markets, and facilitating capital formation.”

Given this mission, the SEC requires management to disclose material financial risks. Are a firm’s GHG emissions really material risks? The first problem here is quite practical: John Cochrane notes the outrageous costs that would be associated with compliance:

“‘Disclosure’ usually means revealing something you know. A perfectly honest answer to ‘disclose what you know about your carbon emissions’ is, ‘we have no idea what our carbon emissions are.’ Back that up with every document the company has ever produced, and you have perfectly ‘disclosed.’ There is no asymmetric information, fraud, etc.

The SEC has already required the production of new information, and as Hester Peirce makes perfectly clear, the climate rules again make a huge dinner out of that appetizer: essentially telling companies to hire a huge number of climate consultants to generate new information, and also how to run businesses.”

In a separate post, Cochrane quotes SEC Commissioner Hester Peirce’s response to the proposed rule. She emphasizes that companies are already required to disclose all material risks. Perhaps they have properly declined to disclose climate risks because those risks are not material.

“Current SEC disclosure mandates are intended to provide investors with an accurate picture of the company’s present and prospective performance through managers’ own eyes. How are they thinking about the company? What opportunities and risks do the board and managers see? What are the material determinants of the company’s financial value?”

Identifying the Risk Causers

Regardless of the actual risks to a firm caused by climate change, the SEC’s proposed GHG disclosures put a more subtle issue into play. Peirce describes what amounts to a fundamental shift in the SEC’s philosophy regarding the motivation and purpose of disclosure:

The proposal, by contrast, tells corporate managers how regulators, doing the bidding of an array of non-investor stakeholders, expect them to run their companies. It identifies a set of risks and opportunities—some perhaps real, others clearly theoretical—that managers should be considering and even suggests specific ways to mitigate those risks. It forces investors to view companies through the eyes of a vocal set of stakeholders, for whom a company’s climate reputation is of equal or greater importance than a company’s financial performance.”

In other words, a major risk faced by these firms has nothing to do with climate change itself, but with perceptions of “climate-related” risks by other parties. That transforms the question of climate risk into something that is, in fact, regulatory and political. Is this the true nature of the SEC’s concern, all dressed up in the scientism typically relied upon by climate change activists?

The reaction of government bureaucrats to the risks they perceive is a palpable threat to investor well-being. For example, GHG emissions might lead to future regulatory sanctions from various government agencies, including fines, taxes, various sanctions, and mitigation mandates. In addition, with the growth of investment management based on what are essentially shambolic and ad hoc ESG scores, GHG or carbon emissions might lead to constraints on a firm’s access to capital. Just ask the oil and gas industry! That penalty is imposed by activist investors and fund managers who wish to force an unwise and premature end to the use of fossil fuels. There is also a threat that GHG disclosures themselves, based (as they will be) on flimsy estimates, could create litigation risk for many companies.

Much Ado About Nothing

While there are major regulatory and political risks to investors, let’s ask, for the sake of argument: how would one degree celcius of warming by the end of this century affect corporate results? Generally not at all. (The bounds described in the Paris Agreement are 1.5 to 2 degrees, but these are based on unrealistic scenarios — see links below.) It would happen gradually in any case, with ample opportunity to adapt to the operating environment. To think otherwise requires great leaps of imagination. For example, climate alarmists probably fancy that violent weather or wildfires will wipe out facilities, yet there is no reliable evidence that the mild warming experienced to-date has been associated with more violent weather or an increased incidence of wildfires (and see here). There are a great many “sacred cows” worshiped by climate-change neurotics, and the SEC undoubtedly harbors many of those shibboleths.

What probabilities can be attached to each incremental degree of warming that might occur over several decades. The evidence we’ve seen comes from so-called carbon-forcing models parameterized for unrealistically high carbon sensitivities and subjected to unrealistic carbon-concentration scenarios. Estimates of these probabilities are not reliable.

Furthermore, climate change risks, even if they could be measured reliably in the aggregate, cannot reasonably be allocated to individual firms. The magnitude of the firm’s own contribution to that risk is equivalent to the marginal reduction in risk if the firm implemented a realistic zero-carbon operating rule. For virtually any firm, we’re talking about something infinitesimal. It involves tremendous guesswork given that various parties around the globe take a flexible approach to emissions, and will continue to do so. The very suggestion of such an exercise is an act of hubris.

Back To The SEC’s Mandated Role

Let’s return to the practical problems associated with these kinds of disclosure requirements. Cochrane also points out that the onerous nature of the SEC proposal, and the regulatory and political threats it embodies, will hasten the transition away from public ownership in many industries.

“The fixed costs alone are huge. The trend to going private and abandoning public markets, at least in the U.S. will continue. The trend to large oligopolized politically compliant static businesses in the U.S. will continue.

I would bet these rules wind up in court, and that these are important issues. They should be.”

Unfortunately, private companies will still have to to deal with certain investors who would shackle their use of energy inputs and demand forms of diligence (… not to say “due”) of their own.

The SEC’s proposed climate risk disclosures are stunningly authoritarian, and they are designed to coalesce with other demands by the regulatory state to kill carbon-based energy and promote renewables. These alternative energy sources are, as yet, unable to offer an economical and stable supply of power. The fraudulent nature of the alleged risks make this all the more appalling. The SEC has effectively undertaken an effort to engage in corporatist industrial policy benefitting a certain class of “green” energy investors, exposing the proposal as yet another step on the road to fascism. Let’s hope Cochrane is right: already, 16 state attorneys general are preparing a legal challenge. May the courts ultimately see through the SEC’s sham!

Hyperbolic Scenarios, Crude Climate Models, and Scientism

07 Sunday Nov 2021

Posted by Nuetzel in Climate science, Global Warming

≈ 6 Comments

Tags

Carbon Efficiency, Carbon forcing, carbon Sensitivity, Cloud Feedback, COP26, G20, Global Temprature, IEA, Intergovernmental Panel on Climate Change, International Energy Agency, IPCC, Joe Biden, Joe Brandon, Judith Curry, Justin Ritchie, Net Zero Emissions, Nic Lewis, Precautionary Principle, Prince Charles, RCP8.5, rent seeking, Representative Concentration Pathway, Roger Pielke Jr., Scientism, United Nations

What we hear regarding the dangers of climate change is based on predictions of future atmospheric carbon concentrations and corresponding predictions of global temperatures. Those predictions are not “data” in the normal, positive sense. They do not represent “the way things are” or “the way things have been”, though one might hope the initial model conditions align with reality. Nor can the predictions be relied upon as “the way things will be”. Climate scientists normally report a range of outcomes produced by models, yet we usually hear only one type of consequence for humanity: catastrophe!

Models Are Not Reality

The kinds of climate models quoted by activists and by the UN’s Intergovernmental Panel on Climate Change (IPCC) have been around for decades. Known as “carbon forcing” models, they are highly simplified representations of the process determining global temperatures. The primary forecast inputs are atmospheric carbon concentrations over time, which again are themselves predictions.

It’s usually asserted that climate model outputs should guide policy, but we must ask: how much confidence can we have in the predictions to allow government to take coercive actions having immediate, negative impacts on human well being? What evidence can be marshaled to show prospective outcomes under proposed policies? And how well do these models fit the actual, historical data? That is, how well do model predictions track our historical experience, given the historical paths of inputs like carbon concentrations?

Faulty Inputs

The IPCC has been defining and updating sets of carbon scenarios since 1990. The scenarios outline the future paths of greenhouse gas emissions (and carbon forcings). They were originally based on economic and demographic modeling before an apparent “decision by committee” to maintain consistency with scenarios issued in the past. Roger Pielke Jr. and Justin Ritchie describe the evolution of this decision process, and they call for change:

“Our research (and that of several colleagues) indicates that the scenarios of greenhouse gas (GHG) emissions through the end of the twenty-first century are grounded in outdated portrayals of the recent past. Because climate models depend on these scenarios to project the future behavior of the climate, the outdated scenarios provide a misleading basis both for developing a scientific evidence base and for informing climate policy discussions. The continuing misuse of scenarios in climate research has become pervasive and consequential—so much so that we view it as one of the most significant failures of scientific integrity in the twenty-first century thus far. We need a course correction.”

One would certainly expect the predicted growth of atmospheric carbon to evolve over time. However, as Pielke and Ritchie note, the IPCC’s baseline carbon scenario today, known as RCP8.5 (“Representative Concentration Pathway”), is remarkably similar to the “business as usual” (BAU) scenario it first issued in 1990:

“The emissions scenarios the climate community is now using as baselines for climate models depend on portrayals of the present that are no longer true. And once the scenarios lost touch with reality, so did the climate, impact, and economic models that depend on them for their projections of the future. Yet these projections are a central part of the scientific basis upon which climate policymakers are now developing, debating, and adopting policies.”

The authors go on to discuss a few characteristics of the BAU scenario that today seem implausible, including:

“… RCP8.5 foresees carbon dioxide emissions growing rapidly to at least the year 2300 when Earth reaches more than 2,000 ppm of atmospheric carbon dioxide concentrations. But again, according to the IEA and other groups, fossil energy emissions have likely plateaued, and it is plausible to achieve net-zero emissions before the end of the century, if not much sooner.”

Pielke and Ritchie demonstrate that the IPCC’s baseline range of carbon emissions by 2045 is centered well above (actually double) the mid-range of scenarios developed by the International Energy Agency (IEA), and there is very little overlap between the two. However, global carbon emissions have been flat over the past decade. Even if we extrapolate the growth in atmospheric CO2 parts per million over the past 20 years, it would rise to less than 600 ppm by 2100, not 1,200 ppm. It’s true that a few countries (China comes to mind) continue to exploit less “carbon efficient” energy resources like coal, but the growth trend in concentrations is likely to continue to taper over time.

It therefore appears that the IPCC’s climate scenarios, which are used broadly as model inputs by the climate research community, are suspect. As one might suspect: garbage in, garbage out. But what about the climate models themselves?

Faulty Models

The model temperature predictions have been grossly in error. They have been and continue to be “too hot”. The chart at the top of this post is typical of the comparisons of model projections and actual temperatures. Before the year 2000, most of the temperature paths projected by the particular model charted above ran higher than actual temperatures. However, the trends subsequently diverged and the gap has become more extreme over the past two decades.

The problem is not merely one of faulty inputs. The models themselves are deeply flawed, as they fail to account adequately for natural forces that strongly influence our climate. It’s been clear for many years that the sun’s radiative energy has a massive impact on temperatures, and it is affected not only by the intensity of the solar cycle but also by cloud cover on Earth. Unfortunately, carbon forcing models do not agree on the role that increased clouds might have in amplifying warming. However, a reduction in cloud cover over the past 20 years, and a corresponding increase in radiative heat, can account for every bit of the warming experienced over that time.

This finding not only offers an alternative explanation for two decades of modest warming, it also strikes at the very heart of the presumed feedback mechanism usually assumed to amplify carbon-induced warming. The overall effect is summarized by the so-called carbon sensitivity, measured as the response of global temperature to a doubling of carbon concentration. The IPCC puts that sensitivity in a range of 1.5C to 4.5C. However, findings published by Nic Lewis and Judith Curry are close to the low end of that range, as are those found by Frank Bosse reported here. The uncertainties surrounding the role of cloud cover and carbon sensitivities reveal that the outputs relied upon by climate alarmists are extreme model simulations, not the kind of reliable intelligence upon which drastic policy measures should be taken.

The constant anxiety issued from the Left on the issue of climate change, and not a little haranguing of the rest of us, is misplaced. The IPCC’s scenarios for the future paths of carbon concentration are outdated and seriously exaggerated, and they represent a breach of scientific protocol. Yet the scenarios are widely used as the basis of policy discussions at both the domestic and international levels. The climate models themselves embed questionable assumptions that create a bias toward calamitous outcomes.

Yet Drastic Action Is Urged

The UN’s 2021 climate conference, or COP26 (“Conference of the Parties …”) is taking place in Glasgow, Scotland this month. Like earlier international climate conferences, the hope is that dire forecasts will prompt broad agreement on goals and commitments, and that signatory countries will translate these into policy at the national level.

Things got off to a bad start when, before COP26 even began, the G20 nations failed to agree on a goal of “net-zero” carbon emissions by 2050. Another bad portent for the conference is that China and India, both big carbon emitters, will not attend, which must be tremendously disappointing to attendees. After all, COP26 has been billed by Prince Charles himself as “the last chance saloon, literally”, for saving the world from catastrophe. He said roughly the same thing before the Paris conference in 2014. And Joe Brandon … er, Biden, blurted some hyperbole of his own:

“Climate change is already ravaging the world. … It’s destroying people’s lives and livelihoods and doing it every single day. … It’s costing our nations trillions of dollars.”

All this is unadulterated hogwash. But it is the stuff upon which a crisis-hungry media feeds. This hucksterism is but one form of climate rent-seeking. Other forms are much more troubling: scary scenarios and model predictions serve the self-interest of regulators, grant-seeking researchers, interventionist politicians, and green investors who suckle at the public teat. It is a nightmare of scientism fed by the arrogance of self-interested social planners. The renewable energy technologies promoted by these investors, politicians, and planners are costly and land-intensive, providing only intermittent output (requiring backup fossil fuel capacity), and they have nasty environmental consequences of their own.

The precautionary principle is no excuse for the extreme policies advocated by alarmists. We already have economically viable “carbon efficient” and even zero-carbon energy alternatives, such as natural gas, modular nuclear power, and expanded opportunities for exploiting geothermal energy. This argues against premature deployment of wasteful renewables. The real crisis is the threat posed by the imposition of draconian green policies to our long-term prosperity, and especially to the world’s poor.

The Futility and Falsehoods of Climate Heroics

01 Tuesday Jun 2021

Posted by Nuetzel in Climate science, Environmental Fascism, Global Warming, Uncategorized

≈ Leave a comment

Tags

Atmospheric Carbon, Biden Administration, Carbon forcing, Carbon Mitigation, Climate Change, Climate Sensitivity, ExxonMobil, Fossil fuels, global warming, Green Energy, Greenhouse Gas, IPPC, John Kerry, Judith Curry, Natural Gas, Netherlands Climate Act, Nic Lewis, Nuclear power, Putty-Clay Technology, Renewables, Ross McKitrick, Royal Dutch Shell, Social Cost of Carbon, William Nordhaus

The world’s gone far astray in attempts to battle climate change through forced reductions in carbon emissions. Last Wednesday, in an outrageously stupid ruling,a Dutch court ordered Royal Dutch Shell to reduce its emissions by 45% by 2030 relative to 2019 levels. It has nothing to do with Shell’s historical record on the environment. Rather, the Court said Shell’s existing climate action plans did not meet “the company’s own responsibility for achieving a CO2 reduction.” The decision will be appealed, but it appears that “industry agreements” under the Netherlands’ Climate Act of 2019 are in dispute.

Later that same day, a shareholder dissident group supporting corporate action on climate change won at least two ExxonMobil board seats. And then we have the story of John Kerry’s effort to stop major banks from lending to the fossil fuel industry. Together with the Biden Administration’s other actions on energy policy, we are witnessing the greatest attack on conventional power sources in history, and we’ll all pay dearly for it. 

The Central Planner’s Conceit

Technological advance is a great thing, and we’ve seen it in the development of safe nuclear power generation, but the environmental left has successfully placed roadblocks in the way of its deployment. Instead, they favor the mandated adoption of what amount to beta versions of technologies that might never be economic and create extreme environmental hazards of their own (see here, here, here, and here). To private adopters, green energy installations are often subsidized by the government, disguising their underlying inefficiencies. These premature beta versions are then embedded in our base of productive capital and often remain even as they are made obsolete by subsequent advances. The “putty-clay” nature of technology decisions should caution us against premature adoptions of this kind. This is just one of the many curses of central planning.

Not only have our leftist planners forced the deployment of inferior technologies: they are actively seeking to bring more viable alternatives to ruination. I mentioned nuclear power and even natural gas offer a path for reducing carbon emissions, yet climate alarmists wage war against it as much as other fossil fuels. We have Kerry’s plot to deny funding for the fossil fuel industry and even activist “woke” investors, attempting to override management expertise and divert internal resources to green energy. It’s not as if renewable energy sources are not already part of these energy firms’ development portfolios. Allocations of capital and staff to these projects are usually dependent upon a company’s professional and technical expertise, market forces, and (less propitiously) incentives decreed by the government. Yet, the activist investors are there to impose their will.

Placing Faith and Fate In Models

All these attempts to remake our energy complex and the economy are based on the presumed external costs associated with carbon emissions. Those costs, and the potential savings achievable through the mitigation efforts of government and private greenies around the globe, have been wildly exaggerated.

The first thing to understand about the climate “science” relied upon by the environmental left is that it is almost exclusively model-dependent. In other words, it is based on mathematical relationships specified by the researchers. Their projections depend on those specs, the selection of parameter values, and the scenarios to which they are subjected. The models are usually calibrated to be roughly consistent with outcomes over some historical time period, but as modelers in almost any field can attest, that is not hard to do. It’s still possible to produce extreme results out-of-sample. The point is that these models are generally not estimated statistically from a lengthy sample of historical data. Even when sound statistical methodologies are employed, the samples are blinkingly short on climatological timescales. That means they are highly sample-specific and likely to propagate large errors out-of-sample. But most of these are what might be called “toy models” specified by the researcher. And what are often billed as “findings” are merely projections based on scenarios that are themselves manufactured by imaginative climate “researchers” cum grant-seeking partisans. In fact, it’s much worse than that because even historical climate data is subject to manipulation, but that’s a topic for another day.

Key Assumptions

What follows are basic components of the climate apocalypse narrative as supported by “the science” of man-made or anthropomorphic global warming (AGW):

(A) The first kind of model output to consider is the increase in atmospheric carbon concentration over time, measured in parts per million (PPM). This is a function of many natural processes, including volcanism and other kinds of outgassing from oceans and decomposing biomass, as well absorption by carbon sinks like vegetation and various geological materials. But the primary focus is human carbon generating activity, which depends on the carbon-intensity of production technology. As Ross McKitrick shows (see chart below), projections from these kinds of models have demonstrated significant upside bias over the years. Whether that is because of slower than expected economic growth, unexpected technological efficiencies, an increase in the service-orientation of economic activity worldwide, or feedback from carbon-induced greening or other processes, most of the models have over-predicted atmospheric carbon PPM. Those errors tend to increase with the passage of time, of course.

(B) Most of the models promoted by climate alarmists are carbon forcing models, meaning that carbon emissions are the primary driver of global temperatures and other phenomena like storm strength and increases in sea level. With increases in carbon concentration predicted by the models in (A) above, the next stage of models predicts that temperatures must rise. But the models tend to run “hot.” This chart shows the mean of several prominent global temperature series contrasted with 1990 projections from the Intergovernmental Panel on Climate Change (IPCC).

The following is even more revealing, as it shows the dispersion of various model runs relative to three different global temperature series:

And here’s another, which is a more “stylized” view, showing ranges of predictions. The gaps show errors of fairly large magnitude relative to the mean trend of actual temperatures of 0.11 degrees Celsius per decade.

(C) Climate sensitivity to “radiative forcing” is a key assumption underlying all of the forecasts of AGW. A simple explanation is that a stronger greenhouse effect, and increases in the atmosphere’s carbon concentration, cause more solar energy to be “trapped” within our “greenhouse,” and less is radiated back into space. Climate sensitivity is usually measured in degrees Celsius relative to a doubling of atmospheric carbon. 

And how large is the climate’s sensitivity to a doubling of carbon PPM? The IPCC says it’s in a range of 1.5C to 4.5C. However, findings published by Nic Lewis and Judith Curry are close to the low end of that range, and are those found by the author of the paper described here. 

In separate efforts, Finnish and Japanese researchers have asserted that the primary cause of recent warming is an increase in low cloud cover, which the Japanese team attributes to increases in the Earth’s bombardment by cosmic rays due to a weakening magnetic field. The Finnish authors note that most of the models used by the climate establishment ignore cloud formation, an omission they believe leads to a massive overstatement (10x) of sensitivity to carbon forcings. Furthermore, they assert that carbon forcings are mainly attributable to ocean discharge as opposed to human activity.

(D) Estimates of the Social Cost of Carbon (SCC) per ton of emissions are used as a rationale for carbon abatement efforts. The SCC was pioneered by economist William Nordhaus in the 1990s, and today there are a number of prominent models that produce distributions of possible SCC values, which tend to have high dispersion and extremely long upper tails. Of course, the highest estimates are driven by the same assumptions about extreme climate sensitivities discussed above. The Biden Administration is using an SCC of $51 per ton. Some recommend the adoption of even higher values for regulatory purposes in order to achieve net-zero emissions at an early date, revealing the manipulative purposes to which the SCC concept is put. This is a raw attempt to usurp economic power, not any sort of exercise in optimization, as this admission from a “climate expert” shows. In the midst of a barrage of false climate propaganda (hurricanes! wildfires!), he tells 60 Minutes that an acceptable limit on warming of 1.5C is just a number they “chose” as a “tipping point.”

As a measurement exercise, more realistic climate sensitivities yield much lower SCCs. McKitrick presents a chart from Lewis-Curry comparing their estimates of the SCC at lower climate sensitivities to an average of earlier estimates used by IPCC:

High levels of the SCC are used as a rationale for high-cost carbon abatement efforts. If the SCC is overstated, however, then costly abatements represent waste. And there is no guarantee that spending an amount on abatements equal to the SCC will eliminate the presumed cost of a ton’s worth of anthropomorphic warming. Again, there are strong reasons to believe that the warming experienced over the past several decades has had multiple causes, and human carbon emissions might have played a relatively minor role. 

Crisis Is King

Some people just aren’t happy unless they have a crisis over which to harangue the rest of us. But try as they might, the vast resources dedicated to carbon reduction are largely wasted. I hesitate to say their effort is quixotic because they want more windmills and are completely lacking in gallantry. As McKitrick notes, it takes many years for abatement to have a meaningful impact on carbon concentrations, and since emissions mix globally, unilateral efforts are practically worthless. Worse yet, the resource costs of abatement and lost economic growth are unacceptable, especially when some of the most promising alternative sources of “clean” energy are dismissed by activists. So we forego economic growth, rush to adopt immature energy alternatives, and make very little progress toward the stated goals of the climate alarmists.

Feckless Greens Burn Aussie Bush

09 Thursday Jan 2020

Posted by Nuetzel in Forest Fires, Global Warming, Wildfires

≈ 1 Comment

Tags

Arson, Arson Raptors, Australia, Black Kite, CO2 Forcings, David Packham, David Ward, Dead Vegetation, Eucalyptus, Gasoline Trees, Human Ignitions, Invasive Grasses, James Morrow, Jennifer Marohasy, Leslie Eastman, Marc Lallanilla, Massachusetts, Mike Shedlock, Mishtalk, Myron Ebell, New South Wales, Patrick Michaels, Prescribed Burns, Queensland, Roy Spencer, Victoria, Whistling Kite, Willis Eschenbach

The raging Australian bush fires have been expansive and deadly. Country-wide, over 12 million acres have burned over the past few months, an area approaching twice the size of Massachusetts. The burnt areas in some regions rival or exceed individual fires of the past, such as the Black Friday fire of 1939, which burned about 5 million acres. As bad as the recent fires have been, note that various outlets in the U.S. have felt it necessary to exaggerate the size of the burnt area (also see here). And the season’s burnt area has not even approached the total for 1974-1975, when over 250 million acres burned.

So what causes these bush fires? Dry weather and plenty of fuel from dead vegetation create the hazard, of course. A spark is needed, as from lightning, an accident, an arsonist, or perhaps even a blistering sun, but warm temperatures are unnecessary. Nevertheless, the narrative we hear year-in and year-out is that global warming is to blame for wildfires. My commentary on the climate-change hubbub over the 2018 California fires is here. As for Australia’s fires, there is similarly ample evidence that climate change or warming has nothing to do with it. Rather, as in California, there is a pattern of mismanagement of forests and brush combined with population growth, accidents, and arson, and of course a dry spell. This dry spell has been severe, but the trend in Australia over the past 120 years has been toward more precipitation, not less, and the past 25 years have been relatively rainy. The rain comes with a downside, however: it encourages growth in vegetation, much of which dies every dry season, leaving plenty of fuel for fires. And the fuel has been accumulating.

Mike Shedlock at Mishtalk offers some pertinent observations. First, he quotes James Morrow in the Wall Street Journal:

“Byzantine environmental restrictions prevent landholders from clearing scrub, brush and trees. State governments don’t do their part to reduce the fuel load in parks. Last November a former fire chief in Victoria slammed that state’s ‘minimalist approach’ to hazard-reduction burning in the off-season. That complaint is heard across the country.“

Prescribed burns have been in decline and focused on areas adjacent to suburbs, leaving vast areas of accumulating fuel. This is a product of wrongheaded conservation efforts and resistance to CO2 emissions. These policymakers haven’t done favors for Australia or the world on either count. Shedlock reinforces this point with the following statement from Patrick Michaels and Myron Ebell:

“Australia has been ready to explode for years. David Packham, former head of Australia’s National Rural Fire Research Centre, warned in a 2015 article in the Age that fire fuel levels had climbed to their most dangerous levels in thousands of years. He noted this was the result of ‘misguided green ideology.'”

Eucalyptus trees grow thickly in many fire-prone areas of Australia, and Shedlock says these trees act as a multiplier on the fire hazard. Yet these trees remain a favorite landscape feature for suburbians even in fire-prone areas. He quotes Marc Lallanilla in LiveScience:

“Fallen eucalyptus leaves create dense carpets of flammable material, and the trees’ bark peels off in long streamers that drop to the ground, providing additional fuel that draws ground fires up into the leaves, creating massive, fast-spreading ‘crown fires’ in the upper story of eucalyptus forests. … Additionally, the eucalyptus oil that gives the trees their characteristic spicy fragrance is a flammable oil: This oil, combined with leaf litter and peeling bark during periods of dry, windy weather, can turn a small ground fire into a terrifying, explosive firestorm in a matter of minutes. That’s why eucalyptus trees — especially the blue gums (Eucalyptus globulus) that are common throughout New South Wales — are sometimes referred to wryly as ‘gasoline trees.’“

The introduction of non-native invasive grasses has also been blamed for increasing the fuel load in the bush. And as incredible as it may seem, certain birds native to Australia are spreading bushfires by carrying and dropping burning sticks in grasslands to flush out prey. Birds are indeed tool users! The Whistling Kite and the Black Kite are sometimes called “arson raptors”, according to Leslie Eastman at this link.

The hypothesis that climate warming from CO2 emissions is the cause of the bushfires is undermined by all of the above. Then, of course, there are the arsonists and accidental fires. Over 180 people have been arrested for setting recent brushfires intentionally in New South Wales alone, and 103 others in Queensland. (Also see here.) Jim Steele reports that human ignitions account for 66% of bush fires, while just 11% are caused by lightning. Population growth has brought more people into close proximity with the bush, which increases the exposure of humans to fire danger and might well add to the number of accidents and potential arsonists. Obviously, human and avian arson, and accidents, are not within the line of causation that climate alarmists have in mind.

Roy Spencer addresses some of the inconsistencies in the claimed link between climate warming and the Australian bushfires. First, of course, is the trend in rainfall. Climate models based on CO2 forcings predict no long-term trend in Australia’s rainfall, but again, rainfall has increased in Australia during the era of accelerated forcings. Interestingly, the fires of 1974-75 occurred during a period that was quite rainy, but that rain might have added so much to the annual vegetation cycle that it exacerbated the effect of dry season. Temperatures in Australia were quite warm in 2019, but the climate models cannot account for that variation, especially as Australian temperatures are subject to high variability from year-to-year. It’s been hotter before, though the temperature records in Australia have been subject to some controversial “editing”. Finally, Spencer notes that global wildfire activity has been in decline for many years, despite the mild warming we’ve experienced over the past 50 years (also see here).

Australia has bush fires every year, and this year has been particularly bad, but it might not reach the proportions of the fires in 1974-75. The causes are: poor burn management practices, or sometimes neglect and no burn management at all, allowing dead vegetation to accumulate to dangerous levels; arson, which has been implicated in a large number of fires this year; and 2019 was a very dry year. The contention that global warming or climate change is responsible for these bush fires is a dangerous distraction from reforms that can minimize fire hazards in the future.

For additional reading of interest, see “Australia Fires … and Misfires” by Willis Eschenbach and “The Mathematics of Connectivity and Bush Fires: A Note From David Ward” a post from Jennifer Marohasy’s blog.

Doomsayers Batting Zero, Draft Kids To Cause

22 Sunday Sep 2019

Posted by Nuetzel in Environmental Fascism, Global Warming

≈ Leave a comment

Tags

Al Gore, Arthur Chrenkoff, Capitalism, Carbon Forcings, Chicken Little, Child Advocacy, Climate Alarmism, Climate Deaths, David Viner, Goose Eggs, Greta Thunberg, Michael Oppenheimer, Model Bias, Over-Prediction, Paul Erlich, Prince Charles, Scott Adams, Seeing CO2, United Nations

Empiricists, take note: The kids were out in the streets on Friday, skipping school to warn us of a climate doomsday fast approaching. Like Greta Thunberg, one of several teenage girls billed as modern-day Cassandras, they just know it. But wait, I think I heard the same thing many years ago… doomsday is nigh! In fact, I’ve heard it over and over through my entire adulthood. And here’s the empirical regularity: “Goose Eggs: No Climate Doomsday Warning Has Come True“. Ever. From the link:

     “Some examples:

    • 1967 — Stanford … expert Paul Erlich predicted “time of famines” in 1975.
    • 1971 — A top NASA expert predicted an “ice age” by 2021.
    • 1988 — It was predicted that the Maldives would be under water by last year.
    • 2008 — Gore said the Arctic would be free of ice by 2013.
    • 2009 — [Prince] Charles said there was just 96 months left to save the world.”

Here are a few other warnings that haven’t panned out:

“Within a few years ‘children just aren’t going to know what snow is.’ Snowfall will be ‘a very rare and exciting event.’” — Dr. David Viner, senior research scientist at the climatic research unit (CRU) of the University of East Anglia [March 2000]”

“[By] 1995, the greenhouse effect would be desolating the heartlands of North America and Eurasia with horrific drought, causing crop failures and food riots…[By 1996] The Platte River of Nebraska would be dry, while a continent-wide black blizzard of prairie topsoil will stop traffic on interstates, strip paint from houses and shut down computers. — Michael Oppenheimer in 1990″

There have been many others (also see here and here). Oh, but you just wait, they say. This time it’s different  and it won’t be long!. You know, people just love to worry. Even so, what kind of daft world do we inhabit with children and adults completely freaked out about “problems” that don’t approximate reality.

Predictions of a more clinical variety, such as upward temperature trends, have been way off on a consistent basis: much too high, that is. But here’s the key: all of the other calamitous developments said to be in our future are predicated on those temperature forecasts. The warnings are not based on data per se, but on on crappy climate models (and see here), which are simplifications of reality, loosely calibrated to capture a relatively short period of historical records. And the models are crappy because they often rely on one input, CO2 forcings. The modelers have difficulty addressing the empirical sensitivity of temperature to carbon, the net effects of radiative forcing, clouds, and ocean circulation. In many prominent cases they don’t even try. Hey look, we’re all gonna die!

A striking misconception one hears repeatedly is that we experience many more hot days, and they are hotter, hot days than in the past. Sure, extremely hot days are bad, but not as bad as extremely cold days, and probably worse than warm nights. The truth is, however, that nearly all of the warming experienced over the past few decades has been in nighttime lows, not daytime highs. More “seasoned” climate alarmists don’t seem to have any memory of the hot days of their youth, and the kids… well, they just fell off the turnip truck, so they have no idea.

One of the great perversions of climate alarmism is the notion that the private enterprise system must be heavily regulated or even abolished in order to put an end to global warming. Never mind that governments are directly responsible for a major share of environmental degradation. And as private economies flourish, the environmental efficiency of production actually improves. In fact, if one were to stipulate that climate change is a problem, as I will for just this one sentence, vibrant capitalism offers the best path to environmental solutions. There are several basic reasons. One is that economic growth and higher income levels give consumers the wherewithal to demand and pay for costlier “green” products. More fundamentally, economic growth facilitates development and investment in cleaner technologies by business and government.

Miss Thunberg doesn’t understand any of this, of course, but she’s a pretty good little scold:

“This is all wrong. I shouldn’t be up here. I should be back in school on the other side of the ocean, yet you come to us young people for hope. How dare you.”

Here’s Arthur Chrenkoff’s take on poor Thunberg and her message:

“[She] should be going to Beijing or Bangalore and staging her protests there instead of, or at least in addition to, Sweden or New York. She should be hounding President Xi and Prime Minister Modi about their shameful emissions. She should be leading throngs of Asian kids out of schools for her Friday student strikes. She should be castigating the industries and the consumers of the developing world for destroying the planet and killing humanity in the process. She should be doing all this if she were serious about the global nature of the problem.”

I especially like this quote from Scott Adams on the “child advocate” phenomenon we’re witnessing:

“Adults sometimes like to use children to carry their messages because it makes it hard for the other side to criticize them without seeming like monsters. If adults have encouraged you to panic about climate change without telling you what I am telling you here, they do not have your best interests at heart. They are using you.“

Of course, Thunberg is thoroughly propagandized and a useful theatrical tool for the alarmist establishment. She has made all sorts of ridiculous and unquestioned claims before the United Nations and elsewhere (e.g., people are dying from climate change (no); that she can “see” CO2 (okay, her mother said that, but what a hoot!). Don’t think for a second that “we have to listen to the children” is uttered sincerely by any adult climate alarmist. It’s manipulation. I feel sorry for Thunberg not least because she is probably deeply frightened about the climate, but also because she is a tool of a death cult.

You really can’t blame kids for being worried about bogeymen foisted upon them by foolish elders, but you can blame the adults for their own frightened acceptance of chicken-little climate augury. And that’s what the kids are being taught. The schools certainly won’t penalize them for missing classes. In fact, many of their teachers accompanied them to the protests.

The climate scare is part of a larger agenda to dismantle not just capitalism, but a host of innocent individual liberties. Scaring children and making teens into miserable pessimists will groom them as good (if neurotic) environmental soldiers for life. They’ll be fit as compliant subjects of a new, environmental fascist state, never to know the sweet freedom and growth possible without the needless bindings imposed by climate cranks. Children, the protection you’ve been told to demand isn’t necessary or worth it. You’re fighting for goose eggs!

 

 

← Older posts
Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Immigration and Merit As Fiscal Propositions
  • Tariff “Dividend” From An Indigent State
  • Almost Looks Like the Fed Has a 3% Inflation Target
  • Government Malpractice Breeds Health Care Havoc
  • A Tax On Imports Takes a Toll on Exports

Archives

  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Passive Income Kickstart
  • OnlyFinance.net
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library
  • Scattered Showers and Quicksand

Blog at WordPress.com.

Passive Income Kickstart

OnlyFinance.net

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The Future is Ours to Create

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

Scattered Showers and Quicksand

Musings on science, investing, finance, economics, politics, and probably fly fishing.

  • Subscribe Subscribed
    • Sacred Cow Chips
    • Join 128 other subscribers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...