• About

Sacred Cow Chips

Sacred Cow Chips

Tag Archives: Climate Change

Grow Or Collapse: Stasis Is Not a Long-Term Option

18 Wednesday Jan 2023

Posted by Nuetzel in Climate, Environment, Growth

≈ Leave a comment

Tags

Asymptotic Burnout, Benjamin Friedman, Climate Change, Dead Weight Loss, Degrowth, Fermi Paradox, Lewis M. Andrews, Limits to Growth, NIMBYism, Paul Ehrlich, Population Bomb, Poverty, regulation, Robert Colvile, Stakeholder Capitalism, State Capacity, Stubborn Attachments, Subsidies, Tax Distortions, Thomas Malthus, Tyler Cowan, Veronique de Rugy, Zero Growth

Growth is a human imperative and a good thing in every sense. We’ve long heard from naysayers, however, that growth will exhaust our finite resources, ending in starvation and the collapse of human civilization. They say, furthermore, that the end is nigh! It’s an old refrain. Thomas Malthus lent it credibility over 200 years ago (perhaps unintentionally), and we can pick on poor Paul Ehrlich’s “Population Bomb” thesis as a more modern starting point for this kind of hysteria. Lewis M. Andrews puts Ehrlich’s predictions in context:

“A year after the book’s publication, Ehrlich went on to say that this ‘utter breakdown’ in Earth’s capacity to support its bulging population was just fifteen years away. … For those of us still alive today, it is clear that nothing even approaching what Ehrlich predicted ever happened. Indeed, in the fifty-four years since his dire prophesy, those suffering from starvation have gone from one in four people on the planet to just one in ten, even as the world’s population has doubled.”

False Limits

The “limits” argument comes from the environmental Left, but it creates for them an uncomfortable tradeoff between limiting growth and the redistribution of a fixed (they hope) or shrinking (more likely) pie. That’s treacherous ground on which to build popular support. It’s also foolish to stake a long-term political agenda on baldly exaggerated claims (and see here) about the climate and resource constraints. Ultimately, people will recognize those ominous forecasts as manipulative propaganda.

Last year, an academic paper argued that growing civilizations must eventually reach a point of “asymptotic burnout” due to resource constraints, and must undergo a “homeostatic awakening”: no growth. The authors rely on a “superlinear scaling” argument based on cross-sectional data on cities, and they offer their “burnout” hypothesis as an explanation for the Fermi Paradox: the puzzling quiet we observe in the universe while we otherwise expect it to be teeming with life… civilizations reach their “awakenings” before finding ways to communicate with, or even detect, their distant neighbors. I addressed this point and it’s weaknesses last year, but here I mention it only to demonstrate that the “limits to growth” argument lives on in new incarnations.

Growth-limiting arguments are tenuous on at least three fundamental grounds: 1) failure to consider the ability of markets to respond to scarcity; 2) underestimating the potential of human ingenuity not only to adapt to challenges, but to invent new solutions, exploit new resources, and use existing resources more efficiently; and 3) homeostasis is impossible because zero growth cannot be achieved without destructive coercion, suspension of cooperative market mechanisms, and losses from non-market (i.e., political and non-political) competition for the fixed levels of societal wealth and production.

The zero-growth world is one that lacks opportunities and rewards for honest creation of value, whether through invention or simple, hard work. That value is determined through the interaction of buyers and sellers in markets, the most effective form of voluntary cooperation and social organization ever devised by mankind. Those preferring to take spoils through the political sphere, or who otherwise compete on the basis of force, either have little value to offer or simply lack the mindset to create value to exchange with others at arms length.

Zero-Growth Mentality

As Robert Colvile writes in a post called “The Morality of Growth”:

“A society without growth is not just politically far more fragile. It is hugely damaging to people’s lives – and in particular to the young, who will never get to benefit from the kind of compounding, increasing prosperity their parents enjoyed.”

Expanding on this theme is commenter Slocum at the Marginal Revolution site, where Colvile’s essay was linked:

“Humans behave poorly when they perceive that the pie is fixed or shrinking, and one of the main drivers for behaving poorly is feelings of envy coming to the forefront. The way we encourage people not to feel envy (and to act badly) is not to try to change human nature, or ‘nudge’ them, but rather to maintain a state of steady improvement so that they (naturally) don’t feel envious, jealous, tribal, xenophobic etc. Don’t create zero-sum economies and you won’t bring out the zero-sum thinking and all the ills that go with it.”

And again, this dynamic leads not to zero growth (if that’s desired), but to decay. Given the political instability to which negative growth can lead, collapse is a realistic possibility.

I liked Colville’s essay, but it probably should have been titled “The Immorality of Non-Growth”. It covers several contemporary obstacles to growth, including the rise of “stakeholder capitalism”, the growth of government at the expense of the private sector, strangling regulation, tax disincentives, NIMBYism, and the ease with which politicians engage in populist demagoguery in establishing policy. All those points have merit. But if his ultimate purpose was to shed light on the virtues of growth, it seems almost as if he lost his focus in examining only the flip side of the coin. I came away feeling like he didn’t expend much effort on the moral virtues of growth as he intended, though I found this nugget well said:

“It is striking that the fastest-growing societies also tend to be by far the most optimistic about their futures – because they can visibly see their lives getting better.”

Compound Growth

A far better discourse on growth’s virtues is offered by Veronique de Rugy in “The Greatness of Growth”. It should be obvious that growth is a potent tonic, but its range as a curative receives strangely little emphasis in popular discussion. First, de Rugy provides a simple illustration of the power of long-term growth, compound growth, in raising average living standards:

This is just a mechanical exercise, but it conveys the power of growth. At 2% real growth, real GDP per capital would double in 35 years and quadruple in 70 years. At 4% growth, real GDP would double in 18 years… less than a generation! It would quadruple in 35 years. If you’re just now starting a career, imagine nearing retirement at a standard of living four times as lavish as today’s senior employees (who make a lot more than you do now). We’ll talk a little more about how such growth rates might be achieved, but first, a little more on what growth can achieve.

The Rewards of Growth

Want to relieve poverty? There is no better and more permanent solution than economic growth. Here are some illustrations of this phenomenon:

Want to rein-in the federal budget deficit? Growth reduces the burden of the existing debt and shrinks fiscal deficits, though it might interfere with what little discipline spendthrift politicians currently face. We’ll have to find other fixes for that problem, but at least growth can insulate us from their profligacy.

And who can argue with the following?

“All the stuff an advocate anywhere on the political spectrum claims to value—good health, clean environment, safety, families and quality of life—depends on higher growth. …

There are other well-documented material consequences of modern economic growth, such as lower homicide rates, better health outcomes (babies born in the U.S. today are expected to live into their upper 70s, not their upper 30s as in 1860), increased leisure, more and better clothing and shelter, less food insecurity and so on.”

De Rugy argues convincingly that growth might well entail a greater boost in living standards for lower ranges of the socioeconomic spectrum than for the well-to-do. That would benefit not just those impoverished due to a lack of skills, but also those early in their careers as well as seniors attempting to earn extra income. For those with a legitimate need of a permanent safety net, growth allows society to be much more generous.

What de Rugy doesn’t mention is how growth can facilitate greater saving. In a truly virtuous cycle, saving is transformed into productivity-enhancing additions to the stock of capital. And not just physical capital, but human capital through investment in education as well. In addition, growth makes possible additional research and development, facilitating the kind of technical innovation that can sustain growth.

Getting Out of the Way of Growth

Later in de Rugy’s piece, she evaluates various ways to stimulate growth, including deregulation, wage and price flexibility, eliminating subsidies, less emphasis on redistribution, and simplifying the tax code. All these features of public policy are stultifying and involve dead-weight losses to society. That’s not to deny the benefits of adequate state capacity for providing true public goods and a legal and judicial system to protect individual rights. The issue of state capacity is a major impediment to growth in the less developed world, whereas countries in the developed world tend to have an excess of state “capacity”, which often runs amok!

In the U.S., our regulatory state imposes huge compliance costs on the private sector and effectively prohibits or destroys incentives for a great deal of productive (and harmless) activity. Interference with market pricing stunts growth by diverting resources from their most valued uses. Instead, it directs them toward uses that are favored by political elites and cronies. Subsidies do the same by distorting tradeoffs at a direct cost to taxpayers. Our system of income taxes is rife with behavioral distortions and compliance costs, bleeding otherwise productive gains into the coffers of accountants, tax attorneys, and bureaucrats. Finally, redistribution often entails the creation of disincentives, fostering a waste of human potential and a pathology of dependence.

Growth and Morality

Given the unequivocally positive consequences of growth to humanity, could the moral case for growth be any clearer? De Rugy quotes Benjamin Friedman’s “The Moral Consequences of Economic Growth”:

“Growth is valuable not only for our material improvement but for how it affects our social attitudes and our political institutions—in other words, our society’s moral character, in the term favored by the Enlightenment thinkers from whom so many of our views on openness, tolerance and democracy have sprung.”

De Rugy also paraphrases Tyler Cowen’s position on growth from his book “Stubborn Attachments”:

“… economic growth, properly understood, should be an essential element of any ethical system that purports to care about universal human well-being. In other words, the benefits are so varied and important that nearly everyone should have a pro-growth program at or near the top of their agenda.”

Conclusion

Agitation for “degrowth” is often made in good faith by truly frightened people. Better education would help them, but our educational establishment has been corrupted by the same ignorant narrative. When it comes to rulers, the fearful are no less tyrannical than power-hungry authoritarians. In fact, fear can be instrumental in enabling that kind of transformation in the personalities of activists. A basic failing is their inability to recognize the many ways in which growth improves well-being, including the societal wealth to enable adaptation to changing conditions and the investment necessary to enhance our range of technological solutions for mitigating existential risks. Not least, however, is the failure of the zero-growth movement to understand the cruelty their position condones in exchange for their highly speculative assurances that we’ll all be better off if we just do as they say. A terrible downside will be unavoidable if and when growth is outlawed.

The Futility and Falsehoods of Climate Heroics

01 Tuesday Jun 2021

Posted by Nuetzel in Climate science, Environmental Fascism, Global Warming, Uncategorized

≈ Leave a comment

Tags

Atmospheric Carbon, Biden Administration, Carbon forcing, Carbon Mitigation, Climate Change, Climate Sensitivity, ExxonMobil, Fossil fuels, global warming, Green Energy, Greenhouse Gas, IPPC, John Kerry, Judith Curry, Natural Gas, Netherlands Climate Act, Nic Lewis, Nuclear power, Putty-Clay Technology, Renewables, Ross McKitrick, Royal Dutch Shell, Social Cost of Carbon, William Nordhaus

The world’s gone far astray in attempts to battle climate change through forced reductions in carbon emissions. Last Wednesday, in an outrageously stupid ruling,a Dutch court ordered Royal Dutch Shell to reduce its emissions by 45% by 2030 relative to 2019 levels. It has nothing to do with Shell’s historical record on the environment. Rather, the Court said Shell’s existing climate action plans did not meet “the company’s own responsibility for achieving a CO2 reduction.” The decision will be appealed, but it appears that “industry agreements” under the Netherlands’ Climate Act of 2019 are in dispute.

Later that same day, a shareholder dissident group supporting corporate action on climate change won at least two ExxonMobil board seats. And then we have the story of John Kerry’s effort to stop major banks from lending to the fossil fuel industry. Together with the Biden Administration’s other actions on energy policy, we are witnessing the greatest attack on conventional power sources in history, and we’ll all pay dearly for it. 

The Central Planner’s Conceit

Technological advance is a great thing, and we’ve seen it in the development of safe nuclear power generation, but the environmental left has successfully placed roadblocks in the way of its deployment. Instead, they favor the mandated adoption of what amount to beta versions of technologies that might never be economic and create extreme environmental hazards of their own (see here, here, here, and here). To private adopters, green energy installations are often subsidized by the government, disguising their underlying inefficiencies. These premature beta versions are then embedded in our base of productive capital and often remain even as they are made obsolete by subsequent advances. The “putty-clay” nature of technology decisions should caution us against premature adoptions of this kind. This is just one of the many curses of central planning.

Not only have our leftist planners forced the deployment of inferior technologies: they are actively seeking to bring more viable alternatives to ruination. I mentioned nuclear power and even natural gas offer a path for reducing carbon emissions, yet climate alarmists wage war against it as much as other fossil fuels. We have Kerry’s plot to deny funding for the fossil fuel industry and even activist “woke” investors, attempting to override management expertise and divert internal resources to green energy. It’s not as if renewable energy sources are not already part of these energy firms’ development portfolios. Allocations of capital and staff to these projects are usually dependent upon a company’s professional and technical expertise, market forces, and (less propitiously) incentives decreed by the government. Yet, the activist investors are there to impose their will.

Placing Faith and Fate In Models

All these attempts to remake our energy complex and the economy are based on the presumed external costs associated with carbon emissions. Those costs, and the potential savings achievable through the mitigation efforts of government and private greenies around the globe, have been wildly exaggerated.

The first thing to understand about the climate “science” relied upon by the environmental left is that it is almost exclusively model-dependent. In other words, it is based on mathematical relationships specified by the researchers. Their projections depend on those specs, the selection of parameter values, and the scenarios to which they are subjected. The models are usually calibrated to be roughly consistent with outcomes over some historical time period, but as modelers in almost any field can attest, that is not hard to do. It’s still possible to produce extreme results out-of-sample. The point is that these models are generally not estimated statistically from a lengthy sample of historical data. Even when sound statistical methodologies are employed, the samples are blinkingly short on climatological timescales. That means they are highly sample-specific and likely to propagate large errors out-of-sample. But most of these are what might be called “toy models” specified by the researcher. And what are often billed as “findings” are merely projections based on scenarios that are themselves manufactured by imaginative climate “researchers” cum grant-seeking partisans. In fact, it’s much worse than that because even historical climate data is subject to manipulation, but that’s a topic for another day.

Key Assumptions

What follows are basic components of the climate apocalypse narrative as supported by “the science” of man-made or anthropomorphic global warming (AGW):

(A) The first kind of model output to consider is the increase in atmospheric carbon concentration over time, measured in parts per million (PPM). This is a function of many natural processes, including volcanism and other kinds of outgassing from oceans and decomposing biomass, as well absorption by carbon sinks like vegetation and various geological materials. But the primary focus is human carbon generating activity, which depends on the carbon-intensity of production technology. As Ross McKitrick shows (see chart below), projections from these kinds of models have demonstrated significant upside bias over the years. Whether that is because of slower than expected economic growth, unexpected technological efficiencies, an increase in the service-orientation of economic activity worldwide, or feedback from carbon-induced greening or other processes, most of the models have over-predicted atmospheric carbon PPM. Those errors tend to increase with the passage of time, of course.

(B) Most of the models promoted by climate alarmists are carbon forcing models, meaning that carbon emissions are the primary driver of global temperatures and other phenomena like storm strength and increases in sea level. With increases in carbon concentration predicted by the models in (A) above, the next stage of models predicts that temperatures must rise. But the models tend to run “hot.” This chart shows the mean of several prominent global temperature series contrasted with 1990 projections from the Intergovernmental Panel on Climate Change (IPCC).

The following is even more revealing, as it shows the dispersion of various model runs relative to three different global temperature series:

And here’s another, which is a more “stylized” view, showing ranges of predictions. The gaps show errors of fairly large magnitude relative to the mean trend of actual temperatures of 0.11 degrees Celsius per decade.

(C) Climate sensitivity to “radiative forcing” is a key assumption underlying all of the forecasts of AGW. A simple explanation is that a stronger greenhouse effect, and increases in the atmosphere’s carbon concentration, cause more solar energy to be “trapped” within our “greenhouse,” and less is radiated back into space. Climate sensitivity is usually measured in degrees Celsius relative to a doubling of atmospheric carbon. 

And how large is the climate’s sensitivity to a doubling of carbon PPM? The IPCC says it’s in a range of 1.5C to 4.5C. However, findings published by Nic Lewis and Judith Curry are close to the low end of that range, and are those found by the author of the paper described here. 

In separate efforts, Finnish and Japanese researchers have asserted that the primary cause of recent warming is an increase in low cloud cover, which the Japanese team attributes to increases in the Earth’s bombardment by cosmic rays due to a weakening magnetic field. The Finnish authors note that most of the models used by the climate establishment ignore cloud formation, an omission they believe leads to a massive overstatement (10x) of sensitivity to carbon forcings. Furthermore, they assert that carbon forcings are mainly attributable to ocean discharge as opposed to human activity.

(D) Estimates of the Social Cost of Carbon (SCC) per ton of emissions are used as a rationale for carbon abatement efforts. The SCC was pioneered by economist William Nordhaus in the 1990s, and today there are a number of prominent models that produce distributions of possible SCC values, which tend to have high dispersion and extremely long upper tails. Of course, the highest estimates are driven by the same assumptions about extreme climate sensitivities discussed above. The Biden Administration is using an SCC of $51 per ton. Some recommend the adoption of even higher values for regulatory purposes in order to achieve net-zero emissions at an early date, revealing the manipulative purposes to which the SCC concept is put. This is a raw attempt to usurp economic power, not any sort of exercise in optimization, as this admission from a “climate expert” shows. In the midst of a barrage of false climate propaganda (hurricanes! wildfires!), he tells 60 Minutes that an acceptable limit on warming of 1.5C is just a number they “chose” as a “tipping point.”

As a measurement exercise, more realistic climate sensitivities yield much lower SCCs. McKitrick presents a chart from Lewis-Curry comparing their estimates of the SCC at lower climate sensitivities to an average of earlier estimates used by IPCC:

High levels of the SCC are used as a rationale for high-cost carbon abatement efforts. If the SCC is overstated, however, then costly abatements represent waste. And there is no guarantee that spending an amount on abatements equal to the SCC will eliminate the presumed cost of a ton’s worth of anthropomorphic warming. Again, there are strong reasons to believe that the warming experienced over the past several decades has had multiple causes, and human carbon emissions might have played a relatively minor role. 

Crisis Is King

Some people just aren’t happy unless they have a crisis over which to harangue the rest of us. But try as they might, the vast resources dedicated to carbon reduction are largely wasted. I hesitate to say their effort is quixotic because they want more windmills and are completely lacking in gallantry. As McKitrick notes, it takes many years for abatement to have a meaningful impact on carbon concentrations, and since emissions mix globally, unilateral efforts are practically worthless. Worse yet, the resource costs of abatement and lost economic growth are unacceptable, especially when some of the most promising alternative sources of “clean” energy are dismissed by activists. So we forego economic growth, rush to adopt immature energy alternatives, and make very little progress toward the stated goals of the climate alarmists.

Myth Makers in Lab Coats

02 Friday Apr 2021

Posted by Nuetzel in Climate science, Research Bias, Science

≈ Leave a comment

Tags

Cambridge, Canonization Effect, Citation Bias, Climate Change, Climatology, Lee Jussim, Medical Science, Model Calibration, National Oceanic and Atmospheric Administration, Pandemic, Political Bias, Psychology Today, Publication Bias, Repication Crisis, Reporting Bias, Spin

The prestige of some elements of the science community has taken a beating during the pandemic due to hugely erroneous predictions, contradictory pronouncements, and misplaced confidence in interventions that have proven futile. We know that medical science has suffered from a replication crisis, and other areas of inquiry like climate science have been compromised by politicization. So it seemed timely when a friend sent me this brief exposition of how “scientific myths” are sometimes created, authored by Lee Jussim in Psychology Today. It’s a real indictment of the publication process in scientific journals, and one can well imagine the impact these biases have on journalists, who themselves are prone to exaggeration in their efforts to produce “hot” stories.

The graphic above appears in Jussim’s article, taken from a Cambridge study of reporting and citation biases in research on treatments for depression. But as Jussim asserts, the biases at play here are not “remotely restricted to antidepressant research”.

The first column of dots represent trial results submitted to journals for publication. A green dot signifies a positive result: that the treatment or intervention was associated with significantly improved patient outcomes. The red dots are trials in which the results were either inconclusive or the treatment was associated with detrimental outcomes. The trials were split about equally between positive and non-positive findings, but far fewer of the trials with non-positive findings were published. From the study:

“While all but one of the positive trials (98%) were published, only 25 (48%) of the negative trials were published. Hence, 77 trials were published, of which 25 (32%) were negative.“

The third column shows that even within the set of published trials, certain negative results were NOT reported or secondary outcomes were elevated to primary emphasis:

“Ten negative trials, however, became ‘positive’ in the published literature, by omitting unfavorable outcomes or switching the status of the primary and secondary outcomes.“

The authors went further by classifying whether the published narrative put a “positive spin” on inconclusive or negative results (yellow dots):

“… only four (5%) of 77 published trials unambiguously reported that the treatment was not more effective than placebo in that particular trial.“

Finally, the last column represents citations of the published trials in subsequent research, where the size of the dots corresponds to different levels of citation:

“Compounding the problem, positive trials were cited three times as frequently as negative trials (92 v. 32 citations. … Altogether, these results show that the effects of different biases accumulate to hide non- significant results from view.“

As Jussim concludes, it’s safe to say these biases are not confined to antidepressant research. He also writes of the “canonization effect”, which occurs when certain conclusions become widely accepted by scientists:

“It is not that [the] underlying research is ‘invalid.’ It is that [the] full scope of findings is mixed, but that the mixed nature of those findings does not make it into what gets canonized.“

I would say canonization applies more broadly across areas of research. For example, in climate research, empirics often take a back seat to theoretical models “calibrated” over short historical records. The theoretical models often incorporate “canonized” climate change doctrine which, on climatological timescales, can only be classified as speculative. Of course, the media and public has difficulty distinguishing this practice from real empirics.

All this is compounded by the institutional biases introduced by the grant-making process, the politicization of certain areas of science (another source of publication bias), and mission creep within government bureaucracies. In fact, some of these agencies control the very data upon which much research is based (the National Oceanic and Atmospheric Administration, for example), and there is credible evidence that this information has been systematically distorted over time.

The authors of the Cambridge study discuss efforts to mitigate the biases in published research. Unfortunately, reforms have met with mixed success at best. The anti-depressant research reflects tendencies that are all too human and perhaps financially motivated. Add to that the political motivation underlying the conduct of broad areas of research and the dimensions of the problem seem almost insurmountable without a fundamental revolution of ethics within the scientific community. For now, the biases have made “follow the science” into something of a joke.

Everything’s Big In Texas Except Surge Capacity

01 Monday Mar 2021

Posted by Nuetzel in Electric Power, Price Mechanism, Shortage

≈ 3 Comments

Tags

Austin Vernon, Blackouts, Climate Change, Coal Power, Dolar Power, Electric Reliability Council of Texas, ERCOT, Gas Power, Green Energy, H. Sterling Burnett, Heartland Institute, Judith Curry, Lynn Kiesling, Nuclear power, Renewables, Surge Capacity, Texas, Tyler Cowen, Variable-Rate Pricing, Vernon L. Smith, Wind Power

The February cold snap left millions of Texas utility customers without power. I provide a bit of a timeline at the bottom of this post. What happened? Well, first, don’t waste your time arguing with alarmists about whether “climate change” caused the plunge in temperatures. Whether it was climate change (it wasn’t) or anything else, the power shortage had very nuts-and-bolts causes and was avoidable.

Texas has transitioned to producing a significant share of its power with renewables: primarily wind and solar, which is fine across a range of weather conditions, though almost certainly uneconomic in a strict sense. The problem in February was that the state lacks adequate capacity to meet surges under extreme weather conditions. But it wasn’t just that the demand for power surged during the cold snap: renewables were not able to maintain output due to frozen wind turbines and snow-covered solar panels, and even some of the gas- and coal-fired generators had mechanical issues. The reliability problem is typical of many renewables, however, which is why counting on it to provide base loads is extremely risky.

Judith Curry’s web site featured an informative article by a planning engineer this week: “Assigning Blame for the Blackouts in Texas”. The Electric Reliability Council of Texas (ERCOT) is the independent, non-profit operator of the state’s electric grid, with membership that includes utilities, electric cooperatives, other sellers, and consumers. Apparently ERCOT failed to prepare for such an extreme weather event and the power demand it engendered:

“… unlike utilities under traditional models, they don’t ensure that the resources can deliver power under adverse conditions, they don’t require that generators have secured firm fuel supplies, and they don’t make sure the resources will be ready and available to operate.”

ERCOT’s emphasis on renewables was costly, draining resources that otherwise might have been used to provide an adequate level of peak capacity and winterization of existing capacity. Moreover, it was paired with a desire to keep the price of power low. ERCOT has essentially “devalued capacity”:

“Texas has stacked the deck to make wind and solar more competitive than they could be in a system that better recognizes the value of dependable resources which can supply capacity benefits. … capacity value is a real value. Ignoring that, as Texas did, comes with real perils. … In Texas now we are seeing the extreme shortages and market price spikes that can result from devaluing capacity. “

Lest there be any doubt about the reliance on renewables in Texas, the Heartland Institutes’s H. Sterling Burnett notes that ERCOT data:

“… shows that five days before the first snowflake fell, wind and solar provided 58% of the electric power in Texas. But clouds formed, temperatures dropped and winds temporarily stalled, resulting in more than half the wind and solar power going offline in three days never to return during the storm, when the problems got worse and turbines froze and snow and ice covered solar panels.”

Power prices must cover the cost of meeting “normal” energy needs as well as the cost of providing for peak loads. That means investment in contracts that guarantee fuel supplies as well as peak generating units. It also means inter-connectivity to other power grids. Instead, ERCOT sought to subsidize costly renewable power in part by skimping on risk-mitigating assets.

Retail pricing can also help avert crises of this kind. Texas customers on fixed-rate plans had no incentive to conserve as temperatures fell. Consumers can be induced to lower their thermostats with variable-rate plans, and turning it down by even a degree can have a significant impact on usage under extreme conditions. The huge spike in bills for variable-rate customers during the crisis has much to do with the fact that too few customers are on these plans to begin with. Among other things, Lynne Kiesling and Vernon L. Smith discuss the use of digital devices to exchange information on scarcity with customers or their heating systems in real time, allowing quick adjustment to changing incentives. And if a customer demands a fixed-rate plan, the rate must be high enough to pay the customer’s share of the cost of peak capacity.

Price incentives make a big difference, but there are other technological advances that might one day allow renewables to provide more reliable power, as discussed in Tyler Cowen’s post on the “energy optimism” of Austin Vernon”. I find Vernon far too optimistic about the near-term prospects for battery technology. I am also skeptical of wind and solar due to drawbacks like land use and other (often ignored) environmental costs, especially given the advantages of nuclear power to provide “green energy” (if only our governments would catch on). The main thing is that sufficient capacity must be maintained to meet surges in demand under adverse conditions, and economic efficiency dictates that that it is a risk against which ratepayers cannot be shielded.

Note: For context on the chart at the top of this post, temperatures in much of Texas fell on the 9th of February, and then really took a dive on the 14th before recovering on the 19th. Wind generation fell immediately, and solar power diminished a day or two later. Gas and coal helped to offset the early reductions, but it took several days for gas to ramp up. Even then there were shortages. Then, on the 16th, there were problems maintaining gas and coal generation. Gas was still carrying a higher than normal load, but not enough to meet demand.

End of Snowfalls Is Greatly Exaggerated

03 Monday Feb 2020

Posted by Nuetzel in Uncategorized

≈ Leave a comment

Tags

Baby Boomers, Climate Change, Climate models, Gen X, global warming, Millenials, NOAA, Snowfalls, The Independent, Thomas Jefferson

Snowcover Anomoly

Everyone seems to think it snowed more in their youth than in recent years, but that’s generally incorrect, at least for for late-stage baby boomers, Gen Xers, and Millenials. Gregory Wrightstone thought the same thing as he reflected on his youth in Pittsburgh, but after checking snowfall records he was surprised to find an upward trend. In “Warming and the Snows of Yesteryear“, Wrightstone says his look at the records from other areas showed similar upward trends. The chart above from NOAA shows the Northern Hemisphere has experienced mostly positive snowfall anomalies over the past 20 years. So, the truth is that snowfalls have not decreased over the last 50+ years, contrary to our fond memories of big snows in childhood. Interestingly, Thomas Jefferson thought the same thing in 1801, but I’m not sure whether he was right.

We’ve been told by climate alarmists that “snowfalls are a thing of the past” due to global warming (The Independent in March, 2000). If anything, however, snowfalls have increased, and big snowfalls still happen. As with so many climate predictions over the years, this too is a bust. Most of those predictions have relied on predictive models fitted with an inadequate historical record of data, and the models are inadequately specified to capture the complexities of global climate trends. Don’t bet the house on them, and don’t presume to bet my house on them either, please!

Scorning the Language of the Left

12 Sunday Jan 2020

Posted by Nuetzel in Censorship, Leftism, Political Correctness

≈ Leave a comment

Tags

Abortion, Boy George, Brett Kavanaugh, Brexit, Check Your Privilege, Cisgender, Climate Change, Donald Trump, Gender, Harper's, Hate Speech, Identitarian, Israel, Lefty Lingo, LGBTQ, Lionel Shriver, Microaggession, Patriarchy, Phobic, Privilege, Progressive Speech, Pronouns, Queer, Safe Space, STFU, Sustainability

It’s hard not to ridicule some the language adopted by our lefty friends, and it can be fun! But it’s not just them. We hear it now from employers, schools, and otherwise sensible people too eager to signal their modernity and virtue. Lionel Shriver dissects some of this “Lefty Lingo” in an entertaining piece in Harper’s. It’s funny, but it aroused my contempt for the smugness of the “wokescenti” (a term Shriver attributes too Meghan Daum) and my pity for those “normals” simply desperate to project progressive sophistication.

Here are a few of Shriver’s observations:

“Privilege”: makes you incapable of understanding that which you criticize.

“Whereas a privilege can be acquired through merit—e.g., students with good grades got to go bowling with our teacher in sixth grade—privilege, sans the article, is implicitly unearned and undeserved. The designation neatly dispossesses those so stigmatized of any credit for their achievements while discounting as immaterial those hurdles an individual with a perceived leg up might still have had to overcome (an alcoholic parent, a stutter, even poverty). For privilege is a static state into which you are born, stained by original sin. Just as you can’t earn yourself into privilege, you can’t earn yourself out of it, either. … . it’s intriguing that the P-bomb is most frequently dropped by folks of European heritage, either to convey a posturing humility (“I acknowledge my privilege”) or to demonize the Bad White People, the better to distinguish themselves as the Good White People.

Meanwhile, it isn’t clear what an admission of privilege calls you to do, aside from cower. That tired injunction ‘Check your privilege’ translates simply to ‘S.T.F.U.’—and it’s telling that ‘Shut the fuck up’ is now a sufficiently commonplace imperative to have lodged in text-speak.”

“Cisgender”: “Cis-” is a linguistic shell game whereby the typical case is labelled cis-typical.

“Denoting, say, a woman born a woman who thinks she’s a woman, this freighted neologism deliberately peculiarizes being born a sex and placidly accepting your fate, and even suggests that there’s something a bit passive and conformist about complying with the arbitrary caprices of your mother’s doctor. Moreover, unless a discussion specifically regards transgenderism, in which case we might need to distinguish the rest of the population (‘non-trans’ would do nicely), we don’t really need this word, except as a banner for how gendercool we are. It’s no more necessary than words for ‘a dog that is not a cat,’ a ‘lamppost that is not a fire hydrant,’ or ‘a table that is actually a table.’ Presumably, in order to mark entities that are what they appear to be, we could append ‘cis’ to anything and everything. ‘Cisblue’ would mean blue and not yellow. ‘Cisboring’ would mean genuinely dull, and not secretly entertaining after all.”

“Microaggression“: Anything you say that bothers them, even a little.

“… a perverse concoction, implying that the offense in question is so minuscule as to be invisible to the naked eye, yet also that it’s terribly important. The word cultivates hypersensitivity.”

“_____-phobic”: the typical use of this suffix in identity politics stands “phobia” on its head. To be fair, however, it started with a presumption that people hate that which they fear. Maybe also that they fear and hate that which they don’t care for, but we’ll just focus on fear and hate. For example, there is the notion that men have deep fears about their own sexuality. Thus, the prototypical gay-basher in film is often compensating for his own repressed homosexual longings, you see. And now, the idea is that we always fear “otherness” and probably hate it too. Both assertions are tenuous. At least those narratives are rooted in “fear”, but it’s not quite the same phenomenon as hate, and yet “phobic” seems to have been redefined as odium:

“The ubiquitous ‘transphobic,’ ‘Islamophobic,’ and ‘homophobic’ are also eccentric, in that the reprobates so branded are not really being accused of fearfulness but hatred.”

“LGBTQ“: Lumping all these “types” together can be misleading, as they do not always speak in unison on public policy. But if we must, how about “Let’s Go Back To ‘Queer'”, as Shriver suggests. The LGBs I know don’t seem to mind it as a descriptor, but maybe that’s only when they say it. Not sure about the trannies. There is a great Libertarian economist who is transsexual ( Dierdre McCloskey), and somehow “queer” doesn’t seem quite right for her. Perhaps she’s just a great woman.

“The alphabet soup of ‘LGBTQ’ continues to add letters: LGBTQIAGNC, LGBTQQIP2SAA, or even LGBTIQCAPGNGFNBA. A three-year-old bashing the keyboard would produce a more functional shorthand, and we already have a simpler locution: queer.”

“Problematic”, “Troubling” and “Inappropriate”: I’m sure some of what I’ve said above is all three. I must confess I’ve used these terms myself, and they are perfectly good words. It’s just funny when the Left uses them in the following ways.

“Rare instances of left-wing understatement, ‘problematic’ and ‘troubling’ are coyly nonspecific red flags for political transgression that obviate spelling out exactly what sin has been committed (thereby eliding the argument). Similarly, the all-purpose adjectival workhorse ‘inappropriate’ presumes a shared set of social norms that in the throes of the culture wars we conspicuously lack. This euphemistic tsk-tsk projects the prim censure of a mother alarmed that her daughter’s low-cut blouse is too revealing for church. ‘Inappropriate’ is laced with disgust, while once again skipping the argument. By conceit, the appalling nature of the misbehavior at issue is glaringly obvious to everyone, so what’s wrong with it goes without saying.”

Here are a few others among my favorites:

“Patriarchy“: This serves the same function as “privilege” but is directed more specifically at the privilege enjoyed by males. Usually white, heterosexual males. It seeks to preemptively discredit any argument a male might make, and often it is used to discredit Western political and economic thought generally. That’s because so much of it was the product of the patriarchy, don’t you know! And remember, it means that males are simply incapable of understanding the plight of females … and children, let alone queers! Apparently fathers are bad, especially if they’re still straight. Mothers are good, unless they stand with the patriarchy.

“Hate Speech“: This expression contributes nothing to our understanding of speech that is not protected by the Constitution. If anything its use is intended to deny certain kinds of protected speech. Sure, originally it was targeted at such aberrations as racist or anti-gay rhetoric, assuming that always meant “hate”, but even those are protected as long as they stop short of “fighting words”. There are many kinds of opinions that now seem to qualify as “hate speech” in the eyes of the Identitarian Left, even when not truly “hateful”, such as church teachings in disapproval of homosexuality. There is also a tendency to characterize certain policy positions as “hate speech”, such as limits on immigration and opposition to “living wage” laws. Hypersensitivity, once more.

“Sustainability“: What a virtue signal! It’s now a big game to characterize whatever you do as promoting “sustainability”. But let’s get one thing straight: an activity is sustainable only if its benefits exceed its resource costs. That is the outcome sought by voluntary participants in markets, or they do not trade. Benefits and costs “estimated” by government bureaucrats without the benefit of market prices are not reliable guides to sustainability. Nor is Lefty politics a reliable guide to sustainability. Subsidies for favored activities actually undermine that goal.

There are many other Lefty catch phrases and preferred ways of speaking. We didn’t even get to “safe space”, “social justice”, and the pronoun controversy. Shriver closes with some general thoughts on the lefty lingo. I’ll close by quoting one of those points:

“The whole lexicon is of a piece. Its usage advertises that one has bought into a set menu of opinions—about race, gender, climate change, abortion, tax policy, #MeToo, Trump, Brexit, Brett Kavanaugh, probably Israel, and a great deal else. Reflexive resort to this argot therefore implies not that you think the same way as others of your political disposition but that you don’t think. You have ordered the prix fixe; you’re not in the kitchen cooking dinner for yourself.”

 

Amazon Fire Fraud

03 Tuesday Sep 2019

Posted by Nuetzel in Forest Fires, Global Greening

≈ Leave a comment

Tags

Al Gore, Alexander Hammond, Amazon Basin, Amazon Fires, Brazil, Climate Change, Coyote Blog, Deforestation, FEE, Global Fire Emissions Database, Jair Bolsonaro, Leonardo DiCaprio, Michael Shellenberger, P.T. Barnum, Reforestation, Sugar Cane, U.S. Ethanol Mandates, Warren Meyer

Leftist activists recently pounced on another opportunity to mischaracterize events, this time in the Amazon Basin, where recent fires were held to be unprecedented. The fires were also characterized as evidence of a massive conspiracy between capitalists and the new government of President Jair Bolsonaro to open the rain forest to commercial exploitation. Warren Meyer squares away the facts at this Coyote Blog post, which is where I found the chart above. Forest clearing in Brazil has been much lower over the past 10 years than during period 1988-2008. It stepped-up somewhat during the first half of 2019, but it still ran at a rate well below 2008. A key reason for the increase is fascinating, but I’ll merely tease that for now.

As for the fires, Meyer provides the following quote about this year’s fires from NASA in a statement accompanying a satellite photo:

“As of August 16, 2019, satellite observations indicated that total fire activity in the Amazon basin was slightly below average in comparison to the past 15 years. Though activity has been above average in Amazonas and to a lesser extent in Rondônia, it has been below average in Mato Grosso and Pará, according to the Global Fire Emissions Database.“

So what was the cause of all the alarm? Meyer points to a August 22 story in the Washington Post, though WaPo might not have been the first. The article was either poorly researched and “fact checked” or it was a deliberate attempt to raise alarm. The sloppy story was picked up elsewhere, of course, and distorted memes spread on social media condemning the Brazilian government and capitalism generally.

The burning that is taking place has been started by farmers preparing land for crops, a process that occurs every year. Meyer quotes the New York Times on this point, which noted that very little of the burning was taking place in old-growth forests.

What’s really ironic and crazy about all this is that U.S. environmental policy is responsible for some of the burning that is taking place in the Amazon. Meyer notes that U.S. ethanol mandates have subsidized a years-long trend of increased sugar cane production in the Amazon Basin. Of course, burning is a regular part of the normal sugar cane harvest. Moreover, that production has contributed to land clearance, offsetting some of the forces that have brought the rate of deforestation in Brazil down overall.

The whole episode dovetails with the ongoing narrative that fires are burning out of control across the globe due to climate change. We heard similar propaganda last year after several large fires in California. Michael Shellenberger does his best to set the record straight, demonstrating that the annual land area burned worldwide has declined by 25% since 2003. He contrasts that record with the hopelessly errant reporting by major media organizations.

As P.T. Barnum once said, a sucker’s born every minute. He might as well have been talking about the armies of well-meaning but gullible greenies who fall for every scare story told by the likes of Al Gore and Leonardo DiCaprio. And scare stories are exactly what these tales of a global conflagration amount to. Meanwhile, as Alexander Hammond explains, global reforestation has taken hold. In what is apparently a paradox to some, this is largely the result of economic growth. Hammond discusses the logical connections between economic development and environmental goods, including reforestation and biodiversity. The bottom line is that the best policies for reforestation are not those imposing obstacles to growth, as the environmental Left would have it. Rather, it is policies that promote development and income growth, which are generally more compatible with individual liberty, that will encourage growth in the world’s forests.

A Carbon Tax Would Be Fine, If Only …

01 Friday Mar 2019

Posted by Nuetzel in Environment, Global Warming, Taxes

≈ Leave a comment

Tags

A.C. Pigou, Carbon Dividend, Carbon Tax, Climate Change, Economic Development, External Cost, Fossil fuels, Green New Deal, IPCC, John Cochrane, Michael Shellenberger, Pigouvian Tax, Quillette, Renewable energy, Revenue Neutrality, Robert P. Murphy, Social Cost of Carbon, Warren Meyer, William D. Nordhaus

I’ve opposed carbon taxes on several grounds, but I admit that it might well be less costly as a substitute for the present mess that is U.S. climate policy. Today, we incur enormous costs from a morass of energy regulations and mandates, prohibitions on development of zero-carbon nuclear power, and subsidies to politically-connected industrialists investing in corn ethanol, electric cars, and land- and wildlife-devouring wind and solar farms. (For more on these costly and ineffective efforts, see Michael Shellenberger’s “Why Renewables Can’t Save the Planet” in Quillette.) Incidentally, the so-called Green New Deal calls for a complete conversion to renewables in unrealistically short order, but with very little emphasis on a carbon tax.

The Carbon Tax

Many economists support the carbon tax precisely because it’s viewed as an attractive substitute for many other costly policies. Some support using revenue from the tax to pay a flat rebate or “carbon dividend” to everyone each year (essentially a universal basic income). Others have pitched the tax as a revenue-neutral replacement for other taxes that are damaging to economic growth, such as payroll taxes or taxes on capital. Economic growth would improve under the carbon tax, or so the story goes, because the carbon tax is a tax on a “bad”, as opposed to taxes on “good” factors of production. I view these ideas as politically naive. If we ever get the tax, we’ll be lucky to get much regulatory relief in the bargain, and the revenue is not likely to be offset by reductions in other taxes.

But let’s look a little closer at the concept of the carbon tax, and I beg my climate-skeptic friends to stick with me for a few moments and keep a straight face. The tax is a way to attach an explicit price to the use of fuels that create carbon emissions. The emissions are said to inflict social or external costs on other parties, costs which are otherwise ignored by consumers and businesses in their many decisions involving energy use. The carbon tax is a so-called Pigouvian tax: a way to “internalize the externality” by making fossil fuels more expensive to burn. The tax itself involves no prohibitions on behavior of any kind. Certain behaviors are taxed to encourage more “desirable” behavior.

Setting the Tax

But what is the appropriate level of the tax? At what level will it approximate the true “social cost of carbon”? Any departure from that cost would be sub-optimal. Robert P. Murphy contrasts William D. Nordhaus’ optimal carbon tax with more radical levels, which Nordhaus believes would be needed to meet the goals of the United Nation’s Intergovernmental Panel on Climate Change (IPCC). Nordhaus won the 2018 Nobel Prize in economics for his work on climate change. Whatever one might think of the real risks of climate change, Nordhaus’ clearly recognizes the economic downsides to mitigating against those risks.

Nordhaus has estimated that the social cost of carbon will be $44/ton in 2025 (about $0.39 per gallon of gas). He claims that a carbon tax at that level would limit increases in global temperature to 3.5º Celsius by 2100. He purports to show that the costs of a $44 carbon tax in terms of reduced economic output would be balanced by the gains from limiting climate warming. Less warming would require a higher tax with fewer incremental rewards, and even more incremental lost output. The costs of the tax would then outweigh benefits. For perspective, according to Nordhaus, a stricter limit of 2.5º C implies a carbon tax equivalent to $2.50 per gallon of gas. The IPCC, however, prescribes an even more radical limit of 1.5º C. That would inflict a huge cost on humanity far outweighing the potential benefits of less warming.

A Carbon Tax, If…

Many economists have come down in favor of a carbon tax under certain qualifications: revenue-neutrality, a “carbon dividend”, or as a pre-condition to deregulation of carbon sources and de-subsidization of alternatives. John Cochrane discusses a carbon tax in the context of the “Economists’ Statement on Carbon Dividends” (Cochrane’s more recent thoughts are here):

“It’s short, sweet, and signed by, as far as I can tell, every living CEA chair, every living Fed Chair, both Democrat and Republican, and most of the living Nobel Prize winners. … It offers four principles 1. A carbon tax, initially $40 per ton. 2. The carbon tax substitutes for regulations and subsidies and (my words) the vast crony-capitalist green boondoggle swamp, which is chewing up money and not saving carbon. 3. Border adjustment like VAT have [sic] 4. ‘All the revenue should be returned directly to U.S. citizens through equal lump-sum rebates.'”

Rather than a carbon dividend, Warren Meyer proposes that a carbon tax be accompanied by a reduction in the payroll tax, an elimination of all subsidies, mandates, and prohibitions, development of more nuclear power-generating capacity, and contributions to a cleanup of Chinese and Asian coal-power generation. That’s a lot of stuff, and I think it exceeds Meyer’s normal realism with respect to policy issues.

My Opposition

Again, I oppose the adoption of a carbon tax for several reasons, despite my sympathy for the logic of Pigouvian taxation of externalities. At the risk of repeating myself, here I elaborate on my reasons for opposition:

Government Guesswork: First, Nordhaus’ estimates notwithstanding, we do not and cannot know the climate/economic tradeoffs with any precision. We can barely measure global climate, and the history of what measures we have are short and heavily manipulated. Models purporting to show the relationship between carbon forcing and global climate climate change are notoriously unreliable. So even if we can agree on the goal (1.5º, 2.5º, 3.5º), and we won’t, the government will get the tradeoffs wrong. I took the following from a comment on Cochrane’s blog, a quote from A.C. Pigou himself:

“It is not sufficient to contrast the imperfect adjustments of unfettered enterprise with the best adjustment that economists in their studies can imagine. For we cannot expect that any State authority will attain, or even wholeheartedly seek, that ideal. Such authorities are liable alike to ignorance, to sectional pressure, and to personal corruption by private interest. A loud-voiced part of their constituents, if organized for votes, may easily outweigh the whole.”

Political Hazards: Second, we won’t get the hoped-for political horse trade made explicit in the “Economists’ Statement …” discussed above. As a political matter, the setting of the carbon tax rate will almost assuredly get us a rate that’s too high. Experiences with carbon taxes in Australia, British Columbia, and France have been terrible thus far, sowing widespread dissatisfaction with the resultant escalation of energy prices.

Economic Growth: Neither is it a foregone conclusion that a revenue-neutral carbon tax will stimulate economic growth, and it might actually reduce output. As Robert P. Murphy explains in another post, the outcome depends on the structure of taxes prior to the change. The substitution of the carbon tax will increase output only if it replaces taxes on a factor of production (labor or capital) that is overtaxed prior to the change. That undermines a key selling point: that the carbon tax would necessarily produce a “double dividend”: a reduction in carbon emissions and higher economic growth. Nevertheless, I’d allow that revenue neutrality combined with elimination of carbon regulation and “green” subsidies would be a good bet from an economic growth perspective.

Overstated Risks: Finally, I oppose carbon taxes because I’m unconvinced that the risk and danger of global warming are as great as even Nordhaus would have it. In other words, the external costs of carbon don’t amount to much. Our recorded temperature history is extremely short and is therefore not a reliable guide to the long-term nature of the systemic relationships at issue. Even worse, temperature records are manipulated to exaggerate the trend in temperatures (also see here, here and here). There is no evidence of an uptrend in severe weather events, and the dangers of sea level rise associated with increasing carbon concentrations also have been greatly exaggerated. Really, at some point one must take notice of the number of alarming predictions and doomsday headlines from the past that have not been borne out even remotely. Furthermore, higher carbon concentrations and even warming itself would be of some benefit to humanity. In addition to a greener environment, the benefits include more rapid economic growth, improved agricultural yields, and a reduction in the salient danger of cold-weather deaths.

Economic Development: The use of fossil fuels has helped to enable strong growth in incomes in developed economies. It has also given us energy alternatives such as nuclear power as well as research into other alternatives, albeit with very mixed success thus far. And while a carbon tax would create an additional incentive to develop such alternatives, a U.S. tax would not accomplish much if any global temperature reduction. Such a tax would have to be applied on a global scale. Talk about a political long-shot! Increasing the price of carbon emissions also has enormous downsides for the less developed world. These fragile economies would benefit greatly from development of fossil fuel energy, enabling reductions in poverty and the income growth necessary to someday join in the prosperity of the developed economies. This, along with liberalization of markets, is the affordable way to bring economic success to these countries, which in turn will enable them to consider the energy alternatives that might come to fruition by that time. Fighting the war on fossil fuels in the underdeveloped world is nothing if not cruel.

 

The Disastrous Boomerang Effect of Fire Suppression

15 Thursday Nov 2018

Posted by Nuetzel in Environment, Wildfires

≈ Leave a comment

Tags

Biomass Harvesting, Camp Fire, Climate Change, Donald Trump, Fire Suppression, Forest Fires, Forest Management, George E. Gruell, PG&E, Prescribed Burns, Sierra Nevada, Spontaneous Combustion, Timber Harvest, U.S. Forest Service, Warren Meyer, Wildfires

We can lament the tragic forest fires burning in California, but a discussion of contributing hazards and causes is urgent if we are to minimize future conflagrations. The Left points the finger at climate change. Donald Trump, along with many forestry experts, point at forest mismanagement. Whether you believe in climate change or not, Trump is correct on this point. However, he blames the state of California when in fact a good deal of the responsibility falls on the federal government. And as usual, Trump has inflamed passions with unnecessarily aggressive rhetoric and threats:

“There is no reason for these massive, deadly and costly forest fires in California except that forest management is so poor. Billions of dollars are given each year, with so many lives lost, all because of gross mismanagement of the forests. Remedy now or no more Fed payments.”

Trump was condemned for his tone, of course, but also for the mere temerity to discuss the relationship between policy and fire hazards at such a tragic moment. Apparently, it’s a fine time to allege causes that conform to the accepted wisdom of the environmental Left, but misguided forest management strategy is off-limits.

The image at the top of this post is from the cover of a book by wildlife biologist George E. Gruell, published in 2001. The author includes hundreds of historical photos of forests in the Sierra Nevada range from as early as 1849. He pairs them with photos of the same views in the late 20th century, such as the photo inset on the cover shown above. The remarkable thing is that the old forests were quite thin by comparison. The following quote is from a review of the book on Amazon:

“Even the famed floor of Yosemite is now mostly forested with conifers. I myself love conifers but George makes an interesting point that these forests are “man made” and in many ways are unhealthy from the standpoint that they lead to canopy firestorms that normally don’t exsist when fires are allowed to naturally burn themselves out. Fire ecology is important and our fear of forest fires has led to an ever worsening situation in the Sierra Nevada.”

I posted this piece on forest fires and climate change three months ago. There is ample reason to attribute the recent magnitude of wildfires to conditions influenced by forest management policy. The contribution of a relatively modest change in average temperatures over the past several decades (but primarily during the 1990s) is rather doubtful. And the evidence that warming-induced drought is the real problem is weakened considerably by the fact that the 20th century was wetter than normal in California. In other words, recent dry conditions represent something of a return to normal, making today’s policy-induced overgrowth untenable.

Wildfires are a natural phenomenon and have occurred historically from various causes such as lightning strikes and even spontaneous combustion of dry biomass. They are also caused by human activity, both accidental and intentional. In centuries past, Native Americans used so-called controlled or prescribed burns to preserve and restore grazing areas used by game. In the late 19th and early 20th centuries, fire suppression became official U.S. policy, leading to an unhealthy accumulation of overgrowth and debris in American forests over several decades. This trend, combined with a hot, dry spell in the 1930s, led to sprawling wildfires. However, Warren Meyer says the data on burnt acreage during that era was exaggerated because the U.S. Forest Service insisted on counting acres burned by prescribed burns in states that did not follow its guidance against the practice.

The total acreage burned by wildfires in the U.S. was minimal from the late 1950s to the end of the century, when a modest uptrend began. In California, while the number of fires continued to decline over the past 30 years, the trend in burnt acreage has been slightly positive. Certainly this year’s mega-fires will reinforce that trend. So the state is experiencing fewer but larger fires.

The prior success in containing fires was due in part to active logging and other good forest management policies, including prescribed burns. However, the timber harvest declined through most of this period under federal fire suppression policies, California state policies that increased harvesting fees, and pressure from environmentalists. The last link shows that the annual “fuel removed” from forests in the state has declined by 80% since the 1950s. But attitudes could be changing, as both the state government and environmentalists (WSJ, link could be gated) are beginning to praise biomass harvesting as a way to reduce wildfire risk. Well, yes!

The reason wildfire control ever became a priority is the presence of people in forest lands, and human infrastructure as well. Otherwise, the fires would burn as they always have. Needless to say, homes or communities surrounded by overgrown forests are at great risk. In fact, it’s been reported that the massive Camp Fire in Northern California was caused by a PG&E power line. If so, it’s possible that the existing right-of-way was not properly maintained by PG&E, but it may also be that rights-of-way are of insufficient width to prevent electrical sparks from blowing into adjacent forests, and that’s an especially dangerous situation if those forests are overgrown.

Apparently Donald Trump is under the impression that state policies are largely responsible for overgrown and debris-choked forests. In fact, both federal and state environmental regulations have played a major role in discouraging timber harvesting and prescribed burns. After all, the federal government owns about 57% of the forested land in California. Much of the rest is owned privately or is tribal land. Trump’s threat to withhold federal dollars was his way of attempting to influence state policy, but the vast bulk of federal funds devoted to forest management is dedicated to national forests. A relatively small share subsidizes state and community efforts. Disaster-related funding is and should be a separate matter, but Trump made the unfortunate suggestion that those funds are at issue. Nevertheless, he was correct to identify the tremendous fire hazard posed by overgrown forests and excessive debris on the forest floor. Changes to both federal and state policy must address these conditions.

For additional reading, I found this article to give a balanced treatment of the issues.

Climate Change, Hurricanes and Noisy Statistics

22 Friday Sep 2017

Posted by Nuetzel in Global Warming

≈ Leave a comment

Tags

AGW, Atlantic Multi-Decadal Oscillation, Climate Change, Cool the Past, East Anglia University, El Nino, Fabius Maximus, global warming, Hurricane Harvey, Hurricane Irma, Hurricane Maria, Michael Mann, NOAA, Roger Pielke Sr, Roy Spencer, Ryan Maue, Sea Surface Temperatures, Signal-to-Noise, Statistical Noise, Storm Intensity, Watt's Up With That?

IMG_4919

The nasty spate of hurricanes this year has been a catch-up of sorts following a decade of subdued activity. In fact, global hurricane activity has been flat to declining in frequency since 1970. Until the recent increase, hurricane activity had been trending down in terms of 24-month cumulative energy since the 1990s, as the chart above shows. The historical data on the number of U.S. landfalls extends back to 1900, and it has had a negative trend as well. Nevertheless, we hear from climate alarmists that Hurricanes Harvey and Irma, which ended a drought of record length in U.S hurricane landfalls, and now presumably Maria, were a consequence of anthropomorphic global warming (AGW), er… climate change.

The implication is that increases in the atmospheric concentration of CO2 led to these hurricanes or their high intensity. Apparently, the paucity of hurricane activity over the previous ten years can be waved off as a fluke. A further implication of the alarmist view is that the longer negative trends in hurricane frequency and energy can be ignored in the context of any relation to CO2 concentration. But how so? One confounding factor I’ve seen mentioned blames El Nino warming in the Pacific, and a consequent increase in Atlantic wind shear, for the long lull in activity after 2005. That has a ring of plausibility, but a closer look reveals that actual El Nino activity during those years was hardly impressive, with the exception of 2015-16.

More historical data can be seen in the charts on the tropical cyclone page on the Watts Up With That? blog. (The charts in question start about two-thirds of the way down the page.) Hurricane expert Ryan Maue compiled a number of these charts, including the one above. He authored an editorial in the Wall Street Journal this week bemoaning the climate-change hype surrounding Harvey and Irma (if the link doesn’t work, it is available at the WSJ’s Opinion page on Facebook, posted on 9/17). Maue believes that both the climate science community and the media share in the blame for that hype. But he also says the following:

“Although a clear scientific consensus has emerged over the past decade that climate change influences hurricanes in the long run, its effect upon any individual storm is unclear.“

Maue provides a link to this NOAA web site offering cautious support for the proposition that there is a link between global warming and hurricane intensity, though the data it cites ends about ten years ago, so it does not capture the recent lull. Also, some of the information it provides is based on modeled global temperatures and hurricane activity through 2100. As is well-known by now, or should be, long-term climate forecasts based on carbon forcings are notoriously inaccurate, and NOAA admits that the association between those predicted temperatures and future hurricanes is tenuous:

“It is premature to conclude that human activities–and particularly greenhouse gas emissions that cause global warming–have already had a detectable impact on Atlantic hurricane or global tropical cyclone activity.“

Perhaps the idea that there is consensus regarding the relationship between climate change and hurricanes is more of a stretch than Maue and NOAA let on. Here is a summary of 30 peer-reviewed studies showing no connection to either hurricane frequency or intensity. Most of these studies are more recent than the end of the data record cited by NOAA. And in fact, many of these studies find support for a negative link between global temperatures and hurricane activity.

One of the prominent alarmists in the climate research community is Penn State’s Michael Mann, who has famously claimed that hurricanes are more frequent now than at any time in the past 1,000 years. He based his conclusions on highly speculative hurricane “proxies” identified in layers of sediment. Mann’s claims and research technique have been called into questioned by other climate scientists, who have arrived at contrary results in their own research. Lest anyone forget, Mann was implicated in a  data manipulation fraud related to the East Anglia climate scandal. Though cleared by a group of tenured professors at his own university, there are a number of climate scientists who believe Mann violated scientific standards.

The claim that global warming will cause hurricanes to become increasingly intense relies on elevated sea surface temperatures. This year, temperatures in the Gulf of Mexico are elevated and are said to have had a role in strengthening Harvey as it approached the Gulf Coast. Texas, however, has experienced as many landfalls of major hurricanes with cooler Gulf waters as with warmer waters. And Irma strengthened in a part of the Atlantic without such warm temperatures. Instead, minimal wind shear was implicated as a factor contributing to Irma’s strength.

In general, Atlantic temperatures have been relatively warm since the late 1990s, a fact that most scientists would at least partially attribute to the “Atlantic multi-decadal oscillation“, a regular cycle in water temperatures that repeats with a period of multiple decades. Potentially adding to that temperature increase is a controversial change in NOAA’s calibration of sea surface temperatures, as an increasing share of those readings are taken from buoys rather than ship-board measurement. There is some suspicion that NOAA’s adjustments “cool the past” more than is justified, a suspicion that was heightened by allegations from one whistle-blowing NOAA scientist early this year. Then, there is the contention that the sea surface temperature makes little difference if it is matched by an increase in air temperature.

Overall, NOAA says the combination of frequency and intensity of tropical cyclones will increase by 2%-11% over the rest of this century. As Roy Spencer notes, that is not a terribly alarming figure given the risks people have always willingly accepted by living in coastal areas. In any case, the range is based on models of climate behavior that are of questionable reliability. And like past temperature predictions produced by carbon-forcing climate models, it is likely to be a gross overestimate. Here is Roger Pielke, Sr., who is quoted in this wide-ranging post on hurricanes and climate at the Fabius Maximus web site:

“Model projections of hurricane frequency and intensity are based on climate models. However, none have shown skill at predicting past (as hindcasts) variations in hurricane activity (or long term change in their behavior) over years, decades, and longer periods. Thus, their claim of how they will change in the future remains, at most, a hypothesis (i.e. speculation). When NOAA, IPCC and others communicate to the media and public, to be scientifically honest, they should mention this.”

Despite the spike in activity this year, strong hurricanes are intermittent and fairly rare. Establishing reliable statistical connections with other forces is difficult with emergent events like hurricanes. Moreover, the degree of error in measuring global or regional temperature itself is much larger than is generally acknowledged, and the global warming “signal” is very weak. As we say in the statistical analysis business, noisy data are compatible with diverse hypotheses. The relationship between hurricanes and climate change is a prime example.

← Older posts
Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Oh To Squeeze Fiscal Discipline From a Debt Limit Turnip
  • Conformity and Suppression: How Science Is Not “Done”
  • Grow Or Collapse: Stasis Is Not a Long-Term Option
  • Cassandras Feel An Urgent Need To Crush Your Lifestyle
  • Containing An Online Viper Pit of Antisemites

Archives

  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Ominous The Spirit
  • Passive Income Kickstart
  • onlyfinance.net/
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library

Blog at WordPress.com.

Ominous The Spirit

Ominous The Spirit is an artist that makes music, paints, and creates photography. He donates 100% of profits to charity.

Passive Income Kickstart

onlyfinance.net/

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The future is ours to create.

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

  • Follow Following
    • Sacred Cow Chips
    • Join 121 other followers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...