• About

Sacred Cow Chips

Sacred Cow Chips

Category Archives: Global Warming

Warming Bias and Hot-Town Thermometers

27 Monday May 2019

Posted by Nuetzel in Global Warming

≈ Leave a comment

Tags

AIRS, Albedo, Axial Tilt, Diurnal Temperature Range, Eccentricity, global warming, Insolation, Interglacial, Javier, Jim Steele, NASA, Obliquity, Paleoclimatolog, Roy Spencer, Satellite Temperatures, Urban Heat Islands

 

 

A few little-recognized facts about global warming are summarized nicely by climate researcher Javier in a comment on this post by Dr. Roy Spencer:

“It is mainly over land and not over sea. It is mainly in the Northern Hemisphere and not in the Southern Hemisphere. It is mainly during winter and not during summer. And it affects mainly minimal (night) temperature and not maximal (day) temperature.”

I added the hyperlinks to Javier’s comment. The last two items on his list emphasize a benign aspect of the warming we’ve experienced since the late 1970s. After all, cold temperatures are far deadlier than warm temperatures.

Here is a disclaimer: my use of the term “global warming” refers to the fact that averages of measured temperatures have risen in a few fits and starts over the past four decades. I do not use the term to mean a permanent trend induced by human activity, since that time span is very short in climatological terms, and the observed increase is well within the historical range of natural variation.

Few seem aware that the surface temperature record is plagued by an obvious issue: the siting of most weather stations in urban environments. In fact, urban weather stations account for 82% of total stations in the U.S., as Jim Steele writes of “Our Urban ‘Climate Crisis’“. Temperatures run hot in cities due to the heat-absorbing characteristics of building materials and the high proportion of impervious ground cover. And some stations well outside of metropolitan areas are also situated near concrete and pavement. There is little doubt that urbanization and thoughtless siting decisions for weather stations have corrupted temperature measurements and exaggerated surface warming trends.

Hot summer days always arouse expressions of climate alarm. However, increases in summer temperatures, and daytime temperatures, have been relatively modest compared to increases in winter and nighttime temperatures. In Roy Spencer’s post, (also linked above), he reports that 80% of the U.S. warming observed by a NASA satellite system (AIRS) from September 2002 to March 2019 occurred at night.

Of course, climate alarmists also claim that global warming makes temperatures more volatile. So, they argue, there are now more very hot days even if the change in the average summer temperature is modest. The facts do not support that claim, however. Indeed, the world has experienced less temperature volatility as global temperatures have risen. And less extreme weather, as it happens, is contrary to another theme in the warmest narrative.

There is some reason to believe that the relative increase in nighttime temperature is connected to the urban heat island effect. Pavement, concrete, and other materials retain heat overnight. Thus, increasing urbanization leads to nighttime temperatures that do not fall from their daily highs as much as they did a few decades back. The magnification of daytime heating is not as pronounced as the effect of retained heat overnight, which causes the diurnal temperature range to decrease. But I should note that some rural farmers insist that nighttime lows have increased relative to daytime highs there as well, and Roy Spencer himself is not confident that the satellite temperature data on which his finding was based reflects a strong urban heat island effect.

For perspective, it’s good to remember that we live in the midst of an interglacial period. These are relatively brief, temperate intervals between lengthier glacial periods (see here, and more from Javier here). The current interglacial is well advanced, having begun about 11,700 years ago, but Javier estimates that it could last for another 1,500 years. That would be longer than the historical average. At the peak of the last interglacial period, temperatures were about 2C higher than today and sea levels were 5 meters higher. The last interglacial ended about 120,000 years ago, but the historical average time between interglacials is only about 41,000 years. These low frequency changes in the global climate are generally driven by the Earth’s axial tilt (obliquity), recurring cycles in the shape of our eliptical orbit around the Sun (eccentricity), and the Earth’s solar exposure (insolation) and albedo.

Biased surface temperature records have both inspired and reinforced the sense of panic surrounding global warming. Few observers seem to understand the existence of a strong bias, let alone its source: the urban heat island effect. And few seem to realize that most of the warming we’ve experienced since the 1970s has occurred at night, not during the day, and that these changes are well within the range of natural variation. Dramatic climate change happens at both long and short time scales for reasons that are largely astronomical. The lengthy historical record accumulated by paleoclimatologists shows that current concerns over global warming are exaggerated. I’m quite confident that mankind will find ways to adapt to climate change in either direction, but some global warming might be beneficial once the next glacial period begins.

 

A Carbon Tax Would Be Fine, If Only …

01 Friday Mar 2019

Posted by Nuetzel in Environment, Global Warming, Taxes

≈ Leave a comment

Tags

A.C. Pigou, Carbon Dividend, Carbon Tax, Climate Change, Economic Development, External Cost, Fossil fuels, Green New Deal, IPCC, John Cochrane, Michael Shellenberger, Pigouvian Tax, Quillette, Renewable energy, Revenue Neutrality, Robert P. Murphy, Social Cost of Carbon, Warren Meyer, William D. Nordhaus

I’ve opposed carbon taxes on several grounds, but I admit that it might well be less costly as a substitute for the present mess that is U.S. climate policy. Today, we incur enormous costs from a morass of energy regulations and mandates, prohibitions on development of zero-carbon nuclear power, and subsidies to politically-connected industrialists investing in corn ethanol, electric cars, and land- and wildlife-devouring wind and solar farms. (For more on these costly and ineffective efforts, see Michael Shellenberger’s “Why Renewables Can’t Save the Planet” in Quillette.) Incidentally, the so-called Green New Deal calls for a complete conversion to renewables in unrealistically short order, but with very little emphasis on a carbon tax.

The Carbon Tax

Many economists support the carbon tax precisely because it’s viewed as an attractive substitute for many other costly policies. Some support using revenue from the tax to pay a flat rebate or “carbon dividend” to everyone each year (essentially a universal basic income). Others have pitched the tax as a revenue-neutral replacement for other taxes that are damaging to economic growth, such as payroll taxes or taxes on capital. Economic growth would improve under the carbon tax, or so the story goes, because the carbon tax is a tax on a “bad”, as opposed to taxes on “good” factors of production. I view these ideas as politically naive. If we ever get the tax, we’ll be lucky to get much regulatory relief in the bargain, and the revenue is not likely to be offset by reductions in other taxes.

But let’s look a little closer at the concept of the carbon tax, and I beg my climate-skeptic friends to stick with me for a few moments and keep a straight face. The tax is a way to attach an explicit price to the use of fuels that create carbon emissions. The emissions are said to inflict social or external costs on other parties, costs which are otherwise ignored by consumers and businesses in their many decisions involving energy use. The carbon tax is a so-called Pigouvian tax: a way to “internalize the externality” by making fossil fuels more expensive to burn. The tax itself involves no prohibitions on behavior of any kind. Certain behaviors are taxed to encourage more “desirable” behavior.

Setting the Tax

But what is the appropriate level of the tax? At what level will it approximate the true “social cost of carbon”? Any departure from that cost would be sub-optimal. Robert P. Murphy contrasts William D. Nordhaus’ optimal carbon tax with more radical levels, which Nordhaus believes would be needed to meet the goals of the United Nation’s Intergovernmental Panel on Climate Change (IPCC). Nordhaus won the 2018 Nobel Prize in economics for his work on climate change. Whatever one might think of the real risks of climate change, Nordhaus’ clearly recognizes the economic downsides to mitigating against those risks.

Nordhaus has estimated that the social cost of carbon will be $44/ton in 2025 (about $0.39 per gallon of gas). He claims that a carbon tax at that level would limit increases in global temperature to 3.5º Celsius by 2100. He purports to show that the costs of a $44 carbon tax in terms of reduced economic output would be balanced by the gains from limiting climate warming. Less warming would require a higher tax with fewer incremental rewards, and even more incremental lost output. The costs of the tax would then outweigh benefits. For perspective, according to Nordhaus, a stricter limit of 2.5º C implies a carbon tax equivalent to $2.50 per gallon of gas. The IPCC, however, prescribes an even more radical limit of 1.5º C. That would inflict a huge cost on humanity far outweighing the potential benefits of less warming.

A Carbon Tax, If…

Many economists have come down in favor of a carbon tax under certain qualifications: revenue-neutrality, a “carbon dividend”, or as a pre-condition to deregulation of carbon sources and de-subsidization of alternatives. John Cochrane discusses a carbon tax in the context of the “Economists’ Statement on Carbon Dividends” (Cochrane’s more recent thoughts are here):

“It’s short, sweet, and signed by, as far as I can tell, every living CEA chair, every living Fed Chair, both Democrat and Republican, and most of the living Nobel Prize winners. … It offers four principles 1. A carbon tax, initially $40 per ton. 2. The carbon tax substitutes for regulations and subsidies and (my words) the vast crony-capitalist green boondoggle swamp, which is chewing up money and not saving carbon. 3. Border adjustment like VAT have [sic] 4. ‘All the revenue should be returned directly to U.S. citizens through equal lump-sum rebates.'”

Rather than a carbon dividend, Warren Meyer proposes that a carbon tax be accompanied by a reduction in the payroll tax, an elimination of all subsidies, mandates, and prohibitions, development of more nuclear power-generating capacity, and contributions to a cleanup of Chinese and Asian coal-power generation. That’s a lot of stuff, and I think it exceeds Meyer’s normal realism with respect to policy issues.

My Opposition

Again, I oppose the adoption of a carbon tax for several reasons, despite my sympathy for the logic of Pigouvian taxation of externalities. At the risk of repeating myself, here I elaborate on my reasons for opposition:

Government Guesswork: First, Nordhaus’ estimates notwithstanding, we do not and cannot know the climate/economic tradeoffs with any precision. We can barely measure global climate, and the history of what measures we have are short and heavily manipulated. Models purporting to show the relationship between carbon forcing and global climate climate change are notoriously unreliable. So even if we can agree on the goal (1.5º, 2.5º, 3.5º), and we won’t, the government will get the tradeoffs wrong. I took the following from a comment on Cochrane’s blog, a quote from A.C. Pigou himself:

“It is not sufficient to contrast the imperfect adjustments of unfettered enterprise with the best adjustment that economists in their studies can imagine. For we cannot expect that any State authority will attain, or even wholeheartedly seek, that ideal. Such authorities are liable alike to ignorance, to sectional pressure, and to personal corruption by private interest. A loud-voiced part of their constituents, if organized for votes, may easily outweigh the whole.”

Political Hazards: Second, we won’t get the hoped-for political horse trade made explicit in the “Economists’ Statement …” discussed above. As a political matter, the setting of the carbon tax rate will almost assuredly get us a rate that’s too high. Experiences with carbon taxes in Australia, British Columbia, and France have been terrible thus far, sowing widespread dissatisfaction with the resultant escalation of energy prices.

Economic Growth: Neither is it a foregone conclusion that a revenue-neutral carbon tax will stimulate economic growth, and it might actually reduce output. As Robert P. Murphy explains in another post, the outcome depends on the structure of taxes prior to the change. The substitution of the carbon tax will increase output only if it replaces taxes on a factor of production (labor or capital) that is overtaxed prior to the change. That undermines a key selling point: that the carbon tax would necessarily produce a “double dividend”: a reduction in carbon emissions and higher economic growth. Nevertheless, I’d allow that revenue neutrality combined with elimination of carbon regulation and “green” subsidies would be a good bet from an economic growth perspective.

Overstated Risks: Finally, I oppose carbon taxes because I’m unconvinced that the risk and danger of global warming are as great as even Nordhaus would have it. In other words, the external costs of carbon don’t amount to much. Our recorded temperature history is extremely short and is therefore not a reliable guide to the long-term nature of the systemic relationships at issue. Even worse, temperature records are manipulated to exaggerate the trend in temperatures (also see here, here and here). There is no evidence of an uptrend in severe weather events, and the dangers of sea level rise associated with increasing carbon concentrations also have been greatly exaggerated. Really, at some point one must take notice of the number of alarming predictions and doomsday headlines from the past that have not been borne out even remotely. Furthermore, higher carbon concentrations and even warming itself would be of some benefit to humanity. In addition to a greener environment, the benefits include more rapid economic growth, improved agricultural yields, and a reduction in the salient danger of cold-weather deaths.

Economic Development: The use of fossil fuels has helped to enable strong growth in incomes in developed economies. It has also given us energy alternatives such as nuclear power as well as research into other alternatives, albeit with very mixed success thus far. And while a carbon tax would create an additional incentive to develop such alternatives, a U.S. tax would not accomplish much if any global temperature reduction. Such a tax would have to be applied on a global scale. Talk about a political long-shot! Increasing the price of carbon emissions also has enormous downsides for the less developed world. These fragile economies would benefit greatly from development of fossil fuel energy, enabling reductions in poverty and the income growth necessary to someday join in the prosperity of the developed economies. This, along with liberalization of markets, is the affordable way to bring economic success to these countries, which in turn will enable them to consider the energy alternatives that might come to fruition by that time. Fighting the war on fossil fuels in the underdeveloped world is nothing if not cruel.

 

Climate Change and Disorders of the Mind

13 Sunday Jan 2019

Posted by Nuetzel in Environment, Global Warming, Socialism

≈ 1 Comment

Tags

Alexandria Ocasio-Cortez, Discount Rate, Gale Pooley, Green New Deal, Ingrid Newkirk, Julian Simon. Simon Abundance Index, Marian Tupy, Michael Bastasch, Modern Monetary Theory, Paul Erlich, PETA, Socialism, Tim Ball, Tom Harris, University of Missouri, Voluntary Human Extinction Movement, Yellow Vests

Let’s hear from an environmentalist and radical animal-rights activist:

“… the extinction of Homo Sapiens would mean survival for millions if not billions, of Earth-dwelling species. Phasing out the human race will solve every problem on earth, social and environmental.”

Okay then, you first! That is an actual quote of Ingrid Newkirk, the misanthropic president of People for the Ethical Treatment of Animals (PETA), as documented by Tom Harris and Tim Ball in “Extreme Environmentalists Are Anti-Human“. I’m no psychologist, but I believe most shrinks would categorize misanthropy as a condition of general dislike for humanity that usually poses no real threat to others. Not always, however, and by my reckoning the sentiments expressed by Newkirk are the ramblings of a disturbed individual. But she’s not alone in her psychosis, by any means.

The sheer lunacy of the environmental Left is nowhere more evident than in the call for mankind’s extinction, and it is not unusual to hear it these days. Here’s a similarly deranged and tyrannical statement from the Voluntary Human Extinction Movement:

“Phasing out the human race by voluntarily ceasing to breed will allow Earth’s biosphere to return to good health … the hopeful alternative to the extinction of millions of species of plants and animals is the voluntary extinction of one species: Homo sapiens … us.“

The policies advocated by many environmentalists don’t go quite that far, but they nevertheless tend to be anti-human, as Harris and Ball demonstrate. In particular, the emphasis on eliminating the use of fossil fuels over the next three decades would consign most people , but especially those in developing countries, to ongoing lives of penury. Here are Harris and Ball:

“Of course, the poor and disadvantaged would be most affected by the inevitable huge rise in energy costs that would accompany the end of fossil fuels. … By promoting the idea that CO2 emissions must be reduced, climate mitigation activists are supporting the expanded use of biofuels. This is resulting in vast quantities of the world’s grain being diverted to fuel instead of food, causing food prices to rise — also causing the most pain among the world’s poor.“

I am highly skeptical of the risks presented by climate change. The magnitude of climate changes on both global and regional scales, even to the present, are subject to so much uncertainty in measurement as to be largely unworthy of policy action. Climate models based on “carbon forcings” have been increasingly in error, and the risks about which we are warned are based on forecasts from the same models far into the future — taking little account of the potential benefits of warming. The purported risks, and the benefits of mitigating actions, are translated into economic terms by models that are themselves subject to tremendous uncertainty. Then, the future calamitous outcomes and the benefits of mitigation are discounted so lightly as to make the lives of future human beings… and plants and animals, and their hypothetical preferences, almost just as important as those of actual human beings who, in the present, are asked to bear the very certain costs of mitigation. The entire pursuit is madness.

Last spring I had a brief discussion with an economist engaged in research on the economics of climate change at the University of Missouri. I mentioned the uncertainties in measuring and aggregating temperatures over time and place (here is one example). He said, with a straight face, that those uncertainties should be disregarded or else “we can’t say anything”. Well yes, as a matter of scientific principle, a high variance always means a greater likelihood that one must accept the null hypothesis! Yet the perspective adopted by the alarmist community is that a disastrous outcome is the null hypothesis — the sky is falling! If it weren’t for government grant money, I’m sure the sense of impending doom would be psychologically debilitating.

And now we are presented with a “Green New Deal” (GND), courtesy of a certain congressional freshman, Alexandria Ocasio-Cortez, whose apparent media appeal is disproportionately greater than her intellectual acumen. The GND would eliminate fossil fuels and nuclear power (which emits zero carbon) from the U.S. energy mix by the impossibly early 2035. That would require the replacement of 88% of U.S. energy sources in about 17 years, which would cripple the U.S. economy and real incomes. The poor would suffer the most, but of course the GND promises much more than a makeover of our energy sources. In fact, it would mandate the replacement of “non-essential individual means of transport with high-quality and modern mass transit”. Welcome to the new authoritarian paradise! All transportation and anything else requiring power would be electrified, a massive infrastructural investment. Oh, and the proposal calls for a slew of socialist programs: a federal job guarantee, a living wage, universal health care, and of course income redistribution. Interestingly, this proposal is consistent with the agenda described in the most widely-reported climate paper in 2018, which Michael Bastasch describes as a call for global socialism.

Cortez’s desperate hope is that all this can be paid for via reductions in defense spending, high taxes on the rich, and “Modern Monetary Theory”. She really doesn’t understand the latter except that it sounds expedient. Like many other leftist numbskulls, she undoubtedly thinks that printing money offers society a free lunch. But printing money simply cannot be transformed into real resources, and such attempts generally have destructive consequences. So the GND might not reflect mental illness so much as sheer stupidity. Anyone familiar with the history of socialism and the realities of public finance knows that the GND would have punishing consequences for everyday people. The so-called Yellow Vests in France should serve to warn of the affront taken by those oppressed by over-reaching government: their protests were originally motivated by a proposed increase in the fuel tax on top of already high energy taxes and other policies that artificially increase the cost of energy.

The environmental lobby has long promoted doomsday scenarios: population growth would outstrip the globe’s capacity for producing food, and resources would become increasingly scarce. In fact, the opposite has occurred. This is demonstrated by Gale L. Pooley and Marian L. Tupy in “The Simon Abundance Index: A New Way to Measure Availability of Resources“. The index is named after the brilliant Julian Simon, who famously made a bet with the doomsayer Paul Erlich on the likely course of prices for five metals. Simon was correct in predicting that markets and human ingenuity would lead to greater abundance, and that prices would fall. But the deep paranoia of the environmental Left continues today. They are oblivious to the lessons of history and the plain market solutions that lie before them. Indeed, those solutions are rejected because they rely on positive action by the presumed villains in their delusional tale: free people. The demonization of mankind, private action, and markets is not just symptomatic of misanthropy; it reflects a deeply paranoid and manipulative psychological state. These would-be tyrants are a real danger to the human race.

Certainty Laundering and Fake Science News

05 Wednesday Dec 2018

Posted by Nuetzel in Global Warming, Risk, Science

≈ Leave a comment

Tags

Ashe Schow, Certainty Laundering, Ceteris Paribus, Fake News, Fake Science, Fourth Annual Climate Assessment, Money Laundering, Point Estimates, Statistical Significance, Warren Meyer, Wildfires

Intriguing theories regarding all kinds of natural and social phenomena abound, but few if any of those theories can be proven with certainty or even validated at a high level of statistical significance. Yet we constantly see reports in the media about scientific studies purporting to prove one thing or another. Naturally, journalists pounce on interesting stories, and they can hardly be blamed when scientists themselves peddle “findings” that are essentially worthless. Unfortunately, the scientific community is doing little to police this kind of malpractice. And incredible as it seems, even principled scientists can be so taken with their devices that they promote uncertain results with few caveats.

Warren Meyer coined the term “certainty laundering” to describe a common form of scientific malpractice. Observational data is often uncontrolled and/or too thin to test theories with any degree of confidence. What’s a researcher to do in the presence of such great uncertainties? Start with a theoretical model in which X is true by assumption and choose parameter values that seem plausible. In all likelihood, the sparse data that exist cannot be used to reject the model on statistical grounds. The data are therefore “consistent with a model in which X is true”. Dramatic headlines are then within reach. Bingo!

The parallel drawn by Meyer between “certainty laundering” and the concept of money laundering is quite suggestive. The latter is a process by which economic gains from illegal activities are funneled through legal entities in order to conceal their subterranean origins. Certainty laundering is a process that may encompass the design of the research exercise, its documentation, and its promotion in the media. It conceals from attention the noise inherent in the data upon which the theory of X presumably bears.

Another tempting exercise that facilitates certainty laundering is to ask how much a certain outcome would have changed under some counterfactual circumstance, call it Z. For example, while atmospheric CO2 concentration increased by roughly one part per 10,000 (0.01%) over the past 60 years, Z might posit that the change did not take place. Then, given a model that embodies a “plausible” degree of global temperature sensitivity to CO2, one can calculate how different global temperatures would be today under that counterfactual. This creates a juicy but often misleading form of attribution. Meyer refers to this process as a way of “writing history”:

“Most of us are familiar with using computer models to predict the future, but this use of complex models to write history is relatively new. Researchers have begun to use computer models for this sort of retrospective analysis because they struggle to isolate the effect of a single variable … in their observational data.”

These “what-if-instead” exercises generally apply ceteris paribus assumptions inappropriately, presuming the dominant influence of a single variable while ignoring other empirical correlations which might have countervailing effects. The exercise usually culminates in a point estimate of the change “implied” by X, without any mention of possible errors in the estimated sensitivity nor any mention of the possible range of outcomes implied by model uncertainty. In many such cases, the actual model and its parameters have not been validated under strict statistical criteria.

Meyer goes on to describe a climate study from 2011 that was quite blatant about its certainty laundering approach. He provides the following quote from the study:

“These question cannot be answered using observations alone, as the available time series are too short and the data not accurate enough. We therefore used climate model output generated in the ESSENCE project, a collaboration of KNMI and Utrecht University that generated 17 simulations of the climate with the ECHAM5/MPI-OM model to sample the natural variability of the climate system. When compared to the available observations, the model describes the ocean temperature rise and variability well.”

At the time, Meyer wrote the following critique:

“[Note the first and last sentences of this paragraph] First, that there is not sufficiently extensive and accurate observational data to test a hypothesis. BUT, then we will create a model, and this model is validated against this same observational data. Then the model is used to draw all kinds of conclusions about the problem being studied.

This is the clearest, simplest example of certainty laundering I have ever seen. If there is not sufficient data to draw conclusions about how a system operates, then how can there be enough data to validate a computer model which, in code, just embodies a series of hypotheses about how a system operates?”

In “Imprecision and Unsettled Science“, I wrote about the process of calculating global surface temperatures. That process is plagued by poor quality and uncertainties, yet many climate scientists and the media seem completely unaware of these problems. They view global and regional temperature data as infallible, but in reality these aggregated readings should be recognized as point estimates with wide error bands. Those bands imply that the conclusions of any research utilizing aggregate temperature data are subject to tremendous uncertainty. Unfortunately, that fact doesn’t get much play.

As Ashe Schow explains, junk science is nothing new. Successful replication rates of study results in most fields are low, and the increasing domination of funding sources by government tends to promote research efforts supporting the preferred narratives of government bureaucrats.

But perhaps we’re not being fair to the scientists, or most scientists at any rate. One hopes that the vast majority theorize with the legitimate intention of explaining phenomena. The unfortunate truth is that adequate data for testing theories is hard to come by in many fields. Fair enough, but Meyer puts his finger on a bigger problem: One simply cannot count on the media to apply appropriate statistical standards in vetting such reports. Here’s his diagnosis of the problem in the context of the Fourth National Climate Assessment and its estimate of the impact of climate change on wildfires:

“The problem comes further down the food chain:

  1. When the media, and in this case the US government, uses this analysis completely uncritically and without any error bars to pretend at certainty — in this case that half of the recent wildfire damage is due to climate change — that simply does not exist
  2. And when anything that supports the general theory that man-made climate change is catastrophic immediately becomes — without challenge or further analysis — part of the ‘consensus’ and therefore immune from criticism.”

That is a big problem for science and society. A striking point estimate is often presented without adequate emphasis on the degree of noise that surrounds it. Indeed, even given a range of estimates, the top number is almost certain to be stressed more heavily. Unfortunately, the incentives facing researchers and journalists are skewed toward this sort of misplaced emphasis. Scientists and other researchers are not immune to the lure of publicity and the promise of policy influence. Sensational point estimates have additional value if they support an agenda that is of interest to those making decisions about research funding. And journalists, who generally are not qualified to make judgements about the quality of scientific research, are always eager for a good story. Today, the spread of bad science, and bad science journalism, is all the more virulent as it is propagated by social media.

The degree of uncertainty underlying a research result just doesn’t sell, but it is every bit as crucial to policy debate as a point estimate of the effect. Policy decisions have expected costs and benefits, but the costs are often front-loaded and more certain than the hoped-for benefits. Any valid cost-benefit analysis must account for uncertainties, but once a narrative gains steam, this sort of rationality is too often cast to the wind. Cascades in public opinion and political momentum are all too vulnerable to the guiles of certainty laundering. Trends of this kind are difficult to reverse and are especially costly if the laundered conclusions are wrong.

The Non-Trend In Hurricane Activity

18 Thursday Oct 2018

Posted by Nuetzel in Global Warming, Hurricanes

≈ 1 Comment

Tags

David Middleton, El Nino, global warming, Hurricane Michael, Media Bias, Natural Disasters, Roy Spencer, Ryan Maue, Selection Bias, Tropical Cyclone Energy, Warren Meyer

People are unaccountably convinced that there is an upward trend in severe weather events due to global warming. But there is no upward trend in the data on either the frequency or severity of those events. Forget, for the moment, the ongoing debate about the true extent of climate warming. In fact, I’ll stipulate that warming has occurred over the past 40 years, though most of it was confined to the jump roughly coincident with two El Ninos in the 1990s; there’s been little if any discernible trend since. But what about the trend in severe weather? I’ve heard people insist that it is true, but a few strong hurricanes do not constitute a trend.

The two charts at the top of this post were created by hurricane expert Ryan N. Maue. I took them from an article by David Middleton., but visit Maue’s web site on tropical cyclone activity for more. The last month plotted is September 2018, so the charts do not account for Hurricane Michael and the 2018 totals are for a partial year. The first nine months of each year typically accounts for about 3/4 of annual tropical cyclones, so 2018 will be a fairly strong year. Nevertheless, the charts refute the contention that there has been an upward trend in tropical cyclone activity. In fact, in the lower chart, the years following the 1990s increase in global temperatures is shown to have been a time a lower cyclone energy. Roy Spencer weighs in on the negative trend in major landfalling hurricanes in the U.S. and Florida stretching over many decades.

Warren Meyer blames ‘”media selection bias” for the mistaken impression of dangerous trends that do not exist. That is, the news media are very likely to report extreme events, as they should, but they are very unlikely to report a paucity of extreme events, no matter how lengthy or unusual the dearth:

“Does anyone doubt that if we were having a record-heavy tornado season, this would be leading every newscast?  [But] if a record-heavy year is newsworthy, shouldn’t a record-light year be newsworthy as well?  Apparently not.” 

It so happens that 2018, thus far, has seen very close to a record low number of tornadoes in the U.S.

Meyer also highlights the frequent use of misleading statistics on the real value of damage from natural disasters. That aggregate value has almost certainly grown over the years, but it had nothing to do with the number or severity of natural disasters. Meyer explains:

“Think about places where there are large natural disasters in the US — two places that come to mind are California fires and coastal hurricanes. Do you really think that the total property value in California or on the US coastline has grown only at inflation? You not only have real estate price increases, but you have the value of new construction. The combination of these two is WAY over the 2-3% inflation rate.”

Recent experiences are always the most vivid in our minds. The same is true of broad impressions drawn from reports on the most recent natural disasters. The drama and tragedy of these events should never be minimized, and the fact that there is no upward trend in cyclone activity is no consolation to victims of those disasters. Still, the media can’t seem to resist the narrative that the threat of such events is increasing, even if it can’t be proven. Indeed, even if it’s not remotely correct. Reporters are human and generally not good at science, and they are not immune to the tendency to exaggerate the significance of events upon which they report. A dangerous, prospective trend is at once scary, exciting, and possibly career-enhancing. As for the public, sheer repetition is enough to convince most people that such a threat is undeniable… that everybody knows it… that the trend is already underway. The fact is that the upward trend in hurricane activity (and other kinds of severe weather) is speculative, not real.

Forest Fires Ignite Climate Change Delusions

10 Friday Aug 2018

Posted by Nuetzel in Global Warming, Wildfires

≈ 5 Comments

Tags

Arson, Bob Zybach, Cal Fire, Controlled Burns, Dust Bowl, Fire Suppression, Forest Fires, Forest Management, Grazing, High Pressure System, Logging, Megafires, Mendocino Complex Fire, Thomas Fire, Wildfires

The geographic extent of this summer’s forest fires won’t come close to the aggregate record for the U.S. Far from it. Yes, there are some terrible fires now burning in California, Oregon, and elsewhere, and the total burnt area this summer in the U.S. is likely to exceed the 2017 total. But as the chart above shows, the burnt area in 2017 was less than 20% of the record set way back in 1930. The same is true of the global burnt area, which has declined over many decades. In fact, this 2006 paper reported the following:

“Analysis of charcoal records in sediments [31] and isotope-ratio records in ice cores [32] suggest that global biomass burning during the past century has been lower than at any time in the past 2000 years. Although the magnitude of the actual differences between pre-industrial and current biomass burning rates may not be as pronounced as suggested by those studies [33], modelling approaches agree with a general decrease of global fire activity at least in past centuries [34]. In spite of this, fire is often quoted as an increasing issue around the globe [11,26–29].”

People have a tendency to exaggerate the significance of current events. Perhaps the youthful can be forgiven for thinking hot summers are a new phenomenon. Incredibly, more “seasoned” folks are often subject to the same fallacies. The fires in California have so impressed climate alarmists that many of them truly believe global warming is the cause of forest fires in recent years, including the confused bureaucrats at Cal Fire, the state’s firefighting agency. Of course, the fires have given fresh fuel to self-interested climate activists and pressure groups, an opportunity for greater exaggeration of an ongoing scare story.

This year, however, and not for the first time, a high-pressure system has been parked over the West, bringing southern winds up the coast along with warmer waters from the south, keeping things warm and dry inland. It’s just weather, though a few arsonists and careless individuals always seem to contribute to the conflagrations. Beyond all that, the impact of a warmer climate on the tendency for biomass to burn is considered ambiguous for realistic climate scenarios.

And what of the “mega-fires” burning in the West, like the huge Mendocino Complex Fire and last year’s Thomas Fire? Unfortunately, many decades of fire suppression measures — prohibitions on logging, grazing, and controlled burns — have left the forests with too much dead wood and debris, especially on public lands. From the last link:

“Oregon, like much of the western U.S., was ravaged by massive wildfires in the 1930s during the Dust Bowl drought. Megafires were largely contained due to logging and policies to actively manage forests, but there’s been an increasing trend since the 1980s of larger fires.

Active management of the forests and logging kept fires at bay for decades, but that largely ended in the 1980s over concerns too many old growth trees and the northern spotted owl. Lawsuits from environmental groups hamstrung logging and government planners cut back on thinning trees and road maintenance.

[Bob] Zybach [a forester] said Native Americans used controlled burns to manage the landscape in Oregon, Washington and northern California for thousands of years. Tribes would burn up to 1 million acres a year on the west coast to prime the land for hunting and grazing, Zybach’s research has shown.

‘The Indians had lots of big fires, but they were controlled,’ Zybach said. ‘It’s the lack of Indian burning, the lack of grazing’ and other active management techniques that caused fires to become more destructive in the 19th and early 20th centuries before logging operations and forest management techniques got fires under control in the mid-20th Century.”

The annual burnt area from wildfires has declined over the past ninety years both in the U.S. and globally. Even this year’s wildfires are unlikely to come close to the average burn extent of the 1930s. The large wildfires this year are due to a combination of decades of poor forest management along with a weather pattern that has trapped warm, dry air over the West. The contention that global warming has played a causal role in the pattern is balderdash, but apparently that explanation seems plausible to the uninformed, and it is typical of the propaganda put forward by climate change interests.

Science of the Spurious: Global Warming and Suicide

27 Friday Jul 2018

Posted by Nuetzel in Global Warming

≈ Leave a comment

Tags

Carbon Mitigation, Causality vs. Correlation, El Nino, Global Warming Hiatus, Inflammatory Chain Reaction, Marshall Burke, Nature Journal, P-Value Problem, Rare Events, Suicide Rate, Suicides and Global Warming

The latest entry in the scare-mongering literature of global warming, published this month in Nature, purports to show that warming will lead to more suicides! I’m not sure whether these researchers deserve an award for naiveté or cynicism, but they should get one or the other. The lead author is listed as Marshall Burke of Stanford University.

The basic finding of their research is that an increase in average monthly temperature of one degree Celsius (1.8 degrees Fahrenheit) increases the monthly suicide rate by 0.68% (or 0.42% when accounting for the previous month’s temperature as well). They might have used the preferred verbiage “is associated with”, rather than “increases”, because they surely know that correlation is not the same as causation, but perhaps they wished to impress the news media. Let’s put their result in perspective: the annual U.S. suicide rate per 100,000 persons was 11.64 over the years 1968-2004 in their sample. A 0.68% increase in the suicide rate would have brought that up to roughly 11.72. Of course, the average U.S. surface temperature did NOT increase by 1 degree Celsius over that period — it was about half that, and temperatures have been relatively flat since then.

The real problem here is that most of the variation in temperatures across the sample used by Burke and his co-authors is seasonal and geographical. While they claim to have accounted for such confounding influences using non-parametric controls, they give few specifics, so I am unconvinced. It has long been known that suicides tend to be seasonal and are higher in the warmer months of the year. The reasons cited vary, including a boost provided by warmth in the energy needed to execute a suicide plan, “inflammatory chain reactions” from high pollen counts, seasonal peaks in bipolar disorder, and stress from greater social interactions during warm weather. These are seasonal phenomena that are not even incidental to the question at hand. And let’s face it: if warmer weather gives you the energy to kill yourself, the temperature is probably not the problem.

The authors also report a positive “effect” of temperatures on suicides using annual data, but with a rather large variance. This result probably captures geographical variation in suicide rates, though again, the authors claim to have made adjustments. Southern states tend have high suicide rates, but no one has suggested that warm, southern climates are to blame. Instead, there are other socioeconomic factors that probably account for this regional variation. I suspect that this is another source of the correlation the authors use to project forward as a likely impact of global warming. (While the inter-mountain West tends to have high suicide rates relative to other regions, many of those states are lightly populated, so they would receive low weights in any analysis of the kind discussed here.)

Finally, the trend toward slightly warmer temperatures between 1968 and the late 1990s was spurred largely by a series of strong El Nino events, especially in 1997-98. Suicide rates in the U.S., on the other hand, reached a high in the mid-1970s, ran slightly lower until hitting another peak in the mid-1980s, and then tapered through the late 1990s even as temperatures spiked. Since 1998, suicides have trended up as temperature trends flattened during the so-called “global warming hiatus”, which is ongoing. This sequence not only contradicts the authors narrative; it reinforces the fact that the variation exploited in the samples may well be seasonal and geographical, and not related to climate trends.

An issue over which Burke, et al demonstrate no awareness is the exaggerated statistical significance of meaningless effects in very large samples. This has been called the “p-value problem” because large samples can lead to vanishingly small p-values (which measure statistical significance). In a very large sample, any small difference may appear to be statistically significant. It’s a well-known pitfall in empirical work. A suicide is what’s known in the statistical literature as a “rare event”, given it’s annual incidence of about 0.01% of the population. I submit that the estimated impact of a 1% change in that rate, a change of 0.0001%, is well-nigh meaningless.

But the authors, undaunted, do their very best to make it seem meaningful. First, they pick a sub-sample that yields a somewhat higher estimated effect. Then they apply it to a future climate change scenario that is considered extreme and “extremely unlikely”, by climate researchers. They calculate the cumulative increase in suicides implied by that estimate out to 2050 — 32 years — for the U.S. and Mexico combined: about 22,000 extra suicides (they give a confidence interval of 9,000 to 39,000). That would be a lot, of course, but aggregating over many years using a high-end estimate and an extreme scenario can make an otherwise tiny effect appear large. And remember, their confidence interval is tightened considerably via the use of many observations on essentially irrelevant seasonal and geographic variation.

Burke and his co-authors have succeeded in publishing a piece of research that is not just flimsy, but that they apply in a way that is grossly misleading. They made it as ripe and plump as possible for promotion by the news media, which seems to love a great scare story. I might just as easily claim that as declines in income are associated with higher suicides, efforts at carbon mitigation requiring high taxes and punitive consumer rates for electric power will lead to an increase in suicides. And I could “prove” it with statistics. Then we would have a double-warming whammy! But I have a better idea: let’s expose bad research for what it is, and that includes just about all of the literature that warns of catastrophe from global warming.

Deceits of the Climate Claimants

23 Monday Jul 2018

Posted by Nuetzel in Global Warming

≈ 1 Comment

Tags

Al Gore, Alpine Tree Lines, Armadillos, Desertification, Global Greening, global warming, Ocean Acidification, Polar Bears, Social Cost of Carbon, Steven Hayward

Well-meaning souls innocently parrot the global warming narrative but generally know little of the controversies surrounding its validation, or lack thereof. That includes much of the mainstream media. Every warm day is evidence of global warming. Every cold day is evidence of extreme volatility brought on by climate change. Every big storm, every forest fire, and every endangered species is attributed to warming. The poles are melting, the sea is rising, the sky is falling, and it is mostly bullshit. But in the meantime, the mythology of global warming has become an all-purpose cudgel for state oversight and “redistributive justice”, primarily to the benefit of the “climate change industrial complex. The myths are repeated so frequently that many accept them as facts. Here, I list a few of these myths along with information that should give pause to anyone tempted to take them too seriously.

The science is settled: There are a number of great scientists who dispute the global warming narrative (and see here). But a few studies have claimed incredibly widespread consensus (97%) among scientists that mankind drives climate change. These studies are generally plagued by biased samples of scientists (sometimes including non-scientists), faulty selection and classification of paper abstracts, and direct involvement of climate activists in the research process. These studies tend to present the “consensus” as one side of a stark dichotomy, with no nuance or middle ground for those subscribing to anything less than the inevitability of a warming catastrophe.

Record high temperatures: The temperatures that are almost always reported are surface temperatures that are subject to extreme bias. The most drastic bias is caused by increasing urbanization. Urban weather instruments are often sited in areas with an increasing amount of impervious ground cover, which absorbs sunlight and heat, leading to the so-called “urban heat-island effect”. This has imparted an upward trend in urban temperature readings. Moreover, urban temperature readings tend to be over-sampled in estimates of global surface temperatures, reinforcing the distortions in measured warming.

Melting poles: Arctic sea ice extent has been in modest retreat since 1980, when satellite measurement began to allow more accurate readings. The Antarctic, however, has shown a trend in the other direction, as shown in this piece by Judith Curry. In the same article, Curry shows that specific Arctic locations had less sea ice 6,000 to 8,000 years ago than today. For more complete information on satellite-era trends in sea ice extent, see this informative reference page (scroll way down for Antarctic information). Looks like Al Gore’s dire prediction that the poles would melt by 2007 was just a little off target.

Polar bear extinction: We are constantly seeing warnings of polar bear extinction on social media. Memes feature desperate-looking bears stranded on ice floes, drifting away from their cubs. Perhaps you aren’t supposed to know that polar bears are extremely strong swimmers. Or that the polar bear population is been thriving, increasing by an estimated 10-20% since 2001. So whether or not the past few decades have seen a decline in sea ice, the bears seem be doing just fine.

Rising sea levels: The rate of increase in sea levels over the past 8,000 years has been vey slow relative to the 10,000 years prior to that, when they rose at rates of up to 5.5 meters per century. That compares to recent rates of about one foot per century. Predictions that islands in the Pacific would be swallowed by the seas have not come to pass. In fact, satellite images show that more of the world’s sandy shorelines accreted than receded between 1984 and 2016, This does not appear to be a crisis by any means.

Increasing storms: No, the frequency and intensity of tropical cyclone activity has decreased since 1900, a trend that has continued unabated over the past 20 years. I know of at least one study suggesting otherwise, but it is based purely on modeled relationships, not hard data, and not tested against data. The frequency and intensity of droughts and floods has been flat to declining as well. And while more weak tornadoes are detected today than in the past, the frequency of moderate to strong tornadoes has decreased over the past 45 years.

Desertification: Increases in carbon concentration have not been associated with desertification, as the media seem to have concluded. As noted above, the frequency of drought has been steady to declining. In fact, precipitation data suggests that patterns of variability in rainfall do not square with the predictions of climate models. In fact, the world has seen an increase in green vegetation since 1985, even in arid regions.

Ocean acidification: The reported declines in ocean pH levels over the past few centuries are actually smaller than the normal seasonal variation in pH levels. The presumed negative impact on sea life appears, after all, to be minimal to nonexistent (see the same link).

Higher alpine tree lines: We’ve been waiting. It hasn’t happened, but that hasn’t stopped some activists from stating it as established fact.

Armadillo northward migration: I’ve heard this cited as “proof” of global warming. The range of armadillos extended as far north as southern Missouri and Kansas in the early 1970s, so this isn’t new. In fact, armadillos began their migration northward into the U.S. before the mid-1800s. Some biologists have attributed the migration to warming but acknowledge many other reasons, including more forested habitat in the north and factors such as movement of cattle by rail. Armadillos burrow and are able to keep warm underground in the winter. Of course, a series of warm winters can bring them further north along with other species, but a few cold winters can take a toll on the population and push them south again.

U.S. carbon criminality: U.S. CO2 emissions have been in almost steady decline on a per capita basis for at least seven decades, long before the carbon freak-out began. The declines have resulted largely from the normal market process of competitive efficiency in production. China leads the world in total annual CO2 emissions by a wide margin, about 80% ahead of the U.S. in 2017. Total U.S. emissions actually declined in 2017 for the third straight year, while emissions in China, the EU, and for the world all increased. In fact, China was actually in compliance with its pledge under the Paris Accord despite the increase, so the pledge was not especially ambitious.

High social cost of carbon: The estimates used by the Environmental Protection Agency are plagued by poor methodology and are subject to great uncertainty. Some studies rely on a series of tenuous causal links, such as CO2 emissions to global temperatures to ice melt to sea level to real dollars of coastal damage many years hence, all without considering variances at each stage, and assuming zero effort to adapt or mitigate damages over long time frames. A shortcut approach relies on historical correlations between temperatures and such measures as heat-related deaths, labor productivity and real output. These estimates extrapolate old relationships to the distant future and ignore the very real human tendency to adapt. The underlying assumptions are undercut by such basic facts as ongoing migration to warmer regions. The estimates also fail to account for the likelihood that warmer weather will improve agricultural productivity.

The public’s interest in climate change has waned, and no wonder: sensible people do not buy hype and demands for sacrifice in the face of contradictory evidence. Revelations of statistical fraud have led to even more skepticism. And when your “proof” is founded on model extrapolation, often theoretically-based rather than empirically-based, you’re skating on thin scientific ice. At this link, Steven Hayward has an interesting take on the public’s increasingly jaundiced view of global warming activism:

“Scientists who are genuinely worried about the potential for catastrophic climate change ought to be the most outraged at how the left politicized the issue and how the international policy community narrowed the range of acceptable responses. Treating climate change as a planet-scale problem that could be solved only by an international regulatory scheme transformed the issue into a political creed for committed believers. Causes that live by politics, die by politics.”

Sea Level Measurement and Perspective

26 Monday Mar 2018

Posted by Nuetzel in Global Warming, Sea Level

≈ 1 Comment

Tags

Absolute Sea Level, Carbon Concentation, Kip Hansen, NASA, NOAA, Relative Sea Level, Satellite Altimetry, Sea Levels, Sedimentation, Tidal Gauges, Vertical Land Motion

The measurement of sea level change is much more complicated than most people realize. In fact, the reported changes that alarm so many are minuscule relative to the uncertainties caused by these measurement difficulties. First, consider the easy part: If you drive a stake into the ground at the shore at high tide one day, your task of measuring sea level change will be complicated by the changing day-to-day tides. Those changes will force you to calculate average readings off of your stake over complete lunar cycles (and even that isn’t quite right, since the gravitational pull of the sun matters as well, and the moon’s distance from earth fluctuates). Or, you can make comparisons only between readings one lunar cycle apart.

Once the local reference point is established at the shore and the tides are controlled for, there are two kinds of changes that cause the sea to rise or fall relative to the “zero” point on your stake. The sea water can rise or fall, of course, but the land itself might do so as well! Settling or upwelling at the land surface can be caused by a variety of geological phenomena. “Vertical land motion”, up or down, occurs almost everywhere. That means sea level is a relative concept. In addition, over time the placement of on-shore sea-level gauges often change with harbor and ship channel alterations, and even accidents. These all require adjustments in order to make valid comparisons across time. That’s to say nothing of variations in air pressure and water currents, which certainly affect on-shore readings. Today, sea levels are also measured by satellite, but that doesn’t make sea level measurement simpler by any means, as you’ll see below.

Kip Hansen discusses the vagaries of sea level measurement in an excellent post. If it isn’t already obvious, changes in readings from a single tide level gauge do not show the rate at which the absolute sea level is changing over time. It shows only the local net effect of the absolute sea level change and the land movement. But Hansen emphasizes an implication about which few are aware: a comparison of relative sea level changes in two different locales shows only the difference in “vertical land motion” between the two sites (at least as a first approximation).

Hansen notes a major discrepancy between the absolute sea level changes reported by NOAA (1.7 mm per year) and NASA (3.0 mm per year). These figures are estimated by satellite readings, which have extremely poor resolution (measured in cm, not mm) compared to tidal gauges. This quote from Hansen in the comments section is revealing (emphasis added):

“Satellite Altimetry — when reporting sea level rise — is not a measurement, but a complex calculation with a dozen or so ‘corrections’ and ‘adjustments’ for confounding factors, all of which are of greater magnitude than the change in sea surface height being sought. Many of these confounders are orders of magnitude greater. Some additions, such the famed GIA adjustment, are acknowledged not to appear in the physical sea surface height at all, but are added on the basis that ‘the sea would have risen the 0.3 mm/yr if the ocean basins hadn’t expanded. There is no scientific justification for the difference. In this essay, I point out that NOAA has stuck to its scientitic guns and not gone along with the NASA figure.“

There are statements on NOAA’s web site that seem to endorse the NASA estimate, but Hansen discounts those references. He advises that there is a big difference between the NOAA science community and its marketing staff, which undoubtedly dominates the content viewed by the public on the site.

There are many other factors that play havoc with sea level estimates, some of which are intractable. One issue, which comes up in the comments to Hansen’s article, has to do with sedimentation and its displacement of sea water. While its effect spreads out across the entire ocean, Hansen stops short of calling it a contributor to absolute sea level rise, though that would be the implication in terms of measurement.

Alarm over rising sea levels is based partly on the focus of local media on relative sea level changes. That may well be an important local issue, whether the land is settling or the absolute sea level is rising (though the two may have different local policy implications). But local concerns about relative sea level are often translated into global concerns that confuse relative with absolute sea levels. This makes excellent fodder for the propaganda of the leftist climate change movement. That propaganda is so effective that it sometimes feeds back to foment local concern, even in areas experiencing reductions in relative sea level! These concerns fly in the face of local experience as well as the absolute rates of change estimated by NOAA.

I’ll close with the following comment taken from an earlier post on SacredCowChips:

“The prospect of rising sea levels is another matter that concerns alarmists, who always fail to note that sea levels have been increasing for a very long time, well before carbon concentrations could have had any impact. In fact, the sea level increases in the past few centuries are a rebound from lows during the Little Ice Age…. But even those fluctuations look minor by comparison to the increases in sea levels that occurred over 8,000 years ago.“

Climate Change, Hurricanes and Noisy Statistics

22 Friday Sep 2017

Posted by Nuetzel in Global Warming

≈ Leave a comment

Tags

AGW, Atlantic Multi-Decadal Oscillation, Climate Change, Cool the Past, East Anglia University, El Nino, Fabius Maximus, global warming, Hurricane Harvey, Hurricane Irma, Hurricane Maria, Michael Mann, NOAA, Roger Pielke Sr, Roy Spencer, Ryan Maue, Sea Surface Temperatures, Signal-to-Noise, Statistical Noise, Storm Intensity, Watt's Up With That?

IMG_4919

The nasty spate of hurricanes this year has been a catch-up of sorts following a decade of subdued activity. In fact, global hurricane activity has been flat to declining in frequency since 1970. Until the recent increase, hurricane activity had been trending down in terms of 24-month cumulative energy since the 1990s, as the chart above shows. The historical data on the number of U.S. landfalls extends back to 1900, and it has had a negative trend as well. Nevertheless, we hear from climate alarmists that Hurricanes Harvey and Irma, which ended a drought of record length in U.S hurricane landfalls, and now presumably Maria, were a consequence of anthropomorphic global warming (AGW), er… climate change.

The implication is that increases in the atmospheric concentration of CO2 led to these hurricanes or their high intensity. Apparently, the paucity of hurricane activity over the previous ten years can be waved off as a fluke. A further implication of the alarmist view is that the longer negative trends in hurricane frequency and energy can be ignored in the context of any relation to CO2 concentration. But how so? One confounding factor I’ve seen mentioned blames El Nino warming in the Pacific, and a consequent increase in Atlantic wind shear, for the long lull in activity after 2005. That has a ring of plausibility, but a closer look reveals that actual El Nino activity during those years was hardly impressive, with the exception of 2015-16.

More historical data can be seen in the charts on the tropical cyclone page on the Watts Up With That? blog. (The charts in question start about two-thirds of the way down the page.) Hurricane expert Ryan Maue compiled a number of these charts, including the one above. He authored an editorial in the Wall Street Journal this week bemoaning the climate-change hype surrounding Harvey and Irma (if the link doesn’t work, it is available at the WSJ’s Opinion page on Facebook, posted on 9/17). Maue believes that both the climate science community and the media share in the blame for that hype. But he also says the following:

“Although a clear scientific consensus has emerged over the past decade that climate change influences hurricanes in the long run, its effect upon any individual storm is unclear.“

Maue provides a link to this NOAA web site offering cautious support for the proposition that there is a link between global warming and hurricane intensity, though the data it cites ends about ten years ago, so it does not capture the recent lull. Also, some of the information it provides is based on modeled global temperatures and hurricane activity through 2100. As is well-known by now, or should be, long-term climate forecasts based on carbon forcings are notoriously inaccurate, and NOAA admits that the association between those predicted temperatures and future hurricanes is tenuous:

“It is premature to conclude that human activities–and particularly greenhouse gas emissions that cause global warming–have already had a detectable impact on Atlantic hurricane or global tropical cyclone activity.“

Perhaps the idea that there is consensus regarding the relationship between climate change and hurricanes is more of a stretch than Maue and NOAA let on. Here is a summary of 30 peer-reviewed studies showing no connection to either hurricane frequency or intensity. Most of these studies are more recent than the end of the data record cited by NOAA. And in fact, many of these studies find support for a negative link between global temperatures and hurricane activity.

One of the prominent alarmists in the climate research community is Penn State’s Michael Mann, who has famously claimed that hurricanes are more frequent now than at any time in the past 1,000 years. He based his conclusions on highly speculative hurricane “proxies” identified in layers of sediment. Mann’s claims and research technique have been called into questioned by other climate scientists, who have arrived at contrary results in their own research. Lest anyone forget, Mann was implicated in a  data manipulation fraud related to the East Anglia climate scandal. Though cleared by a group of tenured professors at his own university, there are a number of climate scientists who believe Mann violated scientific standards.

The claim that global warming will cause hurricanes to become increasingly intense relies on elevated sea surface temperatures. This year, temperatures in the Gulf of Mexico are elevated and are said to have had a role in strengthening Harvey as it approached the Gulf Coast. Texas, however, has experienced as many landfalls of major hurricanes with cooler Gulf waters as with warmer waters. And Irma strengthened in a part of the Atlantic without such warm temperatures. Instead, minimal wind shear was implicated as a factor contributing to Irma’s strength.

In general, Atlantic temperatures have been relatively warm since the late 1990s, a fact that most scientists would at least partially attribute to the “Atlantic multi-decadal oscillation“, a regular cycle in water temperatures that repeats with a period of multiple decades. Potentially adding to that temperature increase is a controversial change in NOAA’s calibration of sea surface temperatures, as an increasing share of those readings are taken from buoys rather than ship-board measurement. There is some suspicion that NOAA’s adjustments “cool the past” more than is justified, a suspicion that was heightened by allegations from one whistle-blowing NOAA scientist early this year. Then, there is the contention that the sea surface temperature makes little difference if it is matched by an increase in air temperature.

Overall, NOAA says the combination of frequency and intensity of tropical cyclones will increase by 2%-11% over the rest of this century. As Roy Spencer notes, that is not a terribly alarming figure given the risks people have always willingly accepted by living in coastal areas. In any case, the range is based on models of climate behavior that are of questionable reliability. And like past temperature predictions produced by carbon-forcing climate models, it is likely to be a gross overestimate. Here is Roger Pielke, Sr., who is quoted in this wide-ranging post on hurricanes and climate at the Fabius Maximus web site:

“Model projections of hurricane frequency and intensity are based on climate models. However, none have shown skill at predicting past (as hindcasts) variations in hurricane activity (or long term change in their behavior) over years, decades, and longer periods. Thus, their claim of how they will change in the future remains, at most, a hypothesis (i.e. speculation). When NOAA, IPCC and others communicate to the media and public, to be scientifically honest, they should mention this.”

Despite the spike in activity this year, strong hurricanes are intermittent and fairly rare. Establishing reliable statistical connections with other forces is difficult with emergent events like hurricanes. Moreover, the degree of error in measuring global or regional temperature itself is much larger than is generally acknowledged, and the global warming “signal” is very weak. As we say in the statistical analysis business, noisy data are compatible with diverse hypotheses. The relationship between hurricanes and climate change is a prime example.

← Older posts
Newer posts →
Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • The Case Against Interest On Reserves
  • Immigration and Merit As Fiscal Propositions
  • Tariff “Dividend” From An Indigent State
  • Almost Looks Like the Fed Has a 3% Inflation Target
  • Government Malpractice Breeds Health Care Havoc

Archives

  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Passive Income Kickstart
  • OnlyFinance.net
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library
  • Scattered Showers and Quicksand

Blog at WordPress.com.

Passive Income Kickstart

OnlyFinance.net

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The Future is Ours to Create

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

Scattered Showers and Quicksand

Musings on science, investing, finance, economics, politics, and probably fly fishing.

  • Subscribe Subscribed
    • Sacred Cow Chips
    • Join 128 other subscribers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...