The latest entry in the scare-mongering literature of global warming, published this month in Nature, purports to show that warming will lead to more suicides! I’m not sure whether these researchers deserve an award for naiveté or cynicism, but they should get one or the other. The lead author is listed as Marshall Burke of Stanford University.
The basic finding of their research is that an increase in average monthly temperature of one degree Celsius (1.8 degrees Fahrenheit) increases the monthly suicide rate by 0.68% (or 0.42% when accounting for the previous month’s temperature as well). They might have used the preferred verbiage “is associated with”, rather than “increases”, because they surely know that correlation is not the same as causation, but perhaps they wished to impress the news media. Let’s put their result in perspective: the annual U.S. suicide rate per 100,000 persons was 11.64 over the years 1968-2004 in their sample. A 0.68% increase in the suicide rate would have brought that up to roughly 11.72. Of course, the average U.S. surface temperature did NOT increase by 1 degree Celsius over that period — it was about half that, and temperatures have been relatively flat since then.
The real problem here is that most of the variation in temperatures across the sample used by Burke and his co-authors is seasonal and geographical. While they claim to have accounted for such confounding influences using non-parametric controls, they give few specifics, so I am unconvinced. It has long been known that suicides tend to be seasonal and are higher in the warmer months of the year. The reasons cited vary, including a boost provided by warmth in the energy needed to execute a suicide plan, “inflammatory chain reactions” from high pollen counts, seasonal peaks in bipolar disorder, and stress from greater social interactions during warm weather. These are seasonal phenomena that are not even incidental to the question at hand. And let’s face it: if warmer weather gives you the energy to kill yourself, the temperature is probably not the problem.
The authors also report a positive “effect” of temperatures on suicides using annual data, but with a rather large variance. This result probably captures geographical variation in suicide rates, though again, the authors claim to have made adjustments. Southern states tend have high suicide rates, but no one has suggested that warm, southern climates are to blame. Instead, there are other socioeconomic factors that probably account for this regional variation. I suspect that this is another source of the correlation the authors use to project forward as a likely impact of global warming. (While the inter-mountain West tends to have high suicide rates relative to other regions, many of those states are lightly populated, so they would receive low weights in any analysis of the kind discussed here.)
Finally, the trend toward slightly warmer temperatures between 1968 and the late 1990s was spurred largely by a series of strong El Nino events, especially in 1997-98. Suicide rates in the U.S., on the other hand, reached a high in the mid-1970s, ran slightly lower until hitting another peak in the mid-1980s, and then tapered through the late 1990s even as temperatures spiked. Since 1998, suicides have trended up as temperature trends flattened during the so-called “global warming hiatus”, which is ongoing. This sequence not only contradicts the authors narrative; it reinforces the fact that the variation exploited in the samples may well be seasonal and geographical, and not related to climate trends.
An issue over which Burke, et al demonstrate no awareness is the exaggerated statistical significance of meaningless effects in very large samples. This has been called the “p-value problem” because large samples can lead to vanishingly small p-values (which measure statistical significance). In a very large sample, any small difference may appear to be statistically significant. It’s a well-known pitfall in empirical work. A suicide is what’s known in the statistical literature as a “rare event”, given it’s annual incidence of about 0.01% of the population. I submit that the estimated impact of a 1% change in that rate, a change of 0.0001%, is well-nigh meaningless.
But the authors, undaunted, do their very best to make it seem meaningful. First, they pick a sub-sample that yields a somewhat higher estimated effect. Then they apply it to a future climate change scenario that is considered extreme and “extremely unlikely”, by climate researchers. They calculate the cumulative increase in suicides implied by that estimate out to 2050 — 32 years — for the U.S. and Mexico combined: about 22,000 extra suicides (they give a confidence interval of 9,000 to 39,000). That would be a lot, of course, but aggregating over many years using a high-end estimate and an extreme scenario can make an otherwise tiny effect appear large. And remember, their confidence interval is tightened considerably via the use of many observations on essentially irrelevant seasonal and geographic variation.
Burke and his co-authors have succeeded in publishing a piece of research that is not just flimsy, but that they apply in a way that is grossly misleading. They made it as ripe and plump as possible for promotion by the news media, which seems to love a great scare story. I might just as easily claim that as declines in income are associated with higher suicides, efforts at carbon mitigation requiring high taxes and punitive consumer rates for electric power will lead to an increase in suicides. And I could “prove” it with statistics. Then we would have a double-warming whammy! But I have a better idea: let’s expose bad research for what it is, and that includes just about all of the literature that warns of catastrophe from global warming.
Despite evidence to the contrary, there’s one thing climate change alarmists seem to consider a clincher. Well… their stylized account has the seas absorbing heat from our warming atmosphere as human activity forces carbon emissions into the air. That notion seems to be reinforced, at least in the popular imagination, by the fact that the sea is a “carbon sink”, but that is a matter of carbon sequestration and not a mechanism of ocean warming. While ocean temperatures have warmed slightly over the past few decades, it is almost entirely coincidental, rather than a result of slightly warmer air temperatures.
Heat and the Hydrosphere
There is no doubt that the oceans store heat very efficiently, but that heat comes primarily from solar radiation and geothermal sources underseas. In fact, water stores heat far more efficiently than the atmosphere. According to Paul Homewood, a given cross section of sea water to a depth of just 2.6 meters is capable of holding as much heat as a column of air of the same width extending from the ocean surface to the outermost layers of the atmosphere! (See here for an earlier reference.) However, that does not imply that the oceans are very effective at drawing heat from warmer air or particularlycarbon back-radiation. Both the air and water draw heat from solar radiation, and how much in any given location depends on the Sun’s angle in the sky.
A solid guide is that air temperatures are heavily influenced by water temperatures, but not as much vice versa. When temperatures in the upper layers of the ocean rise from natural forces, including reduced upward circulation from greater depths, evaporation causes this heat to radiate into the atmosphere along with evaporation of water vapor. Homewood notes that El Niño patterns make the influence of the Pacific Ocean waters on climate pretty obvious. The impact of the Gulf Stream on European climates is also instructive.
The Indian Ocean accounted for about half of the sea warming that occurred within the globe’s top 700 meters of waters over the years 2000 – 2019, though the Indian Ocean represents only about 20% of the world’s sea surface. The authors of that research found that the warming was not caused by trends in surface forcing of any kind, including warmer air temperatures. They said the ocean warming:
“… has been driven by significant changes in oceanic fluxes and not by surface forcing. … the ocean has been driving a rapid increase in Indian Ocean heat content.”
This was consistent with an earlier study of global sea temperatures covering the period 1984 – 2006 that found:
“… diminished ocean cooling due to vertical ocean processes … A conclusion is that natural variability, rather than long-term climate change, dominates the SST [sea surface temperature] and heat flux changes over this 23-yr period.”
It’s a Water World
Heat released by the oceans tends to dominate variations in global temperatures. A 2018 study found that evaporative heat transfer to the atmosphere from the oceans was closely associated with variations in air temperatures:
“When the atmosphere gets extra warm it receives more heat from the ocean, when it is extra cool it receives less heat from the ocean, making it clear that the ocean is the driving force behind these variations. …
The changes in solar radiation received at the Earth’s surface are clearly a trigger for these variations in global mean temperature, but the mechanisms by which these changes occur are a bit more complex and depend on the time-scale of the changes.”
Measurement
Willis Eschenbach reviewed a prominent study of ocean temperature changes and noted that the authors’ estimate of total warming of the oceans was quite small:
“… over the last sixty years, the ocean has warmed a little over a tenth of one measly degree … now you can understand why they put it in zettajoules—it’s far more alarming that way.”
Eschenbach goes on to discuss the massive uncertainty underlying measurements of ocean temperatures, particularly below a depth of 2,000 meters, but even well above that depth given the extremely wide spacing of so-called ARGO floats. However, the relative stability of the point estimates over 60 years is noteworthy, not to mention the “cold water” doused on alarmist claims about ocean overheating.
Sun Engine
Ocean warmth begins with energy from the Sun and from the deep interior of the Earth. The force of solar energy is greatest in the tropics, where sunlight is perpendicular to the surface of the Earth and is least dispersed by the thickness of the atmosphere. The sun’s radiative force is smallest in the polar regions, where the angle of its light is acute. As Anthony Watts says:
“All elements of Earth’s weather, storm fronts, hurricanes, the jet stream, and even ocean currents, are driven to redistribute energy from the tropics to the poles.”
Both land and sea absorb heat from the Sun and from volcanic activity, though the heat is moderated by the sea. That moderation is especially impactful in the Southern Hemisphere, which has far less land area, greater exposure of sea surface to the Sun, and about half of the average ocean temperature variation experienced in the North.
Ultimately, the importance of natural sunlight on air and sea temperatures can’t be overemphasized. Henrik Svensmark and some co-authors have estimated that a cosmic ray flux of 15% from a coronal mass ejection leads to a reduction in cloud cover within roughly 9 – 12 days. The ultimate increase in the Earth’s “energy budget” over about a week’s time is about the same size as a doubling of CO2, which certainly puts things in perspective. However, the oceans, and hence cloud cover, moderate the impact of the Sun, with or without the presence of additional greenhouse gases forced by human activity.
Vapors
The importance of evaporation from bodies of water also deserves great emphasis. No one doubts the massive influence of greenhouse gases (GHGs) on the climate. Water vapor accountsfor about 90% of GHGs, and it originates predominantly from oceans. Meanwhile, carbon dioxide accounts for less than 4% of GHGs, and it appears that only a small part is from anthropogenic sources (and see here and below).
The impact of changing levels of water vapor dominates GHG levels. They are also a critical input to cloud formation, a phenomenon that climate models are generally ill-equipped to explain. Clouds reflect solar radiation back into space, reducing the Sun’s net contribution to the Earth’s energy budget. On the other hand, clouds can trap heat in the lower layers of the atmosphere. The globe has an average of 60 – 70% cloud cover, and most of that is over the oceans. Increased cloud cover generally leads to declines in temperature.
A 2015 study identified a process through which the sea surface has an unexpectedly large impact on climate. This was from the formation of isoprene, a film on the ocean surface, which leads to more cloud formation. In addition to biological sources, isoprene was found to originate, surprisingly, from the effect of sunlight.
The Big Sink
Man-made emissions of CO2 constitute only about 5% of naturally discharged CO2, which is roughly matched by natural removal. CO2 is absorbed, dissolved, or transformed in a variety of ways on both land and sea, but the oceans collectively represent the world’s largest carbon sink. They hold about 50 times more CO2 than the atmosphere. Carbon is stored in sea water at great depths, and it enhances undersea vegetation just as it does on land. It is sequestered in a variety of sea organisms as calcium carbonate and is locked in sediments as well. A longstanding question is whether there is some limit on the capacity of the oceans and other sinks to store carbon, but apparently the uptake over time has remained roughly constant at just under 50% of all natural and man-made CO2 emissions (also see here). So far, we don’t appear to be approaching any sort of “saturation point”.
One claim about the rising carbon stored undersea is that it will drive down the oceans’ pH levels. In other words, it will lead to “ocean acidification” and harm a variety of marine life. Rud István has ridiculed that term (quite rightly) because slightly less alkaline sea water does not make it “acidic”. More substantively, he notes the huge natural variations in ocean pH levels across different marine environments, the exaggeration inherent in some estimates of pH changes that do not account for physical buffering, and the fact that the impact on many organisms is inconsistent with the presumed harms of reduced pH. In fact, errors in some of the research pointing to those harms has been acknowledged. In addition, the much feared “coral crisis” seems to have been a myth.
Conclusion
The upper layers of the oceans have warmed somewhat over the past 60 years, but the warming had natural causes. Heat transfer from the atmosphere to the hydrosphere is relatively minor compared to the absorption of heat by oceans via solar forcings. It is also minor compared to the transfer of temperature from oceans to surface air. As Jim Steele has explained it:
“Greenhouse longwave energy penetrates only a few microns into the ocean surface and even less into most soils, but the sun’s shortwave energy passes much more deeply into the ocean.”
It’s reasonable to concede that warmer air temperatures via man-made GHGs might be a minor reinforcement to natural sources of ocean warming, or it might slightly moderate ocean cooling. However, measuring that contribution would be difficult against the massive background of natural forcings on ocean temperatures.
Oceans are dominant in terms of heat storage from natural forcings and in terms of carbon sequestration. In fact, the oceans have thoroughly outperformed alarmist projections as a carbon sink. Dire prognostications of the effect of carbon dioxide on marine life have been drastically over-emphasized as well.
I added the hyperlinks to Javier’s comment. The last two items on his list emphasize a benign aspect of the warming we’ve experienced since the late 1970s. After all, cold temperatures are far deadlier than warm temperatures.
Here is a disclaimer: my use of the term “global warming” refers to the fact that averages of measured temperatures have risen in a few fits and starts over the past four decades. I do not use the term to mean a permanent trend induced by human activity, since that time span is very short in climatological terms, and the observed increase is well within the historical range of natural variation.
Few seem aware that the surface temperature record is plagued by an obvious issue: the siting of most weather stations in urban environments. In fact, urban weather stations account for 82% of total stations in the U.S., as Jim Steele writes of “Our Urban ‘Climate Crisis’“. Temperatures run hot in cities due to the heat-absorbing characteristics of building materials and the high proportion of impervious ground cover. And some stations well outside of metropolitan areas are also situated near concrete and pavement. There is little doubt that urbanization and thoughtless siting decisions for weather stations have corrupted temperature measurements and exaggerated surface warming trends.
Hot summer days always arouse expressions of climate alarm. However, increases in summer temperatures, and daytime temperatures, have been relatively modest compared to increases in winter and nighttime temperatures. In Roy Spencer’s post, (also linked above), he reports that 80% of the U.S. warming observed by a NASA satellite system (AIRS) from September 2002 to March 2019 occurred at night.
Of course, climate alarmists also claim that global warming makes temperatures more volatile. So, they argue, there are now more very hot days even if the change in the average summer temperature is modest. The facts do not support that claim, however. Indeed, the world has experienced less temperature volatility as global temperatures have risen. And less extreme weather, as it happens, is contrary to another theme in the warmest narrative.
There is some reason to believe that the relative increase in nighttime temperature is connected to the urban heat island effect. Pavement, concrete, and other materials retain heat overnight. Thus, increasing urbanization leads to nighttime temperatures that do not fall from their daily highs as much as they did a few decades back. The magnification of daytime heating is not as pronounced as the effect of retained heat overnight, which causes the diurnal temperature range to decrease. But I should note that some rural farmers insist that nighttime lows have increased relative to daytime highs there as well, and Roy Spencer himself is not confident that the satellite temperature data on which his finding was based reflects a strong urban heat island effect.
For perspective, it’s good to remember that we live in the midst of an interglacial period. These are relatively brief, temperate intervals between lengthier glacial periods (see here, and more from Javier here). The current interglacial is well advanced, having begun about 11,700 years ago, but Javier estimates that it could last for another 1,500 years. That would be longer than the historical average. At the peak of the last interglacial period, temperatures were about 2C higher than today and sea levels were 5 meters higher. The last interglacial ended about 120,000 years ago, but the historical average time between interglacials is only about 41,000 years. These low frequency changes in the global climate are generally driven by the Earth’s axial tilt (obliquity), recurring cycles in the shape of our eliptical orbit around the Sun (eccentricity), and the Earth’s solar exposure (insolation) and albedo.
Biased surface temperature records have both inspired and reinforced the sense of panic surrounding global warming. Few observers seem to understand the existence of a strong bias, let alone its source: the urban heat island effect. And few seem to realize that most of the warming we’ve experienced since the 1970s has occurred at night, not during the day, and that these changes are well within the range of natural variation. Dramatic climate change happens at both long and short time scales for reasons that are largely astronomical. The lengthy historical record accumulated by paleoclimatologists shows that current concerns over global warming are exaggerated. I’m quite confident that mankind will find ways to adapt to climate change in either direction, but some global warming might be beneficial once the next glacial period begins.
First, a preliminary issue: many resources qualify as commons in the very broadest sense, yet free societies have learned over time that many resources are used much more productively when property rights are assigned to individuals. For example, modern agriculture owes much to defining exclusive property rights to land so that conflicting interests don’t have to compete (e.g,, the farmer and the cowman). Federal land is treated as a commons, however. There is a rich history on the establishment of property rights, but within limits, the legal framework in place can define whether a resource is treated as a commons, a club good, or private property. The point here is that there are substantial economic advantages to preserving strong property rights, rather than treating all resources as communal.
The authors of the planetary commons (PC) paper present a rough sketch for governance over use of the planet’s resources, given their belief that a planetary crisis is unfolding before our eyes. The paper has two main thrusts as I see it. One is to broadly redefine virtually all physical resources as common pool interests because their use, in the authors’ view, may entail some degree of external cost involving degradation of the biosphere. The second is to propose centralized, “planetary” rule-making over the amounts and ways in which those resources are used.
It’s an Opinion Piece
The PC paper is billed as the work product of a “collaborative team of 22 leading international researchers”. This group includes four attorneys (one of whom was a lead author) and one philosopher. Climate impact researchers are represented, who undoubtedly helped shape assumptions about climate change and its causes that drive the PC’s theses. (More on those assumptions in a section below.) There are a few social scientists of various stripes among the credited authors, one meteorologist, and a few “sustainability”, “resilience”, and health researchers. It’s quite a collection of signees, er… “research collaborators”.
Grabby Interventionists
The reasoning underlying a “planetary commons” (PC) is that the planet’s biosphere qualifies as a commons. The biosphere must include virtually any public good like air and sunshine, any common good like waterways, or any private good or club good. After all, any object can play host to tiny microbes regardless of ownership status. So the PC authors characterization of the planet’s biosphere as a commons is quite broad in terms of conventional notions of resource attributes.
We usually think of spillover or external costs as arising from some use of a private resource that imposes costs on others, such as air or water pollution. However, mere survival requires that mankind exploit both public and non-public resources, acts that can always be said to impact the biosphere in some way. Efforts to secure shelter, food, and water all impinge on the earth’s resources. To some extent, mankind must use and shape the biosphere to succeed, and it’s our natural prerogative to do so, just like any other creature in the food chain.
Even if we are to accept the PC paper’s premise that the entire biosphere should be treated is a commons, most spillovers are de minimus. From a public policy perspective, it makes little sense to attempt to govern over such minor externalities. Monitoring behavior would be costly, if not impossible, at such an atomistic level. Instead, free and civil societies rely on a high degree of self-governance and informal enforcement of ethical standards to keep small harms to a minimum.
Unfortunately, the identification and quantification of meaningful spillover costs is not always clear-cut. This has led to an increasingly complex regulatory environment, an increasingly litigious business environment, and efforts by policymakers to manage the detailed inputs and outputs of the industrial economy.
All of that is costly in its own right, especially because the activities giving rise to those spillovers often enable large welfare enhancements. Regulators and planners face great difficulties in estimating the costs and benefits of various “correctives”. The very undertaking creates risk that often exceeds the cost of the original spillover. Nevertheless, the PC paper expands on the murkiest aspects of spillover governance by including “… all critical biophysical Earth-regulating systems and their functions, irrespective of where they are located…” as part of a commons requiring “… additional governance arrangements….”
Adoption of the PC framework would authorize global interventions (and ultimately local interventions, including surveillance) on a massive scale based on guesswork by bureaucrats regarding the evolution of the biosphere.
Ostrom Upside Down
Not only would the PC framework represent an expansion of the grounds for intervention by public authorities, it seeks to establish international authority for intervention into public and private affairs within sovereign states. The authors attempt to rationalize such far-reaching intrusions in a rather curious way:
“Drawing on the legacy of Elinor Ostrom’s foundational research, which validated the need for and effectiveness of polycentric approaches to commons governance (e.g., ref. 35, p. 528, ref. 36, p. 1910), we propose that a nested Earth system governance approach be followed, which will entail the creation of additional governance arrangements for those planetary commons that are not yet adequately governed.”
Anyone having a passing familiarity with Elinor Ostrom’s work knows that she focused on the identification of collaborative solutions to common goods problems. She studied voluntary and often strictly private efforts among groups or communities to conserve common pool resources, as opposed to state-imposed solutions. Ostrom accepted assigned rights and pricing solutions to managing common resources, but she counseled against sole reliance on market-based tools.
Surely the PC authors know they aren’t exactly channeling Ostrom:
“An earth system governance approach will require an overarching global institution that is responsible for the entire Earth system, built around high-level principles and broad oversight and reporting provisions. This institution would serve as a universal point of aggregation for the governance of individual planetary commons, where oversight and monitoring of all commons come together, including annual reporting on the state of the planetary commons.”
Polycentricity was used by Ostrom to describe the involvement of different, overlapping “centers of authority”, such as individual consumers and producers, cooperatives formed among consumers and producers, other community organizations, local jurisdictions, and even state or federal regulators. Some of these centers of authority supersede others in various ways. For example, solutions developed by cooperatives or lower centers of authority must align with the legal framework within various government jurisdictions. However, as David Henderson has noted, Ostrom observed that management of pooled resources at lower levels of authority was generally superior to centralized control. Henderson quotes Ostrom and a co-author on this point:
“When users are genuinely engaged in decisions regarding rules affecting their use, the likelihood of them following the rules and monitoring others is much greater than when an authority simply imposes rules.”
The authors of the PC have something else in mind, and they bastardize the spirit of Ostrom’s legacy in the process. For example, the next sentence is critical for understanding the authors’ intent:
“If excessive emissions and harmful activities in some countries affect planetary commons in other areas—for example, the melting of polar ice—strong political and legal restrictions for such localized activities would be needed.”
Of course, there are obvious difficulties in measuring impacts of various actions on polar ice, assigning responsibility, and determining the appropriate “restrictions”. But in essence, the PC paper advocates for a top-down model of governance. Polycentrism is thus reduced to “you do as we say”, which is not in the spirit of Ostrom’s research.
Planetary Governance
Transcending national sovereignty on questions of the biosphere is key to the authors’ ambitions. At a bare minimum, the authors desire legally-binding commitments to international agreements on environmental governance, unlike the unenforceable promises made for the Paris Climate Accords:
“At present, the United Nations General Assembly, or a more specialized body mandated by the Assembly, could be the starting point for such an overarching body, even though the General Assembly, with its state-based approach that grants equal voting rights to both large countries and micronations, represents outdated traditions of an old European political order.”
But the votes of various “micronations” count for zilch when it comes to real “claims” on the resources of other sovereign nations! Otherwise, there is nothing “voluntary” about the regime proposed in the PC paper.
“A challenge for such regimes is to duly adapt and adjust notions of state sovereignty and self-determination, and to define obligations and reciprocal support and compensation schemes to ensure protection of the Earth system, while including comprehensive stewardship obligations and mandates aimed at protecting Earth-regulating systems in a just and inclusive way.”
So there! The way forward is to adopt the broadest possible definition of market failure and global regulation of any and all private activity touching on nature in any way. And note here a similarity to the Paris Accords: achieving commitments would fall to national governments whose elites often demonstrate a preference for top-down solutions.
Ah Yes, Redistribution
It should be apparent by now that the PC paper follows a now well-established tradition in multi-national climate “negotiations” to serve as subterfuge for redistribution (which, incidentally, includes the achievement of interspecies justice):
“For instance, a more equal sharing of the burdens of climate stabilization would require significant multilateral financial and technology transfers in order not to harm the poorest globally (116).”
The authors insist that participation in this governance would be “voluntary”, but the following sentence seems inconsistent with that assurance:
“… considering that any move to strengthen planetary commons governance would likely be voluntarily entered into, the burdens of conservation must be shared fairly (115).”
Wait, what? “Voluntary” at what level? Who defines “fairness”? The authors approvingly offer this paraphrase of the words of Brazilian President Lula da Silva,
“… who affirmed the Amazon rainforest as a collective responsibility which Brazil is committed to protect on behalf of all citizens around the world, and that deserves and justifies compensation from other nations (117).”
Let Them Eat Cake
Furthermore, PC would require de-growth and so-called “sufficiency” for thee (i.e., be happy with less), if not for those who’ll design and administer the regime.
“… new principles that align with novel Anthropocene dynamics and that could reverse the path-dependent course of current governance. These new principles are captured under a new legal paradigm designed for the Anthropocene called earth system law and include, among others, the principles of differentiated degrowth and sufficiency, the principle of interconnectivity, and a new planetary ethic (e.g., principle of ecological sustainability) (134).”
If we’re to take the PC super-regulators at their word, the regulatory regime wouldimpinge on fertility decisions as well. Just who might we trust to govern humanity thusly? If we’re wise enough to applythe Munger Test, we wouldn’t grant that kind of power to our worst enemy!
Global Warmism
The underlying premise of the PC proposal is that a global crisis is now unfolding before our eyes: anthropomorphic global warming (AGW). The authors maintain that emissions of carbon dioxide are the cause of rising temperatures, rapidly rising sea levels, more violent weather, and other imminent disasters.
“It is now well established that human actions have pushed the Earth outside of the window of favorable environmental conditions experienced during the Holocene…”
“Earth system science now shows that there are biophysical limits to what existing organized human political, economic, and other social systems can appropriate from the planet.”
For a variety of reasons, both of these claims are more dubious than one might suppose based on popular narratives. As for the second of these, mankind’s limitless capacity for innovation is a more powerful force for sustainability than the authors would seem to allow. On the first claim, it’s important to note that the PC paper’s forebodings are primarily based on modeled, prospective outcomes, not historical data. The models are drastically oversimplified representations of the earth’s climate dynamics driven by exogenous carbon forcing assumptions. Their outputs have proven to be highly unreliable, overestimating warming trends almost without exception. These models exaggerate climate sensitivity to carbon forcings, and they largely ignore powerful natural forcings such as variations in solar irradiance, geological heating, and even geological carbon forcings. The models are also notorious for their inadequate treatment of feedback effects from cloud cover. Their predictions of key variables like water vapor are wildly in error.
The measurement of the so-called “global temperature” is itself subject to tremendous uncertainty. Weather stations come and go. They are distributed very unevenly across land masses, and measurement at sea is even sketchier. Averaging all these temperatures would be problematic even if there were no other issues… but there are. Individual stations are often sited poorly, including distortions from heat island effects. Aging of equipment creates a systematic upward bias, but correcting for that bias (via so-called homogenization) causes a “cooling the past” bias. It’s also instructive to note that the increase in global temperature from pre-industrial times actually began about 80 years prior to the onset of more intense carbon emissions in the 20th century.
Climate alarmists often speak in terms of temperature anomalies, rather than temperature levels. In other words, to what extent do temperatures differ from long-term averages? The magnitude of these anomalies, using the past several decades as a base, tend to be anywhere from zero degrees to well above one degree Celsius, depending on the year. Relative to temperature levels, the anomalies are a small fraction. Given the uncertainty in temperature levels, the anomalies themselves are dwarfed by the noise in the original series!
Pick Your Own Tipping Point
It seems that“tipping point” scares are heavily in vogue at the moment, and the PC proposal asks us to quaff deeply of these narratives. Everything is said to be at a tipping point into irrecoverable disaster that can be forestalled only by reforms to mankind’s unsustainable ways. To speak of the possibility of other causal forces would be a sacrilege. There are supposed tipping points for the global climate itself as well as tipping points for the polar ice sheets, the world’s forests, sea levels and coastal environments, severe weather, and wildlife populations. But none of this is based on objective science.
For example, the 1.5 degree limit on global warming is a wholly arbitrary figure invented by the IPCC for the Paris Climate Accords, yet the authors of the PC proposal would have us believe that it was some sort of scientific determination. And it does not represent a tipping point. Cliff Mass explains that climate models do not behave as if irreversible tipping points exist.
Likewise, the rise of sea levels has not accelerated from prior trends, so it has nothing to do with carbon forcing.
One thing carbon forcings have accomplished is a significant greening of the planet, which if anything bodes well for the biosphere
What about the disappearance of the polar ice sheets? On this point, Cliff Mass quotes Chapter 3 of the IPCC’s Special Report on the implications of 1.5C or more warming:
“there is little evidence for a tipping point in the transition from perennial to seasonal ice cover. No evidence has been found for irreversibility or tipping points, suggesting that year-round sea ice will return given a suitable climate.”
The PC paper also attempts to connect global warming to increases in forest fires, but that’s incorrect: there has been no increasing trend in forest fires or annual burned acreage. If anything, trends in measures of forest fire activity have been negative over the past 80 years.
Concluding Thoughts
The alarmist propaganda contained in the PC proposal is intended to convince opinion leaders and the public that they’d better get on board with draconian and coercive steps to curtail economic activity. They appeal to the sense of virtue that must always accompany consent to authoritarian action, and that means vouching for sacrifice in the interests of environmental and climate equity. All the while, the authors hide behind a misleading version of Elinor Ostrom’s insights into the voluntary and cooperative husbandry of common pool resources.
One day we’ll be able to produce enough carbon-free energy to accommodate high standards of living worldwide and growth beyond that point. In fact, we already possess the technological know-how to substantially reduce our reliance on fossil fuels, but we lack the political will to avail ourselves of nuclear energy. With any luck, that will soften with installations of modular nuclear units.
Ultimately, we’ll see advances in fusion technology, beamed non-intermittent solar power from orbital collection platforms, advances in geothermal power, and effective carbon capture. Developing these technologies and implementing them at global scales will require massive investments that can be made possible only through economic growth, even if that means additional carbon emissions in the interim. We must unleash the private sector to conduct research and development without the meddling and clumsy efforts at top-down planning that typify governmental efforts (including an end to mandates, subsidies, and taxes). We must also reject ill-advised attempts at geoengineered cooling that are seemingly flying under the regulatory radar. Meanwhile, let’s save ourselves a lot of trouble by dismissing the interventionists in the planetary commons crowd.
Several years ago my wife and I dined on the roof of Mick Fleetwood’s restaurant in beautiful Lahaina on Maui. Sadly, that restaurant was destroyed by this week’s wildfire, along with the famous banyan tree and most of the town. The death toll keeps climbing in what was an unimaginably tragic event.
Climate alarmists, including Hawaii Governor Josh Green and one of my favorite personalities, Steve Parrish, jumped to the immediate and wrong-headed conclusion that this fire was caused by climate change. They believe that sounds so sensible, but nothing could be further from the truth. As I noted in a post last week, this year has seen relatively little burning around the globe. It just so happens that the western side of Maui is particularly prone to grass fires, and this one happened to be huge.
Here’s a good explanation from James Steele of the circumstances which culminated in the fire that destroyed Lahaina. He first acknowledges that wildfires are quite common in Hawaii, but very few are caused by lightning strikes:
“According to Hawaii Wildfire Management Organization, 98% of all Hawaiian fires are started by people, of which 75% are due to carelessness. .. As retirees flock to Hawaii seeking the health benefits of a warmer climate, the population has tripled since 1980, which only increases the probability of a careless fire being started.”
Arson is part of the story, but I don’t know of any reports of arson that might be implicated in this conflagration. Fires from electrical lines have also been mentioned as possible triggers.
The western part of Maui tends to be much drier than more eastern parts of the island. Here’s Steele again:
“Lahaina is situated on the leeward side of Maui’s mountains. These highlands wring out the moisture carried by the trade winds, with only 15” of rain falling in Lahaina compared to 300” on the mountains to the east.”
Expanding acreage of invasive grasses has led to excessive fire risk. This has occurred with declining production of agricultural products like pineapples and sugar cane. These “small diameter” grasses dry-out very quickly and become dangerous fuel for wind-blown fires. Even worse, a very wet spring led to more growth in the grasses than normally occurs. Once dry weather set in, a tinder box was created in western Maui.
Cliff Mass makes some of the same points, and he offers some detail on strong trade winds that developed last week between a powerful high-pressure system to the north of Maui and Hurricane Dora to the south. And no, climate change is not increasing the frequency or severity of hurricanes. Maui’s position between other islands, and the mountains east of Lahaina, create a funneling effect for the winds. Mass speculates that these high winds blew down power lines, igniting the fires.
The risk of catastrophic fires in West Maui has been known for years. The link summarizes statements made by a director of a non-profit involved in planning and preparing earlier reports on fire risk:
“… significant progress in implementing the community-based work since [her] organizations inception in 2002. But the necessary ‘enormous infrastructure’ investments have not come.
‘I don’t know if I understood the urgency of those bigger investments.’ ….”
The recommended investments included 70 miles of fire breaks and 90 miles of fuel breaks. Essentially, these breaks would be bare land intended to reduce fire spread and intensity. Apparently that work was never initiated, nor was work begun on other types of infrastructure needed to minimize risk.
The tragic fire in West Maui resulted from a confluence of declining agriculture, invasive and fire-prone grasses, an especially wet spring followed by a dry summer, a few days of unusually strong trade winds, and geography that funnels and focuses the intensity of those winds. The “spark” or “sparks” might well have been from downed power lines, or possibly some other kind of accident, carelessness, or even arson. It was not caused by global warming, as much as the climate change activists might like to convince you. The assertion that fires are becoming more frequent and severe has absolutely no basis in fact.
It happens every summer! It’s been hot, and the news media and professional grifters in the anti-carbon climate-change establishment want us to panic about it. Granted, the weather really was quite hot for several weeks in July across parts of the U.S., Europe, and elsewhere, but it’s cooled off considerably since then, especially in my neck of the woods.
July is typically the warmest month of the year, and July 2023 was the warmest July for the troposphere on the satellite record. (The troposphere is the lowest 13 km of the atmosphere, but that’s an average — it’s thicker toward equatorial latitudes, thinner toward the poles.) However, attribution of this summer’s heat waves to carbon-induced climate change is misplaced. What follows are a few considerations in evaluating this claim, and the lengths to which climate activists go to distort weather data and reporting.
The Biggest Greenhouse Gas
One speculative explanation for the recent heat wave has gained some traction: the eruption of the Hunga Tonga-Hunga Ha’apai volcano in the South Pacific on Jan. 15, 2022 (and see here). This underwater eruption spewed massive quantities of water vapor into the stratosphere, which encircled the globe in fairly short order. Water vapor acts as a greenhouse gas, and it is by far the most important greenhouse gas. This plume of vapor may have affected the climate with a delay, and it is not expected to dissipate for at least a couple of years. However, there are theories that the eruption might have led to some offsetting effects due to the reflective properties of water and ice in the stratosphere. See here for an interesting debate on the estimated effects of this “shock” to the atmosphere.
NASA has estimated that the Hunga Tonga eruption resulted in a 10% increase in atmospheric water vapor, while the European Space Agency puts the increase at 13%. Now, in addition to this added water vapor, we have the early effects of an El Niño event in the Pacific, which may elevate temperatures over the next couple of years.
However, the temperatures in July simply don’t justify the claim that we’re experiencing “unprecedented” warmth. The satellite records go back only to 1979, which is an especially narrow window on climatological scales. The longer record of temperatures shows earlier periods of higher temperatures, For example, U.S. surface temperature records indicate that the 1930s had periods warmer than this July. Moreover, while estimates of paleo-climate data are a matter of great dispute, there is no question that the globe has experienced warmer temperatures in the past, with an ice-free Arctic.
So, was July 3 really the hottest day in history? No way, and the worst part of this warm spell wasn’t even the warmth. Rather, it was the attempts to make weather a political matter, as if public policymakers possess some kind of control knob over weather phenomena, or as if we should bestow upon them dictatorial powers to act on their fantasy.
Longer Trends
There’s plenty of other evidence running contrary to the “hotter-than-any-time in-history” foolishness. Take a look at trends in hot and cool weather from individual U.S. weather stations over a somewhat longer time span than the satellite record. The red symbols shown on the map below mark stations reporting increases in the number of unusually hot days (heat in the 95th percentile) between 1948 – 2020, with larger symbols corresponding to greater increases in extremely hot days. The blue symbols mark stations reporting increases in the number of unusually cool days (in the 5th percentile) over the same period. The data in this chart is published by the EPA, and it is definitely not alarming.
The next chart shows the so-called Heat Wave Index produced by the EPA. Recent spikes in the index are muted relative to the Dust Bowl days of the 1930s.
Journalism or Exaggeration?
Reports of hot weather in Europe have been distorted as well, often placing more emphasis on forecasts of high temperatures than on the temperatures themselves. It’s almost as if authorities, with the aid of the news media and naive weather reporters, are determined to raise an exaggerated sense of alarm among the citizenry. Almost?
Cold 10x Deadlier Than Heat
The next chart vividly illustrates an attempt to propagandize climate misinformation. Take a look at the left side of this illustration, which appeared in the medical journal Lancet. Note the difference in the horizontal scale for heat deaths vs. cold deaths. The chart on the right side uses equivalent scales for heat vs. cold deaths. This should qualify the journal for some kind of award for mendacity, or perhaps sheer stupidity. It’s the cold that really kills, not the heat! I’m moving south!
Finding Hot Water
And here’s a take-down of some incredible water temperature propaganda. A PBS News Hour reporter has pushed claims that South Florida water temperatures reached 101 degrees this summer. The emphasis on a single reading was taken from a buoy not subject to the cooling effects of deep water circulation, and it is located where fresh water often overlays salt water, which traps heat. Data from other buoys not far away showed much lower temperatures.
Spreads Like Wildfire
Another fallacious claim we hear too often is that global warming is literally causing the world to go up in flames. The facts run contrary to these scare stories. Björn Lomborg notes the following:
“For more than two decades, satellites have recorded fires across the planet’s surface. The data are unequivocal: Since the early 2000s, when 3% of the world’s land caught fire, the area burned annually has trended downward.
“In 2022, the last year for which there are complete data, the world hit a new record-low of 2.2% burned area. Yet you’ll struggle to find that reported anywhere.”
The heavy focus by the media on this year’s wild fires in North America offers a perfect example of the media’s tendency to “cherry pick for clicks”. Africa and Europe have had little burning this year, and in North America, arson has played a conspicuous role (and see here) in the wildfires.
Distorted Measurements
Personally, I have trouble accepting claims that temperatures are any warmer now than they were in my youth, at least where I grew up. My subjective and local assessment aside, there are strong reasons to doubt the reliability and significance of trends in official temperature records. The urban heat-island effect has distorted temperatures by ever greater magnitudes, as growing metropolitan areas absorb heat readily compared to rural green space.
Furthermore, poor siting of weather stations and temperature gauges has become all too common. This includes equipment located at airports and other areas in close proximity to asphalt or concrete. This contributes to an upward bias in more recent temperature data. It’s also worth noting in this context that satellite temperature readings must be calibrated periodically to surface temperatures. If the latter are corrupted in any way, the satellite readings may be corrupted as well.
“Adjusting” the Past
Official historical records also include a variety of “adjustments” to temperature data that raise concerns. Ostensibly, these adjustments are justified by an interest in maintaining a consistent historical record. Changes in equipment or it’s exact location can create discontinuities, for example. Unfortunately, the adjustments appear to have had a systematic tendency to “cool the past” relative to more recent data. This reinforcement of the warming trend over the past few decades is suspicious, to say the least. It does very little to build confidence in the agencies responsible for these records.
Conclusion
The hot temperatures in July brought the usual deluge of propaganda, including distortions in the reporting of weather phenomena. And we hear increasing calls to force transition to EVs (which are powered mostly by fossil-fuel electric plants), subsidize intermittent renewable power sources, and to end the use of air conditioning and gas stoves. Yet these coercive measures would do nothing to prevent summer heat or climate change generally. Water vapor represents 95% of greenhouse gases, and the huge vapor shock from the Hunga Tonga eruption might well make us prone to warmer temperatures for at least some months to come, mixed with signals from the Pacific El Niño pattern. But these are not evidence of a man-made crisis, despite perverse cheers from those rooting for more draconian state intrusions and an end to growth, or indeed, a reversal in gains to human well being.
Smoke from this spring’s terrible forest fires in Canada has fouled the air in much of the country and blown into the northeastern U.S. and mid-Atlantic coastal states. The severity of the fires, if they continued at this pace over the rest of the fire season, would break Canadian records for number of fires and burned area.
Large wildfires with smoky conditions occur in these in regions from time-to-time, and it’s not unusual for fires to ignite in the late spring. The article shown above appeared in the New York Tribune on June 5, 1903. Other “dark day” episodes were recorded in New England in 1706, 1732, 1780, 1814, 1819, 1836, 1881, 1894, and 1903, and several times in the 20th century. I list early years specifically because they preceded by decades (even centuries) the era of supposed anthropomorphic global warming, now euphemistically known as “climate change”.
More recently, however, in the past 10 years, Quebec experienced relatively few wildfires. That left plenty of tinder in the boreal forests with highly flammable, sappy trees. In May, a spell of sunshine helped dry the brush in the Canadian forests. Then lightning and human carelessness sparked the fires, along with multiple instances of arson, some perpetrated by climate change activists.
On top of all that, poor forest management contributed to the conflagrations. So-calledfire suppression techniques have done more harm than good over the years, as I’ve discussed on this blog in the past. David Marcus emphasizes the point:
“For years, Canadian parks officials have been warning that their country does not do enough to cull its forests and now we’re witnessing the catastrophic results.
It’s simple really. Edward Struzik, author of ‘Dark Days at Noon, The Future of Fire’ lays it out well.
‘We have been suppressing fires for so many decades in North America that we have forests that are older than they should be,’ he said. …
‘Prescribed burns are one of the best ways to mitigate the wildfire threat,’ he added.”
Nevertheless, the media are eager to blame climate change for any calamity. That’s one part simple naïveté on the part of young journalists, fresh off the turnip truck as it were, with little knowledge or inclination to understand the history and causes of underlying forest conditions. But many seasoned reporters are all too ready to support the climate change narrative as well. There’s also an element of calculated political misinformation in these claims, abetted by those seeking rents from government climate policies.
Wildfires are as old as time; without good forest management practices they are necessary for forest renewal. Agitation to sow climate panic based on wildfires is highly unscrupulous. There is no emergency except for the need to reform forest management, reduce the fuel load, and more generally, put an end to the waste of resources inherent in government climate change initiatives.
It doesn’t take much due diligence to reveal that certain green “commitments” are flimsy gestures at best. I discussed the poor economics of recycling mandates in a post a few days ago. Here I discuss two other prominent examples of fake virtue: so-called carbon offsets and green bonds. These are devices often utilized by private actors to assuage activists, gain favor with public policymakers., or simply to claim and promote themselves as “zero-footprint”. No doubt many well-intentioned people believe in the goodness of these instruments, blissfully ignorant of the underlying fakery. Of course, this is dwarfed by the broad flimsiness (and cost implications) of claims about climate catastrophe, which is what motivates carbon credits and most green bonds in the first place. The includes “commitments” made by various nations under the Paris Climate Accords, but that is a subject for another day.
Climate Credits
I mentioned Blake Lovewall’s interesting commentary on carbon credits recently. Purchasing these credits is a way of “greenwashing” activities that emit carbon dioxide. Also known as carbon offsets, this is a $2 billion market with growth fueled by a desire by businesses to appeal to environmental activists and “green” investors, and to boost their ESG scores. I’ll quote here from my own piece, which had as it’s main thrust the waste inherent in wind and solar projects (Lovewall quotes are in blue type):
“The resulting carbon emissions are, in reality, unlikely to be offset by any quantity of carbon credits these firms might purchase, which allow them to claim a ‘zero footprint’. Blake Lovewall describes the sham in play here:
‘The biggest and most common Carbon offset schemes are simply forests. Most of the offerings in Carbon marketplaces are forests, particularly in East Asian, African and South American nations. …
The only value being packaged and sold on these marketplaces is not cutting down the trees. Therefore, by not cutting down a forest, the company is maintaining a ‘Carbon sink’ …. One is paying the landowner for doing nothing. This logic has an acronym, and it is slapped all over these heralded offset projects: REDD. That is a UN scheme called “Reduce Emissions from Deforestation and Forest Degradation”. I would re-name it to, “Sell off indigenous forests to global investors”.’
Lovewall goes on to explain that these carbon offset investments do not ensure that forests remain pristine by any stretch of the imagination. For one thing, the requirements for managing these ‘preserves’ are often subject to manipulation by investors working with government; as such, the credits are often vehicles for graft. In Indonesia, for example, carbon credited forests have been converted to palm oil plantations without any loss of value to the credits! Lovewall also cites a story about carbon offset investments in Brazil, where the credits provided capital for a massive dam in the middle of the rainforest. This had severe environmental and social consequences for indigenous peoples. It’s also worth noting that planting trees, wherever that might occur under carbon credits, takes many years to become a real carbon sink.”
Lovewall makes a strong case that carbon credits are a huge fraud. This was reinforced by a recent investigation conducted by the Guardian, Die Zeit and SourceMaterial, a “non-profit investigative journalism organization”, according to the Guardian. The investigation was based on independent research studies as well as interviews with various parties. They found that at least 90% of “rainforest credits” do not represent carbon reductions. Two studies found no abatement whatsoever in deforestation under the credits. Furthermore, the deforestation threats (absent credits) had been overstated by some 400%. The investigation also noted serious human rights violations associated with the offset projects. Rainforest credits are only one kind of carbon offset, but similar problems plague other types of credits as well, such as those earned by shuttering fossil fuel plants in developing countries desperately short on power generation.
That so much of the carbon credit market is fraudulent should infuriate climate change radicals. The findings also are a disgrace to participants in these markets, revealing that much of the “net zero” propaganda trumpeted by corporate PR organizations is a charade. Regrettably, it is motivated by an unnecessary panic over carbon dioxide emissions and their presumed role in global warming. Spending on environmental initiatives should be a warning flag for investors. The resources firms dedicate to those credits deserve careful scrutiny. The fascination with ESG scores is another sign that corporate managers have lost sight of their fundamental mission: to maximize shareholder value by serving their customers well.
Green Bonds
Another suspicious form of “commitment” is embodied in the issuance of so-called “green bonds” to raise funds for environmental initiatives. This form of investing is so ostensibly “virtuous” that these bonds are demanded even with specific commitments that are quite “soft”. This just released study finds that green bonds offer little assurance of any positive environmental impact:
“… we find a concerning lack of enforceability of green promises. Moreover, these promises have been getting weaker over time. Green bonds often make vague commitments, exclude failures to live up to those commitments from default events, and disclaim an obligation to perform in other parts of the document. These shortcomings are known to market participants. Yet, demand for these instruments has been growing. We ask why green bond promises are so weak, while the same investors demand strong promises from the same issuers in other settings.”
Green bonds are “virtue ornaments” typically purchased by institutional investors with some sort of environmental or ESG objective. Apparently, earning returns is an afterthought. Unfortunately, these funds managers are usually investing on behalf of other people. While some of those clients might wholly support the environmental objectives, many others have no clue.
Fortunately, there are alternatives, and I’m tempted to say caveat emptor applies here. However, it really is a remarkable breach of fiduciary duty to manage funds based on objectives other than maximizing expected returns, or to in any way sacrifice returns in favor of “green” objectives. That is happening before our very eyes. Even clients who wish to invest funds for green objectives are being shaken down here. According to the research cited above, the green bond “commitments” are hardly worth the paper they’re written on.
Institutional investors go right along, scrambling to add green bonds to their portfolios. This helps drive down the effective cost of funds to the green bond issuers. Thus, highly speculative climate or environmental initiatives can be funded on the cheap. They do, however, produce lucrative opportunities for the climate crisis industry.
One More Time
People save to build wealth, typically for their retirement years. If that’s your objective, you probably shouldn’t invest in firms expending their resources on carbon credits. At best, the credits are a buy-off to activists. who are just as ignorant of the whole sham.
One might plausibly ask whether I should love carbon credits because they allow, at least, certain forms of beneficial economic activity to avoid challenge by crazies. Perhaps that’s true taking the world as it is, but my hope is that exposing various layers of climate hysteria and craziness is one way to change the world. The whole carbon credit enterprise enables extraction of still greater rents by climate change opportunists, to say nothing of human rights abuses taking place under the guise of these credits.
Like carbon offsets, green bonds promote fictitious virtue, They are another way in which green profiteers extract rents from well-meaning savers and investors, some of whom are unaware that ESG objectives are undermining their returns. Even if investors prefer to sacrifice returns in the pursuit of green goals, the initiatives thus funded often have no environmental merit, particularly when it comes to reducing carbon emissions. Despite the efforts of these bonds issuers to convince us of their green bona fides, their “commitments” to green results are usually flimsy.
I’m not terribly surprised to learn that scientific advancement has slowed over my lifetime. A recent study published in the journal Nature documented a secular decline in the frequency of “disruptive” or “breakthrough” scientific research across a range of fields. Research has become increasingly dominated by “incremental” findings, according to the authors. The graphic below tells a pretty dramatic story:
The index values used in the chart range “from 1 for the most disruptive to -1 for the least disruptive.” The methodology used to assign these values, which summarize academic papers as well as patents, produces a few oddities. Why, for example, does the tech revolution of the last 40 years create barely a blip in the technology index in the chart above? And why have tech research and social science research always been more “disruptive” than other fields of study?
Putting those questions aside, the Nature paper finds trends that are basically consistent across all fields. Apparently, systematic forces have led to declines in these measures of breakthrough scientific findings. The authors try to provide a few explanations as to the forces at play: fewer researchers, incrementalism, and a growing role of large-team research that induces conformity. But if research has become more incremental, that’s more accurately described as a manifestation of the disease, rather than a cause.
Conformity
Steven F. Hayward skewers the authors a little, and perhaps unfairly, stating a concern held by many skeptics of current scientific practices. Hayward says the paper:
“… avoids the most significant and obvious explanation with the myopia of Inspector Clouseau, which is the deadly confluence of ideology and the increasingly narrow conformism of academic specialties.”
Conformism in science is nothing new, and it has often interfered with the advancement of knowledge. The earliest cases of suppression of controversial science were motivated by religious doctrine, but challenges to almost any scientific “consensus” seem to be looked upon as heresy. Several early cases of suppression are discussed here. Matt Ridley has described the case of Mary Worley Montagu, who visited Ottoman Turkey in the early 1700s and witnessed the application of puss from smallpox blisters to small scratches on the skin of healthy subjects. The mild illness this induced led to immunity, but the British medical establishment ridiculed her. A similar fate was suffered by a Boston physician in 1721. Ridley says:
“Conformity is the enemy of scientific progress, which depends on disagreement and challenge. Science is the belief in the ignorance of experts, as [the physicist Richard] Feynman put it.”
When was the Scientific Boom?
I couldn’t agree more with Hayward and Ridley on the damaging effects of conformity. But what gave rise to our recent slide into scientific conformity, and when did it begin? The Nature study on disruptive science used data on papers and patents starting in 1945. The peak year for disruptive science within the data set was … 1945, but the index values were relatively high over the first two decades of the data set. Maybe those decades were very special for science, with a variety of applications and high-profile accomplishments that have gone unmatched since. As Scott Sumner says in an otherwise unrelated post, in many ways we’ve failed to live up to our own expectations:
“In retrospect, the 1950s seem like a pivotal decade. The Boeing 707, nuclear power plants, satellites orbiting Earth, glass walled skyscrapers, etc., all seemed radically different from the world of the 1890s. In contrast, airliners of the 2020s look roughly like the 707, we seem even less able to build nuclear power plants than in the 1960s, we seem to have a harder time getting back to the moon than going the first time, and we still build boring glass walled skyscrapers.”
It’s difficult to put the initial levels of the “disruptiveness” indices into historical context. We don’t know whether science was even more disruptive prior to 1945, or how the indices used by the authors of the Nature article would have captured it. And it’s impossible to say whether there is some “normal” level of disruptive research. Is a “normal” index value equal to zero, which we now approach as an asymptote?
Some incredible scientific breakthroughs occurred decades before 1945, to take Einstein’s theory of relativity as an obvious example. Perhaps the index value for physical sciences would have been much higher at that time, were it measured. Whether the immediate post-World War II era represented an all-time high in scientific disruption is anyone’s guess. Presumably, the world is always coming from a more primitive base of knowledge. Discoveries, however, usually lead to new and deeper questions. The authors of the Nature article acknowledge and attempt to test for the “burden” of a growing knowledge base on the productivity of subsequent research and find no effect. Nevertheless, it’s possible that the declining pattern after 1945 represents a natural decay following major “paradigm shifts” in the early twentieth century.
The Psychosis Now Known As “Wokeness”
The Nature study used papers and patents only through 2010. Therefore, the decline in disruptive science predates the revolution in “wokeness” we’ve seen over the past decade. But “wokeness” amounts to a radicalization of various doctrines that have been knocking around for years. The rise of social justice activism, critical theory, and anthropomorphic global warming theology all began long before the turn of the century and had far reaching effects that extended to the sciences. The recency of “wokeness” certainly doesn’t invalidate Hayward and Ridley when they note that ideology has a negative impact on research productivity. It’s likely, however, that some fields of study are relatively immune to the effects of politicization, such as the physical sciences. Surely other fields are more vulnerable, like the social sciences.
Citations: Not What They Used To Be?
There are other possible causes of the decline in disruptive science as measured by the Nature study, though the authors believe they’ve tested and found these explanations lacking. It’s possible that an increase in collaborative work led to a change in citation practices. For example, this study found that while self-citation has remained stable, citation of those within an author’s “collaboration network” has declined over time. Another paper identified a trend toward citing review articles in Ecology Journals rather than the research upon which those reviews were based, resulting in incorrect attribution of ideas and findings. That would directly reduce the measured “disruptiveness” of a given paper, but it’s not clear whether that trend extends to other fields.
Believe it or not, “citation politics” is a thing! It reflects the extent to which a researcher should suck-up to prominent authors in a field of study, or to anyone else who might be deemed potentially helpful or harmful. In a development that speaks volumes about trends in research productivity, authors are now urged to append a “Citation Diversity Statement” to their papers. Here’s an academic piece addressing the subject of “gendered citation practices” in contemporary physics. The 11 authors of this paper would do well to spend more time thinking about problems in physics than in obsessing about whether their world is “unfair”.
Science and the State
None of those other explanations are to disavow my strong feeling that science has been politicized and that it is harming our progress toward a better world. In fact, it usually leads us astray. Perhaps the most egregious example of politicized conformism today is climate science, though the health sciences went headlong toward a distinctly unhealthy conformism during the pandemic (and see this for a dark laugh).
Politicized science leads to both conformism and suppression. Here are several channels through which politicization might create these perverse tendencies and reduce research productivity or disruptiveness:
Political or agenda-driven research is driven by subjective criteria, rather than objective inquiry and even-handed empiricism
Research funding via private or public grants is often contingent upon whether the research can be expected to support the objectives of the funding NGOs, agencies, or regulators. The gravy train is reserved for those who support the “correct” scientific narrative
Promotion or tenure decisions may be sensitive to the political implications of research
Government agencies have been known to block access to databases funded by taxpayers when a scientist wishes to investigate the “wrong questions”
Journals and referees have political biases that may influence the acceptance of research submissions, which in turn influences the research itself
The favorability of coverage by a politicized media influences researchers, who are sensitive to the damage the media can do to one’s reputation
The chance that one’s research might have a public policy impact is heavily influenced by politics
The talent sought and/or attracted to various fields may be diminished by the primacy of political considerations. Indoctrinated young activists generally aren’t the material from which objective scientists are made
Conclusion
In fairness, there is a great deal of wonderful science being conducted these days, despite the claims appearing in the Nature piece and the politicized corruption undermining good science in certain fields. Tremendous breakthroughs are taking place in areas of medical research such as cancer immunotherapy and diabetes treatment. Fusion energy is inching closer to a reality. Space research is moving forward at a tremendous pace in both the public and private spheres, despite NASA’s clumsiness.
I’m sure there are several causes for the 70-year decline in scientific “disruptiveness” measured in the article in Nature. Part of that decline might have been a natural consequence of coming off an early twentieth-century burst of scientific breakthroughs. There might be other clues related to changes in citation practices. However, politicization has become a huge burden on scientific progress over the past decade. The most awful consequences of this trend include a huge misallocation of resources from industrial planning predicated on politicized science, and a meaningful loss of lives owing to the blind acceptance of draconian health policies during the Covid pandemic. When guided by the state or politics, what passes for science is often no better than scientism. There are, however, even in climate science and public health disciplines, many great scientists who continue to test and challenge the orthodoxy. We need more of them!
I leave you with a few words from President Dwight Eisenhower’s Farewell Address in 1961, in which he foresaw issues related to the federal funding of scientific research:
“Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.
In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.
Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.
The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.
Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.”
The frantic rush to force transition to a zero-carbon future is unnecessary and destructive to both economic well-being and the global environment. I do not subscribe to the view that a zero-carbon goal is an eventual necessity, but even if we stipulate that it is, a rational transition would eschew the immediate abandonment of fossil fuels and adopt a gradual approach relying heavily on market signals rather than a mad dash via coercion.
I’ve written about exaggerated predictions of temperature trends and catastrophes on a number of occasions (and see here for a similar view from a surprising source). What might be less obvious is the waste inherent in forcing the abandonment of mature and economic technologies in favor of, as yet, under-developed and uneconomic technologies. These failures should be obvious when the grid fails, as it does increasingly. It is often better to leave the development and dispersion of new technologies to voluntary decision-making. In time, advances will make alternative, low- or zero-carbon energy sources cost effective and competitive to users. That will include efficient energy storage at scale, new nuclear technologies, geothermal techniques, and further improvements in the carbon efficiency of fossil fuels themselves. These should be chosen by private industry, not government planners.
Boneheads At the Helm
Production of fossil fuels has been severely hampered by the Biden Administration’s policies. The sanctions on Russian oil that only began to take hold in March have caused an additional surge in the price of oil. Primarily, however, we’ve witnessed an artificial market disruption instigated by Biden’s advisors on environmental policy. After all, neither Russian oil imports nor the more recent entreaties to rogue states as Iraq and Venezuela for oil would have been necessary if not for the Administration’s war on fossil fuels. Take a gander at this White House Executive Order issued in January 2021. It reads like a guidebook on how to kill an industry. In a column this weekend, Kevin Williamson quipped about “the Biden administration’s uncanny ability to get everything everywhere wrong all at once.” That was about policy responses to inflation, but it applies to energy in particular.
Scorning the Miracle
Fossil fuels are the source of cheap and reliable energy that have lifted humanity to an unprecedented level of prosperity. Fossil fuels have given a comfortable existence to billions of people, allowing them to rise out of poverty. This prosperity gives us the luxury of time to develop substitutes, not to mention much greater safety against the kind of weather extremes that have always been a fact of life. The world still gets 80% of its energy from fossil fuels. These fuels are truly a miracle, and we should not discard such valuable technologies prematurely. That forces huge long-term investments in inferior technologies that are likely to be superseded in the future by more economic refinements or even energy sources and methods now wholly unimagined. There are investors who will still wish to pursue those new technologies, perhaps with non pecuniary motives, and there are a few consumers who really want alternatives to fossil fuels.
Biden’s apparent hope that his aggressive climate agenda will be a great legacy of his presidency is at the root of his intransigence toward fossil fuels. His actions in this regard have had a profoundly negative psychological effect on the oil and gas industry. Steps such as cancellations of pipeline projects are immediately impactful in that regard, to say nothing of the supplies that would have ultimately flowed through those pipelines. These cancellations reinforce the message Biden’s been sending to the industry and its investors since his campaign: we mean to shut you down! Who wants to invest in new wells under those circumstances? Other actions have followed: no new federal oil and gas leases, methane restrictions, higher drilling fees on federal land, and a variety of climate change initiatives that bode ill for the industry, such as the SEC’s mandate on carbon disclosures and the Federal Reserve’s proposed role in policing climate impacts.
And now, Democrats are contemplating a move that would make gasoline even more scarce: price controls. As Don Boudreaux says in a recent letter to The Hill:
“Progressives incessantly threaten to tax and regulate carbon fuels into oblivion. These threats cannot but reduce investors’ willingness to fund each of the many steps – from exploration through refining to transporting gasoline to market – that are necessary to keep energy prices low. One reality reflected by today’s high prices at the pump is this hostility to carbon fuels generally and to petroleum especially. And gasoline price controls would only make matters worse by further reducing the attractiveness of investing in the petroleum industry: Why invest in bringing products to market if the prices at which you’re allowed to sell are dictated by grandstanding politicians?”
The kicker is that all these policies are futile in terms of their actual impact on global carbon concentrations, let alone their highly tenuous link to global temperatures. The policies are also severely regressive, inflicting disproportionate harm on the poor, who can least afford such an extravagant transition. Biden wants the country to sacrifice its standard of living in pursuit of these questionable goals, while major carbon-emitting nations like China and India essentially ignore the issue.
Half-Baked Substitution
Market intervention always has downsides to balance against the potential gains of “internalizing externalities”. In this case, the presumed negative externalities are imagined harms of catastrophic climate change from the use of fossil fuels; the presumed external benefits are the avoidance of carbon emissions and climate change via renewables and other “zero-carbon” technologies. With those harms and gains in question, it’s especially important to ask who loses. Taxpayers are certainly on that list. Users of energy produced with fossil fuels end up paying higher prices and are forced to conserve or submit to coerced conversion away from fossil fuels. Then there are the wider impediments to economic growth and, as noted above, the distributional consequences.
Users of immature or inferior energy alternatives might also end up as losers, and there are likely to be external costs associated with those technologies as well. It’s not widely appreciated that today’s so-called clean energy alternatives are plagued by their need to obtain certain minerals that are costly to extract in economic and environmental terms, not to mention highly carbon intensive. And when solar and wind facilities fail or reach the end of their useful lives, disposal creates another set of environmental hazards. In short, the loses imposed through forced internalization of highly uncertain externalities are all too real.
Unfortunately, the energy sources favored by the Administration fail to meet base-load power needs on windless and/or cloudy days. The intermittency of these key renewables means that other power sources, primarily fossil-fuel and nuclear capacity, must remain available to meet demand on an ongoing basis. That means the wind and solar cannot strictly replace fossil fuels and nuclear capacity unless we’re willing to tolerate severe outages. Growth in energy demand met by renewables must be matched by growth in backup capacity.
A call for “energy pragmatism” by Dan Ervin hinges on the use of coal to provide the “bridge to the energy future”, both because there remains a large amount of coal generating capacity and it can stabilize the grid given the intermittency of wind and solar. Ervin also bases his argument for coal on recent increases in the price of natural gas, though a reversal of the Biden EPA’s attacks on gas and coal, which Ervin acknowledges, would argue strongly in favor of natural gas as a pragmatic way forward.
Vehicle Mandates
The Administration has pushed mandates for electric vehicle (EV) production and sales, including subsidized charging stations. Of course, the power used by EVs is primarily generated by fossil fuels. Furthermore, rapid growth in EVs will put a tremendous additional strain on the electric grid, which renewables will not be able to relieve without additional backup capacity from fossil fuels and nuclear. This severely undermines the supposed environmental benefits of EVs.
Once again, mandates and subsidies are necessary because EV technology is not yet economic for most consumers. Those buyers don’t want to spend what’s necessary to purchase an EV, nor do they wish to suffer the inconveniences that re-charging often brings. This is a case in which policy is outrunning the ability of the underlying infrastructure required to support it. And while adoption of EVs is growing, it is still quite low (and see here).
Wising Up
Substitution into new inputs or technologies happens more rationally when prices accurately reflect true benefits and scarcities. The case for public subsidies and mandates in the push for a zero-carbon economy rests on model predictions of catastrophic global warming and a theoretical link between U.S. emissions and temperatures. Both links are weak and highly uncertain. What is certain is the efficiency of fossil fuels to power gains in human welfare.
This Bartley J. Madden quote sums up a philosophy of progress that is commendable for firms, and probably no less for public policymakers:
“Keep in mind that innovation is the key to sustainable progress that jointly delivers on financial performance and taking care of future generations through environmental improvements.”
Madden genuflects to the “sustainability” crowd, who otherwise don’t understand the importance of trusting markets to guide innovation. If we empower those who wish to crush private earnings from existing technologies, we concede the future to central planners, who are likely to choose poorly with respect to technology and timing. Let’s forego the coercive approach in favor of time, development, and voluntary adoption!
In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun