• About

Sacred Cow Chips

Sacred Cow Chips

Tag Archives: IPCC

Is “Global Temperature” a Fiction?

01 Friday May 2026

Posted by Nuetzel in Climate science, Global Warming

≈ Leave a comment

Tags

ARGO Buoys, Atmospheric CO2, Christopher Essex, Extensive Measure, Global Temperature, GMST, Intensive Measure, IPCC, Jack Salmon, Jonathan Cohler, NOAH, Ocean Acidification, PH, Price Level, Satellite Temperatures, Sea Surface Temperatures, Temperature Averaging, Urban Heat Island Effect, Weather Station Siting

At the heart of the climate crisis narrative lies a huge weakness regarding a thing its believers take for granted: whether our measures of global temperature are meaningful, let alone reliable. The problems are both at the level of individual weather stations, their siting, their geographical distribution, and perhaps even more critically, their aggregation into the so-called Global Mean Surface Temperature (GMST).

The Weather Station Network

In the U.S. and worldwide we have about one weather station for every one thousand square miles. However, the geographic distribution of weather stations is highly uneven (see the map of land-based stations above) and is more sparse in rural areas than in urban environments. It’s also very sparse in highly remote and extreme environments.

At best, a temperature reading at a particular weather station might be approximately representative of its surrounding area at that moment. However, temperatures from place-to-place are influenced by many varying features of local geography. That includes altitude, the presence of waterways and bodies of water, other surface features such as rock and greenery, and human land use. Thus, conditions at a given weather station might not be at all descriptive of the surrounding area.

Moreover, there are no well-defined geographic “zones” to which weather stations are assigned. Attempts to do so involve arbitrary and irregular boundaries and drastic variations in size. “Averaging” temperatures across such zones requires a crude attempt to assign weights based on distances and ultimately yields mongrelized statistics. Furthermore, daily temperature averages are based on averages of high and low temperatures at each station. Such an average might only describe the actual temperature at a station for an instant, but regardless of duration, the timing is likely to differ across any two stations. Not only that, but many weather stations do not record “daily” temperatures based on normal calendar days. Thus, temperature averages across stations are calculated across locations, extremes only, and time. And again, inputs of temperatures from the individual stations are not representative of their respective zones.

Deterioration in the quality of weather station sites has been the subject of sharp criticism over the years. There are now a large number of poorly-sited stations, often located in close proximity to paved surfaces, concrete, metals, or exhaust fans. These kinds of features impart an upward bias to the local temperature record. Individually, these are small examples of the well-known urban heat island effect. In the aggregate, it creates a substantial exaggeration in temperatures, accounting for about 50% of the estimated warming trend for the U.S.

According to this study, the upward bias is more severe for poorly-sited stations, and the quality of siting often deteriorates over time as urban growth encroaches on outlying communities. Urban sites tend to warm the most, followed by semi-urban sites, followed by rural sites. Even worse, the study found that the NOAH temperature adjustment process creates a contagion of the warming bias, passing biases from poor sites along to better stations as an artifact. That is, the process adjusts temperatures upward for well-sited stations to more closely match poor sites!

Ocean temperatures present their own challenges. Several different techniques have been used over the years, but the most consistent and reliable ocean temperatures are from so-called ARGO buoys, which have been available only since 2003. Before that, ocean temperatures were taken using buckets dipped into the water from the sides of ships, and from engine water intakes. Unfortunately, error rates on reported observations from ARGO buoys (which involve several factors besides the accuracy of the thermometers themselves, such as transmission errors) are unknown, but they appear to be far outside acceptable limits. Thus, reasonably good sea surface records have only recently contributed to global temperature coverage, and even those are subject to great uncertainty. (Satellite temperature measurements, by the way, are really indirect estimates of temperatures based on radiance and subsequent calibrations.)

Thus, historical temperature records are an amalgam of different measurement instruments at different locations at different times of the day, adding layers of inconsistency to the calculation of temperature averages.

Physically Untethered

I was prompted to write this post after reading a mathematical analysis of the impossibility of aggregating temperature readings across multiple weather stations in any meaningful way. The analysis, by Jonathan Cohler, is a damning indictment of GMST as a concept. It relies on a series of calculations and transformations that are arbitrarily chosen from many unsuitable alternatives. Cohler says that such an “average temperature” calculation is necessarily “untethered” from the various states of nature it attempts to summarize.

Temperature itself is a so-called intensive quantity. That means it is independent of the size of the system it characterizes. If you combine it with an identical twin system, the temperature of the combined whole doesn’t double, unlike measures like mass or volume. The latter are examples of extensive quantities.

Temperatures vary from one spot to another within a given system while in disequilibrium, and of course they vary over the course of any day. However, the validity of a temperature measurement at a particular location and time requires a local state of equilibrium in the immediate vicinity of the measuring instrument. Otherwise, a temperature measurement is would not be a valid descriptor of the condition of the (very local) system.

Faulty Aggregations

With that in mind, imagine the many arbitrary ways we can devise to aggregate temperatures across weather stations for which conditions differ drastically. These are all attempts to calculate a single temperature for a large and geographically uneven system in a continuing state of disequilibrium. And every combination of weather station temperatures represents an artificially combined “system” in a state of disequilibrium. That’s true of any two adjacent weather stations or of all the weather stations on the globe. No one method of doing so can claim validity as a measure of system-wide temperature. This contrasts with extensive quantities, for which well-defined rules of aggregation exist (e.g., summation) regardless of a system’s dynamic condition.

Over time, the temperature records involve a changing number of stations, local environmental conditions, accuracy, and a varying mix of seawater bucket measurements, ship engine water intake measurements, and ARGO floats. These disparities reinforce the impossibility of measuring wide-ranging “average” trends in temperature.

As Cohler demonstrates mathematically, these temperature averages are physically meaningless. He offers a crazy-sounding example of blending two intensive measurements: averaging the PH of your morning coffee with the PH of seawater at a nearby coast. This is very much of a kind with averaging temperatures across weather stations under disparate conditions. Furthermore, as noted above, the steps employed to arrive at the temperature to be used for each station, and the weight each station is assigned in the average, is hardly a unique set of calculations. There is an infinite number of equally invalid aggregations of the same data.

Grand Ambiguity

Cohler is not the first to point out that the concept of a global temperature average is physically meaningless. In 2007, a paper by Christopher Essex, et al was entitled, “Does a Global Temperature Exist?” The abstract states (my brackets):

“Distinct and equally valid [or invalid] statistical rules can and do show opposite trends when applied to the results of computations from physical models and real data in the atmosphere. A given temperature field can be interpreted as both ‘warming’ and ‘cooling’ simultaneously, making the concept of warming in the context of the issue of global warming physically ill-posed.”

This is all the more salient in a world with warming biases at poorly sited weather stations and a strong urban heat island effect.

My Glass House?

Of course, there are other areas in which similar statistical “sins” are common, some of which are also used repeatedly by climate alarmists: ocean water PH, which Cohler explains cannot be averaged across “parcels”. The result is meaningless. If that isn’t enough for you to harbor doubts about the ocean acidification narrative, just read the first few paragraphs of the tweet linked above!

Similar examples occur in the world of economic data. For example, prices are intensive measures, but economists often refer to an aggregate “price level”. Can such a thing truly exist? Simply averaging prices of all goods and services creates a meaningless figure. Each price can be weighted in a variety of ways (e.g., by shares of a fixed or varying “market basket”). There are several prominent alternatives, all of which have strengths and weaknesses, but none has a claim as an accurate measure of “the price level.”

In fact, though economists talk about it constantly, it can be said that “the price level” does not exist as an objective reality, just as there is no “global temperature.” The difference is that economists readily acknowledge this fundamental ambiguity surrounding price aggregation. Some even insist, for example, that only nominal aggregates (e.g., total spending = prices x quantities), rather than inflation in “the price level”, be considered in certain policy domains, though there is more than one reason for that preference. In contrast, climate officialdom, within the likes of such organizations as the IPCC and NOAH, are loath to acknowledge weaknesses in GMST.

Conclusion

There are many reasons to question the climate orthodoxy, which holds that human emissions of carbon dioxide, a trace gas, produce a warming global temperature trend. An issue that’s been largely taken for granted is the integrity of the so-called global temperature, most commonly the GMST. The reality is that it’s impossible to identify a unique method of calculating a global temperature. It’s possible to specify many different aggregations of local temperature readings, but there is no “true” way of measuring global temperature. Another way of putting this is that it’s impossible to define a single global temperature as a physical reality. There is no such thing.

Nevertheless, global temperature is a critical pillar on which climate alarmism rests, and Cohler has published equally damning critiques of several other climate measurements (also see here), such as mean ocean PH, ocean heat content, and human contribution to atmospheric CO2. Climate authorities should acknowledge the inherent weakness of relying on temperature aggregations, and especially any one aggregation. Perhaps they could define several alternatives, as economists have with price indices, acknowledging the impossibility of pinning down a true global temperature.

The real lesson here is that we should approach climate statistics both with skepticism and humility. Even if you must pretend that it exists, any measure of a so-called global temperature and its trend is of highly of uncertain value. This is critical when it comes to assessing climate policy. As Jack Salmon says in a somewhat broader context:

“One of the most striking features of modern climate economics is not consensus, it’s dispersion. Depending on which paper, model, or administration you consult, the economic damages from climate change range from modest to catastrophic.“

Lords of the Planetary Commons Insist We Banish Sovereignty, Growth

29 Thursday Feb 2024

Posted by Nuetzel in Central Planning, Environmental Fascism, Global Warming, Liberty

≈ Leave a comment

Tags

Anthropocene, Beamed Solar Power, Carbon Capture, Carbon Forcings, Cliff Mass, Common Pool Resources, Elinor Ostrom, Externalities, Fusion Power, Geoengineering, Geothermal Power, global warming, Heat Islands, Interspecies Justice, IPCC, Lula Da Silva, Munger Test, Nuclear power, Orbital Solar Collection, Paris Climate Accords, Planetary Commons, Polycentrism, Private Goods, Property Rights, Public goods, Redistribution, Solar Irradiance, Spillovers, Tipping points

We all share Planet Earth as our home, so there’s a strong sense in which it qualifies as a “commons”. That’s one sensible premise of a new paper entitled “The planetary commons: A new paradigm for safeguarding Earth-regulating systems in the Anthropocene”. The title is a long way of saying that the authors desire broad-based environmental regulation, and that’s what ultimately comes across.

First, a preliminary issue: many resources qualify as commons in the very broadest sense, yet free societies have learned over time that many resources are used much more productively when property rights are assigned to individuals. For example, modern agriculture owes much to defining exclusive property rights to land so that conflicting interests don’t have to compete (e.g,, the farmer and the cowman). Federal land is treated as a commons, however. There is a rich history on the establishment of property rights, but within limits, the legal framework in place can define whether a resource is treated as a commons, a club good, or private property. The point here is that there are substantial economic advantages to preserving strong property rights, rather than treating all resources as communal.

The authors of the planetary commons (PC) paper present a rough sketch for governance over use of the planet’s resources, given their belief that a planetary crisis is unfolding before our eyes. The paper has two main thrusts as I see it. One is to broadly redefine virtually all physical resources as common pool interests because their use, in the authors’ view, may entail some degree of external cost involving degradation of the biosphere. The second is to propose centralized, “planetary” rule-making over the amounts and ways in which those resources are used.

It’s an Opinion Piece

The PC paper is billed as the work product of a “collaborative team of 22 leading international researchers”. This group includes four attorneys (one of whom was a lead author) and one philosopher. Climate impact researchers are represented, who undoubtedly helped shape assumptions about climate change and its causes that drive the PC’s theses. (More on those assumptions in a section below.) There are a few social scientists of various stripes among the credited authors, one meteorologist, and a few “sustainability”, “resilience”, and health researchers. It’s quite a collection of signees, er… “research collaborators”.

Grabby Interventionists

The reasoning underlying a “planetary commons” (PC) is that the planet’s biosphere qualifies as a commons. The biosphere must include virtually any public good like air and sunshine, any common good like waterways, or any private good or club good. After all, any object can play host to tiny microbes regardless of ownership status. So the PC authors characterization of the planet’s biosphere as a commons is quite broad in terms of conventional notions of resource attributes.

We usually think of spillover or external costs as arising from some use of a private resource that imposes costs on others, such as air or water pollution. However, mere survival requires that mankind exploit both public and non-public resources, acts that can always be said to impact the biosphere in some way. Efforts to secure shelter, food, and water all impinge on the earth’s resources. To some extent, mankind must use and shape the biosphere to succeed, and it’s our natural prerogative to do so, just like any other creature in the food chain.

Even if we are to accept the PC paper’s premise that the entire biosphere should be treated is a commons, most spillovers are de minimus. From a public policy perspective, it makes little sense to attempt to govern over such minor externalities. Monitoring behavior would be costly, if not impossible, at such an atomistic level. Instead, free and civil societies rely on a high degree of self-governance and informal enforcement of ethical standards to keep small harms to a minimum.

Unfortunately, the identification and quantification of meaningful spillover costs is not always clear-cut. This has led to an increasingly complex regulatory environment, an increasingly litigious business environment, and efforts by policymakers to manage the detailed inputs and outputs of the industrial economy.

All of that is costly in its own right, especially because the activities giving rise to those spillovers often enable large welfare enhancements. Regulators and planners face great difficulties in estimating the costs and benefits of various “correctives”. The very undertaking creates risk that often exceeds the cost of the original spillover. Nevertheless, the PC paper expands on the murkiest aspects of spillover governance by including “… all critical biophysical Earth-regulating systems and their functions, irrespective of where they are located…” as part of a commons requiring “… additional governance arrangements….”

Adoption of the PC framework would authorize global interventions (and ultimately local interventions, including surveillance) on a massive scale based on guesswork by bureaucrats regarding the evolution of the biosphere.

Ostrom Upside Down

Not only would the PC framework represent an expansion of the grounds for intervention by public authorities, it seeks to establish international authority for intervention into public and private affairs within sovereign states. The authors attempt to rationalize such far-reaching intrusions in a rather curious way:

“Drawing on the legacy of Elinor Ostrom’s foundational research, which validated the need for and effectiveness of polycentric approaches to commons governance (e.g., ref. 35, p. 528, ref. 36, p. 1910), we propose that a nested Earth system governance approach be followed, which will entail the creation of additional governance arrangements for those planetary commons that are not yet adequately governed.”

Anyone having a passing familiarity with Elinor Ostrom’s work knows that she focused on the identification of collaborative solutions to common goods problems. She studied voluntary and often strictly private efforts among groups or communities to conserve common pool resources, as opposed to state-imposed solutions. Ostrom accepted assigned rights and pricing solutions to managing common resources, but she counseled against sole reliance on market-based tools.

Surely the PC authors know they aren’t exactly channeling Ostrom:

“An earth system governance approach will require an overarching global institution that is responsible for the entire Earth system, built around high-level principles and broad oversight and reporting provisions. This institution would serve as a universal point of aggregation for the governance of individual planetary commons, where oversight and monitoring of all commons come together, including annual reporting on the state of the planetary commons.”

Polycentricity was used by Ostrom to describe the involvement of different, overlapping “centers of authority”, such as individual consumers and producers, cooperatives formed among consumers and producers, other community organizations, local jurisdictions, and even state or federal regulators. Some of these centers of authority supersede others in various ways. For example, solutions developed by cooperatives or lower centers of authority must align with the legal framework within various government jurisdictions. However, as David Henderson has noted, Ostrom observed that management of pooled resources at lower levels of authority was generally superior to centralized control. Henderson quotes Ostrom and a co-author on this point:

“When users are genuinely engaged in decisions regarding rules affecting their use, the likelihood of them following the rules and monitoring others is much greater than when an authority simply imposes rules.”

The authors of the PC have something else in mind, and they bastardize the spirit of Ostrom’s legacy in the process. For example, the next sentence is critical for understanding the authors’ intent:

“If excessive emissions and harmful activities in some countries affect planetary commons in other areas—for example, the melting of polar ice—strong political and legal restrictions for such localized activities would be needed.”

Of course, there are obvious difficulties in measuring impacts of various actions on polar ice, assigning responsibility, and determining the appropriate “restrictions”. But in essence, the PC paper advocates for a top-down model of governance. Polycentrism is thus reduced to “you do as we say”, which is not in the spirit of Ostrom’s research.

Planetary Governance

Transcending national sovereignty on questions of the biosphere is key to the authors’ ambitions. At a bare minimum, the authors desire legally-binding commitments to international agreements on environmental governance, unlike the unenforceable promises made for the Paris Climate Accords:

“At present, the United Nations General Assembly, or a more specialized body mandated by the Assembly, could be the starting point for such an overarching body, even though the General Assembly, with its state-based approach that grants equal voting rights to both large countries and micronations, represents outdated traditions of an old European political order.”

But the votes of various “micronations” count for zilch when it comes to real “claims” on the resources of other sovereign nations! Otherwise, there is nothing “voluntary” about the regime proposed in the PC paper.

“A challenge for such regimes is to duly adapt and adjust notions of state sovereignty and self-determination, and to define obligations and reciprocal support and compensation schemes to ensure protection of the Earth system, while including comprehensive stewardship obligations and mandates aimed at protecting Earth-regulating systems in a just and inclusive way.”

So there! The way forward is to adopt the broadest possible definition of market failure and global regulation of any and all private activity touching on nature in any way. And note here a similarity to the Paris Accords: achieving commitments would fall to national governments whose elites often demonstrate a preference for top-down solutions.

Ah Yes, Redistribution

It should be apparent by now that the PC paper follows a now well-established tradition in multi-national climate “negotiations” to serve as subterfuge for redistribution (which, incidentally, includes the achievement of interspecies justice):

“For instance, a more equal sharing of the burdens of climate stabilization would require significant multilateral financial and technology transfers in order not to harm the poorest globally (116).”

The authors insist that participation in this governance would be “voluntary”, but the following sentence seems inconsistent with that assurance:

“… considering that any move to strengthen planetary commons governance would likely be voluntarily entered into, the burdens of conservation must be shared fairly (115).”

Wait, what? “Voluntary” at what level? Who defines “fairness”? The authors approvingly offer this paraphrase of the words of Brazilian President Lula da Silva,

“… who affirmed the Amazon rainforest as a collective responsibility which Brazil is committed to protect on behalf of all citizens around the world, and that deserves and justifies compensation from other nations (117).”

Let Them Eat Cake

Furthermore, PC would require de-growth and so-called “sufficiency” for thee (i.e., be happy with less), if not for those who’ll design and administer the regime.

“… new principles that align with novel Anthropocene dynamics and that could reverse the path-dependent course of current governance. These new principles are captured under a new legal paradigm designed for the Anthropocene called earth system law and include, among others, the principles of differentiated degrowth and sufficiency, the principle of interconnectivity, and a new planetary ethic (e.g., principle of ecological sustainability) (134).”

If we’re to take the PC super-regulators at their word, the regulatory regime would impinge on fertility decisions as well. Just who might we trust to govern humanity thusly? If we’re wise enough to apply the Munger Test, we wouldn’t grant that kind of power to our worst enemy!

Global Warmism

The underlying premise of the PC proposal is that a global crisis is now unfolding before our eyes: anthropomorphic global warming (AGW). The authors maintain that emissions of carbon dioxide are the cause of rising temperatures, rapidly rising sea levels, more violent weather, and other imminent disasters.

“It is now well established that human actions have pushed the Earth outside of the window of favorable environmental conditions experienced during the Holocene…”

“Earth system science now shows that there are biophysical limits to what existing organized human political, economic, and other social systems can appropriate from the planet.”

For a variety of reasons, both of these claims are more dubious than one might suppose based on popular narratives. As for the second of these, mankind’s limitless capacity for innovation is a more powerful force for sustainability than the authors would seem to allow. On the first claim, it’s important to note that the PC paper’s forebodings are primarily based on modeled, prospective outcomes, not historical data. The models are drastically oversimplified representations of the earth’s climate dynamics driven by exogenous carbon forcing assumptions. Their outputs have proven to be highly unreliable, overestimating warming trends almost without exception. These models exaggerate climate sensitivity to carbon forcings, and they largely ignore powerful natural forcings such as variations in solar irradiance, geological heating, and even geological carbon forcings. The models are also notorious for their inadequate treatment of feedback effects from cloud cover. Their predictions of key variables like water vapor are wildly in error.

The measurement of the so-called “global temperature” is itself subject to tremendous uncertainty. Weather stations come and go. They are distributed very unevenly across land masses, and measurement at sea is even sketchier. Averaging all these temperatures would be problematic even if there were no other issues… but there are. Individual stations are often sited poorly, including distortions from heat island effects. Aging of equipment creates a systematic upward bias, but correcting for that bias (via so-called homogenization) causes a “cooling the past” bias. It’s also instructive to note that the increase in global temperature from pre-industrial times actually began about 80 years prior to the onset of more intense carbon emissions in the 20th century.

Climate alarmists often speak in terms of temperature anomalies, rather than temperature levels. In other words, to what extent do temperatures differ from long-term averages? The magnitude of these anomalies, using the past several decades as a base, tend to be anywhere from zero degrees to well above one degree Celsius, depending on the year. Relative to temperature levels, the anomalies are a small fraction. Given the uncertainty in temperature levels, the anomalies themselves are dwarfed by the noise in the original series!

Pick Your Own Tipping Point

It seems that “tipping point” scares are heavily in vogue at the moment, and the PC proposal asks us to quaff deeply of these narratives. Everything is said to be at a tipping point into irrecoverable disaster that can be forestalled only by reforms to mankind’s unsustainable ways. To speak of the possibility of other causal forces would be a sacrilege. There are supposed tipping points for the global climate itself as well as tipping points for the polar ice sheets, the world’s forests, sea levels and coastal environments, severe weather, and wildlife populations. But none of this is based on objective science.

For example, the 1.5 degree limit on global warming is a wholly arbitrary figure invented by the IPCC for the Paris Climate Accords, yet the authors of the PC proposal would have us believe that it was some sort of scientific determination. And it does not represent a tipping point. Cliff Mass explains that climate models do not behave as if irreversible tipping points exist.

Consider also that there has been absolutely no increase in the frequency or intensity of severe weather.

Likewise, the rise of sea levels has not accelerated from prior trends, so it has nothing to do with carbon forcing.

One thing carbon forcings have accomplished is a significant greening of the planet, which if anything bodes well for the biosphere

What about the disappearance of the polar ice sheets? On this point, Cliff Mass quotes Chapter 3 of the IPCC’s Special Report on the implications of 1.5C or more warming:

“there is little evidence for a tipping point in the transition from perennial to seasonal ice cover. No evidence has been found for irreversibility or tipping points, suggesting that year-round sea ice will return given a suitable climate.”

The PC paper also attempts to connect global warming to increases in forest fires, but that’s incorrect: there has been no increasing trend in forest fires or annual burned acreage. If anything, trends in measures of forest fire activity have been negative over the past 80 years.

Concluding Thoughts

The alarmist propaganda contained in the PC proposal is intended to convince opinion leaders and the public that they’d better get on board with draconian and coercive steps to curtail economic activity. They appeal to the sense of virtue that must always accompany consent to authoritarian action, and that means vouching for sacrifice in the interests of environmental and climate equity. All the while, the authors hide behind a misleading version of Elinor Ostrom’s insights into the voluntary and cooperative husbandry of common pool resources.

One day we’ll be able to produce enough carbon-free energy to accommodate high standards of living worldwide and growth beyond that point. In fact, we already possess the technological know-how to substantially reduce our reliance on fossil fuels, but we lack the political will to avail ourselves of nuclear energy. With any luck, that will soften with installations of modular nuclear units.

Ultimately, we’ll see advances in fusion technology, beamed non-intermittent solar power from orbital collection platforms, advances in geothermal power, and effective carbon capture. Developing these technologies and implementing them at global scales will require massive investments that can be made possible only through economic growth, even if that means additional carbon emissions in the interim. We must unleash the private sector to conduct research and development without the meddling and clumsy efforts at top-down planning that typify governmental efforts (including an end to mandates, subsidies, and taxes). We must also reject ill-advised attempts at geoengineered cooling that are seemingly flying under the regulatory radar. Meanwhile, let’s save ourselves a lot of trouble by dismissing the interventionists in the planetary commons crowd.

Hurricane—Warming Link Is All Model, No Data

18 Tuesday Oct 2022

Posted by Nuetzel in Climate science, Hurricanes, Uncategorized

≈ 2 Comments

Tags

Carbon Forcing Models, carbon Sensitivity, Climate Alarmism, Geophysical Fluid Dynamics Laboratory, Glenn Reynolds, Greenhouse Gases, Hurricane Ian, Hurricane Models, IPCC, Model Calibration, Named Storms, National Hurricane Center, National Oceanic and Atmospheric Administration, Neil L. Frank, NOAA, Paul Driessen, Roger Pielke Jr., Ron DeSantis, Ryan Maue, Satellite Data, Tropical Cyclones

There was deep disappointment among political opponents of Florida Governor Ron DeSantis at their inability to pin blame on him for Hurricane Ian’s destruction. It was a terrible hurricane, but they so wanted it to be “Hurricane Hitler”, as Glenn Reynolds noted with tongue in cheek. That just didn’t work out for them, given DeSantis’ competent performance in marshaling resources for aid and cleanup from the storm. Their last ditch refuge was to condemn DeSantis for dismissing the connection they presume to exist between climate change and hurricane frequency and intensity. That criticism didn’t seem to stick, however, and it shouldn’t.

There is no linkage to climate change in actual data on tropical cyclones. It is a myth. Yes, models of hurricane activity have been constructed that embed assumptions leading to predictions of more hurricanes, and more intense hurricanes, as temperatures rise. But these are models constructed as simplified representations of hurricane development. The following quote from the climate modelers at the Geophysical Fluid Dynamics Laboratory (GFDL) (a division of the National Oceanic and Atmospheric Administration (NOAA)) is straightforward on this point (emphases are mine):

“Through research, GFDL scientists have concluded that it is premature to attribute past changes in hurricane activity to greenhouse warming, although simulated hurricanes tend to be more intense in a warmer climate. Other climate changes related to greenhouse warming, such as increases in vertical wind shear over the Caribbean, lead to fewer yet more intense hurricanes in the GFDL model projections for the late 21st century.

Models typically are said to be “calibrated” to historical data, but no one should take much comfort in that. As a long-time econometric modeler myself, I can say without reservation that such assurances are flimsy, especially with respect to “toy models” containing parameters that aren’t directly observable in the available data. In such a context, a modeler can take advantage of tremendous latitude in choosing parameters to include, sensitivities to assume for unknowns or unmeasured relationships, and historical samples for use in “calibration”. Sad to say, modelers can make these models do just about anything they want. The cautious approach to claims about model implications is a credit to GFDL.

Before I get to the evidence on hurricanes, it’s worth remembering that the entire edifice of climate alarmism relies not just on the temperature record, but on models based on other assumptions about the sensitivity of temperatures to CO2 concentration. The models relied upon to generate catastrophic warming assume very high sensitivity, and those models have a very poor track record of prediction. Estimates of sensitivity are highly uncertain, and this article cites research indicating that the IPCC’s assumptions about sensitivity are about 50% too high. And this article reviews recent findings that carbon sensitivity is even lower, about one-third of what many climate models assume. In addition, this research finds that sensitivities are nearly impossible to estimate from historical data with any precision because the record is plagued by different sources and types of atmospheric forcings, accompanying aerosol effects on climate, and differing half-lives of various greenhouse gases. If sensitivities are as low as discussed at the links above, it means that predictions of warming have been grossly exaggerated.

The evidence that hurricanes have become more frequent or severe, or that they now intensify more rapidly, is basically nonexistent. Ryan Maue and Roger Pielke Jr. of the University of Colorado have both researched hurricanes extensively for many years. They described their compilation of data on land-falling hurricanes in this Forbes piece in 2020. They point out that hurricane activity in older data is much more likely to be missing and undercounted, especially storms that never make landfall. That’s one of the reasons for the focus on landfalling hurricanes to begin with. With the advent of satellite data, storms are highly unlikely to be missed, but even landfalls have sometimes gone unreported historically. The farther back one goes, the less is known about the extent of hurricane activity, but Pielke and Maue feel that post-1970 data is fairly comprehensive.

The chart at the top of this post is a summery of the data that Pielke and Maue have compiled. There are no obvious trends in terms of the number of storms or their strength. The 1970s were quiet while the 90s were more turbulent. The absence of trends also characterizes NOAA’s data on U.S. landfalling hurricanes since 1851, as noted by Pail Driessen. Here is Driessen on Florida hurricane history:

“Using pressure, Ian was not the fourth-strongest hurricane in Florida history but the tenth. The strongest hurricane in U.S. history moved through the Florida Keys in 1935. Among other Florida hurricanes stronger than Ian was another Florida Keys storm in 1919. This was followed by the hurricanes in 1926 in Miami, the Palm Beach/Lake Okeechobee storm in 1928, the Keys in 1948, and Donna in 1960. We do not know how strong the hurricane in 1873 was, but it destroyed Punta Rassa with a 14-foot storm surge. Punta Rassa is located at the mouth of the river leading up to Ft. Myers, where Ian made landfall.”

Neil L. Frank, veteran meteorologist and former head of the National Hurricane Center, bemoans the changed conventions for assigning names to storms in the satellite era. A typical clash of warm and cold air will often produce thunderstorms and wind, but few of these types of systems were assigned names under older conventions. They are not typical of systems that usually produce tropical cyclones, although they can. Many of those kinds of storms are named today. Right or wrong, that gives the false impression of a trend in the number of named storms. Not only is it easier to identify storms today, given the advent of satellite data, but storms are assigned names more readily, even if they don’t strictly meet the definition of a tropical cyclone. It’s a wonder that certain policy advocates get away with saying the outcome of all this is a legitimate trend!

As Frank insists, there is no evidence of a trend toward more frequent and powerful hurricanes during the last several decades, and there is no evidence of rapid intensification. More importantly, there is no evidence that climate change is leading to more hurricane activity. It’s also worth noting that today we suffer far fewer casualties from hurricanes owing to much earlier warnings, better precautions, and better construction.

Climate Alarmism and Junk Science

02 Thursday Dec 2021

Posted by Nuetzel in Uncategorized, Research Bias, Climate

≈ 8 Comments

Tags

Carbon Forcing Models, Climate Alarmism, Green Subsidies, Intergovernmental Panel on Climate Change, IPCC, Kevin Trenberth, Model Bias, Model Ensembles, National Center for Atmospheric Research, Norman Rogers, Redistribution, rent seeking

The weak methodology and accuracy of climate models is the subject of an entertaining Norman Rogers post. I want to share just a few passages along with a couple of qualifiers.

Rogers quotes Kevin Trenberth, former Head of Climate Analysis at the National Center for Atmospheric Research, with apparent approval. Oddly, Rogers does not explain that Trenberth is a strong proponent of the carbon-forcing models used by the UN’s Intergovernmental Panel on Climate Change (IPCC). He should have made that clear, but Trenberth actually did say the following:

“‘[None of the] models correspond even remotely to the current observed climate [of the Earth].’“

I’ll explain the context of this comment below, but it constitutes a telling admission of the poor foundations on which climate alarmism rests. The various models used by the IPCCc are all a little different and they are calibrated differently. I’ve noted elsewhere that their projections are consistently biased toward severe over-predictions of temperature trends. Rogers goes on from there:

“The models can’t properly model the Earth’s climate, but we are supposed to believe that, if carbon dioxide has a certain effect on the imaginary Earths of the many models it will have the same effect on the real earth.”

But how on earth can a modeler accept the poor track record of these models? It’s not as if the bias is difficult to detect! On this question, Rogers says:

“The climate models are an exemplary representation of confirmation bias, the psychological tendency to suspend one’s critical facilities in favor of welcoming what one expects or desires. Climate scientists can manipulate numerous adjustable parameters in the models that can be changed to tune a model to give a ‘good’ result.“

And why are calamitous projections desirable from the perspective of climate modelers? Follow the money and the status rewards of reinforcing the groupthink:

“Once money and status started flowing into climate science because of the disaster its denizens were predicting, there was no going back. Imagine that a climate scientist discovers gigantic flaws in the models and the associated science. Do not imagine that his discovery would be treated respectfully and evaluated on its merits. That would open the door to reversing everything that has been so wonderful for climate scientists. Who would continue to throw billions of dollars a year at climate scientists if there were no disasters to be prevented? “

Indeed, it has been a gravy train. Today, it is reinforced by green-preening politicians, the many billions of dollars committed by investors seeking a continuing flow of public subsidies for renewables, tempting opportunities for international redistribution (and graft), and a mainstream media addicted to peddling scare stories. The parties involved all rely on, and profit by, alarmist research findings.

Rogers’ use of the Trenberth quote above might suggest that Trenberth is a critic of the climate models used by the IPCC. However, the statement was in-line with Trenberth’s long-standing insistence that the IPCC models are exclusively for constructing “what-if” scenarios, not actual forecasting. Perhaps his meaning also reflected his admission that climate models are “low resolution” relative to weather forecasting models. Or maybe he was referencing longer-term outcomes that are scenario-dependent. Nevertheless, the quote is revealing to the extent that one would hope these models are well-calibrated to initial conditions. That is seldom the case, however.

As a modeler, I must comment on a point made by Rogers about the use of ensembles of models. That essentially means averaging the predictions of multiple models that differ in structure. Rogers denigrates the approach, and while it is agnostic with respect to theories of the underlying process generating the data, it certainly has its uses in forecasting. Averaging the predictions of two different models with statistically independent and unbiased predictions will generally produce more accurate forecasts than the individual models. Rogers may or may not be aware of this, but he has my sympathies in this case because the IPCC is averaging across a large number of models that are clearly biased in the same direction! Rogers adds this interesting tidbit on the IPCC’s use of model ensembles:

“There is a political reason for using ensembles. In order to receive the benefits flowing from predicting a climate catastrophe, climate science must present a unified front. Dissenters have to be canceled and suppressed. If the IPCC were to select the best model, dozens of other modeling groups would be left out. They would, no doubt, form a dissenting group questioning the authority of those that gave the crown to one particular model.”

Rogers discusses one more aspect of the underpinnings of climate models, one that I’ve covered several times on this blog. That is the extent to which historical climate data is either completely lacking, plagued by discontinuities or coverage, or distorted by imperfections in measurement. The data used to calibrate climate models has been manipulated, adjusted, infilled, and estimated over lengthy periods by various parties to produce “official” and unofficial temperature series. While these efforts might seem valiant as exercises in understanding the past, they are fraught with uncertainty. Rogers provides a link to the realclimatescience blog, which details many of the data shortcomings as well as shenanigans perpetrated by researchers and agencies who have massaged, imputed, or outright created these historical data sets out of whole cloth. Rogers aptly notes:

“The purported climate catastrophe ahead is 100% junk science. If the unlikely climate catastrophe actually happens, it will be coincidental that it was predicted by climate scientists. Most of the supporting evidence is fabricated.”

Hyperbolic Scenarios, Crude Climate Models, and Scientism

07 Sunday Nov 2021

Posted by Nuetzel in Climate science, Global Warming

≈ 6 Comments

Tags

Carbon Efficiency, Carbon forcing, carbon Sensitivity, Cloud Feedback, COP26, G20, Global Temprature, IEA, Intergovernmental Panel on Climate Change, International Energy Agency, IPCC, Joe Biden, Joe Brandon, Judith Curry, Justin Ritchie, Net Zero Emissions, Nic Lewis, Precautionary Principle, Prince Charles, RCP8.5, rent seeking, Representative Concentration Pathway, Roger Pielke Jr., Scientism, United Nations

What we hear regarding the dangers of climate change is based on predictions of future atmospheric carbon concentrations and corresponding predictions of global temperatures. Those predictions are not “data” in the normal, positive sense. They do not represent “the way things are” or “the way things have been”, though one might hope the initial model conditions align with reality. Nor can the predictions be relied upon as “the way things will be”. Climate scientists normally report a range of outcomes produced by models, yet we usually hear only one type of consequence for humanity: catastrophe!

Models Are Not Reality

The kinds of climate models quoted by activists and by the UN’s Intergovernmental Panel on Climate Change (IPCC) have been around for decades. Known as “carbon forcing” models, they are highly simplified representations of the process determining global temperatures. The primary forecast inputs are atmospheric carbon concentrations over time, which again are themselves predictions.

It’s usually asserted that climate model outputs should guide policy, but we must ask: how much confidence can we have in the predictions to allow government to take coercive actions having immediate, negative impacts on human well being? What evidence can be marshaled to show prospective outcomes under proposed policies? And how well do these models fit the actual, historical data? That is, how well do model predictions track our historical experience, given the historical paths of inputs like carbon concentrations?

Faulty Inputs

The IPCC has been defining and updating sets of carbon scenarios since 1990. The scenarios outline the future paths of greenhouse gas emissions (and carbon forcings). They were originally based on economic and demographic modeling before an apparent “decision by committee” to maintain consistency with scenarios issued in the past. Roger Pielke Jr. and Justin Ritchie describe the evolution of this decision process, and they call for change:

“Our research (and that of several colleagues) indicates that the scenarios of greenhouse gas (GHG) emissions through the end of the twenty-first century are grounded in outdated portrayals of the recent past. Because climate models depend on these scenarios to project the future behavior of the climate, the outdated scenarios provide a misleading basis both for developing a scientific evidence base and for informing climate policy discussions. The continuing misuse of scenarios in climate research has become pervasive and consequential—so much so that we view it as one of the most significant failures of scientific integrity in the twenty-first century thus far. We need a course correction.”

One would certainly expect the predicted growth of atmospheric carbon to evolve over time. However, as Pielke and Ritchie note, the IPCC’s baseline carbon scenario today, known as RCP8.5 (“Representative Concentration Pathway”), is remarkably similar to the “business as usual” (BAU) scenario it first issued in 1990:

“The emissions scenarios the climate community is now using as baselines for climate models depend on portrayals of the present that are no longer true. And once the scenarios lost touch with reality, so did the climate, impact, and economic models that depend on them for their projections of the future. Yet these projections are a central part of the scientific basis upon which climate policymakers are now developing, debating, and adopting policies.”

The authors go on to discuss a few characteristics of the BAU scenario that today seem implausible, including:

“… RCP8.5 foresees carbon dioxide emissions growing rapidly to at least the year 2300 when Earth reaches more than 2,000 ppm of atmospheric carbon dioxide concentrations. But again, according to the IEA and other groups, fossil energy emissions have likely plateaued, and it is plausible to achieve net-zero emissions before the end of the century, if not much sooner.”

Pielke and Ritchie demonstrate that the IPCC’s baseline range of carbon emissions by 2045 is centered well above (actually double) the mid-range of scenarios developed by the International Energy Agency (IEA), and there is very little overlap between the two. However, global carbon emissions have been flat over the past decade. Even if we extrapolate the growth in atmospheric CO2 parts per million over the past 20 years, it would rise to less than 600 ppm by 2100, not 1,200 ppm. It’s true that a few countries (China comes to mind) continue to exploit less “carbon efficient” energy resources like coal, but the growth trend in concentrations is likely to continue to taper over time.

It therefore appears that the IPCC’s climate scenarios, which are used broadly as model inputs by the climate research community, are suspect. As one might suspect: garbage in, garbage out. But what about the climate models themselves?

Faulty Models

The model temperature predictions have been grossly in error. They have been and continue to be “too hot”. The chart at the top of this post is typical of the comparisons of model projections and actual temperatures. Before the year 2000, most of the temperature paths projected by the particular model charted above ran higher than actual temperatures. However, the trends subsequently diverged and the gap has become more extreme over the past two decades.

The problem is not merely one of faulty inputs. The models themselves are deeply flawed, as they fail to account adequately for natural forces that strongly influence our climate. It’s been clear for many years that the sun’s radiative energy has a massive impact on temperatures, and it is affected not only by the intensity of the solar cycle but also by cloud cover on Earth. Unfortunately, carbon forcing models do not agree on the role that increased clouds might have in amplifying warming. However, a reduction in cloud cover over the past 20 years, and a corresponding increase in radiative heat, can account for every bit of the warming experienced over that time.

This finding not only offers an alternative explanation for two decades of modest warming, it also strikes at the very heart of the presumed feedback mechanism usually assumed to amplify carbon-induced warming. The overall effect is summarized by the so-called carbon sensitivity, measured as the response of global temperature to a doubling of carbon concentration. The IPCC puts that sensitivity in a range of 1.5C to 4.5C. However, findings published by Nic Lewis and Judith Curry are close to the low end of that range, as are those found by Frank Bosse reported here. The uncertainties surrounding the role of cloud cover and carbon sensitivities reveal that the outputs relied upon by climate alarmists are extreme model simulations, not the kind of reliable intelligence upon which drastic policy measures should be taken.

The constant anxiety issued from the Left on the issue of climate change, and not a little haranguing of the rest of us, is misplaced. The IPCC’s scenarios for the future paths of carbon concentration are outdated and seriously exaggerated, and they represent a breach of scientific protocol. Yet the scenarios are widely used as the basis of policy discussions at both the domestic and international levels. The climate models themselves embed questionable assumptions that create a bias toward calamitous outcomes.

Yet Drastic Action Is Urged

The UN’s 2021 climate conference, or COP26 (“Conference of the Parties …”) is taking place in Glasgow, Scotland this month. Like earlier international climate conferences, the hope is that dire forecasts will prompt broad agreement on goals and commitments, and that signatory countries will translate these into policy at the national level.

Things got off to a bad start when, before COP26 even began, the G20 nations failed to agree on a goal of “net-zero” carbon emissions by 2050. Another bad portent for the conference is that China and India, both big carbon emitters, will not attend, which must be tremendously disappointing to attendees. After all, COP26 has been billed by Prince Charles himself as “the last chance saloon, literally”, for saving the world from catastrophe. He said roughly the same thing before the Paris conference in 2014. And Joe Brandon … er, Biden, blurted some hyperbole of his own:

“Climate change is already ravaging the world. … It’s destroying people’s lives and livelihoods and doing it every single day. … It’s costing our nations trillions of dollars.”

All this is unadulterated hogwash. But it is the stuff upon which a crisis-hungry media feeds. This hucksterism is but one form of climate rent-seeking. Other forms are much more troubling: scary scenarios and model predictions serve the self-interest of regulators, grant-seeking researchers, interventionist politicians, and green investors who suckle at the public teat. It is a nightmare of scientism fed by the arrogance of self-interested social planners. The renewable energy technologies promoted by these investors, politicians, and planners are costly and land-intensive, providing only intermittent output (requiring backup fossil fuel capacity), and they have nasty environmental consequences of their own.

The precautionary principle is no excuse for the extreme policies advocated by alarmists. We already have economically viable “carbon efficient” and even zero-carbon energy alternatives, such as natural gas, modular nuclear power, and expanded opportunities for exploiting geothermal energy. This argues against premature deployment of wasteful renewables. The real crisis is the threat posed by the imposition of draconian green policies to our long-term prosperity, and especially to the world’s poor.

Green Climate Policy Wreaks Poverty

03 Friday Sep 2021

Posted by Nuetzel in Climate science, Environmental Fascism

≈ 6 Comments

Tags

Assessment Report #6, Carbon Emissions, Cooling the Past, Deforestation, Democratic Republic of Congo, Diablo Canyon, Disparate impact, Economic Development, Energy Poverty, Fossil fuels, Hügo Krüger, Intergovernmental Panel on Climate Change, IPCC, Jennifer Marohasy, Jim Crow Environmentalism, Joel Kotkin, Judith Curry, Michael Schellenberger, Natural Gas, Net Zero Carbon, Nuclear power, Rare Earth Minerals, Regressive Policy, Remodeled Temperatures, Renewable energy, Steve Koonin

Have no doubt: climate change warriors are at battle with humanity itself, ostensibly on behalf of the natural world. They would have us believe that their efforts to eliminate the use of fossil fuels are necessary to keep our planet from becoming a blazing hothouse. However, the global temperature changes we’ve witnessed over the past 150 years, based on the latest Assessment Report (AR6) from the Intergovernmental Panel on Climate Change (IPCC), are well within the range of historical variation.

“Remodeled” History

Jennifer Marohasy posted an informative discussion of the IPCC’s conclusions last month, putting them into a broader climatological context and focusing in particular on measurement issues. In short, discussing “global” temperatures with any exactitude is something of a sham. Moreover, the local temperature series upon which the global calculations are based have been “remodeled.” They are not direct observations. I don’t think it’s too crude to say they’ve been manipulated because the changed records are almost always in one direction: to “cool” the past.

Judith Curry is succinct in her criticism of the approach to climate change adopted by alarmist policymakers and many climate researchers: 

“In a nutshell, we’ve vastly oversimplified both the problem and its solutions. The complexity, uncertainty, and ambiguity of the existing knowledge about climate change is being kept away from the policy and public debate. The solutions that have been proposed are technologically and politically infeasible on a global scale.”

We need a little more honesty!

The Real Victims

I want to focus here on some of the likely casualties of the war on fossil fuels. Those are, without a doubt, the world’s poor, who are being consigned by climate activists to a future of abject suffering. Joel Kotkin and Hügo Krüger are spot-on in their recent piece on the inhumane implications of anti-carbon ideology.

Energy-poor areas of the world are now denied avenues through which to enhance their peoples’ well being. Attempts to fund fossil-fuel power projects are regularly stymied by western governments and financial institutions in the interests of staving off political backlash from greens. Meanwhile, far more prosperous nations power their economies with traditional carbon-based energy sources. Most conspicuously, China continues to fuel its rapid growth with coal and other fossil fuels, getting little pushback from climate activists. If you’re wondering how the composition of energy output has evolved, this time-lapse chart is a pretty good guide.

One of the most incredible aspects of this situation is how nuclear energy has been spurned, despite its status as a proven and safe solution to carbon-free power. This excellent thread by Michael Schellenberger covers the object lesson in bad public policy offered by the proposed closing of the Diablo Canyon nuclear plant in California.

In both the U.S. and other parts of the world, as Kotkin and Krüger note, it is not just the high up-front costs that lead to the rejection of these nuclear projects. The green lobby and renewable energy interests are now so powerful that nuclear energy is hardly considered. Much the same is true of low-carbon natural gas: 

“Sadly, the combination of virtue-signaling companies and directives shaped by green activists in rich countries – often based on wildly exaggerated projections, notes former Barack Obama advisor Steve Koonin – make such a gradual, technically feasible transition all but impossible. Instead, it is becoming increasingly unlikely that developing countries will be able to tap even their own gas.”

Energy is the lifeblood of every economy. Inadequate power creates obstacles to almost any form of production and renders some kinds of production impossible. And ironically, the environmental consequences of “energy poverty” are dire. Many under-developed economies are largely dependent on deforestation for energy. Without a reliable power grid and cheap energy, consumers must burn open fires in their homes for heat and cooking, a practice responsible for 50% of child pneumonia deaths worldwide, according to Kotkin and Krüger.

Green Environmental Degradation

Typically, under-developed countries are reliant on the extraction of natural resources demanded by the developed world:

“The shift to renewables in the West, for example, has increased focus on developing countries as prime sources for critical metals – copper, lithium, and rare-earth minerals, in particular – that could lead to the devastation of much of the remaining natural and agricultural landscape. … Lithium has led to the depletion of water resources in Latin America and the further entrenchment of child labor in the Democratic Republic of the Congoas the search for cobalt continues.”

Unfortunately, the damage is not solely due to dependence on resource extraction:

“The western greens, albeit unintentionally, are essentially turning the Third World into the place they send their dirty work. Already, notes environmental author Mike Shellenberger, Africans are stuck with loads of discarded, highly toxic solar panels that expose both the legions of rag-pickers and the land itself to environmental degradation – all in the name of environmentalism.”

Battering the Poor In the West

Again, wealthy countries are in far better shape to handle the sacrifices required by the climate calamitists, but it still won’t be easy. In fact, lower economic strata will suffer far more than technocrats, managers, and political elites. The environmental left leans on the insidious lever of energy costs in order to reduce demand, but making energy more costly takes a far larger bite out of the budgets of the poor. In another recent piece, “Jim Crow Returns to California,” Kotkin discusses the disparate impact these energy policies have on minorities. 

“This surge in prices derives from the state’s obsession — shared by the ruling tech oligarchs — with renewable energy and the elimination of fossil fuels. Yet as a recent Massachusetts Institute of Technology (MIT) report has shown, over-reliance on renewables is costly, because it requires the production of massive (and environmentally unfriendly) battery-storage capacity — the price of which is invariably passed on to the taxpayer.

This is not bad news for the tech oligarchs, who have been prominent among those profiting from ‘clean energy’ investments. But many other Californians, primarily those in the less temperate interior, find themselves falling into energy poverty or are dependent on state subsidies that raise electricity prices for businesses and the middle class. Black and Latino households are already forced to pay from 20 to 43% more of their household incomes on energy than white households. Last year, more than 4 million households in California (30% of the total) experienced energy poverty.”

Kotkin touches on other consequences of these misguided policies to minority and non-minority working people. In addition to jobs lost in the energy sector, a wide variety of wage earners will suffer as their employers attempt to deal with escalating energy costs. The immediate effects are bad enough, but in the long-run the greens’ plans would scale back the economy’s productive machinery in order to eliminate carbon emissions — net zero means real incomes will decline! 

Energy costs have a broad impact on consumer’s budgets. Almost every product imaginable is dependent on energy, and consumer prices will reflect the higher costs. In addition, the “green” effort to curtail development everywhere except in high-density transit corridors inflates the cost of housing, inflicting more damage on workers’ standards of living.

Tighten Your Belts

These problems won’t be confined to California if environmental leftists get their version of justice. Be prepared for economic stagnation for the world’s poor and a sharply reduced standard of living in the developed world, but quite unnecessarily. We’ll all pay in the long run, but the poor will pay much more in relative terms.

Climate Activists Run From Rigor

02 Tuesday Jul 2019

Posted by Nuetzel in Climate science

≈ 1 Comment

Tags

Carbon Forcings, Chicken Little, Global Energy Budget, IPCC, John Christy, Richard Feynman, Testable Hypotheses, Unfalsifiable Claims

Climate activists are seemingly averse to empiricism, and to the scientific method for that matter. Esteemed climatologist John Christy makes that point abundantly clear in a recent speech entitled “Putting Climate Change Claims To the Test“. Christy was one of two lead authors of the third Assessment Report (AR3) of the United Nations Intergovernmental Panel on Climate Change (IPCC), published in 2001, but his research into systematic discrepancies between climate models and actual temperature trends put him in the doghouse with the IPCC. He hasn’t been asked to serve as a lead author since. The transcript linked above is awkward in a few spots where Christy makes informal references to slides, which do, however, accompany and align with the transcript. He covers a lot of ground in this speech, but I’ll cover just a few points. Read the whole thing!

Christy notes that all of the climate models used by the IPCC have substantially over-predicted temperatures for the past thirty years, by an average across models of more than 2.5 times! He measured the errors 17 years ago and again recently, and the magnitude of those errors was almost identical. Yet little progress has been made in correcting the climate models. And why bother? The press simply won’t report the errors, and the IPCC and the activist community are too enraptured by their religious, end-of-days narrative to give it up.

Christy uses a stylized global “energy budget” to illustrate the various sources of climate forcing. He attributes the climate model errors to a failure to adequately account for the escape of energy from the atmosphere into space. He also demonstrates that the magnitude of carbon forcing from human activity represents a tiny contribution to the impact of Earth’s total energy forcings.

Another major point from Christy is that the climate research community has lost its scientific bearings. The very title of his speech refers to testable hypotheses, which is what real science is all about. Christy provides a punchy quote from Richard Feynman on this issue: “Science is a belief in the ignorance of experts.” Today it is routine for climate scientists to report results based on extrapolations from models hinging on mere assumptions they claim to have backtested. Those backtests are often based on flimsy standards and tend to receive little scrutiny, just as long as they are consistent with the so-called “expert consensus”. In other words, those claims amount to a big “what-if” exercise, and the underlying assumptions often lack rigorous testing. Christy goes on:

“Michael Crichton says that in science consensus is irrelevant, what is relevant is reproducible results, consensus is inappropriate. So, as an aside, there’s a strange thing happening in climate science: the proliferation of unfalsifiable claims, in other words the unfalsifiable hypothesis.  Remember I said that scientific method: you make a claim and the claim has to be testable and falsifiable and then you check and see if it’s the real thing.

Well here’s the claim. Whatever happens is consistent with global warming.  Maybe it’s snow or no snow. More hurricanes? Less hurricanes? This method says wait for something to happen and then claim that human-caused warming is to blame. That’s the unfalsifiable hypothesis and it has no information value and there is no way to test it, it has no testable parameter and so it is not science. The unfalsifiable hypothesis predicts anything is possible therefore nothing is testable.“

In other words, today climate research is infested with “fake science”. Christy marches through a variety of climate-related phenomena in his speech, offering evidence on each that is sometimes mixed but often contrary to the implications of climate models and claims made by activists. The big picture is nothing short of damning to the catastrophic warming narrative. Yet the dire scenarios feared by the Chicken Littles of the world, which never come to pass, continue to be reported eagerly by the media and pressed for costly political action.

The UN’s Mass Extinction Fiction

20 Monday May 2019

Posted by Nuetzel in Biodiversity, Central Planning, Environment

≈ Leave a comment

Tags

African Elephants, Beepocalypse, Biodiversity, Bird Eater Tarantulas, CO2 Emissions, Dan Hannon, Extinction, Gary Wrightstone, Global Greening, Habitat Loss, IPCC, IUCN Red List, Jimmy Carter, Matt Ridley, Non-Native Species, Paris Accord, Polar Bears

A big story early this month warned of mass extinctions and a collapse of the planet’s biodiversity. This was based on a report by the UN’s Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES). A high-level presentation of the data by IPBES was constructed in a way that is easily revealed as misleading (see below). But the first thing to ask about bombastic reports like this is whether the authors are self-interested. There is big money in promoting apocalyptic scenarios and public programs to avert them. Large government grants are at stake for like-minded scientists, and political power is at stake for biodiversity activists worldwide. Like many other scare stories reported as “news”, this one feeds into the statist political agenda of the environmental Left.

Exaggerated claims of species endangerment are not a new phenomenon. We’ve heard grossly erroneous forecasts of polar bear extinctions, frightening but false warnings of a “beepocalyse”, and faulty claims about declines in the population of African elephants. These are headline-grabbing and more thrilling to report than mourning the prospective loss of an obscure species of cave lichen. But a mass extinction is something else! Dan Hannon reminds us of the following:

“In 1980, for example, the Jimmy Carter administration distributed to foreign governments a report claiming that, by the year 2000, 2 million species would be wiped out. In fact, by 2010, there had been 872 documented extinctions.” 

Of course, that figure does not account for the multitude of new species discovered. There are many. Recent examples just gruesome enough to garner attention are the three new species of bird eater tarantulas discovered in 2017.

In the more general mass-extinction context of the IPBES report, the blame for the extremely pessimistic outlook is placed squarely on human activity. The authors allege CO2 emissions as the primary culprit, which is at best a theory and one at odds with the chief driver of extinctions during the industrial era. That is the introduction of non-native species into environments having flora or fauna unable to withstand new competitors. Matt Ridley elaborates:

“The introduction by people of predators, parasites and pests, especially to islands, has been and continues to be far and away the greatest cause of local and global extinction of native fauna.”

There is no question that the IPBES report on extinctions was intended to create alarm. As Gary Wrightstone demonstrates, the lack of rigor and misleading expositional techniques used in the report are a tell:

“… the data were lumped together by century rather than shorter time frames, which, as we shall see accentuates the supposed increase in extinctions. … The base data were derived from the International Union for Conservation of Nature and Natural Resources (IUCN) Red List, which catalogues every known species that has gone the way of the dodo and the carrier pigeon. Review of the full data set reveals a much different view of extinction and what has been happening recently.”

The more granular charts Wrightstone presents are indeed contrary to the narrative in the IPBES report. And Wrightstone also highlights the following in a postscript:

“In an incredibly ironic twist that poses a difficult conundrum for those who are intent on saving the planet from our carbon dioxide excesses, the new study reports that the number one cause of predicted extinctions is habitat loss. Yet their solution is to pave over vast stretches of land for industrial scale solar factories and to construct immense wind factories that will cover forests and   grasslands, killing the endangered birds and other species they claim to want to save.”

The enduring extinction racket is one among other fronts in the war on capitalism. The IPBES report must use the term “transformative” a thousand times, as it recommends “steering away from the current limited paradigm of economic growth“. Matt Ridley highlights the faulty attribution of alleged declines in biodiversity to “western values and capitalism”:

“On the whole what really diminishes biodiversity is a large but poor population trying to live off the land. As countries get richer and join the market economy they generally reverse deforestation, slow species loss and reverse some species declines.”

And Ridley also says this:

“A favourite nostrum of many environmentalists is that you cannot have infinite growth with finite resources. But this is plain wrong, because economic growth comes from doing more with less. So if I invent a new car engine that gets twice as many miles per gallon, I’ve caused economic growth but we’ll use less fuel. Likewise if I increase the yield of a crop, I need less land and probably less fuel too.”

It’s no coincidence that future extinctions foretold by IPBES are predicted to have drastic impacts on less-developed countries. It thus appears that IPBES exists in a happy synergy with the UN’s climate Intergovernmental Panel on Climate Change (IPCC), as well as proponents of the Paris Accord and the entire climate lobby. An objective that helps them garner support around the globe is to redistribute existing wealth to less-developed countries in the name of environmental salvation. That would prove a poor substitute for the kinds of free-market policies that would truly enhance prospects for economic growth in those nations.

The threat of mass extinctions is greatly exaggerated by the UN, IPBES, climate change activists, and members of the media who can’t resist promoting a crisis. Any diminished biodiversity we might experience going forward won’t be solved by limiting economic growth, as the IPBES report claims. Instead, advances in productivity, particularly in agriculture, can allow expansion of native habitat, as recent experience with reforestation and global greening demonstrates. This principle is as applicable to under-developed countries as anywhere else.

The kinds of centrally planned limits on human activity contemplated by the IPBES report are likely to backfire by making us poorer. Those limits would impose costs by misallocating resources away from things that people value most highly. They would also force people to forego the adoption of innovative production techniques, leading to the substitution of other resources, such as inefficient land use. And those limits would deny basic freedoms, including the unfettered use of private property.

A Carbon Tax Would Be Fine, If Only …

01 Friday Mar 2019

Posted by Nuetzel in Environment, Global Warming, Taxes

≈ Leave a comment

Tags

A.C. Pigou, Carbon Dividend, Carbon Tax, Climate Change, Economic Development, External Cost, Fossil fuels, Green New Deal, IPCC, John Cochrane, Michael Shellenberger, Pigouvian Tax, Quillette, Renewable energy, Revenue Neutrality, Robert P. Murphy, Social Cost of Carbon, Warren Meyer, William D. Nordhaus

I’ve opposed carbon taxes on several grounds, but I admit that it might well be less costly as a substitute for the present mess that is U.S. climate policy. Today, we incur enormous costs from a morass of energy regulations and mandates, prohibitions on development of zero-carbon nuclear power, and subsidies to politically-connected industrialists investing in corn ethanol, electric cars, and land- and wildlife-devouring wind and solar farms. (For more on these costly and ineffective efforts, see Michael Shellenberger’s “Why Renewables Can’t Save the Planet” in Quillette.) Incidentally, the so-called Green New Deal calls for a complete conversion to renewables in unrealistically short order, but with very little emphasis on a carbon tax.

The Carbon Tax

Many economists support the carbon tax precisely because it’s viewed as an attractive substitute for many other costly policies. Some support using revenue from the tax to pay a flat rebate or “carbon dividend” to everyone each year (essentially a universal basic income). Others have pitched the tax as a revenue-neutral replacement for other taxes that are damaging to economic growth, such as payroll taxes or taxes on capital. Economic growth would improve under the carbon tax, or so the story goes, because the carbon tax is a tax on a “bad”, as opposed to taxes on “good” factors of production. I view these ideas as politically naive. If we ever get the tax, we’ll be lucky to get much regulatory relief in the bargain, and the revenue is not likely to be offset by reductions in other taxes.

But let’s look a little closer at the concept of the carbon tax, and I beg my climate-skeptic friends to stick with me for a few moments and keep a straight face. The tax is a way to attach an explicit price to the use of fuels that create carbon emissions. The emissions are said to inflict social or external costs on other parties, costs which are otherwise ignored by consumers and businesses in their many decisions involving energy use. The carbon tax is a so-called Pigouvian tax: a way to “internalize the externality” by making fossil fuels more expensive to burn. The tax itself involves no prohibitions on behavior of any kind. Certain behaviors are taxed to encourage more “desirable” behavior.

Setting the Tax

But what is the appropriate level of the tax? At what level will it approximate the true “social cost of carbon”? Any departure from that cost would be sub-optimal. Robert P. Murphy contrasts William D. Nordhaus’ optimal carbon tax with more radical levels, which Nordhaus believes would be needed to meet the goals of the United Nation’s Intergovernmental Panel on Climate Change (IPCC). Nordhaus won the 2018 Nobel Prize in economics for his work on climate change. Whatever one might think of the real risks of climate change, Nordhaus’ clearly recognizes the economic downsides to mitigating against those risks.

Nordhaus has estimated that the social cost of carbon will be $44/ton in 2025 (about $0.39 per gallon of gas). He claims that a carbon tax at that level would limit increases in global temperature to 3.5º Celsius by 2100. He purports to show that the costs of a $44 carbon tax in terms of reduced economic output would be balanced by the gains from limiting climate warming. Less warming would require a higher tax with fewer incremental rewards, and even more incremental lost output. The costs of the tax would then outweigh benefits. For perspective, according to Nordhaus, a stricter limit of 2.5º C implies a carbon tax equivalent to $2.50 per gallon of gas. The IPCC, however, prescribes an even more radical limit of 1.5º C. That would inflict a huge cost on humanity far outweighing the potential benefits of less warming.

A Carbon Tax, If…

Many economists have come down in favor of a carbon tax under certain qualifications: revenue-neutrality, a “carbon dividend”, or as a pre-condition to deregulation of carbon sources and de-subsidization of alternatives. John Cochrane discusses a carbon tax in the context of the “Economists’ Statement on Carbon Dividends” (Cochrane’s more recent thoughts are here):

“It’s short, sweet, and signed by, as far as I can tell, every living CEA chair, every living Fed Chair, both Democrat and Republican, and most of the living Nobel Prize winners. … It offers four principles 1. A carbon tax, initially $40 per ton. 2. The carbon tax substitutes for regulations and subsidies and (my words) the vast crony-capitalist green boondoggle swamp, which is chewing up money and not saving carbon. 3. Border adjustment like VAT have [sic] 4. ‘All the revenue should be returned directly to U.S. citizens through equal lump-sum rebates.'”

Rather than a carbon dividend, Warren Meyer proposes that a carbon tax be accompanied by a reduction in the payroll tax, an elimination of all subsidies, mandates, and prohibitions, development of more nuclear power-generating capacity, and contributions to a cleanup of Chinese and Asian coal-power generation. That’s a lot of stuff, and I think it exceeds Meyer’s normal realism with respect to policy issues.

My Opposition

Again, I oppose the adoption of a carbon tax for several reasons, despite my sympathy for the logic of Pigouvian taxation of externalities. At the risk of repeating myself, here I elaborate on my reasons for opposition:

Government Guesswork: First, Nordhaus’ estimates notwithstanding, we do not and cannot know the climate/economic tradeoffs with any precision. We can barely measure global climate, and the history of what measures we have are short and heavily manipulated. Models purporting to show the relationship between carbon forcing and global climate climate change are notoriously unreliable. So even if we can agree on the goal (1.5º, 2.5º, 3.5º), and we won’t, the government will get the tradeoffs wrong. I took the following from a comment on Cochrane’s blog, a quote from A.C. Pigou himself:

“It is not sufficient to contrast the imperfect adjustments of unfettered enterprise with the best adjustment that economists in their studies can imagine. For we cannot expect that any State authority will attain, or even wholeheartedly seek, that ideal. Such authorities are liable alike to ignorance, to sectional pressure, and to personal corruption by private interest. A loud-voiced part of their constituents, if organized for votes, may easily outweigh the whole.”

Political Hazards: Second, we won’t get the hoped-for political horse trade made explicit in the “Economists’ Statement …” discussed above. As a political matter, the setting of the carbon tax rate will almost assuredly get us a rate that’s too high. Experiences with carbon taxes in Australia, British Columbia, and France have been terrible thus far, sowing widespread dissatisfaction with the resultant escalation of energy prices.

Economic Growth: Neither is it a foregone conclusion that a revenue-neutral carbon tax will stimulate economic growth, and it might actually reduce output. As Robert P. Murphy explains in another post, the outcome depends on the structure of taxes prior to the change. The substitution of the carbon tax will increase output only if it replaces taxes on a factor of production (labor or capital) that is overtaxed prior to the change. That undermines a key selling point: that the carbon tax would necessarily produce a “double dividend”: a reduction in carbon emissions and higher economic growth. Nevertheless, I’d allow that revenue neutrality combined with elimination of carbon regulation and “green” subsidies would be a good bet from an economic growth perspective.

Overstated Risks: Finally, I oppose carbon taxes because I’m unconvinced that the risk and danger of global warming are as great as even Nordhaus would have it. In other words, the external costs of carbon don’t amount to much. Our recorded temperature history is extremely short and is therefore not a reliable guide to the long-term nature of the systemic relationships at issue. Even worse, temperature records are manipulated to exaggerate the trend in temperatures (also see here, here and here). There is no evidence of an uptrend in severe weather events, and the dangers of sea level rise associated with increasing carbon concentrations also have been greatly exaggerated. Really, at some point one must take notice of the number of alarming predictions and doomsday headlines from the past that have not been borne out even remotely. Furthermore, higher carbon concentrations and even warming itself would be of some benefit to humanity. In addition to a greener environment, the benefits include more rapid economic growth, improved agricultural yields, and a reduction in the salient danger of cold-weather deaths.

Economic Development: The use of fossil fuels has helped to enable strong growth in incomes in developed economies. It has also given us energy alternatives such as nuclear power as well as research into other alternatives, albeit with very mixed success thus far. And while a carbon tax would create an additional incentive to develop such alternatives, a U.S. tax would not accomplish much if any global temperature reduction. Such a tax would have to be applied on a global scale. Talk about a political long-shot! Increasing the price of carbon emissions also has enormous downsides for the less developed world. These fragile economies would benefit greatly from development of fossil fuel energy, enabling reductions in poverty and the income growth necessary to someday join in the prosperity of the developed economies. This, along with liberalization of markets, is the affordable way to bring economic success to these countries, which in turn will enable them to consider the energy alternatives that might come to fruition by that time. Fighting the war on fossil fuels in the underdeveloped world is nothing if not cruel.

 

Climate Summit Success? Let’s Talk In Five Years

02 Wednesday Dec 2015

Posted by Nuetzel in Global Warming, Human Welfare

≈ 1 Comment

Tags

AGW, Benny Peisner, Carbon Emissions, Carbon Verification, Climate Alarmism, Climate and Terrorism, Climate Hysteria, Climate Summit, COP 21, global warming, IPCC, Joel Kotkin, Matt Ridley, Regressive Climate Policy

Moudakis Cartoon

Misplaced priorities are on full display in Paris for the next ten days at the climate conference known as COP-21 (“Conference of the Parties”). Joel Kotkin makes note of the hysteria in evidence among climate activists fostered by political opportunists, economic illiteracy and fraudulent climate research. Of course, climate alarmism offers handsome rewards for politician-cronyists and rent-seeking corporatists. With that seemingly in mind, President Barack Obama is playing the role of opportunist-in-chief, claiming that climate change is the biggest threat to U.S. security while blithely asserting that the climate is responsible for the growing danger from terrorism. Here is Kotkin on such tenuous claims:

“… this reflects the growing tendency among climate change activists to promote their cause with sometimes questionable assertions. Generally level-headed accounts, such as in the Economist and in harder-edge publications like the Daily Telegraph, have demonstrated that many claims of climate change activists have already been disproven or are somewhat exaggerated.“

“Somewhat exaggerated” is an understatement, given the scandals that have erupted in the climate research community, the miserable predictive record of carbon forcing models, and the questionable practices employed by NASA and NOAA researchers in adjusting surface temperature data (see below for links). When it comes to climate activism, the Orwellian aspect of Groupthink is palpable:

“Rather than address possible shortcomings in their models, climate change activists increasingly tend to discredit critics as dishonest and tools of the oil companies. There is even a move to subject skeptics to criminal prosecution for deceiving the public.“

This is thoroughly contrary to the spirit of scientific inquiry, to say nothing of free speech. As if to parody their questionable approach to an issue of science, climate-change devotees have come out in full force to attack the excellent Matt Ridley, a sure sign that they find his message threatening to the power of their mantra. Ridley and Benny Peiser have an op-ed in the Wall Street Journal this week entitled “Your Complete Guide to the Climate Debate” (should be ungated for now). The authors discuss the weakness of the scientific case for anthropomorphic global warming (AGW); the fact that they use findings of the Intergovernmental Panel on Climate Change (IPCC) to make this critique must be particularly galling to the alarmists. Ridley and Peisner cover the correspondingly flimsy case for draconian environmental policies to deal with the perceived threat of AGW. Also, they emphasize the regressive nature of the demands made by the environmental left, who are either ignorant or unfazed by the following truths:

“… there are a billion people with no grid electricity whose lives could be radically improved—and whose ability to cope with the effects of weather and climate change could be greatly enhanced—with the access to the concentrated power of coal, gas or oil that the rich world enjoys. Aid for such projects has already been constrained by Western institutions in the interest of not putting the climate at risk. So climate policy is hurting the poor.“

Finally, Ridley and Peisner explain the economic incentives that are likely to undermine any meaningful international agreement in Paris. Less developed countries have been asked to reduce their carbon emissions, which they can ill afford, and to agree to a verification framework. Those parties might agree if they view the framework as sufficiently easy to game (and it will be), and if they are compensated handsomely by the developed world. The latter will represent an insurmountable political challenge for the U.S. and other developed countries, who are already attempting to promulgate costly new restrictions on carbon emissions.

“Concerned about the loss of industrial competitiveness, the Obama administration is demanding an international transparency-and-review mechanism that can verify whether voluntary pledges are met by all countries. Developing countries, however, oppose any outside body reviewing their energy and industrial activities and carbon-dioxide emissions on the grounds that such efforts would violate their sovereignty.

… China, India and the ‘Like-Minded Developing Countries’ group are countering Western pressure by demanding a legally binding compensation package of $100 billion a year of dedicated climate funds, as promised by President Obama at the U.N. climate conference in Copenhagen in 2009.

However, developing nations are only too aware that the $100 billion per annum funding pledge is never going to materialize, not least because the U.S. Congress would never agree to such an astronomical wealth transfer. This failure to deliver is inevitable, but it will give developing nations the perfect excuse not to comply with their own national pledges.“

These conflicting positions may mean that the strongest point of accord at the Paris conference will be to meet again down the road.

“Expect an agreement that is sufficiently vague and noncommittal for all countries to sign and claim victory. Such an agreement will also have to camouflage deep and unbridgeable divisions while ensuring that all countries are liberated from legally binding targets a la Kyoto.“

This morning, an apparently sleepy and deluded President Obama spoke at the Paris conference before heading back to the U.S. He insisted again that the agreement he expects to come out of Paris will be a “powerful rebuke” to terrorists. Yeah, that’ll show ’em! Even a feeble agreement will be trumpeted as a great victory by the conference parties; Obama and the Left will attempt to wield it as a political cudgel, a brave accomplishment if it succeeds in any way, and a vehicle for blame if it is blocked by the principled opponents of climate alarmism. The media will play along without considering scientific evidence running contrary to the hysterical global warming narrative. Meanwhile, the frailty of the agreement will represent something of a win for humanity.

Here are some links to previous posts on this topic from Sacred Cow Chips:

Climate Negotiators To Discuss Economic Cannibalism

A Cooked Up Climate Consensus

Fitting Data To Models At NOAA

Carbon Farce Meets Negative Forcings

Subsidized Waste: The Renewable Irony

Manipulating Temperatures, People & Policy

Record Hot Baloney

Alluring Apocalypse Keeps Failing To Materialize

The Stench of Green Desperation

Cut CO2, But What About the Environment?

Live Long and Prosper With Fossil Fuels

Divesting of Human Well-Being

 

 

← Older posts
Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Is “Global Temperature” a Fiction?
  • ESG Contortions: Virtue, Returns, and Politics
  • Grading Trump II, So Far
  • A Warsh Policy Scenario At the Federal Reserve
  • The Coexistence of Labor and AI-Augmented Capital

Archives

  • May 2026
  • April 2026
  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Passive Income Kickstart
  • OnlyFinance.net
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library
  • Scattered Showers and Quicksand
  • Jam Review

Blog at WordPress.com.

Passive Income Kickstart

OnlyFinance.net

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The Future is Ours to Create

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

Scattered Showers and Quicksand

Musings on science, investing, finance, economics, politics, and probably fly fishing.

Jam Review

"If you get confused, listen to the music play."

  • Subscribe Subscribed
    • Sacred Cow Chips
    • Join 128 other subscribers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar

Loading Comments...