• About

Sacred Cow Chips

Sacred Cow Chips

Category Archives: Risk

Rejecting Fossil Fuels at Our Great Peril

18 Wednesday May 2022

Posted by Nuetzel in Central Planning, Energy, Risk, Technology

≈ 2 Comments

Tags

Bartley J. Madden, Biden Administration, Dan Ervin, Don Boudreaux, Electric Vehicles, Energy Mandates, Energy subsidies, EV Adoption, External Benefits, External Costs, Fossil fuels, Grid Stability, Intermittancy, Kevin Williamson, Markets, Power Outages, Price Controls, regressivity, Renewable energy, Russia Sanctions, SEC Carbon Mandate, Sustainability

The frantic rush to force transition to a zero-carbon future is unnecessary and destructive to both economic well-being and the global environment. I do not subscribe to the view that a zero-carbon goal is an eventual necessity, but even if we stipulate that it is, a rational transition would eschew the immediate abandonment of fossil fuels and adopt a gradual approach relying heavily on market signals rather than a mad dash via coercion.

I’ve written about exaggerated predictions of temperature trends and catastrophes on a number of occasions (and see here for a similar view from a surprising source). What might be less obvious is the waste inherent in forcing the abandonment of mature and economic technologies in favor of, as yet, under-developed and uneconomic technologies. These failures should be obvious when the grid fails, as it does increasingly. It is often better to leave the development and dispersion of new technologies to voluntary decision-making. In time, advances will make alternative, low- or zero-carbon energy sources cost effective and competitive to users. That will include efficient energy storage at scale, new nuclear technologies, geothermal techniques, and further improvements in the carbon efficiency of fossil fuels themselves. These should be chosen by private industry, not government planners.

Boneheads At the Helm

Production of fossil fuels has been severely hampered by the Biden Administration’s policies. The sanctions on Russian oil that only began to take hold in March have caused an additional surge in the price of oil. Primarily, however, we’ve witnessed an artificial market disruption instigated by Biden’s advisors on environmental policy. After all, neither Russian oil imports nor the more recent entreaties to rogue states as Iraq and Venezuela for oil would have been necessary if not for the Administration’s war on fossil fuels. Take a gander at this White House Executive Order issued in January 2021. It reads like a guidebook on how to kill an industry. In a column this weekend, Kevin Williamson quipped about “the Biden administration’s uncanny ability to get everything everywhere wrong all at once.” That was about policy responses to inflation, but it applies to energy in particular.

Scorning the Miracle

Fossil fuels are the source of cheap and reliable energy that have lifted humanity to an unprecedented level of prosperity. Fossil fuels have given a comfortable existence to billions of people, allowing them to rise out of poverty. This prosperity gives us the luxury of time to develop substitutes, not to mention much greater safety against the kind of weather extremes that have always been a fact of life. The world still gets 80% of its energy from fossil fuels. These fuels are truly a miracle, and we should not discard such valuable technologies prematurely. That forces huge long-term investments in inferior technologies that are likely to be superseded in the future by more economic refinements or even energy sources and methods now wholly unimagined. There are investors who will still wish to pursue those new technologies, perhaps with non pecuniary motives, and there are a few consumers who really want alternatives to fossil fuels.

Biden’s apparent hope that his aggressive climate agenda will be a great legacy of his presidency is at the root of his intransigence toward fossil fuels. His actions in this regard have had a profoundly negative psychological effect on the oil and gas industry. Steps such as cancellations of pipeline projects are immediately impactful in that regard, to say nothing of the supplies that would have ultimately flowed through those pipelines. These cancellations reinforce the message Biden’s been sending to the industry and its investors since his campaign: we mean to shut you down! Who wants to invest in new wells under those circumstances? Other actions have followed: no new federal oil and gas leases, methane restrictions, higher drilling fees on federal land, and a variety of climate change initiatives that bode ill for the industry, such as the SEC’s mandate on carbon disclosures and the Federal Reserve’s proposed role in policing climate impacts.

And now, Democrats are contemplating a move that would make gasoline even more scarce: price controls. As Don Boudreaux says in a recent letter to The Hill:

“Progressives incessantly threaten to tax and regulate carbon fuels into oblivion. These threats cannot but reduce investors’ willingness to fund each of the many steps – from exploration through refining to transporting gasoline to market – that are necessary to keep energy prices low. One reality reflected by today’s high prices at the pump is this hostility to carbon fuels generally and to petroleum especially. And gasoline price controls would only make matters worse by further reducing the attractiveness of investing in the petroleum industry: Why invest in bringing products to market if the prices at which you’re allowed to sell are dictated by grandstanding politicians?”

The kicker is that all these policies are futile in terms of their actual impact on global carbon concentrations, let alone their highly tenuous link to global temperatures. The policies are also severely regressive, inflicting disproportionate harm on the poor, who can least afford such an extravagant transition. Biden wants the country to sacrifice its standard of living in pursuit of these questionable goals, while major carbon-emitting nations like China and India essentially ignore the issue.

Half-Baked Substitution

Market intervention always has downsides to balance against the potential gains of “internalizing externalities”. In this case, the presumed negative externalities are imagined harms of catastrophic climate change from the use of fossil fuels; the presumed external benefits are the avoidance of carbon emissions and climate change via renewables and other “zero-carbon” technologies. With those harms and gains in question, it’s especially important to ask who loses. Taxpayers are certainly on that list. Users of energy produced with fossil fuels end up paying higher prices and are forced to conserve or submit to coerced conversion away from fossil fuels. Then there are the wider impediments to economic growth and, as noted above, the distributional consequences.

Users of immature or inferior energy alternatives might also end up as losers, and there are likely to be external costs associated with those technologies as well. It’s not widely appreciated that today’s so-called clean energy alternatives are plagued by their need to obtain certain minerals that are costly to extract in economic and environmental terms, not to mention highly carbon intensive. And when solar and wind facilities fail or reach the end of their useful lives, disposal creates another set of environmental hazards. In short, the loses imposed through forced internalization of highly uncertain externalities are all too real.

Unfortunately, the energy sources favored by the Administration fail to meet base-load power needs on windless and/or cloudy days. The intermittency of these key renewables means that other power sources, primarily fossil-fuel and nuclear capacity, must remain available to meet demand on an ongoing basis. That means the wind and solar cannot strictly replace fossil fuels and nuclear capacity unless we’re willing to tolerate severe outages. Growth in energy demand met by renewables must be matched by growth in backup capacity.

A call for “energy pragmatism” by Dan Ervin hinges on the use of coal to provide the “bridge to the energy future”, both because there remains a large amount of coal generating capacity and it can stabilize the grid given the intermittency of wind and solar. Ervin also bases his argument for coal on recent increases in the price of natural gas, though a reversal of the Biden EPA’s attacks on gas and coal, which Ervin acknowledges, would argue strongly in favor of natural gas as a pragmatic way forward.

Vehicle Mandates

The Administration has pushed mandates for electric vehicle (EV) production and sales, including subsidized charging stations. Of course, the power used by EVs is primarily generated by fossil fuels. Furthermore, rapid growth in EVs will put a tremendous additional strain on the electric grid, which renewables will not be able to relieve without additional backup capacity from fossil fuels and nuclear. This severely undermines the supposed environmental benefits of EVs.

Once again, mandates and subsidies are necessary because EV technology is not yet economic for most consumers. Those buyers don’t want to spend what’s necessary to purchase an EV, nor do they wish to suffer the inconveniences that re-charging often brings. This is a case in which policy is outrunning the ability of the underlying infrastructure required to support it. And while adoption of EVs is growing, it is still quite low (and see here).

Wising Up

Substitution into new inputs or technologies happens more rationally when prices accurately reflect true benefits and scarcities. The case for public subsidies and mandates in the push for a zero-carbon economy rests on model predictions of catastrophic global warming and a theoretical link between U.S. emissions and temperatures. Both links are weak and highly uncertain. What is certain is the efficiency of fossil fuels to power gains in human welfare.

This Bartley J. Madden quote sums up a philosophy of progress that is commendable for firms, and probably no less for public policymakers:

“Keep in mind that innovation is the key to sustainable progress that jointly delivers on financial performance and taking care of future generations through environmental improvements.”

Madden genuflects to the “sustainability” crowd, who otherwise don’t understand the importance of trusting markets to guide innovation. If we empower those who wish to crush private earnings from existing technologies, we concede the future to central planners, who are likely to choose poorly with respect to technology and timing. Let’s forego the coercive approach in favor of time, development, and voluntary adoption!

The Social Security Filing Dilemma

19 Monday Apr 2021

Posted by Nuetzel in Risk, Social Security

≈ Leave a comment

Tags

Deferred Benefits, Full Retirement Age, Life Expectancy, Opportunity cost, Retirement Savings, Risk Tolerance, Social Security, Time Preference

A 67-year-old friend told me he won’t file for Social Security (SS) benefits until he turns 70 because “it will pay off as long as I live to at least 81”. Okay, so benefit levels increase by about 8% for each year they’re deferred after your “full retirement age” (probably about 66 for him), and he has no doubt he’ll live more than the extra 11 years. Yes, his decision will “pay off” in a “break-even” sense if he lives that long: he’ll collect more incremental dollars of benefits beyond his 70th birthday than he’ll lose during the three-year deferral (but actually, he’d have to live till he’s 81.5 to break even). But that does not mean his decision is “optimal”.

Good things come to those who wait. I’ll simplify here just a bit, but let’s say an 8% increase in benefits is uniform for every year deferred beyond age 62. (It’s actually a bit more than that after full retirement age, but it’s less than 8% in some years prior to full retirement age.) 8% is a very good, “safe” return, assuming you don’t mind putting your faith in the government to make good.

The Reaper approaches: Unlike your personal savings, SS benefits end at death (a surviving spouse would continue to receive the higher of your respective benefit payments). That means the “safe” 8% return is eroded by diminishing life expectancy with each passing year. For example, average life expectancy at age 62 is 25.4 years, but it falls to 24.5 years at age 63. That’s a decline of 3.5% in the number of years one can expected to receive those higher, deferred benefits. At ages 69 and 70, remaining life expectancy is 19.6 and 18.8 years, respectively. Therefore, waiting the extra year to age 70 means a 4.1% decline in future years of benefits. So rather than a safe, 8% return, subtract about 4%. You’re looking at roughly a 4% uncertain return for deferral of benefits between age 62 and age 70. If you have health issues, it’s obviously worse.

Opportunity Cost: It would be fine to take an expected 4% annual return for deferring SS benefits if you had no immediate use for the extra funds. But you could take the early benefits and invest them! If you’re still working, you could possibly save a like amount of funds from your employment income tax-deferred. So taking the early benefits would be worthwhile if you can earn at least 4% on the funds. Sure, investment returns are uncertain, but over a few years, a 4% annualized return (which I’ll call the “hurdle” rate) should not be hard to beat.

The same logic applies to an already retired individual who would withdraw funds from savings to afford the deferral of SS benefits. Instead, if he or she takes the benefits immediately, leaving a like amount invested, any return in excess of about 4% will have made it worthwhile. But of course, all of this is beside the point if you really just want to retire and the early benefits allow you to do so. You value the benefits now!

But what about taxes? Investment income will generally be taxed, and it’s possible the incremental benefits from deferred SS benefits won’t be. That might swing the calculus in favor of waiting a few extra years to file. And taking benefits early, while still employed, might mean a larger share of the early benefits will be taxed. If 80% of your benefits are taxed at a marginal rate of 25%, state and federal, you’re out 20% of your early benefits. Also, if you expect to be in a lower tax bracket in the future (good luck!), or if you plan to move to a low-tax state at some point in the future, deferring benefits might be more advantageous.

On the other hand, if you’re subject to tax on a portion of your early benefits, you’re likely to be subject to tax on benefits you defer as well. If you’re SS benefits and investment income are both taxed, the issue might be close to a wash, but that hurdle return I mentioned above might have to be a bit higher than 4% to justify early benefits.

Optimal? So what is an “optimal” decision about when to file for SS benefits? For anyone in their 60s today who has not yet filed for SS benefits, it depends on your tolerance for market risk and your tax status.

—You can likely earn more than the rough 4% annual hurdle discussed over a few years in the market, so taking benefits as early as 62 might be a reasonable decision. That’s especially true if you already have some cash set aside to ride out market downturns.

—If you are an extremely conservative investor then you are unlikely to achieve a 4% return, so the “safe” return from deferring SS benefits is your best bet.

—If you believe your tax status will be more favorable later, that might swing the pendulum in favor of deferral, again depending on risk tolerance.

—If you are afraid that failing health and death might come prematurely, filing early is a reasonable decision.

—If you simply want to retire early and the benefits will enable you to do that, filing early is simply a matter of personal time preference.

So my friend who is deferring his SS benefits until age 70 might or might not be optimizing: 1) he is supremely confident in his long-term health, but that’s not something he should count on; 2) he might be an extremely cautious investor (okay…); and 3) he’s still working, and he might expect his tax status to improve by age 70 (I doubt it).

I plan to retire before I turn 65, and I think I’ll be happy to take the benefits and leave more of my money invested. As for Social Security generally, I’d be happy to take a steeply discounted lump sum immediately and invest it, rather than wait for retirement, but that ain’t gonna happen!

On the Meaning of Herd Immunity

09 Saturday May 2020

Posted by Nuetzel in Pandemic, Public Health, Risk

≈ 2 Comments

Tags

Antibody, Antigen, Carl T. Bergstrom, Christopher Moore, Covid-19, Herd Immunity, Heterogeneity, Household Infection, Immunity, Infection Mortality Risk, Initial Viral Load, John Cochrane, Lockdowns, Marc Lipsitch, Muge Cevik, Natalie Dean, Natural Immunity, Philippe Lemoine, R0, Santa Fe Institute, SARS-CoV-2, Social Distancing, Super-Spreaders, Zvi Mowshowitz

Immunity doesn’t mean you won’t catch the virus. It means you aren’t terribly susceptible to its effects if you do catch it. There is great variation in the population with respect to susceptibility. This simple point may help to sweep away confusion over the meaning of “herd immunity” and what share of the population must be infected to achieve it.

Philippe Lemoine discusses this point in his call for an “honest debate about herd immunity“. He reproduces the following chart, which appeared in this NY Times piece by Carl T. Bergstrom and Natalie Dean:

Herd immunity, as defined by Bergstrom and Dean, occurs when there are sufficiently few susceptible individuals remaining in the population to whom the actively-infected can pass the virus. The number of susceptible individuals shrinks over time as more individuals are infected. The chart indicates that new infections will continue after herd immunity is achieved, but the contagion recedes because fewer additional infections are possible.

We tend to think of the immune population as those having already been exposed to the virus, and who have recovered. Those individuals have antibodies specifically targeted at the antigens produced by the virus. But many others have a natural immunity. That is, their immune systems have a natural ability to adapt to the virus.

Heterogeneity

At any point in a pandemic, the uninfected population covers a spectrum of individuals ranging from the highly susceptible to the hardly and non-susceptible. Immunity, in that sense, is a matter of degree. The point is that the number of susceptible individuals doesn’t start at 100%, as most discussions of herd immunity imply, but something much smaller. If a relatively high share of the population has low susceptibility, the virus won’t have to infect such a large share of the population to achieve effective herd immunity.

The apparent differences in susceptibility across segments of the population may be the key to early herd immunity. We’ve known for a while that the elderly and those with pre-existing conditions are highly vulnerable. Otherwise, youth and good health are associated with low vulnerability.

Lemoine references a paper written by several epidemiologists showing that “variation in susceptibility” to Covid-19 “lowers the herd immunity threshold”:

“Although estimates vary, it is currently believed that herd immunity to SARS-CoV-2 requires 60-70% of the population to be immune. Here we show that variation in susceptibility or exposure to infection can reduce these estimates. Achieving accurate estimates of heterogeneity for SARS-CoV-2 is therefore of paramount importance in controlling the COVID-19 pandemic.”

The chart below is from that paper. It shows a measure of this variation on the horizontal axis. The colored, vertical lines show estimates of historical variation in susceptibility to historical viral episodes. The dashed line shows the required exposure for herd immunity as a function of this measure of heterogeneity.

Their models show that under reasonable assumptions about heterogeneity, the reduction in the herd immunity threshold (in terms of the percent infected) may be dramatic, to perhaps less than 20%.

Then there are these tweets from Marc Lipsitch, who links to this study:

“As an illustration we show that if R0=2.5 in an age-structured community with mixing rates fitted to social activity studies, and also categorizing individuals into three categories: low active, average active and high active, and where preventive measures affect all mixing rates proportionally, then the disease-induced herd immunity level is hD=43% rather than hC=1−1/2.5=60%.”

Even the celebrated Dr. Bergstrom now admits, somewhat grudgingly, that hereogeniety reduces the herd immunity threshold, though he doesn’t think the difference is large enough to change the policy conversation. Lipsitch also is cautious about the implications.

Augmented Heterogeneity

Theoretically, social distancing reduces the herd immunity threshold. That’s because infected but “distanced” people are less likely to come into close contact with the susceptible. However, that holds only so long as distancing lasts. John Cochrane discusses this at length here. Social distancing compounds the mitigating effect of heterogeneity, reducing the infected share of the population required for herd immunity.

Another compounding effect on heterogeneity arises from the variability of initial viral load on infection (IVL), basically the amount of the virus transmitted to a new host. Zvi Mowshowitz discusses its potential importance and what it might imply about distancing, lockdowns, and the course of the pandemic. In any particular case, a weak IVL can turn into a severe infection and vice versa. In large numbers, however, IVL is likely to bear a positive relationship to severity. Mowshowitz explains that a low IVL can give one’s immune system a head start on the virus. Nursing home infections, taking place in enclosed, relatively cold and dry environments, are likely to involve heavy IVLs. In fact, so-called household infections tend to involve heavier IVLs than infections contracted outside of households. And, of course, you are very unlikely to catch Covid outdoors at all.

Further Discussion

How close are we to herd immunity? Perhaps much closer than we thought, but maybe not close enough to let down our guard. Almost 80% of the population is less than 60 years of age. However, according to this analysis, about 45% of the adult population (excluding nursing home residents) have any of six conditions indicating elevated risk of susceptibility to Covid-19 relative to young individuals with no co-morbidities. The absolute level of risk might not be “high” in many of those cases, but it is elevated. Again, children have extremely low susceptibility based on what we’ve seen so far.

This is supported by the transmission dynamics discussed in this Twitter thread by Dr. Muge Cevik. She concludes:

“In summary: While the infectious inoculum required for infection is unknown, these studies indicate that close & prolonged contact is required for #COVID19 transmission. The risk is highest in enclosed environments; household, long-term care facilities and public transport. …

Although limited, these studies so far indicate that susceptibility to infection increases with age (highest >60y) and growing evidence suggests children are less susceptible, are infrequently responsible for household transmission, are not the main drivers of this epidemic.”

Targeted isolation of the highly susceptible in nursing homes, as well as various forms of public “distancing aid” to the independent elderly or those with co-morbidities, is likely to achieve large reductions in the effective herd immunity ratio at low cost relative to general lockdowns.

The existence of so-called super-spreaders is another source of heterogeneity, and one that lends itself to targeting with limitations or cancellations of public events and large gatherings. What’s amazing about this is how the super-spreader phenomenon can lead to the combustion of large “hot spots” in infections even when the average reproduction rate of the virus is low (R0 < 1). This is nicely illustrated by Christopher Moore of the Santa Fe Institute. Super-spreading also implies, however, that while herd immunity signals a reduction in new infections and declines in the actively infected population, “hot spots” may continue to flare up in a seemingly random fashion. The consequences will depend on how susceptible individuals are protected, or on how they choose to mitigate risks themselves.

Conclusion

I’ve heard too many casual references to herd immunity requiring something like 70% of the population to be infected. It’s not that high. Many individuals already have a sort of natural immunity. Recognition of this heterogeneity has driven a shift in the emphasis of policy discussions to the idea of targeted lockdowns, rather than the kind of indiscriminate “dumb” lockdowns we’ve seen. The economic consequences of shifting from broad to targeted lockdowns would be massive. And why not? The health care system has loads of excess capacity, and Covid infection fatality risk (IFR) is turning out to be much lower than the early, naive estimates we were told to expect, which were based on confirmed case fatality rates (CFRs).

Beepocamyth: Neonics Don’t Kill the Buzz

08 Saturday Feb 2020

Posted by Nuetzel in Agriculture, Biodiversity, Environment, Risk

≈ 1 Comment

Tags

Beepocalypse, Colony Collapse Disorder, Fish & Wildlife Service, Genetic Literacy Project, Glyphosate, Jon Entine, Junk Science, Kayleen Schreiber, National Wildlife Refuges, Neonicotiniods, Neonics, Nydia Velázquez, Paul Driessen, Pesticides, Sierra Club

False claims that a certain class of pesticides threaten the world’s bee populations are commonplace, and we hear the same more recently about various species of birds. The origins of the “beepocalypse” rumor were not based on scientific evidence, but on a narrative that developed among environmental activists in response to a phenomenon called Colony Collapse Disorder (CCD) that began around 2006, roughly a decade after neonicotinoid pesticides (so-called neonics) replaced earlier, more toxic compounds as the pesticides of choice. But Jon Entine writes at The Genetic Literacy Project:

“What causes CCD? It still remains a mystery, in part. But researchers turned up historical examples of CCD-like bee die offs across the globe over hundreds of years, well before the introduction of pesticides, but activist groups would have none of it.”

CCD essentially tapered off by 2009, according to Entine, and the number of honeybee colonies are higher now that before the introduction of neonics. See Entine’s charts at the link showing changes in honeybee populations over time. In Australia, where the use of neonics has been especially heavy, bee populations have grown steadily and remain quite healthy.

Entine’s article provides a nice summary of the real and imagined threats to the world’s bee populations as well as distorted claims associated with normal winter die-offs. He provides a number of useful links on these subjects, and he summarizes research showing the lack of any real threat to bees from neonics:

“Over the past seven years, there have been a flood of studies about the potential impact of neonics on bees. Many small-scale, forced-feeding studies that generally overdosed bees with neonics found various negative effects; not a surprise, many entomologists have said, as they do not replicate real world impacts.

In contrast, a multitude of large-population field studies—the ‘gold-standard’ of bee research—have consistently demonstrated there are no serious adverse effects of neonic insecticides on honeybees at the colony level from field-realistic neonic exposure. …

By last year, even the Sierra Club—for years one of the leading proponents of the honeybee Armageddon narrative—was backpeddling, writing: ‘Honeybees are at no risk of dying off. While diseases, parasites and other threats are certainly real problems for beekeepers, the total number of managed honeybees worldwide has risen 45% over the last half century.'”

Then Entine turns his attention to another front in the war on pesticides: a Canadian study in which white-crowned sparrows were force-fed a mixture of seeds and pesticide via gavage — ie, through a tube:

“Only sparrows force-fed the highest dosage were affected, and then only temporarily. They stopped eating, quickly lost body weight and fat, became disoriented and paused their migratory flight—all after tube full of chemicals was forced down their throat and into their stomach. … That said, within a few days of what was likely a trauma-inducing experience, all recovered completely and continued their migration normally.”

Yet the authors reported that the very existence of some wild birds is threatened by neonics, and the media, always eager to report a crisis, ran with it.

Paul Driessen also describes the junk science underlying misleading narratives regarding pesticide use. It is a driving force behind legislation in the House and Senate that would ban the use of neonics in National Wildlife Refuges, where the Fish & Wildlife Service permits farmers to grow various crops. Driessen has some advice for Rep. Nydia Velázquez (D-NY), a sponsor of the legislation:

“She should also recognize potentially serious threats to bees, wildlife, soils, waters and plants in refuges from sources that she, her colleagues and their environmentalist and media allies routinely ignore: solar panels, for instance. Not only do they blanket many thousands of acres, allowing little to grow beneath or between them. They can also leach cadmium and other metals into soils and waters. They should no longer be built near wildlife refuges.

Finally, it’s not just bees. It’s also birds, and bats – which are already being killed and even eradicated in many areas by America’s 56,000 wind turbines. Imagine what Green New Deal turbine numbers would do.”

More perspective is offered in this excellent six-part (and growing?) “Pesticides and Food” series (all at the link) by Kayleen Schreiber:

  1. Has pesticide use decreased? Yes, dramatically in per capita and per unit of output.
  2. Have pesticides improved?  Yes, with dramatically lower toxicity, improved biodegradability, and lower use rates.
  3. How dangerous is glyphosate (a herbicide)? Not very. Covered in my last post. Glyphosate is only 1/10th as toxic as caffeine.
  4. How do organic pesticides compare to synthetic pesticides? It’s a mixed bag, with great variability across both classes. Organics are more toxic in some applications, and synthetics are more toxic in others.
  5. Soil health: Are synthetic pesticides more sustainable than “natural” organics?  Organics require more tillage, which creates sustainability problems.
  6. Pesticide residues — Something to worry about? The USDA finds little residue in its testing, with extremely low detection rates for both organics and synthetics.

 

 

Certainty Laundering and Fake Science News

05 Wednesday Dec 2018

Posted by Nuetzel in Global Warming, Risk, Science

≈ Leave a comment

Tags

Ashe Schow, Certainty Laundering, Ceteris Paribus, Fake News, Fake Science, Fourth Annual Climate Assessment, Money Laundering, Point Estimates, Statistical Significance, Warren Meyer, Wildfires

Intriguing theories regarding all kinds of natural and social phenomena abound, but few if any of those theories can be proven with certainty or even validated at a high level of statistical significance. Yet we constantly see reports in the media about scientific studies purporting to prove one thing or another. Naturally, journalists pounce on interesting stories, and they can hardly be blamed when scientists themselves peddle “findings” that are essentially worthless. Unfortunately, the scientific community is doing little to police this kind of malpractice. And incredible as it seems, even principled scientists can be so taken with their devices that they promote uncertain results with few caveats.

Warren Meyer coined the term “certainty laundering” to describe a common form of scientific malpractice. Observational data is often uncontrolled and/or too thin to test theories with any degree of confidence. What’s a researcher to do in the presence of such great uncertainties? Start with a theoretical model in which X is true by assumption and choose parameter values that seem plausible. In all likelihood, the sparse data that exist cannot be used to reject the model on statistical grounds. The data are therefore “consistent with a model in which X is true”. Dramatic headlines are then within reach. Bingo!

The parallel drawn by Meyer between “certainty laundering” and the concept of money laundering is quite suggestive. The latter is a process by which economic gains from illegal activities are funneled through legal entities in order to conceal their subterranean origins. Certainty laundering is a process that may encompass the design of the research exercise, its documentation, and its promotion in the media. It conceals from attention the noise inherent in the data upon which the theory of X presumably bears.

Another tempting exercise that facilitates certainty laundering is to ask how much a certain outcome would have changed under some counterfactual circumstance, call it Z. For example, while atmospheric CO2 concentration increased by roughly one part per 10,000 (0.01%) over the past 60 years, Z might posit that the change did not take place. Then, given a model that embodies a “plausible” degree of global temperature sensitivity to CO2, one can calculate how different global temperatures would be today under that counterfactual. This creates a juicy but often misleading form of attribution. Meyer refers to this process as a way of “writing history”:

“Most of us are familiar with using computer models to predict the future, but this use of complex models to write history is relatively new. Researchers have begun to use computer models for this sort of retrospective analysis because they struggle to isolate the effect of a single variable … in their observational data.”

These “what-if-instead” exercises generally apply ceteris paribus assumptions inappropriately, presuming the dominant influence of a single variable while ignoring other empirical correlations which might have countervailing effects. The exercise usually culminates in a point estimate of the change “implied” by X, without any mention of possible errors in the estimated sensitivity nor any mention of the possible range of outcomes implied by model uncertainty. In many such cases, the actual model and its parameters have not been validated under strict statistical criteria.

Meyer goes on to describe a climate study from 2011 that was quite blatant about its certainty laundering approach. He provides the following quote from the study:

“These question cannot be answered using observations alone, as the available time series are too short and the data not accurate enough. We therefore used climate model output generated in the ESSENCE project, a collaboration of KNMI and Utrecht University that generated 17 simulations of the climate with the ECHAM5/MPI-OM model to sample the natural variability of the climate system. When compared to the available observations, the model describes the ocean temperature rise and variability well.”

At the time, Meyer wrote the following critique:

“[Note the first and last sentences of this paragraph] First, that there is not sufficiently extensive and accurate observational data to test a hypothesis. BUT, then we will create a model, and this model is validated against this same observational data. Then the model is used to draw all kinds of conclusions about the problem being studied.

This is the clearest, simplest example of certainty laundering I have ever seen. If there is not sufficient data to draw conclusions about how a system operates, then how can there be enough data to validate a computer model which, in code, just embodies a series of hypotheses about how a system operates?”

In “Imprecision and Unsettled Science“, I wrote about the process of calculating global surface temperatures. That process is plagued by poor quality and uncertainties, yet many climate scientists and the media seem completely unaware of these problems. They view global and regional temperature data as infallible, but in reality these aggregated readings should be recognized as point estimates with wide error bands. Those bands imply that the conclusions of any research utilizing aggregate temperature data are subject to tremendous uncertainty. Unfortunately, that fact doesn’t get much play.

As Ashe Schow explains, junk science is nothing new. Successful replication rates of study results in most fields are low, and the increasing domination of funding sources by government tends to promote research efforts supporting the preferred narratives of government bureaucrats.

But perhaps we’re not being fair to the scientists, or most scientists at any rate. One hopes that the vast majority theorize with the legitimate intention of explaining phenomena. The unfortunate truth is that adequate data for testing theories is hard to come by in many fields. Fair enough, but Meyer puts his finger on a bigger problem: One simply cannot count on the media to apply appropriate statistical standards in vetting such reports. Here’s his diagnosis of the problem in the context of the Fourth National Climate Assessment and its estimate of the impact of climate change on wildfires:

“The problem comes further down the food chain:

  1. When the media, and in this case the US government, uses this analysis completely uncritically and without any error bars to pretend at certainty — in this case that half of the recent wildfire damage is due to climate change — that simply does not exist
  2. And when anything that supports the general theory that man-made climate change is catastrophic immediately becomes — without challenge or further analysis — part of the ‘consensus’ and therefore immune from criticism.”

That is a big problem for science and society. A striking point estimate is often presented without adequate emphasis on the degree of noise that surrounds it. Indeed, even given a range of estimates, the top number is almost certain to be stressed more heavily. Unfortunately, the incentives facing researchers and journalists are skewed toward this sort of misplaced emphasis. Scientists and other researchers are not immune to the lure of publicity and the promise of policy influence. Sensational point estimates have additional value if they support an agenda that is of interest to those making decisions about research funding. And journalists, who generally are not qualified to make judgements about the quality of scientific research, are always eager for a good story. Today, the spread of bad science, and bad science journalism, is all the more virulent as it is propagated by social media.

The degree of uncertainty underlying a research result just doesn’t sell, but it is every bit as crucial to policy debate as a point estimate of the effect. Policy decisions have expected costs and benefits, but the costs are often front-loaded and more certain than the hoped-for benefits. Any valid cost-benefit analysis must account for uncertainties, but once a narrative gains steam, this sort of rationality is too often cast to the wind. Cascades in public opinion and political momentum are all too vulnerable to the guiles of certainty laundering. Trends of this kind are difficult to reverse and are especially costly if the laundered conclusions are wrong.

Follow Sacred Cow Chips on WordPress.com

Recent Posts

  • Oh To Squeeze Fiscal Discipline From a Debt Limit Turnip
  • Conformity and Suppression: How Science Is Not “Done”
  • Grow Or Collapse: Stasis Is Not a Long-Term Option
  • Cassandras Feel An Urgent Need To Crush Your Lifestyle
  • Containing An Online Viper Pit of Antisemites

Archives

  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014

Blogs I Follow

  • Ominous The Spirit
  • Passive Income Kickstart
  • onlyfinance.net/
  • TLC Cholesterol
  • Nintil
  • kendunning.net
  • DCWhispers.com
  • Hoong-Wai in the UK
  • Marginal REVOLUTION
  • Stlouis
  • Watts Up With That?
  • Aussie Nationalist Blog
  • American Elephants
  • The View from Alexandria
  • The Gymnasium
  • A Force for Good
  • Notes On Liberty
  • troymo
  • SUNDAY BLOG Stephanie Sievers
  • Miss Lou Acquiring Lore
  • Your Well Wisher Program
  • Objectivism In Depth
  • RobotEnomics
  • Orderstatistic
  • Paradigm Library

Blog at WordPress.com.

Ominous The Spirit

Ominous The Spirit is an artist that makes music, paints, and creates photography. He donates 100% of profits to charity.

Passive Income Kickstart

onlyfinance.net/

TLC Cholesterol

Nintil

To estimate, compare, distinguish, discuss, and trace to its principal sources everything

kendunning.net

The future is ours to create.

DCWhispers.com

Hoong-Wai in the UK

A Commonwealth immigrant's perspective on the UK's public arena.

Marginal REVOLUTION

Small Steps Toward A Much Better World

Stlouis

Watts Up With That?

The world's most viewed site on global warming and climate change

Aussie Nationalist Blog

Commentary from a Paleoconservative and Nationalist perspective

American Elephants

Defending Life, Liberty and the Pursuit of Happiness

The View from Alexandria

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun

The Gymnasium

A place for reason, politics, economics, and faith steeped in the classical liberal tradition

A Force for Good

How economics, morality, and markets combine

Notes On Liberty

Spontaneous thoughts on a humble creed

troymo

SUNDAY BLOG Stephanie Sievers

Escaping the everyday life with photographs from my travels

Miss Lou Acquiring Lore

Gallery of Life...

Your Well Wisher Program

Attempt to solve commonly known problems…

Objectivism In Depth

Exploring Ayn Rand's revolutionary philosophy.

RobotEnomics

(A)n (I)ntelligent Future

Orderstatistic

Economics, chess and anything else on my mind.

Paradigm Library

OODA Looping

  • Follow Following
    • Sacred Cow Chips
    • Join 121 other followers
    • Already have a WordPress.com account? Log in now.
    • Sacred Cow Chips
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...